in_source_id
stringlengths
13
58
issue
stringlengths
3
241k
before_files
listlengths
0
3
after_files
listlengths
0
3
pr_diff
stringlengths
109
107M
conda__conda-build-537
GIT_DESCRIBE_TAG isn't set when the .git is a file Sometimes the .git directory is actually a file. This happens, for example, when you have a git submodule. This causes this test to fail incorrectly: https://github.com/conda/conda-build/blob/master/conda_build/environ.py#L36 Unfortunately, the .git file might point to a real git repo in a relative URL, so just moving the directory to the conda build directory can break it being a git repo.
[ { "content": "from __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\nfrom os.path import join\nimport subprocess\nimport multiprocessing\n\nimport conda.config as cc\n\nfrom conda_build.config import config\n\nfrom conda_build import source\n\n\ndef get_perl_ver():\n retur...
[ { "content": "from __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\nfrom os.path import join\nimport subprocess\nimport multiprocessing\n\nimport conda.config as cc\n\nfrom conda_build.config import config\n\nfrom conda_build import source\n\n\ndef get_perl_ver():\n retur...
diff --git a/conda_build/environ.py b/conda_build/environ.py index d3dc2ca6f6..2d7d4cb680 100644 --- a/conda_build/environ.py +++ b/conda_build/environ.py @@ -33,7 +33,7 @@ def get_git_build_info(src_dir): env = os.environ.copy() d = {} git_dir = join(src_dir, '.git') - if os.path.isdir(git_dir): + if os.path.exists(git_dir): env['GIT_DIR'] = git_dir else: return d
python__mypy-16229
Add setuptools as a dependency on Python 3.12? Mypyc needs `distutils` or `setuptools` to run, but Python 3.12 no longer bundles `distutils` ([PEP 632](https://peps.python.org/pep-0632/)). This seems to imply that we need to include `setuptools` as a dependency of mypy (at least on Python 3.12 or later), or unbundle mypyc into a separate distribution on PyPI. Thoughts?
[ { "content": "#!/usr/bin/env python\n\nfrom __future__ import annotations\n\nimport glob\nimport os\nimport os.path\nimport sys\nfrom typing import TYPE_CHECKING, Any\n\nif sys.version_info < (3, 8, 0): # noqa: UP036\n sys.stderr.write(\"ERROR: You need Python 3.8 or later to use mypy.\\n\")\n exit(1)\n\...
[ { "content": "#!/usr/bin/env python\n\nfrom __future__ import annotations\n\nimport glob\nimport os\nimport os.path\nimport sys\nfrom typing import TYPE_CHECKING, Any\n\nif sys.version_info < (3, 8, 0):\n sys.stderr.write(\"ERROR: You need Python 3.8 or later to use mypy.\\n\")\n exit(1)\n\n# we'll import...
diff --git a/mypyc/doc/getting_started.rst b/mypyc/doc/getting_started.rst index 2db8aae149ec..adc617419ffa 100644 --- a/mypyc/doc/getting_started.rst +++ b/mypyc/doc/getting_started.rst @@ -38,17 +38,17 @@ Installation ------------ Mypyc is shipped as part of the mypy distribution. Install mypy like -this (you need Python 3.5 or later): +this (you need Python 3.8 or later): .. code-block:: - $ python3 -m pip install -U mypy + $ python3 -m pip install -U 'mypy[mypyc]' On some systems you need to use this instead: .. code-block:: - $ python -m pip install -U mypy + $ python -m pip install -U 'mypy[mypyc]' Example program --------------- diff --git a/setup.py b/setup.py index bbb655ea4537..9b945c9047c9 100644 --- a/setup.py +++ b/setup.py @@ -227,6 +227,7 @@ def run(self): # Same here. extras_require={ "dmypy": "psutil >= 4.0", + "mypyc": "setuptools >= 50", "python2": "", "reports": "lxml", "install-types": "pip",
microsoft__DeepSpeed-2698
[BUG] the `benchmarks` folder is included upon installation I noticed that while inspecting the conda package during my attempt to create a conda forge build. ![image](https://user-images.githubusercontent.com/528003/211929466-436decd1-a733-4db6-8a67-c948736575a5.png) The fix is likely as simple as adding `benchmarks` to `packages=find_packages(exclude=[....])` in the `setup.py` file.
[ { "content": "\"\"\"\nCopyright 2020 The Microsoft DeepSpeed Team\n\nDeepSpeed library\n\nTo build wheel on Windows:\n 1. Install pytorch, such as pytorch 1.12 + cuda 11.6\n 2. Install visual cpp build tool\n 3. Include cuda toolkit\n 4. Launch cmd console with Administrator privilege for creating r...
[ { "content": "\"\"\"\nCopyright 2020 The Microsoft DeepSpeed Team\n\nDeepSpeed library\n\nTo build wheel on Windows:\n 1. Install pytorch, such as pytorch 1.12 + cuda 11.6\n 2. Install visual cpp build tool\n 3. Include cuda toolkit\n 4. Launch cmd console with Administrator privilege for creating r...
diff --git a/.github/workflows/formatting.yml b/.github/workflows/formatting.yml index 6ebd9c6d1e9f..f05f3056994b 100644 --- a/.github/workflows/formatting.yml +++ b/.github/workflows/formatting.yml @@ -22,8 +22,10 @@ jobs: steps: - uses: actions/checkout@v2 - - id: setup-venv - uses: ./.github/workflows/setup-venv + - name: environment + run: | + which python + python --version - name: Install deepspeed run: | @@ -32,4 +34,5 @@ jobs: - name: Formatting checks run: | + pip show pre-commit clang-format pre-commit run --all-files diff --git a/setup.py b/setup.py index ae9ece39e3f1..dd942b7b37c1 100755 --- a/setup.py +++ b/setup.py @@ -291,7 +291,9 @@ def create_dir_symlink(src, dest): "release", "requirements", "scripts", - "tests" + "tests", + "benchmarks", + "accelerator" ]), include_package_data=True, scripts=[
sopel-irc__sopel-555
Configuration error with python 3.3 On the first run of willie 4.4.1, installed from pip, the configuration file can't be created due to the use of raw_input(), function replaced by input() in python3. here is the error : ``` bash Welcome to Willie! I can't seem to find the configuration file, so let's generate it! Please answer the following questions to create your configuration file: Encountered an error while writing the config file. This shouldn't happen. Check permissions. Traceback (most recent call last): File "/home/willie/IRC/bin/willie", line 213, in <module> main() File "/home/willie/IRC/bin/willie", line 131, in main create_config(configpath) File "/home/willie/IRC/lib/python3.3/site-packages/willie/config.py", line 450, in create_config config._core() File "/home/willie/IRC/lib/python3.3/site-packages/willie/config.py", line 298, in _core 'Willie') File "/home/willie/IRC/lib/python3.3/site-packages/willie/config.py", line 228, in interactive_add value = raw_input(prompt + ' [%s]: ' % default) or default NameError: global name 'raw_input' is not defined ``` If someone can reproduce the problem I'll submit a pull request
[ { "content": "# coding=utf8\n\"\"\"\n*Availability: 3+ for all functions; attributes may vary.*\n\nThe config class is an abstraction class for accessing the active Willie\nconfiguration file.\n\nThe Willie config file is divided to sections, and each section contains keys\nand values. A section is an attribute...
[ { "content": "# coding=utf8\n\"\"\"\n*Availability: 3+ for all functions; attributes may vary.*\n\nThe config class is an abstraction class for accessing the active Willie\nconfiguration file.\n\nThe Willie config file is divided to sections, and each section contains keys\nand values. A section is an attribute...
diff --git a/willie/config.py b/willie/config.py index 092e674e0f..9db78041ff 100644 --- a/willie/config.py +++ b/willie/config.py @@ -53,6 +53,7 @@ if sys.version_info.major >= 3: unicode = str basestring = str + raw_input = input class ConfigurationError(Exception): """ Exception type for configuration errors """
xonsh__xonsh-4631
Interactive printing fails for objects that implement hasattr in a non-standard way Interactive printing fails for objects that implement `hasattr` in a non-standard way. For example, in the popular [BeautifulSoup](https://pypi.org/project/beautifulsoup4/) library, some objects have a `getattr` implementation that always returns True, irrespective of whether a value actually exists. This causes ```python if hasattr(obj, "xonsh_display"): return obj.xonsh_display() ``` in `pretty.py` to fail. ```console $ import bs4 $ bs4.BeautifulSoup("<html></html>", 'html.parser') $ ... traceback: TypeError: 'NoneType' object is not callable ``` ## xonfig <details> ``` $ xonfig +------------------+---------------------+ | xonsh | 0.11.0.dev40.dev40 | | Git SHA | 70dd8bf2 | | Commit Date | Jan 7 22:34:52 2022 | | Python | 3.6.15 | | PLY | 3.11 | | have readline | True | | prompt toolkit | 3.0.7 | | shell type | prompt_toolkit | | history backend | json | | pygments | 2.11.2 | | on posix | True | | on linux | True | | distro | unknown | | on wsl | False | | on darwin | False | | on windows | False | | on cygwin | False | | on msys2 | False | | is superuser | True | | default encoding | utf-8 | | xonsh encoding | utf-8 | | encoding errors | surrogateescape | | on jupyter | False | | jupyter kernel | None | | xontrib | [] | | RC file | [] | +------------------+---------------------+ ``` </details> ## Expected Behavior The value gets printed. ## Current Behavior <!--- Tell us what happens instead of the expected behavior --> <!--- If part of your bug report is a traceback, please first enter debug mode before triggering the error To enter debug mode, set the environment variable `XONSH_DEBUG=1` _before_ starting `xonsh`. On Linux and OSX, an easy way to to do this is to run `env XONSH_DEBUG=1 xonsh` --> ### Traceback (if applicable) ``` Traceback (most recent call last): File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 16776, in default run_compiled_code(code, self.ctx, None, "single") File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 3563, in run_compiled_code func(code, glb, loc) File "<xonsh-code>", line 1, in <module> File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 21612, in _pprint_displayhook printed_val = pretty(value) File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 2034, in pretty return obj.xonsh_display() TypeError: 'NoneType' object is not callable ``` </details> ## Steps to Reproduce Described in first section. ## For community ⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment** Interactive printing fails for objects that implement hasattr in a non-standard way Interactive printing fails for objects that implement `hasattr` in a non-standard way. For example, in the popular [BeautifulSoup](https://pypi.org/project/beautifulsoup4/) library, some objects have a `getattr` implementation that always returns True, irrespective of whether a value actually exists. This causes ```python if hasattr(obj, "xonsh_display"): return obj.xonsh_display() ``` in `pretty.py` to fail. ```console $ import bs4 $ bs4.BeautifulSoup("<html></html>", 'html.parser') $ ... traceback: TypeError: 'NoneType' object is not callable ``` ## xonfig <details> ``` $ xonfig +------------------+---------------------+ | xonsh | 0.11.0.dev40.dev40 | | Git SHA | 70dd8bf2 | | Commit Date | Jan 7 22:34:52 2022 | | Python | 3.6.15 | | PLY | 3.11 | | have readline | True | | prompt toolkit | 3.0.7 | | shell type | prompt_toolkit | | history backend | json | | pygments | 2.11.2 | | on posix | True | | on linux | True | | distro | unknown | | on wsl | False | | on darwin | False | | on windows | False | | on cygwin | False | | on msys2 | False | | is superuser | True | | default encoding | utf-8 | | xonsh encoding | utf-8 | | encoding errors | surrogateescape | | on jupyter | False | | jupyter kernel | None | | xontrib | [] | | RC file | [] | +------------------+---------------------+ ``` </details> ## Expected Behavior The value gets printed. ## Current Behavior <!--- Tell us what happens instead of the expected behavior --> <!--- If part of your bug report is a traceback, please first enter debug mode before triggering the error To enter debug mode, set the environment variable `XONSH_DEBUG=1` _before_ starting `xonsh`. On Linux and OSX, an easy way to to do this is to run `env XONSH_DEBUG=1 xonsh` --> ### Traceback (if applicable) ``` Traceback (most recent call last): File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 16776, in default run_compiled_code(code, self.ctx, None, "single") File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 3563, in run_compiled_code func(code, glb, loc) File "<xonsh-code>", line 1, in <module> File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 21612, in _pprint_displayhook printed_val = pretty(value) File "/usr/local/lib/python3.9/dist-packages/xonsh/__amalgam__.py", line 2034, in pretty return obj.xonsh_display() TypeError: 'NoneType' object is not callable ``` </details> ## Steps to Reproduce Described in first section. ## For community ⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
[ { "content": "\"\"\"\nPython advanced pretty printer. This pretty printer is intended to\nreplace the old `pprint` python module which does not allow developers\nto provide their own pretty print callbacks.\n\nThis module is based on ruby's `prettyprint.rb` library by `Tanaka Akira`.\n\nThe following implement...
[ { "content": "\"\"\"\nPython advanced pretty printer. This pretty printer is intended to\nreplace the old `pprint` python module which does not allow developers\nto provide their own pretty print callbacks.\n\nThis module is based on ruby's `prettyprint.rb` library by `Tanaka Akira`.\n\nThe following implement...
diff --git a/xonsh/pretty.py b/xonsh/pretty.py index 23013a9df2..f0e1536f57 100644 --- a/xonsh/pretty.py +++ b/xonsh/pretty.py @@ -118,7 +118,7 @@ def pretty( """ Pretty print the object's representation. """ - if hasattr(obj, "xonsh_display"): + if _safe_getattr(obj, "xonsh_display"): return obj.xonsh_display() stream = io.StringIO()
sktime__sktime-556
[DOC] SlidingWindowSplitter start_with_window default value not consistent #### Describe the issue linked to the documentation https://github.com/alan-turing-institute/sktime/blob/139b9291fb634cce367f714a6132212b0172e199/sktime/forecasting/model_selection/_split.py#L174 It looks like the default value of start_with_window=False, but documentation states that it is True. Not sure which one was intended. https://github.com/alan-turing-institute/sktime/blob/139b9291fb634cce367f714a6132212b0172e199/sktime/forecasting/model_selection/_split.py#L183 #### Suggest a potential alternative/fix Change either the documentation or the default argument to match the other one
[ { "content": "#!/usr/bin/env python3 -u\n# -*- coding: utf-8 -*-\n# copyright: sktime developers, BSD-3-Clause License (see LICENSE file)\n\n__all__ = [\n \"SlidingWindowSplitter\",\n \"CutoffSplitter\",\n \"SingleWindowSplitter\",\n \"temporal_train_test_split\",\n]\n__author__ = [\"Markus Löning\"...
[ { "content": "#!/usr/bin/env python3 -u\n# -*- coding: utf-8 -*-\n# copyright: sktime developers, BSD-3-Clause License (see LICENSE file)\n\n__all__ = [\n \"SlidingWindowSplitter\",\n \"CutoffSplitter\",\n \"SingleWindowSplitter\",\n \"temporal_train_test_split\",\n]\n__author__ = [\"Markus Löning\"...
diff --git a/.all-contributorsrc b/.all-contributorsrc index 3cdf7415418..cd1c417af59 100644 --- a/.all-contributorsrc +++ b/.all-contributorsrc @@ -717,7 +717,8 @@ "profile": "https://github.com/ngupta23", "contributions": [ "code", - "bug" + "bug", + "doc" ] }, { diff --git a/sktime/forecasting/model_selection/_split.py b/sktime/forecasting/model_selection/_split.py index 7082fcf92fd..d1598c108f5 100644 --- a/sktime/forecasting/model_selection/_split.py +++ b/sktime/forecasting/model_selection/_split.py @@ -172,7 +172,7 @@ class SlidingWindowSplitter(BaseWindowSplitter): window_length : int step_length : int initial_window : int - start_with_window : bool, optional (default=True) + start_with_window : bool, optional (default=False) Examples --------
scikit-hep__pyhf-1790
Guard SCHEMA_VERSION from version bumps I don't think it is going to be possible to guard the `SCHEMA_VERSION` from `bump2version` so we might need to look for a replacement for `bump2version` that gives guard support. This is going to be a problem when https://github.com/scikit-hep/pyhf/blob/6b0a9317b14da2a452f51d089cb9e493c8f19347/.bumpversion.cfg#L1-L2 hits `1.0.0` and conflicts with https://github.com/scikit-hep/pyhf/blob/f824afe77d9e48e90651931700ccfc3d3c268c18/src/pyhf/utils.py#L13 and also has to properly pick up the multiple correct instances in https://github.com/scikit-hep/pyhf/blob/f824afe77d9e48e90651931700ccfc3d3c268c18/src/pyhf/utils.py#L145 _Originally posted by @matthewfeickert in https://github.com/scikit-hep/pyhf/issues/1218#issuecomment-744590434_
[ { "content": "from setuptools import setup\n\nextras_require = {\n 'shellcomplete': ['click_completion'],\n 'tensorflow': [\n 'tensorflow>=2.3.1', # c.f. https://github.com/tensorflow/tensorflow/pull/40789\n 'tensorflow-probability>=0.11.0', # c.f. PR #1657\n ],\n 'torch': ['torch>=1...
[ { "content": "from setuptools import setup\n\nextras_require = {\n 'shellcomplete': ['click_completion'],\n 'tensorflow': [\n 'tensorflow>=2.3.1', # c.f. https://github.com/tensorflow/tensorflow/pull/40789\n 'tensorflow-probability>=0.11.0', # c.f. PR #1657\n ],\n 'torch': ['torch>=1...
diff --git a/setup.py b/setup.py index 52dbd81742..9fb006aeb5 100644 --- a/setup.py +++ b/setup.py @@ -68,7 +68,7 @@ + extras_require['test'] + [ 'nbdime', - 'bump2version', + 'tbump>=6.7.0', 'ipython', 'pre-commit', 'check-manifest', diff --git a/tbump.toml b/tbump.toml new file mode 100644 index 0000000000..e03aeb71c4 --- /dev/null +++ b/tbump.toml @@ -0,0 +1,61 @@ +github_url = "https://github.com/scikit-hep/pyhf/" + +[version] +current = "0.6.3" + +# Example of a semver regexp. +# Make sure this matches current_version before +# using tbump +regex = ''' + (?P<major>\d+) + \. + (?P<minor>\d+) + \. + (?P<patch>\d+) + (rc + (?P<candidate>\d+) + )? + ''' + +[git] +# The current version will get updated when tbump is run +message_template = "Bump version: 0.6.3 → {new_version}" +tag_template = "v{new_version}" + +# For each file to patch, add a [[file]] config +# section containing the path of the file, relative to the +# tbump.toml location. +[[file]] +src = "tbump.toml" +# Restrict search to make it explicit why tbump.toml +# is even included as a file to bump, as it will get +# its version.current attribute bumped anyway. +search = "Bump version: {current_version} → " + +[[file]] +src = "src/pyhf/utils.py" +# Guard SCHEMA_VERSION +# This search is just identifying the line to restrict the +# regex to, but all matches in the line will get bumped. +search = "pyhf: v{current_version}" + +[[file]] +src = "README.rst" + +[[file]] +src = "src/pyhf/data/citation.bib" + +[[file]] +src = ".zenodo.json" + +[[file]] +src = "codemeta.json" + +[[file]] +src = "CITATION.cff" + +[[field]] +# the name of the field +name = "candidate" +# the default value to use, if there is no match +default = ""
joke2k__faker-512
Using É, é (e-acute) in emails. It looks that É, é (e-acute) symbols are not appropriate for valid email. I used https://pypi.python.org/pypi/robotframework-faker/ which uses this library and the following email was returned: andré38@mentzel.net But email verification was failed for this email. Could you remove É, é and other such letters if they are present from valid email generation?
[ { "content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\nfrom .. import Provider as InternetProvider\n\nclass Provider(InternetProvider):\n\n free_email_domains = (\n 'aol.de', 'gmail.com', 'gmx.de', 'googlemail.com', 'hotmail.de',\n 'web.de', 'yahoo.de',\n )\n tlds = ('co...
[ { "content": "# coding=utf-8\n\nfrom __future__ import unicode_literals\nfrom .. import Provider as InternetProvider\n\nclass Provider(InternetProvider):\n\n free_email_domains = (\n 'aol.de', 'gmail.com', 'gmx.de', 'googlemail.com', 'hotmail.de',\n 'web.de', 'yahoo.de',\n )\n tlds = ('co...
diff --git a/faker/providers/internet/de_DE/__init__.py b/faker/providers/internet/de_DE/__init__.py index 76aaec7ddf..231d57aa0f 100644 --- a/faker/providers/internet/de_DE/__init__.py +++ b/faker/providers/internet/de_DE/__init__.py @@ -15,5 +15,7 @@ class Provider(InternetProvider): ('ä', 'ae'), ('Ä', 'Ae'), ('ö', 'oe'), ('Ö', 'Oe'), ('ü', 'ue'), ('Ü', 'Ue'), + ('é', 'e'), ('É', 'E'), + ('à', 'a'), ('À', 'A'), ('ß', 'ss'), )
openstates__openstates-scrapers-2982
OR failing since at least 2019-06-09 OR has been failing since 2019-06-09 Based on automated runs it appears that OR has not run successfully in 2 days (2019-06-09). ``` loaded Open States pupa settings... or (scrape, import) bills: {} votes: {} 08:01:13 CRITICAL pupa: Session(s) 2019-2020 Interim were reported by Oregon.get_session_list() but were not found in Oregon.legislative_sessions or Oregon.ignored_scraped_sessions. ``` Visit http://bobsled.openstates.org for more info.
[ { "content": "from pupa.scrape import Jurisdiction, Organization\nfrom .people import ORPersonScraper\n# from .committees import ORCommitteeScraper\nfrom .bills import ORBillScraper\nfrom .votes import ORVoteScraper\n\n\nclass Oregon(Jurisdiction):\n division_id = \"ocd-division/country:us/state:or\"\n cl...
[ { "content": "from pupa.scrape import Jurisdiction, Organization\nfrom .people import ORPersonScraper\n# from .committees import ORCommitteeScraper\nfrom .bills import ORBillScraper\nfrom .votes import ORVoteScraper\n\n\nclass Oregon(Jurisdiction):\n division_id = \"ocd-division/country:us/state:or\"\n cl...
diff --git a/openstates/or/__init__.py b/openstates/or/__init__.py index 2630876cc0..18c4f1eaa6 100644 --- a/openstates/or/__init__.py +++ b/openstates/or/__init__.py @@ -108,6 +108,7 @@ class Oregon(Jurisdiction): ] ignored_scraped_sessions = [ "Today", + "2019-2020 Interim", "2017-2018 Interim", "2015-2016 Interim", "2013 1st Special Session",
chainer__chainer-8219
pytest is causing error in Jenkins Example: https://jenkins.preferred.jp/job/chainer/job/chainer_pr/2162/TEST=CHAINERX_chainer-py3,label=mn1-p100/console ``` 14:33:27 + pytest -rfEX --showlocals -m 'not slow and not ideep' /repo/tests/chainer_tests 14:33:28 Traceback (most recent call last): 14:33:28 File "/workspace/conda/envs/testenv/bin/pytest", line 10, in <module> 14:33:28 sys.exit(main()) 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/config/__init__.py", line 61, in main 14:33:28 config = _prepareconfig(args, plugins) 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/config/__init__.py", line 182, in _prepareconfig 14:33:28 config = get_config() 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/config/__init__.py", line 156, in get_config 14:33:28 pluginmanager.import_plugin(spec) 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/config/__init__.py", line 530, in import_plugin 14:33:28 __import__(importspec) 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/tmpdir.py", line 25, in <module> 14:33:28 class TempPathFactory(object): 14:33:28 File "/workspace/conda/envs/testenv/lib/python3.6/site-packages/_pytest/tmpdir.py", line 35, in TempPathFactory 14:33:28 lambda p: Path(os.path.abspath(six.text_type(p))) 14:33:28 TypeError: attrib() got an unexpected keyword argument 'convert' ```
[ { "content": "#!/usr/bin/env python\n\nimport os\nimport pkg_resources\nimport sys\n\nfrom setuptools import setup\n\nimport chainerx_build_helper\n\n\nif sys.version_info[:3] == (3, 5, 0):\n if not int(os.getenv('CHAINER_PYTHON_350_FORCE', '0')):\n msg = \"\"\"\nChainer does not work with Python 3.5....
[ { "content": "#!/usr/bin/env python\n\nimport os\nimport pkg_resources\nimport sys\n\nfrom setuptools import setup\n\nimport chainerx_build_helper\n\n\nif sys.version_info[:3] == (3, 5, 0):\n if not int(os.getenv('CHAINER_PYTHON_350_FORCE', '0')):\n msg = \"\"\"\nChainer does not work with Python 3.5....
diff --git a/setup.py b/setup.py index 61bce8169532..2fabb8e706f1 100644 --- a/setup.py +++ b/setup.py @@ -45,6 +45,7 @@ ], 'test': [ 'pytest<4.2.0', # 4.2.0 is slow collecting tests and times out on CI. + 'attrs<19.2.0', # pytest 4.1.1 does not run with attrs==19.2.0 'mock', ], 'doctest': [
redis__redis-py-1069
AttributeError: 'UnixDomainSocketConnection' object has no attribute '_buffer_cutoff' Since version 3.0, redis client seems broken. I cannot even get, set or keys() anything when connecting to unix socket. I tried running it in a docker container running centos 7.5.1804 core using python3.6. Steps to reproduce: install redis 3.0 ```python import redis red = redis.Redis(unix_socket_path = "/path/to/socket") red.keys() ``` Throws this: ``` Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python3.6/site-packages/redis/client.py", line 1262, in keys return self.execute_command('KEYS', pattern) File "/usr/lib/python3.6/site-packages/redis/client.py", line 754, in execute_command connection.send_command(*args) File "/usr/lib/python3.6/site-packages/redis/connection.py", line 619, in send_command self.send_packed_command(self.pack_command(*args)) File "/usr/lib/python3.6/site-packages/redis/connection.py", line 658, in pack_command buffer_cutoff = self._buffer_cutoff AttributeError: 'UnixDomainSocketConnection' object has no attribute '_buffer_cutoff' ```
[ { "content": "from __future__ import unicode_literals\nfrom distutils.version import StrictVersion\nfrom itertools import chain\nimport io\nimport os\nimport socket\nimport sys\nimport threading\nimport warnings\n\ntry:\n import ssl\n ssl_available = True\nexcept ImportError:\n ssl_available = False\n\...
[ { "content": "from __future__ import unicode_literals\nfrom distutils.version import StrictVersion\nfrom itertools import chain\nimport io\nimport os\nimport socket\nimport sys\nimport threading\nimport warnings\n\ntry:\n import ssl\n ssl_available = True\nexcept ImportError:\n ssl_available = False\n\...
diff --git a/redis/connection.py b/redis/connection.py index b38f24c42d..9b949c5c08 100755 --- a/redis/connection.py +++ b/redis/connection.py @@ -759,6 +759,7 @@ def __init__(self, path='', db=0, password=None, 'db': self.db, } self._connect_callbacks = [] + self._buffer_cutoff = 6000 def _connect(self): "Create a Unix domain socket connection"
hylang__hy-1201
fix setup.py at least hy.extra is missing from package data
[ { "content": "#!/usr/bin/env python\n# Copyright (c) 2012, 2013 Paul Tagliamonte <paultag@debian.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, incl...
[ { "content": "#!/usr/bin/env python\n# Copyright (c) 2012, 2013 Paul Tagliamonte <paultag@debian.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, incl...
diff --git a/setup.py b/setup.py index 3fc2be325..a64cb7b80 100755 --- a/setup.py +++ b/setup.py @@ -71,6 +71,7 @@ package_data={ 'hy.contrib': ['*.hy'], 'hy.core': ['*.hy'], + 'hy.extra': ['*.hy'], }, author="Paul Tagliamonte", author_email="tag@pault.ag",
DataBiosphere__toil-4011
We're getting the new Werkzeug without asking for it CI for the merge https://github.com/DataBiosphere/toil/commit/0e256d63cb974a87b8f6b807bf7d23bc9a12fb76 failed at the lint stage, because the merge commit ends up installing a different Werkzeug than the PR's test run did, and the new one has type hints, which upsets MyPy because we now have an unused ignore. This is because the `connexion` devs finally got access to the `connexion` PyPI package again, and published the current release there. So we started picking up connexion 2.10 instead of 2.5, which is now compatible with Flask 2. So we started installing Flask 2 and Werkzeug 2. If we're going to import out of Werkzeug, we need to depend on a particular major version of it, so it can't be changed put from under us by pip. ┆Issue is synchronized with this [Jira Task](https://ucsc-cgl.atlassian.net/browse/TOIL-1130) ┆friendlyId: TOIL-1130
[ { "content": "# Modified from: https://github.com/common-workflow-language/workflow-service\nimport functools\nimport json\nimport os\nimport logging\nimport tempfile\nfrom abc import abstractmethod\nfrom typing import Optional, List, Dict, Any, Tuple, Callable\nfrom urllib.parse import urldefrag\n\nimport conn...
[ { "content": "# Modified from: https://github.com/common-workflow-language/workflow-service\nimport functools\nimport json\nimport os\nimport logging\nimport tempfile\nfrom abc import abstractmethod\nfrom typing import Optional, List, Dict, Any, Tuple, Callable\nfrom urllib.parse import urldefrag\n\nimport conn...
diff --git a/requirements-server.txt b/requirements-server.txt index 5657b16453..8d0d9a790b 100644 --- a/requirements-server.txt +++ b/requirements-server.txt @@ -1,4 +1,6 @@ -connexion[swagger-ui]>=2.5.1, <3 +connexion[swagger-ui]>=2.10.0, <3 +flask>=2.0,<3 +werkzeug>=2.0,<3 flask-cors==3.0.10 gunicorn==20.1.0 celery>=5.1.0, <6 diff --git a/src/toil/server/wes/abstract_backend.py b/src/toil/server/wes/abstract_backend.py index 95fb5427b2..c340c5969c 100644 --- a/src/toil/server/wes/abstract_backend.py +++ b/src/toil/server/wes/abstract_backend.py @@ -9,7 +9,7 @@ from urllib.parse import urldefrag import connexion # type: ignore -from werkzeug.utils import secure_filename # type: ignore +from werkzeug.utils import secure_filename logger = logging.getLogger(__name__)
mitmproxy__mitmproxy-1150
ServerException instead of ProxyServerError ##### Steps to reproduce the problem: ``` >>> from libmproxy.proxy.server import ProxyServer >>> from libmproxy.proxy.config import ProxyConfig >>> ProxyServer(ProxyConfig(port=80)) (...) ServerException: Error starting proxy server: error(13, 'Permission denied') ``` ##### What is the expected behavior? According to the documentation: ``` >>> ProxyServer? Type: type String form: <class 'libmproxy.proxy.server.ProxyServer'> File: /usr/lib/python2.7/dist-packages/libmproxy/proxy/server.py Init definition: ProxyServer(self, config) Docstring: <no docstring> Init docstring: Raises ProxyServerError if there's a startup problem. ``` the expected behavior is ``` >>> ProxyServer(ProxyConfig(port=80)) (...) ProxyServerError: Error starting proxy server: error(13, 'Permission denied') ``` ##### What went wrong? Maybe the documentation is wrong? ##### Any other comments? Nope. --- Mitmproxy Version: 0.15-2 Operating System: Debian Sid.
[ { "content": "from __future__ import (absolute_import, print_function, division)\n\nimport traceback\nimport sys\nimport socket\nimport six\n\nfrom netlib import tcp\nfrom netlib.exceptions import TcpException\nfrom netlib.http.http1 import assemble_response\nfrom ..exceptions import ProtocolException, ServerEx...
[ { "content": "from __future__ import (absolute_import, print_function, division)\n\nimport traceback\nimport sys\nimport socket\nimport six\n\nfrom netlib import tcp\nfrom netlib.exceptions import TcpException\nfrom netlib.http.http1 import assemble_response\nfrom ..exceptions import ProtocolException, ServerEx...
diff --git a/mitmproxy/proxy/server.py b/mitmproxy/proxy/server.py index 4304bd0be3..8483d3df6c 100644 --- a/mitmproxy/proxy/server.py +++ b/mitmproxy/proxy/server.py @@ -36,7 +36,7 @@ class ProxyServer(tcp.TCPServer): def __init__(self, config): """ - Raises ProxyServerError if there's a startup problem. + Raises ServerException if there's a startup problem. """ self.config = config try:
huggingface__transformers-9379
Improve coverage of the documentation Currently, some public classes are not documented anywhere because we didn't create the corresponding doc pages. Those missing pages are: - Benchmark classes - Bert Japanese - Data collators If someone feels like working on one of those, please tag yourself with a comment on this issue. Once the objects are properly documented, they can be removed from the `SHOULD_BE_DOCUMENTED` constant in [this file](https://github.com/huggingface/transformers/blob/1310e1a758edc8e89ec363db76863c771fbeb1de/utils/check_repo.py#L374).
[ { "content": "# coding=utf-8\n# Copyright 2020 The HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#...
[ { "content": "# coding=utf-8\n# Copyright 2020 The HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#...
diff --git a/docs/source/index.rst b/docs/source/index.rst index 1ada9c18d71c..f8b9c43670b6 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -356,6 +356,7 @@ TensorFlow and/or Flax. model_doc/bart model_doc/barthez model_doc/bert + model_doc/bertweet model_doc/bertgeneration model_doc/blenderbot model_doc/camembert diff --git a/docs/source/model_doc/bertweet.rst b/docs/source/model_doc/bertweet.rst new file mode 100644 index 000000000000..4fe1470def83 --- /dev/null +++ b/docs/source/model_doc/bertweet.rst @@ -0,0 +1,64 @@ +.. + Copyright 2020 The HuggingFace Team. All rights reserved. + + Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on + an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the + specific language governing permissions and limitations under the License. + +Bertweet +----------------------------------------------------------------------------------------------------------------------- + +Overview +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The BERTweet model was proposed in `BERTweet: A pre-trained language model for English Tweets +<https://www.aclweb.org/anthology/2020.emnlp-demos.2.pdf>`__ by Dat Quoc Nguyen, Thanh Vu, Anh Tuan Nguyen. + +The abstract from the paper is the following: + +*We present BERTweet, the first public large-scale pre-trained language model for English Tweets. Our BERTweet, having +the same architecture as BERT-base (Devlin et al., 2019), is trained using the RoBERTa pre-training procedure (Liu et +al., 2019). Experiments show that BERTweet outperforms strong baselines RoBERTa-base and XLM-R-base (Conneau et al., +2020), producing better performance results than the previous state-of-the-art models on three Tweet NLP tasks: +Part-of-speech tagging, Named-entity recognition and text classification.* + +Example of use: + +.. code-block:: + + import torch + from transformers import AutoModel, AutoTokenizer + + bertweet = AutoModel.from_pretrained("vinai/bertweet-base") + + # For transformers v4.x+: + tokenizer = AutoTokenizer.from_pretrained("vinai/bertweet-base", use_fast=False) + + # For transformers v3.x: + # tokenizer = AutoTokenizer.from_pretrained("vinai/bertweet-base") + + # INPUT TWEET IS ALREADY NORMALIZED! + line = "SC has first two presumptive cases of coronavirus , DHEC confirms HTTPURL via @USER :cry:" + + input_ids = torch.tensor([tokenizer.encode(line)]) + + with torch.no_grad(): + features = bertweet(input_ids) # Models outputs are now tuples + + ## With TensorFlow 2.0+: + # from transformers import TFAutoModel + # bertweet = TFAutoModel.from_pretrained("vinai/bertweet-base") + + +The original code can be found `here <https://github.com/VinAIResearch/BERTweet>`__. + +BertweetTokenizer +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +.. autoclass:: transformers.BertweetTokenizer + :members: diff --git a/utils/check_repo.py b/utils/check_repo.py index 5fd890a34770..7897f1ca6212 100644 --- a/utils/check_repo.py +++ b/utils/check_repo.py @@ -382,8 +382,6 @@ def find_all_documented_objects(): "BertJapaneseTokenizer", "CharacterTokenizer", "MecabTokenizer", - # Bertweet - "BertweetTokenizer", # Herbert "HerbertTokenizer", "HerbertTokenizerFast",
paperless-ngx__paperless-ngx-2280
[Bug] cannot save Mail Rule with "mail and attachment as seperate documents" in 1.11.1 Maybe it's just me, but I cannot save Mail Rule with "mail and attachment as seperate documents". _Originally posted by @Limerick-gh in https://github.com/paperless-ngx/paperless-ngx/discussions/2265#discussioncomment-4557234_ [Bug] Missing consumption scope options in frontend ### Discussed in https://github.com/paperless-ngx/paperless-ngx/discussions/2265 <div type='discussions-op-text'> <sup>Originally posted by **morremeyer** December 30, 2022</sup> With #2000, frontend configuration for mail consumption was added. With #848, at about the same time, email body & .eml file consumption was added. #848 added the **consumption scope** for email consumption (see https://github.com/p-h-a-i-l/paperless-ngx/blob/0fda35723d62275a5beb783cbf9061d4d4a15703/src/paperless_mail/models.py#L59-L65) to decide between consuming: * only the attachments * the full email as .eml * the full email as .eml **and** the attachments The **consumption scope** is not yet configurable on the frontend. I'd be really happy if it were configurable in the frontend in a future version. I'm pretty sure someone already has that planned, but I couldn't find an issue or discussion for it, so I'm opening this one to track this request.</div>
[ { "content": "from documents.serialisers import CorrespondentField\nfrom documents.serialisers import DocumentTypeField\nfrom documents.serialisers import TagsField\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\nfrom rest_framework import serializers\n\n\nclass Obfus...
[ { "content": "from documents.serialisers import CorrespondentField\nfrom documents.serialisers import DocumentTypeField\nfrom documents.serialisers import TagsField\nfrom paperless_mail.models import MailAccount\nfrom paperless_mail.models import MailRule\nfrom rest_framework import serializers\n\n\nclass Obfus...
diff --git a/src-ui/messages.xlf b/src-ui/messages.xlf index d833ae3eead..21ac728b301 100644 --- a/src-ui/messages.xlf +++ b/src-ui/messages.xlf @@ -967,7 +967,7 @@ </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">36</context> + <context context-type="linenumber">37</context> </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/storage-path-edit-dialog/storage-path-edit-dialog.component.html</context> @@ -1006,7 +1006,7 @@ </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">37</context> + <context context-type="linenumber">38</context> </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/storage-path-edit-dialog/storage-path-edit-dialog.component.html</context> @@ -1194,141 +1194,159 @@ <context context-type="linenumber">14</context> </context-group> </trans-unit> + <trans-unit id="559099472394646919" datatype="html"> + <source>Consumption scope</source> + <context-group purpose="location"> + <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> + <context context-type="linenumber">15</context> + </context-group> + </trans-unit> <trans-unit id="56643687972548912" datatype="html"> <source>See docs for .eml processing requirements</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">14</context> + <context context-type="linenumber">15</context> </context-group> </trans-unit> <trans-unit id="5488632521862493221" datatype="html"> <source>Paperless will only process mails that match <x id="START_EMPHASISED_TEXT" ctype="x-em" equiv-text="&lt;em&gt;"/>all<x id="CLOSE_EMPHASISED_TEXT" ctype="x-em" equiv-text="&lt;/em&gt;"/> of the filters specified below.</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">17</context> + <context context-type="linenumber">18</context> </context-group> </trans-unit> <trans-unit id="6925928412364847639" datatype="html"> <source>Filter from</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">18</context> + <context context-type="linenumber">19</context> </context-group> </trans-unit> <trans-unit id="8497813481090627874" datatype="html"> <source>Filter subject</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">19</context> + <context context-type="linenumber">20</context> </context-group> </trans-unit> <trans-unit id="7314357616097563149" datatype="html"> <source>Filter body</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">20</context> + <context context-type="linenumber">21</context> </context-group> </trans-unit> <trans-unit id="5031687746498952417" datatype="html"> <source>Filter attachment filename</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">21</context> + <context context-type="linenumber">22</context> </context-group> </trans-unit> <trans-unit id="4245210767172267486" datatype="html"> <source>Only consume documents which entirely match this filename if specified. Wildcards such as *.pdf or *invoice* are allowed. Case insensitive.</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">21</context> + <context context-type="linenumber">22</context> </context-group> </trans-unit> <trans-unit id="9216117865911519658" datatype="html"> <source>Action</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">24</context> + <context context-type="linenumber">25</context> </context-group> </trans-unit> <trans-unit id="4274038999388817994" datatype="html"> <source>Action is only performed when documents are consumed from the mail. Mails without attachments remain entirely untouched.</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">24</context> + <context context-type="linenumber">25</context> </context-group> </trans-unit> <trans-unit id="1261794314435932203" datatype="html"> <source>Action parameter</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">25</context> + <context context-type="linenumber">26</context> </context-group> </trans-unit> <trans-unit id="6093797930511670257" datatype="html"> <source>Assign title from</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">26</context> + <context context-type="linenumber">27</context> </context-group> </trans-unit> <trans-unit id="6695990587380209737" datatype="html"> <source>Assign document type</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">28</context> + <context context-type="linenumber">29</context> </context-group> </trans-unit> <trans-unit id="4754802869258527587" datatype="html"> <source>Assign correspondent from</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">29</context> + <context context-type="linenumber">30</context> </context-group> </trans-unit> <trans-unit id="4875491778188965469" datatype="html"> <source>Assign correspondent</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">30</context> + <context context-type="linenumber">31</context> </context-group> </trans-unit> <trans-unit id="1519954996184640001" datatype="html"> <source>Error</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html</context> - <context context-type="linenumber">35</context> + <context context-type="linenumber">36</context> </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/services/toast.service.ts</context> <context context-type="linenumber">32</context> </context-group> </trans-unit> - <trans-unit id="6233529027580744166" datatype="html"> - <source>Only process attachments.</source> + <trans-unit id="6886003843406464884" datatype="html"> + <source>Only process attachments</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">24</context> + <context context-type="linenumber">25</context> + </context-group> + <context-group purpose="location"> + <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> + <context context-type="linenumber">36</context> </context-group> </trans-unit> - <trans-unit id="3622418743488695840" datatype="html"> - <source>Process with embedded attachments as .eml</source> + <trans-unit id="936923743212522897" datatype="html"> + <source>Process all files, including &apos;inline&apos; attachments</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">28</context> + <context context-type="linenumber">29</context> </context-group> </trans-unit> - <trans-unit id="7205371824972320534" datatype="html"> - <source>Process as .eml and attachments as separate documents</source> + <trans-unit id="9025522236384167767" datatype="html"> + <source>Process message as .eml</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">32</context> + <context context-type="linenumber">40</context> + </context-group> + </trans-unit> + <trans-unit id="7411485377918318115" datatype="html"> + <source>Process message as .eml and attachments separately</source> + <context-group purpose="location"> + <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> + <context context-type="linenumber">44</context> </context-group> </trans-unit> <trans-unit id="7022070615528435141" datatype="html"> <source>Delete</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">39</context> + <context context-type="linenumber">51</context> </context-group> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/document-detail/document-detail.component.html</context> @@ -1391,84 +1409,84 @@ <source>Move to specified folder</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">43</context> + <context context-type="linenumber">55</context> </context-group> </trans-unit> <trans-unit id="4593278936733161020" datatype="html"> <source>Mark as read, don&apos;t process read mails</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">47</context> + <context context-type="linenumber">59</context> </context-group> </trans-unit> <trans-unit id="2378921144019636516" datatype="html"> <source>Flag the mail, don&apos;t process flagged mails</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">51</context> + <context context-type="linenumber">63</context> </context-group> </trans-unit> <trans-unit id="6457024618858980302" datatype="html"> <source>Tag the mail with specified tag, don&apos;t process tagged mails</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">55</context> + <context context-type="linenumber">67</context> </context-group> </trans-unit> <trans-unit id="4673329664686432878" datatype="html"> <source>Use subject as title</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">62</context> + <context context-type="linenumber">74</context> </context-group> </trans-unit> <trans-unit id="8645471396972938185" datatype="html"> <source>Use attachment filename as title</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">66</context> + <context context-type="linenumber">78</context> </context-group> </trans-unit> <trans-unit id="1568902914205618549" datatype="html"> <source>Do not assign a correspondent</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">73</context> + <context context-type="linenumber">85</context> </context-group> </trans-unit> <trans-unit id="3567746385454588269" datatype="html"> <source>Use mail address</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">77</context> + <context context-type="linenumber">89</context> </context-group> </trans-unit> <trans-unit id="445154175758965852" datatype="html"> <source>Use name (or mail address if not available)</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">81</context> + <context context-type="linenumber">93</context> </context-group> </trans-unit> <trans-unit id="1258862217749148424" datatype="html"> <source>Use correspondent selected below</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">85</context> + <context context-type="linenumber">97</context> </context-group> </trans-unit> <trans-unit id="3147349817770432927" datatype="html"> <source>Create new mail rule</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">125</context> + <context context-type="linenumber">137</context> </context-group> </trans-unit> <trans-unit id="3374331029704382439" datatype="html"> <source>Edit mail rule</source> <context-group purpose="location"> <context context-type="sourcefile">src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts</context> - <context context-type="linenumber">129</context> + <context context-type="linenumber">141</context> </context-group> </trans-unit> <trans-unit id="6036319582202941456" datatype="html"> diff --git a/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html b/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html index 4af044407c2..64d54a72cdc 100644 --- a/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html +++ b/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.html @@ -11,7 +11,8 @@ <h4 class="modal-title" id="modal-basic-title">{{getTitle()}}</h4> <app-input-select i18n-title title="Account" [items]="accounts" formControlName="account"></app-input-select> <app-input-text i18n-title title="Folder" formControlName="folder" i18n-hint hint="Subfolders must be separated by a delimiter, often a dot ('.') or slash ('/'), but it varies by mail server." [error]="error?.folder"></app-input-text> <app-input-number i18n-title title="Maximum age (days)" formControlName="maximum_age" [showAdd]="false" [error]="error?.maximum_age"></app-input-number> - <app-input-select i18n-title title="Attachment type" [items]="attachmentTypeOptions" formControlName="attachment_type" i18n-hint hint="See docs for .eml processing requirements"></app-input-select> + <app-input-select i18n-title title="Attachment type" [items]="attachmentTypeOptions" formControlName="attachment_type"></app-input-select> + <app-input-select i18n-title title="Consumption scope" [items]="consumptionScopeOptions" formControlName="consumption_scope" i18n-hint hint="See docs for .eml processing requirements"></app-input-select> </div> <div class="col"> <p class="small" i18n>Paperless will only process mails that match <em>all</em> of the filters specified below.</p> diff --git a/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts b/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts index a2486e141d1..63699fd66a3 100644 --- a/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts +++ b/src-ui/src/app/components/common/edit-dialog/mail-rule-edit-dialog/mail-rule-edit-dialog.component.ts @@ -12,6 +12,7 @@ import { MailMetadataCorrespondentOption, MailMetadataTitleOption, PaperlessMailRule, + MailRuleConsumptionScope, } from 'src/app/data/paperless-mail-rule' import { CorrespondentService } from 'src/app/services/rest/correspondent.service' import { DocumentTypeService } from 'src/app/services/rest/document-type.service' @@ -21,15 +22,26 @@ import { MailRuleService } from 'src/app/services/rest/mail-rule.service' const ATTACHMENT_TYPE_OPTIONS = [ { id: MailFilterAttachmentType.Attachments, - name: $localize`Only process attachments.`, + name: $localize`Only process attachments`, }, { - id: MailFilterAttachmentType.Email_Only, - name: $localize`Process with embedded attachments as .eml`, + id: MailFilterAttachmentType.Everything, + name: $localize`Process all files, including 'inline' attachments`, }, +] + +const CONSUMPTION_SCOPE_OPTIONS = [ { - id: MailFilterAttachmentType.Everything, - name: $localize`Process as .eml and attachments as separate documents`, + id: MailRuleConsumptionScope.Attachments, + name: $localize`Only process attachments`, + }, + { + id: MailRuleConsumptionScope.Email_Only, + name: $localize`Process message as .eml`, + }, + { + id: MailRuleConsumptionScope.Everything, + name: $localize`Process message as .eml and attachments separately`, }, ] @@ -140,6 +152,7 @@ export class MailRuleEditDialogComponent extends EditDialogComponent<PaperlessMa filter_attachment_filename: new FormControl(null), maximum_age: new FormControl(null), attachment_type: new FormControl(MailFilterAttachmentType.Attachments), + consumption_scope: new FormControl(MailRuleConsumptionScope.Attachments), action: new FormControl(MailAction.MarkRead), action_parameter: new FormControl(null), assign_title_from: new FormControl(MailMetadataTitleOption.FromSubject), @@ -181,4 +194,8 @@ export class MailRuleEditDialogComponent extends EditDialogComponent<PaperlessMa get metadataCorrespondentOptions() { return METADATA_CORRESPONDENT_OPTIONS } + + get consumptionScopeOptions() { + return CONSUMPTION_SCOPE_OPTIONS + } } diff --git a/src-ui/src/app/data/paperless-mail-rule.ts b/src-ui/src/app/data/paperless-mail-rule.ts index 1c9f1be7b18..9f526d4042d 100644 --- a/src-ui/src/app/data/paperless-mail-rule.ts +++ b/src-ui/src/app/data/paperless-mail-rule.ts @@ -1,6 +1,11 @@ import { ObjectWithId } from './object-with-id' export enum MailFilterAttachmentType { + Attachments = 1, + Everything = 2, +} + +export enum MailRuleConsumptionScope { Attachments = 1, Email_Only = 2, Everything = 3, diff --git a/src/paperless_mail/serialisers.py b/src/paperless_mail/serialisers.py index 5944656a7ef..0d15f617cd6 100644 --- a/src/paperless_mail/serialisers.py +++ b/src/paperless_mail/serialisers.py @@ -86,6 +86,7 @@ class Meta: "assign_document_type", "order", "attachment_type", + "consumption_scope", ] def update(self, instance, validated_data):
nipy__nipype-2852
nipype/conftest.py should be excluded from API documentation ### Summary The auto-generated API docs include `conftest.py`, which has a fixture. Pytest has turned calling a fixture directly into an error, and apparently the fixture is getting called when the docs are generated. This is what's currently breaking the Circle builds.
[ { "content": "#!/usr/bin/env python\n# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-\n# vi: set ft=python sts=4 ts=4 sw=4 et:\n\"\"\"Script to auto-generate interface docs.\n\"\"\"\nfrom __future__ import print_function, unicode_literals\n# stdlib imports\nimport os\nimport sys\n\n# **...
[ { "content": "#!/usr/bin/env python\n# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-\n# vi: set ft=python sts=4 ts=4 sw=4 et:\n\"\"\"Script to auto-generate interface docs.\n\"\"\"\nfrom __future__ import print_function, unicode_literals\n# stdlib imports\nimport os\nimport sys\n\n# **...
diff --git a/tools/build_interface_docs.py b/tools/build_interface_docs.py index 6fa518381e..37b99cb476 100755 --- a/tools/build_interface_docs.py +++ b/tools/build_interface_docs.py @@ -41,6 +41,7 @@ '\.pipeline\.s3_node_wrapper$', '\.testing', '\.scripts', + '\.conftest', ] docwriter.class_skip_patterns += [ 'AFNICommand',
akvo__akvo-rsr-3132
When logged in landing page should be "myRSR"
[ { "content": "# -*- coding: utf-8 -*-\n\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\n\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\n...
[ { "content": "# -*- coding: utf-8 -*-\n\n\"\"\"Akvo RSR is covered by the GNU Affero General Public License.\n\nSee more details in the license.txt file located at the root folder of the Akvo RSR module.\nFor additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\n...
diff --git a/akvo/rsr/views/__init__.py b/akvo/rsr/views/__init__.py index 517aa420d7..a1f5dbf4ca 100644 --- a/akvo/rsr/views/__init__.py +++ b/akvo/rsr/views/__init__.py @@ -11,5 +11,7 @@ def index(request): - """.""" - return HttpResponseRedirect(reverse('project-directory', args=[])) + """Redirect user to project directory or My RSR.""" + + redirect_url = 'project-directory' if request.user.is_anonymous() else 'my_rsr' + return HttpResponseRedirect(reverse(redirect_url, args=[]))
pulp__pulpcore-4727
pulp file python package reporting wrongly Starting with pulpcore 3.40 the pulp_file plugins python package started reporting as pulp_file instead of pulp-file.
[ { "content": "from pulpcore.plugin import PulpPluginAppConfig\n\n\nclass PulpFilePluginAppConfig(PulpPluginAppConfig):\n \"\"\"\n Entry point for pulp_file plugin.\n \"\"\"\n\n name = \"pulp_file.app\"\n label = \"file\"\n version = \"3.41.1.dev\"\n python_package_name = \"pulp_file\" # TO...
[ { "content": "from pulpcore.plugin import PulpPluginAppConfig\n\n\nclass PulpFilePluginAppConfig(PulpPluginAppConfig):\n \"\"\"\n Entry point for pulp_file plugin.\n \"\"\"\n\n name = \"pulp_file.app\"\n label = \"file\"\n version = \"3.41.1.dev\"\n python_package_name = \"pulp-file\" # TO...
diff --git a/CHANGES/4724.bugfix b/CHANGES/4724.bugfix new file mode 100644 index 0000000000..f5de72e61d --- /dev/null +++ b/CHANGES/4724.bugfix @@ -0,0 +1 @@ +Fixed that `pulp_file` presented its `python_package` as `pulp_file` instead of `pulp-file`. diff --git a/pulp_file/app/__init__.py b/pulp_file/app/__init__.py index 7ed000a8e8..d1bb39be6d 100644 --- a/pulp_file/app/__init__.py +++ b/pulp_file/app/__init__.py @@ -9,5 +9,5 @@ class PulpFilePluginAppConfig(PulpPluginAppConfig): name = "pulp_file.app" label = "file" version = "3.41.1.dev" - python_package_name = "pulp_file" # TODO Add python_module_name + python_package_name = "pulp-file" # TODO Add python_module_name domain_compatible = True
cocotb__cocotb-1298
Change setup.py to list the version as 1.x-dev for versions installed from github As suggested by @themperek, it would be neat if cocotb behaved like this: ``` > pip install git+https://github.com/cocotb/cocotb > python -c "import cocotb; print(cocotb.__version__)" 1.4.0-dev ```
[ { "content": "# Package versioning solution originally found here:\n# http://stackoverflow.com/q/458550\n\n# Store the version here so:\n# 1) we don't load dependencies by storing it in __init__.py\n# 2) we can import it in setup.py for the same reason\n# 3) we can import it into your module\n__version__ = '1.3...
[ { "content": "# Package versioning solution originally found here:\n# http://stackoverflow.com/q/458550\n\n# Store the version here so:\n# 1) we don't load dependencies by storing it in __init__.py\n# 2) we can import it in setup.py for the same reason\n# 3) we can import it into your module\n__version__ = '1.4...
diff --git a/cocotb/_version.py b/cocotb/_version.py index e88ee234c7..46ea3afc99 100644 --- a/cocotb/_version.py +++ b/cocotb/_version.py @@ -5,4 +5,4 @@ # 1) we don't load dependencies by storing it in __init__.py # 2) we can import it in setup.py for the same reason # 3) we can import it into your module -__version__ = '1.3.0' +__version__ = '1.4.0.dev0'
locustio__locust-1760
Locust stopped working after Flast 2.0 got released in setup.py I can see: ` "flask>=1.1.2", ` I guess it should be hardcoded to ==1.1.2 for now. it crashes with: ``` File "/root/.local/share/virtualenvs/xxxxxxx/lib/python3.6/site-packages/locust/web.py", line 102, in __init__ app.jinja_options["extensions"].append("jinja2.ext.do") KeyError: 'extensions' ```
[ { "content": "# -*- coding: utf-8 -*-\nimport ast\nimport os\nimport re\nimport sys\n\nfrom setuptools import find_packages, setup\n\nROOT_PATH = os.path.abspath(os.path.dirname(__file__))\n\n# parse version from locust/__init__.py\n_version_re = re.compile(r\"__version__\\s+=\\s+(.*)\")\n_init_file = os.path.j...
[ { "content": "# -*- coding: utf-8 -*-\nimport ast\nimport os\nimport re\nimport sys\n\nfrom setuptools import find_packages, setup\n\nROOT_PATH = os.path.abspath(os.path.dirname(__file__))\n\n# parse version from locust/__init__.py\n_version_re = re.compile(r\"__version__\\s+=\\s+(.*)\")\n_init_file = os.path.j...
diff --git a/setup.py b/setup.py index c2596f5e5e..a03f1a3f03 100644 --- a/setup.py +++ b/setup.py @@ -19,7 +19,7 @@ version=version, install_requires=[ "gevent>=20.9.0", - "flask>=1.1.2", + "flask==1.1.2", "Werkzeug>=1.0.1", "requests>=2.9.1", "msgpack>=0.6.2",
quantumlib__Cirq-1160
Broken Hadamard gate decomposition Steps to reproduce: ``` In [1]: import cirq In [2]: q = cirq.NamedQubit('q') In [3]: cirq.Circuit.from_ops(cirq.decompose([cirq.H(q)]))._unitary_() Out[3]: array([[ 0.5+0.5j, 0.5+0.5j], [ 0.5+0.5j, -0.5-0.5j]]) In [4]: cirq.Circuit.from_ops([cirq.H(q)])._unitary_() Out[4]: array([[ 0.70710678+0.j, 0.70710678+0.j], [ 0.70710678+0.j, -0.70710678+0.j]]) ``` Note that exponentiating the gate to a power different from 1.0 makes this work, suggesting a special casing of a decomposition is the culprit. This affects other gates whose decomposition includes Hadamards (e.g. iSwaps). There is a unit test that compares the unitary given by the gate and by its decomposition, but the assert it uses makes the comparison merely up to global phase. I think this is incorrect. Consider two qubits q0 and q1 and a circuit that applies U0 to q0 and U1 to q1. Suppose that the decomposition of U0 yields a unitary that is consistent with U0 merely up to global phase. What happens we you replace U0 with its decomposition? Well, this alters the *relative* phase between q0 and q1 producing observable effect.
[ { "content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by...
[ { "content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by...
diff --git a/cirq/ops/common_gates.py b/cirq/ops/common_gates.py index 0ab85680bdc..0c8bc36a218 100644 --- a/cirq/ops/common_gates.py +++ b/cirq/ops/common_gates.py @@ -518,7 +518,8 @@ def _decompose_(self, qubits): q = qubits[0] if self._exponent == 1: - yield Y(q)**0.5, X(q) + yield cirq.Y(q)**0.5 + yield cirq.XPowGate(global_shift=-0.25).on(q) return yield Y(q)**0.25 diff --git a/cirq/testing/consistent_decomposition.py b/cirq/testing/consistent_decomposition.py index aac0f041788..90dc3b877bc 100644 --- a/cirq/testing/consistent_decomposition.py +++ b/cirq/testing/consistent_decomposition.py @@ -14,8 +14,9 @@ from typing import Any +import numpy as np + from cirq import protocols, ops, line, circuits -from cirq.testing import lin_alg_utils def assert_decompose_is_consistent_with_unitary(val: Any): @@ -36,6 +37,4 @@ def assert_decompose_is_consistent_with_unitary(val: Any): actual = circuits.Circuit.from_ops(dec).to_unitary_matrix( qubit_order=qubits) - lin_alg_utils.assert_allclose_up_to_global_phase(actual, - expected, - atol=1e-8) + assert np.allclose(actual, expected, atol=1e-8)
PaddlePaddle__PaddleSpeech-19
Fix some problems in the ctc beam search decoder - [x] Make character's index in FST starting from one, otherwise wrong decoding results would be produced especially when space is the first character in the vocabulary; - [x] Add version check in the setup script; - [x] Remove unused code.
[ { "content": "\"\"\"Script to build and install decoder package.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom setuptools import setup, Extension, distutils\nimport glob\nimport platform\nimport os, sys\nimport multiprocessing.pool\...
[ { "content": "\"\"\"Script to build and install decoder package.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom setuptools import setup, Extension, distutils\nimport glob\nimport platform\nimport os, sys\nimport multiprocessing.pool\...
diff --git a/decoders/swig/path_trie.cpp b/decoders/swig/path_trie.cpp index 40d90970556..152efa82c64 100644 --- a/decoders/swig/path_trie.cpp +++ b/decoders/swig/path_trie.cpp @@ -52,7 +52,7 @@ PathTrie* PathTrie::get_path_trie(int new_char, bool reset) { } else { if (has_dictionary_) { matcher_->SetState(dictionary_state_); - bool found = matcher_->Find(new_char); + bool found = matcher_->Find(new_char + 1); if (!found) { // Adding this character causes word outside dictionary auto FSTZERO = fst::TropicalWeight::Zero(); diff --git a/decoders/swig/scorer.cpp b/decoders/swig/scorer.cpp index 686c67c77e1..27b61cd033e 100644 --- a/decoders/swig/scorer.cpp +++ b/decoders/swig/scorer.cpp @@ -149,13 +149,15 @@ void Scorer::set_char_map(const std::vector<std::string>& char_list) { char_list_ = char_list; char_map_.clear(); + // Set the char map for the FST for spelling correction for (size_t i = 0; i < char_list_.size(); i++) { if (char_list_[i] == " ") { SPACE_ID_ = i; - char_map_[' '] = i; - } else if (char_list_[i].size() == 1) { - char_map_[char_list_[i][0]] = i; } + // The initial state of FST is state 0, hence the index of chars in + // the FST should start from 1 to avoid the conflict with the initial + // state, otherwise wrong decoding results would be given. + char_map_[char_list_[i]] = i + 1; } } @@ -193,17 +195,11 @@ std::vector<std::string> Scorer::make_ngram(PathTrie* prefix) { void Scorer::fill_dictionary(bool add_space) { fst::StdVectorFst dictionary; - // First reverse char_list so ints can be accessed by chars - std::unordered_map<std::string, int> char_map; - for (size_t i = 0; i < char_list_.size(); i++) { - char_map[char_list_[i]] = i; - } - // For each unigram convert to ints and put in trie int dict_size = 0; for (const auto& word : vocabulary_) { bool added = add_word_to_dictionary( - word, char_map, add_space, SPACE_ID_, &dictionary); + word, char_map_, add_space, SPACE_ID_ + 1, &dictionary); dict_size += added ? 1 : 0; } diff --git a/decoders/swig/scorer.h b/decoders/swig/scorer.h index 61836463597..5ebc719c701 100644 --- a/decoders/swig/scorer.h +++ b/decoders/swig/scorer.h @@ -104,7 +104,7 @@ class Scorer { int SPACE_ID_; std::vector<std::string> char_list_; - std::unordered_map<char, int> char_map_; + std::unordered_map<std::string, int> char_map_; std::vector<std::string> vocabulary_; }; diff --git a/decoders/swig/setup.py b/decoders/swig/setup.py index b6bc0ca06af..a4bb2e9dadb 100644 --- a/decoders/swig/setup.py +++ b/decoders/swig/setup.py @@ -113,7 +113,7 @@ def compile_test(header, library): setup( name='swig_decoders', - version='1.0', + version='1.1', description="""CTC decoders""", ext_modules=decoders_module, py_modules=['swig_decoders'], ) diff --git a/setup.sh b/setup.sh index 7c40415db32..ec5e47ec8fc 100644 --- a/setup.sh +++ b/setup.sh @@ -27,7 +27,7 @@ if [ $? != 0 ]; then fi # install decoders -python -c "import swig_decoders" +python -c "import pkg_resources; pkg_resources.require(\"swig_decoders==1.1\")" if [ $? != 0 ]; then cd decoders/swig > /dev/null sh setup.sh
iterative__dvc-7234
dvc.fs.Path.parts wrong results **EDIT**: This issue will just be for this first problem of handling a sep at the end of a path. I made the windows-style path problem a separate issue #7233 When a path ends with the path sep, the `parts` function doesn't split. It returns a tuple with a single item: ```python from dvc.fs.path import Path Path('/').parts('/a/b/c/') ``` ```python ('/a/b/c',) ``` A second problem occurs when using windows style paths. We get the sep between the drive and the rest of the path: ```python Path('\\').parts('c:\\a') ``` ```python ('c:', '\\', 'a') ``` The first problem could be solved by simply stripping the final sep: ```python drive, path = self.flavour.splitdrive(path.rstrip(self.flavour.sep)) ``` but the second problem would still exist. We should really get these results: ```python Path('/').parts('/a/b/c/') ``` ```python ('/', 'a', 'b', 'c') ``` and ```python Path('\\').parts('c:\\a') ``` ```python ('c:', 'a') ``` Note the second case is still a little different from pathlib, which would include the sep with the drive: ```python from pathlib import PureWindowsPath PureWindowsPath('c:\\a').parts ``` ```python ('c:\\', 'a') ``` but this is probably more in-line with fsspec, which basically treats the drive letter as the first element of a relative path: ```python fsspec.AbstractFileSystem._parent('c:/a') ``` ```python 'c:' ``` version info: ``` DVC version: 2.9.4.dev28+gd90fe54d.d20220106 --------------------------------- Platform: Python 3.10.1 on Linux-5.15.11-arch2-1-x86_64-with-glibc2.33 Supports: azure (adlfs = 2021.10.0, knack = 0.9.0, azure-identity = 1.7.1), gdrive (pydrive2 = 1.10.0), gs (gcsfs = 2021.11.1), hdfs (fsspec = 2021.11.1, pyarrow = 6.0.1), webhdfs (fsspec = 2021.11.1), http (aiohttp = 3.8.1, aiohttp-retry = 2.4.6), https (aiohttp = 3.8.1, aiohttp-retry = 2.4.6), s3 (s3fs = 2021.11.1, boto3 = 1.19.8), ssh (sshfs = 2021.11.2), oss (ossfs = 2021.8.0), webdav (webdav4 = 0.9.3), webdavs (webdav4 = 0.9.3) Cache types: <https://error.dvc.org/no-dvc-cache> Caches: local Remotes: https Workspace directory: btrfs on /dev/mapper/nvme0n1p3_crypt Repo: dvc, git ```
[ { "content": "import ntpath\nimport posixpath\n\n\nclass Path:\n def __init__(self, sep):\n if sep == posixpath.sep:\n self.flavour = posixpath\n elif sep == ntpath.sep:\n self.flavour = ntpath\n else:\n raise ValueError(f\"unsupported separator '{sep}'\"...
[ { "content": "import ntpath\nimport posixpath\n\n\nclass Path:\n def __init__(self, sep):\n if sep == posixpath.sep:\n self.flavour = posixpath\n elif sep == ntpath.sep:\n self.flavour = ntpath\n else:\n raise ValueError(f\"unsupported separator '{sep}'\"...
diff --git a/dvc/fs/path.py b/dvc/fs/path.py index b0c02db8c7..7545fb7a88 100644 --- a/dvc/fs/path.py +++ b/dvc/fs/path.py @@ -15,7 +15,7 @@ def join(self, *parts): return self.flavour.join(*parts) def parts(self, path): - drive, path = self.flavour.splitdrive(path) + drive, path = self.flavour.splitdrive(path.rstrip(self.flavour.sep)) ret = [] while True: diff --git a/tests/unit/fs/test_path.py b/tests/unit/fs/test_path.py new file mode 100644 index 0000000000..caf41342c6 --- /dev/null +++ b/tests/unit/fs/test_path.py @@ -0,0 +1,29 @@ +import pytest + +from dvc.fs.path import Path + + +@pytest.mark.parametrize("prefix", ["", "/"]) +@pytest.mark.parametrize("postfix", ["", "/"]) +@pytest.mark.parametrize( + "path,expected", + [ + ("path", ("path",)), + ("some/path", ("some", "path")), + ], +) +def test_parts_posix(prefix, postfix, path, expected): + assert Path("/").parts(prefix + path + postfix) == tuple(prefix) + expected + + +@pytest.mark.parametrize("postfix", ["", "\\"]) +@pytest.mark.parametrize( + "path,expected", + [ + ("path", ("path",)), + ("c:\\path", ("c:", "\\", "path")), + ("some\\path", ("some", "path")), + ], +) +def test_parts_nt(postfix, path, expected): + assert Path("\\").parts(path + postfix) == expected
inventree__InvenTree-4843
PanelMixin get_custom_panels not getting called for part list view ### Please verify that this bug has NOT been raised before. - [X] I checked and didn't find a similar issue ### Describe the bug* I want to add a custom part import panel, for that I'm trying to use the PanelMixin for my plugin. But I realized the part list view "http://inventree_server/part/" ignores the plugin and doesn't call the get_custom_panels method. ### Steps to Reproduce create plugin with PanelMixin that always returns a test panel. open part list, the panel doesn't show ### Expected behaviour The panel should show ### Deployment Method - [ ] Docker - [X] Bare metal ### Version Information # Version Information: InvenTree-Version: 0.10.0 Django Version: 3.2.16 Database: sqlite3 Debug-Mode: False Deployed using Docker: False Active plugins: [{'name': 'InvenTreeBarcode', 'slug': 'inventreebarcode', 'version': '2.0.0'}, {'name': 'InvenTreeCoreNotificationsPlugin', 'slug': 'inventreecorenotificationsplugin', 'version': '1.0.0'}, {'name': 'EMEImport', 'slug': 'emeimport', 'version': '0.0.1'}] ### Relevant log output _No response_
[ { "content": "\"\"\"Django views for interacting with Part app.\"\"\"\n\nimport os\nfrom decimal import Decimal\n\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import HttpResponseRedirect, get_object_or_404\nfrom...
[ { "content": "\"\"\"Django views for interacting with Part app.\"\"\"\n\nimport os\nfrom decimal import Decimal\n\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import HttpResponseRedirect, get_object_or_404\nfrom...
diff --git a/InvenTree/part/views.py b/InvenTree/part/views.py index 93b088716494..db1cf52df206 100644 --- a/InvenTree/part/views.py +++ b/InvenTree/part/views.py @@ -27,7 +27,7 @@ from .part import MakePartTemplate -class PartIndex(InvenTreeRoleMixin, ListView): +class PartIndex(InvenTreeRoleMixin, InvenTreePluginViewMixin, ListView): """View for displaying list of Part objects.""" model = Part
LibraryOfCongress__concordia-463
Pagination and filtering don't work together **What behavior did you observe? Please describe the bug** The filter became unset and went to all images. **How can we reproduce the bug?** Steps to reproduce the behavior: 1. Go to an item that has several assets in open and submitted states. 2. Use the filter to only view submitted for review assets. 3. Scroll down and click page 2. **What is the expected behavior?** When I click page 2, the filter should be maintained.
[ { "content": "# TODO: use correct copyright header\nimport os\n\nfrom django.contrib import messages\nfrom dotenv import load_dotenv\n\n# Build paths inside the project like this: os.path.join(SITE_ROOT_DIR, ...)\nCONCORDIA_APP_DIR = os.path.abspath(os.path.dirname(__file__))\nSITE_ROOT_DIR = os.path.dirname(CO...
[ { "content": "# TODO: use correct copyright header\nimport os\n\nfrom django.contrib import messages\nfrom dotenv import load_dotenv\n\n# Build paths inside the project like this: os.path.join(SITE_ROOT_DIR, ...)\nCONCORDIA_APP_DIR = os.path.abspath(os.path.dirname(__file__))\nSITE_ROOT_DIR = os.path.dirname(CO...
diff --git a/concordia/settings_template.py b/concordia/settings_template.py index 261edf2e4..58dd93e94 100755 --- a/concordia/settings_template.py +++ b/concordia/settings_template.py @@ -78,6 +78,7 @@ "raven.contrib.django.raven_compat", "maintenance_mode", "bootstrap4", + "bittersweet", "concordia.apps.ConcordiaAppConfig", "exporter", "importer", diff --git a/concordia/templates/standard-pagination.html b/concordia/templates/standard-pagination.html index dfb470e05..830fd92a0 100644 --- a/concordia/templates/standard-pagination.html +++ b/concordia/templates/standard-pagination.html @@ -1,3 +1,5 @@ +{% load bittersweet_querystring %} + {% comment %} This template fragment assumes that you are using Bootstrap's default pagination with a Django ListView CBV or equivalent which has the default is_paginated, @@ -9,7 +11,7 @@ <ul class="pagination mx-auto justify-content-center"> {% if page_obj.has_previous %} <li class="page-item"> - <a class="page-link" href="?page={{ page_obj.previous_page_number }}" aria-title="Previous Page">&larr;</a> + <a class="page-link" href="?{% qs_alter request.GET page=page_obj.previous_page_number %}" aria-title="Previous Page">&larr;</a> </li> {% else %} <li class="page-item disabled" aria-hidden="true"> @@ -19,7 +21,7 @@ {% if page_obj.number > 1 %} <li class="page-item"> - <a class="page-link" href="?page=1">1</a> + <a class="page-link" href="?{% qs_alter request.GET page=1 %}">1</a> </li> {% endif %} @@ -31,7 +33,7 @@ {% with page_obj.previous_page_number|add:-1 as second_previous_page %} {% if second_previous_page > 1 %} <li class="page-item"> - <a class="page-link" href="?page={{ second_previous_page }}">{{ second_previous_page }}</a> + <a class="page-link" href="?{% qs_alter request.GET page=second_previous_page %}">{{ second_previous_page }}</a> </li> {% endif %} {% endwith %} @@ -39,19 +41,19 @@ {% if page_obj.previous_page_number > 1 %} <li class="page-item"> - <a class="page-link" href="?page={{ page_obj.previous_page_number }}">{{ page_obj.previous_page_number }}</a> + <a class="page-link" href="?{% qs_alter request.GET page=page_obj.previous_page_number %}">{{ page_obj.previous_page_number }}</a> </li> {% endif %} <li class="page-item active"> - <a class="page-link" href="?page={{ page_obj.number }}"> + <a class="page-link" href="?{% qs_alter request.GET page=page_obj.number %}"> {{ page_obj.number }} </a> </li> {% if page_obj.next_page_number < paginator.num_pages %} <li class="page-item"> - <a class="page-link" href="?page={{ page_obj.next_page_number }}">{{ page_obj.next_page_number }}</a> + <a class="page-link" href="?{% qs_alter request.GET page=page_obj.next_page_number %}">{{ page_obj.next_page_number }}</a> </li> {% endif %} @@ -59,7 +61,7 @@ {% with page_obj.next_page_number|add:1 as second_next_page %} {% if second_next_page < paginator.num_pages %} <li class="page-item"> - <a class="page-link" href="?page={{ second_next_page }}">{{ second_next_page }}</a> + <a class="page-link" href="?{% qs_alter request.GET page=second_next_page %}">{{ second_next_page }}</a> </li> {% endif %} {% endwith %} @@ -71,13 +73,13 @@ {% if page_obj.number < paginator.num_pages %} <li class="page-item"> - <a class="page-link" href="?page={{ paginator.num_pages }}">{{ paginator.num_pages }}</a> + <a class="page-link" href="?{% qs_alter request.GET page=paginator.num_pages %}">{{ paginator.num_pages }}</a> </li> {% endif %} {% if page_obj.has_next %} <li class="page-item"> - <a class="page-link" href="?page={{ page_obj.next_page_number }}">&rarr;</a> + <a class="page-link" href="?{% qs_alter request.GET page=page_obj.next_page_number %}">&rarr;</a> </li> {% else %} <li class="page-item disabled" aria-hidden="true"> diff --git a/concordia/templates/transcriptions/item_detail.html b/concordia/templates/transcriptions/item_detail.html index 505c629d6..8fba3d32c 100644 --- a/concordia/templates/transcriptions/item_detail.html +++ b/concordia/templates/transcriptions/item_detail.html @@ -63,7 +63,7 @@ <h1 class="m-3">{{ item.title }}</h1> <div class="col-md-4"> <small>Contributors: {{ contributor_count|intcomma }}</small> </div> - </div> + </div> </div> </div> <div class="card-deck justify-content-center">
LMFDB__lmfdb-4167
Random link for Dirichlet characters is broken https://www.lmfdb.org/Character/Dirichlet/random gives an invalid label error (two in fact). Also, three error messages are displayed when you enter the label "banana". Only one should be displayed.
[ { "content": "# -*- coding: utf-8 -*-\n\nfrom __future__ import absolute_import\nfrom lmfdb.app import app\nimport re\nfrom flask import render_template, url_for, request, redirect, abort\nfrom sage.all import gcd, euler_phi\nfrom lmfdb.utils import (\n to_dict, flash_error, SearchArray, YesNoBox, display_kn...
[ { "content": "# -*- coding: utf-8 -*-\n\nfrom __future__ import absolute_import\nfrom lmfdb.app import app\nimport re\nfrom flask import render_template, url_for, request, redirect, abort\nfrom sage.all import gcd, euler_phi\nfrom lmfdb.utils import (\n to_dict, flash_error, SearchArray, YesNoBox, display_kn...
diff --git a/lmfdb/characters/main.py b/lmfdb/characters/main.py index da37849a43..0c4f779909 100644 --- a/lmfdb/characters/main.py +++ b/lmfdb/characters/main.py @@ -189,6 +189,7 @@ def url_for_label(label): shortcuts={ "jump": jump }, url_for_label=url_for_label, learnmore=learn, + random_projection="label", bread=lambda: bread("Search results"), credit=credit, )
scverse__scanpy-997
`datasets.pbmc68k_reduced` isn't contained in the pypi package anymore This still works in `1.4.4.post1`. It's very likely caused by changes to `setup.py`. I experienced similar problems before and fixed them via `package_data`. But this got removed. It's probably only a problem for the source-based installs. https://github.com/theislab/scanpy/commit/881f0bef31cdfe0df7333641dc847a60894b5c41#diff-2eeaed663bd0d25b7e608891384b7298 ``` >>> import scanpy >>> scanpy.__version__ <Version('1.4.5.post2')> >>> scanpy.datasets.pbmc68k_reduced() Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/scanpy/datasets/__init__.py", line 239, in pbmc68k_reduced return read(filename) File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/scanpy/readwrite.py", line 114, in read **kwargs, File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/scanpy/readwrite.py", line 524, in _read return read_h5ad(filename, backed=backed) File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/anndata/readwrite/read.py", line 447, in read_h5ad constructor_args = _read_args_from_h5ad(filename=filename, chunk_size=chunk_size) File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/anndata/readwrite/read.py", line 481, in _read_args_from_h5ad f = h5py.File(filename, 'r') File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/anndata/h5py/h5sparse.py", line 162, in __init__ **kwds, File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/h5py/_hl/files.py", line 312, in __init__ fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr) File "/Users/alexwolf/miniconda3/lib/python3.6/site-packages/h5py/_hl/files.py", line 142, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 78, in h5py.h5f.open ```
[ { "content": "import sys\n\nif sys.version_info < (3, 6):\n sys.exit('scanpy requires Python >= 3.6')\nfrom pathlib import Path\n\nfrom setuptools import setup, find_packages\n\n\ntry:\n from scanpy import __author__, __email__\nexcept ImportError: # Deps not yet installed\n __author__ = __email__ = '...
[ { "content": "import sys\n\nif sys.version_info < (3, 6):\n sys.exit('scanpy requires Python >= 3.6')\nfrom pathlib import Path\n\nfrom setuptools import setup, find_packages\n\n\ntry:\n from scanpy import __author__, __email__\nexcept ImportError: # Deps not yet installed\n __author__ = __email__ = '...
diff --git a/setup.py b/setup.py index 9e6cdfa2fb..2dcff3cdec 100644 --- a/setup.py +++ b/setup.py @@ -50,6 +50,7 @@ ], ), packages=find_packages(), + include_package_data=True, entry_points=dict(console_scripts=['scanpy=scanpy.cli:console_main']), zip_safe=False, classifiers=[
cognitedata__cognite-sdk-python-291
client.time_series.get_time_series does not return metadata **Describe the bug** When executing `client.time_series.get_time_series()` with `include_metadata = True` no metadata is returned. **To Reproduce** Runnable code reproducing the error. ``` import cognite import requests import os import numpy as np import pandas as pd from datetime import datetime, timedelta from cognite.client.stable.time_series import TimeSeries sm_api = os.environ['SM_API_KEY'] client = cognite.CogniteClient(api_key = sm_api) ts_name = 'Test_tssssss' my_time_series = [TimeSeries(name=ts_name, description = 'test_description', metadata = { 'ASSETSCOPENAME' : 'meta_test_1' })] client.time_series.post_time_series(my_time_series) # create dummy data np.random.seed(1338) start_time = int((datetime.now()-timedelta(1)).strftime("%s")) timestamps = [(start_time + i * 10)*1000 for i in np.arange(11)] df = pd.DataFrame({'timestamp' : timestamps}) df[ts_name] = np.random.random(df.shape[0]) client.datapoints.post_datapoints_frame(df) # get time_series ts1 = client.time_series.get_time_series(name = ts_name, include_metadata = True).to_pandas() ts1_id = ts1['id'].loc[0] print(ts1.loc[0]) # no meta data # requests: # first with no metadata r1 = requests.get(url = 'https://api.cognitedata.com/api/0.5/projects/smart-maintenance-sandbox/timeseries/' + str(ts1_id) , headers= { 'Api-Key' : sm_api} , params = {"includeMetadata" : False}) print(r1.text.split('\n')) # then with metadata r1 = requests.get(url = 'https://api.cognitedata.com/api/0.5/projects/smart-maintenance-sandbox/timeseries/' + str(ts1_id) , headers= { 'Api-Key' : sm_api} , params = {"includeMetadata" : True}) print(r1.text.split('\n')) ``` **Expected behavior** The `client.time_series.get_time_series(name = ts_name,include_metadata = True)` should return the metadata.
[ { "content": "# -*- coding: utf-8 -*-\nfrom copy import deepcopy\nfrom typing import List\nfrom urllib.parse import quote\n\nimport pandas as pd\n\nfrom cognite.client._api_client import APIClient, CogniteCollectionResponse, CogniteResource, CogniteResponse\n\n\nclass TimeSeriesResponse(CogniteResponse):\n \...
[ { "content": "# -*- coding: utf-8 -*-\nfrom copy import deepcopy\nfrom typing import List\nfrom urllib.parse import quote\n\nimport pandas as pd\n\nfrom cognite.client._api_client import APIClient, CogniteCollectionResponse, CogniteResource, CogniteResponse\n\n\nclass TimeSeriesResponse(CogniteResponse):\n \...
diff --git a/cognite/client/stable/time_series.py b/cognite/client/stable/time_series.py index 26708c4fcb..a002097efc 100644 --- a/cognite/client/stable/time_series.py +++ b/cognite/client/stable/time_series.py @@ -45,7 +45,7 @@ class TimeSeriesListResponse(CogniteCollectionResponse): _RESPONSE_CLASS = TimeSeriesResponse - def to_pandas(self, include_metadata: bool = False): + def to_pandas(self, include_metadata: bool = True): """Returns data as a pandas dataframe Args:
pymodbus-dev__pymodbus-1197
client.ModbusClientMixin doesn not have __init__, but ModbusBaseClient tries to call it During its initialization class ModbusBaseClient tries to call super().\_\_init\_\_(), even though ModbusClientMixin does not have \_\_init\_\_(). Usually it is not a problem. However, if later one tries to inherit from, for example, ModbusTcpClient and from another class which has \_\_init\_\_() - that class is being called twice, with unexpected consequences: ```python from pymodbus.client.tcp import * class SyncClientMixin: def __init__(self, **kwargs): print("This is gonna be called twice") class TcpClientWrapper(ModbusTcpClient, SyncClientMixin): def __init__(self, **kwargs): super().__init__(**kwargs) SyncClientMixin.__init__(self, **kwargs) wrap = TcpClientWrapper(host = 'localhost') ``` The resolution is to have an empty \_\_init\_\_ in ModbusClientMixin
[ { "content": "\"\"\"Modbus Client Common.\"\"\"\nimport logging\nfrom typing import List, Union\n\nimport pymodbus.bit_read_message as pdu_bit_read\nimport pymodbus.bit_write_message as pdu_bit_write\nimport pymodbus.diag_message as pdu_diag\nimport pymodbus.other_message as pdu_other_msg\nimport pymodbus.regis...
[ { "content": "\"\"\"Modbus Client Common.\"\"\"\nimport logging\nfrom typing import List, Union\n\nimport pymodbus.bit_read_message as pdu_bit_read\nimport pymodbus.bit_write_message as pdu_bit_write\nimport pymodbus.diag_message as pdu_diag\nimport pymodbus.other_message as pdu_other_msg\nimport pymodbus.regis...
diff --git a/pymodbus/client/mixin.py b/pymodbus/client/mixin.py index 72a89456b..5cd99a462 100644 --- a/pymodbus/client/mixin.py +++ b/pymodbus/client/mixin.py @@ -42,6 +42,9 @@ class ModbusClientMixin: # pylint: disable=too-many-public-methods last_frame_end = 0 silent_interval = 0 + def __init__(self): + """Initialize.""" + def execute(self, request: ModbusRequest) -> ModbusResponse: """Execute request.
facebookresearch__hydra-1531
Add `env` to Hydra's config group This is a follow up to #1441 the `env` config group will allows users to manually change the env defaults value. (such as provides default callbacks or update run.dir )
[ { "content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nfrom dataclasses import dataclass, field\nfrom typing import Any, Dict, List, Optional\n\nfrom omegaconf import MISSING\n\nfrom hydra.core.config_store import ConfigStore\n\n\n@dataclass\nclass HelpConf:\n app_name: str = M...
[ { "content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\nfrom dataclasses import dataclass, field\nfrom typing import Any, Dict, List, Optional\n\nfrom omegaconf import MISSING\n\nfrom hydra.core.config_store import ConfigStore\n\n\n@dataclass\nclass HelpConf:\n app_name: str = M...
diff --git a/hydra/conf/__init__.py b/hydra/conf/__init__.py index dab5348094f..efd536bf88e 100644 --- a/hydra/conf/__init__.py +++ b/hydra/conf/__init__.py @@ -99,6 +99,8 @@ class HydraConf: {"hydra_logging": "default"}, {"job_logging": "default"}, {"callbacks": None}, + # env specific overrides + {"env": "default"}, ] ) diff --git a/hydra/conf/hydra/env/default.yaml b/hydra/conf/hydra/env/default.yaml new file mode 100644 index 00000000000..e69de29bb2d diff --git a/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/__init__.py b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/__init__.py new file mode 100644 index 00000000000..168f9979a46 --- /dev/null +++ b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/__init__.py @@ -0,0 +1 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved diff --git a/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/__init__.py b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/__init__.py new file mode 100644 index 00000000000..168f9979a46 --- /dev/null +++ b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/__init__.py @@ -0,0 +1 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved diff --git a/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/hydra/env/default.yaml b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/hydra/env/default.yaml new file mode 100644 index 00000000000..d39c37b6290 --- /dev/null +++ b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/conf/hydra/env/default.yaml @@ -0,0 +1,6 @@ +# @package _global_ + +hydra: + job: + env_set: + FOO: bar diff --git a/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/env_defaults.py b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/env_defaults.py new file mode 100644 index 00000000000..5338cf21df5 --- /dev/null +++ b/tests/test_apps/custom_env_defaults/hydra_plugins/env_defaults/env_defaults.py @@ -0,0 +1,13 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +from hydra.core.config_search_path import ConfigSearchPath +from hydra.plugins.search_path_plugin import SearchPathPlugin + + +class TestEnvDefaultSearchPathPlugin(SearchPathPlugin): + def manipulate_search_path(self, search_path: ConfigSearchPath) -> None: + # prepend search path to override env default + search_path.prepend( + provider="test-env-defaults", + path="pkg://hydra_plugins.env_defaults.conf", + anchor="hydra", + ) diff --git a/tests/test_apps/custom_env_defaults/my_app.py b/tests/test_apps/custom_env_defaults/my_app.py new file mode 100644 index 00000000000..57994e76e11 --- /dev/null +++ b/tests/test_apps/custom_env_defaults/my_app.py @@ -0,0 +1,18 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +import logging +import os + +from omegaconf import DictConfig + +import hydra + +log = logging.getLogger(__name__) + + +@hydra.main() +def my_app(_: DictConfig) -> None: + assert os.getenv("FOO") == "bar" + + +if __name__ == "__main__": + my_app() diff --git a/tests/test_completion.py b/tests/test_completion.py index 65eca2452ec..fa1613893fe 100644 --- a/tests/test_completion.py +++ b/tests/test_completion.py @@ -120,6 +120,7 @@ def test_bash_completion_with_dot_in_path() -> None: "hydra/", 3, [ + "hydra/env=", "hydra/help=", "hydra/hydra_help=", "hydra/hydra_logging=", @@ -165,11 +166,13 @@ def test_completion_plugin( config_loader = create_config_loader() bc = DefaultCompletionPlugin(config_loader) ret = bc._query(config_name="config.yaml", line=line_prefix + line) + assert ret == expected ret = bc._query( config_name="config.yaml", line="--multirun " + line_prefix + line ) + assert ret == expected @mark.skipif( diff --git a/tests/test_config_loader.py b/tests/test_config_loader.py index 6b014569e06..d624db55ec9 100644 --- a/tests/test_config_loader.py +++ b/tests/test_config_loader.py @@ -402,6 +402,7 @@ def test_list_groups() -> None: ] assert sorted(config_loader.list_groups("hydra")) == [ + "env", "help", "hydra_help", "hydra_logging", @@ -757,6 +758,7 @@ def test_overriding_with_dict(config: str, overrides: Any, expected: Any) -> Non [], { "optimizer": "nesterov", + "hydra/env": "default", "hydra/callbacks": None, "hydra/hydra_help": "default", "hydra/help": "default", @@ -773,6 +775,7 @@ def test_overriding_with_dict(config: str, overrides: Any, expected: Any) -> Non ["optimizer=adam"], { "optimizer": "adam", + "hydra/env": "default", "hydra/callbacks": None, "hydra/hydra_help": "default", "hydra/help": "default", diff --git a/tests/test_env_defaults.py b/tests/test_env_defaults.py new file mode 100644 index 00000000000..5d1d20d9ade --- /dev/null +++ b/tests/test_env_defaults.py @@ -0,0 +1,15 @@ +# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved +from pathlib import Path + +from hydra.test_utils.test_utils import chdir_hydra_root, run_python_script + +chdir_hydra_root() + + +def test_env_defaults(tmpdir: Path) -> None: + + cmd = [ + "tests/test_apps/custom_env_defaults/my_app.py", + "hydra.run.dir=" + str(tmpdir), + ] + run_python_script(cmd)
cal-itp__benefits-213
Send X-XSS-Protection header The X-XSS-Protection header can be used to manage certain browser's protection against reflected cross-site scripting (XSS), stopping a page from being loaded if an attack is detected. In modern browsers, the Content-Security-Policy header can provide better protection against XSS and setting X-XSS-Protection might be redundant (#203 tracks CSP implementation). See more at https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-XSS-Protection We'll want the following header/value: ``` X-XSS-Protection: 1; mode=block ``` This can be done in a new Middleware and configured in [`settings.py`](https://github.com/cal-itp/benefits/blob/dev/benefits/settings.py#L45) for all requests/responses.
[ { "content": "\"\"\"\nDjango settings for benefits project.\n\"\"\"\nimport os\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nBASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os...
[ { "content": "\"\"\"\nDjango settings for benefits project.\n\"\"\"\nimport os\n\n# Build paths inside the project like this: os.path.join(BASE_DIR, ...)\nBASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os...
diff --git a/benefits/settings.py b/benefits/settings.py index 722aeafcfd..830d0bb5b3 100644 --- a/benefits/settings.py +++ b/benefits/settings.py @@ -75,6 +75,8 @@ CSRF_FAILURE_VIEW = "benefits.core.views.csrf_failure" SESSION_COOKIE_SECURE = True +SECURE_BROWSER_XSS_FILTER = True + ROOT_URLCONF = "benefits.urls" template_ctx_processors = [
Netflix__lemur-455
A custom cert name with spaces causes AWS Upload failures Creating a cert with a custom name that has spaces, such as: `My Certificate` will not properly get uploaded to AWS. -- Potential Fixes: 1. Prevent spaces in custom names 2. Allow custom cert names to be editable 3. If spaces are allowed, the AWS uploader plugin needs to upload it in a way that can work properly.
[ { "content": "\"\"\"\n.. module: lemur.certificates.models\n :platform: Unix\n :copyright: (c) 2015 by Netflix Inc., see AUTHORS for more\n :license: Apache, see LICENSE for more details.\n.. moduleauthor:: Kevin Glisson <kglisson@netflix.com>\n\"\"\"\nimport datetime\n\nimport lemur.common.utils\nfrom...
[ { "content": "\"\"\"\n.. module: lemur.certificates.models\n :platform: Unix\n :copyright: (c) 2015 by Netflix Inc., see AUTHORS for more\n :license: Apache, see LICENSE for more details.\n.. moduleauthor:: Kevin Glisson <kglisson@netflix.com>\n\"\"\"\nimport datetime\n\nimport lemur.common.utils\nfrom...
diff --git a/lemur/certificates/models.py b/lemur/certificates/models.py index f5a7d9caa5..30acbbb0a5 100644 --- a/lemur/certificates/models.py +++ b/lemur/certificates/models.py @@ -27,6 +27,7 @@ def get_or_increase_name(name): + name = '-'.join(name.strip().split(' ')) count = Certificate.query.filter(Certificate.name.ilike('{0}%'.format(name))).count() if count >= 1:
nltk__nltk-1274
Tox fails with "ERROR: Failure: ImportError (No module named 'six')" When I try to run the tests with Tox (on Ubuntu) from within a local clone of the repo, it manages to install the dependencies but blows up when trying to import things from within NLTK. I imagine I can work around this by figuring out how to manually run just the tests I care about, but it's inconvenient. I'm not sure whether I'm doing something dumb or whether the Tox setup is broken; if the former, the CONTRIBUTING docs should probably mention what needs to be done besides just running Tox; if the latter, it should probably be fixed. Here's the full output (had to pastebin it due to GitHub's post length limit): http://pastebin.com/ENuCLnv6
[ { "content": "# Natural Language Toolkit: Tokenizer Interface\n#\n# Copyright (C) 2001-2015 NLTK Project\n# Author: Edward Loper <edloper@gmail.com>\n# Steven Bird <stevenbird1@gmail.com>\n# URL: <http://nltk.org/>\n# For license information, see LICENSE.TXT\n\n\"\"\"\nTokenizer Interface\n\"\"\"\n\nfro...
[ { "content": "# Natural Language Toolkit: Tokenizer Interface\n#\n# Copyright (C) 2001-2015 NLTK Project\n# Author: Edward Loper <edloper@gmail.com>\n# Steven Bird <stevenbird1@gmail.com>\n# URL: <http://nltk.org/>\n# For license information, see LICENSE.TXT\n\n\"\"\"\nTokenizer Interface\n\"\"\"\n\nfro...
diff --git a/nltk/test/probability.doctest b/nltk/test/probability.doctest index b19c9d689a..594d18b00a 100644 --- a/nltk/test/probability.doctest +++ b/nltk/test/probability.doctest @@ -69,33 +69,33 @@ ConditionalFreqDist ------------------- >>> cfd1 = ConditionalFreqDist() - >>> cfd1[1] = FreqDist('abbbc') + >>> cfd1[1] = FreqDist('abbbb') >>> cfd1[2] = FreqDist('xxxxyy') >>> cfd1 <ConditionalFreqDist with 2 conditions> >>> cfd2 = ConditionalFreqDist() - >>> cfd2[1] = FreqDist('bccd') - >>> cfd2[2] = FreqDist('xxyyyzz') + >>> cfd2[1] = FreqDist('bbccc') + >>> cfd2[2] = FreqDist('xxxyyyzz') >>> cfd2[3] = FreqDist('m') >>> cfd2 <ConditionalFreqDist with 3 conditions> >>> r = cfd1 + cfd2 >>> [(i,r[i]) for i in r.conditions()] - [(1, FreqDist({'b': 4, 'c': 3, 'a': 1, 'd': 1})), (2, FreqDist({'x': 6, 'y': 5, 'z': 2})), (3, FreqDist({'m': 1}))] + [(1, FreqDist({'b': 6, 'c': 3, 'a': 1})), (2, FreqDist({'x': 7, 'y': 5, 'z': 2})), (3, FreqDist({'m': 1}))] >>> r = cfd1 - cfd2 >>> [(i,r[i]) for i in r.conditions()] - [(1, FreqDist({'b': 2, 'a': 1})), (2, FreqDist({'x': 2}))] + [(1, FreqDist({'b': 2, 'a': 1})), (2, FreqDist({'x': 1}))] >>> r = cfd1 | cfd2 >>> [(i,r[i]) for i in r.conditions()] - [(1, FreqDist({'b': 3, 'c': 2, 'a': 1, 'd': 1})), (2, FreqDist({'x': 4, 'y': 3, 'z': 2})), (3, FreqDist({'m': 1}))] + [(1, FreqDist({'b': 4, 'c': 3, 'a': 1})), (2, FreqDist({'x': 4, 'y': 3, 'z': 2})), (3, FreqDist({'m': 1}))] >>> r = cfd1 & cfd2 >>> [(i,r[i]) for i in r.conditions()] - [(1, FreqDist({'c': 1, 'b': 1})), (2, FreqDist({'y': 2, 'x': 2}))] + [(1, FreqDist({'b': 2})), (2, FreqDist({'x': 3, 'y': 2}))] Testing some HMM estimators --------------------------- diff --git a/nltk/tokenize/api.py b/nltk/tokenize/api.py index 98c168b43b..174d23f394 100644 --- a/nltk/tokenize/api.py +++ b/nltk/tokenize/api.py @@ -11,7 +11,7 @@ """ from abc import ABCMeta, abstractmethod -from six import add_metaclass +from nltk.six import add_metaclass from nltk.internals import overridden from nltk.tokenize.util import string_span_tokenize
python-telegram-bot__python-telegram-bot-699
Bot tries to assign user id to anonymous channel post <!-- Thanks for reporting issues of python-telegram-bot! To make it easier for us to help you please enter detailed information below. Please note, we only support the latest version of python-telegram-bot and master branch. Please make sure to upgrade & recreate the issue on the latest version prior to opening an issue. --> ### Steps to reproduce 1. Add a bot to a channel that doesn't have post signing. 2. Post something to that channel. 3. Bot tries to assign a user id to the update, bot gets an AttributeError because there is no user. ### Expected behaviour The bot checks if the user is in the update before trying to get the id. ### Actual behaviour An AttributeError is thrown and the bot fails to process the update. ### Configuration **Operating System:** macOS Sierra **Version of Python, python-telegram-bot & dependencies:** ``` python-telegram-bot 6.1.0 urllib3 1.20 certifi 2017.04.17 future 0.16.0 Python 3.6.0 (default, Dec 24 2016, 08:01:42) [GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.42.1)] ``` ### Logs ``` Traceback (most recent call last): File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/dispatcher.py", line 264, in process_update if handler.check_update(update): File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/conversationhandler.py", line 181, in check_update key = self._get_key(update) File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/conversationhandler.py", line 164, in _get_key key.append(user.id) AttributeError: 'NoneType' object has no attribute 'id' ``` Bot tries to assign user id to anonymous channel post <!-- Thanks for reporting issues of python-telegram-bot! To make it easier for us to help you please enter detailed information below. Please note, we only support the latest version of python-telegram-bot and master branch. Please make sure to upgrade & recreate the issue on the latest version prior to opening an issue. --> ### Steps to reproduce 1. Add a bot to a channel that doesn't have post signing. 2. Post something to that channel. 3. Bot tries to assign a user id to the update, bot gets an AttributeError because there is no user. ### Expected behaviour The bot checks if the user is in the update before trying to get the id. ### Actual behaviour An AttributeError is thrown and the bot fails to process the update. ### Configuration **Operating System:** macOS Sierra **Version of Python, python-telegram-bot & dependencies:** ``` python-telegram-bot 6.1.0 urllib3 1.20 certifi 2017.04.17 future 0.16.0 Python 3.6.0 (default, Dec 24 2016, 08:01:42) [GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.42.1)] ``` ### Logs ``` Traceback (most recent call last): File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/dispatcher.py", line 264, in process_update if handler.check_update(update): File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/conversationhandler.py", line 181, in check_update key = self._get_key(update) File "/Users/jelle/.venv/lib/python3.6/site-packages/telegram/ext/conversationhandler.py", line 164, in _get_key key.append(user.id) AttributeError: 'NoneType' object has no attribute 'id' ``` Message.from is not mandatory due to channels Currently it's a required attribute in the Message constructor. We should fix that. CC: @tsnoam
[ { "content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2017\n# Leandro Toledo de Souza <devs@python-telegram-bot.org>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Pub...
[ { "content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2017\n# Leandro Toledo de Souza <devs@python-telegram-bot.org>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Pub...
diff --git a/AUTHORS.rst b/AUTHORS.rst index 040853d53e9..5a30084174a 100644 --- a/AUTHORS.rst +++ b/AUTHORS.rst @@ -32,6 +32,7 @@ The following wonderful people contributed directly or indirectly to this projec - `Jacob Bom <https://github.com/bomjacob>`_ - `JASON0916 <https://github.com/JASON0916>`_ - `jeffffc <https://github.com/jeffffc>`_ +- `Jelle Besseling <https://github.com/pingiun>`_ - `jh0ker <https://github.com/jh0ker>`_ - `John Yong <https://github.com/whipermr5>`_ - `jossalgon <https://github.com/jossalgon>`_ diff --git a/telegram/ext/conversationhandler.py b/telegram/ext/conversationhandler.py index bfe880047f1..e6f9b6da4c6 100644 --- a/telegram/ext/conversationhandler.py +++ b/telegram/ext/conversationhandler.py @@ -160,7 +160,7 @@ def _get_key(self, update): if self.per_chat: key.append(chat.id) - if self.per_user: + if self.per_user and user is not None: key.append(user.id) if self.per_message: diff --git a/tests/test_conversationhandler.py b/tests/test_conversationhandler.py index f15f28df6a4..231306da78c 100644 --- a/tests/test_conversationhandler.py +++ b/tests/test_conversationhandler.py @@ -330,6 +330,12 @@ def test_perChatMessageWithoutChat(self): update = Update(0, callback_query=cbq) handler.check_update(update) + def test_channelMessageWithoutChat(self): + handler = ConversationHandler(entry_points=[CommandHandler('start', self.start_end)], states={}, fallbacks=[]) + message = Message(0, None, None, Chat(0, Chat.CHANNEL, "Misses Test")) + update = Update(0, message=message) + handler.check_update(update) + if __name__ == '__main__': unittest.main()
OctoPrint__OctoPrint-973
Add more longRunningCommands (Specifically M400) 1. What were you doing? > Running a print that makes liberal use of `M400`s 1. What did you expect to happen? > The print to finish to completion 1. What happened instead? > The print failed with a communication error 1. Branch & Commit or Version of OctoPrint: > Version: 1.3.0-dev-71-g3cb8757 (HEAD branch) 1. Printer model & used firmware incl. version (if applicable - always include if unsure): > Marlin Integration Branch (3c54992c1c76af1c4206fb4b1ae915ad6873f3bb) 1. Browser and Version of Browser, Operating System running Browser (if applicable - always include if unsure): > Chrome on Windows 1. Link to octoprint.log on gist.github.com or pastebin.com (ALWAYS INCLUDE AND DO NOT TRUNCATE): > N/A 1. Link to contents of terminal tab or serial.log on gist.github.com or pastebin.com (if applicable - always include if unsure or reporting communication issues AND DO NOT TRUNCATE): > N/A 1. Link to contents of Javascript console in the browser on gist.github.com or pastebin.com or alternatively a screenshot (if applicable - always include if unsure or reporting UI issues): > N/A 1. Screenshot(s) showing the problem (if applicable - always include if unsure or reporting UI issues): > N/A I have read the FAQ. I use M400 a good amount in my GCode, and that combined with a large move buffer can cause >30s delay between sending and recieving the response. This is fixed by adding M400 to the "Long running commands" list in the settings. I think it should be there by default.
[ { "content": "# coding=utf-8\n\"\"\"\nThis module represents OctoPrint's settings management. Within this module the default settings for the core\napplication are defined and the instance of the :class:`Settings` is held, which offers getter and setter\nmethods for the raw configuration values as well as vario...
[ { "content": "# coding=utf-8\n\"\"\"\nThis module represents OctoPrint's settings management. Within this module the default settings for the core\napplication are defined and the instance of the :class:`Settings` is held, which offers getter and setter\nmethods for the raw configuration values as well as vario...
diff --git a/src/octoprint/settings.py b/src/octoprint/settings.py index f5fd85a235..982e932da3 100644 --- a/src/octoprint/settings.py +++ b/src/octoprint/settings.py @@ -83,7 +83,7 @@ def settings(init=False, basedir=None, configfile=None): "sdStatus": 1 }, "additionalPorts": [], - "longRunningCommands": ["G4", "G28", "G29", "G30", "G32"] + "longRunningCommands": ["G4", "G28", "G29", "G30", "G32", "M400", "M226"] }, "server": { "host": "0.0.0.0",
ibis-project__ibis-9088
docs: improvements to the home page The Ibis project home page is better than it once was [citation needed], but can use some improvements. In particular, it'd be great if we could have an [interactive demo similar to DuckDB's](https://shell.duckdb.org/#queries=v0,%20%20-Create-table-from-Parquet-file%0ACREATE-TABLE-train_services-AS%0A----FROM-'s3%3A%2F%2Fduckdb%20blobs%2Ftrain_services.parquet'~,%20%20-Get-the-top%203-busiest-train-stations%0ASELECT-station_name%2C-count(*)-AS-num_services%0AFROM-train_services%0AGROUP-BY-ALL%0AORDER-BY-num_services-DESC%0ALIMIT-3~). This would required [PyArrow in Pyodide](https://github.com/pyodide/pyodide/issues/2933) as the last blocker, I think. Regardless, we should ensure the landing page answers to a new/prospective user: - What is Ibis? - Why should I use Ibis? - Confidence Ibis is a well-supported, production-ready library Unfortunately, this may involve more HTML/CSS than I'm conformtable doing but we'll figure something out.
[ { "content": "from __future__ import annotations\n\nimport plotly.graph_objects as go\n\n\ndef to_greyish(hex_code, grey_value=128):\n hex_code = hex_code.lstrip(\"#\")\n r, g, b = int(hex_code[0:2], 16), int(hex_code[2:4], 16), int(hex_code[4:6], 16)\n\n new_r = (r + grey_value) // 2\n new_g = (g +...
[ { "content": "from __future__ import annotations\n\nimport plotly.graph_objects as go\n\n\ndef to_greyish(hex_code, grey_value=128):\n hex_code = hex_code.lstrip(\"#\")\n r, g, b = int(hex_code[0:2], 16), int(hex_code[2:4], 16), int(hex_code[4:6], 16)\n\n new_r = (r + grey_value) // 2\n new_g = (g +...
diff --git a/docs/backends_sankey.py b/docs/backends_sankey.py index 9bc7270c988c..b04cea212faf 100644 --- a/docs/backends_sankey.py +++ b/docs/backends_sankey.py @@ -94,7 +94,7 @@ def to_greyish(hex_code, grey_value=128): fig.update_layout( title_text="Ibis backend types", - font_size=24, + font_size=20, # font_family="Arial", title_font_size=30, margin=dict(l=30, r=30, t=80, b=30), diff --git a/docs/index.qmd b/docs/index.qmd index 4f1688ebb158..5be4a9bd4dd8 100644 --- a/docs/index.qmd +++ b/docs/index.qmd @@ -40,513 +40,226 @@ about: ::: {#about} ::: -## Install +{{< pagebreak >}} -We recommend starting with the default backend (DuckDB). +::: {.column-page} -```bash -pip install 'ibis-framework[duckdb,examples]' # <1> -``` - -1. Install Ibis with the DuckDB backend along with examples. - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseBackends" role="button" aria-expanded="false" aria-controls="collapseBackends" margin="100px">Show supported backends</a></div> - -### - -::: {#collapseBackends .collapse .multi-collapse} - -## Backends - -Need to use Ibis with a backend that isn't currently supported? [Let us know!](https://github.com/ibis-project/ibis/discussions/new?category=q-a) - -{{< include ./_tabsets/install.qmd >}} +### An open source dataframe library that works with any data system -See the [backend support matrix](support_matrix.qmd) for details on operations supported. [Open a feature request](https://github.com/ibis-project/ibis/issues/new?assignees=&labels=feature&projects=&template=feature-request.yml&title=feat) if you'd like to see support for an operation in a given backend. If the backend supports it, we'll do our best to add it quickly! +- Use the same API for 20+ backends +- Fast local dataframes with embedded DuckDB (default), Polars, or DataFusion +- Iterate locally and deploy remotely by changing a single line of code +- Compose SQL and Python dataframe code, bridging the gap between data engineering and data science -::: - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseQuickstart" role="button" aria-expanded="false" aria-controls="collapseQuickstart">Show quickstart</a></div> +```{python} +#| code-fold: true +#| echo: false -### +import ibis -::: {#collapseQuickstart .collapse .multi-collapse} +t = ibis.examples.penguins.fetch() +t.to_parquet("penguins.parquet") +``` -## Quickstart +## Ibis: the portable Python dataframe library -See [the getting started tutorial](tutorials/getting_started.qmd) for a more in-depth introduction to Ibis. Below is a quick overview. +Ibis offers a familiar local dataframe experience with outstanding performance, +using [DuckDB](https://duckdb.org) by default. ```{python} import ibis # <1> -import ibis.selectors as s # <1> ibis.options.interactive = True # <2> -t = ibis.examples.penguins.fetch() # <3> +t = ibis.read_parquet("penguins.parquet", table_name="penguins") # <3> t.head(3) # <4> ``` -1. Ensure you install Ibis first. -2. Use interactive mode for exploratory data analysis (EDA) or demos. -3. Load a dataset from the built-in examples. -4. Display the table. - - -Ibis is a dataframe library with familiar syntax. - -```{python} -t[10:15] # <1> -``` - -1. Display a slice of the table. - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseAnalytics" role="button" aria-expanded="false" aria-controls="collapseAnalytics">Show analytics</a></div> - -### +1. Import Ibis. +2. Enable interactive mode for exploratory data analysis (EDA) or demos. +3. Read a Parquet file and specify a table name (optional). +4. Display the first few rows of the table. -::: {#collapseAnalytics .collapse .multi-collapse} - -### Analytics - -Ibis is built for easy analytics at scale in Python. +Iterate and explore data locally: ```{python} -( # <1> - t.filter(ibis._["body_mass_g"] != None) # <1> - .group_by(["species", "island"]) # <1> - .aggregate(count=ibis._.count()) # <1> - .order_by(ibis.desc("count")) # <1> -) # <1> +grouped = t.group_by("species", "island").agg(count=t.count()).order_by("count") # <1> +grouped # <2> ``` -1. Group by species and island, and compute the number of rows in each group. - -::: - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseVisualization" role="button" aria-expanded="false" aria-controls="collapseVisualization">Show EDA + visualization</a></div> - -### - -::: {#collapseVisualization .collapse .multi-collapse} +1. Transform the table. +2. Display the transformed table. -### Exploratory data analysis (EDA) and visualization +### One API for 20+ backends -#### Exploratory data analysis - -Ibis has built-in methods for exploration and [visualization](#visualization). +Use the same dataframe API for 20+ backends: ```{python} -num_species = int(t.select("species").nunique().to_pandas()) # <1> -t["species"].topk(num_species) # <2> -``` - -1. Compute the number of species in the dataset. -2. Display the top species by count. - -#### Visualization +#| code-fold: true +#| echo: false -Ibis works with any Python plotting library that supports the [dataframe interchange protocol](https://data-apis.org/dataframe-protocol/latest/index.html). - -```{python} -grouped = ( # <1> - t.group_by("species") # <1> - .aggregate(count=ibis._.count()) # <1> - .order_by(ibis.desc("count")) # <1> -) # <1> -grouped # <2> +from backends_sankey import fig +fig.show() ``` -1. Setup data to plot. -2. Display the table. +For example: ::: {.panel-tabset} -## Altair - -```{.bash} -pip install altair -``` +## DuckDB ```{python} -import altair as alt - -chart = ( - alt.Chart(grouped.to_pandas()) - .mark_bar() - .encode( - x="species", - y="count", - tooltip=["species", "count"], - ) - .properties(width=600, height=400) - .interactive() -) -chart -``` - -## matplotlib - -```{.bash} -pip install matplotlib +con = ibis.connect("duckdb://") ``` ```{python} -import matplotlib.pyplot as plt - -chart = grouped.to_pandas().plot.bar( - x="species", - y="count", - figsize=(600 / 100, 400 / 100), -) -plt.show() -``` - -## Plotly - -```{.bash} -pip install plotly +t = con.read_parquet("penguins.parquet") +t.head(3) ``` ```{python} -import plotly.express as px - -chart = px.bar( - grouped.to_pandas(), - x="species", - y="count", - width=600, - height=400, -) -chart +t.group_by("species", "island").agg(count=t.count()).order_by("count") ``` -## plotnine +## Polars -```{.bash} -pip install plotnine -``` ```{python} -from plotnine import ggplot, aes, geom_bar, theme - -chart = ( - ggplot( - grouped, - aes(x="species", y="count"), - ) - + geom_bar(stat="identity") - + theme(figure_size=(600 / 100, 400 / 100)) -) -chart -``` - -## seaborn - -```{.bash} -pip install seaborn +con = ibis.connect("polars://") ``` ```{python} -import seaborn as sns - -chart = sns.barplot( - data=grouped.to_pandas(), - x="species", - y="count", -) -chart.figure.set_size_inches(600 / 100, 400 / 100) +t = con.read_parquet("penguins.parquet") +t.head(3) ``` -::: - -::: - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseDataScience" role="button" aria-expanded="false" aria-controls="collapseDataScience">Show data science</a></div> - -### - -::: {#collapseDataScience .collapse .multi-collapse} - -### Data science - -Use Ibis with your favorite data science libraries for concise and efficient workflows. - ```{python} -import ibis.selectors as s # <1> - - -def transform(t): # <2> - t = t.mutate( # <2> - s.across(s.numeric(), {"zscore": lambda x: (x - x.mean()) / x.std()}) # <2> - ).dropna() # <2> - return t # <2> - - -f = transform(t.drop("year")) # <3> -f.select("species", "island", s.contains("zscore")) # <4> +t.group_by("species", "island").agg(count=t.count()).order_by("count") ``` -1. Import the selectors module. -2. Define a function to transform the table for code reuse (compute z-scores on numeric columns). -3. Apply the function to the table and assign it to a new variable. -4. Display the transformed table. - -```bash -pip install scikit-learn -``` +## DataFusion ```{python} -import plotly.express as px # <1> -from sklearn.decomposition import PCA # <1> - -X = f.select(s.contains("zscore")) # <2> - -n_components = 3 # <3> -pca = PCA(n_components=n_components).fit(X) # <3> - -t_pca = ibis.memtable(pca.transform(X)).rename( # <4> - {"pc1": "col0", "pc2": "col1", "pc3": "col2"} # <4> -) # <4> - -f = f.mutate(row_number=ibis.row_number().over()).join( # <5> - t_pca.mutate(row_number=ibis.row_number().over()), # <5> - "row_number", # <5> -) # <5> - -px.scatter_3d( # <6> - f.to_pandas(), # <6> - x="pc1", # <6> - y="pc2", # <6> - z="pc3", # <6> - color="species", # <6> - symbol="island", # <6> -) # <6> +con = ibis.connect("datafusion://") ``` -1. Import data science libraries -2. Select "features" (numeric columns) as X -3. Compute PCA -4. Create a table from the PCA results -5. Join the PCA results to the original table -6. Plot the results - -::: - -### - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseInputOutput" role="button" aria-expanded="false" aria-controls="collapseInputOutput">Show input and output</a></div> - -### - -::: {#collapseInputOutput .collapse .multi-collapse} - -### Input and output - -Ibis supports a variety of input and output options. - -{{< include /_code/input_output_penguins.qmd >}} - -::: - -<div class="d-grid gap-2"><a class="btn btn-lg btn-primary" data-bs-toggle="collapse" href="#collapseSQLPython" role="button" aria-expanded="false" aria-controls="collapseSQLPython">Show SQL + Python</a></div> - -::: {#collapseSQLPython .collapse .multi-collapse} - -### SQL + Python - -Ibis has the `ibis.to_sql` to generate SQL strings. - -In a Jupyter notebook or IPython shell session, the output of `ibis.to_sql` will be syntax highlighted. - -In a plain Python REPL use `print(ibis.to_sql(...))` to pretty print SQL. - -Ibis uses [SQLGlot](https://sqlglot.com) under the hood to allow passing a `dialect` parameter to SQL methods. - -::: {.panel-tabset} - -## BigQuery - -```{python} -dialect = "bigquery" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> -``` - -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. - -You can chain Ibis expressions and `.sql` together. - ```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> +t = con.read_parquet("penguins.parquet") +t.head(3) ``` -1. Chain `.sql` calls and Ibis expressions together. - -## Snowflake - ```{python} -dialect = "snowflake" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> +t.group_by("species", "island").agg(count=t.count()).order_by("count") ``` -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. - -You can chain Ibis expressions and `.sql` together. +## PySpark ```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> +con = ibis.connect("pyspark://") ``` -1. Chain `.sql` calls and Ibis expressions together. - -## Oracle - ```{python} -dialect = "oracle" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> +t = con.read_parquet("penguins.parquet") +t.head(3) ``` -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. - -You can chain Ibis expressions and `.sql` together. - ```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> +t.group_by("species", "island").agg(count=t.count()).order_by("count") ``` -1. Chain `.sql` calls and Ibis expressions together. +::: -## MySQL +This allows you to iterate locally and deploy remotely by changing a single line +of code. For instance, develop locally with DuckDB and deploy remotely to +BigQuery. Or, using any combination of backends that meet your requirements. -```{python} -dialect = "mysql" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> -``` +### Python + SQL: better together -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. +Ibis works by decoupling the dataframe API from the backend execution. Most +backends support a SQL dialect, which Ibis compiles its expressions into using +[SQLGlot](https://github.com/tobymao/sqlglot). You can inspect the SQL that Ibis +generates for any SQL backend: -You can chain Ibis expressions and `.sql` together. ```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> +ibis.to_sql(grouped) # <1> ``` -1. Chain `.sql` calls and Ibis expressions together. +1. Display the SQL generated from the table expression. -## MSSQL +And use SQL strings directly, mixing and matching with Python dataframe code: ```{python} -dialect = "mssql" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> -``` +#| code-fold: true +#| echo: false -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. - -You can chain Ibis expressions and `.sql` together. - -```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> +t = ibis.read_parquet("penguins.parquet", table_name="penguins") ``` -1. Chain `.sql` calls and Ibis expressions together. - -## PostgreSQL - ```{python} -dialect = "postgres" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> +t.sql( # <1> + "SELECT species, island, COUNT(*) AS count FROM penguins GROUP BY species, island" # <1> +).order_by("count") # <2> ``` -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. - -You can chain Ibis expressions and `.sql` together. +1. Transform the table using SQL. +2. Then, transform the table using Python dataframe code. -```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> -``` +This allows you to combine the flexibility of Python with the scale and +performance of modern SQL. -1. Chain `.sql` calls and Ibis expressions together. +::: {.text-center} +## Users say... +::: -## SQLite +::: {.index-grid} -```{python} -dialect = "sqlite" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> -``` +::: {.index-g-col-4 .card .border-light .mb-3 .text-center} +::: {.card-body} +["Ibis is amazing, there is so much bikeshedding out there that this library +improves upon. I love that now we can empower any visualization with nearly +any dataset! Big thanks to those who have contributed!"]{.card-text} -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. +[Nick Shook]{.blockquote-footer} +::: +::: -You can chain Ibis expressions and `.sql` together. +::: {.index-g-col-4 .card .border-light .mb-3 .text-center} +::: {.card-body} +"I now have Ibis code that runs PySpark in my Databricks environment and Polars +on my laptop which is pretty slick 🔥" -```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> -``` +[Mark Druffel]{.blockquote-footer} +::: +::: -1. Chain `.sql` calls and Ibis expressions together. +::: {.index-g-col-4 .card .border-light .mb-3 .text-center} +::: {.card-body} +"I love that with Ibis, I can use SQL for the heavy lifting or aggregations and +then switch to a dataframe-like API for the type of dynamic transformations that +would otherwise be tedious to do in pure SQL." -## Trino +[Daniel Kim]{.blockquote-footer} +::: +::: -```{python} -dialect = "trino" # <1> -sql = ibis.to_sql( # <2> - grouped, # <2> - dialect=dialect, # <2> -) # <2> -sql # <3> -``` +::: -1. Set the dialect. -2. Convert the table to a SQL string. -3. Display the SQL string. +::: {.text-center} +## Get started with Ibis +::: -You can chain Ibis expressions and `.sql` together. +::: {.index-grid .text-center} -```{python} -con.sql(sql, dialect=dialect).filter(ibis._["species"] == "Adelie") # <1> -``` +::: {.index-g-col-4} +[Why Ibis?](why.qmd){.btn .btn-primary .w-100} +::: -1. Chain `.sql` calls and Ibis expressions together. +::: {.index-g-col-4} +[Tutorial: getting started](tutorials/getting_started.qmd){.btn .btn-primary .w-100} +::: +::: {.index-g-col-4} +[API reference](/reference){.btn .btn-primary .w-100} ::: ::: diff --git a/docs/styles.css b/docs/styles.css index a4608cdf0d1a..01ed4c84633b 100644 --- a/docs/styles.css +++ b/docs/styles.css @@ -21,3 +21,17 @@ section[id^="parameters-"] { margin: auto; display: block; } + +.index-grid { + @extend .grid; + display: flex; + justify-content: space-between; +} + +.index-g-col-4 { + @extend .g-col-4; + flex: 1; + /* Ensures all columns grow to fill the same space */ + margin: 0 5px; + /* Adds a small margin between columns */ +} diff --git a/docs/tutorials/getting_started.qmd b/docs/tutorials/getting_started.qmd index 049a72ec495f..f55c79d3a59e 100644 --- a/docs/tutorials/getting_started.qmd +++ b/docs/tutorials/getting_started.qmd @@ -2,6 +2,12 @@ This is a quick tour of some basic commands and usage patterns, just to get your flippers wet. +::: {.callout-tip} +You can run this tutorial in a GitHub Codespace with everything setup for you: + +[![](https://github.com/codespaces/badge.svg)](https://codespaces.new/ibis-project/ibis) +::: + ## Install Ibis {{< include ../_tabsets/install_default.qmd >}}
readthedocs__readthedocs.org-2712
Document that RTD uses `rel` branch for production Hi, i'd like to add a new builder for doxygen documentation (but native, not with breath). Since there are a lot of branches like real/relcorp which a far ahead of master, i'd like to know, which branch to choose for development. Thanks in advance! Oli
[ { "content": "# -*- coding: utf-8 -*-\n#\nimport os\nimport sys\n\nfrom recommonmark.parser import CommonMarkParser\n\nsys.path.insert(0, os.path.abspath('..'))\nsys.path.append(os.path.dirname(__file__))\nos.environ.setdefault(\"DJANGO_SETTINGS_MODULE\", \"readthedocs.settings.dev\")\n\nfrom django.conf import...
[ { "content": "# -*- coding: utf-8 -*-\n#\nimport os\nimport sys\n\nfrom recommonmark.parser import CommonMarkParser\n\nsys.path.insert(0, os.path.abspath('..'))\nsys.path.append(os.path.dirname(__file__))\nos.environ.setdefault(\"DJANGO_SETTINGS_MODULE\", \"readthedocs.settings.dev\")\n\nfrom django.conf import...
diff --git a/LICENSE.mit b/LICENSE.mit index fcb099e132c..2447f29c36a 100644 --- a/LICENSE.mit +++ b/LICENSE.mit @@ -1,4 +1,4 @@ -Copyright (c) 2011 Charles Leifer, Eric Holscher, Bobby Grace +Copyright (c) 2010-2017 Read the Docs, Inc & contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation diff --git a/docs/alternate_domains.rst b/docs/alternate_domains.rst index 6593c3ee0fa..9c84fef99a0 100644 --- a/docs/alternate_domains.rst +++ b/docs/alternate_domains.rst @@ -20,9 +20,12 @@ This requires two steps: * Add a CNAME record in your DNS that point to our servers `readthedocs.io` * Add a Domain object in the **Project Admin > Domains** page for your project. +.. note:: The ``Domain`` that should be used is the actual subdomain that you want your docs served on. + Generally it will be `docs.projectname.org`. + Using pip as an example, http://www.pip-installer.org resolves, but is hosted on our infrastructure. -As an example, fabric's dig record looks like this:: +As another example, fabric's dig record looks like this:: -> dig docs.fabfile.org ... diff --git a/docs/api.rst b/docs/api.rst index a7e252e78aa..4b51fd85614 100644 --- a/docs/api.rst +++ b/docs/api.rst @@ -4,6 +4,9 @@ Read the Docs Public API We have a limited public API that is available for you to get data out of the site. This document covers only part of the API provided. We have plans to create a read/write API, so that you can easily automate interactions with your project. +.. warning:: This API is out of date and not currently maintained. + We have a v2 API that is currently supported at http://readthedocs.org/api/v2/. + A basic API client using slumber -------------------------------- diff --git a/docs/builds.rst b/docs/builds.rst index 2d8dfacc202..ec36a3e32a4 100644 --- a/docs/builds.rst +++ b/docs/builds.rst @@ -15,6 +15,10 @@ Our current build limits are: We can increase build limits on a per-project basis, if you provide a good reason your documentation needs more resources. +You can see the current Docker build images that we use in our `docker repository <https://github.com/rtfd/readthedocs-docker-images>`_. `Docker Hub <https://hub.docker.com/r/readthedocs/build/>`_ also shows the latest set of images that have been built. + +Currently in production we're using the ``readthedocs/build:2.0`` docker image as our default image. + How we build documentation -------------------------- diff --git a/docs/conf.py b/docs/conf.py index 072c96a5bce..5a3b27dc95a 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -32,7 +32,7 @@ master_doc = 'index' project = u'Read The Docs' -copyright = u'2010, Eric Holscher, Charlie Leifer, Bobby Grace' +copyright = u'2010-2017, Read the Docs, Inc & contributors' version = '1.0' release = '1.0' exclude_patterns = ['_build'] diff --git a/docs/contribute.rst b/docs/contribute.rst index 6b7b629b234..95e288981a0 100644 --- a/docs/contribute.rst +++ b/docs/contribute.rst @@ -45,6 +45,9 @@ Triaging tickets Here is a brief explanation on how we triage incoming tickets to get a better sense of what needs to be done on what end. +.. note:: You will need Triage permission on the project in order to do this. + You can ask one of the members of the :doc:`team` to give you access. + Initial triage ~~~~~~~~~~~~~~ diff --git a/docs/faq.rst b/docs/faq.rst index 8e6540bd310..c6ad95f4690 100644 --- a/docs/faq.rst +++ b/docs/faq.rst @@ -96,8 +96,6 @@ Deleting a stale or broken build environment If you're having trouble getting your version to build, try wiping out the existing build/environment files. On your version list page ``/projects/[project]/versions`` there is a "Wipe" button that will remove all of the files associated with your documentation build, but not the documentation itself. - - How do I host multiple projects on one CNAME? --------------------------------------------- @@ -151,11 +149,6 @@ How do I support multiple languages of documentation? See the section on :ref:`Localization of Documentation`. -Do I need to be whitelisted? ----------------------------- - -No. Whitelisting has been removed as a concept in Read the Docs. You should have access to all of the features already. - Does Read The Docs work well with "legible" docstrings? ------------------------------------------------------- @@ -207,3 +200,8 @@ file* field. .. _Sphinx's autoapi: http://sphinx-doc.org/ext/autodoc.html .. _pip requirements file: https://pip.pypa.io/en/stable/user_guide.html#requirements-files + +What commit of Read the Docs is in production? +---------------------------------------------- + +We deploy readthedocs.org from the `rel` branch in our GitHub repository. You can see the latest commits that have been deployed by looking on GitHub: https://github.com/rtfd/readthedocs.org/commits/rel \ No newline at end of file
adamchainz__django-cors-headers-851
Listing Origin, DNT, or Accept-Encoding as allowed request headers is never necessary ### Understanding CORS - [X] I have read the resources. ### Python Version _No response_ ### Django Version _No response_ ### Package Version _No response_ ### Description The [README](https://github.com/adamchainz/django-cors-headers#cors_allow_headers-sequencestr) explicitly lists `"accept-encoding"`, `"dnt"`, and `"origin"` in the `CORS_ALLOW_HEADERS` list: ```python CORS_ALLOW_HEADERS = [ # omitted "accept-encoding", # omitted "dnt", "origin", # omitted ] ``` However, contrary to popular belief and according to the Fetch standard, allowing those request headers is never necessary. As so-called [_forbidden request headers_](https://fetch.spec.whatwg.org/#forbidden-request-header), they're indeed handled by the browser, not by the client. You can safely drop those three elements from that list.
[ { "content": "from __future__ import annotations\n\ndefault_headers = (\n \"accept\",\n \"accept-encoding\",\n \"authorization\",\n \"content-type\",\n \"dnt\",\n \"origin\",\n \"user-agent\",\n \"x-csrftoken\",\n \"x-requested-with\",\n)\n\ndefault_methods = (\"DELETE\", \"GET\", \"O...
[ { "content": "from __future__ import annotations\n\ndefault_headers = (\n \"accept\",\n \"authorization\",\n \"content-type\",\n \"user-agent\",\n \"x-csrftoken\",\n \"x-requested-with\",\n)\n\ndefault_methods = (\"DELETE\", \"GET\", \"OPTIONS\", \"PATCH\", \"POST\", \"PUT\")\n", "path": "...
diff --git a/CHANGELOG.rst b/CHANGELOG.rst index c66c08e5..d3df4261 100644 --- a/CHANGELOG.rst +++ b/CHANGELOG.rst @@ -2,6 +2,12 @@ Changelog ========= +* Remove three headers from the default "accept list": ``accept-encoding``, ``dnt``, and ``origin``. + These are `Forbidden header names <https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_header_name>`__, which means requests JavaScript can never set them. + Consequently, allowing them via CORS has no effect. + + Thanks to jub0bs for the report in `Issue #842 <https://github.com/adamchainz/django-cors-headers/issues/842>`__. + * Drop the ``CORS_REPLACE_HTTPS_REFERER`` setting and ``CorsPostCsrfMiddleware``. Since Django 1.9, the ``CSRF_TRUSTED_ORIGINS`` setting has been the preferred solution to making CSRF checks pass for CORS requests. The removed setting and middleware only existed as a workaround for Django versions before 1.9. diff --git a/README.rst b/README.rst index 0822b081..b9fc9f87 100644 --- a/README.rst +++ b/README.rst @@ -237,17 +237,14 @@ __ https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allo .. code-block:: python - CORS_ALLOW_HEADERS = [ + CORS_ALLOW_HEADERS = ( "accept", - "accept-encoding", "authorization", "content-type", - "dnt", - "origin", "user-agent", "x-csrftoken", "x-requested-with", - ] + ) The default can be imported as ``corsheaders.defaults.default_headers`` so you can extend it with your custom headers. This allows you to keep up to date with any future changes. @@ -257,9 +254,10 @@ For example: from corsheaders.defaults import default_headers - CORS_ALLOW_HEADERS = list(default_headers) + [ + CORS_ALLOW_HEADERS = ( + *default_headers, "my-custom-header", - ] + ) ``CORS_EXPOSE_HEADERS: Sequence[str]`` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/src/corsheaders/defaults.py b/src/corsheaders/defaults.py index 0897582b..35ec7a4b 100644 --- a/src/corsheaders/defaults.py +++ b/src/corsheaders/defaults.py @@ -2,11 +2,8 @@ default_headers = ( "accept", - "accept-encoding", "authorization", "content-type", - "dnt", - "origin", "user-agent", "x-csrftoken", "x-requested-with",
microsoft__botbuilder-python-1907
German language is not appropiate used when using Confirmprompts ### The Issue I am building a chatbot for german users. I am sending the local "de-de" as user, and can confirm this actual arrives the bot. When i want to use Confirmprompts the bot returns Yes and No and not "Ja" "Nein". ### The Solution After a lot of digging, I found the underlying cause and a fix. The culture model does not actually recognices German (de-de) as supported language, and thus switches to the default (english). But in the prompt_culture_models.py German actualy exists and ther is a todo "# TODO: Replace with Culture.German after Recognizers-Text package updates." Which I looked up and the Recognizers-Text package sis already updated :) . Still this is not the real issue. The reason is that german is not listed in the supported cultures function. I simply added it and every thing works fine. ` @classmethod def get_supported_cultures(cls) -> List[PromptCultureModel]: """ Gets a list of the supported culture models. """ return [ cls.Chinese, cls.German, cls.Dutch, cls.English, cls.French, cls.Italian, cls.Japanese, cls.Korean, cls.Portuguese, cls.Spanish, cls.Turkish, ]`
[ { "content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom typing import List\n\nfrom recognizers_text import Culture\n\n\nclass PromptCultureModel:\n \"\"\"\n Culture model used in Choice and Confirm Prompts.\n \"\"\"\n\n def __init__(\n ...
[ { "content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nfrom typing import List\n\nfrom recognizers_text import Culture\n\n\nclass PromptCultureModel:\n \"\"\"\n Culture model used in Choice and Confirm Prompts.\n \"\"\"\n\n def __init__(\n ...
diff --git a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/prompt_culture_models.py b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/prompt_culture_models.py index 1572ac688..abb527e21 100644 --- a/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/prompt_culture_models.py +++ b/libraries/botbuilder-dialogs/botbuilder/dialogs/prompts/prompt_culture_models.py @@ -174,6 +174,7 @@ def get_supported_cultures(cls) -> List[PromptCultureModel]: """ return [ cls.Chinese, + cls.German, cls.Dutch, cls.English, cls.French,
AnalogJ__lexicon-1356
Bug in create action for glesys provider When creating an A record with the glesys provider, the full name is added instead of the host name. ``` lexicon_config = { "provider_name" : "glesys", "action": "create", "domain": "somedomain.com", "type": "A", "name": "lexicon", "content": "1.2.3.4", "glesys": { } } ``` Results in the A-record: `{'id': 2723410, 'type': 'A', 'name': 'lexicon.somedomain.com', 'ttl': 3600, 'content': '1.2.3.4'}` While the expected result is: `{'id': 2723410, 'type': 'A', 'name': 'lexicon', 'ttl': 3600, 'content': '1.2.3.4'}` The request data sent to `domain/addrecord` : `{'domainname': 'somedomain.com', 'host': 'lexicon.somedomain.com', 'type': 'A', 'data': '1.2.3.4', 'ttl': 3600}` Expected request data to `domain/addrecord`: `{'domainname': 'somedomain.com', 'host': 'lexicon', 'type': 'A', 'data': '1.2.3.4', 'ttl': 3600}` Glesys API documentation: ``` domain/addrecord Url: https://api.glesys.com/domain/addrecord Method: Only Https POST Required arguments: domainname , host , type , data Optional arguments: ttl Description: Adds a dns record to a domain ```
[ { "content": "\"\"\"Module provider for Glesys\"\"\"\nimport json\n\nimport requests\n\nfrom lexicon.exceptions import AuthenticationError\nfrom lexicon.providers.base import Provider as BaseProvider\n\nNAMESERVER_DOMAINS = [\"glesys.com\"]\n\n\ndef provider_parser(subparser):\n \"\"\"Generate a subparser fo...
[ { "content": "\"\"\"Module provider for Glesys\"\"\"\nimport json\n\nimport requests\n\nfrom lexicon.exceptions import AuthenticationError\nfrom lexicon.providers.base import Provider as BaseProvider\n\nNAMESERVER_DOMAINS = [\"glesys.com\"]\n\n\ndef provider_parser(subparser):\n \"\"\"Generate a subparser fo...
diff --git a/lexicon/providers/glesys.py b/lexicon/providers/glesys.py index 2b30919b9..4bbaffd20 100644 --- a/lexicon/providers/glesys.py +++ b/lexicon/providers/glesys.py @@ -44,7 +44,7 @@ def _create_record(self, rtype, name, content): request_data = { "domainname": self.domain, - "host": self._full_name(name), + "host": name, "type": rtype, "data": content, }
imAsparky__django-cookiecutter-59
[FEAT]: Add Pyup to the Django project. **Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
[ { "content": "#!/usr/bin/env python\n\"\"\"django-cookiecutter post project generation jobs.\"\"\"\nimport os\nimport subprocess # nosec\n\nPROJECT_DIRECTORY = os.path.realpath(os.path.curdir)\n\nREMOTE_REPO = \"git@github.com:{{cookiecutter.github_username}}/\\\n{{cookiecutter.git_project_name}}.git\"\n\n\nGI...
[ { "content": "#!/usr/bin/env python\n\"\"\"django-cookiecutter post project generation jobs.\"\"\"\nimport os\nimport subprocess # nosec\n\nPROJECT_DIRECTORY = os.path.realpath(os.path.curdir)\n\nREMOTE_REPO = \"git@github.com:{{cookiecutter.github_username}}/\\\n{{cookiecutter.git_project_name}}.git\"\n\n\nGI...
diff --git a/cookiecutter.json b/cookiecutter.json index a664ffac..4144ada1 100644 --- a/cookiecutter.json +++ b/cookiecutter.json @@ -7,10 +7,11 @@ "git_project_name": "{{ cookiecutter.project_name.lower().replace(' ', '-').replace('_', '-') }}", "project_slug": "{{ cookiecutter.project_name.lower()|replace(' ', '_')|replace('-', '_')|replace('.', '_')|trim() }}", "project_short_description": "A Django project with all the boilerplate", - "add_contributors_list": "n", + "add_contributors_list": ["n", "y"], "version": "0.1.0", "use_repo_status_badge": ["no", "concept", "wip", "active"], "use_pre_commit": ["y", "n"], + "use_pyup_io": ["y", "n"], "include_sphinx_docs": ["y", "n"], "use_readthedocs": ["y", "n"], @@ -38,7 +39,6 @@ "_copy_without_render": [ ".pre-commit-config.yaml", ".github/workflows/semantic_release.yaml", - ".github/workflows/semantic_release_test_pypi.yaml", ".github/workflows/test_contribution.yaml" ] diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py index e0212c0a..e8273d7c 100644 --- a/hooks/post_gen_project.py +++ b/hooks/post_gen_project.py @@ -179,3 +179,6 @@ def git_configure_custom_commit_message(): if "{{ cookiecutter.create_repo_auto_test_workflow }}" == "n": remove_file(".github/workflows/test_contribution.yaml") + + if "{{ cookiecutter.use_pyup_io }}" == "n": + remove_file(".pyup.yaml") diff --git a/tests/test_bake_django.py b/tests/test_bake_django.py index acebd19f..cc9162c9 100644 --- a/tests/test_bake_django.py +++ b/tests/test_bake_django.py @@ -565,6 +565,51 @@ def test_baked_django_readme_without_precommit_badge(cookies): assert " :alt: pre-commit" not in readme_file +def test_baked_django_with_pyup_io(cookies): + """Test Django pyup.io file has been generated correctly.""" + default_django = cookies.bake() + + assert ".pyup.yaml" in os.listdir(default_django.project_path) + + pyup_path = default_django.project_path / ".pyup.yaml" + pyup_file = pyup_path.read_text().splitlines() + + assert ' - "imAsparky""' in pyup_file + + readme_path = default_django.project_path / "README.rst" + readme_file = readme_path.read_text().splitlines() + + assert ( + ".. image:: https://pyup.io/repos/github/imAsparky/django-boilerplate/shield.svg" + in readme_file + ) + assert ( + " :target: https://pyup.io/repos/github/imAsparky/django-boilerplate/" + in readme_file + ) + assert " :alt: Updates" in readme_file + + +def test_baked_django_without_pyup_io(cookies): + """Test Django pyup.io file has not been generated.""" + non_default_django = cookies.bake(extra_context={"use_pyup_io": "n"}) + + assert ".pyup.yaml" not in os.listdir(non_default_django.project_path) + + readme_path = non_default_django.project_path / "README.rst" + readme_file = readme_path.read_text().splitlines() + + assert ( + ".. image:: https://pyup.io/repos/github/imAsparky/django-boilerplate/shield.svg" + not in readme_file + ) + assert ( + " :target: https://pyup.io/repos/github/imAsparky/django-boilerplate/" + not in readme_file + ) + assert " :alt: Updates" not in readme_file + + def test_baked_django_with_read_the_docs(cookies): """Test Django readthedocs config has been generated correctly.""" default_django = cookies.bake() diff --git a/{{cookiecutter.git_project_name}}/.pyup.yaml b/{{cookiecutter.git_project_name}}/.pyup.yaml new file mode 100644 index 00000000..1de1b733 --- /dev/null +++ b/{{cookiecutter.git_project_name}}/.pyup.yaml @@ -0,0 +1,32 @@ +# configure updates globally +# default: all +# allowed: all, insecure, False +update: all + +# configure dependency pinning globally +# default: True +# allowed: True, False +pin: True + +# set the default branch +# default: empty, the default branch on GitHub +Default Branch: main + +# assign users to pull requests, default is not set +# requires private repo permissions, even on public repos +# default: empty +assignees: + - "{{cookiecutter.github_username}}"" + +# add a label to pull requests, default is not set +# requires private repo permissions, even on public repos +# default: empty +label_prs: "fix(pyup): Update dependencies" + +# allow to close stale PRs +# default: True +close_prs: False + +requirements: + - "requirements_dev.txt" + - "docs/requirements.txt" diff --git a/{{cookiecutter.git_project_name}}/README.rst b/{{cookiecutter.git_project_name}}/README.rst index 85959e5b..efdf14f1 100644 --- a/{{cookiecutter.git_project_name}}/README.rst +++ b/{{cookiecutter.git_project_name}}/README.rst @@ -4,12 +4,19 @@ *{{cookiecutter.project_short_description}}* -{%- if cookiecutter.use_repo_status_badge != "no" %} +{%- if cookiecutter.use_repo_status_badge != "n" %} .. image:: https://www.repostatus.org/badges/latest/{{cookiecutter.use_repo_status_badge}}.svg :target: https://www.repostatus.org/#{{cookiecutter.use_repo_status_badge}} :alt: Project Status: {{cookiecutter.use_repo_status_badge}} {%- endif %} +{%- if cookiecutter.use_pyup_io == "y" %} +.. image:: https://pyup.io/repos/github/{{cookiecutter.github_username}}/{{cookiecutter.git_project_name}}/shield.svg + :target: https://pyup.io/repos/github/{{cookiecutter.github_username}}/{{cookiecutter.git_project_name}}/ + :alt: Updates +{%- endif %} + + {%- if cookiecutter.use_pre_commit == "y" %} .. image:: https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white :target: https://github.com/pre-commit/pre-commit
flairNLP__flair-2711
SciSpacyTokenizer.tokenize() function is broken **Describe the bug** Two minor bugs were introduced to Flair's `SciSpacyTokenizer` as part of #2645 The bugs prevent usage of `SciSpacyTokenizer` and tokenizers that leverage it. **To Reproduce** Any use of classes that leverage `SciSpacyTokenizer` will raise errors. For example: ```python from flair.tokenization import SciSpacySentenceSplitter splitter = SciSpacySentenceSplitter() sentences = splitter.split("abcd") ``` Causes ``` [/usr/local/lib/python3.7/dist-packages/flair/tokenization.py](https://localhost:8080/#) in tokenize(self, text) 257 words: List[str] = [] 258 for word in sentence: --> 259 word.append(word) 260 return words 261 AttributeError: 'spacy.tokens.token.Token' object has no attribute 'append' ``` **To Fix** ***First issue*** The first problem is that there is a typo here: https://github.com/flairNLP/flair/blob/480d2c9afd66ab8d3bf40a676917e84dba3c4cee/flair/tokenization.py#L259 It should be `words.append`, not `word.append`. ***Second issue*** the `SciSpacyTokenizer.tokenize()` is supposed to return a list of `str` , but instad it returns a list of Spacy `Token` objects. Happy to open a PR shortly
[ { "content": "import logging\nfrom abc import ABC, abstractmethod\nfrom typing import Any, Callable, List, Union\n\nfrom segtok.segmenter import split_multi, split_single\nfrom segtok.tokenizer import split_contractions, word_tokenizer\n\nfrom flair.data import Sentence, Tokenizer\n\nlog = logging.getLogger(\"f...
[ { "content": "import logging\nfrom abc import ABC, abstractmethod\nfrom typing import Any, Callable, List, Union\n\nfrom segtok.segmenter import split_multi, split_single\nfrom segtok.tokenizer import split_contractions, word_tokenizer\n\nfrom flair.data import Sentence, Tokenizer\n\nlog = logging.getLogger(\"f...
diff --git a/flair/tokenization.py b/flair/tokenization.py index 33134ad555..c285c2ddd0 100644 --- a/flair/tokenization.py +++ b/flair/tokenization.py @@ -256,7 +256,7 @@ def tokenize(self, text: str) -> List[str]: sentence = self.model(text) words: List[str] = [] for word in sentence: - word.append(word) + words.append(word.text) return words @property
typeddjango__django-stubs-1794
Next release planning (4.2.6) As [announced in 4.2.5 release notes](https://github.com/typeddjango/django-stubs/releases/tag/4.2.5), we will drop the hard dependency on mypy. Users of django-stubs with mypy will need to add their own dependency on mypy, or install `django-stubs[compatible-mypy]` extra. I'm hoping to release this in less than a week to fix a few bugs. Blockers: * #1778 (fixes a bug introduced in version 4.2.5) * #1782 * #1784 * #1786 setup.py install is deprecated Will need to deal with this at one point. https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html This warning shows up when making a release: ``` .../site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! ```
[ { "content": "#!/usr/bin/env python\nimport os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n ...
[ { "content": "#!/usr/bin/env python\nimport os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n ...
diff --git a/.gitignore b/.gitignore index cc1ac47cb..1141379bd 100644 --- a/.gitignore +++ b/.gitignore @@ -11,3 +11,4 @@ out/ pip-wheel-metadata/ stubgen/ build/ +dist/ diff --git a/README.md b/README.md index a578e8267..c7cd38c1d 100644 --- a/README.md +++ b/README.md @@ -49,6 +49,7 @@ We rely on different `django` and `mypy` versions: | django-stubs | Mypy version | Django version | Django partial support | Python version | |----------------|--------------|----------------|------------------------|----------------| +| 4.2.6 | 1.6.x | 4.2 | 4.1, 3.2 | 3.8 - 3.12 | | 4.2.5 | 1.6.x | 4.2 | 4.1, 3.2 | 3.8 - 3.12 | | 4.2.4 | 1.5.x | 4.2 | 4.1, 3.2 | 3.8 - 3.11 | | 4.2.3 | 1.4.x | 4.2 | 4.1, 3.2 | 3.8 - 3.11 | diff --git a/scripts/release.sh b/scripts/release.sh index 71281961a..de8ab2da6 100755 --- a/scripts/release.sh +++ b/scripts/release.sh @@ -5,9 +5,9 @@ if [[ -z $(git status -s) ]] then if [[ "$VIRTUAL_ENV" != "" ]] then - pip install --upgrade setuptools wheel twine + pip install --upgrade setuptools wheel twine build rm -rf dist/ build/ - python setup.py sdist bdist_wheel + python -m build twine upload dist/* rm -rf dist/ build/ else diff --git a/setup.py b/setup.py index 5973d2e63..18bb048eb 100644 --- a/setup.py +++ b/setup.py @@ -37,7 +37,7 @@ def find_stub_files(name: str) -> List[str]: setup( name="django-stubs", - version="4.2.5", + version="4.2.6", description="Mypy stubs for Django", long_description=readme, long_description_content_type="text/markdown",
oppia__oppia-6846
Adding Lesson Topics to Lesson-Specific Landing Pages **Is your feature request related to a problem? Please describe.** Currently, our lesson landing pages don't include many of the keywords related to the lessons themselves, which makes them more difficult to surface in searches and in our ads. **Describe the solution you'd like** I would like to add lesson topics/areas to the lesson landing page (as seen in the screenshot below). In mobile view, the Topics covered list will be seen above the Otter in one column. Also note that Mark recommended using a more colorful cake, like the one seen in the screenshot below, for the Fractions landing page. **Describe alternatives you've considered** I've also considered adding more keywords by adding exploration titles to the collection landing pages to increase relevancy to those pages as well. **Additional context** <img width="499" alt="Screenshot 2019-05-24 14 01 05" src="https://user-images.githubusercontent.com/12034267/58350733-60d8ea00-7e2c-11e9-91e5-7d934471f1f6.png"> <img width="499" alt="Screenshot 2019-05-24 14 00 24" src="https://user-images.githubusercontent.com/12034267/58350707-4868cf80-7e2c-11e9-8734-497549b6464c.png"> Adding Lesson Topics to Lesson-Specific Landing Pages **Is your feature request related to a problem? Please describe.** Currently, our lesson landing pages don't include many of the keywords related to the lessons themselves, which makes them more difficult to surface in searches and in our ads. **Describe the solution you'd like** I would like to add lesson topics/areas to the lesson landing page (as seen in the screenshot below). In mobile view, the Topics covered list will be seen above the Otter in one column. Also note that Mark recommended using a more colorful cake, like the one seen in the screenshot below, for the Fractions landing page. **Describe alternatives you've considered** I've also considered adding more keywords by adding exploration titles to the collection landing pages to increase relevancy to those pages as well. **Additional context** <img width="499" alt="Screenshot 2019-05-24 14 01 05" src="https://user-images.githubusercontent.com/12034267/58350733-60d8ea00-7e2c-11e9-91e5-7d934471f1f6.png"> <img width="499" alt="Screenshot 2019-05-24 14 00 24" src="https://user-images.githubusercontent.com/12034267/58350707-4868cf80-7e2c-11e9-8734-497549b6464c.png">
[ { "content": "# coding: utf-8\n#\n# Copyright 2014 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licen...
[ { "content": "# coding: utf-8\n#\n# Copyright 2014 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licen...
diff --git a/assets/images/landing/maths/negative-numbers/negative_1.png b/assets/images/landing/maths/negative-numbers/negative_1.png new file mode 100644 index 0000000000000..1dfc7b286dbad Binary files /dev/null and b/assets/images/landing/maths/negative-numbers/negative_1.png differ diff --git a/assets/images/landing/maths/negative-numbers/negative_2.png b/assets/images/landing/maths/negative-numbers/negative_2.png new file mode 100644 index 0000000000000..f196d5d55d6ad Binary files /dev/null and b/assets/images/landing/maths/negative-numbers/negative_2.png differ diff --git a/assets/videos/landing/maths/negative-numbers/negative-numbers_video.mp4 b/assets/videos/landing/maths/negative-numbers/negative-numbers_video.mp4 new file mode 100644 index 0000000000000..aac65e3e68b59 Binary files /dev/null and b/assets/videos/landing/maths/negative-numbers/negative-numbers_video.mp4 differ diff --git a/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.controller.ts b/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.controller.ts index 66b5fbdef1187..3e7d78cea5f83 100644 --- a/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.controller.ts +++ b/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.controller.ts @@ -16,8 +16,11 @@ * @fileoverview Controller for landing page. */ +require( + 'components/common-layout-directives/common-elements/' + + 'background-banner.directive.ts'); + require('domain/utilities/UrlInterpolationService.ts'); -require('filters/string-utility-filters/capitalize.filter.ts'); require( 'pages/landing-pages/topic-landing-page/topic-landing-page.controller.ts'); require('services/PageTitleService.ts'); @@ -28,26 +31,72 @@ require('services/SiteAnalyticsService.ts'); oppia.constant('TOPIC_LANDING_PAGE_DATA', { maths: { fractions: { + topic_title: 'Fractions', collection_id: '4UgTQUc1tala', page_data: { - image_1: 'matthew_paper.png', - image_2: 'matthew_fractions.png', + image_1: { + file_name: 'matthew_paper.png', + alt: 'Matthew showing parts of fractions written on a card.' + }, + image_2: { + file_name: 'matthew_fractions.png', + alt: 'Matthew solving problems on Oppia.' + }, video: 'fractions_video.mp4', + lessons: [ + 'What is a Fraction?', + 'Comparing Fractions', + 'The Meaning of "Equal Parts"', + 'Adding & Subtracting Fractions' + ] + } + }, + 'negative-numbers': { + topic_title: 'Negative Numbers', + collection_id: 'GdYIgsfRZwG7', + page_data: { + image_1: { + file_name: 'negative_1.png', + alt: 'A boy showing 3 + -24 written on a slate.' + }, + image_2: { + file_name: 'negative_2.png', + alt: 'A boy smiling and solving negative-number problems on Oppia.' + }, + video: 'negative-numbers_video.mp4', + lessons: [ + 'The Number Line', + 'What is a Negative Number?', + 'Adding & Subtracting Negative Numbers', + 'Multiplying & Dividing Negative Numbers' + ] } }, ratios: { + topic_title: 'Ratios', collection_id: '53gXGLIR044l', page_data: { - image_1: 'ratios_James.png', - image_2: 'ratios_question.png', + image_1: { + file_name: 'ratios_James.png', + alt: 'A boy showing 2 is to 3 ratio on a card.' + }, + image_2: { + file_name: 'ratios_question.png', + alt: 'A smoothie shop and a card having question "What does a ratio' + + 'tell us?" with options.' + }, video: 'ratios_video.mp4', + lessons: [ + 'What is a Ratio?', + 'Equivalent Ratios', + 'Ratios & Proportional Reasoning', + 'Writing Ratios in Simplest Form' + ] } } } }); - - oppia.controller('TopicLandingPage', [ '$filter', '$scope', '$timeout', '$window', 'PageTitleService', 'SiteAnalyticsService', 'UrlInterpolationService', 'TOPIC_LANDING_PAGE_DATA', @@ -56,36 +105,42 @@ oppia.controller('TopicLandingPage', [ SiteAnalyticsService, UrlInterpolationService, TOPIC_LANDING_PAGE_DATA) { var pathArray = $window.location.pathname.split('/'); $scope.subject = pathArray[2]; - $scope.topic = pathArray[3]; - var landingPageData = ( - TOPIC_LANDING_PAGE_DATA[$scope.subject][$scope.topic].page_data); + var topic = pathArray[3]; + var topicData = TOPIC_LANDING_PAGE_DATA[$scope.subject][topic]; + var landingPageData = topicData.page_data; var assetsPathFormat = '/landing/<subject>/<topic>/<file_name>'; - - var capitalizedTopic = $filter('capitalize')($scope.topic); - var pageTitle = 'Learn ' + capitalizedTopic + ' - Oppia'; + $scope.topicTitle = topicData.topic_title; + $scope.lessons = landingPageData.lessons; + var pageTitle = 'Learn ' + $scope.topicTitle + ' - Oppia'; PageTitleService.setPageTitle(pageTitle); + $scope.bookImageUrl = UrlInterpolationService.getStaticImageUrl( + '/splash/books.svg'); - $scope.getRowImageUrl = function(index) { + var getImageData = function(index) { var imageKey = 'image_' + index; if (landingPageData[imageKey]) { var imagePath = UrlInterpolationService.interpolateUrl( angular.copy(assetsPathFormat), { subject: $scope.subject, - topic: $scope.topic, - file_name: landingPageData[imageKey] + topic: topic, + file_name: landingPageData[imageKey].file_name }); - return UrlInterpolationService.getStaticImageUrl(imagePath); - } else { - throw Error('page_data does not have ' + imageKey + ' key.'); + return { + src: UrlInterpolationService.getStaticImageUrl(imagePath), + alt: landingPageData[imageKey].alt + }; } }; + $scope.image1 = getImageData(1); + $scope.image2 = getImageData(2); + $scope.getVideoUrl = function() { if (landingPageData.video) { var videoPath = UrlInterpolationService.interpolateUrl( angular.copy(assetsPathFormat), { subject: $scope.subject, - topic: $scope.topic, + topic: topic, file_name: landingPageData.video }); return UrlInterpolationService.getStaticVideoUrl(videoPath); @@ -94,14 +149,8 @@ oppia.controller('TopicLandingPage', [ } }; - $scope.getStaticSubjectImageUrl = function(subjectName) { - return UrlInterpolationService.getStaticImageUrl( - '/subjects/' + subjectName + '.svg'); - }; - $scope.onClickGetStartedButton = function() { - var collectionId = ( - TOPIC_LANDING_PAGE_DATA[$scope.subject][$scope.topic].collection_id); + var collectionId = topicData.collection_id; SiteAnalyticsService.registerOpenCollectionFromLandingPageEvent( collectionId); $timeout(function() { diff --git a/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.mainpage.html b/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.mainpage.html index 3d9162c73f56b..20e8fa0ef8322 100644 --- a/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.mainpage.html +++ b/core/templates/dev/head/pages/landing-pages/topic-landing-page/topic-landing-page.mainpage.html @@ -1,135 +1,39 @@ {% extends 'dist/base.html' %} {% block content %} - <div ng-controller="TopicLandingPage"> + <div ng-controller="TopicLandingPage" class="oppia-landing-page"> <div class="oppia-landing-section text-center" style="background-color: #afd2eb"> <div class="oppia-landing-section-inner"> <div class="container-fluid"> - <div class="row"> - <div class="col-sm-6 col-sm-push-6 oppia-landing-image-div" style="height: auto"> - <img ng-src="<[getRowImageUrl(1)]>" class="oppia-landing-image" style="width: 60%;" alt=""> + <background-banner class="oppia-landing-background-image"></background-banner> + <div class="row oppia-landing-section-row"> + <div class="col-sm-6 col-sm-push-6 oppia-landing-image-div"> + <img ng-src="<[image1.src]>" class="oppia-landing-image" alt="<[image1.alt]>"> </div> - <div class="col-sm-6 col-sm-pull-6" style="z-index: 20"> - <div class="oppia-landing-text-box-0"> - <h1 class="oppia-landing-h1"><[topic | capitalize]> just got easier</h1> - <h2 class="oppia-landing-h2" style="padding-right: 15px;">Get your students and kids started with our free, effective <[subject]> lessons.</h2> + <div class="col-sm-6 col-sm-pull-6"> + <div class="oppia-landing-text-box-1"> + <h1 class="oppia-landing-h1 oppia-text-color-green"><[topicTitle]> just got easier</h1> + <h2 class="oppia-landing-h2 oppia-text-color-green">Get your students and kids started with our free, effective <[subject]> lessons.</h2> </div> - <button class="btn oppia-landing-get-started" ng-click="onClickGetStartedButton('teacher')">Get Started</button> - <button class="btn oppia-landing-learn-more" ng-click="onClickLearnMoreButton()">Learn More</button> + <button class="btn oppia-landing-page-button" ng-click="onClickGetStartedButton('teacher')">Get Started</button> + <button class="btn oppia-landing-page-button oppia-make-button-transparent" ng-click="onClickLearnMoreButton()">Learn More</button> </div> </div> </div> - <div style="position: absolute; top: 115px"> - <div class="oppia-landing-background-icon-row" style="margin-top: 20px"> - <img ng-src="<[getStaticSubjectImageUrl('Humor')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Combinatorics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Cooking')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Government')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Architecture')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('History')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Microbiology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Engineering')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Algorithms')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Economics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Computing')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Reading')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Art')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Creativity')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Physics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Language')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Arithmetic')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chess')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Astronomy')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Religion')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Mathematics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Philosophy')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Humor')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Combinatorics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Cooking')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Government')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Architecture')]>" class="oppia-landing-background-icon" alt=""> - </div> - - <div class="oppia-landing-background-icon-row"> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Poetry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Puzzles')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Calculus')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Business')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Geography')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Biology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Poetry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Puzzles')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Calculus')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Business')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Geography')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Biology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> - </div> - - <div class="oppia-landing-background-icon-row"> - <img ng-src="<[getStaticSubjectImageUrl('Economics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algorithms')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Creativity')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Astronomy')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chess')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Arithmetic')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Language')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Physics')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Combinatorics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Humor')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Philosophy')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Mathematics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Religion')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Cooking')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Engineering')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Microbiology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('History')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Architecture')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Government')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Art')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Reading')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Computing')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Economics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algorithms')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Creativity')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Astronomy')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chess')]>" class="oppia-landing-background-icon" alt=""> - </div> - </div> </div> </div> <div class="oppia-landing-section text-center" style="background-color: #e8e7e3"> <div class="oppia-landing-section-inner"> <div class="container-fluid"> - <div class="row"> + <div class="row oppia-landing-section-row"> <div class="col-sm-6 col-sm-push-6 oppia-landing-image-div"> - <img ng-src="<[getRowImageUrl(2)]>" class="oppia-landing-image" alt=""> + <img ng-src="<[image2.src]>" class="oppia-landing-image" alt="<[image2.alt]>"> </div> <div class="col-sm-6 col-sm-pull-6"> <div class="oppia-landing-text-box-1"> - <h1 class="oppia-landing-h1" style="color: #242424">Fun storytelling for all</h1> - <h2 class="oppia-landing-h2" style="color: #242424">Students are guided through explorations with targeted feedback and immersive storytelling. + <h1 class="oppia-landing-h1">Fun storytelling for all</h1> + <h2 class="oppia-landing-h2">Students are guided through explorations with targeted feedback and immersive storytelling. <br> <br> Oppia guides students step-by-step with helpful hints, so they can complete the lessons on their own. @@ -144,14 +48,10 @@ <h2 class="oppia-landing-h2" style="color: #242424">Students are guided through <div class="oppia-landing-section text-center" style="background-color: #429488; "> <div class="oppia-landing-section-inner" style="margin-bottom: 2%;"> <div class="container-fluid"> - <div class="row"> + <div class="row oppia-landing-section-row"> <div class="col-sm-6 oppia-landing-image-div"> <div class="oppia-landing-video-frame"> - <video controls type="video/mp4" ng-src="<[getVideoUrl()]>" class="oppia-landing-image - oppia-landing-image-desktop" onclick="this.paused ? this.play() : this.pause();">Sorry, your browser doesn't support embedded videos. - </video> - <video controls type="video/mp4" ng-src="<[getVideoUrl()]>" class="oppia-landing-image - oppia-landing-image-mobile" onclick="this.paused ? this.play() : this.pause();">Sorry, your browser doesn't support embedded videos. + <video controls type="video/mp4" ng-src="<[getVideoUrl()]>" onclick="this.paused ? this.play() : this.pause();">Sorry, your browser doesn't support embedded videos. </video> </div> </div> @@ -168,82 +68,62 @@ <h2 class="oppia-landing-h2" style="color: #FFFFFF;">By working through lessons </div> </div> - <div class="oppia-landing-section text-center" style="background-color: #afd2eb"> + <div class="oppia-landing-section text-center" style="background-color: #e8e7e3"> <div class="oppia-landing-section-inner"> - <h1 class="oppia-landing-h1" style="margin-left: 10%; margin-right: 10%; padding-left: 0px; padding-bottom: 10px; text-align: center;white-space: pre-line;">Imagine what your students could learn today!</h1> - <button class="btn oppia-landing-get-started oppia-landing-get-started-mobile" style="margin-top: 2%;" ng-click="onClickGetStartedButton('teacher')">Get Started</button> - <h2 class="oppia-landing-h2 oppia-landing-centered-h2 library-text" style="margin-top: 2.5%; text-align: center; width: 55%;">To see high quality lessons on subjects other than <[topic | capitalize]>, visit our Library.</h2> - <button class="btn oppia-landing-explore-lessons oppia-landing-explore-lessons-mobile" ng-click="onClickExploreLessonsButton()">Explore Lessons</button> - <div style="position: absolute; top: 40%"> - <div class="oppia-landing-background-icon-row"> - <img ng-src="<[getStaticSubjectImageUrl('Humor')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Combinatorics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Cooking')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Government')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Architecture')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('History')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Microbiology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Engineering')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Algorithms')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Economics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Computing')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Reading')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Art')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Creativity')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Physics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Language')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Arithmetic')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chess')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Astronomy')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Religion')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Mathematics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Philosophy')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Humor')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Combinatorics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Cooking')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Government')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Architecture')]>" class="oppia-landing-background-icon" alt=""> + <div class="container-fluid"> + <div class="row oppia-landing-section-row"> + <div class="col-sm-6 col-sm-push-6 oppia-landing-image-div"> + <img ng-src="<[bookImageUrl]>" class="oppia-landing-image" alt=""> + </div> + <div class="col-sm-6 col-sm-pull-6"> + <div class="oppia-landing-text-box-1"> + <h1 class="oppia-landing-h1 text-center">Topics covered in this lesson</h1> + <br> + <h2 class="oppia-landing-h2 oppia-lessons-title oppia-text-color-green" ng-repeat="lessonTitle in lessons"><[lessonTitle]></h2> + <h2 class="oppia-landing-h2 oppia-lessons-title oppia-text-color-black">... and more!</h2> + </div> + </div> </div> + </div> + </div> + </div> - <div class="oppia-landing-background-icon-row"> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Poetry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Puzzles')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Calculus')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Business')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Geography')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Biology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Poetry')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Puzzles')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Calculus')]>" class="oppia-landing-background-icon" alt=""> - - <img ng-src="<[getStaticSubjectImageUrl('Business')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Geography')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Biology')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Genetics')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Space')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Algebra')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Music')]>" class="oppia-landing-background-icon" alt=""> - <img ng-src="<[getStaticSubjectImageUrl('Chemistry')]>" class="oppia-landing-background-icon" alt=""> + <div class="oppia-landing-section text-center" style="background-color: #afd2eb"> + <div class="oppia-landing-section-inner"> + <div class="container-fluid"> + <background-banner class="oppia-landing-background-image"></background-banner> + <div class="row oppia-landing-section-row"> + <div class="col-sm-12"> + <h1 class="oppia-landing-h1 oppia-text-centered oppia-text-color-green" style="padding: 0">Imagine what your students could learn today!</h1> + </div> + </div> + <div class="row oppia-landing-section-row"> + <div class="col-sm-12"> + <button class="btn oppia-landing-page-button" ng-click="onClickGetStartedButton('teacher')">Get Started</button> + </div> + </div> + <div class="row oppia-landing-section-row"> + <div class="col-sm-12"> + <h2 class="oppia-landing-h2 oppia-text-centered oppia-text-color-green"> + To see high quality lessons on subjects other than <[topicTitle]>, visit our Library. + </h2> + </div> + </div> + <div class="row oppia-landing-section-row oppia-text-color-green"> + <div class="col-sm-12"> + <button class="btn oppia-landing-page-button" ng-click="onClickExploreLessonsButton()">Explore Lessons</button> + </div> </div> </div> </div> + </div> </div> <style> + .oppia-landing-page h1, h2, button { + font-family: "Capriola", "Roboto", Arial, sans-serif; + } .oppia-landing-section { height: auto; margin-left: auto; @@ -251,87 +131,72 @@ <h2 class="oppia-landing-h2 oppia-landing-centered-h2 library-text" style="margi overflow: hidden; position: relative; } + .oppia-landing-section-row { + display: flex; + align-items: center; + } + .oppia-landing-background-image { + zoom: 2; + } .oppia-landing-h1 { font-size: 2.8vw; margin: 0; text-align: left; - top: 160px; white-space: pre; } - .oppia-landing-section-inner { - height: auto; - margin-left: auto; - margin-right: auto; - padding-bottom: 3%; - padding-top: 8%; - position: relative; - } .oppia-landing-h2 { font-size: 1.95vmax; line-height: 1.6em; margin-top: 5%; - position: relative; text-align: left; width: 80%; - z-index: 20; } - .oppia-landing-h1, .oppia-landing-h2 { - color: #005c5e; - font-family: "Capriola", "Roboto", Arial, sans-serif; + .oppia-text-centered { + padding: 0 25%; + margin: 2% 0; + text-align: center; + width: 100%; } - - .oppia-landing-learn-more, - .oppia-landing-get-started, - .oppia-landing-explore-lessons { - border-radius: 0; - font-family: "Capriola", "Roboto", Arial, sans-serif; - font-size: 1.9vmax; - height: 56px; - position: relative; + .oppia-landing-section-inner { + height: auto; + margin-left: auto; + margin-right: auto; + padding-bottom: 8%; + padding-top: 6%; + } + .oppia-lessons-title { + margin: 10px; text-align: center; - text-transform: uppercase; - width: 38%; - z-index: 20; + width: 100%; } - .oppia-landing-get-started, - .oppia-landing-explore-lessons { + .oppia-landing-page-button { background-color: #015c53; + border-radius: 0; + border: 4px solid #265a53; color: #fff; - left: 0; + font-size: 1.9vmax; margin-right: 15px; + text-transform: uppercase; + width: 40%; } - .oppia-landing-learn-more { + .oppia-make-button-transparent { background-color: transparent; - border: 4px solid #265a53; - box-sizing: border-box; color: #265a53; - margin-right: 15px; } - .oppia-landing-get-started:hover, - .oppia-landing-get-started:focus, - .oppia-landing-get-started:active, - .oppia-landing-explore-lessons:hover, - .oppia-landing-explore-lessons:focus, - .oppia-landing-explore-lessons:active { + .oppia-landing-page-button:hover, + .oppia-landing-page-button:focus, + .oppia-landing-page-button:active { background-color: #05beb2; + border-color: #05beb2; color: #fff; } - .oppia-landing-learn-more:hover, - .oppia-landing-learn-more:focus, - .oppia-landing-learn-more:active { + .oppia-make-button-transparent:hover, + .oppia-make-button-transparent:focus, + .oppia-make-button-transparent:active { + background-color: transparent; border-color: #05beb2; color: #05beb2; } - .oppia-landing-centered-h2 { - left: 0px; - margin-left: 25%; - margin-right: 25%; - padding-left: 0px; - } - .library-text { - margin-left: 22%; - } - .oppia-landing-text-box-0, .oppia-landing-text-box-1, .oppia-landing-text-box-2 { margin-left: 60px; @@ -341,35 +206,15 @@ <h2 class="oppia-landing-h2 oppia-landing-centered-h2 library-text" style="margi margin-top: 100px; padding-right: 40px; } - .oppia-landing-background-icon-row { - margin-top: 0; - margin-bottom: 0; - margin-left: -webkit-calc((100% - 2700px) / 2); - margin-left: -moz-calc((100% - 2700px) / 2); - margin-left: -o-calc((100% - 2700px) / 2); - margin-left: calc((100% - 2700px) / 2); - margin-right: -webkit-calc((100% - 2700px) / 2); - margin-right: -moz-calc((100% - 2700px) / 2); - margin-right: -o-calc((100% - 2700px) / 2); - margin-right: calc((100% - 2700px) / 2); - opacity: 0.4; - position: relative; - text-align: center; - width: 2700px; - } - .oppia-landing-background-icon { - margin: -1px; - max-width: 96px; - width: 10%; - } .oppia-landing-image { max-width: 100%; - max-height: 500px; - position: relative; - z-index: 20; + max-height: 400px; } - .oppia-landing-image-mobile { - display: none; + .oppia-text-color-black { + color: #242424; + } + .oppia-text-color-green { + color: #005c5e; } .oppia-landing-video-frame { border: 11px black solid; @@ -377,7 +222,7 @@ <h2 class="oppia-landing-h2 oppia-landing-centered-h2 library-text" style="margi border-bottom-width: 50px; border-radius: 36px; background: #000000; - height: 600px; + min-height: 500px; margin: auto; position: relative; width: 270px; @@ -420,60 +265,43 @@ <h2 class="oppia-landing-h2 oppia-landing-centered-h2 library-text" style="margi .oppia-landing-section { height: auto; } + .oppia-landing-section-row { + display: block; + } .oppia-landing-h1 { font-size: 1.6em; - margin-left: 10%; - margin-right: 20%; - padding-left: 0; - padding-right: 0; padding-top: 40px; text-align: center; - width: 80%; + white-space: pre-line; } .oppia-landing-h2 { font-size: 1em; - left: 10%; + padding: 0 10%; text-align: center; - width: 80%; + width: 100%; + } + .oppia-landing-background-image { + zoom: 1.3; + } + .oppia-landing-image { + max-height: 250px; } - .oppia-landing-text-box-0, .oppia-landing-text-box-1, .oppia-landing-text-box-2 { margin-left: auto; } - .oppia-landing-learn-more, - .oppia-landing-get-started, - .oppia-landing-explore-lessons { + .oppia-landing-page-button { font-size: 2.3vmax; margin-bottom: 30px; width: auto; } - .oppia-landing-get-started-mobile { - left: 9.5px; - } - .oppia-landing-explore-lessons-mobile { - left: 10.5px; - } - .oppia-landing-image-mobile { - display: block; - margin: auto; - } - .oppia-landing-image-desktop { - display: none; - } - .oppia-landing-centered-h2 { - left: 0; - margin-right: 10%; - } .oppia-landing-text-box-2 { margin-top: 0; padding-right: inherit; } } @media screen and (max-width: 320px) { - .oppia-landing-learn-more, - .oppia-landing-get-started, - .oppia-landing-explore-lessons { + .oppia-landing-page-button { font-size: 2.8vmax; } .oppia-landing-h1 { diff --git a/feconf.py b/feconf.py index 4542194789af3..34a549dbc8e87 100644 --- a/feconf.py +++ b/feconf.py @@ -951,5 +951,5 @@ def get_empty_ratings(): # oppia constant defined in # core/templates/dev/head/pages/landing-pages/TopicLandingPage.js file. AVAILABLE_LANDING_PAGES = { - 'maths': ['fractions', 'ratios'] + 'maths': ['fractions', 'negative-numbers', 'ratios'] }
networkx__networkx-4326
Use a utf8 friendly latex backend The current sphinx configuration in docs/conf.py defaults to pdflatex. This is causing problems on #4169 which introduces API-level doctests with unicode characters in them. I tried several iterations of lualatex and xelatex to try and get it to work, but latex errors are never the most helpful. I will open a PR to resolve this shortly.
[ { "content": "from datetime import date\nfrom sphinx_gallery.sorting import ExplicitOrder\nimport sphinx_rtd_theme\nfrom warnings import filterwarnings\n\nfilterwarnings(\n \"ignore\", message=\"Matplotlib is currently using agg\", category=UserWarning\n)\n\n# General configuration\n# ---------------------\n...
[ { "content": "from datetime import date\nfrom sphinx_gallery.sorting import ExplicitOrder\nimport sphinx_rtd_theme\nfrom warnings import filterwarnings\n\nfilterwarnings(\n \"ignore\", message=\"Matplotlib is currently using agg\", category=UserWarning\n)\n\n# General configuration\n# ---------------------\n...
diff --git a/.circleci/config.yml b/.circleci/config.yml index 8ba6d421622..a7ba4d54654 100644 --- a/.circleci/config.yml +++ b/.circleci/config.yml @@ -23,7 +23,7 @@ jobs: - run: name: Install TeX command: | - sudo apt-get install texlive texlive-latex-extra latexmk + sudo apt-get install texlive texlive-latex-extra latexmk texlive-xetex fonts-freefont-otf xindy - run: name: Install cartopy dependencies diff --git a/.github/workflows/deploy-docs.yml b/.github/workflows/deploy-docs.yml index ae24a8f8b03..8abffa754ad 100644 --- a/.github/workflows/deploy-docs.yml +++ b/.github/workflows/deploy-docs.yml @@ -21,7 +21,8 @@ jobs: run: | sudo apt-get update sudo apt-get install libgdal-dev graphviz graphviz-dev - sudo apt-get install texlive texlive-latex-extra latexmk + sudo apt-get install texlive texlive-latex-extra latexmk texlive-xetex + sudo apt-get install fonts-freefont-otf xindy sudo apt-get install libgeos-dev libproj-dev sudo apt-get install libspatialindex-dev python3 -m venv ~/venv diff --git a/.travis.yml b/.travis.yml index e7bb6ba6cd8..ff15808ae8a 100644 --- a/.travis.yml +++ b/.travis.yml @@ -20,6 +20,9 @@ matrix: - texlive - texlive-latex-extra - latexmk + - texlive-xetex + - fonts-freefont-otf + - xindy - libgeos-dev - libproj-dev - libspatialindex-dev diff --git a/doc/conf.py b/doc/conf.py index b58f9aaea6d..12998e33980 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -170,6 +170,8 @@ # Options for LaTeX output # ------------------------ +# Use a latex engine that allows for unicode characters in docstrings +latex_engine = "xelatex" # The paper size ('letter' or 'a4'). latex_paper_size = "letter"
sosreport__sos-3483
Obtain CNI files for containerd Containerd uses the CNI configuration present in the defined folders by the configuration ``` [plugins."io.containerd.grpc.v1.cri".cni] conf_dir = "/etc/cni/net.d ``` It will be very useful to obtain the cni configurations present on the folder for debugging networking related problems https://github.com/sosreport/sos/blob/b94ced8370824bd62f3c7573ae33fcb96c5da531/sos/report/plugins/containerd.py#L12-L28
[ { "content": "# This file is part of the sos project: https://github.com/sosreport/sos\n#\n# This copyrighted material is made available to anyone wishing to use,\n# modify, copy, or redistribute it subject to the terms and conditions of\n# version 2 of the GNU General Public License.\n#\n# See the LICENSE file...
[ { "content": "# This file is part of the sos project: https://github.com/sosreport/sos\n#\n# This copyrighted material is made available to anyone wishing to use,\n# modify, copy, or redistribute it subject to the terms and conditions of\n# version 2 of the GNU General Public License.\n#\n# See the LICENSE file...
diff --git a/sos/report/plugins/containerd.py b/sos/report/plugins/containerd.py index 33231da1ab..ecba988721 100644 --- a/sos/report/plugins/containerd.py +++ b/sos/report/plugins/containerd.py @@ -19,6 +19,7 @@ class Containerd(Plugin, RedHatPlugin, UbuntuPlugin, CosPlugin): def setup(self): self.add_copy_spec([ "/etc/containerd/", + "/etc/cni/net.d/", ]) self.add_cmd_output('containerd config dump')
bokeh__bokeh-4542
clustering app example needs updates for recent changes Fails because `theme.yaml` tries to set `title_text_font_size` on `Plot` This bypasses the (python) property that deprecates this former `Plot` property, and tries to set a (Bokeh) property with that name directly on the plot. This fails, because of the work to make `Title` its own model. Will fix up the `theme.yaml` and note this problem in migration guide. Since we barely demonstrated and not discussed the theming, hopefully this will not bite many people at all.
[ { "content": "import numpy as np\nnp.random.seed(0)\n\nfrom bokeh.io import curdoc\nfrom bokeh.models import ColumnDataSource, VBox, HBox, Select, Slider\nfrom bokeh.plotting import Figure\nfrom bokeh.palettes import Spectral6\n\nfrom sklearn import cluster, datasets\nfrom sklearn.neighbors import kneighbors_gr...
[ { "content": "import numpy as np\nnp.random.seed(0)\n\nfrom bokeh.io import curdoc\nfrom bokeh.models import ColumnDataSource, VBox, HBox, Select, Slider\nfrom bokeh.plotting import Figure\nfrom bokeh.palettes import Spectral6\n\nfrom sklearn import cluster, datasets\nfrom sklearn.neighbors import kneighbors_gr...
diff --git a/examples/app/clustering/main.py b/examples/app/clustering/main.py index 1e4ad5e663d..360fb62937a 100644 --- a/examples/app/clustering/main.py +++ b/examples/app/clustering/main.py @@ -153,7 +153,7 @@ def update_algorithm_or_clusters(attrname, old, new): source.data['x'] = X[:, 0] source.data['y'] = X[:, 1] - plot.title = algorithm + plot.title.text = algorithm def update_samples_or_dataset(attrname, old, new): global X, y diff --git a/examples/app/clustering/theme.yaml b/examples/app/clustering/theme.yaml index 804befe1d52..ab939448def 100644 --- a/examples/app/clustering/theme.yaml +++ b/examples/app/clustering/theme.yaml @@ -3,9 +3,11 @@ attrs: Figure: plot_width: 400 plot_height: 400 - title_text_font_size: '10pt' background_fill_color: 'lightgrey' background_fill_alpha: 0.2 Grid: grid_line_color: null + + Title: + text_font_size: '10pt'
pytorch__tnt-85
PyTorch 0.4 test errors Need to fix these: ``` .......... ---------------------------------------------------------------------- Ran 10 tests in 0.015s OK E.../Users/szagoruyko/anaconda3/lib/python3.6/site-packages/numpy/core/_methods.py:135: RuntimeWarning: Degrees of freedom <= 0 for slice keepdims=keepdims) /Users/szagoruyko/anaconda3/lib/python3.6/site-packages/numpy/core/_methods.py:127: RuntimeWarning: invalid value encountered in double_scalars ret = ret.dtype.type(ret / rcount) .....E ====================================================================== ERROR: testAPMeter (__main__.TestMeters) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_meters.py", line 208, in testAPMeter ap = mtr.value() File "/Users/szagoruyko/anaconda3/lib/python3.6/site-packages/torchnet/meter/apmeter.py", line 137, in value ap[k] = precision[truth.byte()].sum() / max(truth.sum(), 1) RuntimeError: Expected object of type torch.FloatTensor but found type torch.LongTensor for argument #2 'other' ====================================================================== ERROR: testmAPMeter (__main__.TestMeters) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_meters.py", line 329, in testmAPMeter ap = mtr.value() File "/Users/szagoruyko/anaconda3/lib/python3.6/site-packages/torchnet/meter/mapmeter.py", line 30, in value return self.apmeter.value().mean() File "/Users/szagoruyko/anaconda3/lib/python3.6/site-packages/torchnet/meter/apmeter.py", line 137, in value ap[k] = precision[truth.byte()].sum() / max(truth.sum(), 1) RuntimeError: Expected object of type torch.FloatTensor but found type torch.LongTensor for argument #2 'other' ---------------------------------------------------------------------- Ran 10 tests in 0.118s FAILED (errors=2) ```
[ { "content": "import math\nfrom . import meter\nimport torch\n\n\nclass APMeter(meter.Meter):\n \"\"\"\n The APMeter measures the average precision per class.\n\n The APMeter is designed to operate on `NxK` Tensors `output` and\n `target`, and optionally a `Nx1` Tensor weight where (1) the `output`\...
[ { "content": "import math\nfrom . import meter\nimport torch\n\n\nclass APMeter(meter.Meter):\n \"\"\"\n The APMeter measures the average precision per class.\n\n The APMeter is designed to operate on `NxK` Tensors `output` and\n `target`, and optionally a `Nx1` Tensor weight where (1) the `output`\...
diff --git a/torchnet/meter/apmeter.py b/torchnet/meter/apmeter.py index 5058e29e43..57991d1241 100644 --- a/torchnet/meter/apmeter.py +++ b/torchnet/meter/apmeter.py @@ -134,5 +134,5 @@ def value(self): precision = tp.div(rg) # compute average precision - ap[k] = precision[truth.byte()].sum() / max(truth.sum(), 1) + ap[k] = precision[truth.byte()].sum() / max(float(truth.sum()), 1) return ap
conda__conda-3257
Zsh.exe not supported on MSYS2 The following error is reported in a MSYS2 zsh shell: ``` ➜ dotfiles git:(master) ✗ source activate py35_32 Traceback (most recent call last): File "C:\Miniconda3\Scripts\conda-script.py", line 5, in <module> sys.exit(main()) File "C:\Miniconda3\lib\site-packages\conda\cli\main.py", line 48, in main activate.main() File "C:\Miniconda3\lib\site-packages\conda\cli\activate.py", line 105, in main shelldict = shells[shell] KeyError: 'zsh.exe' ```
[ { "content": "from __future__ import print_function, division, absolute_import\n\nimport collections\nimport errno\nimport hashlib\nimport logging\nimport os\nimport re\nimport sys\nimport time\nimport threading\nfrom functools import partial\nfrom os.path import isdir, join, basename, exists\n# conda build imp...
[ { "content": "from __future__ import print_function, division, absolute_import\n\nimport collections\nimport errno\nimport hashlib\nimport logging\nimport os\nimport re\nimport sys\nimport time\nimport threading\nfrom functools import partial\nfrom os.path import isdir, join, basename, exists\n# conda build imp...
diff --git a/conda/utils.py b/conda/utils.py index 90b328b85ba..9c8a6f82859 100644 --- a/conda/utils.py +++ b/conda/utils.py @@ -313,6 +313,12 @@ def human_bytes(n): "sh.exe": dict( msys2_shell_base, exe="sh.exe", ), + "zsh.exe": dict( + msys2_shell_base, exe="zsh.exe", + ), + "zsh": dict( + msys2_shell_base, exe="zsh", + ), } else:
dask__dask-533
ProgressBar is not visible in the notebook The `ProgressBar` doesn't update itself during execution while in the notebook. Afterwards the full bar will pop up but it doesn't give you any cues during execution.
[ { "content": "from __future__ import division\nimport sys\nimport threading\nimport time\nfrom timeit import default_timer\n\nfrom ..core import istask\nfrom .core import Diagnostic\n\n\ndef format_time(t):\n \"\"\"Format seconds into a human readable form.\n\n >>> format_time(10.4)\n '10.4s'\n >>> ...
[ { "content": "from __future__ import division\nimport sys\nimport threading\nimport time\nfrom timeit import default_timer\n\nfrom ..core import istask\nfrom .core import Diagnostic\n\n\ndef format_time(t):\n \"\"\"Format seconds into a human readable form.\n\n >>> format_time(10.4)\n '10.4s'\n >>> ...
diff --git a/dask/diagnostics/progress.py b/dask/diagnostics/progress.py index d79f5476300..08a37459bb0 100644 --- a/dask/diagnostics/progress.py +++ b/dask/diagnostics/progress.py @@ -54,6 +54,7 @@ def _start(self, dsk, state): def _posttask(self, key, value, dsk, state, id): self._ndone += 1 + sys.stdout.flush() def _finish(self, dsk, state, errored): self._running = False
cupy__cupy-5225
[info] NumPy/SciPy new version pinning recommendation See: - https://github.com/numpy/numpy/pull/18505 - scipy/scipy#12862 The most important takeaway is that NumPy/SciPy now recommend downstream distributions to pin the upper bound version if NumPy/Scipy are runtime dependencies. (The example is if the latest NumPy out there is 1.20, one should pin to `<1.23`; the notation used in the docs `<1.xx+3.0` is a bit confusing, see the clarification in https://github.com/scipy/scipy/pull/12862#discussion_r575790007.) There are other suggestions too, but I think this is potentially the most impactful one.
[ { "content": "#!/usr/bin/env python\n\nimport glob\nimport os\nfrom setuptools import setup, find_packages\nimport sys\n\nimport cupy_setup_build\n\n\nfor submodule in ('cupy/_core/include/cupy/cub/',\n 'cupy/_core/include/cupy/jitify'):\n if len(os.listdir(submodule)) == 0:\n msg = '...
[ { "content": "#!/usr/bin/env python\n\nimport glob\nimport os\nfrom setuptools import setup, find_packages\nimport sys\n\nimport cupy_setup_build\n\n\nfor submodule in ('cupy/_core/include/cupy/cub/',\n 'cupy/_core/include/cupy/jitify'):\n if len(os.listdir(submodule)) == 0:\n msg = '...
diff --git a/setup.py b/setup.py index 200fa0da730..9ba1a739d44 100644 --- a/setup.py +++ b/setup.py @@ -31,11 +31,11 @@ ], 'install': [ - 'numpy>=1.17', + 'numpy>=1.17,<1.23', # see #4773 'fastrlock>=0.5', ], 'all': [ - 'scipy>=1.4', + 'scipy>=1.4,<1.9', # see #4773 'optuna>=2.0', ],
feast-dev__feast-3966
Bump the cryptography version to 42 **Is your feature request related to a problem? Please describe.** `cryptography<42` package has some medium vulnerabilities. Example: https://scout.docker.com/vulnerabilities/id/CVE-2023-50782?s=github&n=cryptography&t=pypi&vr=%3C42.0.0&utm_source=desktop&utm_medium=ExternalLink starlette and fastapi had some high vulnerabilities but that was recently bumped up and thanks to that, they are removed. **Describe the solution you'd like** Bump the cryptography package to>=42. Nice to have: bumping up of other compatible packages also.
[ { "content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a...
[ { "content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a...
diff --git a/sdk/python/requirements/py3.10-ci-requirements.txt b/sdk/python/requirements/py3.10-ci-requirements.txt index f20bc05df90..051eee0ad1c 100644 --- a/sdk/python/requirements/py3.10-ci-requirements.txt +++ b/sdk/python/requirements/py3.10-ci-requirements.txt @@ -124,7 +124,7 @@ comm==0.2.1 # ipywidgets coverage[toml]==7.4.1 # via pytest-cov -cryptography==41.0.7 +cryptography==42.0.4 # via # azure-identity # azure-storage-blob @@ -659,7 +659,7 @@ pymysql==1.1.0 # via feast (setup.py) pyodbc==5.1.0 # via feast (setup.py) -pyopenssl==23.3.0 +pyopenssl==24.0.0 # via snowflake-connector-python pyparsing==3.1.1 # via @@ -805,7 +805,7 @@ sniffio==1.3.0 # httpx snowballstemmer==2.2.0 # via sphinx -snowflake-connector-python[pandas]==3.7.0 +snowflake-connector-python[pandas]==3.7.1 # via feast (setup.py) sortedcontainers==2.4.0 # via snowflake-connector-python diff --git a/sdk/python/requirements/py3.8-ci-requirements.txt b/sdk/python/requirements/py3.8-ci-requirements.txt index afa43ec2a2b..bb177f2ec22 100644 --- a/sdk/python/requirements/py3.8-ci-requirements.txt +++ b/sdk/python/requirements/py3.8-ci-requirements.txt @@ -131,7 +131,7 @@ comm==0.2.1 # ipywidgets coverage[toml]==7.4.1 # via pytest-cov -cryptography==41.0.7 +cryptography==42.0.4 # via # azure-identity # azure-storage-blob @@ -680,7 +680,7 @@ pymysql==1.1.0 # via feast (setup.py) pyodbc==5.1.0 # via feast (setup.py) -pyopenssl==23.3.0 +pyopenssl==24.0.0 # via snowflake-connector-python pyparsing==3.1.1 # via @@ -829,7 +829,7 @@ sniffio==1.3.0 # httpx snowballstemmer==2.2.0 # via sphinx -snowflake-connector-python[pandas]==3.7.0 +snowflake-connector-python[pandas]==3.7.1 # via feast (setup.py) sortedcontainers==2.4.0 # via snowflake-connector-python diff --git a/sdk/python/requirements/py3.9-ci-requirements.txt b/sdk/python/requirements/py3.9-ci-requirements.txt index 6c26f889e27..a4104065bab 100644 --- a/sdk/python/requirements/py3.9-ci-requirements.txt +++ b/sdk/python/requirements/py3.9-ci-requirements.txt @@ -124,7 +124,7 @@ comm==0.2.1 # ipywidgets coverage[toml]==7.4.1 # via pytest-cov -cryptography==41.0.7 +cryptography==42.0.4 # via # azure-identity # azure-storage-blob @@ -666,7 +666,7 @@ pymysql==1.1.0 # via feast (setup.py) pyodbc==5.1.0 # via feast (setup.py) -pyopenssl==23.3.0 +pyopenssl==24.0.0 # via snowflake-connector-python pyparsing==3.1.1 # via @@ -814,7 +814,7 @@ sniffio==1.3.0 # httpx snowballstemmer==2.2.0 # via sphinx -snowflake-connector-python[pandas]==3.7.0 +snowflake-connector-python[pandas]==3.7.1 # via feast (setup.py) sortedcontainers==2.4.0 # via snowflake-connector-python diff --git a/setup.py b/setup.py index c14d64557a2..4a19b49f168 100644 --- a/setup.py +++ b/setup.py @@ -148,7 +148,7 @@ [ "build", "virtualenv==20.23.0", - "cryptography>=35.0,<42", + "cryptography>=35.0,<43", "flake8>=6.0.0,<6.1.0", "black>=22.6.0,<23", "isort>=5,<6",
pytorch__vision-7613
make_grid doesn't use kwargs ### 🐛 Describe the bug In the `make_grid` function from `torchvision.utils`,`kwargs` it not used: https://github.com/pytorch/vision/blob/300a90926e88f13abbaf3d8155cdba36aab86ab4/torchvision/utils.py#LL24C1-L33C19 Is this a bug? It's very easy to mistype some argument and not even notice because no exception is raised. ### Versions PyTorch version: 2.0.0+cpu Is debug build: False CUDA used to build PyTorch: None ROCM used to build PyTorch: N/A OS: Manjaro Linux (x86_64) GCC version: (GCC) 12.2.1 20230201 Clang version: 15.0.7 CMake version: Could not collect Libc version: glibc-2.37 Python version: 3.9.16 (main, Mar 8 2023, 14:00:05) [GCC 11.2.0] (64-bit runtime) Python platform: Linux-6.1.29-1-MANJARO-x86_64-with-glibc2.37 Is CUDA available: False CUDA runtime version: No CUDA CUDA_MODULE_LOADING set to: N/A GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA HIP runtime version: N/A MIOpen runtime version: N/A Is XNNPACK available: True CPU: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 39 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 4 On-line CPU(s) list: 0-3 Vendor ID: GenuineIntel Model name: Intel(R) Core(TM) i7-7500U CPU @ 2.70GHz CPU family: 6 Model: 142 Thread(s) per core: 2 Core(s) per socket: 2 Socket(s): 1 Stepping: 9 CPU(s) scaling MHz: 54% CPU max MHz: 3500,0000 CPU min MHz: 400,0000 BogoMIPS: 5802,42 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb invpcid_single pti ssbd ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid mpx rdseed adx smap clflushopt intel_pt xsaveopt xsavec xgetbv1 xsaves dtherm ida arat pln pts hwp hwp_notify hwp_act_window hwp_epp md_clear flush_l1d arch_capabilities Virtualization: VT-x L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 512 KiB (2 instances) L3 cache: 4 MiB (1 instance) NUMA node(s): 1 NUMA node0 CPU(s): 0-3 Vulnerability Itlb multihit: KVM: Mitigation: VMX disabled Vulnerability L1tf: Mitigation; PTE Inversion; VMX conditional cache flushes, SMT vulnerable Vulnerability Mds: Mitigation; Clear CPU buffers; SMT vulnerable Vulnerability Meltdown: Mitigation; PTI Vulnerability Mmio stale data: Mitigation; Clear CPU buffers; SMT vulnerable Vulnerability Retbleed: Mitigation; IBRS Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; IBRS, IBPB conditional, STIBP conditional, RSB filling, PBRSB-eIBRS Not affected Vulnerability Srbds: Mitigation; Microcode Vulnerability Tsx async abort: Not affected Versions of relevant libraries: [pip3] efficientnet-pytorch==0.7.1 [pip3] mypy-extensions==1.0.0 [pip3] numpy==1.24.1 [pip3] segmentation-models-pytorch==0.3.2 [pip3] torch==2.0.0+cpu [pip3] torchaudio==2.0.1+cpu [pip3] torchvision==0.15.1+cpu [conda] efficientnet-pytorch 0.7.1 pypi_0 pypi [conda] numpy 1.24.1 pypi_0 pypi [conda] segmentation-models-pytorch 0.3.2 pypi_0 pypi [conda] torch 2.0.0+cpu pypi_0 pypi [conda] torchaudio 2.0.1+cpu pypi_0 pypi [conda] torchvision 0.15.1+cpu pypi_0 pypi
[ { "content": "import collections\nimport math\nimport pathlib\nimport warnings\nfrom itertools import repeat\nfrom types import FunctionType\nfrom typing import Any, BinaryIO, List, Optional, Tuple, Union\n\nimport numpy as np\nimport torch\nfrom PIL import Image, ImageColor, ImageDraw, ImageFont\n\n__all__ = [...
[ { "content": "import collections\nimport math\nimport pathlib\nimport warnings\nfrom itertools import repeat\nfrom types import FunctionType\nfrom typing import Any, BinaryIO, List, Optional, Tuple, Union\n\nimport numpy as np\nimport torch\nfrom PIL import Image, ImageColor, ImageDraw, ImageFont\n\n__all__ = [...
diff --git a/torchvision/utils.py b/torchvision/utils.py index bc9d88b2849..1418656a7f2 100644 --- a/torchvision/utils.py +++ b/torchvision/utils.py @@ -29,7 +29,6 @@ def make_grid( value_range: Optional[Tuple[int, int]] = None, scale_each: bool = False, pad_value: float = 0.0, - **kwargs, ) -> torch.Tensor: """ Make a grid of images.
localstack__localstack-1695
Downloading files from localstack S3 generates "Requested Range Not Satisfiable" Seems related to #1185, but I am copying a tar file (~414MB) up to my localstack S3 instance and trying to download it. Early on in the download (Looks like it's roughly 32MB in) the following is generated ``` + aws --endpoint-url http://localhost:4572 s3 cp s3://test-bucket/test.tar /tmp/test.copy.tar download failed: s3://test-bucket/test.tar to ./test.copy.tar An error occurred (416) when calling the GetObject operation: Requested Range Not Satisfiable ``` I am running the following to replicate this: ``` #!/bin/bash set -x set -e # Create Bucket aws --endpoint-url=http://localhost:4572 s3 mb s3://test-bucket # Bucket ACL aws --endpoint-url=http://localhost:4572 s3api put-bucket-acl --bucket test-bucket --acl public-read # Copy to Bucket aws --endpoint-url=http://localhost:4572 s3 cp /tmp/test.tar s3://test-bucket/test.tar # ls bucket aws --endpoint-url=http://localhost:4572 s3 ls s3://test-bucket/ # Download aws --endpoint-url http://localhost:4572 s3 cp s3://test-bucket/test.tar /tmp/test.copy.tar ``` Perhaps I am doing something wrong. The command I'm using to copy from s3 works on a real s3 instance, but not with localstack's. As far as I can tell I'm using the latest localstack image installed w/ `pip3 install localstack[all]`. ``` localstack/localstack latest 06af7745282d 18 hours ago 829MB ``` I also don't think I am running out of Memory, Disk, ect. And the localstack instance generates the following error messages: ``` 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48000) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47988) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48006) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47960) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48012) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47958) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48018) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- ``` Downloading files from localstack S3 generates "Requested Range Not Satisfiable" Seems related to #1185, but I am copying a tar file (~414MB) up to my localstack S3 instance and trying to download it. Early on in the download (Looks like it's roughly 32MB in) the following is generated ``` + aws --endpoint-url http://localhost:4572 s3 cp s3://test-bucket/test.tar /tmp/test.copy.tar download failed: s3://test-bucket/test.tar to ./test.copy.tar An error occurred (416) when calling the GetObject operation: Requested Range Not Satisfiable ``` I am running the following to replicate this: ``` #!/bin/bash set -x set -e # Create Bucket aws --endpoint-url=http://localhost:4572 s3 mb s3://test-bucket # Bucket ACL aws --endpoint-url=http://localhost:4572 s3api put-bucket-acl --bucket test-bucket --acl public-read # Copy to Bucket aws --endpoint-url=http://localhost:4572 s3 cp /tmp/test.tar s3://test-bucket/test.tar # ls bucket aws --endpoint-url=http://localhost:4572 s3 ls s3://test-bucket/ # Download aws --endpoint-url http://localhost:4572 s3 cp s3://test-bucket/test.tar /tmp/test.copy.tar ``` Perhaps I am doing something wrong. The command I'm using to copy from s3 works on a real s3 instance, but not with localstack's. As far as I can tell I'm using the latest localstack image installed w/ `pip3 install localstack[all]`. ``` localstack/localstack latest 06af7745282d 18 hours ago 829MB ``` I also don't think I am running out of Memory, Disk, ect. And the localstack instance generates the following error messages: ``` 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48000) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47988) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48006) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47960) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48012) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 47958) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- 2019-10-06T12:42:12:WARNING:localstack.services.generic_proxy: Connection prematurely closed by client (broken pipe). ---------------------------------------- Exception happened during processing of request from ('172.17.0.1', 48018) Traceback (most recent call last): File "/opt/code/localstack/localstack/services/generic_proxy.py", line 313, in forward self.wfile.write(to_bytes(response.content)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/lib/python3.6/socketserver.py", line 654, in process_request_thread self.finish_request(request, client_address) File "/usr/lib/python3.6/socketserver.py", line 364, in finish_request self.RequestHandlerClass(request, client_address, self) File "/opt/code/localstack/localstack/services/generic_proxy.py", line 103, in __init__ BaseHTTPRequestHandler.__init__(self, request, client_address, server) File "/usr/lib/python3.6/socketserver.py", line 724, in __init__ self.handle() File "/usr/lib/python3.6/http/server.py", line 418, in handle self.handle_one_request() File "/usr/lib/python3.6/http/server.py", line 406, in handle_one_request method() File "/opt/code/localstack/localstack/services/generic_proxy.py", line 128, in do_GET self.forward('GET') File "/opt/code/localstack/localstack/services/generic_proxy.py", line 330, in forward self.end_headers() File "/usr/lib/python3.6/http/server.py", line 520, in end_headers self.flush_headers() File "/usr/lib/python3.6/http/server.py", line 524, in flush_headers self.wfile.write(b"".join(self._headers_buffer)) File "/usr/lib/python3.6/socketserver.py", line 803, in write self._sock.sendall(b) BrokenPipeError: [Errno 32] Broken pipe ---------------------------------------- ```
[ { "content": "import sys\nimport logging\nimport traceback\nfrom moto.s3 import models as s3_models\nfrom moto.server import main as moto_main\nfrom localstack import config\nfrom localstack.constants import DEFAULT_PORT_S3_BACKEND\nfrom localstack.utils.aws import aws_stack\nfrom localstack.utils.common import...
[ { "content": "import sys\nimport logging\nimport traceback\nfrom moto.s3 import models as s3_models\nfrom moto.server import main as moto_main\nfrom localstack import config\nfrom localstack.constants import DEFAULT_PORT_S3_BACKEND\nfrom localstack.utils.aws import aws_stack\nfrom localstack.utils.common import...
diff --git a/localstack/services/s3/s3_starter.py b/localstack/services/s3/s3_starter.py index 49c60ba000558..428e54db934f1 100644 --- a/localstack/services/s3/s3_starter.py +++ b/localstack/services/s3/s3_starter.py @@ -14,7 +14,7 @@ LOGGER = logging.getLogger(__name__) # max file size for S3 objects (in MB) -S3_MAX_FILE_SIZE_MB = 128 +S3_MAX_FILE_SIZE_MB = 2048 def check_s3(expect_shutdown=False, print_error=False):
scikit-hep__pyhf-126
test_backend_consistency not resetting to default backend if test fails unexpectedly # Description A cascading error is observed when test_backend_consistency fails, which keeps the backend as tensorflow and causes all the other tests to erroneously fail. <img width="1550" alt="screenshot 2018-04-15 20 45 50" src="https://user-images.githubusercontent.com/761483/38786764-92380ebc-40ef-11e8-921c-fc20a2d96578.png"> Easy to reproduce, run `pytest` and see `test_pdf.py` fail. Run `pytest tests/test_pdf.py` and see that it's fine (as in screenshot).
[ { "content": "import logging\nimport pyhf.optimize as optimize\nimport pyhf.tensor as tensor\n\n\nlog = logging.getLogger(__name__)\ntensorlib = tensor.numpy_backend()\noptimizer = optimize.scipy_optimizer()\n\ndef set_backend(backend):\n \"\"\"\n Set the backend and the associated optimizer\n\n Args:\...
[ { "content": "import logging\nimport pyhf.optimize as optimize\nimport pyhf.tensor as tensor\n\n\nlog = logging.getLogger(__name__)\ntensorlib = tensor.numpy_backend()\ndefault_backend = tensorlib\noptimizer = optimize.scipy_optimizer()\ndefault_optimizer = optimizer\n\ndef set_backend(backend):\n \"\"\"\n ...
diff --git a/pyhf/__init__.py b/pyhf/__init__.py index cb29a5e3e4..2c1c1598a3 100644 --- a/pyhf/__init__.py +++ b/pyhf/__init__.py @@ -5,7 +5,9 @@ log = logging.getLogger(__name__) tensorlib = tensor.numpy_backend() +default_backend = tensorlib optimizer = optimize.scipy_optimizer() +default_optimizer = optimizer def set_backend(backend): """ diff --git a/tests/benchmarks/test_benchmark.py b/tests/benchmarks/test_benchmark.py index 8dabac86cf..9ddbec1033 100644 --- a/tests/benchmarks/test_benchmark.py +++ b/tests/benchmarks/test_benchmark.py @@ -106,7 +106,6 @@ def test_runOnePoint(benchmark, backend, n_bins): Returns: None """ - default_backend = pyhf.tensorlib pyhf.set_backend(backend) source = generate_source_static(n_bins) @@ -118,8 +117,6 @@ def test_runOnePoint(benchmark, backend, n_bins): assert benchmark(runOnePoint, pdf, data) is not None except AssertionError: print('benchmarking has failed for n_bins = {}'.formant(n_bins)) - pyhf.set_backend(default_backend) assert False # Reset backend - pyhf.set_backend(default_backend) diff --git a/tests/conftest.py b/tests/conftest.py new file mode 100644 index 0000000000..d69e6c447d --- /dev/null +++ b/tests/conftest.py @@ -0,0 +1,7 @@ +import pytest +import pyhf + +@pytest.fixture(scope='function', autouse=True) +def reset_backend(): + yield reset_backend + pyhf.set_backend(pyhf.default_backend) diff --git a/tests/test_backend_consistency.py b/tests/test_backend_consistency.py index e365097f99..ddf81b870c 100644 --- a/tests/test_backend_consistency.py +++ b/tests/test_backend_consistency.py @@ -4,7 +4,6 @@ import numpy as np import pytest - def generate_source_static(n_bins): """ Create the source structure for the given number of bins. @@ -86,7 +85,6 @@ def test_runOnePoint_q_mu(n_bins, Returns: None """ - default_backend = pyhf.tensorlib source = generate_source_static(n_bins) pdf = hepdata_like(source['bindata']['sig'], @@ -128,15 +126,10 @@ def test_runOnePoint_q_mu(n_bins, except AssertionError: print('Ratio to NumPy+SciPy exceeded tolerance of {}: {}'.format( tolerance['numpy'], numpy_ratio_delta_unity.tolist())) - pyhf.set_backend(default_backend) assert False try: assert (tensors_ratio_delta_unity < tolerance['tensors']).all() except AssertionError: print('Ratio between tensor backends exceeded tolerance of {}: {}'.format( tolerance['tensors'], tensors_ratio_delta_unity.tolist())) - pyhf.set_backend(default_backend) assert False - - # Reset backend - pyhf.set_backend(default_backend) diff --git a/tests/test_optim.py b/tests/test_optim.py index 1d278e329f..a9a27fde46 100644 --- a/tests/test_optim.py +++ b/tests/test_optim.py @@ -42,7 +42,6 @@ def test_optim_numpy(): init_pars = pdf.config.suggested_init() par_bounds = pdf.config.suggested_bounds() - oldlib = pyhf.tensorlib pyhf.set_backend(pyhf.tensor.numpy_backend(poisson_from_normal=True)) optim = pyhf.optimizer @@ -53,8 +52,6 @@ def test_optim_numpy(): result = optim.constrained_bestfit(pyhf.loglambdav, 1.0, data, pdf, init_pars, par_bounds) assert pyhf.tensorlib.tolist(result) - pyhf.set_backend(oldlib) - def test_optim_pytorch(): source = { @@ -96,8 +93,6 @@ def test_optim_pytorch(): init_pars = pdf.config.suggested_init() par_bounds = pdf.config.suggested_bounds() - oldlib = pyhf.tensorlib - pyhf.set_backend(pyhf.tensor.pytorch_backend(poisson_from_normal=True)) optim = pyhf.optimizer @@ -107,8 +102,6 @@ def test_optim_pytorch(): result = optim.constrained_bestfit(pyhf.loglambdav, 1.0, data, pdf, init_pars, par_bounds) assert pyhf.tensorlib.tolist(result) - pyhf.set_backend(oldlib) - def test_optim_tflow(): source = { @@ -150,8 +143,6 @@ def test_optim_tflow(): init_pars = pdf.config.suggested_init() par_bounds = pdf.config.suggested_bounds() - oldlib = pyhf.tensorlib - pyhf.set_backend(pyhf.tensor.tensorflow_backend()) pyhf.tensorlib.session = tf.Session() optim = pyhf.optimizer @@ -161,5 +152,3 @@ def test_optim_tflow(): result = optim.constrained_bestfit(pyhf.loglambdav, 1.0, data, pdf, init_pars, par_bounds) assert pyhf.tensorlib.tolist(result) - - pyhf.set_backend(oldlib) diff --git a/tests/test_tensor.py b/tests/test_tensor.py index 0bac85d27a..1318ec7fbf 100644 --- a/tests/test_tensor.py +++ b/tests/test_tensor.py @@ -41,8 +41,6 @@ def test_common_tensor_backends(): def test_pdf_eval(): - oldlib = pyhf.tensorlib - tf_sess = tf.Session() backends = [numpy_backend(poisson_from_normal=True), pytorch_backend(), @@ -92,12 +90,8 @@ def test_pdf_eval(): assert np.std(values) < 1e-6 - pyhf.set_backend(oldlib) - def test_pdf_eval_2(): - oldlib = pyhf.tensorlib - tf_sess = tf.Session() backends = [numpy_backend(poisson_from_normal=True), pytorch_backend(), @@ -126,5 +120,3 @@ def test_pdf_eval_2(): values.append(pyhf.tensorlib.tolist(v1)[0]) assert np.std(values) < 1e-6 - - pyhf.set_backend(oldlib)
locustio__locust-1395
Update flask version Our minimum required flask version is too old (saw at least one person having an issue https://stackoverflow.com/questions/61969924/typeerror-when-i-run-a-locustfile-py) https://flask.palletsprojects.com/en/1.1.x/changelog/#version-0-12-5 is a minimum, but we should probably go to 1.x right away. I can do the PR
[ { "content": "# -*- coding: utf-8 -*-\nimport ast\nimport os\nimport re\nimport sys\n\nfrom setuptools import find_packages, setup\n\nROOT_PATH = os.path.abspath(os.path.dirname(__file__))\n\n# parse version from locust/__init__.py\n_version_re = re.compile(r'__version__\\s+=\\s+(.*)')\n_init_file = os.path.joi...
[ { "content": "# -*- coding: utf-8 -*-\nimport ast\nimport os\nimport re\nimport sys\n\nfrom setuptools import find_packages, setup\n\nROOT_PATH = os.path.abspath(os.path.dirname(__file__))\n\n# parse version from locust/__init__.py\n_version_re = re.compile(r'__version__\\s+=\\s+(.*)')\n_init_file = os.path.joi...
diff --git a/setup.py b/setup.py index 80716c5a28..26f4eec5e8 100644 --- a/setup.py +++ b/setup.py @@ -20,7 +20,7 @@ version=version, install_requires=[ "gevent>=1.5.0", - "flask>=0.10.1", + "flask>=1.1.2", "requests>=2.9.1", "msgpack>=0.6.2", "pyzmq>=16.0.2",
encode__httpx-1054
Type-checking our tests I know this is not a standard thing to do across Encode projects, but I've been wondering if it would be worth starting to type-hint our tests. I've seen at least two instances of this recently: - In HTTPX: https://github.com/encode/httpx/pull/648#discussion_r359862603 - In Starlette: https://github.com/encode/starlette/issues/722 My rationale is based on two aspects: - It improves our upfront knowledge about how users will actually use HTTPX — currently their usage of type hints in the wild is not reflected anywhere. - It helps us catch type hint inconsistencies we wouldn't see in the core package. The main counter-argument, I suppose, is that type hinting tests is tedious. I think that's fair, but I believe the two pro's above make it compelling. Thoughts?
[ { "content": "\"\"\"\nType definitions for type checking purposes.\n\"\"\"\n\nimport ssl\nfrom http.cookiejar import CookieJar\nfrom typing import (\n IO,\n TYPE_CHECKING,\n AsyncIterator,\n Callable,\n Dict,\n Iterator,\n List,\n Mapping,\n Optional,\n Sequence,\n Tuple,\n U...
[ { "content": "\"\"\"\nType definitions for type checking purposes.\n\"\"\"\n\nimport ssl\nfrom http.cookiejar import CookieJar\nfrom typing import (\n IO,\n TYPE_CHECKING,\n AsyncIterator,\n Callable,\n Dict,\n Iterator,\n List,\n Mapping,\n Optional,\n Sequence,\n Tuple,\n U...
diff --git a/httpx/_types.py b/httpx/_types.py index a74020a4ae..d2fc098e24 100644 --- a/httpx/_types.py +++ b/httpx/_types.py @@ -72,4 +72,4 @@ # (filename, file (or text), content_type) Tuple[Optional[str], FileContent, Optional[str]], ] -RequestFiles = Union[Mapping[str, FileTypes], List[Tuple[str, FileTypes]]] +RequestFiles = Union[Mapping[str, FileTypes], Sequence[Tuple[str, FileTypes]]] diff --git a/scripts/check b/scripts/check index 2b42506f6f..f9fc19343b 100755 --- a/scripts/check +++ b/scripts/check @@ -10,5 +10,5 @@ set -x ${PREFIX}black --check --diff --target-version=py36 $SOURCE_FILES ${PREFIX}flake8 $SOURCE_FILES -${PREFIX}mypy httpx +${PREFIX}mypy $SOURCE_FILES ${PREFIX}isort --check --diff --project=httpx $SOURCE_FILES diff --git a/tests/client/test_async_client.py b/tests/client/test_async_client.py index 649e428f5e..6818b4a444 100644 --- a/tests/client/test_async_client.py +++ b/tests/client/test_async_client.py @@ -174,8 +174,11 @@ def test_dispatch_deprecated(): def test_asgi_dispatch_deprecated(): + async def app(scope, receive, send): + pass + with pytest.warns(DeprecationWarning) as record: - ASGIDispatch(None) + ASGIDispatch(app) assert len(record) == 1 assert ( diff --git a/tests/client/test_auth.py b/tests/client/test_auth.py index 818e65904a..13184a06ba 100644 --- a/tests/client/test_auth.py +++ b/tests/client/test_auth.py @@ -11,7 +11,6 @@ Auth, Client, DigestAuth, - Headers, ProtocolError, Request, RequestBodyUnavailable, @@ -86,23 +85,26 @@ def __init__( async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], - headers: typing.List[typing.Tuple[bytes, bytes]], - stream: ContentStream, + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], + headers: typing.List[typing.Tuple[bytes, bytes]] = None, + stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, ) -> typing.Tuple[ bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream ]: if self._response_count < self.send_response_after_attempt: - return self.challenge_send(method, url, headers, stream) + assert headers is not None + return self.challenge_send(method, headers) authorization = get_header_value(headers, "Authorization") body = JSONStream({"auth": authorization}) return b"HTTP/1.1", 200, b"", [], body def challenge_send( - self, method: bytes, url: URL, headers: Headers, stream: ContentStream, - ) -> typing.Tuple[int, bytes, Headers, ContentStream]: + self, method: bytes, headers: typing.List[typing.Tuple[bytes, bytes]], + ) -> typing.Tuple[ + bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream + ]: self._response_count += 1 nonce = ( hashlib.sha256(os.urandom(8)).hexdigest() @@ -297,7 +299,8 @@ async def test_auth_hidden_header() -> None: async def test_auth_invalid_type() -> None: url = "https://example.org/" client = AsyncClient( - transport=AsyncMockTransport(), auth="not a tuple, not a callable", + transport=AsyncMockTransport(), + auth="not a tuple, not a callable", # type: ignore ) with pytest.raises(TypeError): await client.get(url) diff --git a/tests/client/test_client.py b/tests/client/test_client.py index 02f78f6999..1426fc216c 100644 --- a/tests/client/test_client.py +++ b/tests/client/test_client.py @@ -182,8 +182,11 @@ def test_dispatch_deprecated(): def test_wsgi_dispatch_deprecated(): + def app(start_response, environ): + pass + with pytest.warns(DeprecationWarning) as record: - WSGIDispatch(None) + WSGIDispatch(app) assert len(record) == 1 assert ( diff --git a/tests/client/test_cookies.py b/tests/client/test_cookies.py index a109ccc6ee..68b6c64cf5 100644 --- a/tests/client/test_cookies.py +++ b/tests/client/test_cookies.py @@ -20,14 +20,15 @@ class MockTransport(httpcore.AsyncHTTPTransport): async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], - headers: typing.List[typing.Tuple[bytes, bytes]], - stream: ContentStream, + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], + headers: typing.List[typing.Tuple[bytes, bytes]] = None, + stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, ) -> typing.Tuple[ bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream ]: host, scheme, port, path = url + body: ContentStream if path.startswith(b"/echo_cookies"): cookie = get_header_value(headers, "cookie") body = JSONStream({"cookies": cookie}) diff --git a/tests/client/test_headers.py b/tests/client/test_headers.py index 6f26b1c7b3..2f87c38a1f 100755 --- a/tests/client/test_headers.py +++ b/tests/client/test_headers.py @@ -13,13 +13,14 @@ class MockTransport(httpcore.AsyncHTTPTransport): async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], - headers: typing.List[typing.Tuple[bytes, bytes]], - stream: ContentStream, + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], + headers: typing.List[typing.Tuple[bytes, bytes]] = None, + stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, ) -> typing.Tuple[ bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream ]: + assert headers is not None headers_dict = { key.decode("ascii"): value.decode("ascii") for key, value in headers } diff --git a/tests/client/test_properties.py b/tests/client/test_properties.py index 5dbbb4690a..011c593cd3 100644 --- a/tests/client/test_properties.py +++ b/tests/client/test_properties.py @@ -3,14 +3,14 @@ def test_client_headers(): client = AsyncClient() - client.headers = {"a": "b"} + client.headers = {"a": "b"} # type: ignore assert isinstance(client.headers, Headers) assert client.headers["A"] == "b" def test_client_cookies(): client = AsyncClient() - client.cookies = {"a": "b"} + client.cookies = {"a": "b"} # type: ignore assert isinstance(client.cookies, Cookies) mycookies = list(client.cookies.jar) assert len(mycookies) == 1 diff --git a/tests/client/test_proxies.py b/tests/client/test_proxies.py index 5222b08e34..fb21760bf7 100644 --- a/tests/client/test_proxies.py +++ b/tests/client/test_proxies.py @@ -1,3 +1,4 @@ +import httpcore import pytest import httpx @@ -24,7 +25,9 @@ def test_proxies_parameter(proxies, expected_proxies): for proxy_key, url in expected_proxies: assert proxy_key in client.proxies - assert client.proxies[proxy_key].proxy_origin == httpx.URL(url).raw[:3] + proxy = client.proxies[proxy_key] + assert isinstance(proxy, httpcore.AsyncHTTPProxy) + assert proxy.proxy_origin == httpx.URL(url).raw[:3] assert len(expected_proxies) == len(client.proxies) @@ -81,6 +84,7 @@ def test_transport_for_request(url, proxies, expected): if expected is None: assert transport is client.transport else: + assert isinstance(transport, httpcore.AsyncHTTPProxy) assert transport.proxy_origin == httpx.URL(expected).raw[:3] diff --git a/tests/client/test_queryparams.py b/tests/client/test_queryparams.py index 97e1199620..10a03539e2 100644 --- a/tests/client/test_queryparams.py +++ b/tests/client/test_queryparams.py @@ -3,7 +3,7 @@ import httpcore import pytest -from httpx import URL, AsyncClient, Headers, QueryParams +from httpx import URL, AsyncClient, QueryParams from httpx._content_streams import ContentStream, JSONStream @@ -11,16 +11,15 @@ class MockTransport(httpcore.AsyncHTTPTransport): async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], - headers: typing.List[typing.Tuple[bytes, bytes]], - stream: ContentStream, + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], + headers: typing.List[typing.Tuple[bytes, bytes]] = None, + stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, ) -> typing.Tuple[ bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream ]: - headers = Headers() body = JSONStream({"ok": "ok"}) - return b"HTTP/1.1", 200, b"OK", headers, body + return b"HTTP/1.1", 200, b"OK", [], body def test_client_queryparams(): @@ -35,7 +34,7 @@ def test_client_queryparams_string(): assert client.params["a"] == "b" client = AsyncClient() - client.params = "a=b" + client.params = "a=b" # type: ignore assert isinstance(client.params, QueryParams) assert client.params["a"] == "b" diff --git a/tests/client/test_redirects.py b/tests/client/test_redirects.py index ae800fa792..30b6f6a128 100644 --- a/tests/client/test_redirects.py +++ b/tests/client/test_redirects.py @@ -103,8 +103,8 @@ async def body(): headers_dict = { key.decode("ascii"): value.decode("ascii") for key, value in headers } - content = ByteStream(json.dumps({"headers": headers_dict}).encode()) - return b"HTTP/1.1", 200, b"OK", [], content + stream = ByteStream(json.dumps({"headers": headers_dict}).encode()) + return b"HTTP/1.1", 200, b"OK", [], stream elif path == b"/redirect_body": code = codes.PERMANENT_REDIRECT @@ -121,10 +121,10 @@ async def body(): headers_dict = { key.decode("ascii"): value.decode("ascii") for key, value in headers } - body = ByteStream( + stream = ByteStream( json.dumps({"body": content.decode(), "headers": headers_dict}).encode() ) - return b"HTTP/1.1", 200, b"OK", [], body + return b"HTTP/1.1", 200, b"OK", [], stream elif path == b"/cross_subdomain": host = get_header_value(headers, "host") @@ -402,9 +402,9 @@ class MockCookieTransport(httpcore.AsyncHTTPTransport): async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], - headers: typing.List[typing.Tuple[bytes, bytes]], - stream: ContentStream, + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], + headers: typing.List[typing.Tuple[bytes, bytes]] = None, + stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, ) -> typing.Tuple[ bytes, int, bytes, typing.List[typing.Tuple[bytes, bytes]], ContentStream @@ -432,7 +432,8 @@ async def request( ] return b"HTTP/1.1", status_code, b"See Other", headers, ByteStream(b"") - elif path == b"/logout": + else: + assert path == b"/logout" status_code = codes.SEE_OTHER headers = [ (b"location", b"/"), diff --git a/tests/conftest.py b/tests/conftest.py index a145ce0fa0..10576ebd8a 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -56,7 +56,7 @@ async def my_async_test(): @pytest.fixture(scope="function", autouse=True) -def clean_environ() -> typing.Dict[str, typing.Any]: +def clean_environ(): """Keeps os.environ clean for every test without having to mock os.environ""" original_environ = os.environ.copy() os.environ.clear() diff --git a/tests/test_api.py b/tests/test_api.py index 4c1d611620..2d51d99e8a 100644 --- a/tests/test_api.py +++ b/tests/test_api.py @@ -68,7 +68,6 @@ def test_stream(server): assert response.http_version == "HTTP/1.1" -@pytest.mark.asyncio -async def test_get_invalid_url(server): +def test_get_invalid_url(): with pytest.raises(httpx.InvalidURL): - await httpx.get("invalid://example.org") + httpx.get("invalid://example.org") diff --git a/tests/test_config.py b/tests/test_config.py index 41d81916ad..46d154cdb8 100644 --- a/tests/test_config.py +++ b/tests/test_config.py @@ -56,7 +56,7 @@ def test_load_ssl_config_verify_env_file(https_server, ca_cert_pem_file, config) def test_load_ssl_config_verify_directory(): path = Path(certifi.where()).parent - ssl_config = SSLConfig(verify=path) + ssl_config = SSLConfig(verify=str(path)) context = ssl_config.ssl_context assert context.verify_mode == ssl.VerifyMode.CERT_REQUIRED assert context.check_hostname is True @@ -192,7 +192,7 @@ def test_ssl_config_support_for_keylog_file(tmpdir, monkeypatch): # pragma: noc ssl_config = SSLConfig(trust_env=True) - assert ssl_config.ssl_context.keylog_filename is None + assert ssl_config.ssl_context.keylog_filename is None # type: ignore filename = str(tmpdir.join("test.log")) @@ -201,11 +201,11 @@ def test_ssl_config_support_for_keylog_file(tmpdir, monkeypatch): # pragma: noc ssl_config = SSLConfig(trust_env=True) - assert ssl_config.ssl_context.keylog_filename == filename + assert ssl_config.ssl_context.keylog_filename == filename # type: ignore ssl_config = SSLConfig(trust_env=False) - assert ssl_config.ssl_context.keylog_filename is None + assert ssl_config.ssl_context.keylog_filename is None # type: ignore @pytest.mark.parametrize( diff --git a/tests/test_content_streams.py b/tests/test_content_streams.py index 2b2adc92ae..140aa8d2af 100644 --- a/tests/test_content_streams.py +++ b/tests/test_content_streams.py @@ -203,7 +203,7 @@ async def test_empty_request(): def test_invalid_argument(): with pytest.raises(TypeError): - encode(123) + encode(123) # type: ignore @pytest.mark.asyncio diff --git a/tests/test_multipart.py b/tests/test_multipart.py index fbabc7c483..7d6f8e025d 100644 --- a/tests/test_multipart.py +++ b/tests/test_multipart.py @@ -8,7 +8,7 @@ import pytest import httpx -from httpx._content_streams import AsyncIteratorStream, encode +from httpx._content_streams import AsyncIteratorStream, MultipartStream, encode from httpx._utils import format_form_param @@ -16,7 +16,7 @@ class MockTransport(httpcore.AsyncHTTPTransport): async def request( self, method: bytes, - url: typing.Tuple[bytes, bytes, int, bytes], + url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes], headers: typing.List[typing.Tuple[bytes, bytes]] = None, stream: httpcore.AsyncByteStream = None, timeout: typing.Dict[str, typing.Optional[float]] = None, @@ -27,6 +27,7 @@ async def request( typing.List[typing.Tuple[bytes, bytes]], httpcore.AsyncByteStream, ]: + assert stream is not None content = AsyncIteratorStream(aiterator=(part async for part in stream)) return b"HTTP/1.1", 200, b"OK", [], content @@ -46,7 +47,10 @@ async def test_multipart(value, output): # bit grungy, but sufficient just for our testing purposes. boundary = response.request.headers["Content-Type"].split("boundary=")[-1] content_length = response.request.headers["Content-Length"] - pdict = {"boundary": boundary.encode("ascii"), "CONTENT-LENGTH": content_length} + pdict: dict = { + "boundary": boundary.encode("ascii"), + "CONTENT-LENGTH": content_length, + } multipart = cgi.parse_multipart(io.BytesIO(response.content), pdict) # Note that the expected return type for text fields @@ -91,7 +95,10 @@ async def test_multipart_file_tuple(): # bit grungy, but sufficient just for our testing purposes. boundary = response.request.headers["Content-Type"].split("boundary=")[-1] content_length = response.request.headers["Content-Length"] - pdict = {"boundary": boundary.encode("ascii"), "CONTENT-LENGTH": content_length} + pdict: dict = { + "boundary": boundary.encode("ascii"), + "CONTENT-LENGTH": content_length, + } multipart = cgi.parse_multipart(io.BytesIO(response.content), pdict) # Note that the expected return type for text fields @@ -117,6 +124,7 @@ def test_multipart_encode(tmp_path: typing.Any) -> None: boundary = os.urandom(16).hex() stream = encode(data=data, files=files) + assert isinstance(stream, MultipartStream) assert stream.can_replay() assert stream.content_type == f"multipart/form-data; boundary={boundary}" @@ -143,6 +151,7 @@ def test_multipart_encode_files_allows_filenames_as_none() -> None: boundary = os.urandom(16).hex() stream = encode(data={}, files=files) + assert isinstance(stream, MultipartStream) assert stream.can_replay() assert stream.content_type == f"multipart/form-data; boundary={boundary}" @@ -169,6 +178,7 @@ def test_multipart_encode_files_guesses_correct_content_type( boundary = os.urandom(16).hex() stream = encode(data={}, files=files) + assert isinstance(stream, MultipartStream) assert stream.can_replay() assert stream.content_type == f"multipart/form-data; boundary={boundary}" @@ -192,6 +202,7 @@ def test_multipart_encode_files_allows_bytes_or_str_content( boundary = os.urandom(16).hex() stream = encode(data={}, files=files) + assert isinstance(stream, MultipartStream) assert stream.can_replay() assert stream.content_type == f"multipart/form-data; boundary={boundary}" @@ -226,7 +237,7 @@ def data() -> typing.Iterator[bytes]: yield b"Hello" yield b"World" - fileobj = IteratorIO(data()) + fileobj: typing.Any = IteratorIO(data()) files = {"file": fileobj} stream = encode(files=files, boundary=b"+++") assert not stream.can_replay() diff --git a/tests/test_status_codes.py b/tests/test_status_codes.py index e62b3e067b..c53e95965d 100644 --- a/tests/test_status_codes.py +++ b/tests/test_status_codes.py @@ -7,7 +7,7 @@ def test_status_code_as_int(): def test_lowercase_status_code(): - assert httpx.codes.not_found == 404 + assert httpx.codes.not_found == 404 # type: ignore def test_reason_phrase_for_status_code():
pyca__cryptography-4307
incorrect key_size of sect571r1 Hello! https://github.com/pyca/cryptography/blob/17c8f126c7c7d5ce886112a6e924277a7b203f25/src/cryptography/hazmat/primitives/asymmetric/ec.py#L138 The value there should be 570. From [the standard](http://www.secg.org/sec2-v2.pdf) the order of the published generator is ```py >>> order = 0x03FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFE661CE18FF55987308059B186823851EC7DD9CA1161DE93D5174D66E8382E9BB2FE84E47 >>> print(len(bin(order))-2) 570 ```
[ { "content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport abc\n\nimport six\n\nfrom cryptography...
[ { "content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport abc\n\nimport six\n\nfrom cryptography...
diff --git a/src/cryptography/hazmat/primitives/asymmetric/ec.py b/src/cryptography/hazmat/primitives/asymmetric/ec.py index 83266bb4681c..6cbfcab4c1bd 100644 --- a/src/cryptography/hazmat/primitives/asymmetric/ec.py +++ b/src/cryptography/hazmat/primitives/asymmetric/ec.py @@ -135,7 +135,7 @@ def verify(self, signature, data, signature_algorithm): @utils.register_interface(EllipticCurve) class SECT571R1(object): name = "sect571r1" - key_size = 571 + key_size = 570 @utils.register_interface(EllipticCurve)
cloud-custodian__cloud-custodian-1049
efs tag support I am finding that searching for tagging of EFS resources does not consistently report the correct results. It did find an EFS that was incorrectly tagged, but after it was corrected it continues to report the same resource. I use the same filter for other resource types and do not see this behavior. ``` - name: efs-tag-compliance resource: efs description: Notify if an EFS does not comply with tagging best practices. mode: type: periodic schedule: "rate(24 hours)" role: arn:aws:iam::MYACCOUNT:role/cloud-custodian filters: - or: - "tag:CostCenter": absent - "tag:POC": absent - "tag:Service": absent - "tag:Name": absent ... ```
[ { "content": "# Copyright 2016 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required b...
[ { "content": "# Copyright 2016 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required b...
diff --git a/c7n/resources/efs.py b/c7n/resources/efs.py index 2687e2113eb..3dfe9b58682 100644 --- a/c7n/resources/efs.py +++ b/c7n/resources/efs.py @@ -27,6 +27,7 @@ class resource_type(object): name = 'Name' date = 'CreationTime' dimension = None + detail_spec = ('describe_tags', 'FileSystemId', 'FileSystemId', None) @ElasticFileSystem.action_registry.register('delete') diff --git a/tests/data/placebo/test_efs_delete/elasticfilesystem.DescribeTags_1.json b/tests/data/placebo/test_efs_delete/elasticfilesystem.DescribeTags_1.json new file mode 100644 index 00000000000..b07ee79843d --- /dev/null +++ b/tests/data/placebo/test_efs_delete/elasticfilesystem.DescribeTags_1.json @@ -0,0 +1,30 @@ +{ + "status_code": 200, + "data": { + "ResponseMetadata": { + "RetryAttempts": 0, + "HTTPStatusCode": 200, + "RequestId": "d6e6f4a2-13e8-11e7-a10d-03dc08f89c32", + "HTTPHeaders": { + "x-amzn-requestid": "d6e6f4a2-13e8-11e7-a10d-03dc08f89c32", + "date": "Tue, 28 Mar 2017 19:00:41 GMT", + "content-length": "142", + "content-type": "application/json" + } + }, + "Tags": [ + { + "Value": "MyDocs", + "Key": "Name" + }, + { + "Value": "Test User", + "Key": "POC" + }, + { + "Value": "skunk", + "Key": "Service" + } + ] + } +} \ No newline at end of file diff --git a/tests/data/placebo/test_efs_query/elasticfilesystem.DescribeTags_1.json b/tests/data/placebo/test_efs_query/elasticfilesystem.DescribeTags_1.json new file mode 100644 index 00000000000..b07ee79843d --- /dev/null +++ b/tests/data/placebo/test_efs_query/elasticfilesystem.DescribeTags_1.json @@ -0,0 +1,30 @@ +{ + "status_code": 200, + "data": { + "ResponseMetadata": { + "RetryAttempts": 0, + "HTTPStatusCode": 200, + "RequestId": "d6e6f4a2-13e8-11e7-a10d-03dc08f89c32", + "HTTPHeaders": { + "x-amzn-requestid": "d6e6f4a2-13e8-11e7-a10d-03dc08f89c32", + "date": "Tue, 28 Mar 2017 19:00:41 GMT", + "content-length": "142", + "content-type": "application/json" + } + }, + "Tags": [ + { + "Value": "MyDocs", + "Key": "Name" + }, + { + "Value": "Test User", + "Key": "POC" + }, + { + "Value": "skunk", + "Key": "Service" + } + ] + } +} \ No newline at end of file
bokeh__bokeh-2235
VBoxForm broken Added a `float:left` to fix `sliders.py` which broke stock app example worse.
[ { "content": "\nfrom bokeh.io import vform\nfrom bokeh.plotting import figure, hplot, output_file, show, vplot, ColumnDataSource\nfrom bokeh.models.actions import Callback\nfrom bokeh.models.widgets import Slider\n\nimport numpy as np\n\nx = np.linspace(0, 10, 500)\ny = np.sin(x)\n\nsource = ColumnDataSource(da...
[ { "content": "\nfrom bokeh.io import vform\nfrom bokeh.plotting import figure, hplot, output_file, show, vplot, ColumnDataSource\nfrom bokeh.models.actions import Callback\nfrom bokeh.models.widgets import Slider\n\nimport numpy as np\n\nx = np.linspace(0, 10, 500)\ny = np.sin(x)\n\nsource = ColumnDataSource(da...
diff --git a/bokehjs/src/less/widgets.less b/bokehjs/src/less/widgets.less index 0069826f406..0d937da33d8 100644 --- a/bokehjs/src/less/widgets.less +++ b/bokehjs/src/less/widgets.less @@ -14,7 +14,6 @@ .bk-widget-form { padding: 30px 30px 30px 30px; overflow: hidden; - float:left; } .bk-widget-form-group { diff --git a/examples/plotting/file/slider.py b/examples/plotting/file/slider.py index 71f3af31ad6..314dbe0f088 100644 --- a/examples/plotting/file/slider.py +++ b/examples/plotting/file/slider.py @@ -42,8 +42,8 @@ callback.args["offset"] = offset_slider layout = hplot( + plot, vform(amp_slider, freq_slider, phase_slider, offset_slider), - plot ) output_file("slider.html")
typeddjango__django-stubs-1371
Next release planning (1.15.0) I'll make a new release a soonish, perhaps this weekend or next week, now that mypy 1.0 is being tested in CI and used for `django-stubs[compatible-mypy]`. * #1360 I'd like to make a dual release together with djangorestframework-stubs, so recommended mypy version stays in sync between both projects. But there's some work to be done still on that side: https://github.com/typeddjango/djangorestframework-stubs/issues/324#issuecomment-1421098490 Additionally, nice to have PRs waiting, communtiy reviewers welcome: * #1309 * #1308
[ { "content": "import os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n if os.path.sep in...
[ { "content": "import os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n if os.path.sep in...
diff --git a/README.md b/README.md index f75a1e57a..2fde6dddf 100644 --- a/README.md +++ b/README.md @@ -51,6 +51,7 @@ We rely on different `django` and `mypy` versions: | django-stubs | mypy version | django version | python version |--------------| ---- | ---- | ---- | +| 1.15.0 | 1.0.x | 3.2.x or 4.0.x or 4.1.x | ^3.7 | 1.14.0 | 0.990+ | 3.2.x or 4.0.x or 4.1.x | ^3.7 | 1.13.0 | 0.980+ | 3.2.x or 4.0.x or 4.1.x | ^3.7 | 1.12.0 | 0.931+ | 3.2.x or 4.0.x | ^3.7 diff --git a/setup.py b/setup.py index 11a5ae240..019e1ca1b 100644 --- a/setup.py +++ b/setup.py @@ -36,7 +36,7 @@ def find_stub_files(name: str) -> List[str]: setup( name="django-stubs", - version="1.14.0", + version="1.15.0", description="Mypy stubs for Django", long_description=readme, long_description_content_type="text/markdown",
voxel51__fiftyone-3297
[BUG] fiftyone forces starlette=0.16.0 and it breaks integrations with applications that use FastAPI in newer versions. ### Instructions Thank you for submitting an issue. Please refer to our [issue policy](https://www.github.com/voxel51/fiftyone/blob/develop/ISSUE_POLICY.md) for information on what types of issues we address. **Please fill in this template to ensure a timely and thorough response.** - Place an "x" between the brackets next to an option if it applies. Example: - [x] Selected option - Please delete this section (all content above this line) before submitting the issue ### System information - **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: linux 18.04 - **FiftyOne installed from (pip or source)**:pip - **FiftyOne version (run `fiftyone --version`)**:0.17.2 - **Python version**:3.8 ### Commands to reproduce As thoroughly as possible, please provide the Python and/or shell commands used to encounter the issue. Application steps can be described in the next section. requriements.txt: fastapi==0.79.0 fiftyone==0.17.2 ``` pip install -r requirments.txt ``` ### Describe the problem fiftyone cannot be used with the newer versions of fastapi, because it forces starlette to be in the version starlette=0.16.0 Is it possible to add a condition like: starlette>=0.16.0. In this way it would not break apps that use fiftyone ### Code to reproduce issue fastapi==0.79.0 fiftyone==0.17.2 pip install -r requirments.txt ### Other info / logs es. #0 388.1 #0 388.1 The conflict is caused by: #0 388.1 bentoml 1.0.4 depends on starlette #0 388.1 fastapi 0.79.0 depends on starlette==0.19.1 #0 388.1 fiftyone 0.17.2 depends on starlette==0.16.0 #0 388.1 #0 388.1 To fix this you could try to: #0 388.1 1. loosen the range of package versions you've specified #0 388.1 2. remove package versions to allow pip attempt to solve the dependency conflict #0 388.1 #0 388.1 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts #0 388.1 WARNING: You are using pip version 22.0.4; however, version 22.3 is available. #0 388.1 You should consider upgrading via the '/usr/bin/python -m pip install --upgrade pip' command. ------ ### What areas of FiftyOne does this bug affect? - [X ] `App`: FiftyOne application issue - [ ] `Core`: Core `fiftyone` Python library issue - [ ] `Server`: Fiftyone server issue ### Willingness to contribute The FiftyOne Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the FiftyOne codebase? - [ X] Yes. I can contribute a fix for this bug independently. - [ ] Yes. I would be willing to contribute a fix for this bug with guidance from the FiftyOne community. - [ ] No. I cannot contribute a bug fix at this time.
[ { "content": "#!/usr/bin/env python\n\"\"\"\nInstalls FiftyOne.\n\n| Copyright 2017-2023, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\ntry:\n from importlib import metadata\nexcept ImportError:\n import importlib_metadata as metadata\n\nimport os\nimport re\nfrom setuptools import se...
[ { "content": "#!/usr/bin/env python\n\"\"\"\nInstalls FiftyOne.\n\n| Copyright 2017-2023, Voxel51, Inc.\n| `voxel51.com <https://voxel51.com/>`_\n|\n\"\"\"\ntry:\n from importlib import metadata\nexcept ImportError:\n import importlib_metadata as metadata\n\nimport os\nimport re\nfrom setuptools import se...
diff --git a/setup.py b/setup.py index 826a6ba39a5..002e6a8e3ed 100644 --- a/setup.py +++ b/setup.py @@ -67,7 +67,7 @@ def get_version(): "setuptools", "sseclient-py>=1.7.2,<2", "sse-starlette>=0.10.3,<1", - "starlette>=0.24.0,<0.27", + "starlette>=0.24.0", "strawberry-graphql==0.138.1", "tabulate", "xmltodict",
learningequality__kolibri-5872
update perseus to use new build config scheme ### Observed behavior follow-up from #5864, need to update perseus to use new buildconfig. Currently builds but does not run. ### Errors and logs Currently getting: ``` ERROR Internal Server Error: /en/user/ Traceback (most recent call last): File "/Users/d/Projects/le/kolibri/kolibri/core/webpack/hooks.py", line 111, in _stats_file_content with io.open(self._stats_file, mode="r", encoding="utf-8") as f: FileNotFoundError: [Errno 2] No such file or directory: '/Users/d/Projects/le/kolibri/.venv/lib/python3.7/site-packages/kolibri_exercise_perseus_plugin/build/_stats.json' ``` ### Context current 0.13.0 develop branch
[ { "content": "import argparse\nimport importlib\nimport json\nimport logging\nimport os\nimport sys\nimport tempfile\n\nfrom pkg_resources import DistributionNotFound\nfrom pkg_resources import get_distribution\nfrom pkg_resources import resource_exists\nfrom pkg_resources import resource_filename\nfrom pkg_res...
[ { "content": "import argparse\nimport importlib\nimport json\nimport logging\nimport os\nimport sys\nimport tempfile\n\nfrom pkg_resources import DistributionNotFound\nfrom pkg_resources import get_distribution\nfrom pkg_resources import resource_exists\nfrom pkg_resources import resource_filename\nfrom pkg_res...
diff --git a/kolibri/core/package.json b/kolibri/core/package.json index 89b1a9847aa..407ca6f379e 100644 --- a/kolibri/core/package.json +++ b/kolibri/core/package.json @@ -40,7 +40,7 @@ "vuex": "^3.1.0" }, "devDependencies": { - "kolibri-tools": "0.12.0-beta.3.2", + "kolibri-tools": "0.13.0-dev.3", "responselike": "^1.0.2" } -} +} \ No newline at end of file diff --git a/package.json b/package.json index f32f1d4fc70..057a24fd3ff 100644 --- a/package.json +++ b/package.json @@ -54,7 +54,7 @@ "devDependencies": { "@types/jest": "^24.0.12", "black-fmt": "https://github.com/learningequality/black-fmt#v0.1.3", - "kolibri-tools": "0.12.0-beta.3.2", + "kolibri-tools": "0.13.0-dev.3", "yarn-run-all": "^3.1.1" }, "optionalDependencies": { @@ -65,4 +65,4 @@ "node": "10.x", "yarn": ">= 1.12.3" } -} +} \ No newline at end of file diff --git a/packages/eslint-plugin-kolibri/package.json b/packages/eslint-plugin-kolibri/package.json index 1e88fbbcfac..dd57c58f7ca 100644 --- a/packages/eslint-plugin-kolibri/package.json +++ b/packages/eslint-plugin-kolibri/package.json @@ -1,6 +1,6 @@ { "name": "eslint-plugin-kolibri", - "version": "0.12.0-beta.3.2", + "version": "0.13.0-dev.3", "description": "Custom rules.", "author": "Learning Equality", "main": "lib/index.js", @@ -14,4 +14,4 @@ "node": ">=0.10.0" }, "license": "MIT" -} +} \ No newline at end of file diff --git a/packages/kolibri-core-for-export/package.json b/packages/kolibri-core-for-export/package.json index 04b9050f3ff..830d8eb2024 100644 --- a/packages/kolibri-core-for-export/package.json +++ b/packages/kolibri-core-for-export/package.json @@ -1,6 +1,6 @@ { "name": "kolibri", - "version": "0.12.0-beta.3.2", + "version": "0.13.0-dev.3", "description": "The Kolibri core API", "repository": "github.com/learningequality/kolibri", "author": "Learning Equality", @@ -8,6 +8,6 @@ "private": false, "dependencies": {}, "devDependencies": { - "kolibri-tools": "0.12.0-beta.3.2" + "kolibri-tools": "0.13.0-dev.3" } -} +} \ No newline at end of file diff --git a/packages/kolibri-tools/.npmignore b/packages/kolibri-tools/.npmignore index 8b38d75008a..2b60090d23d 100644 --- a/packages/kolibri-tools/.npmignore +++ b/packages/kolibri-tools/.npmignore @@ -1 +1 @@ -build.js +build_kolibri_tools.js diff --git a/packages/kolibri-tools/lib/read_webpack_json.js b/packages/kolibri-tools/lib/read_webpack_json.js index 0cd09255ade..4098a251d11 100644 --- a/packages/kolibri-tools/lib/read_webpack_json.js +++ b/packages/kolibri-tools/lib/read_webpack_json.js @@ -5,31 +5,11 @@ const temp = require('temp').track(); const webpack_json = path.resolve(path.dirname(__filename), './webpack_json.py'); -function parseConfig(buildConfig, pythonData) { +function parseConfig(buildConfig, pythonData, configPath, index = null) { // Set the main entry for this module, set the name based on the data.name and the path to the // entry file from the data.src_file const bundleId = buildConfig.bundle_id; - const webpackConfig = buildConfig.webpack_config; const pluginPath = pythonData.plugin_path; - if (typeof webpackConfig.entry === 'string') { - webpackConfig.entry = { - [bundleId]: path.join(pluginPath, webpackConfig.entry), - }; - } else { - Object.keys(webpackConfig.entry).forEach(key => { - function makePathAbsolute(entryPath) { - if (entryPath.startsWith('./') || entryPath.startsWith('../')) { - return path.join(pluginPath, entryPath); - } - return entryPath; - } - if (Array.isArray(webpackConfig.entry[key])) { - webpackConfig.entry[key] = webpackConfig.entry[key].map(makePathAbsolute); - } else { - webpackConfig.entry[key] = makePathAbsolute(webpackConfig.entry[key]); - } - }); - } return { name: bundleId, static_dir: path.join(pluginPath, 'static'), @@ -37,7 +17,8 @@ function parseConfig(buildConfig, pythonData) { locale_data_folder: pythonData.locale_data_folder, plugin_path: pluginPath, version: pythonData.version, - config: webpackConfig, + config_path: configPath, + index, }; } @@ -84,13 +65,14 @@ module.exports = function({ pluginFile, plugins, pluginPath }) { const parsedResult = JSON.parse(result); const output = []; parsedResult.forEach(pythonData => { - const buildConfig = require(path.join(pythonData.plugin_path, 'buildConfig.js')); + const configPath = path.join(pythonData.plugin_path, 'buildConfig.js'); + const buildConfig = require(configPath); if (Array.isArray(buildConfig)) { - buildConfig.forEach(configObj => { - output.push(parseConfig(configObj, pythonData)); + buildConfig.forEach((configObj, i) => { + output.push(parseConfig(configObj, pythonData, configPath, i)); }); } else { - output.push(parseConfig(buildConfig, pythonData)); + output.push(parseConfig(buildConfig, pythonData, configPath)); } }); return output; diff --git a/packages/kolibri-tools/lib/webpack.config.base.js b/packages/kolibri-tools/lib/webpack.config.base.js index 141090bcb2a..86169fc5a5f 100644 --- a/packages/kolibri-tools/lib/webpack.config.base.js +++ b/packages/kolibri-tools/lib/webpack.config.base.js @@ -38,7 +38,7 @@ const WebpackMessages = require('./webpackMessages'); module.exports = (data, { mode = 'development', hot = false } = {}) => { if ( typeof data.name === 'undefined' || - typeof data.config === 'undefined' || + typeof data.config_path === 'undefined' || typeof data.static_dir === 'undefined' || typeof data.stats_file === 'undefined' || typeof data.locale_data_folder === 'undefined' || @@ -48,7 +48,32 @@ module.exports = (data, { mode = 'development', hot = false } = {}) => { logging.error(data.name + ' plugin is misconfigured, missing parameter(s)'); return; } - + const configData = require(data.config_path); + let webpackConfig; + if (data.index !== null) { + webpackConfig = configData[data.index].webpack_config; + } else { + webpackConfig = configData.webpack_config; + } + if (typeof webpackConfig.entry === 'string') { + webpackConfig.entry = { + [data.name]: path.join(data.plugin_path, webpackConfig.entry), + }; + } else { + Object.keys(webpackConfig.entry).forEach(key => { + function makePathAbsolute(entryPath) { + if (entryPath.startsWith('./') || entryPath.startsWith('../')) { + return path.join(data.plugin_path, entryPath); + } + return entryPath; + } + if (Array.isArray(webpackConfig.entry[key])) { + webpackConfig.entry[key] = webpackConfig.entry[key].map(makePathAbsolute); + } else { + webpackConfig.entry[key] = makePathAbsolute(webpackConfig.entry[key]); + } + }); + } const production = mode === 'production'; const cssInsertionLoader = hot ? 'style-loader' : MiniCssExtractPlugin.loader; @@ -82,7 +107,7 @@ module.exports = (data, { mode = 'development', hot = false } = {}) => { let externals; - if (!data.config.output || data.config.output.library !== kolibriName) { + if (!webpackConfig.output || webpackConfig.output.library !== kolibriName) { // If this is not the core bundle, then we need to add the external library mappings. externals = coreExternals; } else { @@ -236,7 +261,7 @@ module.exports = (data, { mode = 'development', hot = false } = {}) => { stats: 'minimal', }; - bundle = merge.smart(bundle, data.config); + bundle = merge.smart(bundle, webpackConfig); return bundle; }; diff --git a/packages/kolibri-tools/lib/webpack_json.py b/packages/kolibri-tools/lib/webpack_json.py index 976d6a6de22..b1fbd757008 100644 --- a/packages/kolibri-tools/lib/webpack_json.py +++ b/packages/kolibri-tools/lib/webpack_json.py @@ -15,6 +15,9 @@ logger = logging.getLogger("webpack_json") logger.setLevel(level=logging.INFO) +handler = logging.StreamHandler() +handler.setLevel(logging.INFO) +logger.addHandler(handler) BUILD_CONFIG = "buildConfig.js" diff --git a/packages/kolibri-tools/package.json b/packages/kolibri-tools/package.json index 0f615cd15d7..4f52b5b7e62 100644 --- a/packages/kolibri-tools/package.json +++ b/packages/kolibri-tools/package.json @@ -1,6 +1,6 @@ { "name": "kolibri-tools", - "version": "0.12.0-beta.3.2", + "version": "0.13.0-dev.3", "description": "Tools for building Kolibri frontend plugins", "main": "lib/cli.js", "repository": "github.com/learningequality/kolibri", @@ -32,7 +32,7 @@ "eslint-config-prettier": "^2.9.0", "eslint-plugin-import": "^2.14.0", "eslint-plugin-jest": "^21.26.1", - "eslint-plugin-kolibri": "0.12.0-beta.3.2", + "eslint-plugin-kolibri": "0.13.0-dev.3", "eslint-plugin-vue": "^5.2.2", "espree": "^5.0.1", "esquery": "^1.0.1", @@ -93,6 +93,6 @@ "readline-sync": "^1.4.9" }, "optionalDependencies": { - "kolibri": "0.12.0-beta.3.2" + "kolibri": "0.13.0-dev.3" } -} +} \ No newline at end of file diff --git a/packages/kolibri-tools/test/test_webpack.config.base.spec.js b/packages/kolibri-tools/test/test_webpack.config.base.spec.js index d4d471865b1..fe77c30aef8 100644 --- a/packages/kolibri-tools/test/test_webpack.config.base.spec.js +++ b/packages/kolibri-tools/test/test_webpack.config.base.spec.js @@ -16,6 +16,16 @@ jest.mock('../lib/logging', () => ({ }, })); +jest.mock( + 'test', + () => ({ + webpack_config: { + entry: 'test', + }, + }), + { virtual: true } +); + const baseData = { name: 'kolibri.plugin.test.test_plugin', stats_file: 'output.json', @@ -24,11 +34,8 @@ const baseData = { locale_data_folder: 'kolibri/locale/test', version: 'test', plugin_path: 'kolibri/plugin', - config: { - entry: { - test_plugin: 'src/file.js', - }, - }, + config_path: 'test', + index: null, }; describe('webpackConfigBase', function() { @@ -86,9 +93,9 @@ describe('webpackConfigBase', function() { expectParsedDataIsUndefined(data); }); }); - describe('input is missing config, bundles output', function() { + describe('input is missing config_path, bundles output', function() { it('should be undefined', function() { - delete data.config; + delete data.config_path; expectParsedDataIsUndefined(data); }); }); diff --git a/requirements/base.txt b/requirements/base.txt index f2df3082486..f3274de4b98 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -17,7 +17,7 @@ porter2stemmer==1.0 unicodecsv==0.14.1 metafone==0.5 le-utils==0.1.17 -kolibri_exercise_perseus_plugin==1.2.0 +kolibri_exercise_perseus_plugin==1.2.1 jsonfield==2.0.2 morango==0.4.9 requests-toolbelt==0.8.0
oppia__oppia-8773
All the Frontend services should be documented with jsdoc. **This starter issue is currently on hold because we do not have the capacity to support new contributors working on it.** -------------- We aim to document all the files listed below. Each of the below-listed files should have a file overview signifying the purpose of the file, and each function should have its meaning, arguments and return statement documented with the help of jsdoc decorators like `@fileoverview`, `@param`, `@return`. You can go through these services to get some reference: - graph-input-rules.service.ts - exploration-html-formatter.service.ts - graph-utils.service.ts - alerts.service.ts - playthrough-issues.service.ts **Deducing variable's significance and the meaning from the code:** Try and execute the code by running a dev server locally, and log the variable type (you can use typeof for this) and try to find out the purpose of the variable(what's the variable storing, what is it being used for, what would break if we remove the variable?). To figure out how to execute the code, grep to see what methods call the function, and add console logs to ensure that the code is being executed when you perform the corresponding action in the UI. (As a sanity check, you might also want to ensure that the suspected variable type is consistent with any TypeScript types that are already provided.) **Overview of the function:** Finding or deducing the overview or the purpose of the function can be sometimes a bit tricky, some general advice can be to think-- - why is this function even required, what does it helps us achieve. Try to think from the perspective of the person who created the function and try to mimic the thought process of the original author. - Look at the callers of the function, see all the places where this function is being called at and try to get a better understanding of the function. - If you are unable to understand the purpose of the function, feel free to reach out to your mentor(always happy to help). Please go through this [doc](https://docs.google.com/document/d/1jr8X3oqW7WqKxOgsK8b4TxIraODAV23vDJgYso1R7Pk/edit?usp=sharing) for a deeper context. **Please don't include types in the JSDoc, use the TypeScript annotations for that.** PR's for reference: [#8773](https://github.com/oppia/oppia/pull/8773) **To be assigned to a file or for any queries, comment on the thread and tag @nithusha21.** The listed services file below needs to be documented: - [ ] admin-config-tab-backend-api.service.ts - [ ] admin-data.service.ts - [ ] admin-router.service.ts @anumehaagrawal - [ ] admin-task-manager.service.ts @larakhdavies - [ ] alerts.service.ts - [ ] angular-name.service.ts @parulpriyedarshani - [ ] answer-classification.service.ts - [ ] answer-groups-cache.service.ts - [ ] assets-backend-api.service.ts - [ ] audio-pFlayer.service.ts - [ ] audio-preloader.service.ts - [ ] audio-translation-language.service.ts @kaylahardie - [ ] audio-translation-manager.service.ts - [ ] autogenerated-audio-player.service.ts @BlakeHan01 - [ ] autoplayed-videos.service.ts @darkpsychic - [ ] autosave-info-modals.service.ts - [ ] background-mask.service.ts - [ ] base-undo-redo.service.ts - [ ] browser-checker.service.ts - [ ] change-list.service.ts - [ ] changes-in-human-readable-form.service.ts - [ ] classroom-backend-api.service.ts @ReshuKumari - [ ] code-normalizer.service.ts - [ ] collection-creation-backend-api.service.ts - [ ] collection-creation.service.ts - [ ] collection-editor-state.service.ts - [ ] collection-linearizer.service.ts - [ ] collection-rights-backend-api.service.ts - [ ] collection-update.service.ts - [ ] collection-validation.service.ts - [ ] compare-versions.service.ts - [ ] compute-graph.service.ts - [ ] concept-card-backend-api.service.ts - [ ] construct-translation-ids.service.ts @BlakeHan01 - [ ] context.service.ts - [ ] contribution-and-review.service.ts @lelouchB - [ ] contribution-opportunities-backend-api.service.ts - [ ] contribution-opportunities.service.ts - [ ] creator-dashboard-backend-api.service.ts - [ ] csrf-token.service.ts - [ ] current-interaction.service.ts - [ ] date-time-format.service.ts @linnhallonqvist - [ ] debouncer.service.ts - [ ] debug-info-tracker.service.ts - [ ] device-info.service.ts - [ ] document-attribute-customization.service.ts - [ ] editability.service.ts - [ ] editable-collection-backend-api.service.ts - [ ] editable-exploration-backend-api.service.ts - [ ] editable-question-backend-api.service.ts - [ ] editable-skill-backend-api.service.ts - [ ] editable-story-backend-api.service.ts - [ ] editable-topic-backend-api.service.ts - [ ] editor-first-time-events.service.ts - [ ] email-dashboard-data.service.ts - [ ] exploration-automatic-text-to-speech.service.ts - [ ] exploration-category.service.ts - [ ] exploration-correctness-feedback.service.ts - [ ] exploration-creation.service.ts - [ ] exploration-data.service.ts - [ ] exploration-diff.service.ts - [ ] exploration-embed-button.service.ts - [ ] exploration-engine.service.ts - [ ] exploration-features-backend-api.service.ts - [ ] exploration-features.service.ts @parulpriyedarshani - [ ] exploration-html-formatter.service.ts - [ ] exploration-init-state-name.service.ts - [ ] exploration-language-code.service.ts - [ ] exploration-objective.service.ts - [ ] exploration-param-changes.service.ts - [ ] exploration-param-specs.service.ts - [ ] exploration-player-state.service.ts - [ ] exploration-property.service.ts - [ ] exploration-recommendations.service.ts - [ ] exploration-rights.service.ts - [ ] exploration-save.service.ts - [ ] exploration-states.service.ts - [ ] exploration-summary-backend-api.service.ts - [ ] exploration-tags.service.ts @shrutisatish00 - [ ] exploration-title.service.ts - [ ] exploration-warnings.service.ts - [ ] expression-evaluator.service.ts - [ ] expression-interpolation.service.ts - [ ] expression-parser.service.ts - [ ] expression-syntax-tree.service.ts - [ ] expression-type-parser.service.ts - [ ] extension-tag-assembler.service.ts - [ ] extract-image-filenames-from-state.service.ts - [ ] fatigue-detection.service.ts - [ ] focus-manager.service.ts - [ ] generate-content-id.service.ts - [ ] graph-data.service.ts - [ ] graph-layout.service.ts - [ ] guest-collection-progress.service.ts - [ ] hint-and-solution-modal.service.ts - [ ] hints-and-solution-manager.service.ts - [ ] html-escaper.service.ts @tianqi-wu - [ ] id-generation.service.ts - [ ] image-preloader.service.ts - [ ] image-upload-helper.service.ts - [ ] improvement-modal.service.ts - [ ] improvement-task.service.ts - [ ] improvements-display.service.ts - [ ] improvements.service.ts - [ ] interaction-details-cache.service.ts - [ ] language-util.service.ts - [ ] learner-action-render.service.ts - [ ] learner-answer-details-backend-api.service.ts - [ ] learner-answer-details-data.service.ts - [ ] learner-answer-info.service.ts - [ ] learner-dashboard-backend-api.service.ts - [ ] learner-dashboard-ids-backend-api.service.ts - [ ] learner-params.service.ts - [ ] learner-playlist.service.ts - [ ] learner-view-rating.service.ts - [ ] local-storage.service.ts - [ ] logger.service.ts @remigourdon - [ ] messenger.service.ts @remigourdon - [ ] meta-tag-customization.service.ts - [ ] navigation.service.ts - [ ] nested-directives-recursion-timeout-prevention.service.ts - [ ] number-attempts.service.ts @gp201 - [ ] page-title.service.ts - [ ] parameter-metadata.service.ts - [ ] player-correctness-feedback-enabled.service.ts - [ ] player-position.service.ts @tianqi-wu - [ ] player-transcript.service.ts - [ ] playthrough-issues-backend-api.service.ts - [ ] playthrough-issues.service.ts - [ ] playthrough.service.ts - [ ] prediction-algorithm-registry.service.ts - [ ] pretest-question-backend-api.service.ts - [ ] promo-bar.service.ts - [ ] question-backend-api.service.ts - [ ] question-creation.service.ts - [ ] question-player-engine.service.ts - [ ] question-player-state.service.ts - [ ] question-suggestion.service.ts - [ ] question-undo-redo.service.ts - [ ] question-update.service.ts - [ ] questions-list.service.ts - [ ] rating-computation.service.ts - [ ] read-only-collection-backend-api.service.ts - [ ] read-only-exploration-backend-api.service.ts - [ ] refresher-exploration-confirmation-modal.service.ts - [ ] request-interceptor.service.ts - [ ] responses.service.ts - [ ] review-test-backend-api.service.ts - [ ] review-test-engine.service.ts - [ ] router.service.ts - [ ] rte-helper.service.ts - [ ] schema-default-value.service.ts - [ ] schema-undefined-last-element.service.ts - [ ] search-explorations-backend-api.service.ts - [ ] search.service.ts - [ ] sidebar-status.service.ts - [ ] site-analytics.service.ts - [ ] skill-creation.service.ts - [ ] skill-editor-routing.service.ts - [ ] skill-editor-state.service.ts - [ ] skill-mastery-backend-api.service.ts - [ ] skill-rights-backend-api.service.ts - [ ] skill-update.service.ts - [ ] solution-validity.service.ts - [ ] solution-verification.service.ts - [ ] speech-synthesis-chunker.service.ts - [ ] state-classifier-mapping.service.ts - [ ] state-content.service.ts - [ ] state-customization-args.service.ts - [ ] state-editor.service.ts - [ ] state-hints.service.ts - [ ] state-improvement-suggestion.service.ts @bobbychen1999 - [ ] state-interaction-id.service.ts - [ ] state-name.service.ts - [ ] state-param-changes.service.ts - [ ] state-property.service.ts - [ ] state-recorded-voiceovers.service.ts - [ ] state-rules-stats.service.ts - [ ] state-solicit-answer-details.service.ts - [ ] state-solution.service.ts - [ ] state-top-answers-stats-backend-api.service.ts - [ ] state-top-answers-stats.service.ts - [ ] state-tutorial-first-time.service.ts @akeeoaobh - [ ] state-written-translations.service.ts - [ ] stats-reporting.service.ts - [ ] story-creation.service.ts - [ ] story-editor-state.service.ts @pengcheng95 - [ ] story-update.service.ts - [ ] story-viewer-backend-api.service.ts - [ ] subtopic-viewer-backend-api.service.ts - [ ] suggestion-modal-for-creator-view.service.ts - [ ] suggestion-modal-for-exploration-editor.service.ts - [ ] suggestion-modal-for-exploration-player.service.ts - [ ] suggestion-modal-for-learner-dashboard.service.ts - [ ] suggestion-modal.service.ts - [ ] thread-data.service.ts - [ ] thread-status-display.service.ts - [ ] topic-creation.service.ts - [ ] topic-editor-routing.service.ts - [ ] topic-editor-state.service.ts - [ ] topic-rights-backend-api.service.ts - [ ] topic-update.service.ts - [ ] topic-viewer-backend-api.service.ts - [ ] topics-and-skills-dashboard-backend-api.service.ts - [ ] training-data-editor-panel.service.ts - [ ] training-data.service.ts @felicityzhao99 - [ ] training-modal.service.ts @varuncj02 - [ ] translate-text.service.ts - [ ] translation-file-hash-loader.service.ts - [ ] translation-language.service.ts - [ ] translation-status.service.ts - [ ] translation-tab-active-content-id.service.ts - [ ] translation-tab-active-mode.service.ts - [ ] undo-redo.service.ts - [ ] url-interpolation.service.ts @qinghaoyang - [ ] url.service.ts @tianqi-wu - [ ] user-email-preferences.service.ts @felicityzhao99 - [ ] user-exploration-permissions.service.ts - [ ] user.service.ts - [ ] utils.service.ts @rriyaldhi - [ ] validators.service.ts - [ ] version-tree.service.ts - [ ] voiceover-recording.service.ts - [ ] window-dimensions.service.ts @asafprivman - [ ] window-ref.service.ts @larakhdavies Note: For a guide on how to access Oppia's webpages, see [this](https://github.com/oppia/oppia/wiki/How-to-access-Oppia-webpages).
[ { "content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n...
[ { "content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n...
diff --git a/core/templates/pages/exploration-editor-page/services/user-email-preferences.service.ts b/core/templates/pages/exploration-editor-page/services/user-email-preferences.service.ts index 52e55b80ca706..aedb11d3fc0f0 100644 --- a/core/templates/pages/exploration-editor-page/services/user-email-preferences.service.ts +++ b/core/templates/pages/exploration-editor-page/services/user-email-preferences.service.ts @@ -34,24 +34,42 @@ angular.module('oppia').factory('UserEmailPreferencesService', [ this.feedbackNotificationsMuted = feedbackNotificationsMuted; this.suggestionNotificationsMuted = suggestionNotificationsMuted; }, + /** + * @return {boolean} Whether the feedback notification is muted. + */ areFeedbackNotificationsMuted: function() { return this.feedbackNotificationsMuted; }, + /** + * @return {boolean} Whether the suggestion notification is muted. + */ areSuggestionNotificationsMuted: function() { return this.suggestionNotificationsMuted; }, + /** + * Set the message type to feedback and mute to true or false. + * @param {boolean} mute - Whether the feedback notification is muted. + */ setFeedbackNotificationPreferences: function(mute) { this.saveChangeToBackend({ message_type: MESSAGE_TYPE_FEEDBACK, mute: mute }); }, + /** + * Set the message type to suggestion and mute to true or false. + * @param {boolean} mute - Whether the suggestion notification is muted. + */ setSuggestionNotificationPreferences: function(mute) { this.saveChangeToBackend({ message_type: MESSAGE_TYPE_SUGGESTION, mute: mute }); }, + /** + * Save the change of message_type and mute to backend. + * @param {object} requestParams - Info about message_type and mute. + */ saveChangeToBackend: function(requestParams) { var that = this; var emailPreferencesUrl = UrlInterpolationService.interpolateUrl( diff --git a/scripts/create_expression_parser.py b/scripts/create_expression_parser.py index d34d2ebd5039f..0695d6835fd41 100644 --- a/scripts/create_expression_parser.py +++ b/scripts/create_expression_parser.py @@ -18,9 +18,7 @@ from __future__ import unicode_literals # pylint: disable=import-only-modules import argparse -import fileinput import os -import re import subprocess import python_utils
freedomofpress__securedrop-6051
Alembic operations fail with multiple head revisions ## Description All Alembic operations fail with Alembic error: ERROR [alembic.util.messaging] Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads Cf. consistent recent failures of CI jobs `app-tests` and `staging-test-with-rebase` since #5974. ## Steps to Reproduce `make test` on `develop`; open or push to a PR; etc. ## Expected Behavior Alembic operations succeed and Alembic-based tests pass. ## Actual Behavior All Alembic operations and tests fail with Alembic error: ERROR [alembic.util.messaging] Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>@head' to narrow to a specific head, or 'heads' for all heads ## Comments This is essentially an Alembic-level merge-conflict. PR forthcoming with the one-line fix.
[ { "content": "\"\"\"unique_index_for_instanceconfig_valid_until\n\nRevision ID: 1ddb81fb88c2\nRevises: 92fba0be98e9\nCreate Date: 2021-06-04 17:28:25.725563\n\n\"\"\"\nfrom alembic import op\nimport sqlalchemy as sa\n\n\n# revision identifiers, used by Alembic.\nrevision = '1ddb81fb88c2'\ndown_revision = '92fba...
[ { "content": "\"\"\"unique_index_for_instanceconfig_valid_until\n\nRevision ID: 1ddb81fb88c2\nRevises: 92fba0be98e9\nCreate Date: 2021-06-04 17:28:25.725563\n\n\"\"\"\nfrom alembic import op\nimport sqlalchemy as sa\n\n\n# revision identifiers, used by Alembic.\nrevision = '1ddb81fb88c2'\ndown_revision = 'b060f...
diff --git a/securedrop/alembic/versions/1ddb81fb88c2_unique_index_for_instanceconfig_valid_.py b/securedrop/alembic/versions/1ddb81fb88c2_unique_index_for_instanceconfig_valid_.py index 74342aed1e..ec65d3a49d 100644 --- a/securedrop/alembic/versions/1ddb81fb88c2_unique_index_for_instanceconfig_valid_.py +++ b/securedrop/alembic/versions/1ddb81fb88c2_unique_index_for_instanceconfig_valid_.py @@ -11,7 +11,7 @@ # revision identifiers, used by Alembic. revision = '1ddb81fb88c2' -down_revision = '92fba0be98e9' +down_revision = 'b060f38c0c31' branch_labels = None depends_on = None
spack__spack-18268
Installation issue: dbus (missing libsm dependency) <!-- Thanks for taking the time to report this build failure. To proceed with the report please: 1. Title the issue "Installation issue: <name-of-the-package>". 2. Provide the information required below. We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively! --> I am trying to install visit, and I am hitting an error when it tries to install dbus. This appears to be due to dbus depending on libSM (and through that libuuid), but not declaring that dependency in Spack. So in my build of visit, the libuuid dependency is picked up and set to use the spack installed libuuid via some other package visit depends on, but dbus ends up using the system installed libSM, and there is a mismatch between the two. But the dbus package should not be linking against system libSM. ### Steps to reproduce the issue I am trying to install visit, and I am hitting an error when it tries to install dbus. This appears to be spack install dbus@1.12.8%gcc@8.4.0 ^libuuid@1.0.3 eventually aborts with CCLD dbus-run-session /lib/../lib64/libSM.so: undefined reference to `uuid_unparse_lower@UUID_1.0' /lib/../lib64/libSM.so: undefined reference to `uuid_generate@UUID_1.0' collect2: error: ld returned 1 exit status Error appears due to the attempt to link the system /lib64/libSM.so ### Information on your system spack debug report * **Spack:** 0.14.2 * **Python:** 2.7.16 * **Platform:** linux-rhel7-broadwell ### Additional information [spack-build-env.txt](https://github.com/spack/spack/files/5125717/spack-build-env.txt) [spack-build-out.txt](https://github.com/spack/spack/files/5125718/spack-build-out.txt) No maintainers for dbus ### General information <!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. --> - [x ] I have run `spack debug report` and reported the version of Spack/Python/Platform - [x] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers - [x ] I have uploaded the build log and environment files - [ x] I have searched the issues of this repo and believe this is not a duplicate
[ { "content": "# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass Dbus(Package):\n \"\"\"D-Bus is a message bus system, a simpl...
[ { "content": "# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass Dbus(Package):\n \"\"\"D-Bus is a message bus system, a simpl...
diff --git a/var/spack/repos/builtin/packages/dbus/package.py b/var/spack/repos/builtin/packages/dbus/package.py index 31495a06b5510a..f47f7f4b16265d 100644 --- a/var/spack/repos/builtin/packages/dbus/package.py +++ b/var/spack/repos/builtin/packages/dbus/package.py @@ -30,6 +30,7 @@ class Dbus(Package): depends_on('pkgconfig', type='build') depends_on('expat') depends_on('glib') + depends_on('libsm') def install(self, spec, prefix): configure(
oobabooga__text-generation-webui-4905
coqui_tts fails to load as assumes interactive sessions to accept ToS ### Describe the bug When enabled coqui_tts prevents textgen from starting as it expects an interactive session for a user to accept a ToS agreement ### Is there an existing issue for this? - [X] I have searched the existing issues ### Reproduction - Enable coqui_tts - Restart textgen - Note that textgen never starts - Check console logs ``` 2023-12-12 22:13:22 INFO:Loading the extension "coqui_tts"... [XTTS] Loading XTTS... > You must agree to the terms of service to use this model. | > Please see the terms of service at https://coqui.ai/cpml.txt | > "I have read, understood and agreed to the Terms and Conditions." - [y/n] ``` - No way to accept non-interactively ### Screenshot _No response_ ### Logs ```shell INFO: Started server process [37] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:5001 (Press CTRL+C to quit) 2023-12-12 22:13:18 DEBUG:Intercepting all calls to posthog. 2023-12-12 22:13:19 DEBUG:Creating Sentence Embedder... 2023-12-12 22:13:20 WARNING:Using embedded DuckDB without persistence: data will be transient 2023-12-12 22:13:22 DEBUG:Loading hyperparameters... 2023-12-12 22:13:22 INFO:Loading the extension "coqui_tts"... [XTTS] Loading XTTS... > You must agree to the terms of service to use this model. | > Please see the terms of service at https://coqui.ai/cpml.txt | > "I have read, understood and agreed to the Terms and Conditions." - [y/n] ``` ### System Info ```shell Latest official docker image running on server. ``` Note that a workaround for this is to remove coqui_tts and install "alltalk_tts" instead which seems to work without issue.
[ { "content": "import html\nimport json\nimport random\nimport time\nfrom pathlib import Path\n\nimport gradio as gr\n\nfrom modules import chat, shared, ui_chat\nfrom modules.logging_colors import logger\nfrom modules.ui import create_refresh_button\nfrom modules.utils import gradio\n\ntry:\n from TTS.api im...
[ { "content": "import os\nimport html\nimport json\nimport random\nimport time\nfrom pathlib import Path\n\nimport gradio as gr\n\nfrom modules import chat, shared, ui_chat\nfrom modules.logging_colors import logger\nfrom modules.ui import create_refresh_button\nfrom modules.utils import gradio\n\ntry:\n from...
diff --git a/extensions/coqui_tts/script.py b/extensions/coqui_tts/script.py index 81e85117d4..682cb94ca4 100644 --- a/extensions/coqui_tts/script.py +++ b/extensions/coqui_tts/script.py @@ -1,3 +1,4 @@ +import os import html import json import random @@ -26,6 +27,7 @@ raise +os.environ["COQUI_TOS_AGREED"] = "1" params = { "activate": True,
boto__botocore-1117
Support Python 3.6 Python 3.6 got released, and some distro (like Fedora) are swithcing to it.
[ { "content": "#!/usr/bin/env python\nimport botocore\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nrequires = ['jmespath>=0.7.1,<1.0.0',\n 'python-dateutil>=2.1,<3.0.0',\n 'docutils>=0.10']\n\n\nif sys.version_info[:2] == (2, 6):\n # For python2.6 we have a few other d...
[ { "content": "#!/usr/bin/env python\nimport botocore\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nrequires = ['jmespath>=0.7.1,<1.0.0',\n 'python-dateutil>=2.1,<3.0.0',\n 'docutils>=0.10']\n\n\nif sys.version_info[:2] == (2, 6):\n # For python2.6 we have a few other d...
diff --git a/.travis.yml b/.travis.yml index b4c6cd4f63..75633c23f1 100644 --- a/.travis.yml +++ b/.travis.yml @@ -5,7 +5,7 @@ python: - "3.3" - "3.4" - "3.5" - - "3.6-dev" + - "3.6" sudo: false before_install: - if [ "$TRAVIS_PULL_REQUEST" != "false" ] && [ "$TRAVIS_BRANCH" == "master" ]; then diff --git a/requirements.txt b/requirements.txt index 88d780206d..772ad4129c 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,4 +1,4 @@ -tox>=2.3.1,<3.0.0 +tox>=2.5.0,<3.0.0 python-dateutil>=2.1,<3.0.0 nose==1.3.0 mock==1.3.0 diff --git a/setup.py b/setup.py index ec976c7656..e16b6245f9 100644 --- a/setup.py +++ b/setup.py @@ -57,5 +57,6 @@ 'Programming Language :: Python :: 3.3', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', + 'Programming Language :: Python :: 3.6', ), ) diff --git a/tox.ini b/tox.ini index 8e0fb21423..17e3fb91a9 100644 --- a/tox.ini +++ b/tox.ini @@ -1,5 +1,5 @@ [tox] -envlist = py26,py27,py33,py34,py35 +envlist = py26,py27,py33,py34,py35,py36 skipsdist = True
huggingface__dataset-viewer-2409
Retry jobs that finish with `ClientConnection` error? Maybe here: https://github.com/huggingface/datasets-server/blob/f311a9212aaa91dd0373e5c2d4f5da9b6bdabcb5/chart/env/prod.yaml#L209 Internal conversation on Slack: https://huggingface.slack.com/archives/C0311GZ7R6K/p1698224875005729 Anyway: I'm wondering if we can have the error now that the dataset scripts are disabled by default.
[ { "content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nCACHE_COLLECTION_RESPONSES = \"cachedResponsesBlue\"\nCACHE_MONGOENGINE_ALIAS = \"cache\"\nHF_DATASETS_CACHE_APPNAME = \"hf_datasets_cache\"\nPARQUET_METADATA_CACHE_APPNAME = \"datasets_server_parquet_metadata\"\nD...
[ { "content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nCACHE_COLLECTION_RESPONSES = \"cachedResponsesBlue\"\nCACHE_MONGOENGINE_ALIAS = \"cache\"\nHF_DATASETS_CACHE_APPNAME = \"hf_datasets_cache\"\nPARQUET_METADATA_CACHE_APPNAME = \"datasets_server_parquet_metadata\"\nD...
diff --git a/libs/libcommon/src/libcommon/constants.py b/libs/libcommon/src/libcommon/constants.py index 0a17db2ce6..075c5529af 100644 --- a/libs/libcommon/src/libcommon/constants.py +++ b/libs/libcommon/src/libcommon/constants.py @@ -36,6 +36,7 @@ PARQUET_REVISION = "refs/convert/parquet" ERROR_CODES_TO_RETRY = { + "ConnectionError", "CreateCommitError", "ExternalServerError", "JobManagerCrashedError",
getredash__redash-4189
JIRA setup: change password field name to "API Token" While a password can be used there, it's not recommended and eventually will be deprecated.
[ { "content": "import re\nfrom collections import OrderedDict\n\nfrom redash.query_runner import *\nfrom redash.utils import json_dumps, json_loads\n\n\n# TODO: make this more general and move into __init__.py\nclass ResultSet(object):\n def __init__(self):\n self.columns = OrderedDict()\n self....
[ { "content": "import re\nfrom collections import OrderedDict\n\nfrom redash.query_runner import *\nfrom redash.utils import json_dumps, json_loads\n\n\n# TODO: make this more general and move into __init__.py\nclass ResultSet(object):\n def __init__(self):\n self.columns = OrderedDict()\n self....
diff --git a/redash/query_runner/jql.py b/redash/query_runner/jql.py index 76e707e3a3..47a47b2fe6 100644 --- a/redash/query_runner/jql.py +++ b/redash/query_runner/jql.py @@ -144,7 +144,7 @@ class JiraJQL(BaseHTTPQueryRunner): requires_authentication = True url_title = 'JIRA URL' username_title = 'Username' - password_title = 'Password' + password_title = 'API Token' @classmethod def name(cls):
conda__conda-3740
conda env create giving ImportError for yaml package `conda env create` suddenly started giving `"ImportError: No module named 'yaml'"` with latest miniconda on my TravisCI builbs: https://travis-ci.org/leouieda/website/builds/170917743 I changed nothing significant in my code. Tried rebuilding previous passing builds and started getting the same error. Is this something from a recent release? conda env create giving ImportError for yaml package `conda env create` suddenly started giving `"ImportError: No module named 'yaml'"` with latest miniconda on my TravisCI builbs: https://travis-ci.org/leouieda/website/builds/170917743 I changed nothing significant in my code. Tried rebuilding previous passing builds and started getting the same error. Is this something from a recent release?
[ { "content": "\"\"\"\nWrapper around yaml to ensure that everything is ordered correctly.\n\nThis is based on the answer at http://stackoverflow.com/a/16782282\n\"\"\"\nfrom __future__ import absolute_import, print_function\nfrom collections import OrderedDict\nimport yaml\n\n\ndef represent_ordereddict(dumper,...
[ { "content": "\"\"\"\nWrapper around yaml to ensure that everything is ordered correctly.\n\nThis is based on the answer at http://stackoverflow.com/a/16782282\n\"\"\"\nfrom __future__ import absolute_import, print_function\nfrom collections import OrderedDict\n\nfrom conda.common.yaml import get_yaml\nyaml = g...
diff --git a/conda_env/yaml.py b/conda_env/yaml.py index 74a462f3579..fbe8dc7e088 100644 --- a/conda_env/yaml.py +++ b/conda_env/yaml.py @@ -5,7 +5,9 @@ """ from __future__ import absolute_import, print_function from collections import OrderedDict -import yaml + +from conda.common.yaml import get_yaml +yaml = get_yaml() def represent_ordereddict(dumper, data):
typeddjango__django-stubs-1343
Next release planning (1.14.0) Tracking a few regressions in [version 1.13.2](https://github.com/typeddjango/django-stubs/releases/tag/1.13.2), we should probably get out a release quickly once these are resolved. * #1335 * Fixes #1333 * Fixes #1336 * https://github.com/typeddjango/django-stubs/pull/1331 * Fixes #1330 * #1345 * Fixes #1327 We can update to mypy 0.991 thanks to @flaeppe, and will call the next version 1.14.0. * #1329 * Unblocked #1260 * Fixes #1261 Also some nice to have PRs still waiting for community reviewers: * #1309 * #1308 * #1315
[ { "content": "import os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n if os.path.sep in...
[ { "content": "import os\nfrom typing import List\n\nfrom setuptools import find_packages, setup\n\n\ndef find_stub_files(name: str) -> List[str]:\n result = []\n for root, _dirs, files in os.walk(name):\n for file in files:\n if file.endswith(\".pyi\"):\n if os.path.sep in...
diff --git a/README.md b/README.md index 65c83a392..f75a1e57a 100644 --- a/README.md +++ b/README.md @@ -51,6 +51,7 @@ We rely on different `django` and `mypy` versions: | django-stubs | mypy version | django version | python version |--------------| ---- | ---- | ---- | +| 1.14.0 | 0.990+ | 3.2.x or 4.0.x or 4.1.x | ^3.7 | 1.13.0 | 0.980+ | 3.2.x or 4.0.x or 4.1.x | ^3.7 | 1.12.0 | 0.931+ | 3.2.x or 4.0.x | ^3.7 | 1.11.0 | 0.931+ | 3.2.x | ^3.7 diff --git a/setup.py b/setup.py index f685e7d4e..19ec016db 100644 --- a/setup.py +++ b/setup.py @@ -36,7 +36,7 @@ def find_stub_files(name: str) -> List[str]: setup( name="django-stubs", - version="1.13.2", + version="1.14.0", description="Mypy stubs for Django", long_description=readme, long_description_content_type="text/markdown",
pytorch__TensorRT-371
🐛 [Bug] An error occurs in CompileGraph when gpu_id == 1 When I tried to Complie on the second GPU in a multi-GPU environment, an error occurred. The code sample used is as follows. ```cpp void load(const std::string& model_path, int64_t gpu_id, int64_t opt_batch_size) { torch::jit::Module module = torch::jit::load(model_path); torch::Device device = (torch::cuda::is_available() ? torch::Device(torch::kCUDA, gpu_id) : torch::Device(torch::kCPU)); module.to(device, torch::kHalf); module.eval(); std::vector<int64_t> in_opt = { opt_batch_size, INPUT_CHANNEL_NUM, BOARD_WIDTH, BOARD_WIDTH }; trtorch::CompileSpec::InputRange range(in_opt); trtorch::CompileSpec info({ range }); info.op_precision = torch::kHalf; info.device.gpu_id = gpu_id; module = trtorch::CompileGraph(module, info); } ``` #### Error1 I called this function with gpu_id = 1. I got the following error: ``` terminate called after throwing an instance of 'trtorch::Error' what(): [enforce fail at core/conversion/conversionctx/ConversionCtx.cpp:107] Expected cudaSetDevice(settings.device.gpu_id) to be true but got false Unable to set gpu id: 1 ``` I think this line is the cause. https://github.com/NVIDIA/TRTorch/blob/1d4b967a28e36beee048703f5645ee6fcc95793d/core/conversion/conversionctx/ConversionCtx.cpp#L112 `cudaSetDevice` returns `cudaSuccess` (= 0) on success. However, `TRTORCH_CHECK` judges success or failure as a Boolean type. I fixed it as follows and rebuilt it so that this error disappeared. ```diff diff --git a/core/conversion/conversionctx/ConversionCtx.cpp b/core/conversion/conversionctx/ConversionCtx.cpp index ff23692..bc5bf68 100644 --- a/core/conversion/conversionctx/ConversionCtx.cpp +++ b/core/conversion/conversionctx/ConversionCtx.cpp @@ -109,7 +109,7 @@ ConversionCtx::ConversionCtx(BuilderSettings build_settings) cfg->setEngineCapability(settings.capability); if (settings.device.gpu_id) { - TRTORCH_CHECK(cudaSetDevice(settings.device.gpu_id), "Unable to set gpu id: " << settings.device.gpu_id); + TRTORCH_CHECK(cudaSetDevice(settings.device.gpu_id) == cudaSuccess, "Unable to set gpu id: " << settings.device.gpu_id); } if (settings.device.device_type == nvinfer1::DeviceType::kDLA) { ``` You may also use `set_device`. https://github.com/NVIDIA/TRTorch/blob/1d4b967a28e36beee048703f5645ee6fcc95793d/core/compiler.cpp#L176-L178 #### Error2 After making the above fix, I get the following error: ``` ERROR: [TRTorch Conversion Context] - Builder was created on device different than current device. ``` I changed `cudaSetDevice` to do it at the beginning of the function and it worked fine. ```diff diff --git a/core/conversion/conversionctx/ConversionCtx.cpp b/core/conversion/conversionctx/ConversionCtx.cpp index ff23692..09a419c 100644 --- a/core/conversion/conversionctx/ConversionCtx.cpp +++ b/core/conversion/conversionctx/ConversionCtx.cpp @@ -47,6 +47,10 @@ ConversionCtx::ConversionCtx(BuilderSettings build_settings) util::logging::get_logger().get_reportable_severity(), util::logging::get_logger().get_is_colored_output_on()) { // TODO: Support FP16 and FP32 from JIT information + if (settings.device.gpu_id) { + TRTORCH_CHECK(cudaSetDevice(settings.device.gpu_id) == cudaSuccess, "Unable to set gpu id: " << settings.device.gpu_id); + } + builder = nvinfer1::createInferBuilder(logger); net = builder->createNetworkV2(1U << static_cast<uint32_t>(nvinfer1::NetworkDefinitionCreationFlag::kEXPLICIT_BATCH)); @@ -108,10 +112,6 @@ ConversionCtx::ConversionCtx(BuilderSettings build_settings) cfg->setDefaultDeviceType(settings.device.device_type); cfg->setEngineCapability(settings.capability); - if (settings.device.gpu_id) { - TRTORCH_CHECK(cudaSetDevice(settings.device.gpu_id), "Unable to set gpu id: " << settings.device.gpu_id); - } - if (settings.device.device_type == nvinfer1::DeviceType::kDLA) { auto nbDLACores = builder->getNbDLACores(); TRTORCH_CHECK( ``` It's working, but I'm not sure if this is a good fix as there may be other side effects as well. I would appreciate it if you could respond appropriately.
[ { "content": "from typing import List, Dict, Any\nimport torch\nfrom torch import nn\n\nimport trtorch._C\nfrom trtorch._compile_spec import _parse_compile_spec\nfrom trtorch._version import __version__\nfrom types import FunctionType\n\n\ndef compile(module: torch.jit.ScriptModule, compile_spec: Any) -> torch....
[ { "content": "from typing import List, Dict, Any\nimport torch\nfrom torch import nn\n\nimport trtorch._C\nfrom trtorch._compile_spec import _parse_compile_spec\nfrom trtorch._version import __version__\nfrom types import FunctionType\n\n\ndef compile(module: torch.jit.ScriptModule, compile_spec: Any) -> torch....
diff --git a/core/conversion/conversionctx/ConversionCtx.cpp b/core/conversion/conversionctx/ConversionCtx.cpp index ff236921c2..9d47026c60 100644 --- a/core/conversion/conversionctx/ConversionCtx.cpp +++ b/core/conversion/conversionctx/ConversionCtx.cpp @@ -47,6 +47,11 @@ ConversionCtx::ConversionCtx(BuilderSettings build_settings) util::logging::get_logger().get_reportable_severity(), util::logging::get_logger().get_is_colored_output_on()) { // TODO: Support FP16 and FP32 from JIT information + if (settings.device.gpu_id) { + TRTORCH_CHECK( + cudaSetDevice(settings.device.gpu_id) == cudaSuccess, "Unable to set gpu id: " << settings.device.gpu_id); + } + builder = nvinfer1::createInferBuilder(logger); net = builder->createNetworkV2(1U << static_cast<uint32_t>(nvinfer1::NetworkDefinitionCreationFlag::kEXPLICIT_BATCH)); @@ -108,10 +113,6 @@ ConversionCtx::ConversionCtx(BuilderSettings build_settings) cfg->setDefaultDeviceType(settings.device.device_type); cfg->setEngineCapability(settings.capability); - if (settings.device.gpu_id) { - TRTORCH_CHECK(cudaSetDevice(settings.device.gpu_id), "Unable to set gpu id: " << settings.device.gpu_id); - } - if (settings.device.device_type == nvinfer1::DeviceType::kDLA) { auto nbDLACores = builder->getNbDLACores(); TRTORCH_CHECK( diff --git a/docsrc/py_api/trtorch.rst b/docsrc/py_api/trtorch.rst index d7376cb2f0..6063b4c69c 100644 --- a/docsrc/py_api/trtorch.rst +++ b/docsrc/py_api/trtorch.rst @@ -11,6 +11,8 @@ trtorch Functions ------------ +.. autofunction:: set_device + .. autofunction:: compile .. autofunction:: convert_method_to_trt_engine diff --git a/py/trtorch/_compiler.py b/py/trtorch/_compiler.py index 76a7923ca2..fdb6fc2e80 100644 --- a/py/trtorch/_compiler.py +++ b/py/trtorch/_compiler.py @@ -156,3 +156,6 @@ def get_build_info() -> str: build_info = trtorch._C.get_build_info() build_info = "TRTorch Version: " + str(__version__) + '\n' + build_info return build_info + +def set_device(gpu_id): + trtorch._C.set_device(gpu_id) diff --git a/py/trtorch/csrc/trtorch_py.cpp b/py/trtorch/csrc/trtorch_py.cpp index 420e27cccc..418423db41 100644 --- a/py/trtorch/csrc/trtorch_py.cpp +++ b/py/trtorch/csrc/trtorch_py.cpp @@ -15,6 +15,10 @@ namespace py = pybind11; namespace trtorch { namespace pyapi { +void set_device(const int device_id) { + core::set_device(device_id); +} + torch::jit::Module CompileGraph(const torch::jit::Module& mod, CompileSpec& info) { py::gil_scoped_acquire gil; auto trt_mod = core::CompileGraph(mod, info.toInternalCompileSpec()); @@ -146,6 +150,7 @@ PYBIND11_MODULE(_C, m) { m.def("_get_is_colored_output_on", &logging::get_is_colored_output_on, "Get if the logging output will be colored"); m.def("_set_is_colored_output_on", &logging::set_is_colored_output_on, "Set if the logging output should be colored"); m.def("_log", &logging::log, "Add a message to the logger"); + m.def("set_device", &trtorch::pyapi::set_device, "Set CUDA device id"); py::enum_<core::util::logging::LogLevel>(m, "LogLevel", py::arithmetic()) .value("INTERNAL_ERROR", core::util::logging::LogLevel::kINTERNAL_ERROR) diff --git a/tests/py/BUILD b/tests/py/BUILD index d8798f0175..2f20daaf67 100644 --- a/tests/py/BUILD +++ b/tests/py/BUILD @@ -15,9 +15,23 @@ py_test( "test_api.py", "model_test_case.py" ] + select({ - ":aarch64_linux": [ - "test_api_dla.py" - ], + ":aarch64_linux": [ + "test_api_dla.py" + ], + "//conditions:default" : [] + }), + deps = [ + requirement("torchvision") + ] +) + +# Following multi_gpu test is only targeted for multi-gpu configurations. It is not included in the test suite by default. +py_test( + name = "test_multi_gpu", + srcs = [ + "test_multi_gpu.py", + "model_test_case.py" + ], "//conditions:default" : [] }), deps = [ diff --git a/tests/py/test_multi_gpu.py b/tests/py/test_multi_gpu.py new file mode 100644 index 0000000000..4fb433f441 --- /dev/null +++ b/tests/py/test_multi_gpu.py @@ -0,0 +1,69 @@ +import unittest +import trtorch +import torch +import torchvision.models as models + +from model_test_case import ModelTestCase + +class TestMultiGpuSwitching(ModelTestCase): + def setUp(self): + if torch.cuda.device_count() < 2: + self.fail("Test is not relevant for this platform since number of available CUDA devices is less than 2") + + trtorch.set_device(0) + self.target_gpu = 1 + self.input = torch.randn((1, 3, 224, 224)).to("cuda:1") + self.model = self.model.to("cuda:1") + self.traced_model = torch.jit.trace(self.model, [self.input]) + self.scripted_model = torch.jit.script(self.model) + + def test_compile_traced(self): + trtorch.set_device(0) + compile_spec = { + "input_shapes": [self.input.shape], + "device": { + "device_type": trtorch.DeviceType.GPU, + "gpu_id": self.target_gpu, + "dla_core": 0, + "allow_gpu_fallback": False, + "disable_tf32": False + } + } + + trt_mod = trtorch.compile(self.traced_model, compile_spec) + trtorch.set_device(self.target_gpu) + same = (trt_mod(self.input) - self.traced_model(self.input)).abs().max() + trtorch.set_device(0) + self.assertTrue(same < 2e-3) + + def test_compile_script(self): + trtorch.set_device(0) + compile_spec = { + "input_shapes": [self.input.shape], + "device": { + "device_type": trtorch.DeviceType.GPU, + "gpu_id": self.target_gpu, + "dla_core": 0, + "allow_gpu_fallback": False, + "disable_tf32": False + } + } + + trt_mod = trtorch.compile(self.scripted_model, compile_spec) + trtorch.set_device(self.target_gpu) + same = (trt_mod(self.input) - self.scripted_model(self.input)).abs().max() + trtorch.set_device(0) + self.assertTrue(same < 2e-3) + +def test_suite(): + suite = unittest.TestSuite() + suite.addTest(TestMultiGpuSwitching.parametrize(TestMultiGpuSwitching, model=models.resnet18(pretrained=True))) + + return suite + +suite = test_suite() + +runner = unittest.TextTestRunner() +result = runner.run(suite) + +exit(int(not result.wasSuccessful()))
Gallopsled__pwntools-1716
Importing pwntools breaks carriage return Normally, a loop like below ``` for i in range(0, 5): print(str(i), end="\r") ``` should print each number in the same space. However, when I import pwntools, this behavior breaks and each one is printed on a new line or sequentially. System is ubuntu 18.04. Latest pwntools version.
[ { "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport atexit\nimport os\nimport re\nimport signal\nimport six\nimport struct\nimport sys\nimport threading\nimport traceback\n\nif sys.platform != 'win32':\n import fcntl\n import termios\n\nfrom pwnlib.context impor...
[ { "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport atexit\nimport os\nimport re\nimport signal\nimport six\nimport struct\nimport sys\nimport threading\nimport traceback\n\nif sys.platform != 'win32':\n import fcntl\n import termios\n\nfrom pwnlib.context impor...
diff --git a/pwnlib/term/term.py b/pwnlib/term/term.py index df35e68a5..2d0f41d6c 100644 --- a/pwnlib/term/term.py +++ b/pwnlib/term/term.py @@ -425,7 +425,7 @@ def render_cell(cell, clear_after = False): put('\x08') col -= 1 elif t == CR: -# put('\r') + put('\r') col = 0 elif t == SOH: put('\x01')
bokeh__bokeh-9368
Use `npm ci` to force usage of the lock file `npm ci` should be used in situations where any update of dependencies is undesired, especially in CI. `npm ci` installs dependencies exactly as specified in `package-lock.json`. On the other hand `npm install` can still perform minor updates.
[ { "content": "'''\n\n'''\nimport shutil\nfrom glob import glob\nfrom os.path import dirname, exists, join, realpath, relpath\nimport os, re, subprocess, sys, time\n\nimport versioneer\n\n# provide fallbacks for highlights in case colorama is not installed\ntry:\n import colorama\n from colorama import For...
[ { "content": "'''\n\n'''\nimport shutil\nfrom glob import glob\nfrom os.path import dirname, exists, join, realpath, relpath\nimport os, re, subprocess, sys, time\n\nimport versioneer\n\n# provide fallbacks for highlights in case colorama is not installed\ntry:\n import colorama\n from colorama import For...
diff --git a/_setup_support.py b/_setup_support.py index 80bb17de726..e9affaf8933 100644 --- a/_setup_support.py +++ b/_setup_support.py @@ -406,7 +406,7 @@ def package_path(path, filters=()): %s -Have you run `npm install --no-save` from the bokehjs subdirectory? +Have you run `npm ci` from the bokehjs subdirectory? For more information, see the Dev Guide: https://docs.bokeh.org/en/latest/docs/dev_guide.html diff --git a/bokehjs/LICENSE b/bokehjs/LICENSE index b31dc04a84c..b0ce3e6ea57 100644 --- a/bokehjs/LICENSE +++ b/bokehjs/LICENSE @@ -45,6 +45,6 @@ complete set of unique licenses found in BokehJS packages is listed here: A detailed list of specific package licenses may be obtained by executing the following commands in the "bokehjs" subdirectory: - $ npm install + $ npm ci $ npx license-checker --production --csv diff --git a/docker-tools/Dockerfile-from-source b/docker-tools/Dockerfile-from-source index 244388d1b23..faf509b6e65 100644 --- a/docker-tools/Dockerfile-from-source +++ b/docker-tools/Dockerfile-from-source @@ -38,7 +38,7 @@ RUN conda install --yes --quiet `python scripts/deps.py build` RUN npm install -g npm WORKDIR $BOKEH_SOURCE_DIR/bokehjs # build BokehJS -RUN npm install --no-save --no-progress +RUN npm ci --no-progress RUN sh -c 'if [[ -d make ]]; then node make build; else node_modules/.bin/gulp build; fi' WORKDIR $BOKEH_SOURCE_DIR # build a noarch conda package for Bokeh using the just-built BokehJS diff --git a/docker-tools/npm-install.sh b/docker-tools/npm-install.sh index 278c96a3c48..f0b26c6c996 100755 --- a/docker-tools/npm-install.sh +++ b/docker-tools/npm-install.sh @@ -1,5 +1,5 @@ #!/bin/bash -COMMAND="cd /bokeh/bokehjs && npm install --no-save --no-progress" +COMMAND="cd /bokeh/bokehjs && npm ci --no-progress" source "$(dirname $0)/base.sh" diff --git a/scripts/ci/appveyor/build.ps1 b/scripts/ci/appveyor/build.ps1 index 4cd08c654b3..a2e07d900ef 100644 --- a/scripts/ci/appveyor/build.ps1 +++ b/scripts/ci/appveyor/build.ps1 @@ -3,7 +3,7 @@ set-psdebug -trace 2 function build() { npm install -g npm Push-Location -Path ".\\bokehjs" - npm install --no-save --no-progress + npm ci --no-progress Pop-Location python setup.py -q install --build-js } diff --git a/scripts/ci/install.build b/scripts/ci/install.build index 40e7e5f8e45..574c800429c 100755 --- a/scripts/ci/install.build +++ b/scripts/ci/install.build @@ -22,5 +22,5 @@ npm install -g npm # install NPM dependencies pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress popd diff --git a/scripts/ci/install.codebase b/scripts/ci/install.codebase index 827a95aefba..1600d495c2f 100755 --- a/scripts/ci/install.codebase +++ b/scripts/ci/install.codebase @@ -8,5 +8,5 @@ set -x # echo commands # install NPM dependencies pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress popd diff --git a/scripts/ci/install.deploy b/scripts/ci/install.deploy index 785f7f20e45..711df98a8a3 100755 --- a/scripts/ci/install.deploy +++ b/scripts/ci/install.deploy @@ -23,7 +23,7 @@ npm install -g npm # install NPM dependencies pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress popd # install sampledata diff --git a/scripts/ci/install.examples b/scripts/ci/install.examples index 4daaa36a272..3237fb49b9a 100755 --- a/scripts/ci/install.examples +++ b/scripts/ci/install.examples @@ -5,6 +5,6 @@ set -x # echo commands # install NPM dependencies and build JS examples pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress node make examples --no-build popd diff --git a/scripts/ci/install.js b/scripts/ci/install.js index b339e359ac4..91b2318720d 100755 --- a/scripts/ci/install.js +++ b/scripts/ci/install.js @@ -5,5 +5,5 @@ set -x # echo commands # install NPM dependencies pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress popd diff --git a/scripts/ci/install.unit b/scripts/ci/install.unit index 186fa00dc2b..fd579437b4c 100755 --- a/scripts/ci/install.unit +++ b/scripts/ci/install.unit @@ -5,7 +5,7 @@ set -x # echo commands # install NPM dependencies pushd bokehjs -npm install --no-save --no-progress +npm ci --no-progress popd if [[ ! -z "${MINIMAL}" ]]; then diff --git a/sphinx/source/docs/dev_guide/setup.rst b/sphinx/source/docs/dev_guide/setup.rst index ffa119e4b3a..bbb11b09287 100644 --- a/sphinx/source/docs/dev_guide/setup.rst +++ b/sphinx/source/docs/dev_guide/setup.rst @@ -182,7 +182,7 @@ to install all of BokehJS JavaScript dependencies: .. code-block:: sh - npm install --no-save + npm ci This command will install the necessary packages into the ``node_modules`` subdirectory.
keras-team__keras-nlp-1166
Add compute_output_shape method to WordPieceTokenizer When we run Pretraining Transformer from Scratch guide with PyTorch and JAX backend, it raises ``` RuntimeError: Exception encountered when calling WordPieceTokenizer.call(). Could not automatically infer the output shape / dtype of 'word_piece_tokenizer_1' (of type WordPieceTokenizer). Either the `WordPieceTokenizer.call()` method is incorrect, or you need to implement the `WordPieceTokenizer.compute_output_spec() / compute_output_shape()` method. Error encountered: 'string' Arguments received by WordPieceTokenizer.call(): • args=('<KerasTensor shape=(None,), dtype=string, name=keras_tensor_59>',) • kwargs=<class 'inspect._empty'> ``` cc: @mattdangerw
[ { "content": "# Copyright 2023 The KerasNLP Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required b...
[ { "content": "# Copyright 2023 The KerasNLP Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required b...
diff --git a/keras_nlp/tokenizers/tokenizer.py b/keras_nlp/tokenizers/tokenizer.py index dbc7c4bbdb..e03123b0c5 100644 --- a/keras_nlp/tokenizers/tokenizer.py +++ b/keras_nlp/tokenizers/tokenizer.py @@ -123,3 +123,6 @@ def token_to_id(self, token: str) -> int: def call(self, inputs, *args, training=None, **kwargs): return self.tokenize(inputs, *args, **kwargs) + + def compute_output_shape(self, inputs_shape): + return tuple(inputs_shape) + (self.sequence_length,)
djangopackages__djangopackages-267
Add errormator.com
[ { "content": "# -*- coding: utf-8 -*-\n\"\"\"Heroku specific settings. These are used to deploy opencomparison to\nHeroku's platform.\n\"\"\"\n\n\nfrom os import environ\n\nfrom memcacheify import memcacheify\nfrom postgresify import postgresify\nfrom S3 import CallingFormat\n\nfrom settings.base import *\n\n\n...
[ { "content": "# -*- coding: utf-8 -*-\n\"\"\"Heroku specific settings. These are used to deploy opencomparison to\nHeroku's platform.\n\"\"\"\n\n\nfrom os import environ\n\nfrom memcacheify import memcacheify\nfrom postgresify import postgresify\nfrom S3 import CallingFormat\n\nfrom settings.base import *\n\n\n...
diff --git a/requirements.txt b/requirements.txt index a6f0c0f28..194b08b16 100644 --- a/requirements.txt +++ b/requirements.txt @@ -40,3 +40,4 @@ raven==1.4.6 rq==0.3.8 requests==1.2.3 six==1.1.0 +appenlight-client==0.6.4 \ No newline at end of file diff --git a/settings/heroku.py b/settings/heroku.py index f555a4450..d257723d9 100644 --- a/settings/heroku.py +++ b/settings/heroku.py @@ -168,3 +168,14 @@ ) ########## end templates + +#------------------- +# appenlight-client +#------------------ + +import appenlight_client.client as e_client +APPENLIGHT = e_client.get_config({'appenlight.api_key': os.environ.get('APPENLIGHT_KEY', '')}) + +MIDDLEWARE_CLASSES += ( + 'appenlight_client.django_middleware.AppenlightMiddleware', +) \ No newline at end of file
kivy__kivy-611
Label Text Clipped Horizontally (Moved) **Originally reported as a continuation of #576 by esbullington** I think I'm having trouble with this same issue. I'm trying to use markup with a Label, and am finding that my Label text is cut-off along the horizontal axis if I have markup set to True. This probably is only occurring with the latest development version, even after the above path was pulled. The problem does not occur with Kivy 1.3.0. If needed, I can re-install the development version and make a screen shot, but for now I'm working with Kivy 1.3.0. I've only started working with Kivy in the past few days, so I'm not yet in a place where I feel comfortable sending in a patch. (awesome framework, by the way, congrats on the great work!). Oh, and it doesn't look like I can re-open the issue, so someone else may wish to do so, or else tell me to open another issue for the problem. UPDATE: I coped markup.py from Kivy 1.3 to Kivy1.4-dev and it resolved this issue for me. I may now have problems with rst, but at least my markdown labels aren't cut in half.
[ { "content": "'''\nText Markup\n===========\n\n.. versionadded:: 1.1.0\n\nWe provide a simple text-markup for inline text styling. The syntax look the\nsame as the `BBCode <http://en.wikipedia.org/wiki/BBCode>`_.\n\nA tag is defined as ``[tag]``, and might have a closed tag associated:\n``[/tag]``. Example of a...
[ { "content": "'''\nText Markup\n===========\n\n.. versionadded:: 1.1.0\n\nWe provide a simple text-markup for inline text styling. The syntax look the\nsame as the `BBCode <http://en.wikipedia.org/wiki/BBCode>`_.\n\nA tag is defined as ``[tag]``, and might have a closed tag associated:\n``[/tag]``. Example of a...
diff --git a/kivy/core/text/markup.py b/kivy/core/text/markup.py index 1d05d14dbc..46e41ea351 100644 --- a/kivy/core/text/markup.py +++ b/kivy/core/text/markup.py @@ -266,7 +266,7 @@ def _real_render(self): y = 0 w, h = self._size refs = self._refs - no_of_lines = len(self._lines)-1 + no_of_lines = len(self._lines) for line in self._lines: lh = line[1] diff --git a/kivy/tests/test_issue_609.py b/kivy/tests/test_issue_609.py new file mode 100644 index 0000000000..e1d2320474 --- /dev/null +++ b/kivy/tests/test_issue_609.py @@ -0,0 +1,19 @@ +from common import GraphicUnitTest + + +class Issue609(GraphicUnitTest): + + def test_markup_pos(self): + from kivy.uix.label import Label + from kivy.uix.gridlayout import GridLayout + + lbl = Label(text="TextToTest") + lbl.bind(text_size=lbl.setter('size')) + mrkp = Label(text="TextToTest", markup = True) + mrkp.bind(text_size=mrkp.setter('size')) + + grid = GridLayout(rows=1, size_hint=(1, 1)) + grid.add_widget(lbl) + grid.add_widget(mrkp) + + self.render(grid, 2)
OpenNMT__OpenNMT-tf-6
Poor translation results with the Transformer The Transformer model produces very bad translation results. Its implementation should be revised and fixed. See also the reference implementation at https://github.com/tensorflow/tensor2tensor.
[ { "content": "\"\"\"Define functions related to the Google's Transformer model.\"\"\"\n\nimport tensorflow as tf\n\n\ndef scaled_dot_attention(queries,\n keys,\n values,\n mode,\n values_length=None,\n ...
[ { "content": "\"\"\"Define functions related to the Google's Transformer model.\"\"\"\n\nimport tensorflow as tf\n\n\ndef scaled_dot_attention(queries,\n keys,\n values,\n mode,\n values_length=None,\n ...
diff --git a/opennmt/utils/transformer.py b/opennmt/utils/transformer.py index 68d1a89c1..70d0fc9e6 100644 --- a/opennmt/utils/transformer.py +++ b/opennmt/utils/transformer.py @@ -163,5 +163,5 @@ def add_and_norm(inputs, rate=dropout, training=mode == tf.estimator.ModeKeys.TRAIN) outputs += inputs - outputs = tf.contrib.layers.layer_norm(outputs) + outputs = tf.contrib.layers.layer_norm(outputs, begin_norm_axis=-1) return outputs
qutip__qutip-834
Incorrect docstring of spin coherent state In qutip.states the docstring of the `spin_coherent` state is the same of `spin_state`. The correct description should be: "Generate the coherent spin state |theta, phi>."
[ { "content": "# This file is part of QuTiP: Quantum Toolbox in Python.\n#\n# Copyright (c) 2011 and later, Paul D. Nation and Robert J. Johansson.\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the followi...
[ { "content": "# This file is part of QuTiP: Quantum Toolbox in Python.\n#\n# Copyright (c) 2011 and later, Paul D. Nation and Robert J. Johansson.\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the followi...
diff --git a/qutip/states.py b/qutip/states.py index 0345945771..d0710ad7df 100644 --- a/qutip/states.py +++ b/qutip/states.py @@ -1060,8 +1060,7 @@ def spin_state(j, m, type='ket'): def spin_coherent(j, theta, phi, type='ket'): - """Generates the spin state |j, m>, i.e. the eigenstate - of the spin-j Sz operator with eigenvalue m. + """Generate the coherent spin state |theta, phi>. Parameters ----------
cookiecutter__cookiecutter-588
No way to define options that have no defaults Currently if you set a value in `cookiecutter.json` to `null` it becomes `None` and is then turned into the _string_ `'None'`.
[ { "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n\"\"\"\ncookiecutter.prompt\n---------------------\n\nFunctions for prompting the user for project info.\n\"\"\"\n\nfrom collections import OrderedDict\n\nimport click\nfrom past.builtins import basestring\n\nfrom future.utils import iteritems\nfro...
[ { "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n\"\"\"\ncookiecutter.prompt\n---------------------\n\nFunctions for prompting the user for project info.\n\"\"\"\n\nfrom collections import OrderedDict\n\nimport click\nfrom past.builtins import basestring\n\nfrom future.utils import iteritems\nfro...
diff --git a/cookiecutter/prompt.py b/cookiecutter/prompt.py index d06409c42..0809f9e56 100755 --- a/cookiecutter/prompt.py +++ b/cookiecutter/prompt.py @@ -81,6 +81,8 @@ def read_user_choice(var_name, options): def render_variable(env, raw, cookiecutter_dict): + if raw is None: + return None if not isinstance(raw, basestring): raw = str(raw) template = env.from_string(raw) diff --git a/tests/test_prompt.py b/tests/test_prompt.py index 3000fda1a..583da1d84 100644 --- a/tests/test_prompt.py +++ b/tests/test_prompt.py @@ -22,7 +22,8 @@ (1, '1'), (True, 'True'), ('foo', 'foo'), - ('{{cookiecutter.project}}', 'foobar') + ('{{cookiecutter.project}}', 'foobar'), + (None, None), ]) def test_convert_to_str(mocker, raw_var, rendered_var): env = Environment() @@ -35,10 +36,13 @@ def test_convert_to_str(mocker, raw_var, rendered_var): result = prompt.render_variable(env, raw_var, context) assert result == rendered_var - # Make sure that non str variables are conerted beforehand - if not isinstance(raw_var, basestring): - raw_var = str(raw_var) - from_string.assert_called_once_with(raw_var) + # Make sure that non None non str variables are conerted beforehand + if raw_var is not None: + if not isinstance(raw_var, basestring): + raw_var = str(raw_var) + from_string.assert_called_once_with(raw_var) + else: + assert not from_string.called @pytest.fixture(autouse=True)
openvinotoolkit__datumaro-125
infer result passed from openvino launcher to interpreter is not appropriate. I tried model run using openvino's mobileenet-v2-pytorch model. (using mobilenet-v2-pytorch.xml, mobilenet-v2-pytorch.bin) `datum model run -p proj -m model-0` However, only the name of the layer (ex. 'prob' string) comes into the input parameters(outputs) of the interpreter. Please check the return result of OpenvinoLauncher.infer `results = self._net.infer(inputs)` line 178, openvino_launcher.py Debugging results are normal up to the code above, but it seems that only the name of the result layer is returned when returning and passing to interpreter.
[ { "content": "\n# Copyright (C) 2019-2020 Intel Corporation\n#\n# SPDX-License-Identifier: MIT\n\n# pylint: disable=exec-used\n\nimport cv2\nimport logging as log\nimport numpy as np\nimport os.path as osp\nimport shutil\n\nfrom openvino.inference_engine import IECore\n\nfrom datumaro.components.cli_plugin impo...
[ { "content": "\n# Copyright (C) 2019-2020 Intel Corporation\n#\n# SPDX-License-Identifier: MIT\n\n# pylint: disable=exec-used\n\nimport cv2\nimport logging as log\nimport numpy as np\nimport os.path as osp\nimport shutil\n\nfrom openvino.inference_engine import IECore\n\nfrom datumaro.components.cli_plugin impo...
diff --git a/CHANGELOG.md b/CHANGELOG.md index 12e1062671..f16e5fce62 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -20,7 +20,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - ### Fixed -- +- Inference result for only one output layer in OpenVINO launcher (<https://github.com/openvinotoolkit/datumaro/pull/125>) ### Security - diff --git a/datumaro/plugins/openvino_launcher.py b/datumaro/plugins/openvino_launcher.py index 6a21750356..7c64d6fa44 100644 --- a/datumaro/plugins/openvino_launcher.py +++ b/datumaro/plugins/openvino_launcher.py @@ -177,7 +177,7 @@ def infer(self, inputs): results = self._net.infer(inputs) if len(results) == 1: - return next(iter(results)) + return next(iter(results.values())) else: return results
ietf-tools__datatracker-6907
the ietf meeting parts of upcoming.ics end a day early Our custom ics should take into account (from RFC5545) >The "DTEND" property for a "VEVENT" calendar component specifies the non-inclusive end of the event. See https://github.com/ietf-tools/datatracker/blob/287cf0fe46c0b1b7548389b4327854567e6b29f8/ietf/templates/meeting/upcoming.ics#L28
[ { "content": "# Copyright The IETF Trust 2007-2023, All Rights Reserved\n# -*- coding: utf-8 -*-\n\n\nimport datetime\nimport re\nfrom urllib.parse import urljoin\nfrom zoneinfo import ZoneInfo\n\nfrom django import template\nfrom django.conf import settings\nfrom django.utils.html import escape\nfrom django.te...
[ { "content": "# Copyright The IETF Trust 2007-2023, All Rights Reserved\n# -*- coding: utf-8 -*-\n\n\nimport datetime\nimport re\nfrom urllib.parse import urljoin\nfrom zoneinfo import ZoneInfo\n\nfrom django import template\nfrom django.conf import settings\nfrom django.utils.html import escape\nfrom django.te...
diff --git a/ietf/doc/templatetags/ietf_filters.py b/ietf/doc/templatetags/ietf_filters.py index 8d9336b536..cfed7aa1db 100644 --- a/ietf/doc/templatetags/ietf_filters.py +++ b/ietf/doc/templatetags/ietf_filters.py @@ -539,6 +539,10 @@ def ics_date_time(dt, tzname): return f':{timestamp}Z' else: return f';TZID={ics_esc(tzname)}:{timestamp}' + +@register.filter +def next_day(value): + return value + datetime.timedelta(days=1) @register.filter diff --git a/ietf/templates/meeting/upcoming.ics b/ietf/templates/meeting/upcoming.ics index fb5b37d772..5eca7ec81d 100644 --- a/ietf/templates/meeting/upcoming.ics +++ b/ietf/templates/meeting/upcoming.ics @@ -25,7 +25,7 @@ SUMMARY:IETF {{ meeting.number }}{% if meeting.city %} LOCATION:{{ meeting.city }},{{ meeting.country }}{% endif %} CLASS:PUBLIC DTSTART;VALUE=DATE{% if meeting.time_zone %};TZID={{ meeting.time_zone|ics_esc }}{% endif %}:{{ meeting.date|date:"Ymd" }} -DTEND;VALUE=DATE{% if meeting.time_zone %};TZID={{ meeting.time_zone|ics_esc }}{% endif %}:{{ meeting.end_date|date:"Ymd" }} +DTEND;VALUE=DATE{% if meeting.time_zone %};TZID={{ meeting.time_zone|ics_esc }}{% endif %}:{{ meeting.end_date|next_day|date:"Ymd" }} DTSTAMP{% ics_date_time meeting.cached_updated|utc 'utc' %} URL:{{ request.scheme }}://{{ request.get_host }}{% url 'agenda' num=meeting.number %} END:VEVENT
microsoft__playwright-python-1497
[Question]: How to get the right BrowserType from a device name? ### Your question I noticed that the CLI is able to figure out the right `BrowserType` to use when it is launched from the command line: ``` playwright open --device="Desktop Safari" wikipedia.org # Webkit playwright open --device="Desktop Firefox" wikipedia.org # Firefox playwright open --device="Desktop Chrome" wikipedia.org # Chrome ``` But [the documentation](https://playwright.dev/python/docs/api/class-playwright#playwright-devices) seems to say that I have to initialize a `BrowserType` before I can pass the settings to the context, which partially defeats the purpose of the device settings. I can implement my own logic do initialize the right `BrowserType` for each device, but as `playwright open` can already do that, that seems superfluous.
[ { "content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by ap...
[ { "content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by ap...
diff --git a/playwright/_impl/_playwright.py b/playwright/_impl/_playwright.py index 354e3d11c..f9fe7617f 100644 --- a/playwright/_impl/_playwright.py +++ b/playwright/_impl/_playwright.py @@ -81,4 +81,5 @@ def parse_device_descriptor(dict: Dict) -> Dict: "device_scale_factor": dict["deviceScaleFactor"], "is_mobile": dict["isMobile"], "has_touch": dict["hasTouch"], + "default_browser_type": dict["defaultBrowserType"], } diff --git a/tests/async/test_device_descriptors.py b/tests/async/test_device_descriptors.py new file mode 100644 index 000000000..c8790b2a8 --- /dev/null +++ b/tests/async/test_device_descriptors.py @@ -0,0 +1,37 @@ +import pytest + + +@pytest.mark.only_browser("chromium") +async def test_should_work(playwright) -> None: + device_descriptor = playwright.devices["Pixel 2"] + device_type = device_descriptor["default_browser_type"] + browser = await playwright[device_type].launch() + context = await browser.new_context( + **device_descriptor, + ) + page = await context.new_page() + assert device_descriptor["default_browser_type"] == "chromium" + assert browser.browser_type.name == "chromium" + + assert "Pixel 2" in device_descriptor["user_agent"] + assert "Pixel 2" in await page.evaluate("navigator.userAgent") + + assert device_descriptor["device_scale_factor"] > 2 + assert await page.evaluate("window.devicePixelRatio") > 2 + + assert device_descriptor["viewport"]["height"] > 700 + assert device_descriptor["viewport"]["height"] < 800 + inner_height = await page.evaluate("window.innerHeight") + inner_height > 700 + inner_height < 800 + + assert device_descriptor["viewport"]["width"] > 400 + assert device_descriptor["viewport"]["width"] < 500 + inner_width = await page.evaluate("window.innerWidth") + inner_width > 400 + inner_width < 500 + + assert device_descriptor["has_touch"] + assert device_descriptor["is_mobile"] + + await browser.close()
mkdocs__mkdocs-2893
Support latest realise of Markdown library I believe there has been some update to the `Markdown` library and how it internally records its version that is breaking things. With a brand new environment and a fresh install of `mkdocs`, a `mkdocs build --strict --verbose` fails my project with this error: ```bash DEBUG - Loading configuration file: /Users/sh/Projects/dataportalapiclient/mkdocs.yml ERROR - Config value: 'markdown_extensions'. Error: module 'markdown' has no attribute 'version_info' ``` At this point, mkdocs has a dependency on `Markdown==3.4.1`, which was released [three days ago](https://github.com/Python-Markdown/markdown/tags). After running a `pip install Markdown==3.3.7` to downgrade the version, rerunning the build is successful: ```bash DEBUG - Loading configuration file: /Users/sh/Projects/dataportalapiclient/mkdocs.yml ... DEBUG - mkdocstrings: Tearing handlers down INFO - Documentation built in 3.45 seconds ``` I notice in [this commit from May 27th on the Markdown repository](https://github.com/Python-Markdown/markdown/commit/a767b2daaad78ba32d45a4f1dabb7c5e218f030a), the deprecated `version_info` info object was removed, and replaced with the `__version_info__` object, as per this table: | Deprecated Object | Replacement Object | |----------------------------------------|-------------------------------------| | `markdown.version` | `markdown.__version__` | | `markdown.version_info` | `markdown.__version_info__` | | `markdown.util.etree` | `xml.etree.ElementTree` | | `markdown.util.string_type` | `str` | | `markdown.util.text_type` | `str` | | `markdown.util.int2str` | `chr` | | `markdown.util.iterrange` | `range` | | `markdown.util.isBlockLevel` | `markdown.Markdown.is_block_level` | | `markdown.util.Processor().markdown` | `markdown.util.Processor().md` | | `markdown.util.Registry().__setitem__` | `markdown.util.Registry().register` | | `markdown.util.Registry().__delitem__` |`markdown.util.Registry().deregister`| | `markdown.util.Registry().add` | `markdown.util.Registry().register` | Hopefully the fix is a simple change to this dunder object! Whether this repository is the right place for the packaged markdown extension or not, I'm unsure, I couldn't quite see where that config gets run either here or in the [Python Markdown library](https://github.com/Python-Markdown/markdown/). If this isn't the place, I'd appreciate if you can please point me towards the right repo.
[ { "content": "#!/usr/bin/env python\n\nfrom setuptools import setup\nimport re\nimport os\nimport sys\n\nfrom mkdocs.commands.setup import babel_cmdclass\n\nwith open('README.md') as f:\n long_description = f.read()\n\n\ndef get_version(package):\n \"\"\"Return package version as listed in `__version__` i...
[ { "content": "#!/usr/bin/env python\n\nfrom setuptools import setup\nimport re\nimport os\nimport sys\n\nfrom mkdocs.commands.setup import babel_cmdclass\n\nwith open('README.md') as f:\n long_description = f.read()\n\n\ndef get_version(package):\n \"\"\"Return package version as listed in `__version__` i...
diff --git a/requirements/project.txt b/requirements/project.txt index 1402023779..b158a10a42 100644 --- a/requirements/project.txt +++ b/requirements/project.txt @@ -1,7 +1,7 @@ babel>=2.9.0 click>=7.0 Jinja2>=2.10.2 -Markdown>=3.2.1 +Markdown>=3.2.1,<3.4 PyYAML>=5.2 watchdog>=2.0.0 mdx_gh_links>=0.2 diff --git a/setup.py b/setup.py index 77cabc87e3..1115ce0e75 100755 --- a/setup.py +++ b/setup.py @@ -64,7 +64,7 @@ def get_packages(package): install_requires=[ 'click>=3.3', 'Jinja2>=2.10.2', - 'Markdown>=3.2.1', + 'Markdown>=3.2.1,<3.4', 'PyYAML>=3.10', 'watchdog>=2.0', 'ghp-import>=1.0',
django-cms__django-cms-3868
Implements a different live/draft switcher for placeholders outside of the CMS. Placeholders outside of the CMS currently cause the live/draft switcher to be displayed, which toggles the editing mode. However, for non-CMS models, draft versions are not implemented. This can be confusing to users. This PR adds another switcher template that uses the language "Editing Off" in place of "Draft" and "Editing Live" in place of "Live." The PageToolbar.add_draft_live() method has been modified to take the parameter "is_page" which determines the template used. Tests have not been implemented as there are currently no tests for cms_toolbar.py. Some thoughts: Should there be a separate method for adding the new toggle, as opposed to piggybacking off of add_draft_live? Should there be one template for the switcher that we hand text to? Can anyone offer guidance on implementing tests?
[ { "content": "# -*- coding: utf-8 -*-\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.core.urlresolvers import reverse, NoReverseMatch, resolve, Resolver404\nfrom django.utils.translation import ugettext_lazy as _\nfrom django.contrib import admin\nfrom django.co...
[ { "content": "# -*- coding: utf-8 -*-\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.core.urlresolvers import reverse, NoReverseMatch, resolve, Resolver404\nfrom django.utils.translation import ugettext_lazy as _\nfrom django.contrib import admin\nfrom django.co...
diff --git a/cms/cms_toolbar.py b/cms/cms_toolbar.py index 9285aaf348d..992099bd4e3 100644 --- a/cms/cms_toolbar.py +++ b/cms/cms_toolbar.py @@ -274,8 +274,8 @@ def populate(self): def post_template_populate(self): self.init_placeholders_from_request() - self.add_publish_button() self.add_draft_live() + self.add_publish_button() # Buttons diff --git a/cms/static/cms/css/cms.base.css b/cms/static/cms/css/cms.base.css index 537dcfcdf74..20c578b28ae 100644 --- a/cms/static/cms/css/cms.base.css +++ b/cms/static/cms/css/cms.base.css @@ -2,11 +2,11 @@ * @copyright: https://github.com/divio/django-cms */.cms_reset div,.cms_reset p,.cms_reset a,.cms_reset a:hover,.cms_reset a:active,.cms_reset a:focus,.cms_reset ul,.cms_reset li,.cms_reset form,.cms_reset fieldset,.cms_reset label,.cms_reset input,.cms_reset textarea{font:normal 13px/20px Helvetica,Arial,sans-serif;color:#222;text-decoration:none;text-align:left;outline:none;list-style-type:none;height:auto;padding:0;margin:0;border:none;background:none}#page_form_lang_tabs{position:relative}#page_form_lang_tabs input.language_button{background:#ccc}#page_form_lang_tabs input.selected{color:black;text-shadow:none;background:white}#page_form_lang_tabs input.notfilled{color:#bbb;background:none}#page_form_lang_tabs .lang_tabs_line{position:absolute;left:0;bottom:-5px;width:100%;height:5px;background:white}.cms_dialog{position:absolute;left:50%;top:50%;z-index:99999;width:500px;height:200px;padding:25px;margin:-100px 0 0 -275px;background:white;border:1px solid #ccc;-moz-border-radius:5px;-webkit-border-radius:5px;border-radius:5px}.cms_dialog h1{padding:0;margin:0 0 15px}.cms_dialog form{padding:15px 0;margin:15px 0;border-top:1px solid #f3f3f3}.cms_toolbar-noscroll{position:fixed;overflow-y:scroll;width:100%}/*! * @copyright: https://github.com/divio/django-cms - */#cms_toolbar .cms_loader{background:#fcfcfc url("../img/loader.gif") no-repeat center center !important}#cms_toolbar .cms_btn{color:#666;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;border:1px solid #e6e6e6;background:#e6e6e6;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iI2YyZjJmMiIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iI2U2ZTZlNiIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #f2f2f2),color-stop(100%, #e6e6e6));background-image:-moz-linear-gradient(top, #f2f2f2,#e6e6e6);background-image:-webkit-linear-gradient(top, #f2f2f2,#e6e6e6);background-image:linear-gradient(to bottom, #f2f2f2,#e6e6e6);-moz-box-shadow:inset #f2f2f2 0px 1px 0px;-webkit-box-shadow:inset #f2f2f2 0px 1px 0px;box-shadow:inset #f2f2f2 0px 1px 0px}#cms_toolbar .cms_btn:hover,#cms_toolbar .cms_btn:active,#cms_toolbar .cms_btn:focus{background:#e6e6e6;border:1px solid #e6e6e6;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn:active,#cms_toolbar .cms_btn:focus{border:1px solid #ccc;background:#ccc}#cms_toolbar .cms_btn-disabled{border-right:1px solid #ccc;border-top:1px solid #ccc;-moz-box-shadow:inset 0px 1px 0px #e6e6e6;-webkit-box-shadow:inset 0px 1px 0px #e6e6e6;box-shadow:inset 0px 1px 0px #e6e6e6;background:#ededed}#cms_toolbar .cms_btn-disabled:hover,#cms_toolbar .cms_btn-disabled:active,#cms_toolbar .cms_btn-disabled:focus{background-color:#e6e6e6}#cms_toolbar .cms_btn-active{color:white;border:1px solid #333 !important;border-bottom:none !important;-moz-box-shadow:inset 0 1px 0 #999;-webkit-box-shadow:inset 0 1px 0 #999;box-shadow:inset 0 1px 0 #999;background:#666;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzY2NjY2NiIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzMzMzMzMyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #666666),color-stop(100%, #333333));background-image:-moz-linear-gradient(top, #666666,#333333);background-image:-webkit-linear-gradient(top, #666666,#333333);background-image:linear-gradient(to bottom, #666666,#333333)}#cms_toolbar .cms_btn-active:hover,#cms_toolbar .cms_btn-active:active,#cms_toolbar .cms_btn-active:focus{background:#454545;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-active:active,#cms_toolbar .cms_btn-active:focus{background:#000}#cms_toolbar .cms_btn-action{color:white;border:1px solid #0e72ec !important;-moz-box-shadow:inset #3abcf3 0px 1px 0px;-webkit-box-shadow:inset #3abcf3 0px 1px 0px;box-shadow:inset #3abcf3 0px 1px 0px;background:#0eaaec;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzBlOTdlYyIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzBlNzJlYyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #0e97ec),color-stop(100%, #0e72ec));background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_btn-action:hover,#cms_toolbar .cms_btn-action:active,#cms_toolbar .cms_btn-action:focus{background:#0e72ec;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-action:active,#cms_toolbar .cms_btn-action:focus{background:#0b5bbc}#cms_toolbar .cms_btn-caution{color:white;border:1px solid #ff4000 !important;-moz-box-shadow:inset #f66 0px 1px 0px;-webkit-box-shadow:inset #f66 0px 1px 0px;box-shadow:inset #f66 0px 1px 0px;background:red;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iI2ZmMTUwMCIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iI2ZmNDAwMCIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #ff1500),color-stop(100%, #ff4000));background-image:-moz-linear-gradient(top, #ff1500,#ff4000);background-image:-webkit-linear-gradient(top, #ff1500,#ff4000);background-image:linear-gradient(to bottom, #ff1500,#ff4000)}#cms_toolbar .cms_btn-caution:hover,#cms_toolbar .cms_btn-caution:active,#cms_toolbar .cms_btn-caution:focus{background:#ff4000;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-caution:active,#cms_toolbar .cms_btn-caution:focus{background:#c30}#cms_toolbar .cms_btn-publish{display:none}#cms_toolbar .cms_btn-publish-active{display:block}#cms_toolbar .cms_tooltip{visibility:hidden;position:absolute;left:0;top:0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;font-size:11px;line-height:11px;font-weight:bold;white-space:nowrap;padding:5px 7px 7px 24px;margin:0;color:#e6e6e6;background:#454545 url("../img/toolbar/sprite_toolbar.png") no-repeat -78px -169px}#cms_toolbar .cms_tooltip span{float:right;position:absolute;right:0;top:20px;-moz-border-radius:3px 0 3px 3px;-webkit-border-radius:3px;border-radius:3px 0 3px 3px;color:white;font-weight:normal;padding:5px 7px;background:#454545}/*! + */#cms_toolbar .cms_loader{background:#fcfcfc url("../img/loader.gif") no-repeat center center !important}#cms_toolbar .cms_btn{color:#666;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;border:1px solid #e6e6e6;background:#e6e6e6;background-image:-moz-linear-gradient(top, #f2f2f2,#e6e6e6);background-image:-webkit-linear-gradient(top, #f2f2f2,#e6e6e6);background-image:linear-gradient(to bottom, #f2f2f2,#e6e6e6);-moz-box-shadow:inset #f2f2f2 0px 1px 0px;-webkit-box-shadow:inset #f2f2f2 0px 1px 0px;box-shadow:inset #f2f2f2 0px 1px 0px}#cms_toolbar .cms_btn:hover,#cms_toolbar .cms_btn:active,#cms_toolbar .cms_btn:focus{background:#e6e6e6;border:1px solid #e6e6e6;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn:active,#cms_toolbar .cms_btn:focus{border:1px solid #ccc;background:#ccc}#cms_toolbar .cms_btn-disabled{border-right:1px solid #ccc;border-top:1px solid #ccc;-moz-box-shadow:inset 0px 1px 0px #e6e6e6;-webkit-box-shadow:inset 0px 1px 0px #e6e6e6;box-shadow:inset 0px 1px 0px #e6e6e6;background:#ededed}#cms_toolbar .cms_btn-disabled:hover,#cms_toolbar .cms_btn-disabled:active,#cms_toolbar .cms_btn-disabled:focus{background-color:#e6e6e6}#cms_toolbar .cms_btn-active{color:white;border:1px solid #333 !important;border-bottom:none !important;-moz-box-shadow:inset 0 1px 0 #999;-webkit-box-shadow:inset 0 1px 0 #999;box-shadow:inset 0 1px 0 #999;background:#666;background-image:-moz-linear-gradient(top, #666666,#333333);background-image:-webkit-linear-gradient(top, #666666,#333333);background-image:linear-gradient(to bottom, #666666,#333333)}#cms_toolbar .cms_btn-active:hover,#cms_toolbar .cms_btn-active:active,#cms_toolbar .cms_btn-active:focus{background:#454545;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-active:active,#cms_toolbar .cms_btn-active:focus{background:#000}#cms_toolbar .cms_btn-action{color:white;border:1px solid #0e72ec !important;-moz-box-shadow:inset #3abcf3 0px 1px 0px;-webkit-box-shadow:inset #3abcf3 0px 1px 0px;box-shadow:inset #3abcf3 0px 1px 0px;background:#0eaaec;background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_btn-action:hover,#cms_toolbar .cms_btn-action:active,#cms_toolbar .cms_btn-action:focus{background:#0e72ec;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-action:active,#cms_toolbar .cms_btn-action:focus{background:#0b5bbc}#cms_toolbar .cms_btn-caution{color:white;border:1px solid #ff4000 !important;-moz-box-shadow:inset #f66 0px 1px 0px;-webkit-box-shadow:inset #f66 0px 1px 0px;box-shadow:inset #f66 0px 1px 0px;background:red;background-image:-moz-linear-gradient(top, #ff1500,#ff4000);background-image:-webkit-linear-gradient(top, #ff1500,#ff4000);background-image:linear-gradient(to bottom, #ff1500,#ff4000)}#cms_toolbar .cms_btn-caution:hover,#cms_toolbar .cms_btn-caution:active,#cms_toolbar .cms_btn-caution:focus{background:#ff4000;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_btn-caution:active,#cms_toolbar .cms_btn-caution:focus{background:#c30}#cms_toolbar .cms_btn-publish{display:none}#cms_toolbar .cms_btn-publish-active{display:block}#cms_toolbar .cms_tooltip{visibility:hidden;position:absolute;left:0;top:0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;font-size:11px;line-height:11px;font-weight:bold;white-space:nowrap;padding:5px 7px 7px 24px;margin:0;color:#e6e6e6;background:#454545 url("../img/toolbar/sprite_toolbar.png") no-repeat -78px -169px}#cms_toolbar .cms_tooltip span{float:right;position:absolute;right:0;top:20px;-moz-border-radius:3px 0 3px 3px;-webkit-border-radius:3px;border-radius:3px 0 3px 3px;color:white;font-weight:normal;padding:5px 7px;background:#454545}/*! * @copyright: https://github.com/divio/django-cms */.cms_plugin{display:inline}.cms_plugin-active{outline:#0e72ec auto 4px}.cms_placeholder{height:0px;overflow:hidden}.cms_render_model_icon{display:inline-block;width:18px;height:18px;padding:0;margin:0;cursor:pointer}.cms_render_model_icon img{max-width:none;position:relative;padding:0 !important;margin:0 !important;background:url("../img/toolbar/render_model_icon.png") no-repeat}.cms_render_model_add{display:inline-block;width:18px;height:18px;padding:0;margin:0;cursor:pointer}.cms_render_model_add img{max-width:none;position:relative;padding:0 !important;margin:0 !important;background:url("../img/toolbar/render_model_add.png") no-repeat}/*! * @copyright: https://github.com/divio/django-cms - */#cms_toolbar{position:absolute;left:0;top:5px;z-index:9999999;width:100%}#cms_toolbar .cms_toolbar{display:none;position:fixed;left:0;top:0;z-index:999999;width:100%;min-width:320px;height:30px;border-bottom:1px solid #666 !important;background-color:white;background:rgba(250,250,250,0);background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iI2ZhZmFmYSIgc3RvcC1vcGFjaXR5PSIwLjk3Ii8+PHN0b3Agb2Zmc2V0PSI1MCUiIHN0b3AtY29sb3I9IiNmY2ZjZmMiIHN0b3Atb3BhY2l0eT0iMC45NyIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iI2ZhZmFmYSIgc3RvcC1vcGFjaXR5PSIwLjk1Ii8+PC9saW5lYXJHcmFkaWVudD48L2RlZnM+PHJlY3QgeD0iMCIgeT0iMCIgd2lkdGg9IjEwMCUiIGhlaWdodD0iMTAwJSIgZmlsbD0idXJsKCNncmFkKSIgLz48L3N2Zz4g');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, rgba(250,250,250,0.97)),color-stop(50%, rgba(252,252,252,0.97)),color-stop(100%, rgba(250,250,250,0.95)));background-image:-moz-linear-gradient(top, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);background-image:-webkit-linear-gradient(top, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);background-image:linear-gradient(to bottom, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);-moz-box-shadow:0 0 5px rgba(0,0,0,0.2);-webkit-box-shadow:0 0 5px rgba(0,0,0,0.2);box-shadow:0 0 5px rgba(0,0,0,0.2);background/**/:#fcfcfc}#cms_toolbar .cms_toolbar .cms_toolbar-left{float:left;padding-left:10px;position:relative;z-index:10}#cms_toolbar .cms_toolbar .cms_toolbar-right{float:right;padding-right:32px;position:relative;z-index:10}#cms_toolbar .cms_toolbar .cms_toolbar-left .cms_toolbar-item{margin-left:10px}#cms_toolbar .cms_toolbar .cms_toolbar-right .cms_toolbar-item{margin-right:20px}#cms_toolbar .cms_toolbar .cms_toolbar-item{float:left}#cms_toolbar .cms_toolbar .cms_toolbar-item-buttons a{border-bottom:none !important}@media only screen and (max-width: 800px){#cms_toolbar .cms_toolbar-right{display:none}}#cms_toolbar.cms_toolbar-debug .cms_toolbar{top:5px !important}#cms_toolbar.cms_toolbar-debug .cms_toolbar-trigger{top:5px !important}#cms_toolbar.cms_toolbar-debug .cms_debug-bar{position:fixed;left:0;top:0;z-index:99999999;width:100%;height:4px;border-bottom:1px solid #ddd;background:#fdffc8 url("../img/toolbar/sprite_toolbar.png") repeat-x left -444px}#cms_toolbar.cms_toolbar-debug #container{padding-top:35px !important}#cms_toolbar .cms_toolbar-item-navigation li{float:left;position:relative;zoom:1}#cms_toolbar .cms_toolbar-item-navigation li a{float:left;padding:5px 10px;zoom:1;cursor:default}#cms_toolbar .cms_toolbar-item-navigation li ul{display:none}#cms_toolbar .cms_toolbar-item-navigation>li:first-child>a span{font-weight:800;line-height:12px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul{position:absolute;left:0;top:30px;display:block;min-width:180px;padding:4px 0;-moz-border-radius:0 0 4px 4px;-webkit-border-radius:0;border-radius:0 0 4px 4px;border:1px solid white;border-top:none;background-color:white;background:rgba(255,255,255,0.97);-moz-box-shadow:0 1px 1px rgba(0,0,0,0.4);-webkit-box-shadow:0 1px 1px rgba(0,0,0,0.4);box-shadow:0 1px 1px rgba(0,0,0,0.4)}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul li{float:none;zoom:1}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul li a{float:none;display:block;zoom:1;cursor:pointer;white-space:nowrap;padding:2px 10px 2px 15px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul ul{-moz-border-radius:0 4px 4px 0;-webkit-border-radius:0;border-radius:0 4px 4px 0;border-top:1px solid #f5f5f5}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children ul{display:none;left:100%;top:-5px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children>a{cursor:default}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children>a span{display:block;background:url("../img/toolbar/sprite_toolbar.png") no-repeat right -270px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover>a{color:white;background:#0e72ec;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzBlOTdlYyIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzBlNzJlYyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #0e97ec),color-stop(100%, #0e72ec));background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover>a span{background-position:right -300px !important}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-active>a{font-weight:800}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-break{height:1px;margin:0 0 4px;padding:0 0 3px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;border-bottom:1px solid #e6e6e6}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a{cursor:default !important;filter:progid:DXImageTransform.Microsoft.Alpha(Opacity=20);opacity:0.2}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:hover,#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:active,#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:focus{-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none;background:none !important;color:black !important}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled ul{display:none !important}#cms_toolbar .cms_toolbar-item-cms-mode-switcher{display:none}#cms_toolbar .cms_messages{display:none;position:fixed;z-index:999999;top:30px;width:300px;min-height:14px;margin:0;padding:6px 10px 8px;background:rgba(0,0,0,0.74);-moz-border-radius:0 0 3px 3px;-webkit-border-radius:0;border-radius:0 0 3px 3px;color:white;font-size:12px;line-height:16px;font-weight:200}#cms_toolbar .cms_messages *{color:white;font-size:12px;line-height:16px;font-weight:200}#cms_toolbar .cms_messages a{color:#0eaaec}#cms_toolbar .cms_messages a:hover{text-decoration:underline}#cms_toolbar .cms_messages strong{color:#3abcf3;font-weight:200}#cms_toolbar .cms_messages ul{display:inline;color:white}#cms_toolbar .cms_messages ul li{display:inline;color:white;font-weight:200}#cms_toolbar .cms_messages .cms_messages-close{display:none;float:right;cursor:pointer;width:20px;height:14px;margin-left:10px;position:relative;top:-2px;left:3px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat -100px -90px}#cms_toolbar .cms_messages .cms_messages-close:hover{background-position:-120px -90px}#cms_toolbar .cms_messages-error strong{color:red}#cms_toolbar .cms_toolbar-item-logo{margin:0 !important}#cms_toolbar .cms_toolbar-item-logo a{display:block;width:92px;height:20px;margin:5px 0;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;background:url("../img/toolbar/sprite_toolbar.png") no-repeat left top}#cms_toolbar .cms_toolbar-item-logo a:hover,#cms_toolbar .cms_toolbar-item-logo a:active,#cms_toolbar .cms_toolbar-item-logo a:focus{background-position:left -20px}#cms_toolbar .cms_form-login{padding:3px 0 0 0}#cms_toolbar .cms_form-login label{float:left;cursor:pointer;padding-left:10px}#cms_toolbar .cms_form-login label span{padding-top:1px;display:inline-block;vertical-align:middle;*vertical-align:auto;*zoom:1;*display:inline}#cms_toolbar .cms_form-login input[type="text"],#cms_toolbar .cms_form-login input[type="password"]{font-size:13px;line-height:13px;width:100px;padding:3px 5px;margin:0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;color:#666;border:1px solid #d9d9d9;-moz-box-shadow:0px 1px 0px #fff;-webkit-box-shadow:0px 1px 0px #fff;box-shadow:0px 1px 0px #fff}#cms_toolbar .cms_form-login input[type="text"]:focus,#cms_toolbar .cms_form-login input[type="password"]:focus{border-color:#0eaaec;-moz-box-shadow:inset 0px 0px 2px #e6e6e6;-webkit-box-shadow:inset 0px 0px 2px #e6e6e6;box-shadow:inset 0px 0px 2px #e6e6e6;-moz-transition:outline,0.2s 1s;-o-transition:outline,0.2s 1s;-webkit-transition:outline,0.2s 1s;transition:outline 0.2s 1s}#cms_toolbar .cms_form-login input[type="submit"]{display:block;color:white;font-size:12px;text-transform:uppercase;cursor:pointer;height:23px;padding:1px 15px 0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;border:1px solid #333;background-color:#666;-moz-box-shadow:inset 0 1px 0 #999;-webkit-box-shadow:inset 0 1px 0 #999;box-shadow:inset 0 1px 0 #999;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzY2NjY2NiIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzMzMzMzMyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #666666),color-stop(100%, #333333));background-image:-moz-linear-gradient(top, #666666,#333333);background-image:-webkit-linear-gradient(top, #666666,#333333);background-image:linear-gradient(to bottom, #666666,#333333)}#cms_toolbar .cms_form-login input[type="submit"]:hover,#cms_toolbar .cms_form-login input[type="submit"]:active,#cms_toolbar .cms_form-login input[type="submit"]:focus{background:#454545;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_form-login input[type="submit"]:active,#cms_toolbar .cms_form-login input[type="submit"]:focus{background:#000}#cms_toolbar .cms_form-login .cms_error{color:red}#cms_toolbar .cms_form-login .cms_error input{border:1px solid red}#cms_toolbar .cms_toolbar-item-buttons{margin:4px 0 4px}#cms_toolbar .cms_toolbar-item-buttons a{float:left;font-size:11px;line-height:1;padding:5px 12px}#cms_toolbar .cms_toolbar-item-buttons a:first-child{-moz-border-radius:3px 0 0 3px;-webkit-border-radius:3px;border-radius:3px 0 0 3px}#cms_toolbar .cms_toolbar-item-buttons a:last-child{-moz-border-radius:0 3px 3px 0;-webkit-border-radius:0;border-radius:0 3px 3px 0}#cms_toolbar .cms_toolbar-item-buttons a:only-child{-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px}#cms_toolbar .cms_toolbar-trigger{position:fixed;right:0;top:0;z-index:999999;border-left:1px solid #666;border-bottom:1px solid #666}#cms_toolbar .cms_toolbar-trigger a{display:block;width:30px;height:29px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;color:#454545;border-left:1px solid white;border-top:1px solid white;background:#fafafa url("../img/toolbar/sprite_toolbar.png") no-repeat -60px -40px}#cms_toolbar .cms_toolbar-trigger a:hover,#cms_toolbar .cms_toolbar-trigger a:active,#cms_toolbar .cms_toolbar-trigger a:focus{background-position:-90px -40px;background-color:white}#cms_toolbar .cms_toolbar-trigger-expanded a{background-position:0 -40px}#cms_toolbar .cms_toolbar-trigger-expanded a:hover,#cms_toolbar .cms_toolbar-trigger-expanded a:active,#cms_toolbar .cms_toolbar-trigger-expanded a:focus{background-position:-30px -40px}#cms_toolbar .cms_toolbar-loader a{background:#fcfcfc url("../img/loader.gif") no-repeat center center !important;background-size:20px 20px !important}#cms_toolbar .cms_toolbar-item_switch{position:relative;left:0;top:0;margin:4px 0 4px;-moz-border-radius:20px;-webkit-border-radius:20px;border-radius:20px;border-top:1px solid #e6e6e6;background:#ededed;-moz-box-shadow:inset #e6e6e6 0px 1px 0px;-webkit-box-shadow:inset #e6e6e6 0px 1px 0px;box-shadow:inset #e6e6e6 0px 1px 0px}#cms_toolbar .cms_toolbar-item_switch:hover,#cms_toolbar .cms_toolbar-item_switch:active,#cms_toolbar .cms_toolbar-item_switch:focus{background-color:#e6e6e6}#cms_toolbar .cms_toolbar-item_switch a{float:left;position:relative;z-index:100;font-size:11px;line-height:11px;text-transform:uppercase;letter-spacing:1px;padding:6px 14px 4px 28px;color:black;text-shadow:0 1px 0 #fff}#cms_toolbar .cms_toolbar-item_switch .cms_toolbar-item_switch-knob{float:left;position:absolute;left:2px;top:1px;z-index:99;width:16px;height:16px;-moz-border-radius:16px;-webkit-border-radius:16px;border-radius:16px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;border:1px solid black;background:#454545;-moz-box-shadow:inset 0 1px 0 #999;-webkit-box-shadow:inset 0 1px 0 #999;box-shadow:inset 0 1px 0 #999;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzY2NjY2NiIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzMzMzMzMyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #666666),color-stop(100%, #333333));background-image:-moz-linear-gradient(top, #666666,#333333);background-image:-webkit-linear-gradient(top, #666666,#333333);background-image:linear-gradient(to bottom, #666666,#333333)}#cms_toolbar .cms_toolbar-item_switch .cms_toolbar-item_switch-on{display:none;position:relative;top:-1px}#cms_toolbar .cms_toolbar-item_switch .cms_toolbar-item_switch-off{display:inline;position:relative;top:-1px}#cms_toolbar .cms_toolbar-item_switch-active a{padding:6px 28px 4px 14px;color:#693}#cms_toolbar .cms_toolbar-item_switch-active .cms_toolbar-item_switch-knob{left:auto;right:2px;border:1px solid #80bf40;background:#80bf40;-moz-box-shadow:inset 0 1px 0 #b3d98c;-webkit-box-shadow:inset 0 1px 0 #b3d98c;box-shadow:inset 0 1px 0 #b3d98c;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzgwYmY0MCIvPjxzdG9wIG9mZnNldD0iNTAlIiBzdG9wLWNvbG9yPSIjNjZhZDFmIi8+PHN0b3Agb2Zmc2V0PSIxMDAlIiBzdG9wLWNvbG9yPSIjNjZiODE0Ii8+PC9saW5lYXJHcmFkaWVudD48L2RlZnM+PHJlY3QgeD0iMCIgeT0iMCIgd2lkdGg9IjEwMCUiIGhlaWdodD0iMTAwJSIgZmlsbD0idXJsKCNncmFkKSIgLz48L3N2Zz4g');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #80bf40),color-stop(50%, #66ad1f),color-stop(100%, #66b814));background-image:-moz-linear-gradient(top, #80bf40 0%,#66ad1f 50%,#66b814 100%);background-image:-webkit-linear-gradient(top, #80bf40 0%,#66ad1f 50%,#66b814 100%);background-image:linear-gradient(to bottom, #80bf40 0%,#66ad1f 50%,#66b814 100%)}#cms_toolbar .cms_toolbar-item_switch-active .cms_toolbar-item_switch-on{display:inline}#cms_toolbar .cms_toolbar-item_switch-active .cms_toolbar-item_switch-off{display:none}#cms_toolbar .cms_toolbar-item_switch-highlight a{color:#0eaaec}#cms_toolbar .cms_toolbar-item_switch-highlight .cms_toolbar-item_switch-knob{border:1px solid #0b87bc;background:#3abcf3;-moz-box-shadow:inset 0 1px 0 #6accf6;-webkit-box-shadow:inset 0 1px 0 #6accf6;box-shadow:inset 0 1px 0 #6accf6;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzBlOTdlYyIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzBlNzJlYyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #0e97ec),color-stop(100%, #0e72ec));background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_screenblock{color:white;text-align:center;position:fixed;right:0;top:0;z-index:100;width:100%;height:100%;background-color:black;background:rgba(0,0,0,0.9)}#cms_toolbar .cms_screenblock .cms_screenblock-inner{margin-top:300px}#cms_toolbar .cms_screenblock .cms_screenblock-inner h1{font-size:28px;line-height:30px}#cms_toolbar .cms_screenblock .cms_screenblock-inner h1,#cms_toolbar .cms_screenblock .cms_screenblock-inner p{color:#a6a6a6;text-align:center}#cms_toolbar .cms_screenblock .cms_screenblock-inner a{color:white}#cms_toolbar .cms_screenblock .cms_screenblock-inner a:hover{text-decoration:underline}/*! + */#cms_toolbar{position:absolute;left:0;top:5px;z-index:9999999;width:100%}#cms_toolbar .cms_toolbar{display:none;position:fixed;left:0;top:0;z-index:999999;width:100%;min-width:320px;height:30px;border-bottom:1px solid #666 !important;background-color:white;background:rgba(250,250,250,0);background-image:-moz-linear-gradient(top, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);background-image:-webkit-linear-gradient(top, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);background-image:linear-gradient(to bottom, rgba(250,250,250,0.97) 0%,rgba(252,252,252,0.97) 50%,rgba(250,250,250,0.95) 100%);-moz-box-shadow:0 0 5px rgba(0,0,0,0.2);-webkit-box-shadow:0 0 5px rgba(0,0,0,0.2);box-shadow:0 0 5px rgba(0,0,0,0.2);background/**/:#fcfcfc}#cms_toolbar .cms_toolbar .cms_toolbar-left{float:left;padding-left:10px;position:relative;z-index:10}#cms_toolbar .cms_toolbar .cms_toolbar-right{float:right;padding-right:32px;position:relative;z-index:10}#cms_toolbar .cms_toolbar .cms_toolbar-left .cms_toolbar-item{margin-left:10px}#cms_toolbar .cms_toolbar .cms_toolbar-right .cms_toolbar-item{margin-right:20px}#cms_toolbar .cms_toolbar .cms_toolbar-item{float:left}#cms_toolbar .cms_toolbar .cms_toolbar-item-buttons a{border-bottom:none !important}@media only screen and (max-width: 800px){#cms_toolbar .cms_toolbar-right{display:none}}#cms_toolbar.cms_toolbar-debug .cms_toolbar{top:5px !important}#cms_toolbar.cms_toolbar-debug .cms_toolbar-trigger{top:5px !important}#cms_toolbar.cms_toolbar-debug .cms_debug-bar{position:fixed;left:0;top:0;z-index:99999999;width:100%;height:4px;border-bottom:1px solid #ddd;background:#fdffc8 url("../img/toolbar/sprite_toolbar.png") repeat-x left -444px}#cms_toolbar.cms_toolbar-debug #container{padding-top:35px !important}#cms_toolbar .cms_toolbar-item-navigation li{float:left;position:relative;zoom:1}#cms_toolbar .cms_toolbar-item-navigation li a{float:left;padding:5px 10px;zoom:1;cursor:default}#cms_toolbar .cms_toolbar-item-navigation li ul{display:none}#cms_toolbar .cms_toolbar-item-navigation>li:first-child>a span{font-weight:800;line-height:12px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul{position:absolute;left:0;top:30px;display:block;min-width:180px;padding:4px 0;-moz-border-radius:0 0 4px 4px;-webkit-border-radius:0;border-radius:0 0 4px 4px;border:1px solid white;border-top:none;background-color:white;background:rgba(255,255,255,0.97);-moz-box-shadow:0 1px 1px rgba(0,0,0,0.4);-webkit-box-shadow:0 1px 1px rgba(0,0,0,0.4);box-shadow:0 1px 1px rgba(0,0,0,0.4)}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul li{float:none;zoom:1}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul li a{float:none;display:block;zoom:1;cursor:pointer;white-space:nowrap;padding:2px 10px 2px 15px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover ul ul{-moz-border-radius:0 4px 4px 0;-webkit-border-radius:0;border-radius:0 4px 4px 0;border-top:1px solid #f5f5f5}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children ul{display:none;left:100%;top:-5px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children>a{cursor:default}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-children>a span{display:block;background:url("../img/toolbar/sprite_toolbar.png") no-repeat right -270px}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover>a{color:white;background:#0e72ec;background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover>a span{background-position:right -300px !important}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-hover .cms_toolbar-item-navigation-active>a{font-weight:800}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-break{height:1px;margin:0 0 4px;padding:0 0 3px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;border-bottom:1px solid #e6e6e6}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a{cursor:default !important;filter:progid:DXImageTransform.Microsoft.Alpha(Opacity=20);opacity:0.2}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:hover,#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:active,#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled a:focus{-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none;background:none !important;color:black !important}#cms_toolbar .cms_toolbar-item-navigation .cms_toolbar-item-navigation-disabled ul{display:none !important}#cms_toolbar .cms_toolbar-item-cms-mode-switcher{display:none}#cms_toolbar .cms_messages{display:none;position:fixed;z-index:999999;top:30px;width:300px;min-height:14px;margin:0;padding:6px 10px 8px;background:rgba(0,0,0,0.74);-moz-border-radius:0 0 3px 3px;-webkit-border-radius:0;border-radius:0 0 3px 3px;color:white;font-size:12px;line-height:16px;font-weight:200}#cms_toolbar .cms_messages *{color:white;font-size:12px;line-height:16px;font-weight:200}#cms_toolbar .cms_messages a{color:#0eaaec}#cms_toolbar .cms_messages a:hover{text-decoration:underline}#cms_toolbar .cms_messages strong{color:#3abcf3;font-weight:200}#cms_toolbar .cms_messages ul{display:inline;color:white}#cms_toolbar .cms_messages ul li{display:inline;color:white;font-weight:200}#cms_toolbar .cms_messages .cms_messages-close{display:none;float:right;cursor:pointer;width:20px;height:14px;margin-left:10px;position:relative;top:-2px;left:3px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat -100px -90px}#cms_toolbar .cms_messages .cms_messages-close:hover{background-position:-120px -90px}#cms_toolbar .cms_messages-error strong{color:red}#cms_toolbar .cms_toolbar-item-logo{margin:0 !important}#cms_toolbar .cms_toolbar-item-logo a{display:block;width:92px;height:20px;margin:5px 0;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;background:url("../img/toolbar/sprite_toolbar.png") no-repeat left top}#cms_toolbar .cms_toolbar-item-logo a:hover,#cms_toolbar .cms_toolbar-item-logo a:active,#cms_toolbar .cms_toolbar-item-logo a:focus{background-position:left -20px}#cms_toolbar .cms_form-login{padding:3px 0 0 0}#cms_toolbar .cms_form-login label{float:left;cursor:pointer;padding-left:10px}#cms_toolbar .cms_form-login label span{padding-top:1px;display:inline-block;vertical-align:middle;*vertical-align:auto;*zoom:1;*display:inline}#cms_toolbar .cms_form-login input[type="text"],#cms_toolbar .cms_form-login input[type="password"]{font-size:13px;line-height:13px;width:100px;padding:3px 5px;margin:0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;color:#666;border:1px solid #d9d9d9;-moz-box-shadow:0px 1px 0px #fff;-webkit-box-shadow:0px 1px 0px #fff;box-shadow:0px 1px 0px #fff}#cms_toolbar .cms_form-login input[type="text"]:focus,#cms_toolbar .cms_form-login input[type="password"]:focus{border-color:#0eaaec;-moz-box-shadow:inset 0px 0px 2px #e6e6e6;-webkit-box-shadow:inset 0px 0px 2px #e6e6e6;box-shadow:inset 0px 0px 2px #e6e6e6;-moz-transition:outline,0.2s 1s;-webkit-transition:outline,0.2s 1s;transition:outline 0.2s 1s}#cms_toolbar .cms_form-login input[type="submit"]{display:block;color:white;font-size:12px;text-transform:uppercase;cursor:pointer;height:23px;padding:1px 15px 0;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;border:1px solid #333;background-color:#666;-moz-box-shadow:inset 0 1px 0 #999;-webkit-box-shadow:inset 0 1px 0 #999;box-shadow:inset 0 1px 0 #999;background-image:-moz-linear-gradient(top, #666666,#333333);background-image:-webkit-linear-gradient(top, #666666,#333333);background-image:linear-gradient(to bottom, #666666,#333333)}#cms_toolbar .cms_form-login input[type="submit"]:hover,#cms_toolbar .cms_form-login input[type="submit"]:active,#cms_toolbar .cms_form-login input[type="submit"]:focus{background:#454545;-moz-box-shadow:none;-webkit-box-shadow:none;box-shadow:none}#cms_toolbar .cms_form-login input[type="submit"]:active,#cms_toolbar .cms_form-login input[type="submit"]:focus{background:#000}#cms_toolbar .cms_form-login .cms_error{color:red}#cms_toolbar .cms_form-login .cms_error input{border:1px solid red}#cms_toolbar .cms_toolbar-item-buttons{margin:4px 0 4px}#cms_toolbar .cms_toolbar-item-buttons a{float:left;font-size:11px;line-height:1;padding:5px 12px}#cms_toolbar .cms_toolbar-item-buttons a:first-child{-moz-border-radius:3px 0 0 3px;-webkit-border-radius:3px;border-radius:3px 0 0 3px}#cms_toolbar .cms_toolbar-item-buttons a:last-child{-moz-border-radius:0 3px 3px 0;-webkit-border-radius:0;border-radius:0 3px 3px 0}#cms_toolbar .cms_toolbar-item-buttons a:only-child{-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px}#cms_toolbar .cms_toolbar-trigger{position:fixed;right:0;top:0;z-index:999999;border-left:1px solid #666;border-bottom:1px solid #666}#cms_toolbar .cms_toolbar-trigger a{display:block;width:30px;height:29px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;color:#454545;border-left:1px solid white;border-top:1px solid white;background:#fafafa url("../img/toolbar/sprite_toolbar.png") no-repeat -60px -40px}#cms_toolbar .cms_toolbar-trigger a:hover,#cms_toolbar .cms_toolbar-trigger a:active,#cms_toolbar .cms_toolbar-trigger a:focus{background-position:-90px -40px;background-color:white}#cms_toolbar .cms_toolbar-trigger-expanded a{background-position:0 -40px}#cms_toolbar .cms_toolbar-trigger-expanded a:hover,#cms_toolbar .cms_toolbar-trigger-expanded a:active,#cms_toolbar .cms_toolbar-trigger-expanded a:focus{background-position:-30px -40px}#cms_toolbar .cms_toolbar-loader a{background:#fcfcfc url("../img/loader.gif") no-repeat center center !important;background-size:20px 20px !important}#cms_toolbar .cms_screenblock{color:white;text-align:center;position:fixed;right:0;top:0;z-index:100;width:100%;height:100%;background-color:black;background:rgba(0,0,0,0.9)}#cms_toolbar .cms_screenblock .cms_screenblock-inner{margin-top:300px}#cms_toolbar .cms_screenblock .cms_screenblock-inner h1{font-size:28px;line-height:30px}#cms_toolbar .cms_screenblock .cms_screenblock-inner h1,#cms_toolbar .cms_screenblock .cms_screenblock-inner p{color:#a6a6a6;text-align:center}#cms_toolbar .cms_screenblock .cms_screenblock-inner a{color:white}#cms_toolbar .cms_screenblock .cms_screenblock-inner a:hover{text-decoration:underline}/*! * @copyright: https://github.com/divio/django-cms */#cms_toolbar .cms_modal{display:none;position:fixed;left:50%;top:50%;z-index:999999;overflow:hidden;-moz-border-radius:5px;-webkit-border-radius:5px;border-radius:5px;-moz-box-shadow:0 0 20px rgba(0,0,0,0.5);-webkit-box-shadow:0 0 20px rgba(0,0,0,0.5);box-shadow:0 0 20px rgba(0,0,0,0.5);background:white}#cms_toolbar .cms_modal .cms_modal-body{position:relative;z-index:10;width:800px;height:400px;border-top:1px solid #e6e6e6;border-bottom:1px solid #e6e6e6}#cms_toolbar .cms_modal .cms_modal-foot{position:relative;height:32px;-moz-border-radius:0px 0px 5px 5px;-webkit-border-radius:0px;border-radius:0px 0px 5px 5px;clear:both;overflow:hidden;background:#fafafa}#cms_toolbar .cms_modal .cms_modal-shim{display:none;position:absolute;left:0;top:0;z-index:20;width:100%;height:100%}#cms_toolbar .cms_modal .cms_modal-frame{position:relative;z-index:10;width:100%;height:100%}#cms_toolbar .cms_modal .cms_modal-frame iframe{width:100%;height:100%}#cms_toolbar .cms_modal .cms_modal-title{display:block;font-size:13px;font-weight:bold;text-align:center;cursor:move;padding:4px 75px 3px;-moz-border-radius:5px 5px 0px 0px;-webkit-border-radius:5px;border-radius:5px 5px 0px 0px;color:#454545;background:#fafafa}#cms_toolbar .cms_modal .cms_modal-collapse,#cms_toolbar .cms_modal .cms_modal-close,#cms_toolbar .cms_modal .cms_modal-maximize{display:block;position:absolute;right:3px;top:3px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize;cursor:pointer;width:20px;height:20px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat left top}#cms_toolbar .cms_modal .cms_modal-collapse{right:40px;background-position:0 -70px}#cms_toolbar .cms_modal .cms_modal-collapse:hover,#cms_toolbar .cms_modal .cms_modal-collapse:active,#cms_toolbar .cms_modal .cms_modal-collapse:focus{background-position:-20px -70px}#cms_toolbar .cms_modal .cms_modal-collapsed{background-position:-100px -70px}#cms_toolbar .cms_modal .cms_modal-collapsed:hover,#cms_toolbar .cms_modal .cms_modal-collapsed:active,#cms_toolbar .cms_modal .cms_modal-collapsed:focus{background-position:-100px -70px}#cms_toolbar .cms_modal .cms_modal-maximize{right:22px;background-position:0 -90px}#cms_toolbar .cms_modal .cms_modal-maximize:hover,#cms_toolbar .cms_modal .cms_modal-maximize:active,#cms_toolbar .cms_modal .cms_modal-maximize:focus{background-position:-20px -90px}#cms_toolbar .cms_modal .cms_modal-maximize-active{background-position:-20px -90px !important}#cms_toolbar .cms_modal .cms_modal-close{background-position:-40px -70px}#cms_toolbar .cms_modal .cms_modal-close:hover,#cms_toolbar .cms_modal .cms_modal-close:active,#cms_toolbar .cms_modal .cms_modal-close:focus{background-position:-60px -70px}#cms_toolbar .cms_modal .cms_modal-resize{position:absolute;right:0;bottom:0;z-index:102;width:20px;height:20px;cursor:nw-resize;background:url("../img/toolbar/sprite_toolbar.png") no-repeat -117px -67px}#cms_toolbar .cms_modal .cms_modal-resize:hover{background-position:-137px -67px}#cms_toolbar .cms_modal .cms_modal-breadcrumb{display:none;float:left;font-size:12px;line-height:12px;position:relative;z-index:100;height:32px;min-width:225px;overflow:hidden;width:100%}#cms_toolbar .cms_modal .cms_modal-breadcrumb .cms_modal-breadcrumb-items{position:absolute;left:35px;top:0;width:9999px;background:#fcfcfc}#cms_toolbar .cms_modal .cms_modal-breadcrumb a{float:left;font-size:12px;line-height:12px;margin-left:-10px;position:relative;z-index:100;padding-right:10px;color:#666;background:url("../img/toolbar/sprite_toolbar.png") no-repeat right -200px}#cms_toolbar .cms_modal .cms_modal-breadcrumb a span{float:left;padding:10px 15px 10px 25px;color:black}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(1){z-index:100}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(2){z-index:80}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(3){z-index:70}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(4){z-index:60}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(5){z-index:50}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(6){z-index:40}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(7){z-index:30}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(8){z-index:20}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(9){z-index:10}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:nth-child(10){z-index:1}#cms_toolbar .cms_modal .cms_modal-breadcrumb a span,#cms_toolbar .cms_modal .cms_modal-breadcrumb .cms_modal-breadcrumb-title{float:left;position:relative;z-index:120;color:#666}#cms_toolbar .cms_modal .cms_modal-breadcrumb .cms_modal-breadcrumb-title{padding:10px 20px 10px 15px;border-right:1px solid #e6e6e6;-moz-border-radius:0 0 0 5px;-webkit-border-radius:0;border-radius:0 0 0 5px;background:#fff url("../img/toolbar/sprite_toolbar.png") no-repeat -133px -84px;text-indent:-119988px;overflow:hidden;text-align:left;text-transform:capitalize}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:hover{color:black;background-position:right -232px !important}#cms_toolbar .cms_modal .cms_modal-breadcrumb a:hover span{color:black;background-color:white}#cms_toolbar .cms_modal .cms_modal-breadcrumb .cms_modal-breadcrumb-last{cursor:default}#cms_toolbar .cms_modal .cms_modal-breadcrumb .cms_modal-breadcrumb-last span{color:#0eaaec}#cms_toolbar .cms_modal .cms_modal-buttons{position:absolute;right:0;top:0;z-index:101;float:right;padding:0 20px 0 10px;-moz-border-radius:0 0 5px 0;-webkit-border-radius:0;border-radius:0 0 5px 0;background:#fcfcfc}#cms_toolbar .cms_modal .cms_modal-buttons div{float:right;font-size:12px;cursor:pointer;padding:2px 10px;margin:3px 5px 3px 0}/*! * @copyright: https://github.com/divio/django-cms @@ -16,4 +16,4 @@ * @copyright: https://github.com/divio/django-cms */#cms_toolbar .cms_structure{display:none;position:absolute;top:0;right:0;width:100%;height:100%;z-index:9999}#cms_toolbar .cms_structure .cms_structure-dimmer{display:none;position:fixed;top:0;right:0;bottom:0;left:0;width:100%;height:100%;z-index:10;background:rgba(255,255,255,0.95)}#cms_toolbar .cms_structure .cms_structure-content{position:absolute;left:0;top:0;z-index:100;width:100%;height:100%}#cms_toolbar .cms_structure .cms_dragarea{position:absolute;padding:5px 5px 4px;margin:0 0 5px;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;background:#454545;-moz-box-sizing:border-box;-webkit-box-sizing:border-box;box-sizing:border-box}#cms_toolbar .cms_structure .cms_dragarea-static{background:#454545 url("../img/toolbar/pattern.png")}#cms_toolbar .cms_structure .cms_dragbar{font-size:13px;line-height:20px;position:relative;left:0;top:0;z-index:9999;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px}#cms_toolbar .cms_structure .cms_dragbar .cms_dragbar-title{font-size:12px;line-height:17px;text-transform:uppercase;font-weight:500;padding:0 0 0 15px;height:16px;cursor:pointer;color:white;text-shadow:0px 1px 0px #000}#cms_toolbar .cms_structure .cms_dragbar .cms_dragbar-title:before{content:" ";position:absolute;left:0;top:0;width:16px;height:15px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat -85px -113px}#cms_toolbar .cms_structure .cms_dragbar .cms_dragbar-title:hover:before{background-position:-105px -113px}#cms_toolbar .cms_structure .cms_dragbar .cms_dragbar-title-expanded:before{background-position:-124px -114px}#cms_toolbar .cms_structure .cms_dragbar .cms_dragbar-title-expanded:hover:before{background-position:-144px -114px !important}#cms_toolbar .cms_structure .cms_dragbar-empty{font-size:11px;text-transform:uppercase;padding-top:0;padding-bottom:0}#cms_toolbar .cms_structure .cms_dragbar-empty-wrapper{display:none}#cms_toolbar .cms_structure .cms_draggables{list-style-type:none;padding:0;margin:0}#cms_toolbar .cms_structure .cms_draggables .cms_draggables{display:none;min-height:25px;padding-left:6px}#cms_toolbar .cms_structure .cms_draggables .cms_draggables>.cms_draggable:first-child,#cms_toolbar .cms_structure .cms_draggables .cms_draggables>.cms_draggable:only-child,#cms_toolbar .cms_structure .cms_draggable>.cms_draggable{margin-top:0}#cms_toolbar .cms_structure .cms_draggables>.cms_draggable:last-child{margin-bottom:1px}#cms_toolbar .cms_structure .cms_draggables .cms_draggables>.cms_draggable:last-child{margin-bottom:2px}#cms_toolbar .cms_structure .cms_draggable,#cms_toolbar .cms_structure .cms_droppable{list-style-type:none;position:relative;left:0;top:0;z-index:99;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;padding:4px 5px 3px 5px;margin:5px 0 0;margin-left:0 !important}#cms_toolbar .cms_structure .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable{position:relative;z-index:99;white-space:nowrap;border-color:#e6e6e6;background:white}#cms_toolbar .cms_structure .cms_draggable .cms_draggable:hover,#cms_toolbar .cms_structure .cms_droppable .cms_draggable:hover{border-color:#a6a6a6}#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable{background:#fafafa}#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable{background:white}#cms_toolbar .cms_structure .cms_draggable .cms_submenu,#cms_toolbar .cms_structure .cms_droppable .cms_submenu{display:none;margin-top:2px}#cms_toolbar .cms_structure .cms_draggable .cms_submenu-dropdown,#cms_toolbar .cms_structure .cms_droppable .cms_submenu-dropdown{right:-6px;top:22px}#cms_toolbar .cms_structure .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_submenu-quicksearch{right:-5px;top:-6px;-moz-border-radius:0;-webkit-border-radius:0;border-radius:0;height:28px;border-left:1px dotted #e6e6e6;background:#fafafa url("../img/toolbar/sprite_toolbar.png") no-repeat right -415px}#cms_toolbar .cms_structure .cms_draggable .cms_submenu-quicksearch input,#cms_toolbar .cms_structure .cms_droppable .cms_submenu-quicksearch input{color:black;margin-top:1px}#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch{background-color:white}#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch,#cms_toolbar .cms_structure .cms_droppable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_draggable .cms_submenu-quicksearch{background-color:#fafafa}#cms_toolbar .cms_structure .cms_draggable .cms_dragitem-text,#cms_toolbar .cms_structure .cms_droppable .cms_dragitem-text{display:inline-block;vertical-align:middle;*vertical-align:auto;*zoom:1;*display:inline;width:90%;height:21px;overflow:hidden}#cms_toolbar .cms_structure .cms_draggable{z-index:100;color:black;border:1px solid #fafafa;background:#fafafa}#cms_toolbar .cms_structure .cms_draggable:hover{-moz-box-shadow:inset 0px 0px 3px #e6e6e6;-webkit-box-shadow:inset 0px 0px 3px #e6e6e6;box-shadow:inset 0px 0px 3px #e6e6e6}#cms_toolbar .cms_structure .cms_droppable{-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;color:#bfbfbf;border:1px dashed #bfbfbf}#cms_toolbar .cms_structure .cms_dragitem{cursor:move}#cms_toolbar .cms_structure .cms_dragitem-collapsable,#cms_toolbar .cms_structure .cms_dragitem-expanded{cursor:pointer;padding-left:15px}#cms_toolbar .cms_structure .cms_dragitem-collapsable{background:url("../img/toolbar/sprite_toolbar.png") no-repeat 1px -359px}#cms_toolbar .cms_structure .cms_dragitem-expanded{background:url("../img/toolbar/sprite_toolbar.png") no-repeat 0 -389px}#cms_toolbar .cms_structure .cms_dragitem-success{position:absolute;left:-1px;top:-1px;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;width:100%;height:100%;filter:progid:DXImageTransform.Microsoft.Alpha(Opacity=60);opacity:0.6}#cms_toolbar .cms_structure .cms_draggable-selected .cms_dragitem,#cms_toolbar .cms_structure .cms_draggable-selected .cms_dragitem strong{color:#0e72ec}#cms_toolbar .cms_structure .cms_draggable-selected .cms_draggable .cms_dragitem,#cms_toolbar .cms_structure .cms_draggable-selected .cms_draggable .cms_dragitem strong{color:black}#cms_toolbar .cms_structure .cms_draggable-allowed,#cms_toolbar .cms_structure .cms_draggable-hover-allowed,#cms_toolbar .cms_structure .cms_draggable-placeholder{color:#cce6b3;border-color:#cce6b3}#cms_toolbar .cms_structure .cms_draggable-hover-allowed,#cms_toolbar .cms_structure .cms_draggable-placeholder{color:white;background:rgba(102,153,51,0.2)}#cms_toolbar .cms_structure .cms_dragitem-success{border:1px solid #cce6b3;background:#cce6b3}#cms_toolbar .cms_structure .cms_draggable-disallowed,#cms_toolbar .cms_structure .cms_draggable-hover-disallowed{color:red;border:1px dashed red;background:rgba(255,0,0,0.1)}#cms_toolbar .cms_structure .cms_draggable-disabled>.cms_dragitem-collapsable{background:none !important;padding-left:0}#cms_toolbar .cms_structure .cms_draggable-disabled .cms_draggables{display:none !important}body>.cms_draggable{list-style-type:none;white-space:nowrap;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px;padding:4px 5px 3px 5px;margin:0;border-color:#e6e6e6;background:white}body>.cms_draggable .cms_switcher{display:none !important}body>.cms_draggable .cms_submenu{display:none !important}body>.cms_draggable .cms_draggables{display:none !important}/*! * @copyright: https://github.com/divio/django-cms - */#cms_toolbar .cms_submenu{display:block;width:20px;height:15px;cursor:pointer;position:absolute;right:5px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat 3px -152px}#cms_toolbar .cms_submenu-lang{padding:0 5px;position:absolute;top:3px;right:3px;border:1px solid #e6e6e6;background:white;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px}#cms_toolbar .cms_submenu-dropdown{display:none;zoom:1;position:absolute;right:0;top:20px;z-index:999;min-width:140px;max-height:230px;overflow:auto;border:1px solid #e6e6e6;background:white;-moz-box-shadow:0 1px 1px rgba(0,0,0,0.1);-webkit-box-shadow:0 1px 1px rgba(0,0,0,0.1);box-shadow:0 1px 1px rgba(0,0,0,0.1)}#cms_toolbar .cms_submenu-dropdown::-webkit-scrollbar{-webkit-appearance:none;width:7px;background:#e6e6e6}#cms_toolbar .cms_submenu-dropdown::-webkit-scrollbar-thumb{background-color:#454545;border-left:1px solid #e6e6e6;-moz-box-shadow:0 0 1px rgba(255,255,255,0.5);-webkit-box-shadow:0 0 1px rgba(255,255,255,0.5);box-shadow:0 0 1px rgba(255,255,255,0.5)}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item{zoom:1}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a,#cms_toolbar .cms_submenu-dropdown span{display:block;font-size:12px;line-height:15px;text-align:left;padding:4px 8px 3px 8px}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a{color:black}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:hover,#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:focus{color:white;background:#0e72ec;background-image:url('data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4gPHN2ZyB2ZXJzaW9uPSIxLjEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PGRlZnM+PGxpbmVhckdyYWRpZW50IGlkPSJncmFkIiBncmFkaWVudFVuaXRzPSJvYmplY3RCb3VuZGluZ0JveCIgeDE9IjAuNSIgeTE9IjAuMCIgeDI9IjAuNSIgeTI9IjEuMCI+PHN0b3Agb2Zmc2V0PSIwJSIgc3RvcC1jb2xvcj0iIzBlOTdlYyIvPjxzdG9wIG9mZnNldD0iMTAwJSIgc3RvcC1jb2xvcj0iIzBlNzJlYyIvPjwvbGluZWFyR3JhZGllbnQ+PC9kZWZzPjxyZWN0IHg9IjAiIHk9IjAiIHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InVybCgjZ3JhZCkiIC8+PC9zdmc+IA==');background-size:100%;background-image:-webkit-gradient(linear, 50% 0%, 50% 100%, color-stop(0%, #0e97ec),color-stop(100%, #0e72ec));background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:first-child{border-top:none}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item span{cursor:default;font-weight:bold;color:black;border-top:1px solid #a6a6a6;border-bottom:1px solid #e6e6e6}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item:first-child span{border-top:none}#cms_toolbar .cms_submenu-quicksearch{display:none;position:absolute;right:-5px;top:-5px;z-index:1000;cursor:default;text-align:right;height:25px;-moz-border-radius:4px;-webkit-border-radius:4px;border-radius:4px;background:#454545 url("../img/toolbar/sprite_toolbar.png") no-repeat right -326px}#cms_toolbar .cms_submenu-quicksearch label{cursor:pointer}#cms_toolbar .cms_submenu-quicksearch input{display:block;font-size:12px;color:white;text-align:right;-webkit-appearance:none;width:109px;height:20px;padding:3px 1px 1px 5px;margin-right:25px;border:none;background:none}#cms_toolbar .cms_submenu-scroll-hint{display:none;color:#a6a6a6;font-size:12px;line-height:1;text-align:center;position:absolute;left:0;bottom:0;width:100%;padding:5px 0 4px;background-color:#e6e6e6}@media print, (-o-min-device-pixel-ratio: 5 / 4), (-webkit-min-device-pixel-ratio: 1.25), (min-resolution: 1.25dppx){.cms_toolbar-item-navigation-children>a span,.cms_sideframe-btn div,.cms_clipboard ul a,.cms_clipboard-empty a,.cms_messages .cms_messages-close,.cms_modal-collapse,.cms_modal-close,.cms_modal-maximize,.cms_modal-resize,.cms_modal-breadcrumb a,.cms_modal-breadcrumb-title,.cms_toolbar-item-logo a,.cms_toolbar-trigger a,.cms_tooltip,.cms_placeholders-menu,.cms_toolbar-debug .cms_debug-bar{background-image:url("../img/toolbar/sprite_toolbar@2x.png") !important;background-size:190px !important}#cms_toolbar .cms_loader{background-image:url("../img/loader@2x.gif") !important;background-size:32px !important}.cms_submenu,.cms_submenu-quicksearch,.cms_placeholder-title:before,.cms_placeholder .cms_dragitem-collapsable,.cms_placeholder .cms_dragitem-collapsed{background-image:url("../img/toolbar/sprite_toolbar@2x.png") !important;background-size:190px !important}} + */#cms_toolbar .cms_submenu{display:block;width:20px;height:15px;cursor:pointer;position:absolute;right:5px;background:url("../img/toolbar/sprite_toolbar.png") no-repeat 3px -152px}#cms_toolbar .cms_submenu-lang{padding:0 5px;position:absolute;top:3px;right:3px;border:1px solid #e6e6e6;background:white;-moz-border-radius:3px;-webkit-border-radius:3px;border-radius:3px}#cms_toolbar .cms_submenu-dropdown{display:none;zoom:1;position:absolute;right:0;top:20px;z-index:999;min-width:140px;max-height:230px;overflow:auto;border:1px solid #e6e6e6;background:white;-moz-box-shadow:0 1px 1px rgba(0,0,0,0.1);-webkit-box-shadow:0 1px 1px rgba(0,0,0,0.1);box-shadow:0 1px 1px rgba(0,0,0,0.1)}#cms_toolbar .cms_submenu-dropdown::-webkit-scrollbar{-webkit-appearance:none;width:7px;background:#e6e6e6}#cms_toolbar .cms_submenu-dropdown::-webkit-scrollbar-thumb{background-color:#454545;border-left:1px solid #e6e6e6;-moz-box-shadow:0 0 1px rgba(255,255,255,0.5);-webkit-box-shadow:0 0 1px rgba(255,255,255,0.5);box-shadow:0 0 1px rgba(255,255,255,0.5)}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item{zoom:1}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a,#cms_toolbar .cms_submenu-dropdown span{display:block;font-size:12px;line-height:15px;text-align:left;padding:4px 8px 3px 8px}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a{color:black}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:hover,#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:focus{color:white;background:#0e72ec;background-image:-moz-linear-gradient(top, #0e97ec,#0e72ec);background-image:-webkit-linear-gradient(top, #0e97ec,#0e72ec);background-image:linear-gradient(to bottom, #0e97ec,#0e72ec)}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item a:first-child{border-top:none}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item span{cursor:default;font-weight:bold;color:black;border-top:1px solid #a6a6a6;border-bottom:1px solid #e6e6e6}#cms_toolbar .cms_submenu-dropdown .cms_submenu-item:first-child span{border-top:none}#cms_toolbar .cms_submenu-quicksearch{display:none;position:absolute;right:-5px;top:-5px;z-index:1000;cursor:default;text-align:right;height:25px;-moz-border-radius:4px;-webkit-border-radius:4px;border-radius:4px;background:#454545 url("../img/toolbar/sprite_toolbar.png") no-repeat right -326px}#cms_toolbar .cms_submenu-quicksearch label{cursor:pointer}#cms_toolbar .cms_submenu-quicksearch input{display:block;font-size:12px;color:white;text-align:right;-webkit-appearance:none;width:109px;height:20px;padding:3px 1px 1px 5px;margin-right:25px;border:none;background:none}#cms_toolbar .cms_submenu-scroll-hint{display:none;color:#a6a6a6;font-size:12px;line-height:1;text-align:center;position:absolute;left:0;bottom:0;width:100%;padding:5px 0 4px;background-color:#e6e6e6}@media print, (-o-min-device-pixel-ratio: 5 / 4), (-webkit-min-device-pixel-ratio: 1.25), (min-resolution: 1.25dppx){.cms_toolbar-item-navigation-children>a span,.cms_sideframe-btn div,.cms_clipboard ul a,.cms_clipboard-empty a,.cms_messages .cms_messages-close,.cms_modal-collapse,.cms_modal-close,.cms_modal-maximize,.cms_modal-resize,.cms_modal-breadcrumb a,.cms_modal-breadcrumb-title,.cms_toolbar-item-logo a,.cms_toolbar-trigger a,.cms_tooltip,.cms_placeholders-menu,.cms_toolbar-debug .cms_debug-bar{background-image:url("../img/toolbar/sprite_toolbar@2x.png") !important;background-size:190px !important}#cms_toolbar .cms_loader{background-image:url("../img/loader@2x.gif") !important;background-size:32px !important}.cms_submenu,.cms_submenu-quicksearch,.cms_placeholder-title:before,.cms_placeholder .cms_dragitem-collapsable,.cms_placeholder .cms_dragitem-collapsed{background-image:url("../img/toolbar/sprite_toolbar@2x.png") !important;background-size:190px !important}} diff --git a/cms/static/cms/sass/includes/_toolbar.scss b/cms/static/cms/sass/includes/_toolbar.scss index af3647aaf67..c5b55d5aebd 100644 --- a/cms/static/cms/sass/includes/_toolbar.scss +++ b/cms/static/cms/sass/includes/_toolbar.scss @@ -180,45 +180,7 @@ position:absolute; left:0; top:5px; z-index:9999999; width:100%; .cms_toolbar-loader a { background:#fcfcfc url('../img/loader.gif') no-repeat center center !important; background-size:20px 20px !important; } -// #TOOLBAR/elements/switch# -.cms_toolbar-item_switch { position:relative; left:0; top:0; margin:4px 0 4px; @include border-radius(20px); - border-top:1px solid $color-grey-10; background:darken($color-grey-5, 5%); - @include box-shadow(inset $color-grey-10 0px 1px 0px); - &:hover, &:active, &:focus { background-color:$color-grey-10; } - - a { float:left; position:relative; z-index:100; font-size:11px; line-height:11px; - text-transform:uppercase; letter-spacing:1px; padding:6px 14px 4px 28px; - color:black; @include text-shadow(0 1px 0 white); } - .cms_toolbar-item_switch-knob { - float:left; position:absolute; left:2px; top:1px; z-index:99; width:16px; height:16px; - @include border-radius(16px); @include hide-text(); - - border:1px solid black; background:$color-grey-70; - @include box-shadow(inset 0 1px 0 lighten($color-grey, 20%)); - @include background-image($gradient-dark); - } - .cms_toolbar-item_switch-on { display:none; position:relative; top:-1px; } - .cms_toolbar-item_switch-off { display:inline; position:relative; top:-1px; } -} -.cms_toolbar-item_switch-active { - a { padding:6px 28px 4px 14px; color:$color-green; } - .cms_toolbar-item_switch-knob { left:auto; right:2px; - border:1px solid lighten($color-green, 10%); background:lighten($color-green, 10%); - @include box-shadow(inset 0 1px 0 lighten($color-green, 30%)); - @include background-image($gradient-green); } - .cms_toolbar-item_switch-on { display:inline; } - .cms_toolbar-item_switch-off { display:none; } -} -// highlight -.cms_toolbar-item_switch-highlight { - a { color:$color-blue; } - .cms_toolbar-item_switch-knob { - border:1px solid darken($color-blue, 10%); background:lighten($color-blue, 10%); - @include box-shadow(inset 0 1px 0 lighten($color-blue, 20%)); - @include background-image($gradient-blue); - } -} - +// TODO Reimplement blinking if unpublished content is present //################################################################################################################## // #TOOLBAR/blocker# .cms_screenblock { color:white; text-align:center; @@ -234,4 +196,4 @@ position:absolute; left:0; top:5px; z-index:9999999; width:100%; } // end of toolbar -} \ No newline at end of file +} diff --git a/cms/templates/cms/toolbar/items/live_draft.html b/cms/templates/cms/toolbar/items/live_draft.html index 6d731f02751..19e27f12ad9 100644 --- a/cms/templates/cms/toolbar/items/live_draft.html +++ b/cms/templates/cms/toolbar/items/live_draft.html @@ -1,8 +1,13 @@ -{% load i18n %} -<div class="cms_toolbar-item cms_toolbar-item_switch{% if not request.toolbar.edit_mode %} cms_toolbar-item_switch-active{% endif %}"> - <a href="{% if request.toolbar.edit_mode %}?{{ request.toolbar.edit_mode_url_off }}{% else %}?{{ request.toolbar.edit_mode_url_on }}{% endif %}"> - <span class="cms_toolbar-item_switch-on">{% trans "Live" %}</span> - <span class="cms_toolbar-item_switch-off">{% trans "Draft" %}</span> +{% load i18n %}{% spaceless %} +<div class="cms_toolbar-item cms_toolbar-item-buttons cms_toolbar-item_switch_save-edit"> + {% if request.toolbar.edit_mode %} + <a class="cms_btn cms_btn-switch-save" href="?{{ request.toolbar.edit_mode_url_off }}"> + {% trans "Save and close" %} </a> - <span class="cms_toolbar-item_switch-knob">{% trans "Change" %}</span> + {% else %} + <a class="cms_btn cms_btn-active cms_btn-switch-edit" href="?{{ request.toolbar.edit_mode_url_on }}"> + {% trans "Edit" %} + </a> + {% endif %} </div> +{% endspaceless %} diff --git a/cms/templates/cms/toolbar/toolbar.html b/cms/templates/cms/toolbar/toolbar.html index 4133350bede..904936b3903 100644 --- a/cms/templates/cms/toolbar/toolbar.html +++ b/cms/templates/cms/toolbar/toolbar.html @@ -27,15 +27,6 @@ {% for item in request.toolbar.get_right_items %} {{ item.render }} {% endfor %} - {% if request.toolbar.can_change %} - <div class="cms_toolbar-item cms_toolbar-item_switch{% if not request.toolbar.edit_mode %} cms_toolbar-item_switch-active{% endif %}"> - <a href="{% if request.toolbar.edit_mode %}?{{ request.toolbar.edit_mode_url_off }}{% else %}?{{ request.toolbar.edit_mode_url_on }}{% endif %}"> - <span class="cms_toolbar-item_switch-on">{% trans "Live" %}</span> - <span class="cms_toolbar-item_switch-off">{% trans "Draft" %}</span> - </a> - <span class="cms_toolbar-item_switch-knob">{% trans "Change" %}</span> - </div> - {% endif %} </div> </div> <div class="cms_toolbar-trigger"><a href="#">{% trans "Toggle toolbar" %}</a></div> diff --git a/cms/tests/views.py b/cms/tests/views.py index dd6cc3193e6..932f70845a9 100644 --- a/cms/tests/views.py +++ b/cms/tests/views.py @@ -149,13 +149,13 @@ def test_edit_permission(self): page = create_page("page", "nav_playground.html", "en", published=True) # Anon user response = self.client.get("/en/?%s" % get_cms_setting('CMS_TOOLBAR_URL__EDIT_ON')) - self.assertNotContains(response, "cms_toolbar-item_switch", 200) + self.assertNotContains(response, "cms_toolbar-item_switch_save-edit", 200) # Superuser user = self.get_superuser() with self.login_user_context(user): response = self.client.get("/en/?%s" % get_cms_setting('CMS_TOOLBAR_URL__EDIT_ON')) - self.assertContains(response, "cms_toolbar-item_switch", 4, 200) + self.assertContains(response, "cms_toolbar-item_switch_save-edit", 1, 200) # Admin but with no permission user = self.get_staff_user_with_no_permissions() @@ -163,12 +163,12 @@ def test_edit_permission(self): with self.login_user_context(user): response = self.client.get("/en/?%s" % get_cms_setting('CMS_TOOLBAR_URL__EDIT_ON')) - self.assertNotContains(response, "cms_toolbar-item_switch", 200) + self.assertNotContains(response, "cms_toolbar-item_switch_save-edit", 200) PagePermission.objects.create(can_change=True, user=user, page=page) with self.login_user_context(user): response = self.client.get("/en/?%s" % get_cms_setting('CMS_TOOLBAR_URL__EDIT_ON')) - self.assertContains(response, "cms_toolbar-item_switch", 4, 200) + self.assertContains(response, "cms_toolbar-item_switch_save-edit", 1, 200) @override_settings(ROOT_URLCONF='cms.test_utils.project.urls')
scikit-image__scikit-image-3790
0.14.2 test suite fails with `NameError: global name 'osp'` ## Description The test suite does not pass. As far as I know `osp` is a common alias for `os.path`. Is this a typo in the code? Or related to the base python version? ## Way to reproduce ```python pytest -vv ``` ## Version information ```python 2.7.16 (default, Mar 4 2019, 19:30:43) [GCC 8.2.0] Linux-4.20.2-gentoo-x86_64-Intel-R-_Core-TM-_i7-8550U_CPU_@_1.80GHz-with-gentoo-2.6 scikit-image version: 0.14.2 numpy version: 1.16.1 ``` OR ```python 3.6.8 (default, Mar 4 2019, 19:32:41) [GCC 8.2.0] Linux-4.20.2-gentoo-x86_64-Intel-R-_Core-TM-_i7-8550U_CPU_@_1.80GHz-with-gentoo-2.6 scikit-image version: 0.14.2 numpy version: 1.16.1 ``` ## My output [build.log](https://github.com/scikit-image/scikit-image/files/2937545/build.log)
[ { "content": "\"\"\"Image Processing SciKit (Toolbox for SciPy)\n\n``scikit-image`` (a.k.a. ``skimage``) is a collection of algorithms for image\nprocessing and computer vision.\n\nThe main package of ``skimage`` only provides a few utilities for converting\nbetween image data types; for most features, you need...
[ { "content": "\"\"\"Image Processing SciKit (Toolbox for SciPy)\n\n``scikit-image`` (a.k.a. ``skimage``) is a collection of algorithms for image\nprocessing and computer vision.\n\nThe main package of ``skimage`` only provides a few utilities for converting\nbetween image data types; for most features, you need...
diff --git a/skimage/__init__.py b/skimage/__init__.py index 52c8a3af409..d77fd21e9a0 100644 --- a/skimage/__init__.py +++ b/skimage/__init__.py @@ -135,6 +135,7 @@ def _test(doctest=False, verbose=False): def _raise_build_error(e): # Raise a comprehensible error + import os.path as osp local_dir = osp.split(__file__)[0] msg = _STANDARD_MSG if local_dir == "skimage":
pytorch__vision-2933
Change default value of eps in FrozenBatchNorm to match BatchNorm ## ❓ Questions and Help Hello Loss is nan error occurs when I learn fast rcnn with resnext101 backbone My code is as follows ```python backbone = resnet_fpn_backbone('resnext101_32x8d', pretrained=True) model = FasterRCNN(backbone, num_classes) in_features = model.roi_heads.box_predictor.cls_score.in_features model.roi_heads.box_predictor = FastRCNNPredictor(in_features, num_classes) ``` error message ``` Epoch: [0] [ 0/7208] eta: 1:27:42 lr: 0.000040 loss: 40613806080.0000 (40613806080.0000) loss_box_reg: 7979147264.0000 (7979147264.0000) loss_classifier: 11993160704.0000 (11993160704.0000) loss_objectness: 9486380032.0000 (9486380032.0000) loss_rpn_box_reg: 11155118080.0000 (11155118080.0000) time: 0.7301 data: 0.4106 max mem: 1241 Loss is nan, stopping training ``` When i change the backbone to resnet50 and resnet152, no error occrus. ### Please note that this issue tracker is not a help form and this issue will be closed. We have a set of [listed resources available on the website](https://pytorch.org/resources). Our primary means of support is our discussion forum: - [Discussion Forum](https://discuss.pytorch.org/)
[ { "content": "\"\"\"\nhelper class that supports empty tensors on some nn functions.\n\nIdeally, add support directly in PyTorch to empty tensors in\nthose functions.\n\nThis can be removed once https://github.com/pytorch/pytorch/issues/12013\nis implemented\n\"\"\"\n\nimport warnings\nimport torch\nfrom torch ...
[ { "content": "\"\"\"\nhelper class that supports empty tensors on some nn functions.\n\nIdeally, add support directly in PyTorch to empty tensors in\nthose functions.\n\nThis can be removed once https://github.com/pytorch/pytorch/issues/12013\nis implemented\n\"\"\"\n\nimport warnings\nimport torch\nfrom torch ...
diff --git a/test/test_models.py b/test/test_models.py index acff816852b..b37fb176a2b 100644 --- a/test/test_models.py +++ b/test/test_models.py @@ -6,9 +6,10 @@ import numpy as np from torchvision import models import unittest -import traceback import random +from torchvision.ops.misc import FrozenBatchNorm2d + def set_rng_seed(seed): torch.manual_seed(seed) @@ -149,6 +150,10 @@ def _test_detection_model(self, name, dev): if "retinanet" in name: kwargs["score_thresh"] = 0.013 model = models.detection.__dict__[name](num_classes=50, pretrained_backbone=False, **kwargs) + if "keypointrcnn" in name or "retinanet" in name: + for module in model.modules(): + if isinstance(module, FrozenBatchNorm2d): + module.eps = 0 model.eval().to(device=dev) input_shape = (3, 300, 300) # RNG always on CPU, to ensure x in cuda tests is bitwise identical to x in cpu tests diff --git a/test/test_ops.py b/test/test_ops.py index 7c13de4dedc..79294ed173e 100644 --- a/test/test_ops.py +++ b/test/test_ops.py @@ -623,10 +623,10 @@ def test_frozenbatchnorm2d_eps(self): running_var=torch.rand(sample_size[1]), num_batches_tracked=torch.tensor(100)) - # Check that default eps is zero for backward-compatibility + # Check that default eps is equal to the one of BN fbn = ops.misc.FrozenBatchNorm2d(sample_size[1]) fbn.load_state_dict(state_dict, strict=False) - bn = torch.nn.BatchNorm2d(sample_size[1], eps=0).eval() + bn = torch.nn.BatchNorm2d(sample_size[1]).eval() bn.load_state_dict(state_dict) # Difference is expected to fall in an acceptable range self.assertTrue(torch.allclose(fbn(x), bn(x), atol=1e-6)) diff --git a/torchvision/ops/misc.py b/torchvision/ops/misc.py index 3b52c0d8c4d..3e9f13c9daf 100644 --- a/torchvision/ops/misc.py +++ b/torchvision/ops/misc.py @@ -51,7 +51,7 @@ class FrozenBatchNorm2d(torch.nn.Module): def __init__( self, num_features: int, - eps: float = 0., + eps: float = 1e-5, n: Optional[int] = None, ): # n=None for backward-compatibility
OCA__server-tools-464
runbot 9.0 red due to letsencrypt? Hi, It seems the 9.0 branch is red on runbot due to the letsencrypt module? ``` Call of self.pool.get('letsencrypt').cron(cr, uid, *()) failed in Job 2 Traceback (most recent call last): File "/srv/openerp/instances/openerp-oca-runbot/parts/odoo-extra/runbot/static/build/3148182-9-0-209efa/openerp/addons/base/ir/ir_cron.py", line 129, in _callback getattr(model, method_name)(cr, uid, *args) File "/srv/openerp/instances/openerp-oca-runbot/parts/odoo-extra/runbot/static/build/3148182-9-0-209efa/openerp/api.py", line 250, in wrapper return old_api(self, *args, **kwargs) File "/srv/openerp/instances/openerp-oca-runbot/parts/odoo-extra/runbot/static/build/3148182-9-0-209efa/openerp/api.py", line 354, in old_api result = method(recs, *args, **kwargs) File "/srv/openerp/instances/openerp-oca-runbot/parts/odoo-extra/runbot/static/build/3148182-9-0-209efa/openerp/addons/letsencrypt/models/letsencrypt.py", line 151, in cron account_key, csr, acme_challenge, log=_logger, CA=DEFAULT_CA) File "/srv/openerp/instances/openerp-oca-runbot/sandbox/local/lib/python2.7/site-packages/acme_tiny.py", line 104, in get_crt raise ValueError("Error requesting challenges: {0} {1}".format(code, result)) ValueError: Error requesting challenges: 400 { "type": "urn:acme:error:malformed", "detail": "Error creating new authz :: Invalid character in DNS name", "status": 400 } ``` @hbrunn
[ { "content": "# -*- coding: utf-8 -*-\n# © 2016 Therp BV <http://therp.nl>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n{\n \"name\": \"Let's encrypt\",\n \"version\": \"9.0.1.0.0\",\n \"author\": \"Therp BV,\"\n \"Tecnativa,\"\n \"Odoo Community Asso...
[ { "content": "# -*- coding: utf-8 -*-\n# © 2016 Therp BV <http://therp.nl>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n{\n \"name\": \"Let's encrypt\",\n \"version\": \"9.0.1.0.0\",\n \"author\": \"Therp BV,\"\n \"Tecnativa,\"\n \"Odoo Community Asso...
diff --git a/auth_supplier/security/auth_supplier_security.xml b/auth_supplier/security/auth_supplier_security.xml index 93293bee2f6..0108e381f29 100644 --- a/auth_supplier/security/auth_supplier_security.xml +++ b/auth_supplier/security/auth_supplier_security.xml @@ -4,7 +4,6 @@ <record id="group_auth_supplier" model="res.groups"> <field name="name">Supplier Portal</field> <field name="category_id" ref="base.module_category_extra"/> - <field name="is_portal" eval="True"/> </record> </odoo> diff --git a/letsencrypt/__openerp__.py b/letsencrypt/__openerp__.py index 01457b8073a..626b17e12b2 100644 --- a/letsencrypt/__openerp__.py +++ b/letsencrypt/__openerp__.py @@ -16,6 +16,7 @@ "data": [ "data/ir_config_parameter.xml", "data/ir_cron.xml", + "demo/ir_cron.xml", ], "post_init_hook": 'post_init_hook', "installable": True, diff --git a/letsencrypt/demo/ir_cron.xml b/letsencrypt/demo/ir_cron.xml new file mode 100644 index 00000000000..e4451aa5946 --- /dev/null +++ b/letsencrypt/demo/ir_cron.xml @@ -0,0 +1,8 @@ +<?xml version="1.0" encoding="UTF-8"?> +<openerp> + <data> + <record id="cronjob" model="ir.cron"> + <field name="active" eval="False" /> + </record> + </data> +</openerp>
scrapy__scrapy-4503
Fix the hoverxref configuration > You shouldn't override hoverxref_version and hoverxref_project since they are taken automatically from Read the Docs. > > If you want to avoid your CI failing because of this, you can define the environment variables as Read the Docs does: > > READTHEDOCS_PROJECT=scrapy > READTHEDOCS_VERSION='' > > With the current configuration, all the versions built on Read the Docs will point to a different version on Read the Docs and this will conflict. For example, current master version in Read the Docs defines hoverxref_version='2.0.0' but that version does not exist on Read the Docs and the tooltip does not known where to get the content from. @humitos at https://github.com/scrapy/scrapy/pull/4480#discussion_r409026912
[ { "content": "# -*- coding: utf-8 -*-\n#\n# Scrapy documentation build configuration file, created by\n# sphinx-quickstart on Mon Nov 24 12:02:52 2008.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# The contents of this file are pickled, so don't put values in the nam...
[ { "content": "# -*- coding: utf-8 -*-\n#\n# Scrapy documentation build configuration file, created by\n# sphinx-quickstart on Mon Nov 24 12:02:52 2008.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# The contents of this file are pickled, so don't put values in the nam...
diff --git a/docs/conf.py b/docs/conf.py index 4414ef6371a..813417bae17 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -295,8 +295,6 @@ # ------------------------------------ hoverxref_auto_ref = True -hoverxref_project = "scrapy" -hoverxref_version = release hoverxref_role_types = { "class": "tooltip", "confval": "tooltip", diff --git a/tox.ini b/tox.ini index b1babc7fd63..cd118c921d0 100644 --- a/tox.ini +++ b/tox.ini @@ -74,11 +74,15 @@ deps = changedir = docs deps = -rdocs/requirements.txt +setenv = + READTHEDOCS_PROJECT=scrapy + READTHEDOCS_VERSION=master [testenv:docs] basepython = python3 changedir = {[docs]changedir} deps = {[docs]deps} +setenv = {[docs]setenv} commands = sphinx-build -W -b html . {envtmpdir}/html @@ -86,6 +90,7 @@ commands = basepython = python3 changedir = {[docs]changedir} deps = {[docs]deps} +setenv = {[docs]setenv} commands = sphinx-build -b coverage . {envtmpdir}/coverage @@ -93,6 +98,7 @@ commands = basepython = python3 changedir = {[docs]changedir} deps = {[docs]deps} +setenv = {[docs]setenv} commands = sphinx-build -W -b linkcheck . {envtmpdir}/linkcheck
pallets__click-2187
click.echo is improperly typed I'm getting a repeat of #2174 : although click.secho has been fixed, pyright is continuing to complain about the type annotation for click.echo #2175 only fixes the problem for click.secho, I think the same should be done for click.echo? (Running with 1c588834)
[ { "content": "import os\nimport sys\nimport typing as t\nfrom functools import update_wrapper\nfrom types import ModuleType\n\nfrom ._compat import _default_text_stderr\nfrom ._compat import _default_text_stdout\nfrom ._compat import _find_binary_writer\nfrom ._compat import auto_wrap_for_ansi\nfrom ._compat im...
[ { "content": "import os\nimport sys\nimport typing as t\nfrom functools import update_wrapper\nfrom types import ModuleType\n\nfrom ._compat import _default_text_stderr\nfrom ._compat import _default_text_stdout\nfrom ._compat import _find_binary_writer\nfrom ._compat import auto_wrap_for_ansi\nfrom ._compat im...
diff --git a/CHANGES.rst b/CHANGES.rst index 6c9a79327..d02c3e952 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -14,7 +14,8 @@ Unreleased - Fix a typo in the Bash completion script that affected file and directory completion. If this script was generated by a previous version, it should be regenerated. :issue:`2163` -- Fix typing for ``secho`` file argument. :issue:`2174` +- Fix typing for ``echo`` and ``secho`` file argument. + :issue:`2174, 2185` Version 8.0.3 diff --git a/src/click/utils.py b/src/click/utils.py index 051cf7009..8dd3a00c7 100644 --- a/src/click/utils.py +++ b/src/click/utils.py @@ -203,7 +203,7 @@ def __iter__(self) -> t.Iterator[t.AnyStr]: def echo( message: t.Optional[t.Any] = None, - file: t.Optional[t.IO] = None, + file: t.Optional[t.IO[t.Any]] = None, nl: bool = True, err: bool = False, color: t.Optional[bool] = None,
alltheplaces__alltheplaces-4514
Domains missing from New Look websites The new_look_gb.py spider is returning websites that are missing the domain name. This is because that's how the website appears in the schema.org block on the pages being scraped. e.g. we have `"url":"/uk/store/Beccles-Beccles-GB-1775"`. The scheme and domain `https://www.newlook.com` needs to be prepended to each of the returned URLs. This is the same issue as in #4302 but for a different spider.
[ { "content": "from scrapy.spiders import SitemapSpider\n\nfrom locations.structured_data_spider import StructuredDataSpider\n\n\nclass NewLookGB(SitemapSpider, StructuredDataSpider):\n name = \"new_look_gb\"\n item_attributes = {\"brand\": \"New Look\", \"brand_wikidata\": \"Q12063852\"}\n sitemap_urls...
[ { "content": "from scrapy.spiders import SitemapSpider\n\nfrom locations.structured_data_spider import StructuredDataSpider\n\n\nclass NewLookGB(SitemapSpider, StructuredDataSpider):\n name = \"new_look_gb\"\n item_attributes = {\"brand\": \"New Look\", \"brand_wikidata\": \"Q12063852\"}\n sitemap_urls...
diff --git a/locations/spiders/new_look_gb.py b/locations/spiders/new_look_gb.py index 18e75cd91eb..7dc6f8af6c8 100644 --- a/locations/spiders/new_look_gb.py +++ b/locations/spiders/new_look_gb.py @@ -15,3 +15,7 @@ def sitemap_filter(self, entries): for entry in entries: if "closed" not in entry["loc"].lower(): yield entry + + def inspect_item(self, item, response): + item["website"] = response.urljoin(item["website"]) + yield item
ibis-project__ibis-4637
Problem with formatting union expressions when using `value_counts` I'm working on a subclass of the MySQL backend and using unions. When attempting to do a `value_counts` on a union, I get an attribute error. Here is a simple test using our backend (this DataFrame upload might not work in the actual MySQL, but should be fairly simple to replicate). ``` df = pd.DataFrame([[0, 1], [2, 3]], columns=['a', 'b']) tbl = conn.create_table('test_union', df, force=True) u = ibis.union(tbl, tbl) u.a.value_counts() ``` Here is the tail end of the exception. ``` ... ~/.pyenv/versions/3.9.4/lib/python3.9/site-packages/ibis/expr/format.py in _fmt_value_expr(expr, aliases) 555 Forwards the call on to the specific operation dispatch rule. 556 """ --> 557 return fmt_value(expr.op(), aliases=aliases) 558 559 ~/.pyenv/versions/3.9.4/lib/python3.9/functools.py in wrapper(*args, **kw) 875 '1 positional argument') 876 --> 877 return dispatch(args[0].__class__)(*args, **kw) 878 879 funcname = getattr(func, '__name__', 'singledispatch function') ~/.pyenv/versions/3.9.4/lib/python3.9/site-packages/ibis/expr/format.py in _fmt_value_table_node(op, aliases, **_) 669 if not hasattr(op, 'table'): 670 import pdb; pdb.set_trace() --> 671 return f"{aliases[op.table.op()]}" 672 673 AttributeError: 'Union' object has no attribute 'table' ```
[ { "content": "from __future__ import annotations\n\nimport collections\nimport functools\nimport textwrap\nimport types\nfrom typing import Any, Callable, Deque, Iterable, Mapping, Tuple\n\nimport rich.pretty\n\nimport ibis\nimport ibis.common.graph as graph\nimport ibis.expr.datatypes as dt\nimport ibis.expr.o...
[ { "content": "from __future__ import annotations\n\nimport collections\nimport functools\nimport textwrap\nimport types\nfrom typing import Any, Callable, Deque, Iterable, Mapping, Tuple\n\nimport rich.pretty\n\nimport ibis\nimport ibis.common.graph as graph\nimport ibis.expr.datatypes as dt\nimport ibis.expr.o...
diff --git a/ibis/expr/format.py b/ibis/expr/format.py index 3c6065baf22a..1d3ae3e7075d 100644 --- a/ibis/expr/format.py +++ b/ibis/expr/format.py @@ -643,7 +643,7 @@ def _fmt_value_table_node(op: ops.TableNode, *, aliases: Aliases, **_: Any) -> s This function is called when a table is used in a value expression. An example is `table.count()`. """ - return f"{aliases[op.table]}" + return f"{aliases[op]}" @fmt_value.register diff --git a/ibis/tests/expr/test_format.py b/ibis/tests/expr/test_format.py index cc5c47ebccf6..454d7facb7b0 100644 --- a/ibis/tests/expr/test_format.py +++ b/ibis/tests/expr/test_format.py @@ -275,6 +275,21 @@ def test_tables_have_format_value_rules(cls): assert cls in ibis.expr.format.fmt_value.registry +@pytest.mark.parametrize( + "f", + [ + lambda t1, t2: t1.count(), + lambda t1, t2: t1.join(t2, t1.a == t2.a).count(), + lambda t1, t2: ibis.union(t1, t2).count(), + ], +) +def test_table_value_expr(f): + t1 = ibis.table([("a", "int"), ("b", "float")], name="t1") + t2 = ibis.table([("a", "int"), ("b", "float")], name="t2") + expr = f(t1, t2) + repr(expr) # smoketest + + def test_window_no_group_by(): t = ibis.table(dict(a="int64", b="string"), name="t") expr = t.a.mean().over(ibis.window(preceding=0))
scalableminds__webknossos-libs-312
Convenience for wkcuber.api To open/create a dataset with the cool new high-level API the following code is required: ```python from wkcuber.api.Dataset import WKDataset from pathlib import Path ds1 = WKDataset.create(Path("path") / "to" / "dataset1", scale=(128,128,128)) ds2 = WKDataset.open(Path("path") / "to" / "dataset2") ``` For one-off scripts, I think that could be a bit more convenient, if we had an API like this ```python from wkcuber import WKDataset ds1 = WKDataset.create("path/to/dataset1", scale=(128, 128, 128)) ds2 = WKDataset.open("path/to/dataset2") ``` Any thoughts? @rschwanhold @jstriebel @philippotto
[ { "content": "from .cubing import cubing\nfrom .downsampling import downsample_mags\nfrom .compress import compress_mag\nfrom .metadata import write_webknossos_metadata\n", "path": "wkcuber/__init__.py" } ]
[ { "content": "from .api.Dataset import WKDataset\nfrom .cubing import cubing\nfrom .downsampling import downsample_mags\nfrom .compress import compress_mag\nfrom .mag import Mag\nfrom .metadata import write_webknossos_metadata\n", "path": "wkcuber/__init__.py" } ]
diff --git a/tests/test_reexport.py b/tests/test_reexport.py new file mode 100644 index 000000000..124704668 --- /dev/null +++ b/tests/test_reexport.py @@ -0,0 +1,8 @@ +from wkcuber import Mag, WKDataset +from wkcuber.mag import Mag as _Mag +from wkcuber.api.Dataset import WKDataset as _WKDataset + + +def test_reexport_classes() -> None: + assert Mag == _Mag, "Mag exports should be the same class" + assert WKDataset == _WKDataset, "WKDataset exports should be the same class" diff --git a/wkcuber/__init__.py b/wkcuber/__init__.py index e53f7a6ee..e1298544c 100644 --- a/wkcuber/__init__.py +++ b/wkcuber/__init__.py @@ -1,4 +1,6 @@ +from .api.Dataset import WKDataset from .cubing import cubing from .downsampling import downsample_mags from .compress import compress_mag +from .mag import Mag from .metadata import write_webknossos_metadata
opsdroid__opsdroid-1241
Exiting opsdroid with ctrl+c fails with exception <!-- Before you post an issue or if you are unsure about something join our matrix channel https://riot.im/app/#/room/#opsdroid-general:matrix.org and ask away! We are more than happy to help you. --> # Description I am trying to build a Slack bot using Opsdroid (master branch). When pressing `ctrl+c` to exit opsdroid, the process does not stop and throws an error. ## Steps to Reproduce 1. Start opsdroid and wait for it to run ``` opsdroid start ``` 2. Press `ctrl+c` to exit the process ## Expected Functionality The opsdroid process should exit on pressing `ctrl+c`. ## Experienced Functionality The opsdroid process fails to exit with an exception. The debug log is as follows: ``` INFO opsdroid.logging: ======================================== INFO opsdroid.logging: Started opsdroid v0.16.0+82.g4c55e97 INFO opsdroid: ======================================== INFO opsdroid: You can customise your opsdroid by modifying your configuration.yaml INFO opsdroid: Read more at: http://opsdroid.readthedocs.io/#configuration INFO opsdroid: Watch the Get Started Videos at: http://bit.ly/2fnC0Fh INFO opsdroid: Install Opsdroid Desktop at: https://github.com/opsdroid/opsdroid-desktop/releases INFO opsdroid: ======================================== WARNING opsdroid.loader: No databases in configuration.This will cause skills which store things in memory to lose data when opsdroid is restarted. INFO opsdroid.connector.slack: Connecting to Slack INFO opsdroid.connector.slack: Connected successfully INFO opsdroid.web: Started web server on http://0.0.0.0:8080 INFO opsdroid.core: Opsdroid is now running, press ctrl+c to exit. ^CINFO opsdroid.core: Received stop signal, exiting. INFO opsdroid.core: Removing skills... INFO opsdroid.core: Removed hello INFO opsdroid.core: Removed seen INFO opsdroid.core: Removed help INFO opsdroid.core: Stopping connector slack... ERROR: Unhandled exception in opsdroid, exiting... Caught exception {'message': 'Task exception was never retrieved', 'exception': TypeError("object NoneType can't be used in 'await' expression",), 'future': <Task finished coro=<OpsDroid.handle_signal() done, defined at /home/daniccan/c8/OpsDroid/c8-alertbot/env/lib/python3.6/site-packages/opsdroid/core.py:147> exception=TypeError("object NoneType can't be used in 'await' expression",)>} WARNING slack.rtm.client: Websocket was closed. ``` ## Versions - **Opsdroid version:** master branch in git - **Python version:** 3.6.8 - **OS/Docker version:** Ubuntu 18.04 LTS ## Configuration File Please include your version of the configuration file below. ```yaml # Your code goes here. welcome-message: true connectors: - name: slack api-token: "<Bot OAuth Token>" skills: - name: hello - name: seen - name: help ``` ## Additional Details Any other details you wish to include such as screenshots, console messages, etc. <!-- Love opsdroid? Please consider supporting our collective: +👉 https://opencollective.com/opsdroid/donate -->
[ { "content": "\"\"\"A connector for Slack.\"\"\"\nimport logging\nimport re\nimport ssl\nimport certifi\n\nimport slack\nfrom emoji import demojize\n\nfrom opsdroid.connector import Connector, register_event\nfrom opsdroid.events import Message, Reaction\nfrom opsdroid.connector.slack.events import Blocks\n\n\n...
[ { "content": "\"\"\"A connector for Slack.\"\"\"\nimport logging\nimport re\nimport ssl\nimport certifi\n\nimport slack\nfrom emoji import demojize\n\nfrom opsdroid.connector import Connector, register_event\nfrom opsdroid.events import Message, Reaction\nfrom opsdroid.connector.slack.events import Blocks\n\n\n...
diff --git a/opsdroid/connector/slack/__init__.py b/opsdroid/connector/slack/__init__.py index 833b32730..6cca1d43e 100644 --- a/opsdroid/connector/slack/__init__.py +++ b/opsdroid/connector/slack/__init__.py @@ -87,7 +87,7 @@ async def connect(self): async def disconnect(self): """Disconnect from Slack.""" - await self.slack_rtm.stop() + self.slack_rtm.stop() self.listening = False async def listen(self):
TabbycatDebate__tabbycat-2348
Crash when generating QF draw (WS) **Running:** a1ca1a390866199e1884db12c215ddaa867a98dc When generating the draw for the first elimination round in a WS tournament, I encountered this exception: ```python [2023-07-09 12:01:47,564] ERROR django.request: Internal Server Error: /xxx-yyz/admin/draw/round/7/create/ Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/django/core/handlers/exception.py", line 56, in inner response = get_response(request) File "/usr/local/lib/python3.9/site-packages/django/core/handlers/base.py", line 197, in _get_response response = wrapped_callback(request, *callback_args, **callback_kwargs) File "/usr/local/lib/python3.9/site-packages/django/views/generic/base.py", line 103, in view return self.dispatch(request, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/django/contrib/auth/mixins.py", line 135, in dispatch return super().dispatch(request, *args, **kwargs) File "/tcd/tabbycat/tournaments/mixins.py", line 125, in dispatch return super().dispatch(request, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/django/views/generic/base.py", line 142, in dispatch return handler(request, *args, **kwargs) File "/tcd/tabbycat/draw/views.py", line 664, in post manager.create() File "/tcd/tabbycat/draw/manager.py", line 157, in create drawer = DrawGenerator(self.teams_in_debate, generator_type, teams, File "/tcd/tabbycat/draw/generator/__init__.py", line 93, in DrawGenerator return klass(teams, results, rrseq, **kwargs) File "/tcd/tabbycat/draw/generator/common.py", line 182, in __init__ super().__init__(teams, results, rrseq, **kwargs) File "/tcd/tabbycat/draw/generator/common.py", line 73, in __init__ raise ValueError("Unrecognised options: " + ", ".join(unrecognised)) ValueError: Unrecognised options: avoid_conflicts ``` I quickly patched around it like so and we manually confirmed the draw was correct: ```diff diff --git a/tabbycat/draw/generator/common.py b/tabbycat/draw/generator/common.py index 2a61de6ea..3d7167aa1 100644 --- a/tabbycat/draw/generator/common.py +++ b/tabbycat/draw/generator/common.py @@ -68,9 +68,10 @@ class BaseDrawGenerator: # Compute the full dictionary of default options self.options = self.BASE_DEFAULT_OPTIONS.copy() self.options.update(self.DEFAULT_OPTIONS) + print(self.__class__) unrecognised = [key for key in kwargs if key not in self.options] - if unrecognised: - raise ValueError("Unrecognised options: " + ", ".join(unrecognised)) +# if unrecognised: +# raise ValueError("Unrecognised options: " + ", ".join(unrecognised)) self.options.update(kwargs) def generate(self): ``` Of course, this is not a fix for the problem, just avoiding the symptoms. **I intend to find the cause of this issue and fix it in the following days**, bu I'm dropping an issue here so I don't forget
[ { "content": "import logging\n\nfrom django.utils.translation import gettext as _\n\nlogger = logging.getLogger(__name__)\n\n\nclass BaseDrawError(Exception):\n pass\n\n\nclass DrawUserError(BaseDrawError):\n \"\"\"DrawUserError is raised by any DrawGenerator class when a problem that\n would appear to...
[ { "content": "import logging\n\nfrom django.utils.translation import gettext as _\n\nlogger = logging.getLogger(__name__)\n\n\nclass BaseDrawError(Exception):\n pass\n\n\nclass DrawUserError(BaseDrawError):\n \"\"\"DrawUserError is raised by any DrawGenerator class when a problem that\n would appear to...
diff --git a/tabbycat/draw/generator/common.py b/tabbycat/draw/generator/common.py index 2a61de6ea3c..6b31bf373ee 100644 --- a/tabbycat/draw/generator/common.py +++ b/tabbycat/draw/generator/common.py @@ -167,6 +167,7 @@ class BasePairDrawGenerator(BaseDrawGenerator): "side_penalty" : 0, "pullup_debates_penalty": 0, "pairing_penalty" : 0, + "avoid_conflicts" : "off", } TEAMS_PER_DEBATE = 2
openai__gym-2162
close the env when finished https://github.com/openai/gym/blob/345c65973fc7160d8be374745a60c36869d8accc/gym/envs/box2d/lunar_lander.py#L449 Shall we add `env.close()` before returning here? I've seen error below if it's not closed. `ImportError: sys.meta_path is None, Python is likely shutting down`.
[ { "content": "\"\"\"\nRocket trajectory optimization is a classic topic in Optimal Control.\n\nAccording to Pontryagin's maximum principle it's optimal to fire engine full throttle or\nturn it off. That's the reason this environment is OK to have discreet actions (engine on or off).\n\nThe landing pad is always...
[ { "content": "\"\"\"\nRocket trajectory optimization is a classic topic in Optimal Control.\n\nAccording to Pontryagin's maximum principle it's optimal to fire engine full throttle or\nturn it off. That's the reason this environment is OK to have discreet actions (engine on or off).\n\nThe landing pad is always...
diff --git a/gym/envs/box2d/lunar_lander.py b/gym/envs/box2d/lunar_lander.py index 247184591f7..b83256cd271 100644 --- a/gym/envs/box2d/lunar_lander.py +++ b/gym/envs/box2d/lunar_lander.py @@ -446,6 +446,8 @@ def demo_heuristic_lander(env, seed=None, render=False): print("step {} total_reward {:+0.2f}".format(steps, total_reward)) steps += 1 if done: break + if render: + env.close() return total_reward
liqd__a4-meinberlin-2857
ValueError: Missing staticfiles manifest entry for 'js/select_dropdown_init.js' https://sentry.liqd.net/sentry/meinberlin-dev/issues/1032/ ``` ValueError: Missing staticfiles manifest entry for 'js/select_dropdown_init.js' (35 additional frame(s) were not displayed) ... File "django/templatetags/static.py", line 118, in handle_simple return staticfiles_storage.url(path) File "django_cloudflare_push/middleware.py", line 47, in url return super(DebugStaticFilesStorage, self).url(path) File "django/contrib/staticfiles/storage.py", line 153, in url return self._url(self.stored_name, name, force) File "django/contrib/staticfiles/storage.py", line 132, in _url hashed_name = hashed_name_func(*args) File "django/contrib/staticfiles/storage.py", line 420, in stored_name raise ValueError("Missing staticfiles manifest entry for '%s'" % clean_name) Internal Server Error: /kiezkasse/create/module/kiezkasse-2/ ```
[ { "content": "from django import forms\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.categories.forms import CategorizableFieldMixin\nfrom adhocracy4.labels.mixins import LabelsAddableFieldMixin\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom meinberlin.apps.contrib.mixi...
[ { "content": "from django import forms\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4.categories.forms import CategorizableFieldMixin\nfrom adhocracy4.labels.mixins import LabelsAddableFieldMixin\nfrom adhocracy4.maps import widgets as maps_widgets\nfrom meinberlin.apps.contrib.mixi...
diff --git a/meinberlin/apps/mapideas/forms.py b/meinberlin/apps/mapideas/forms.py index 18eaf8c807..e69a32391d 100644 --- a/meinberlin/apps/mapideas/forms.py +++ b/meinberlin/apps/mapideas/forms.py @@ -22,7 +22,7 @@ def __init__(self, *args, **kwargs): 'Please locate your proposal on the map.') class Media: - js = ('js/select_dropdown_init.js',) + js = ('select_dropdown_init.js',) class Meta: model = models.MapIdea
elastic__apm-agent-python-1466
Missing HTTP status code since version 6.3.1 using Starlette **Describe the bug**: HTTP status code is missing when using agent versions > 6.2.3 and Starlette **To Reproduce** 1. Create a hello world REST service with FastAPI and agent 6.7.2. 2. Send requests 2. APM transactions are uploaded to ES but are missing HTTP status code **Environment (please complete the following information)** - OS: Indifferent. Happens on Mac OS X and Docker containers - Python version: 3.9.7 - Framework and version: Starlette 0.17.1 - APM Server version: 7.12.0 - Agent version: 6.3.1 - 6.7.2 **Additional context** Add any other context about the problem here. - Agent config options <!-- be careful not to post sensitive information --> <details> <summary>Click to expand</summary> ``` # Nothing special here, app is just the FastAPI instance def configure_apm(app): apm_config = { "SERVICE_NAME": APPLICATION_NAME, "SERVER_URL": os.environ.get("ELASTIC_APM_SERVER_HOST"), "SECRET_TOKEN": os.environ.get("ELASTIC_APM_SECRET_TOKEN"), "ENVIRONMENT": os.environ.get("ENVIRONMENT", "staging").lower(), } apm = make_apm_client(apm_config) app.add_middleware(ElasticAPM, client=apm) ``` </details> - `requirements.txt`: <details> <summary>Click to expand</summary> ``` elastic-apm==6.7.2 starlette==0.17.1 uvicorn==0.17.1 fastapi==0.73.0 ``` </details> - Example APM JSON document using agent 6.7.2: <details> ``` { "_index": "apm-7.12.0-transaction-000010", "_type": "_doc", "_id": "H-xCBn8BoEtMrRa0MKNx", "_version": 1, "_score": null, "fields": { "transaction.name.text": [ "GET /test/fran" ], "user_agent.original.text": [ "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.80 Safari/537.36" ], "host.hostname": [ "MacBook-Pro-de-Fran.local" ], "process.pid": [ 70040 ], "service.language.name": [ "python" ], "transaction.result": [ "HTTP 2xx" ], "user_agent.os.version": [ "10.15.7" ], "transaction.id": [ "d01f7447213a4374" ], "http.request.method": [ "GET" ], "processor.event": [ "transaction" ], "agent.name": [ "python" ], "host.name": [ "MacBook-Pro-de-Fran.local" ], "user_agent.version": [ "98.0.4758.80" ], "event.outcome": [ "success" ], "user_agent.original": [ "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.80 Safari/537.36" ], "process.ppid": [ 70038 ], "processor.name": [ "transaction" ], "transaction.duration.us": [ 642 ], "service.runtime.version": [ "3.9.7" ], "user_agent.name": [ "Chrome" ], "host.architecture": [ "arm64" ], "timestamp.us": [ 1645077472362757 ], "url.path": [ "/test/fran" ], "ecs.version": [ "1.8.0" ], "observer.type": [ "apm-server" ], "observer.version": [ "7.12.0" ], "agent.version": [ "6.7.2" ], "transaction.name": [ "GET /test/fran" ], "service.framework.version": [ "0.17.1" ], "observer.name": [ "instance-0000000001" ], "user_agent.os.full": [ "Mac OS X 10.15.7" ], "service.node.name": [ "MacBook-Pro-de-Fran.local" ], "url.scheme": [ "http" ], "transaction.sampled": [ true ], "user_agent.os.name": [ "Mac OS X" ], "host.ip": [ "-" ], "trace.id": [ "0c161d26c928799b770ccddcf4cfe3c4" ], "transaction.span_count.dropped": [ 0 ], "url.port": [ 8000 ], "url.full": [ "http://localhost:8000/test/fran" ], "service.environment": [ "staging" ], "service.name": [ "test" ], "service.framework.name": [ "starlette" ], "service.runtime.name": [ "CPython" ], "process.args": [ "/Users/fgarcia/miniconda3/envs/test-rest/lib/python3.9/site-packages/uvicorn/__main__.py", "app.main:app", "--reload" ], "observer.version_major": [ 7 ], "observer.hostname": [ "c2c026e5b645" ], "transaction.type": [ "request" ], "event.ingested": [ "2022-02-17T05:57:55.440Z" ], "@timestamp": [ "2022-02-17T05:57:52.362Z" ], "host.os.platform": [ "darwin" ], "service.language.version": [ "3.9.7" ], "url.domain": [ "localhost" ], "user_agent.device.name": [ "Mac" ] }, "highlight": { "host.architecture": [ "@kibana-highlighted-field@arm64@/kibana-highlighted-field@" ], "service.name": [ "@kibana-highlighted-field@test@/kibana-highlighted-field@" ], "service.framework.name": [ "@kibana-highlighted-field@starlette@/kibana-highlighted-field@" ], "processor.name": [ "@kibana-highlighted-field@transaction@/kibana-highlighted-field@" ], "agent.version": [ "@kibana-highlighted-field@6.7.2@/kibana-highlighted-field@" ] }, "sort": [ 1645077472362 ] } ``` </details> - Example APM JSON document using agent 6.2.3: <details> ``` { "_index": "apm-7.12.0-transaction-000010", "_type": "_doc", "_id": "oOw-Bn8BoEtMrRa0M5-0", "_version": 1, "_score": null, "fields": { "transaction.name.text": [ "GET /test/fran" ], "user_agent.original.text": [ "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.80 Safari/537.36" ], "host.hostname": [ "MacBook-Pro-de-Fran.local" ], "process.pid": [ 69858 ], "service.language.name": [ "python" ], "transaction.result": [ "HTTP 2xx" ], "user_agent.os.version": [ "10.15.7" ], "transaction.id": [ "ab3e2d9c98d72380" ], "http.request.method": [ "GET" ], "processor.event": [ "transaction" ], "agent.name": [ "python" ], "host.name": [ "MacBook-Pro-de-Fran.local" ], "user_agent.version": [ "-" ], "http.response.status_code": [ 200 ], "event.outcome": [ "success" ], "user_agent.original": [ "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.80 Safari/537.36" ], "process.ppid": [ 69856 ], "processor.name": [ "transaction" ], "transaction.duration.us": [ 656 ], "service.runtime.version": [ "3.9.7" ], "user_agent.name": [ "Chrome" ], "host.architecture": [ "arm64" ], "timestamp.us": [ 1645077212632517 ], "url.path": [ "/test/fran" ], "ecs.version": [ "1.8.0" ], "observer.type": [ "apm-server" ], "observer.version": [ "7.12.0" ], "agent.version": [ "6.2.3" ], "transaction.name": [ "GET /test/fran" ], "service.framework.version": [ "0.17.1" ], "observer.name": [ "instance-0000000001" ], "user_agent.os.full": [ "Mac OS X 10.15.7" ], "service.node.name": [ "MacBook-Pro-de-Fran.local" ], "url.scheme": [ "http" ], "transaction.sampled": [ true ], "user_agent.os.name": [ "Mac OS X" ], "host.ip": [ "-" ], "trace.id": [ "527836b27e7cfbe629eedca1f073ad38" ], "transaction.span_count.dropped": [ 0 ], "url.port": [ 8000 ], "url.full": [ "http://localhost:8000/test/fran" ], "service.environment": [ "staging" ], "service.name": [ "test" ], "service.framework.name": [ "starlette" ], "service.runtime.name": [ "CPython" ], "process.args": [ "/Users/fgarcia/miniconda3/envs/test-rest/lib/python3.9/site-packages/uvicorn/__main__.py", "app.main:app", "--reload" ], "observer.version_major": [ 7 ], "observer.hostname": [ "c2c026e5b645" ], "transaction.type": [ "request" ], "event.ingested": [ "2022-02-17T05:53:34.130Z" ], "@timestamp": [ "2022-02-17T05:53:32.632Z" ], "host.os.platform": [ "darwin" ], "service.language.version": [ "3.9.7" ], "url.domain": [ "localhost" ], "user_agent.device.name": [ "Mac" ] }, "highlight": { "service.name": [ "@kibana-highlighted-field@test@/kibana-highlighted-field@" ], "service.framework.name": [ "@kibana-highlighted-field@starlette@/kibana-highlighted-field@" ] }, "sort": [ 1645077212632 ] } ``` </details>
[ { "content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain...
[ { "content": "# BSD 3-Clause License\n#\n# Copyright (c) 2019, Elasticsearch BV\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n#\n# * Redistributions of source code must retain...
diff --git a/elasticapm/contrib/starlette/utils.py b/elasticapm/contrib/starlette/utils.py index 48ac251bd..f06c19055 100644 --- a/elasticapm/contrib/starlette/utils.py +++ b/elasticapm/contrib/starlette/utils.py @@ -86,7 +86,7 @@ async def get_data_from_response(message: dict, config: Config, event_type: str) """ result = {} - if "status_code" in message: + if "status" in message: result["status_code"] = message["status"] if config.capture_headers and "headers" in message: diff --git a/tests/contrib/asyncio/starlette_tests.py b/tests/contrib/asyncio/starlette_tests.py index dc11aa021..99b3cc1d3 100644 --- a/tests/contrib/asyncio/starlette_tests.py +++ b/tests/contrib/asyncio/starlette_tests.py @@ -146,6 +146,10 @@ def test_get(app, elasticapm_client): assert request["method"] == "GET" assert request["socket"] == {"remote_address": "127.0.0.1"} + response = transaction["context"]["response"] + assert response["status_code"] == 200 + assert response["headers"]["content-type"] == "text/plain; charset=utf-8" + assert span["name"] == "test"