in_source_id stringlengths 13 58 | issue stringlengths 3 241k | before_files listlengths 0 3 | after_files listlengths 0 3 | pr_diff stringlengths 109 107M ⌀ |
|---|---|---|---|---|
pulp__pulpcore-4641 | pulp_file version is set to 3.40.0.dev
**Version**
pulpcore 3.40.0
**Describe the bug**
Status API reports pulp_file version as 3.40.0.dev
| [
{
"content": "from pulpcore.plugin import PulpPluginAppConfig\n\n\nclass PulpFilePluginAppConfig(PulpPluginAppConfig):\n \"\"\"\n Entry point for pulp_file plugin.\n \"\"\"\n\n name = \"pulp_file.app\"\n label = \"file\"\n version = \"3.40.0.dev\"\n python_package_name = \"pulp_file\" # TO... | [
{
"content": "from pulpcore.plugin import PulpPluginAppConfig\n\n\nclass PulpFilePluginAppConfig(PulpPluginAppConfig):\n \"\"\"\n Entry point for pulp_file plugin.\n \"\"\"\n\n name = \"pulp_file.app\"\n label = \"file\"\n version = \"3.41.0.dev\"\n python_package_name = \"pulp_file\" # TO... | diff --git a/.bumpversion.cfg b/.bumpversion.cfg
index 540db5cc96..aba3e10392 100644
--- a/.bumpversion.cfg
+++ b/.bumpversion.cfg
@@ -19,3 +19,5 @@ values =
[bumpversion:file:./setup.py]
[bumpversion:file:./docs/conf.py]
+
+[bumpversion:file:pulp_file/app/__init__.py]
diff --git a/CHANGES/pulp_file/4633.bugfix b/CHANGES/pulp_file/4633.bugfix
new file mode 100644
index 0000000000..36a8711407
--- /dev/null
+++ b/CHANGES/pulp_file/4633.bugfix
@@ -0,0 +1 @@
+Fix pulp_file advertised version.
diff --git a/pulp_file/app/__init__.py b/pulp_file/app/__init__.py
index fe1d7362c6..d92d113268 100644
--- a/pulp_file/app/__init__.py
+++ b/pulp_file/app/__init__.py
@@ -8,6 +8,6 @@ class PulpFilePluginAppConfig(PulpPluginAppConfig):
name = "pulp_file.app"
label = "file"
- version = "3.40.0.dev"
+ version = "3.41.0.dev"
python_package_name = "pulp_file" # TODO Add python_module_name
domain_compatible = True
|
DataBiosphere__toil-239 | Jenkins should only deploy to PyPI when building off the master branch
| [
{
"content": "from setuptools import setup, find_packages\n\nsetup(\n name='toil',\n version='3.0.4',\n description='Pipeline management software for clusters.',\n author='Benedict Paten',\n author_email='benedict@soe.usc.edu',\n url=\"https://github.com/BD2KGenomics/toil\",\n install_requi... | [
{
"content": "from setuptools import setup, find_packages\n\nsetup(\n name='toil',\n version='3.0.5.dev1',\n description='Pipeline management software for clusters.',\n author='Benedict Paten',\n author_email='benedict@soe.usc.edu',\n url=\"https://github.com/BD2KGenomics/toil\",\n install_... | diff --git a/Makefile b/Makefile
index 252cda9d12..a854a9d914 100644
--- a/Makefile
+++ b/Makefile
@@ -76,7 +76,11 @@ check_running_on_jenkins:
@test -n "$$BUILD_NUMBER" || ( echo "\033[0;31mThis target should only be invoked on Jenkins.\033[0m" ; false )
pypi: check_clean_working_copy check_running_on_jenkins
- $(python) setup.py egg_info --tag-build=dev$$BUILD_NUMBER register sdist bdist_egg upload
+ test "$$(git rev-parse --verify remotes/origin/master)" != "$$(git rev-parse --verify HEAD)" \
+ && echo "Not on master branch, silently skipping deployment to PyPI." \
+ || $(python) setup.py egg_info --tag-build=build$$BUILD_NUMBER register sdist bdist_egg upload
pypi_stable: check_clean_working_copy check_running_on_jenkins
- $(python) setup.py egg_info register sdist bdist_egg upload
\ No newline at end of file
+ test "$$(git rev-parse --verify remotes/origin/master)" != "$$(git rev-parse --verify HEAD)" \
+ && echo "Not on master branch, silently skipping deployment to PyPI." \
+ || $(python) setup.py egg_info register sdist bdist_egg upload
diff --git a/setup.py b/setup.py
index 34c01ab695..85111970c2 100755
--- a/setup.py
+++ b/setup.py
@@ -2,7 +2,7 @@
setup(
name='toil',
- version='3.0.4',
+ version='3.0.5.dev1',
description='Pipeline management software for clusters.',
author='Benedict Paten',
author_email='benedict@soe.usc.edu',
|
google__mobly-524 | Fix pytest warnings in Python 3
`pytests` currently produces the following warnings:
mobly/mobly/test_runner.py:181: PytestWarning: cannot collect test class 'TestRunner' because it has a __init__ constructor
class TestRunner(object):
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:192: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.begin_time, expected_begin_time)
mobly/tests/mobly/base_instrumentation_test_test.py:193: DeprecationWarning: Please use assertEqual instead.
self.assertEquals(actual_test.end_time, expected_end_time)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/base_instrumentation_test_test.py:179: DeprecationWarning: Please use assertEqual instead.
expected_completed_and_passed)
mobly/tests/mobly/output_test.py:171: DeprecationWarning: Please use assertNotEqual instead.
self.assertNotEquals(output_dir1, output_dir2)
mobly/tests/mobly/output_test.py:205: DeprecationWarning: Please use assertNotEqual instead.
self.assertNotEquals(output_dir1, output_dir2)
-- Docs: https://docs.pytest.org/en/latest/warnings.html
| [
{
"content": "# Copyright 2016 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | [
{
"content": "# Copyright 2016 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | diff --git a/setup.py b/setup.py
index 9e593a80..adc1df5d 100755
--- a/setup.py
+++ b/setup.py
@@ -40,7 +40,7 @@ class PyTest(test.test):
def finalize_options(self):
test.test.finalize_options(self)
- self.test_args = ['-x', "tests"]
+ self.test_args = ['-x', "tests/mobly"]
self.test_suite = True
def run_tests(self):
|
carpentries__amy-430 | Skills should be displayed in sorted order
Skills are currently displayed in a more-or-less random order (based I presume on the order in which they were added to the DB). They should be sorted, either alphabetically (which would put all 'dc' before all 'swc') or by second part (e.g., by what's after the '/').
| [
{
"content": "import datetime\nimport re\n\nfrom django.contrib.auth.models import (\n AbstractBaseUser, BaseUserManager, PermissionsMixin)\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.core.urlresolvers import reverse\nfrom django.db import models\nfrom django.db.models import Q\n\nfro... | [
{
"content": "import datetime\nimport re\n\nfrom django.contrib.auth.models import (\n AbstractBaseUser, BaseUserManager, PermissionsMixin)\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.core.urlresolvers import reverse\nfrom django.db import models\nfrom django.db.models import Q\n\nfro... | diff --git a/workshops/models.py b/workshops/models.py
index 0fd73428d..d52ca94f5 100644
--- a/workshops/models.py
+++ b/workshops/models.py
@@ -391,6 +391,9 @@ class Lesson(models.Model):
def __str__(self):
return self.name
+ class Meta:
+ ordering = ['name']
+
#------------------------------------------------------------
class Qualification(models.Model):
diff --git a/workshops/test/test_person.py b/workshops/test/test_person.py
index c21ff1426..b15763cc2 100644
--- a/workshops/test/test_person.py
+++ b/workshops/test/test_person.py
@@ -425,8 +425,8 @@ def setUp(self):
self.badge1 = Badge.objects.create(name='Badge1')
self.badge2 = Badge.objects.create(name='Badge2')
- self.lesson1 = Lesson.objects.get(name='swc/python')
- self.lesson2 = Lesson.objects.get(name='dc/spreadsheets')
+ self.lesson1 = Lesson.objects.get(name='dc/spreadsheets')
+ self.lesson2 = Lesson.objects.get(name='swc/python')
self.domain1 = KnowledgeDomain.objects.get(pk=1) # space sciences
self.domain2 = KnowledgeDomain.objects.get(pk=2) # geo* sciences
|
magenta__magenta-629 | ValueError: Cell returns tuple of states, but the flag state_is_tuple is not set. State size is: (LSTMStateTuple(c=128, h=128), LSTMStateTuple(c=128, h=128))
Hey guys,
I've just set up my conda environment and packages. When I running the bazel test //magenta/... command, the test //magenta/models/shared:events_rnn_graph_test failed. I am new to this project so hopefully someone could point me to the right direction! For your info, I have installed all the required packages according to setup.py, and confirmed installation with 'pip freeze' and 'conda list' command.
Thanks in advance!
Simon
Bellow is the error message in the log file:
`ERROR: testBuildGraphWithAttention (__main__.EventSequenceRNNGraphTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/private/var/tmp/_bazel_simonttk/2d57163c72209284de52b06652358cc7/execroot/magenta/bazel-out/local-opt/bin/magenta/models/shared/events_rnn_graph_test.runfiles/__main__/magenta/models/shared/events_rnn_graph_test.py", line 58, in testBuildGraphWithAttention
'train', self.config, sequence_example_file_paths=['test'])
File "/private/var/tmp/_bazel_simonttk/2d57163c72209284de52b06652358cc7/execroot/magenta/bazel-out/local-opt/bin/magenta/models/shared/events_rnn_graph_test.runfiles/__main__/magenta/models/shared/events_rnn_graph.py", line 98, in build_graph
attn_length=hparams.attn_length)
File "/private/var/tmp/_bazel_simonttk/2d57163c72209284de52b06652358cc7/execroot/magenta/bazel-out/local-opt/bin/magenta/models/shared/events_rnn_graph_test.runfiles/__main__/magenta/models/shared/events_rnn_graph.py", line 47, in make_rnn_cell
cell = tf.contrib.rnn.AttentionCellWrapper(cell, attn_length)
File "/Users/simonttk/anaconda2/envs/magenta/lib/python2.7/site-packages/tensorflow/contrib/rnn/python/ops/rnn_cell.py", line 1077, in __init__
% str(cell.state_size))
ValueError: Cell returns tuple of states, but the flag state_is_tuple is not set. State size is: (LSTMStateTuple(c=128, h=128), LSTMStateTuple(c=128, h=128))`
| [
{
"content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless ... | [
{
"content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless ... | diff --git a/magenta/models/shared/events_rnn_graph.py b/magenta/models/shared/events_rnn_graph.py
index c4e6f7dd02..3b58b22d7e 100644
--- a/magenta/models/shared/events_rnn_graph.py
+++ b/magenta/models/shared/events_rnn_graph.py
@@ -44,7 +44,8 @@ def make_rnn_cell(rnn_layer_sizes,
cell = tf.contrib.rnn.MultiRNNCell(cells)
if attn_length:
- cell = tf.contrib.rnn.AttentionCellWrapper(cell, attn_length)
+ cell = tf.contrib.rnn.AttentionCellWrapper(
+ cell, attn_length, state_is_tuple=True)
return cell
|
projectmesa__mesa-826 | Push new Mesa release
Wee are overdue for an official release. Before I push one, does anyone have anything that really want to try to get in or should I just tag and release?
Discuss.
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom .model import Model\nfrom .agent import Agent\n\n\n__all__ = [\"Model\", \"Agent\"]\n\n__title__ = \"mesa\"\n__version__ = \"0.8.6\"\n__license__ = \"Apache 2.... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom .model import Model\nfrom .agent import Agent\n\n\n__all__ = [\"Model\", \"Agent\"]\n\n__title__ = \"mesa\"\n__version__ = \"0.8.7\"\n__license__ = \"Apache 2.... | diff --git a/HISTORY.rst b/HISTORY.rst
index 9a6869ac774..2be43bafb32 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -3,6 +3,74 @@
Release History
---------------
+0.8.7 (2020-05-XX) Lake Havasu City
++++++++++++++++++++++++++++++++++++++++++++
+
+**Improvements**
+
+* Enable BatchRunner to run specified set of parameter combinations #651 (#607)
+* Restructured runcontrol.js #661
+* Add pipenv support for mesa #678
+* Increase test coverage and change to codecov #692
+* Updates Travis to explicitly set the dist to be Xenial #699
+* time: Remove resolved TODO on random seed of random scheduler #708
+* hex_snowflake: Update description to be more informative #712
+* Added Coverall to Codecov in Contributing file #734
+* Makes opening the browser optional when launching the server #755 #754
+* NetworkGrid: Update to networkx 2.4 API #763
+* Apply black to mesa/ directory #775
+* Updated travis to 3.8 and updated gitignore #777
+* Add information (to docstring) on image as agent portrayal shape #791
+* Change grid empties from list to set #649 (improves speed)
+* Adding mypy annotation
+ * space: Add type annotation to Grid class #779
+ * add Mypy annotation to time, agent, and model #792
+ * space: Add mypy annotation to the remaining methods/functions #796
+* Docs related
+ * Bulk merge of docs from 'docs' to 'master' #684
+ * Created useful snippets code section in the docs #668 #669
+ * Updating index.rst #672
+ * Clarify runserver snippet in index.rst #682
+ * Add documentation for feature (pipenv) added in #678 #683
+ * Add docs for BatchRunner to support Variable and Fixed Parameter Contribution #679 #683
+ * Resources #651 in docs branch #691. This preps for #683 to be merged.
+ * intro tutorial: Clarify a function that is not defined in the class #705
+ * Updates formatting the readme Docs markdown #737
+* Examples related
+ * Schelling: Separate text-only viz into run_ascii.py #706
+ * examples/Readme.md: Update description to be consistent with the folder names #707
+
+**Fixes**
+
+* Fixes link to update code coverage module - Updates Removing last link to coveralls and replacing to codecoverage #748
+* Fixes D3 Network Visualization to update (rather than overwrite) #765 #767
+* Fix parameter order in initializing SingleGrid object #770 #769
+* Updating pipenv link #773
+* Fixed pip install from github by specifying egg #802
+* Compatibility fixes
+ * Fixes VisualizationServer to be compatible with recent versions of Tornado #655
+ * Fixes #749 networkx incompatibility #750
+* Fixing typos
+ * Fixes documentation typos in example code #695 #696
+ * Fixes typo in ModularServer's last parameter #711
+ * Fixed typo in BarChartModule line 100 #747
+ * Fix typo in documentation #809
+* Doc fixes (not relating to typos)
+ * Update tutorial to point to correct repo location #671 #670
+ * Updating sphinx and reverting issues #674 #675 #677 #681
+ * Fixes code blocks that weren't showing up in the tutorial #686
+ * Remove figure from advanced tutorial showing the empty visualization #729
+ * Removes git clone from tutorial - Update intro_tutorial.rst #730
+ * Fixes citation links in docs tutorial section #736
+ * Fix histogram in advanced tutorial #794 #610
+ * Fixes Advanced Tutorial #elements #804 #803
+* Fixes to examples
+ * Fixing test_random_walk bug - wolf sheep. #821
+ * Fixes shape_example server launch #762 #756
+ * Fixing broken table in pd_grid example #824
+
+
+
0.8.6 (2019-05-02) Lake Havasu City
+++++++++++++++++++++++++++++++++++++++++++
diff --git a/mesa/__init__.py b/mesa/__init__.py
index c54d71e9eb0..a2598b0890d 100644
--- a/mesa/__init__.py
+++ b/mesa/__init__.py
@@ -14,6 +14,6 @@
__all__ = ["Model", "Agent"]
__title__ = "mesa"
-__version__ = "0.8.6"
+__version__ = "0.8.7"
__license__ = "Apache 2.0"
__copyright__ = "Copyright %s Project Mesa Team" % datetime.date.today().year
|
mindsdb__mindsdb-28 | IndexError: list index out of range when missing predict value
**Is your feature request related to a problem? Please describe.**
When there is empty string provided as predict value e.g:
```
result = mdb.predict(predict=' ', model_name='home_rentals')
```
IndexError: list index out of range error is thrown
**Describe the solution you'd like**
User friendly message should be thrown e.g
ValueError: Please provide valid predict value
**Additional context**
We can check for empty predict values in https://github.com/mindsdb/main/blob/76c691c4b18a4723626dfcbff8228da614d93e8b/mindsdb/libs/controllers/mindsdb_controller.py#L170 and raise Value error if predict not provided.
| [
{
"content": "import sqlite3\nimport pandas\nimport requests\nimport logging\nimport os\nimport platform\nimport _thread\nimport uuid\nimport traceback\nimport urllib\n\nfrom mindsdb.libs.helpers.sqlite_helpers import *\nfrom mindsdb.libs.helpers.multi_data_source import getDS\nfrom mindsdb.config import SQLITE... | [
{
"content": "import sqlite3\nimport pandas\nimport requests\nimport logging\nimport os\nimport platform\nimport _thread\nimport uuid\nimport traceback\nimport urllib\n\nfrom mindsdb.libs.helpers.sqlite_helpers import *\nfrom mindsdb.libs.helpers.multi_data_source import getDS\nfrom mindsdb.config import SQLITE... | diff --git a/mindsdb/libs/controllers/mindsdb_controller.py b/mindsdb/libs/controllers/mindsdb_controller.py
index 149e2533f88..4d70d3ae767 100644
--- a/mindsdb/libs/controllers/mindsdb_controller.py
+++ b/mindsdb/libs/controllers/mindsdb_controller.py
@@ -174,6 +174,9 @@ def predict(self, predict, from_data = None, when={}, model_name='mdsb_model', b
:param model_name:
:return:
"""
+
+ if not predict:
+ raise ValueError('Please provide valid predict value.')
transaction_type = TRANSACTION_PREDICT
|
pytorch__ignite-2826 | WandBLogger and TensorboardLogger have different APIs for logging audio
## 🚀 Feature
The following code doesn't work:
```
logger = WandBLogger()
logger.writer
```
This is how you would typically add audio with a tensorboard logger:
```
logger.writer.add_audio('mixture', x.t(), engine.state.epoch)
```
The workaround (similar to discussed in https://github.com/Lightning-AI/lightning/issues/7028) would be to use the underlying _wandb object:
```
logger._wandb.log({"validation": [wandb.Audio(x.t(), caption="mixture", sample_rate=44100)]})
logger._wandb.log({"validation": [wandb.Audio(x.t(), caption="vocals", sample_rate=44100)]})
```
Is there a proposal for a standardized media logging API?
| [
{
"content": "\"\"\"TensorBoard logger and its helper handlers.\"\"\"\nfrom typing import Any, Callable, List, Optional, Union\n\nfrom torch.optim import Optimizer\n\nfrom ignite.contrib.handlers.base_logger import (\n BaseLogger,\n BaseOptimizerParamsHandler,\n BaseOutputHandler,\n BaseWeightsHandl... | [
{
"content": "\"\"\"TensorBoard logger and its helper handlers.\"\"\"\nfrom typing import Any, Callable, List, Optional, Union\n\nfrom torch.optim import Optimizer\n\nfrom ignite.contrib.handlers.base_logger import (\n BaseLogger,\n BaseOptimizerParamsHandler,\n BaseOutputHandler,\n BaseWeightsHandl... | diff --git a/ignite/contrib/handlers/tensorboard_logger.py b/ignite/contrib/handlers/tensorboard_logger.py
index 042d19198320..f8b002e3020b 100644
--- a/ignite/contrib/handlers/tensorboard_logger.py
+++ b/ignite/contrib/handlers/tensorboard_logger.py
@@ -160,6 +160,9 @@ def __init__(self, *args: Any, **kwargs: Any):
self.writer = SummaryWriter(*args, **kwargs)
+ def __getattr__(self, attr: Any) -> Any:
+ return getattr(self.writer, attr)
+
def close(self) -> None:
self.writer.close()
diff --git a/tests/ignite/contrib/handlers/test_tensorboard_logger.py b/tests/ignite/contrib/handlers/test_tensorboard_logger.py
index 7645eddd335f..60c8a1f4483c 100644
--- a/tests/ignite/contrib/handlers/test_tensorboard_logger.py
+++ b/tests/ignite/contrib/handlers/test_tensorboard_logger.py
@@ -32,6 +32,17 @@ def test_optimizer_params_handler_wrong_setup():
handler(mock_engine, mock_logger, Events.ITERATION_STARTED)
+def test_getattr_method():
+ # Create a mock SummaryWriter object
+ mock_writer = MagicMock()
+ # Assign the mock object to the writer attribute of a TensorboardLoggerinstance
+ logger = TensorboardLogger()
+ logger.writer = mock_writer
+ # Test that a method passed through the __getattr__ method calls thecorresponding method on the mock object
+ logger.add_scalar("loss", 0.5)
+ mock_writer.add_scalar.assert_called_once_with("loss", 0.5)
+
+
def test_optimizer_params():
optimizer = torch.optim.SGD([torch.tensor(0.0)], lr=0.01)
|
mosaicml__composer-79 | Add Colab Example
* Add Example Jupyter notebook to the examples folder
* Add "Open in Colab" to the README.md
| [
{
"content": "# Copyright 2021 MosaicML. All Rights Reserved.\n\nimport os\nimport sys\n\nimport setuptools\nfrom setuptools import setup\n\n\ndef package_files(directory):\n # from https://stackoverflow.com/a/36693250\n paths = []\n for (path, directories, filenames) in os.walk(directory):\n fo... | [
{
"content": "# Copyright 2021 MosaicML. All Rights Reserved.\n\nimport os\nimport sys\n\nimport setuptools\nfrom setuptools import setup\n\n\ndef package_files(directory):\n # from https://stackoverflow.com/a/36693250\n paths = []\n for (path, directories, filenames) in os.walk(directory):\n fo... | diff --git a/README.md b/README.md
index 8caf8f42ca..d50b6b9df0 100644
--- a/README.md
+++ b/README.md
@@ -7,11 +7,18 @@ The library features:
* Standardized approach to implement and compose efficiency methods, extended from two-way callbacks ([Howard et al, 2020](https://arxiv.org/abs/2002.04688))
* Easy way to access our methods either directly for your trainer loops, or through the MosaicML Trainer.
+[](https://colab.research.google.com/github/mosaicml/composer/blob/main/examples/composer.ipynb)
+
+
+## Installing Composer
+
To install `Composer`:
```
pip install mosaicml
```
+## Using Composer
+
A few ways to use `Composer`:
1. Import the functional form of our methods:
diff --git a/examples/composer.ipynb b/examples/composer.ipynb
new file mode 100644
index 0000000000..91544e8826
--- /dev/null
+++ b/examples/composer.ipynb
@@ -0,0 +1,104 @@
+{
+ "cells": [
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "requirements"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "!pip install mosaicml"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "imports"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "import torch\n",
+ "\n",
+ "from composer import trainer, algorithms"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "hparams"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "if torch.cuda.is_available():\n",
+ " trainer_hparams = trainer.load(\"classify_mnist\")\n",
+ "else:\n",
+ " trainer_hparams = trainer.load(\"classify_mnist_cpu\")\n",
+ "\n",
+ "trainer_hparams.algorithms = algorithms.load_multiple(\n",
+ " \"blurpool\",\n",
+ " \"scale_schedule\")\n",
+ "trainer_hparams.set_datadir(\"~/datasets\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "trainer"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "mosaic_trainer = trainer_hparams.initialize_object()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "train"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "mosaic_trainer.fit()"
+ ]
+ }
+ ],
+ "metadata": {
+ "celltoolbar": "Tags",
+ "interpreter": {
+ "hash": "40ad569553f4172ee5f9f9f1cdecfe3a03f28f5ebfb04d4146b885c5108ed381"
+ },
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.9.7"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/setup.py b/setup.py
index 0040c525b9..bd31c35993 100755
--- a/setup.py
+++ b/setup.py
@@ -49,6 +49,7 @@ def package_files(directory):
'sphinxcontrib.katex>=0.8.6',
'sphinxext.opengraph>=0.4.2',
'sphinx_rtd_theme>=1.0.0',
+ 'testbook>=0.4.2',
'myst-parser>=0.15.2',
]
extra_deps['wandb'] = ['wandb>=0.12.2']
diff --git a/tests/examples/__init__.py b/tests/examples/__init__.py
new file mode 100644
index 0000000000..0929d967ff
--- /dev/null
+++ b/tests/examples/__init__.py
@@ -0,0 +1 @@
+# Copyright 2021 MosaicML. All Rights Reserved.
diff --git a/tests/examples/test_composer_ipynb.py b/tests/examples/test_composer_ipynb.py
new file mode 100644
index 0000000000..ea594e6125
--- /dev/null
+++ b/tests/examples/test_composer_ipynb.py
@@ -0,0 +1,22 @@
+# Copyright 2021 MosaicML. All Rights Reserved.
+
+import os
+
+import pytest
+import testbook
+import testbook.client
+
+import composer
+
+examples_path = os.path.join(os.path.dirname(composer.__file__), '..', 'examples')
+
+
+@testbook.testbook(os.path.join(examples_path, 'composer.ipynb'))
+@pytest.mark.timeout(120) # long timeout to download the dataset (if needed) and train one epoch
+def test_composer_notebook(tb: testbook.client.TestbookNotebookClient):
+ tb.execute_cell("imports")
+ tb.execute_cell("hparams")
+ tb.inject("trainer_hparams.max_epochs = 1")
+ tb.execute_cell("trainer")
+ assert tb.get('mosaic_trainer').state.max_epochs == 1
+ tb.execute_cell("train")
|
beeware__toga-928 | toga-demo alias doesn't work on Windows
## Expected Behavior
Examples in the documentation should work. I have to specify version 0.2.15 for anything to run properly - the normal pip installation of toga installs the dev builds that are not functioning.
## Current Behavior
They all fail with various errors of missing items, etc.
```
C:\Users\bubth\Development\togatest> pip install --pre toga-demo
Collecting toga-demo
Downloading https://files.pythonhosted.org/packages/33/05/61d94bccdfe6831eb60fc59cd79c60d7780983d07df984d82e2a8f298b8b
/toga_demo-0.3.0.dev19-py3-none-any.whl (616kB)
|████████████████████████████████| 624kB 819kB/s
Collecting toga==0.3.0.dev18 (from toga-demo)
Downloading https://files.pythonhosted.org/packages/9c/cd/4ec127b063c9b1c6f045791e7613e05247dc30e0cb817bccf09de9377ecf
/toga-0.3.0.dev18-py3-none-any.whl
Collecting toga-winforms==0.3.0.dev18; sys_platform == "win32" (from toga==0.3.0.dev18->toga-demo)
Downloading https://files.pythonhosted.org/packages/81/67/6e16ddc4c4286a4b6f08005c66006524e305c3befca01df34f509ef76202
/toga_winforms-0.3.0.dev18-py3-none-any.whl
Collecting toga-core==0.3.0.dev18 (from toga-winforms==0.3.0.dev18; sys_platform == "win32"->toga==0.3.0.dev18->toga-dem
o)
/toga_core-0.3.0.dev18-py3-none-any.whl (512kB)
|████████████████████████████████| 522kB 6.8MB/s
Requirement already satisfied: pythonnet in c:\program files\python37\lib\site-packages (from toga-winforms==0.3.0.dev18Requirement already satisfied: importlib-metadata; python_version < "3.8" in c:\users\bubth\appdata\roaming\python\pythotoga-demo) (0.18)
Collecting travertino>=0.1.0 (from toga-core==0.3.0.dev18->toga-winforms==0.3.0.dev18; sys_platform == "win32"->toga==0.3.0.dev18->toga-demo)
Downloading https://files.pythonhosted.org/packages/4c/78/b33e38d372707fbf2c461d1bde6797a12c8d20f97279db63cb57dc24eacb/travertino-0.1.3-py3-none-any.whl
Requirement already satisfied: zipp>=0.5 in c:\users\bubth\appdata\roaming\python\python37\site-packages (from importlib-metadata; python_version < "3.8"->toga-core==0.3.0.dev18->toga-winforms==0.3.0.dev18; sys_platform == "win32"->toga==0.3.0.dev18->toga-demo) (0.5.2)
Installing collected packages: travertino, toga-core, toga-winforms, toga, toga-demo
Found existing installation: toga-core 0.2.15
Uninstalling toga-core-0.2.15:
Successfully uninstalled toga-core-0.2.15
Found existing installation: toga-winforms 0.2.15
Uninstalling toga-winforms-0.2.15:
Successfully uninstalled toga-winforms-0.2.15
Found existing installation: toga 0.2.15
Uninstalling toga-0.2.15:
Successfully uninstalled toga-0.2.15
Successfully installed toga-0.3.0.dev18 toga-core-0.3.0.dev18 toga-demo-0.3.0.dev19 toga-winforms-0.3.0.dev18 travertino-0.1.3
WARNING: You are using pip version 19.2.1, however version 20.1.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
C:\Users\bubth\Development\togatest> python --versoin
unknown option --versoin
usage: C:\Program Files\Python37\python.exe [option] ... [-c cmd | -m mod | file | -] [arg] ...
Try `python -h' for more information.
C:\Users\bubth\Development\togatest> python --version
Python 3.7.3
C:\Users\bubth\Development\togatest> toga-demo
Traceback (most recent call last):
File "c:\program files\python37\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\program files\python37\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Program Files\Python37\Scripts\toga-demo.exe\__main__.py", line 9, in <module>
File "c:\program files\python37\lib\site-packages\toga_demo\__main__.py", line 5, in run
main().main_loop()
File "c:\program files\python37\lib\site-packages\toga_demo\app.py", line 98, in main
return TogaDemo('Toga Demo', 'org.beeware.toga-demo')
File "c:\program files\python37\lib\site-packages\toga\app.py", line 184, in __init__
self.icon = 'resources/{app_name}'.format(app_name=self.app_name)
File "c:\program files\python37\lib\site-packages\toga\app.py", line 317, in icon
self._icon.bind(self.factory)
File "c:\program files\python37\lib\site-packages\toga\icons.py", line 41, in bind
resource_path = factory.paths.app
File "c:\program files\python37\lib\site-packages\toga_winforms\paths.py", line 10, in app
return Path(sys.modules[App.app.module_name].__file__).parent
KeyError: ''
C:\Users\bubth\Development\togatest>
```
```
Traceback (most recent call last):
File ".\test.py", line 2, in <module>
from toga.style.pack import Pack, ROW, CENTER, COLUMN
ModuleNotFoundError: No module named 'toga.style'
```
```
C:\Users\bubth\Development\togatest> python .\test.py
Traceback (most recent call last):
File ".\test.py", line 24, in <module>
main().main_loop()
File "C:\Program Files\Python37\lib\site-packages\toga_winforms\app.py", line 49, in main_loop
self._startup()
File "C:\Program Files\Python37\lib\site-packages\toga_winforms\app.py", line 41, in _startup
self.startup()
File "C:\Program Files\Python37\lib\site-packages\toga\interface\app.py", line 144, in startup
self.main_window.content = self._startup_method(self)
File ".\test.py", line 9, in build
box = toga.Box()
File "C:\Program Files\Python37\lib\site-packages\toga_winforms\widgets\box.py", line 10, in __init__
super().__init__(id=id, style=style, children=children)
File "C:\Program Files\Python37\lib\site-packages\toga\interface\widgets\box.py", line 21, in __init__
super().__init__(id=id, style=style, children=children)
File "C:\Program Files\Python37\lib\site-packages\toga\interface\widgets\base.py", line 144, in __init__
self.style = CSS()
File "C:\Program Files\Python37\lib\site-packages\toga\interface\widgets\base.py", line 170, in style
self._style = value.bind(self)
AttributeError: 'CSS' object has no attribute 'bind'
```
## Steps to reproduce
<!--- Provide a set of steps describing how to reproduce this bug. If you have a live example, provide the link below -->
1. Be on windows
2. install toga
3. Follow the browser tutorial or hello world tutorial
## Your Environment
<!--- Provide details on your current environment you found the bug in -->
* Python Version (list the specific version number)
```
C:\Users\bubth\Development\togatest> python --version
Python 3.7.3
```
* Operating System and Version (select from the following and list the specific version number; if your OS is not listed, list that as well)
```
OS Name Microsoft Windows 10 Pro
Version 10.0.19041 Build 19041
Other OS Description Not Available
OS Manufacturer Microsoft Corporation
System Name LAPPYTOPPY
System Manufacturer Micro-Star International Co., Ltd.
System Model GP73 Leopard 8RF
System Type x64-based PC
System SKU 17C5.1
Processor Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz, 2201 Mhz, 6 Core(s), 12 Logical Processor(s)
BIOS Version/Date American Megatrends Inc. E17C5IMS.10A, 7/13/2018
SMBIOS Version 3.1
Embedded Controller Version 255.255
BIOS Mode UEFI
BaseBoard Manufacturer Micro-Star International Co., Ltd.
BaseBoard Product MS-17C5
BaseBoard Version REV:1.0
Platform Role Mobile
Secure Boot State On
PCR7 Configuration Elevation Required to View
Windows Directory C:\WINDOWS
System Directory C:\WINDOWS\system32
Boot Device \Device\HarddiskVolume3
Locale United States
Hardware Abstraction Layer Version = "10.0.19041.1"
User Name LAPPYTOPPY\bubth
Time Zone Mountain Daylight Time
Installed Physical Memory (RAM) 16.0 GB
Total Physical Memory 15.8 GB
Available Physical Memory 4.19 GB
Total Virtual Memory 18.2 GB
Available Virtual Memory 4.69 GB
Page File Space 2.38 GB
Page File C:\pagefile.sys
Kernel DMA Protection Off
Virtualization-based security Running
Virtualization-based security Required Security Properties
Virtualization-based security Available Security Properties Base Virtualization Support, Secure Boot, DMA Protection, SMM Security Mitigations 1.0, Mode Based Execution Control
Virtualization-based security Services Configured
Virtualization-based security Services Running
Device Encryption Support Elevation Required to View
A hypervisor has been detected. Features required for Hyper-V will not be displayed.
```
* Toga Version (list the specific version number or git hash)
```
C:\Users\bubth\Development\togatest> python
Python 3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 22:22:05) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import toga
>>> toga.__version__
'0.3.0.dev18'
```
* Toga Target (the type of app you are trying to generate)
- [ ] android
- [ ] cocoa
- [ ] django
- [ ] gtk
- [ ] iOS
- [ ] tvOS
- [ ] watchOS
- [x ] winforms
- [ ] win32
- [ ] Other (please specify)
| [
{
"content": "#!/usr/bin/env python\nimport io\n\nfrom setuptools import setup, find_packages\n\n\nwith io.open('README.rst', encoding='utf8') as readme:\n long_description = readme.read()\n\n\nsetup(\n name='toga-demo',\n version='0.3.0.dev20',\n description='A demonstration of the capabilities of ... | [
{
"content": "#!/usr/bin/env python\nimport io\n\nfrom setuptools import setup, find_packages\n\n\nwith io.open('README.rst', encoding='utf8') as readme:\n long_description = readme.read()\n\n\nsetup(\n name='toga-demo',\n version='0.3.0.dev20',\n description='A demonstration of the capabilities of ... | diff --git a/demo/pyproject.toml b/demo/pyproject.toml
index 29077abdbe..abf54d10f8 100644
--- a/demo/pyproject.toml
+++ b/demo/pyproject.toml
@@ -1,5 +1,3 @@
-[build-system]
-requires = ["briefcase"]
[tool.briefcase]
project_name = "Toga Demo"
diff --git a/demo/setup.py b/demo/setup.py
index c8776abf61..1e6ac8011e 100644
--- a/demo/setup.py
+++ b/demo/setup.py
@@ -23,7 +23,7 @@
'toga_demo': ['resources/*.icns', 'resources/*.png'],
},
install_requires=[
- 'toga==0.3.0.dev18'
+ 'toga==0.3.0.dev20'
],
entry_points={
'console_scripts': [
|
streamlit__streamlit-6348 | experimental_get_query_params won't work before rerun
### Checklist
- [X] I have searched the [existing issues](https://github.com/streamlit/streamlit/issues) for similar issues.
- [X] I added a very descriptive title to this issue.
- [X] I have provided sufficient information below to help reproduce this issue.
### Summary
User can not get right query_params before rerun.
### Reproducible Code Example
```Python
import streamlit as st
st.experimental_set_query_params(param=3)
st.write(st.experimental_get_query_params())
```
### Steps To Reproduce
Run script, `{"param ": 3}` will not appear at first time until rerun script after querystring in browser already changed.
### Expected Behavior
Show `{"param ": 3}`
### Current Behavior
show empty dict
### Is this a regression?
- [X] Yes, this used to work in a previous version.
### Debug info
- Streamlit version: 1.20.0
- Python version: 3.10.6
- Operating System: Linux
- Browser: Chrome
- Virtual environment: None
### Additional Information
In previous version `set_query_params` will set `ctx.query_string = parse.urlencode(query_params, doseq=True)` immediately.
But in 1.20, this line is removed while `get_query_params` still get if from `ctx.query_string` .
### Are you willing to submit a PR?
- [x] Yes, I am willing to submit a PR!
| [
{
"content": "# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022)\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2... | [
{
"content": "# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022)\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2... | diff --git a/lib/streamlit/commands/query_params.py b/lib/streamlit/commands/query_params.py
index b15e753aa1d6..7b632f3391ed 100644
--- a/lib/streamlit/commands/query_params.py
+++ b/lib/streamlit/commands/query_params.py
@@ -97,6 +97,7 @@ def set_query_params(**query_params: Any) -> None:
msg.page_info_changed.query_string = _ensure_no_embed_params(
query_params, ctx.query_string
)
+ ctx.query_string = msg.page_info_changed.query_string
ctx.enqueue(msg)
diff --git a/lib/tests/streamlit/streamlit_test.py b/lib/tests/streamlit/streamlit_test.py
index 168fde6b07c8..c6cd234b8e18 100644
--- a/lib/tests/streamlit/streamlit_test.py
+++ b/lib/tests/streamlit/streamlit_test.py
@@ -693,6 +693,13 @@ def test_set_query_params_exceptions(self):
with self.assertRaises(StreamlitAPIException):
st.experimental_set_query_params(embed_options="show_colored_line")
+ def test_get_query_params_after_set_query_params(self):
+ """Test valid st.set_query_params sends protobuf message."""
+ p_set = dict(x=["a"])
+ st.experimental_set_query_params(**p_set)
+ p_get = st.experimental_get_query_params()
+ self.assertEqual(p_get, p_set)
+
@parameterized.expand([(st.error,), (st.warning,), (st.info,), (st.success,)])
def test_st_alert_exceptions(self, alert_func):
"""Test that alert functions throw an exception when a non-emoji is given as an icon."""
|
EleutherAI__gpt-neox-1024 | 'attention.bias' and 'attention.masked_bias' not in `hf_layer.state_dict()` when converting gpt-neox model to huggingface
**Describe the bug**
A clear and concise description of what the bug is.
I encounter the following error when I am converting GPTNeoX models to Huggingface using the `tools/convert_module_to_hf.py` script.
```
(gpt-neox) johnny@ink-lucy:~/gpt-neox$ bash haveibeentrainedon/wikitext/pilot/convert_to_hf.sh
[2023-08-18 23:37:21,695] [INFO] [real_accelerator.py:133:get_accelerator] Setting ds_accelerator to cuda (auto detect)
> building GPT2BPETokenizer tokenizer ...
> padded vocab (size: 50257) with 47 dummy tokens (new size: 50304)
Saving weights in fp16 precision...
0%| | 0/24 [00:00<?, ?it/s]
Traceback (most recent call last):
File "./tools/convert_module_to_hf.py", line 307, in <module>
hf_model = convert(args.input_dir, loaded_config, args.output_dir)
File "./tools/convert_module_to_hf.py", line 230, in convert
state_dict["attention.bias"] = hf_layer.state_dict()["attention.bias"]
KeyError: 'attention.bias'
```
**Expected behavior**
Successful conversion.
**Proposed solution**
If you comment out lines 230 and 231, the script will run through. From an eyeballing of the results, it doesn't seem like language modelling performance seriously degraded. Could this be some code that was supposed to be taken out?
**Additional context**
This is for a model trained with the config `configs/pythia/410m.yml`
| [
{
"content": "# Copyright (c) 2021, EleutherAI\n# This file is based on code by the authors denoted below and has been modified from its original version.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain... | [
{
"content": "# Copyright (c) 2021, EleutherAI\n# This file is based on code by the authors denoted below and has been modified from its original version.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain... | diff --git a/configs/neox_arguments.md b/configs/neox_arguments.md
index b8367075a..f7cc3f084 100644
--- a/configs/neox_arguments.md
+++ b/configs/neox_arguments.md
@@ -111,7 +111,7 @@ Logging Arguments
- **git_hash**: str
- Default = 16485ee
+ Default = 7bdda99
current git hash of repository
diff --git a/requirements/requirements.txt b/requirements/requirements.txt
index 3f3a70882..88e49f073 100644
--- a/requirements/requirements.txt
+++ b/requirements/requirements.txt
@@ -12,4 +12,4 @@ sentencepiece
six
tiktoken>=0.1.2
tokenizers>=0.12.1
-transformers>=4.24.0
+transformers==4.30.2
diff --git a/tools/corpora.py b/tools/corpora.py
index b9e846454..35977b908 100644
--- a/tools/corpora.py
+++ b/tools/corpora.py
@@ -290,7 +290,7 @@ class C4OpenWebText(DataDownloader):
class Enwik8(DataDownloader):
name = "enwik8"
- urls = ["https://data.deepai.org/enwik8.zip"]
+ urls = ["http://mattmahoney.net/dc/enwik8.zip"]
def maybe_download_gpt2_tokenizer_data(tokenizer_type, data_dir):
|
scoutapp__scout_apm_python-583 | Support Python 3.9
Python 3.9 will be released 2020-10-05.
Here are some steps before its release:
* Start testing with prerelease
After release:
* Ensure tests run with released version
* Add 3.9 PyPI classifier
* Enable PYthon wheel building in release
| [
{
"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport os\nimport sys\n\nfrom setuptools import Extension, find_packages, setup\n\nwith open(\"README.md\", \"r\") as fp:\n long_description = fp.read()\n\npackages = find_packages(\"src\")\nif... | [
{
"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport os\nimport sys\n\nfrom setuptools import Extension, find_packages, setup\n\nwith open(\"README.md\", \"r\") as fp:\n long_description = fp.read()\n\npackages = find_packages(\"src\")\nif... | diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
index 88366fde..4e3aa315 100644
--- a/.github/workflows/main.yml
+++ b/.github/workflows/main.yml
@@ -21,6 +21,7 @@ jobs:
- 3.6
- 3.7
- 3.8
+ - 3.9
services:
elasticsearch:
@@ -47,7 +48,7 @@ jobs:
steps:
- uses: actions/checkout@v2
- - uses: actions/setup-python@v2.1.1
+ - uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Upgrade packaging tools
@@ -56,5 +57,5 @@ jobs:
run: python -m pip install --upgrade tox
- name: Run tox targets for ${{ matrix.python-version }}
run: |
- ENV_PREFIX=$(tr -d "." <<< "py${{ matrix.python-version }}-")
- TOXENV=$(tox --listenvs | grep $ENV_PREFIX | tr '\n' ',') python -m tox
+ ENV_PREFIX=$(tr -C -d "0-9" <<< "${{ matrix.python-version }}")
+ TOXENV=$(tox --listenvs | grep "^py$ENV_PREFIX" | tr '\n' ',') python -m tox
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 95da24af..cb811ebf 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,12 @@
# Changelog
+## Pending
+
+### Added
+
+- Support Python 3.9.
+ ([PR #583](https://github.com/scoutapp/scout_apm_python/pull/583))
+
## [2.16.2] 2020-09-17
- Moved core agent on Linux to default to the musl version, rather than try
diff --git a/setup.py b/setup.py
index 2e6caef2..f33ae372 100644
--- a/setup.py
+++ b/setup.py
@@ -98,5 +98,6 @@
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
+ "Programming Language :: Python :: 3.9",
],
)
diff --git a/tests/compat.py b/tests/compat.py
index 35dd36cf..72c2b416 100644
--- a/tests/compat.py
+++ b/tests/compat.py
@@ -44,4 +44,17 @@ def nullcontext(obj):
yield obj
-__all__ = ["mock", "nullcontext", "TemporaryDirectory"]
+if sys.version_info >= (3, 4):
+ from contextlib import suppress
+else:
+ from contextlib import contextmanager
+
+ @contextmanager
+ def suppress(*exceptions):
+ try:
+ yield
+ except exceptions:
+ pass
+
+
+__all__ = ["mock", "nullcontext", "suppress", "TemporaryDirectory"]
diff --git a/tests/integration/django_app.py b/tests/integration/django_app.py
index 078186b5..4ec7d181 100644
--- a/tests/integration/django_app.py
+++ b/tests/integration/django_app.py
@@ -1,11 +1,15 @@
# coding: utf-8
from __future__ import absolute_import, division, print_function, unicode_literals
+import sys
+
import django
import wrapt
from django.conf import settings
from django.template.response import TemplateResponse
+from tests.compat import suppress
+
config = {
"ALLOWED_HOSTS": ["*"],
"DATABASES": {
@@ -118,22 +122,22 @@ def sql_kwargs(request):
def sql_type_errors(request):
with connection.cursor() as cursor:
- try:
+ with suppress(TypeError):
cursor.execute()
- except TypeError:
- pass
- try:
+
+ if sys.version_info >= (3, 9):
+ exc_type = TypeError
+ else:
+ exc_type = ValueError
+
+ with suppress(exc_type):
cursor.execute(sql=None)
- except ValueError:
- pass
- try:
+
+ with suppress(TypeError):
cursor.executemany()
- except TypeError:
- pass
- try:
+
+ with suppress(TypeError):
cursor.executemany(sql=None, param_list=[(1,)])
- except TypeError:
- pass
return HttpResponse("Done")
diff --git a/tox.ini b/tox.ini
index abf2debe..d16a6c1a 100644
--- a/tox.ini
+++ b/tox.ini
@@ -6,9 +6,9 @@ envlist =
{py27,py34,py35,py36}-django111
{py34,py35,py36,py37,py38}-django20
{py35,py36,py37,py38}-django21
- {py35,py36,py37,py38}-django22
- {py36,py37,py38}-django30
- {py36,py37,py38}-django31
+ {py35,py36,py37,py38,py39}-django22
+ {py36,py37,py38,py39}-django30
+ {py36,py37,py38,py39}-django31
[testenv]
passenv =
|
rlworkgroup__garage-714 | Cannot plot during training
```
from garage.experiment import LocalRunner, run_experiment
from garage.np.baselines import LinearFeatureBaseline
from garage.tf.algos import TRPO
from garage.tf.envs import TfEnv
from garage.tf.policies import CategoricalMLPPolicy
def run_task(*_):
with LocalRunner() as runner:
env = TfEnv(env_name='CartPole-v1')
policy = CategoricalMLPPolicy(
name='policy', env_spec=env.spec, hidden_sizes=(32, 32))
baseline = LinearFeatureBaseline(env_spec=env.spec)
algo = TRPO(
env_spec=env.spec,
policy=policy,
baseline=baseline,
max_path_length=100,
discount=0.99,
max_kl_step=0.01)
runner.setup(algo, env)
runner.train(n_epochs=100, batch_size=4000,plot=True)
run_experiment(
run_task,
snapshot_mode='last',
seed=4,
n_parallel=4,
plot=True,
use_tf=False,
use_gpu=False
)
```
##########################################################
3) Why we removed viskit? I cannot find it in garage.
Thanks! I really like rllab and garage.
| [
{
"content": "\"\"\"\nThe local runner for tensorflow algorithms.\n\nA runner setup context for algorithms during initialization and\npipelines data between sampler and algorithm during training.\n\"\"\"\nimport copy\nimport time\nfrom types import SimpleNamespace\n\nfrom dowel import logger, tabular\nimport te... | [
{
"content": "\"\"\"\nThe local runner for tensorflow algorithms.\n\nA runner setup context for algorithms during initialization and\npipelines data between sampler and algorithm during training.\n\"\"\"\nimport copy\nimport time\nfrom types import SimpleNamespace\n\nfrom dowel import logger, tabular\nimport te... | diff --git a/src/garage/experiment/local_tf_runner.py b/src/garage/experiment/local_tf_runner.py
index 8ecc20931c..5573f7fcfe 100644
--- a/src/garage/experiment/local_tf_runner.py
+++ b/src/garage/experiment/local_tf_runner.py
@@ -388,6 +388,7 @@ def _train(self,
pause_for_plot=pause_for_plot,
start_epoch=start_epoch)
+ self.plot = plot
self.start_worker()
self.start_time = time.time()
diff --git a/tests/garage/experiment/test_local_tf_runner.py b/tests/garage/experiment/test_local_tf_runner.py
index 9b5c31e1aa..0715775158 100644
--- a/tests/garage/experiment/test_local_tf_runner.py
+++ b/tests/garage/experiment/test_local_tf_runner.py
@@ -5,6 +5,7 @@
from garage.sampler import singleton_pool
from garage.tf.algos import VPG
from garage.tf.envs import TfEnv
+from garage.tf.plotter import Plotter
from garage.tf.policies import CategoricalMLPPolicy
from garage.tf.samplers import BatchSampler
from tests.fixtures import TfGraphTestCase
@@ -45,7 +46,6 @@ def test_batch_sampler(self):
policy=policy,
baseline=baseline,
max_path_length=1,
- whole_paths=True,
discount=0.99)
runner.setup(
@@ -62,7 +62,8 @@ def test_batch_sampler(self):
runner.start_worker()
- paths = runner.sampler.obtain_samples(0, 8)
+ paths = runner.sampler.obtain_samples(
+ 0, batch_size=8, whole_paths=True)
self.assertGreaterEqual(
len(paths), max_cpus, 'BatchSampler should sample more than '
'max_cpus={} trajectories'.format(max_cpus))
@@ -103,3 +104,27 @@ def test_external_sess(self):
pass
# sess should still be the default session here.
tf.no_op().run()
+
+ def test_set_plot(self):
+ with LocalRunner() as runner:
+ env = TfEnv(env_name='CartPole-v1')
+
+ policy = CategoricalMLPPolicy(
+ name='policy', env_spec=env.spec, hidden_sizes=(8, 8))
+
+ baseline = LinearFeatureBaseline(env_spec=env.spec)
+
+ algo = VPG(
+ env_spec=env.spec,
+ policy=policy,
+ baseline=baseline,
+ max_path_length=100,
+ discount=0.99,
+ optimizer_args=dict(
+ tf_optimizer_args=dict(learning_rate=0.01, )))
+
+ runner.setup(algo, env)
+ runner.train(n_epochs=1, batch_size=100, plot=True)
+
+ assert isinstance(runner.plotter, Plotter), (
+ 'self.plotter in LocalRunner should be set to Plotter.')
|
python-pillow__Pillow-4455 | PIL cannot read JPEG comment
### What did you do?
I want PIL to read the JPEG comment (marker: 0xFF 0xFE).
I took an image with an attached JPEG comment - verified with exiftools & IrfanView to exist.
```python
from PIL import Image, JpegImagePlugin
pic = Image.open(<path_to_pic_with_JPEG_comment>)
print(pic.info)
```
### What did you expect to happen?
Show the JPEG comment in the dict.
### What actually happened?
> {'jfif': 257, 'jfif_version': (1, 1), 'dpi': (96, 96), 'jfif_unit': 1, 'jfif_density': (96, 96), 'exif': b'...'}
### What are your OS, Python and Pillow versions?
* OS: W7x64
* Python: Python 3.8.1 x64
* Pillow: Pillow 7.0.0
I cannot attach an image via github ("Something went really wrong, ..."), so here is the file (5.61 KiB) (I downloaded it and verified it's byte-identical to the uploaded one): [](https://postimg.cc/BLrFc0kf)
| [
{
"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# JPEG (JFIF) file handling\n#\n# See \"Digital Compression and Coding of Continuous-Tone Still Images,\n# Part 1, Requirements and Guidelines\" (CCITT T.81 / ISO 10918-1)\n#\n# History:\n# 1995-09-09 fl Created\n# 1995-09-13 fl Added full parser\n#... | [
{
"content": "#\n# The Python Imaging Library.\n# $Id$\n#\n# JPEG (JFIF) file handling\n#\n# See \"Digital Compression and Coding of Continuous-Tone Still Images,\n# Part 1, Requirements and Guidelines\" (CCITT T.81 / ISO 10918-1)\n#\n# History:\n# 1995-09-09 fl Created\n# 1995-09-13 fl Added full parser\n#... | diff --git a/Tests/test_file_jpeg.py b/Tests/test_file_jpeg.py
index f13536d5868..33045122891 100644
--- a/Tests/test_file_jpeg.py
+++ b/Tests/test_file_jpeg.py
@@ -60,6 +60,8 @@ def test_app(self):
)
assert len(im.applist) == 2
+ assert im.info["comment"] == b"File written by Adobe Photoshop\xa8 4.0\x00"
+
def test_cmyk(self):
# Test CMYK handling. Thanks to Tim and Charlie for test data,
# Michael for getting me to look one more time.
diff --git a/docs/handbook/image-file-formats.rst b/docs/handbook/image-file-formats.rst
index 7ce685ed2a3..18f547a498c 100644
--- a/docs/handbook/image-file-formats.rst
+++ b/docs/handbook/image-file-formats.rst
@@ -298,6 +298,11 @@ The :py:meth:`~PIL.Image.Image.open` method may set the following
**exif**
Raw EXIF data from the image.
+**comment**
+ A comment about the image.
+
+ .. versionadded:: 7.1.0
+
The :py:meth:`~PIL.Image.Image.save` method supports the following options:
diff --git a/docs/releasenotes/7.1.0.rst b/docs/releasenotes/7.1.0.rst
index 1369177d26e..e3bc107ddff 100644
--- a/docs/releasenotes/7.1.0.rst
+++ b/docs/releasenotes/7.1.0.rst
@@ -18,6 +18,15 @@ been resolved.
im = Image.open("hopper.jpg")
im.save("out.jpg", quality=0)
+API Additions
+=============
+
+Reading JPEG comments
+^^^^^^^^^^^^^^^^^^^^^
+
+When opening a JPEG image, the comment may now be read into
+:py:attr:`~PIL.Image.Image.info`.
+
Other Changes
=============
diff --git a/src/PIL/JpegImagePlugin.py b/src/PIL/JpegImagePlugin.py
index 229eac2141e..2aa029efbff 100644
--- a/src/PIL/JpegImagePlugin.py
+++ b/src/PIL/JpegImagePlugin.py
@@ -176,6 +176,7 @@ def COM(self, marker):
n = i16(self.fp.read(2)) - 2
s = ImageFile._safe_read(self.fp, n)
+ self.info["comment"] = s
self.app["COM"] = s # compatibility
self.applist.append(("COM", s))
|
feast-dev__feast-244 | Feast cli config file should be settable by an env var
**Is your feature request related to a problem? Please describe.**
If I have multiple feast instances, I want to be able to set different .feast files to configure the CLI.
**Describe the solution you'd like**
export FEAST_CONFIG=path/to/feast/configfile
it should default to ~/.feast
Feast cli config file should be settable by an env var
**Is your feature request related to a problem? Please describe.**
If I have multiple feast instances, I want to be able to set different .feast files to configure the CLI.
**Describe the solution you'd like**
export FEAST_CONFIG=path/to/feast/configfile
it should default to ~/.feast
| [
{
"content": "#\n# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requ... | [
{
"content": "#\n# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requ... | diff --git a/sdk/python/feast/config.py b/sdk/python/feast/config.py
index 77867cbaa4d..9b6a6dd4d83 100644
--- a/sdk/python/feast/config.py
+++ b/sdk/python/feast/config.py
@@ -28,7 +28,7 @@
feast_configuration_properties = {"core_url": "URL", "serving_url": "URL"}
-CONFIGURATION_FILE_DIR = ".feast"
+CONFIGURATION_FILE_DIR = os.environ.get("FEAST_CONFIG", ".feast")
CONFIGURATION_FILE_NAME = "config.toml"
|
chainer__chainer-5586 | Docstring of `functions.forget` is incorrect as `+` doesn't retain inputs anymore
The docstring says that `(x + y) + x` retains the immediate variable holding `x + y`.
```
Let ``f`` be a function defined as:
>>> def f(a, b):
... return a + b + a
and, ``x`` and ``y`` be :class:`~chainer.Variable`\\ s:
>>> x = chainer.Variable(np.random.uniform(-1, 1, 5).astype(np.float32))
>>> y = chainer.Variable(np.random.uniform(-1, 1, 5).astype(np.float32))
When ``z`` is calculated as ``z = f(x, y)``, its intermediate result
``x + y`` is stored in memory. Instead, if you call ``f`` with
``F.forget``:
>>> z = F.forget(f, x, y)
intermediate ``x + y`` is forgotten.
```
But this isn't true for new-style function of `+`, because addition don't requires book-kept inputs for backpropagation.
I checked the behavior by the following script, which traverses retained variables.
```python
import chainer
import chainer.functions as F
import numpy as np
def f(a, b):
return (a + b) + a
def recur_check_vars(v, x, y):
creator = v.creator_node
if creator is None:
return
for pnode in creator.inputs:
p = pnode.get_variable()
assert p.data is None or p is x or p is y
print(p)
recur_check_vars(p, x, y)
def main():
x = chainer.Variable(np.random.uniform(-1, 1, 5).astype(np.float32))
y = chainer.Variable(np.random.uniform(-1, 1, 5).astype(np.float32))
print(x)
print(y)
print()
z = f(x, y)
recur_check_vars(z, x, y)
if __name__ == '__main__':
main()
```
The script doesn't fail, and the output is as follows. We can see that`x + y` is discarded. Living variables `x` and `y` are retrieved, as each `VariableNode` instance has a weakref to the corresponding variable.
```
variable([-0.7699733 -0.50523347 -0.20869003 -0.7912116 0.92058474])
variable([ 0.58832335 -0.06183117 0.1939743 0.9021316 -0.19973369])
variable(None)
variable([-0.7699733 -0.50523347 -0.20869003 -0.7912116 0.92058474])
variable([ 0.58832335 -0.06183117 0.1939743 0.9021316 -0.19973369])
variable([-0.7699733 -0.50523347 -0.20869003 -0.7912116 0.92058474])
```
| [
{
"content": "import chainer\nfrom chainer import function\nfrom chainer import function_node\nfrom chainer import variable\n\n\ndef _call_func(func, xs):\n outs = func(*xs)\n\n if isinstance(outs, tuple):\n for i, out in enumerate(outs):\n if isinstance(out, variable.Variable):\n ... | [
{
"content": "import chainer\nfrom chainer import function\nfrom chainer import function_node\nfrom chainer import variable\n\n\ndef _call_func(func, xs):\n outs = func(*xs)\n\n if isinstance(outs, tuple):\n for i, out in enumerate(outs):\n if isinstance(out, variable.Variable):\n ... | diff --git a/chainer/functions/util/forget.py b/chainer/functions/util/forget.py
index 14e62093af8c..44c1c2271d57 100644
--- a/chainer/functions/util/forget.py
+++ b/chainer/functions/util/forget.py
@@ -89,7 +89,7 @@ def forget(func, *xs):
Let ``f`` be a function defined as:
>>> def f(a, b):
- ... return a + b + a
+ ... return (a + b) * a
and, ``x`` and ``y`` be :class:`~chainer.Variable`\\ s:
|
ray-project__ray-9297 | [tune] Parameters from `tune.choice()` do not get logged to TensorBoard when integers
### What is the problem?
When providing parameters via `tune.choice()` that include integers, the values are not logged to TensorBoard's HPARAMS section.
The issue is that `numpy.random.choice([1, 2, 3])` (for example) returns `numpy.int32`/`numpy.int64` and those types are not included in the `VALID_HPARAMS = (str, bool, int, float, list)` tuple (python/ray/tune/logger.py).
Since TensorBoard has no issues with logging `numpy.int32/64`, one simple solution would be to just include those types in the tuple above. Happy to provide a PR if you think this is the way to go.
*Ray version and other system information (Python version, TensorFlow version, OS):*
ray: 0.8.6
python: 3.7.7
tensorboard: 2.2.2
ubuntu: 20.04
### Reproduction (REQUIRED)
```python
from ray import tune
def trainable(config):
tune.report(score=config["a"])
config_dict = {"a": tune.choice([1, 2, 3])}
tune.run(trainable, config=config_dict, num_samples=1)
```
- [x] I have verified my script runs in a clean environment and reproduces the issue.
- [x] I have verified the issue also occurs with the [latest wheels](https://docs.ray.io/en/latest/installation.html).
| [
{
"content": "import csv\nimport json\nimport logging\nimport os\nimport yaml\nimport numbers\nimport numpy as np\n\nimport ray.cloudpickle as cloudpickle\nfrom ray.util.debug import log_once\nfrom ray.tune.result import (NODE_IP, TRAINING_ITERATION, TIME_TOTAL_S,\n TIMESTEPS_TOTAL, ... | [
{
"content": "import csv\nimport json\nimport logging\nimport os\nimport yaml\nimport numbers\nimport numpy as np\n\nimport ray.cloudpickle as cloudpickle\nfrom ray.util.debug import log_once\nfrom ray.tune.result import (NODE_IP, TRAINING_ITERATION, TIME_TOTAL_S,\n TIMESTEPS_TOTAL, ... | diff --git a/python/ray/tune/logger.py b/python/ray/tune/logger.py
index 044448d47c622..d2fae3723fd0a 100644
--- a/python/ray/tune/logger.py
+++ b/python/ray/tune/logger.py
@@ -187,7 +187,7 @@ class TBXLogger(Logger):
"""
# NoneType is not supported on the last TBX release yet.
- VALID_HPARAMS = (str, bool, int, float, list)
+ VALID_HPARAMS = (str, bool, np.bool8, int, np.integer, float, list)
def _init(self):
try:
diff --git a/python/ray/tune/tests/test_logger.py b/python/ray/tune/tests/test_logger.py
index 9a52ec61c4dd0..17215b36c9fe9 100644
--- a/python/ray/tune/tests/test_logger.py
+++ b/python/ray/tune/tests/test_logger.py
@@ -2,6 +2,7 @@
import unittest
import tempfile
import shutil
+import numpy as np
from ray.tune.logger import JsonLogger, CSVLogger, TBXLogger
@@ -46,7 +47,17 @@ def testJSON(self):
logger.close()
def testTBX(self):
- config = {"a": 2, "b": [1, 2], "c": {"c": {"D": 123}}}
+ config = {
+ "a": 2,
+ "b": [1, 2],
+ "c": {
+ "c": {
+ "D": 123
+ }
+ },
+ "d": np.int64(1),
+ "e": np.bool8(True)
+ }
t = Trial(evaluated_params=config, trial_id="tbx")
logger = TBXLogger(config=config, logdir=self.test_dir, trial=t)
logger.on_result(result(0, 4))
|
pyodide__pyodide-3868 | Aborted fetch requests freeze REPL
## 🐛 Bug
Fetch requests aborted (using the [signal](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal/timeout_static#examples) options) freeze REPL, and do not raise exception.
### To Reproduce
```python
def fetch_response(url, timeout):
options = js.Object.new()
options.signal = js.AbortSignal.timeout(timeout)
return js.fetch(url, options)
response = await fetch_response('slow api', 1)
```
Dev Console shows:
```
Uncaught (in promise) PythonError: TypeError: invalid exception object
at new_error (pyodide.asm.js:9:14992)
at pyodide.asm.wasm:0x152d67
at pyodide.asm.wasm:0x152e6c
at Module.callPyObjectKwargs (pyodide.asm.js:9:75811)
at Module.callPyObject (pyodide.asm.js:9:76020)
at onRejected (pyodide.asm.js:9:59090)
```
This appears to occur because the `PyodideFuture` object, which the fetch returns, expects to receive a python Exception object on rejection. Instead, the returned `PyodideFuture` object gets a `JsProxy` of an `AbortError` (`DOMException`). A `JsProxy` of a `DOMException` can't be raised in pyodide.
```python
>>> import js
>>> raise js.DOMException.new('')
```
```
Traceback (most recent call last):
File "<console>", line 1, in <module>
TypeError: exceptions must derive from BaseException
```
### Expected behavior
Should raise a `JsException`.
Possible solution: Allow js.DOMException objects to be raised much like js.Error objects:
```python
>>> from pyodide.webloop import PyodideFuture
>>> fut = PyodideFuture()
>>> fut.set_exception(js.Error('hi'))
>>> fut.exception()
Error: hi
```
### Environment
- Pyodide Version<0.23.2>:
```
>>> import pyodide
>>> pyodide.__version__
'0.23.2'
```
- Browser version<Chrome 113.0.5672.114->:
| [
{
"content": "import sys\nfrom collections.abc import (\n AsyncIterator,\n Awaitable,\n Callable,\n ItemsView,\n Iterable,\n Iterator,\n KeysView,\n Mapping,\n MutableMapping,\n Sequence,\n ValuesView,\n)\nfrom functools import reduce\nfrom types import TracebackType\nfrom typin... | [
{
"content": "import sys\nfrom collections.abc import (\n AsyncIterator,\n Awaitable,\n Callable,\n ItemsView,\n Iterable,\n Iterator,\n KeysView,\n Mapping,\n MutableMapping,\n Sequence,\n ValuesView,\n)\nfrom functools import reduce\nfrom types import TracebackType\nfrom typin... | diff --git a/docs/project/changelog.md b/docs/project/changelog.md
index a56412e88ec..bf1866a1654 100644
--- a/docs/project/changelog.md
+++ b/docs/project/changelog.md
@@ -31,6 +31,10 @@ myst:
- {{ Enhancement }} Added `headers` property to `pyodide.http.FetchResponse`.
{pr}`2078`
+- {{ Fix }} A `JSProxy` of a `DOMException` will now inherit from exception so
+ it can be raised in Python.
+ {pr}`3868`
+
### Packages
- OpenBLAS has been added and scipy now uses OpenBLAS rather than CLAPACK
diff --git a/src/core/jsproxy.c b/src/core/jsproxy.c
index a558b4ed75e..56b34a67456 100644
--- a/src/core/jsproxy.c
+++ b/src/core/jsproxy.c
@@ -4011,6 +4011,7 @@ EM_JS_NUM(int, JsProxy_compute_typeflags, (JsRef idobj), {
}
const isBufferView = safeBool(() => ArrayBuffer.isView(obj));
const isArray = safeBool(() => Array.isArray(obj));
+ const constructorName = safeBool(() => obj.constructor.name) || "";
// If we somehow set more than one of IS_CALLABLE, IS_BUFFER, and IS_ERROR,
// we'll run into trouble. I think that for this to happen, someone would have
@@ -4040,7 +4041,28 @@ EM_JS_NUM(int, JsProxy_compute_typeflags, (JsRef idobj), {
isBufferView && typeTag !== '[object DataView]');
SET_FLAG_IF(IS_GENERATOR, typeTag === "[object Generator]");
SET_FLAG_IF(IS_ASYNC_GENERATOR, typeTag === "[object AsyncGenerator]");
- SET_FLAG_IF(IS_ERROR, (hasProperty(obj, "name") && hasProperty(obj, "message") && hasProperty(obj, "stack")) && !(type_flags & (IS_CALLABLE | IS_BUFFER)));
+
+ /**
+ * DOMException is a weird special case. According to WHATWG, there are two
+ * types of Exception objects, simple exceptions and DOMExceptions. The spec
+ * says:
+ *
+ * > if an implementation gives native Error objects special powers or
+ * > nonstandard properties (such as a stack property), it should also expose
+ * > those on DOMException objects
+ *
+ * Firefox respects this and has DOMException.stack. But Safari and Chrome do
+ * not. Hence the special check here for DOMException.
+ */
+ SET_FLAG_IF(IS_ERROR,
+ (
+ hasProperty(obj, "name")
+ && hasProperty(obj, "message")
+ && (
+ hasProperty(obj, "stack")
+ || constructorName === "DOMException"
+ )
+ ) && !(type_flags & (IS_CALLABLE | IS_BUFFER)));
// clang-format on
return type_flags;
});
diff --git a/src/py/_pyodide/_core_docs.py b/src/py/_pyodide/_core_docs.py
index da86e3c713d..74b696e5f86 100644
--- a/src/py/_pyodide/_core_docs.py
+++ b/src/py/_pyodide/_core_docs.py
@@ -988,6 +988,10 @@ def _new_exc(cls, name: str, message: str = "", stack: str = "") -> "JsException
result.stack = stack
return result
+ @classmethod
+ def new(cls, *args: Any) -> "JsException":
+ return cls()
+
def __str__(self):
return f"{self.name}: {self.message}"
diff --git a/src/py/js.pyi b/src/py/js.pyi
index f5c365ad634..0ea16163fb2 100644
--- a/src/py/js.pyi
+++ b/src/py/js.pyi
@@ -5,6 +5,7 @@ from _pyodide._core_docs import _JsProxyMetaClass
from pyodide.ffi import (
JsArray,
JsDomElement,
+ JsException,
JsFetchResponse,
JsProxy,
JsTypedArray,
@@ -90,3 +91,6 @@ class document(_JsObject):
def createElement(tagName: str) -> JsDomElement: ...
@staticmethod
def appendChild(child: JsDomElement) -> None: ...
+
+class DOMException(JsException):
+ pass
diff --git a/src/tests/test_pyodide.py b/src/tests/test_pyodide.py
index ab3bd278635..ddb4836b456 100644
--- a/src/tests/test_pyodide.py
+++ b/src/tests/test_pyodide.py
@@ -583,6 +583,18 @@ def test_run_python_js_error(selenium):
)
+@pytest.mark.xfail_browsers(node="No DOMException in node")
+@run_in_pyodide
+def test_run_python_dom_error(selenium):
+ import pytest
+
+ from js import DOMException
+ from pyodide.ffi import JsException
+
+ with pytest.raises(JsException, match="oops"):
+ raise DOMException.new("oops")
+
+
def test_run_python_locals(selenium):
selenium.run_js(
"""
diff --git a/src/tests/test_typeconversions.py b/src/tests/test_typeconversions.py
index 5e5f6265c8e..cfbfa1fe47d 100644
--- a/src/tests/test_typeconversions.py
+++ b/src/tests/test_typeconversions.py
@@ -951,6 +951,10 @@ def test_dict_js2py2js(selenium):
def test_error_js2py2js(selenium):
selenium.run_js("self.err = new Error('hello there?');")
assert_js_to_py_to_js(selenium, "err")
+ if selenium.browser == "node":
+ return
+ selenium.run_js("self.err = new DOMException('hello there?');")
+ assert_js_to_py_to_js(selenium, "err")
def test_error_py2js2py(selenium):
|
kivy__kivy-6128 | 🙏 emoji in README.md breaks return fileh.read() in pip install from master
When running with latest pip 19
```
pip3 install https://github.com/kivy/kivy/archive/master.zip
```
I get:
```
Collecting https://github.com/kivy/kivy/archive/master.zip
Downloading https://github.com/kivy/kivy/archive/master.zip
/ 41.6MB 24.9MB/s
Complete output from command python setup.py egg_info:
fatal: not a git repository (or any of the parent directories): .git
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-req-build-qbahrg18/setup.py", line 1007, in <module>
long_description=get_description(),
File "/tmp/pip-req-build-qbahrg18/setup.py", line 44, in get_description
return fileh.read()
File "/usr/lib/python3.6/encodings/ascii.py", line 26, in decode
return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xf0 in position 4973: ordinal not in range(128)
Using distutil
```
position 4973 is the 🙏 emoji
| [
{
"content": "#\n# Kivy - Cross-platform UI framework\n# https://kivy.org/\n#\nfrom __future__ import print_function\n\nimport sys\nbuild_examples = False\nif \"--build_examples\" in sys.argv:\n build_examples = True\n sys.argv.remove(\"--build_examples\")\n\nfrom copy import deepcopy\nimport os\nfrom os.... | [
{
"content": "#\n# Kivy - Cross-platform UI framework\n# https://kivy.org/\n#\nfrom __future__ import print_function\n\nimport sys\nbuild_examples = False\nif \"--build_examples\" in sys.argv:\n build_examples = True\n sys.argv.remove(\"--build_examples\")\n\nfrom copy import deepcopy\nimport os\nfrom os.... | diff --git a/setup.py b/setup.py
index eac026084d..4eebf8fdf9 100644
--- a/setup.py
+++ b/setup.py
@@ -40,8 +40,8 @@ def ver_equal(self, other):
def get_description():
- with open(join(dirname(__file__), 'README.md')) as fileh:
- return fileh.read()
+ with open(join(dirname(__file__), 'README.md'), 'rb') as fileh:
+ return fileh.read().decode("utf8")
def get_version(filename='kivy/version.py'):
|
scikit-hep__pyhf-2220 | Menu on mobile page not accessible for pyhf v0.7.1 docs
### Summary
On the [`pyhf` `v0.7.1` docs](https://pyhf.readthedocs.io/en/v0.7.1/) and on the `main` docs build the drop down menu (circled in screen shot bellow) fails to open when clicked on.

Things work fine on desktop and confusingly @alexander-held has pointed out that the [`v0.5.2` `cabinetry` docs](https://cabinetry.readthedocs.io/en/stable/) (where were [released](https://github.com/scikit-hep/cabinetry/releases/tag/v0.5.2) very close in time to the `pyhf` `v0.7.1` docs) have a menu that works fine on mobile.
### Documentation Page Link
https://pyhf.readthedocs.io/en/v0.7.1/
### Code of Conduct
- [X] I agree to follow the Code of Conduct
| [
{
"content": "#\n# pyhf documentation build configuration file, created by\n# sphinx-quickstart on Fri Feb 9 11:58:49 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#... | [
{
"content": "#\n# pyhf documentation build configuration file, created by\n# sphinx-quickstart on Fri Feb 9 11:58:49 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this\n# autogenerated file.\n#... | diff --git a/docs/conf.py b/docs/conf.py
index 30b9f2c6aa..cda874ee89 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -55,6 +55,7 @@ def setup(app):
'sphinx.ext.viewcode',
'sphinx.ext.githubpages',
'sphinx.ext.intersphinx',
+ 'sphinx_rtd_theme',
'sphinxcontrib.bibtex',
'sphinx.ext.napoleon',
'sphinx_click.ext',
|
mindsdb__lightwood-603 | :wrench: Add default logging level environment variable
## Task
Add a `LIGHTWOOD_LOG` environment variable that controls the default logging level for lightwood. It should be possible to set values for it so that `DEBUG`, `INFO`, `WARNING`, `ERROR` and `CRITICAL` are all possible options. The logger lightwood uses is declared and exported [here](https://github.com/mindsdb/lightwood/blob/stable/lightwood/helpers/log.py).
## Steps :male_detective: :female_detective:
- Fork the Lightwood repository, checkout the `staging` branch and from it create a new one.
- Implement the necessary changes.
- Check that only the appropriate logs are getting through. For this, you can run any of the integration tests, like [`test_boston_housing`](https://github.com/mindsdb/lightwood/blob/stable/tests/integration/basic/test_boston_housing.py), and analyze the output.
- Make the PR and address any comments that reviewers might make.
## Additional rewards :1st_place_medal:
Each documentation PR brings :one: point for entry into the draw for a :computer: Deep Learning Laptop powered by the NVIDIA RTX 3080 Max-Q GPU or other swag :shirt: :bear: . For more info check out https://mindsdb.com/hacktoberfest/
| [
{
"content": "import logging\nimport os\n\n\ndef initialize_log():\n pid = os.getpid()\n logging.basicConfig()\n log = logging.getLogger(f'lightwood-{pid}')\n log.setLevel(logging.DEBUG)\n return log\n\n\nlog = initialize_log()\n",
"path": "lightwood/helpers/log.py"
}
] | [
{
"content": "import logging\nimport os\n\n\ndef initialize_log():\n pid = os.getpid()\n logging.basicConfig()\n log = logging.getLogger(f'lightwood-{pid}')\n log_level = os.environ.get('LIGHTWOOD_LOG', 'DEBUG')\n log.setLevel(log_level)\n return log\n\n\nlog = initialize_log()\n",
"path":... | diff --git a/lightwood/helpers/log.py b/lightwood/helpers/log.py
index d25dc4c01..96f893b3e 100644
--- a/lightwood/helpers/log.py
+++ b/lightwood/helpers/log.py
@@ -6,7 +6,8 @@ def initialize_log():
pid = os.getpid()
logging.basicConfig()
log = logging.getLogger(f'lightwood-{pid}')
- log.setLevel(logging.DEBUG)
+ log_level = os.environ.get('LIGHTWOOD_LOG', 'DEBUG')
+ log.setLevel(log_level)
return log
|
crytic__slither-1108 | [Bug]: Infinite loop in RTLO detector
### What happened?
Slither hangs on this code indefinitely
### Can you share code with us to reproduce this bug?
https://github.com/ethereum/solidity/blob/develop/test/libsolidity/syntaxTests/comments/multiline_unicode_direction_override_5.sol
### Version
0.8.2
### Relevant log output
_No response_
| [
{
"content": "import re\nfrom slither.detectors.abstract_detector import AbstractDetector, DetectorClassification\n\n\nclass RightToLeftOverride(AbstractDetector):\n \"\"\"\n Detect the usage of a Right-To-Left-Override (U+202E) character\n \"\"\"\n\n ARGUMENT = \"rtlo\"\n HELP = \"Right-To-Left-... | [
{
"content": "import re\nfrom slither.detectors.abstract_detector import AbstractDetector, DetectorClassification\n\n\nclass RightToLeftOverride(AbstractDetector):\n \"\"\"\n Detect the usage of a Right-To-Left-Override (U+202E) character\n \"\"\"\n\n ARGUMENT = \"rtlo\"\n HELP = \"Right-To-Left-... | diff --git a/slither/detectors/source/rtlo.py b/slither/detectors/source/rtlo.py
index 904f2d2e39..df1f265952 100644
--- a/slither/detectors/source/rtlo.py
+++ b/slither/detectors/source/rtlo.py
@@ -88,6 +88,6 @@ def _detect(self):
results.append(res)
# Advance the start index for the next iteration
- start_index = result_index + 1
+ start_index = idx + 1
return results
diff --git a/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol b/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol
new file mode 100644
index 0000000000..80f312986d
--- /dev/null
+++ b/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol
@@ -0,0 +1,11 @@
+pragma solidity ^0.8.0;
+contract my_contract {
+ function empty_func() external pure
+ {
+ // The string below contains 3 RLO and 3 PDF unicode characters
+ // RLO is U+202E and changes the print direction to right-to-left
+ // PDF is U+202C and restores the print direction to what it was before RLO
+ /*ok aaabbbcccdddeee*/
+ }
+}
+// ----
\ No newline at end of file
diff --git a/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol.0.8.0.RightToLeftOverride.json b/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol.0.8.0.RightToLeftOverride.json
new file mode 100644
index 0000000000..97160fb1f5
--- /dev/null
+++ b/tests/detectors/rtlo/0.8.0/unicode_direction_override.sol.0.8.0.RightToLeftOverride.json
@@ -0,0 +1,91 @@
+[
+ [
+ {
+ "elements": [
+ {
+ "type": "other",
+ "name": "rtlo-character",
+ "source_mapping": {
+ "start": 336,
+ "length": 3,
+ "filename_used": "/GENERIC_PATH",
+ "filename_relative": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "filename_absolute": "/GENERIC_PATH",
+ "filename_short": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "is_dependency": false,
+ "lines": [
+ 8
+ ],
+ "starting_column": 14,
+ "ending_column": 17
+ }
+ }
+ ],
+ "description": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 336:\n\t- b' /*ok \\xe2\\x80\\xaeaaa\\xe2\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "markdown": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 336:\n\t- b' /*ok \\xe2\\x80\\xaeaaa\\xe2\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "first_markdown_element": "",
+ "id": "2407672dea557be27d0c488ba9c714e6a7f21dd3f7759058e718c1984e142f95",
+ "check": "rtlo",
+ "impact": "High",
+ "confidence": "High"
+ },
+ {
+ "elements": [
+ {
+ "type": "other",
+ "name": "rtlo-character",
+ "source_mapping": {
+ "start": 348,
+ "length": 3,
+ "filename_used": "/GENERIC_PATH",
+ "filename_relative": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "filename_absolute": "/GENERIC_PATH",
+ "filename_short": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "is_dependency": false,
+ "lines": [
+ 8
+ ],
+ "starting_column": 26,
+ "ending_column": 29
+ }
+ }
+ ],
+ "description": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 348:\n\t- b'\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "markdown": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 348:\n\t- b'\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "first_markdown_element": "",
+ "id": "477e54031d4d30d485b9cdc2d7ef3e9ae3de52640364505df8eb9619c2bcde6b",
+ "check": "rtlo",
+ "impact": "High",
+ "confidence": "High"
+ },
+ {
+ "elements": [
+ {
+ "type": "other",
+ "name": "rtlo-character",
+ "source_mapping": {
+ "start": 342,
+ "length": 3,
+ "filename_used": "/GENERIC_PATH",
+ "filename_relative": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "filename_absolute": "/GENERIC_PATH",
+ "filename_short": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol",
+ "is_dependency": false,
+ "lines": [
+ 8
+ ],
+ "starting_column": 20,
+ "ending_column": 23
+ }
+ }
+ ],
+ "description": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 342:\n\t- b'\\x80\\xaeaaa\\xe2\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "markdown": "tests/detectors/rtlo/0.8.0/unicode_direction_override.sol contains a unicode right-to-left-override character at byte offset 342:\n\t- b'\\x80\\xaeaaa\\xe2\\x80\\xaebbb\\xe2\\x80\\xaeccc\\xe2\\x80\\xacddd\\xe2\\x80\\xaceee\\xe2\\x80\\xac*/'\n",
+ "first_markdown_element": "",
+ "id": "9dd23585bb0ff1f244f749281b27f62978e0bb5b0ae58c8c9cb6d3f9c7e82253",
+ "check": "rtlo",
+ "impact": "High",
+ "confidence": "High"
+ }
+ ]
+]
\ No newline at end of file
diff --git a/tests/test_detectors.py b/tests/test_detectors.py
index 7b5fd993c6..f7884d68f0 100644
--- a/tests/test_detectors.py
+++ b/tests/test_detectors.py
@@ -724,6 +724,11 @@ def id_test(test_item: Test):
"right_to_left_override.sol",
"0.6.11",
),
+ Test(
+ all_detectors.RightToLeftOverride,
+ "unicode_direction_override.sol",
+ "0.8.0",
+ ),
Test(all_detectors.VoidConstructor, "void-cst.sol", "0.4.25"),
Test(all_detectors.VoidConstructor, "void-cst.sol", "0.5.16"),
Test(all_detectors.VoidConstructor, "void-cst.sol", "0.6.11"),
|
python-discord__site-402 | Replace Allauth anti-email monkey-patch with proper settings.
The allauth extension we use for the discord login and connection to a github account for any of our site accounts.
We have a [monkeypatch](https://github.com/python-discord/site/blob/master/pydis_site/apps/home/apps.py#L17-L38) to avoid saving in our database any email details that we may get from oauth authorisations. This does not avoid the request for emails showing in the auth request page though, as the scope is still requested.
Instead, we can define in settings.py the appropriate provider settings, particularly any required scopes to be requested. If we only provide `identify` for the discord one, it won't add the `email` scope in the auth request page in future, making it a cleaner and more appropriate solution.
The setting would look like:
```py
SOCIALACCOUNT_PROVIDERS = {
'discord': {
'SCOPE': [
'identify',
],
}
}
```
The relevant scope setting for github can be given also, it just needs to be looked up as to what scopes we should restrict it to in order to avoid unnecessary sensitive data being stored.
| [
{
"content": "\"\"\"\nDjango settings for pydis_site project.\n\nGenerated by 'django-admin startproject' using Django 2.1.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/2.1/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/... | [
{
"content": "\"\"\"\nDjango settings for pydis_site project.\n\nGenerated by 'django-admin startproject' using Django 2.1.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/2.1/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/... | diff --git a/pydis_site/settings.py b/pydis_site/settings.py
index 1f042c1bb..3769fa253 100644
--- a/pydis_site/settings.py
+++ b/pydis_site/settings.py
@@ -401,3 +401,11 @@ def WIKI_CAN_WRITE(article: "Article", user: "User") -> bool: # noqa: N802
LOGIN_REDIRECT_URL = "home"
SOCIALACCOUNT_ADAPTER = "pydis_site.utils.account.SocialAccountAdapter"
+SOCIALACCOUNT_PROVIDERS = {
+ "discord": {
+ "SCOPE": [
+ "identify",
+ ],
+ "AUTH_PARAMS": {"prompt": "none"}
+ }
+}
|
iterative__dvc-3828 | End of file fixer
I am using an [end-of-file fixer in the pre-commit hook](https://pre-commit.com/hooks.html). It checks that the file ends with an empty new line.
It looks like files
```
modified: .dvc/plots/confusion.json
modified: .dvc/plots/default.json
modified: .dvc/plots/scatter.json
```
That are automatically created by `dvc init` do not have an empty line at the end of the file.
| [
{
"content": "import json\nimport logging\nimport os\nimport re\n\nfrom funcy import cached_property\n\nfrom dvc.exceptions import DvcException\nfrom dvc.utils.fs import makedirs\n\nlogger = logging.getLogger(__name__)\n\n\nclass TemplateNotFoundError(DvcException):\n def __init__(self, path):\n super... | [
{
"content": "import json\nimport logging\nimport os\nimport re\n\nfrom funcy import cached_property\n\nfrom dvc.exceptions import DvcException\nfrom dvc.utils.fs import makedirs\n\nlogger = logging.getLogger(__name__)\n\n\nclass TemplateNotFoundError(DvcException):\n def __init__(self, path):\n super... | diff --git a/dvc/repo/plots/template.py b/dvc/repo/plots/template.py
index 0bfe6b415a..731897f5d2 100644
--- a/dvc/repo/plots/template.py
+++ b/dvc/repo/plots/template.py
@@ -59,6 +59,7 @@ def dump(self):
indent=self.INDENT,
separators=self.SEPARATORS,
)
+ fobj.write("\n")
@staticmethod
def get_data_anchor(template_content):
|
pypa__pipenv-2111 | Newly added / changed sources not used
When an environment variable source (possibly any source) is updated in the Pipfile, the new source isn't used for resolution when `pipenv install` or `pipenv lock` is next run.
See for example:
<details><summary>Pipfile</summary>
```
[[source]]
url = "https://pypi.python.org/${ENV_VAR}"
verify_ssl = true
[dev-packages]
pytest = "==3.4.0"
[packages]
requests = "==2.18.0"
```
</details>
<details><summary>Pipfile.lock</summary>
```
{
"_meta": {
"hash": {
"sha256": "5f70d907b20123fa92bd105fff99886abbf573b68009a4eb8dfd3e18144ab001"
},
"pipfile-spec": 6,
"requires": {},
"sources": [
{
"url": "https://pypi.python.org/${ENV_VAR}",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:13e698f54293db9f89122b0581843a782ad0934a4fe0172d2a980ba77fc61bb7",
"sha256:9fa520c1bacfb634fa7af20a76bcbd3d5fb390481724c597da32c719a7dca4b0"
],
"version": "==2018.4.16"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"idna": {
"hashes": [
"sha256:3cb5ce08046c4e3a560fc02f138d0ac63e00f8ce5901a56b32ec8b7994082aab",
"sha256:cc19709fd6d0cbfed39ea875d29ba6d4e22c0cebc510a76d6302a28385e8bb70"
],
"version": "==2.5"
},
"requests": {
"hashes": [
"sha256:5e88d64aa56ac0fda54e77fb9762ebc65879e171b746d5479a33c4082519d6c6",
"sha256:cd0189f962787284bff715fddaad478eb4d9c15aa167bd64e52ea0f661e7ea5c"
],
"version": "==2.18.0"
},
"urllib3": {
"hashes": [
"sha256:8ed6d5c1ff9d6ba84677310060d6a3a78ca3072ce0684cb3c645023009c114b1",
"sha256:b14486978518ca0901a76ba973d7821047409d7f726f22156b24e83fd71382a5"
],
"version": "==1.21.1"
}
},
"develop": {
"attrs": {
"hashes": [
"sha256:1c7960ccfd6a005cd9f7ba884e6316b5e430a3f1a6c37c5f87d8b43f83b54ec9",
"sha256:a17a9573a6f475c99b551c0e0a812707ddda1ec9653bed04c13841404ed6f450"
],
"version": "==17.4.0"
},
"funcsigs": {
"hashes": [
"sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca",
"sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50"
],
"markers": "python_version < '3.0'",
"version": "==1.0.2"
},
"pluggy": {
"hashes": [
"sha256:7f8ae7f5bdf75671a718d2daf0a64b7885f74510bcd98b1a0bb420eb9a9d0cff",
"sha256:d345c8fe681115900d6da8d048ba67c25df42973bda370783cd58826442dcd7c",
"sha256:e160a7fcf25762bb60efc7e171d4497ff1d8d2d75a3d0df7a21b76821ecbf5c5"
],
"version": "==0.6.0"
},
"py": {
"hashes": [
"sha256:29c9fab495d7528e80ba1e343b958684f4ace687327e6f789a94bf3d1915f881",
"sha256:983f77f3331356039fdd792e9220b7b8ee1aa6bd2b25f567a963ff1de5a64f6a"
],
"version": "==1.5.3"
},
"pytest": {
"hashes": [
"sha256:6074ea3b9c999bd6d0df5fa9d12dd95ccd23550df2a582f5f5b848331d2e82ca",
"sha256:95fa025cd6deb5d937e04e368a00552332b58cae23f63b76c8c540ff1733ab6d"
],
"version": "==3.4.0"
},
"six": {
"hashes": [
"sha256:70e8a77beed4562e7f14fe23a786b54f6296e34344c23bc42f07b15018ff98e9",
"sha256:832dc0e10feb1aa2c68dcc57dbb658f1c7e65b9b61af69048abc87a2db00a0eb"
],
"version": "==1.11.0"
}
}
}
```
</details>
<br>
Try updating the source in the Pipfile in the above to `"https://pypi.python.org/${ENV_VAR}"` and installation will still failing, complaining that `https://pypi.python.org/${ENV_VAR}` isn't reachable.
<details><summary>$ python -m pipenv.help output</summary>
Pipenv version: `'11.10.1'`
Pipenv location: `'/Users/greysteil/code/pipenv/pipenv'`
Python location: `'/Users/greysteil/.pyenv/versions/3.6.5/bin/python3'`
Other Python installations in `PATH`:
- `2.6`: `/usr/bin/python2.6`
- `2.6`: `/usr/bin/python2.6`
- `2.7`: `/Users/greysteil/.pyenv/shims/python2.7`
- `2.7`: `/Users/greysteil/.pyenv/shims/python2.7`
- `2.7`: `/usr/bin/python2.7`
- `3.5`: `/Users/greysteil/.pyenv/shims/python3.5`
- `3.6`: `/Users/greysteil/.pyenv/versions/3.6.5/bin/python3.6m`
- `3.6`: `/Users/greysteil/.pyenv/versions/3.6.5/bin/python3.6`
- `3.6`: `/Users/greysteil/.pyenv/shims/python3.6`
- `3.6`: `/usr/local/bin/python3.6`
- `3.6`: `/usr/local/bin/python3.6`
- `3.6.5`: `/Users/greysteil/.pyenv/versions/3.6.5/bin/python`
- `3.6.5`: `/Users/greysteil/.pyenv/shims/python`
- `3.6.5`: `/usr/local/bin/python`
- `3.6.5`: `/usr/local/bin/python`
- `2.7.10`: `/usr/bin/python`
- `None`: `/Users/greysteil/.pyenv/shims/python2`
- `3.6.5`: `/Users/greysteil/.pyenv/versions/3.6.5/bin/python3`
- `3.6.5`: `/Users/greysteil/.pyenv/shims/python3`
- `3.6.5`: `/usr/local/bin/python3`
- `3.6.5`: `/usr/local/bin/python3`
PEP 508 Information:
```
{'implementation_name': 'cpython',
'implementation_version': '3.6.5',
'os_name': 'posix',
'platform_machine': 'x86_64',
'platform_python_implementation': 'CPython',
'platform_release': '16.7.0',
'platform_system': 'Darwin',
'platform_version': 'Darwin Kernel Version 16.7.0: Wed Oct 4 00:17:00 PDT '
'2017; root:xnu-3789.71.6~1/RELEASE_X86_64',
'python_full_version': '3.6.5',
'python_version': '3.6',
'sys_platform': 'darwin'}
```
System environment variables:
- `TERM_PROGRAM`
- `PYENV_ROOT`
- `SHELL`
- `TERM`
- `CLICOLOR`
- `TMPDIR`
- `Apple_PubSub_Socket_Render`
- `TERM_PROGRAM_VERSION`
- `TERM_SESSION_ID`
- `PYENV_VERSION`
- `USER`
- `SSH_AUTH_SOCK`
- `PYENV_DIR`
- `__CF_USER_TEXT_ENCODING`
- `LSCOLORS`
- `PATH`
- `PWD`
- `EDITOR`
- `LANG`
- `PYENV_HOOK_PATH`
- `XPC_FLAGS`
- `RBENV_SHELL`
- `XPC_SERVICE_NAME`
- `SHLVL`
- `HOME`
- `PYENV_SHELL`
- `LOGNAME`
- `SECURITYSESSIONID`
- `PYTHONDONTWRITEBYTECODE`
- `PIP_PYTHON_PATH`
Pipenv–specific environment variables:
Debug–specific environment variables:
- `PATH`: `/Users/greysteil/.pyenv/versions/3.6.5/bin:/usr/local/Cellar/pyenv/1.2.3/libexec:/Users/greysteil/.pyenv/plugins/pyenv-virtualenv/bin:/Users/greysteil/.pyenv/plugins/pyenv-update/bin:/Users/greysteil/.pyenv/plugins/pyenv-installer/bin:/Users/greysteil/.pyenv/plugins/pyenv-doctor/bin:/Users/greysteil/.pyenv/shims:/Users/greysteil/.pyenv/bin:/Users/greysteil/.cargo/bin:/usr/local/heroku/bin:/Users/greysteil/.rbenv/shims:/usr/local/bin:./node_modules/.bin:.bundle/binstubs:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/git/bin:/Library/TeX/texbin:/usr/local/sbin`
- `SHELL`: `/bin/bash`
- `EDITOR`: `subl -w`
- `LANG`: `en_GB.UTF-8`
- `PWD`: `/Users/greysteil/code/python-test`
---------------------------
Contents of `Pipfile` ('/Users/greysteil/code/python-test/Pipfile'):
```toml
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
[dev-packages]
pytest = "==3.4.0"
[packages]
requests = "==2.18.0"
```
Contents of `Pipfile.lock` ('/Users/greysteil/code/python-test/Pipfile.lock'):
```json
{
"_meta": {
"hash": {
"sha256": "5f70d907b20123fa92bd105fff99886abbf573b68009a4eb8dfd3e18144ab001"
},
"pipfile-spec": 6,
"requires": {},
"sources": [
{
"url": "https://pypi.python.org/${ENV_VAR}",
"verify_ssl": true
}
]
},
"default": {
"certifi": {
"hashes": [
"sha256:13e698f54293db9f89122b0581843a782ad0934a4fe0172d2a980ba77fc61bb7",
"sha256:9fa520c1bacfb634fa7af20a76bcbd3d5fb390481724c597da32c719a7dca4b0"
],
"version": "==2018.4.16"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"idna": {
"hashes": [
"sha256:3cb5ce08046c4e3a560fc02f138d0ac63e00f8ce5901a56b32ec8b7994082aab",
"sha256:cc19709fd6d0cbfed39ea875d29ba6d4e22c0cebc510a76d6302a28385e8bb70"
],
"version": "==2.5"
},
"requests": {
"hashes": [
"sha256:5e88d64aa56ac0fda54e77fb9762ebc65879e171b746d5479a33c4082519d6c6",
"sha256:cd0189f962787284bff715fddaad478eb4d9c15aa167bd64e52ea0f661e7ea5c"
],
"version": "==2.18.0"
},
"urllib3": {
"hashes": [
"sha256:8ed6d5c1ff9d6ba84677310060d6a3a78ca3072ce0684cb3c645023009c114b1",
"sha256:b14486978518ca0901a76ba973d7821047409d7f726f22156b24e83fd71382a5"
],
"version": "==1.21.1"
}
},
"develop": {
"attrs": {
"hashes": [
"sha256:1c7960ccfd6a005cd9f7ba884e6316b5e430a3f1a6c37c5f87d8b43f83b54ec9",
"sha256:a17a9573a6f475c99b551c0e0a812707ddda1ec9653bed04c13841404ed6f450"
],
"version": "==17.4.0"
},
"funcsigs": {
"hashes": [
"sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca",
"sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50"
],
"markers": "python_version < '3.0'",
"version": "==1.0.2"
},
"pluggy": {
"hashes": [
"sha256:7f8ae7f5bdf75671a718d2daf0a64b7885f74510bcd98b1a0bb420eb9a9d0cff",
"sha256:d345c8fe681115900d6da8d048ba67c25df42973bda370783cd58826442dcd7c",
"sha256:e160a7fcf25762bb60efc7e171d4497ff1d8d2d75a3d0df7a21b76821ecbf5c5"
],
"version": "==0.6.0"
},
"py": {
"hashes": [
"sha256:29c9fab495d7528e80ba1e343b958684f4ace687327e6f789a94bf3d1915f881",
"sha256:983f77f3331356039fdd792e9220b7b8ee1aa6bd2b25f567a963ff1de5a64f6a"
],
"version": "==1.5.3"
},
"pytest": {
"hashes": [
"sha256:6074ea3b9c999bd6d0df5fa9d12dd95ccd23550df2a582f5f5b848331d2e82ca",
"sha256:95fa025cd6deb5d937e04e368a00552332b58cae23f63b76c8c540ff1733ab6d"
],
"version": "==3.4.0"
},
"six": {
"hashes": [
"sha256:70e8a77beed4562e7f14fe23a786b54f6296e34344c23bc42f07b15018ff98e9",
"sha256:832dc0e10feb1aa2c68dcc57dbb658f1c7e65b9b61af69048abc87a2db00a0eb"
],
"version": "==1.11.0"
}
}
}
```
</details>
| [
{
"content": "import os\nimport sys\nimport json\nimport logging\n\nos.environ['PIP_PYTHON_PATH'] = sys.executable\n\n\ndef _patch_path():\n pipenv_libdir = os.path.dirname(os.path.abspath(__file__))\n for _dir in ('vendor', 'patched'):\n sys.path.insert(0, os.path.join(pipenv_libdir, _dir))\n s... | [
{
"content": "import os\nimport sys\nimport json\nimport logging\n\nos.environ['PIP_PYTHON_PATH'] = sys.executable\n\n\ndef _patch_path():\n pipenv_libdir = os.path.dirname(os.path.abspath(__file__))\n for _dir in ('vendor', 'patched'):\n sys.path.insert(0, os.path.join(pipenv_libdir, _dir))\n s... | diff --git a/pipenv/resolver.py b/pipenv/resolver.py
index c04a6b3cd3..7e4b95d36a 100644
--- a/pipenv/resolver.py
+++ b/pipenv/resolver.py
@@ -68,7 +68,7 @@ def resolve(packages, pre, sources, verbose, clear, system):
results = resolve(
packages,
pre=do_pre,
- sources=project.sources,
+ sources=project.pipfile_sources,
verbose=is_verbose,
clear=do_clear,
system=system,
diff --git a/tests/integration/test_lock.py b/tests/integration/test_lock.py
index f71e02b368..a14b8b10c9 100644
--- a/tests/integration/test_lock.py
+++ b/tests/integration/test_lock.py
@@ -1,4 +1,5 @@
import pytest
+import os
from flaky import flaky
@@ -247,3 +248,41 @@ def test_private_index_lock_requirements(PipenvInstance):
assert c.return_code == 0
assert '-i https://pypi.python.org/simple' in c.out.strip()
assert '--extra-index-url https://test.pypi.org/simple' in c.out.strip()
+
+
+@pytest.mark.install
+@pytest.mark.index
+def test_lock_updated_source(PipenvInstance, pypi):
+
+ with PipenvInstance(pypi=pypi) as p:
+ with open(p.pipfile_path, 'w') as f:
+ contents = """
+[[source]]
+url = "{url}/${{MY_ENV_VAR}}"
+
+[packages]
+requests = "==2.14.0"
+ """.strip().format(url=pypi.url)
+ f.write(contents)
+
+ os.environ['MY_ENV_VAR'] = 'simple'
+ c = p.pipenv('lock')
+ assert c.return_code == 0
+ assert 'requests' in p.lockfile['default']
+
+ del os.environ['MY_ENV_VAR']
+
+ with open(p.pipfile_path, 'w') as f:
+ contents = """
+[[source]]
+url = "{url}/simple"
+
+[packages]
+requests = "==2.14.0"
+ """.strip().format(url=pypi.url)
+ f.write(contents)
+
+ c = p.pipenv('lock')
+ assert c.return_code == 0
+ assert 'requests' in p.lockfile['default']
+
|
obspy__obspy-3012 | Station.identifiers[0] should not be URI type
Hello!
Just want to say that obspy continues to be an incredibly useful package!
I'm trying to set the identifiers on an obspy Station instance.
According to FDSN schema 1.1 IdentifierType should be a simple string with "type" attribute:
```
<xs:complexType name="IdentifierType">
<xs:annotation>
<xs:documentation>A type to document persistent identifiers.
Identifier values should be specified without a URI scheme (prefix),
instead the identifer type is documented as an attribute.
</xs:documentation>
</xs:annotation>
<xs:simpleContent>
<xs:extension base="xs:string">
<xs:attribute name="type" type="xs:string"> </xs:attribute>
</xs:extension>
</xs:simpleContent>
</xs:complexType>
```
However, obspy (v.1.2.2) seems to have encoded this as xsd:anyURI type instead:
>>> wes.identifiers = ['10.157778/RESIF.FR']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/mth/mth/miniconda3/envs/test_yasmine/lib/python3.8/site-packages/obspy/core/inventory/util.py", line 123, in identifiers
_warn_on_invalid_uri(identifier)
File "/Users/mth/mth/miniconda3/envs/test_yasmine/lib/python3.8/site-packages/obspy/core/inventory/util.py", line 1076, in _warn_on_invalid_uri
msg = "Given string seems to not be a valid URI: ''" % uri
TypeError: not all arguments converted during string formatting
```
>>> wes.identifiers=['http://10.16778/RESIF.FR', 'http://32.2323/RESIF.CR']
>>> print("obspy is happy now!")
```
Tracking it down a bit further:
core/inventory/util.py:
```
@identifiers.setter
def identifiers(self, value):
if not hasattr(value, "__iter__"):
msg = "identifiers needs to be an iterable, e.g. a list."
raise ValueError(msg)
# make sure to unwind actual iterators, or the just might get exhausted
# at some point
identifiers = [identifier for identifier in value]
for identifier in identifiers:
_warn_on_invalid_uri(identifier)
self._identifiers = identifiers
```
This calls:
```
def _warn_on_invalid_uri(uri):
if not _is_valid_uri(uri):
msg = "Given string seems to not be a valid URI: ''" % uri
warnings.warn(msg)
```
And that msg seems to be missing the %s format to print uri and that seems to be
the error I'm getting.
So I guess there are 2 things:
1. identifiers - sholudn't be checked as valid_uri, at least not for basenode types
2. the _warn_on_invalid_uri() func has an error in msg.
Thanks!
-Mike
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nUtility objects.\n\n:copyright:\n Lion Krischer (krischer@geophysik.uni-muenchen.de), 2013\n:license:\n GNU Lesser General Public License, Version 3\n (https://www.gnu.org/copyleft/lesser.html)\n\"\"\"\nimport copy\nimport re\nimport... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"\nUtility objects.\n\n:copyright:\n Lion Krischer (krischer@geophysik.uni-muenchen.de), 2013\n:license:\n GNU Lesser General Public License, Version 3\n (https://www.gnu.org/copyleft/lesser.html)\n\"\"\"\nimport copy\nimport re\nimport... | diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index 8cdb7bc8c2f..bb87264e9ca 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -46,6 +46,8 @@ Changes:
safe route but it also prevents valid calculations as it is up to the user
to make sure that signal spectrum is properly suppressed in those
frequency ranges outside of the valid response information (see #2988)
+ * fix a bug while checking for valid URI syntax when setting identifiers on
+ inventory type objects (see #2905)
- obspy.clients.arclink:
* submodule removed completely, since ArcLink was officially deprecated and
deactivated on all big datacenters years ago (see #2994)
diff --git a/obspy/core/inventory/util.py b/obspy/core/inventory/util.py
index 230cc12fa38..160ab8ffbdf 100644
--- a/obspy/core/inventory/util.py
+++ b/obspy/core/inventory/util.py
@@ -1072,7 +1072,7 @@ def _is_valid_uri(uri):
def _warn_on_invalid_uri(uri):
if not _is_valid_uri(uri):
- msg = "Given string seems to not be a valid URI: ''" % uri
+ msg = f"Given string seems to not be a valid URI: '{uri}'"
warnings.warn(msg)
diff --git a/obspy/core/tests/test_station.py b/obspy/core/tests/test_station.py
index 31fa7853957..6b12c66ae81 100644
--- a/obspy/core/tests/test_station.py
+++ b/obspy/core/tests/test_station.py
@@ -13,6 +13,8 @@
import pytest
from obspy import read_inventory, UTCDateTime
+from obspy.core.inventory import Station
+from obspy.core.util import CatchAndAssertWarnings
from obspy.core.util.testing import WarningsCapture
@@ -153,3 +155,14 @@ def test_station_select(self):
assert len(sta.select(
latitude=47.95, longitude=12.95,
minradius=0.08, maxradius=0.1)) == 0
+
+ def test_warn_identifier_invalid_uri_syntax(self):
+ """
+ Tests the warning on Identifiers getting set with an invalid URI (not
+ having scheme-colon-path)
+ """
+ sta = Station(code='A', latitude=1, longitude=1, elevation=1)
+ invalid_uri = "this-has-no-URI-scheme-and-no-colon"
+ msg = f"Given string seems to not be a valid URI: '{invalid_uri}'"
+ with CatchAndAssertWarnings(expected=[(UserWarning, msg)]):
+ sta.identifiers = [invalid_uri]
|
adamchainz__django-cors-headers-238 | CORS_HEADER value is set to regex match object
The `CORS_ENABLED` response header is set to the regex match object in line 82 of `middleware.py` as returned by `CorsPostCsrfMiddleware.is_enabled`
I am not aware of the best practice here, but I would assume a boolean response makes more sense.

| [
{
"content": "import re\n\nfrom django import http\nfrom django.apps import apps\nfrom django.utils.cache import patch_vary_headers\nfrom django.utils.six.moves.urllib.parse import urlparse\n\nfrom .compat import MiddlewareMixin\nfrom .conf import conf\nfrom .signals import check_request_enabled\n\nACCESS_CONTR... | [
{
"content": "import re\n\nfrom django import http\nfrom django.apps import apps\nfrom django.utils.cache import patch_vary_headers\nfrom django.utils.six.moves.urllib.parse import urlparse\n\nfrom .compat import MiddlewareMixin\nfrom .conf import conf\nfrom .signals import check_request_enabled\n\nACCESS_CONTR... | diff --git a/HISTORY.rst b/HISTORY.rst
index 1241332d..e99b84a9 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -5,6 +5,8 @@ Pending
-------
* New release notes go here.
+* Ensured that ``request._cors_enabled`` is always a ``bool()`` - previously it
+ could be set to a regex match object.
2.1.0 (2017-05-28)
------------------
diff --git a/corsheaders/middleware.py b/corsheaders/middleware.py
index a39e94b5..1b1e672a 100644
--- a/corsheaders/middleware.py
+++ b/corsheaders/middleware.py
@@ -158,7 +158,7 @@ def origin_found_in_model(self, url):
def is_enabled(self, request):
return (
- re.match(conf.CORS_URLS_REGEX, request.path) or
+ bool(re.match(conf.CORS_URLS_REGEX, request.path)) or
self.check_signal(request)
)
diff --git a/tests/test_middleware.py b/tests/test_middleware.py
index a6b7f166..7824cfce 100644
--- a/tests/test_middleware.py
+++ b/tests/test_middleware.py
@@ -322,10 +322,21 @@ def test_get_no_origin_not_enabled(self):
assert ACCESS_CONTROL_ALLOW_ORIGIN not in resp
@override_settings(CORS_ORIGIN_WHITELIST=['example.com'])
- def test_works_if_view_deletes_is_enabled(self):
+ def test_cors_enabled_is_attached_and_bool(self):
+ """
+ Ensure that request._cors_enabled is available - although a private API
+ someone might use it for debugging
+ """
+ resp = self.client.get('/', HTTP_ORIGIN='http://example.com')
+ request = resp.wsgi_request
+ assert isinstance(request._cors_enabled, bool)
+ assert request._cors_enabled
+
+ @override_settings(CORS_ORIGIN_WHITELIST=['example.com'])
+ def test_works_if_view_deletes_cors_enabled(self):
"""
Just in case something crazy happens in the view or other middleware,
- check that get_response doesn't fall over if `is_enabled` is removed
+ check that get_response doesn't fall over if `_cors_enabled` is removed
"""
resp = self.client.get(
'/delete-is-enabled/',
|
getsentry__sentry-15491 | Simple typo in the compact docstring for utils.functional
## Important Details
How are you running Sentry?
* [ ] On-Premise docker [Version xyz]
* [ ] Saas (sentry.io)
* [x] Other [briefly describe your environment]
Observed documentation - not running sentry.
## Description
Simple typo should be values rather than valules.
## Steps to Reproduce
1. Observe docstring in utils.functional.compact method
### What you expected to happen
Should be values rather than valules.
### Possible Solution
Replace valules with values.
| [
{
"content": "from __future__ import absolute_import\n\nimport six\n\nfrom django.utils.functional import empty\n\n\ndef extract_lazy_object(lo):\n \"\"\"\n Unwrap a LazyObject and return the inner object. Whatever that may be.\n\n ProTip: This is relying on `django.utils.functional.empty`, which may\n... | [
{
"content": "from __future__ import absolute_import\n\nimport six\n\nfrom django.utils.functional import empty\n\n\ndef extract_lazy_object(lo):\n \"\"\"\n Unwrap a LazyObject and return the inner object. Whatever that may be.\n\n ProTip: This is relying on `django.utils.functional.empty`, which may\n... | diff --git a/src/sentry/utils/functional.py b/src/sentry/utils/functional.py
index ee23e33a38f021..91a5e2f4200ff9 100644
--- a/src/sentry/utils/functional.py
+++ b/src/sentry/utils/functional.py
@@ -46,7 +46,7 @@ def compact(seq):
Removes keys with a corresponding ``None`` value.
list:
- Removes ``None`` valules.
+ Removes ``None`` values.
>>> compact({'foo': 'bar', 'baz': None})
{'foo': 'bar'}
|
googleapis__google-cloud-python-9973 | Bigquery: Missing Entity Type when reading dataset.access_entries
When running the following code:
```python
from google.cloud import bigquery
gbq_client = bigquery.Client(project='project-name')
dataset_ref = gbq_client.dataset(dataset_id='dataset1', project='project-name')
dataset = gbq_client.get_dataset(dataset_ref=dataset_ref)
print(len(dataset.access_entries))
```
the following error will happen about 25% of the time:
```python
Traceback (most recent call last):
File "iam.py", line 5, in <module>
print(len(dataset.access_entries))
File "/usr/local/lib/python3.7/site-packages/google/cloud/bigquery/dataset.py", line 376, in access_entries
return [AccessEntry.from_api_repr(entry) for entry in entries]
File "/usr/local/lib/python3.7/site-packages/google/cloud/bigquery/dataset.py", line 376, in <listcomp>
return [AccessEntry.from_api_repr(entry) for entry in entries]
File "/usr/local/lib/python3.7/site-packages/google/cloud/bigquery/dataset.py", line 183, in from_api_repr
return cls(role, entity_type, entity_id)
File "/usr/local/lib/python3.7/site-packages/google/cloud/bigquery/dataset.py", line 115, in __init__
raise ValueError(message)
ValueError: Entity type 'iamMember' not among: domain, groupByEmail, specialGroup, userByEmail, view
```
It seems the Google API is returning a new 'iamMember' entity type that is not in the hard coded list of allowed entity types in [dataset.py](https://github.com/googleapis/google-cloud-python/blob/master/bigquery/google/cloud/bigquery/dataset.py)
| [
{
"content": "# Copyright 2015 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | [
{
"content": "# Copyright 2015 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | diff --git a/bigquery/google/cloud/bigquery/dataset.py b/bigquery/google/cloud/bigquery/dataset.py
index 02664d87b153..99c47026fe3a 100644
--- a/bigquery/google/cloud/bigquery/dataset.py
+++ b/bigquery/google/cloud/bigquery/dataset.py
@@ -123,7 +123,7 @@ class AccessEntry(object):
"""
ENTITY_TYPES = frozenset(
- ["userByEmail", "groupByEmail", "domain", "specialGroup", "view"]
+ ["userByEmail", "groupByEmail", "domain", "specialGroup", "view", "iamMember"]
)
"""Allowed entity types."""
|
huggingface__transformers-6719 | Some weights of AlbertModel were not initialized ['albert.embeddings.position_ids']
Hello!
There seems to be a problem with the current code to load a pre-trained Albert model. This warning appears in any configuration of the Albert model:
`Some weights of AlbertModel were not initialized from the model checkpoint at albert-base-v2 and are newly initialized: ['albert.embeddings.position_ids']`
`You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.`
I found this happens only when I install it from the source. Models load correctly (without warning) when installing the library with pip.
| [
{
"content": "# coding=utf-8\n# Copyright 2018 Google AI, Google Brain and the HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache... | [
{
"content": "# coding=utf-8\n# Copyright 2018 Google AI, Google Brain and the HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache... | diff --git a/src/transformers/modeling_albert.py b/src/transformers/modeling_albert.py
index eb0ed4dfbdd5..76f099895252 100755
--- a/src/transformers/modeling_albert.py
+++ b/src/transformers/modeling_albert.py
@@ -403,6 +403,7 @@ class AlbertPreTrainedModel(PreTrainedModel):
config_class = AlbertConfig
base_model_prefix = "albert"
+ authorized_missing_keys = [r"position_ids"]
def _init_weights(self, module):
""" Initialize the weights.
|
espnet__espnet-617 | Conversion of AttributeDict with vars() returns unexpected results
I found a bug.
In training phase, `train_args` is `argparse.Namespace`.
So `vars(train_args)` convert into dict as follows.
```python
(Pdb) train_args
Namespace(aconv_chans=10, aconv_filts=100, adim=320, aheads=4, asr_model=False, atype='location', awin=5, backend='pytorch', batch_size=30, beam_size=4, char_list=['<blank>', '<unk>', '<space>', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', '<eos>'], criterion='acc', ctc_type='warpctc', ctc_weight=0.3, debugdir='exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', debugmode=1, dict='data/lang_1char/train_nodev_units.txt', dlayers=1, dropout_rate=0.0, dropout_rate_decoder=0.0, dtype='lstm', dunits=300, early_stop_criterion='validation/main/acc', elayers=4, elayers_sd=4, epochs=20, eprojs=320, eps=1e-08, eps_decay=0.01, etype='blstmp', eunits=320, grad_clip=5, lm_weight=0.1, lsm_type='', lsm_weight=0.0, maxlen_in=800, maxlen_out=150, maxlenratio=0.0, minibatches=0, minlenratio=0.0, mt_model=False, mtlalpha=0.5, n_iter_processes=0, nbest=1, ngpu=1, num_save_attention=3, num_spkrs=1, opt='adadelta', outdir='exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150/results', patience=3, penalty=0.0, preprocess_conf=None, report_cer=False, report_wer=False, resume=None, rnnlm=None, rnnlm_conf=None, sampling_probability=0.0, seed=1, sortagrad=0, spa=False, subsample='1_2_2_1_1', sym_blank='<blank>', sym_space='<space>', tensorboard_dir='tensorboard/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', threshold=0.0001, train_json='dump/train_nodev/deltafalse/data.json', valid_json='dump/train_dev/deltafalse/data.json', verbose=1, weight_decay=0.0)
(Pdb) vars(train_args)
{'aconv_chans': 10, 'aconv_filts': 100, 'adim': 320, 'aheads': 4, 'asr_model': False, 'atype': 'location', 'awin': 5, 'backend': 'pytorch', 'batch_size': 30, 'beam_size': 4, 'char_list': ['<blank>', '<unk>', '<space>', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', '<eos>'], 'criterion': 'acc', 'ctc_type': 'warpctc', 'ctc_weight': 0.3, 'debugdir': 'exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', 'debugmode': 1, 'dict': 'data/lang_1char/train_nodev_units.txt', 'dlayers': 1, 'dropout_rate': 0.0, 'dropout_rate_decoder': 0.0, 'dtype': 'lstm', 'dunits': 300, 'early_stop_criterion': 'validation/main/acc', 'elayers': 4, 'elayers_sd': 4, 'epochs': 20, 'eprojs': 320, 'eps': 1e-08, 'eps_decay': 0.01, 'etype': 'blstmp', 'eunits': 320, 'grad_clip': 5, 'lm_weight': 0.1, 'lsm_type': '', 'lsm_weight': 0.0, 'maxlen_in': 800, 'maxlen_out': 150, 'maxlenratio': 0.0, 'minibatches': 0, 'minlenratio': 0.0, 'mt_model': False, 'mtlalpha': 0.5, 'n_iter_processes': 0, 'nbest': 1, 'ngpu': 1, 'num_save_attention': 3, 'num_spkrs': 1, 'opt': 'adadelta', 'outdir': 'exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150/results', 'patience': 3, 'penalty': 0.0, 'preprocess_conf': None, 'report_cer': False, 'report_wer': False, 'resume': None, 'rnnlm': None, 'rnnlm_conf': None, 'sampling_probability': 0.0, 'seed': 1, 'sortagrad': 0, 'spa': False, 'subsample': '1_2_2_1_1', 'sym_blank': '<blank>', 'sym_space': '<space>', 'tensorboard_dir': 'tensorboard/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', 'threshold': 0.0001, 'train_json': 'dump/train_nodev/deltafalse/data.json', 'valid_json': 'dump/train_dev/deltafalse/data.json', 'verbose': 1, 'weight_decay': 0.0}
```
However, in the testing phase, loaded `train_args` is `AttributeDict`.
Therefore, `vars(train_args)` return different results.
```python
(Pdb) train_args
<espnet.asr.asr_utils.AttributeDict object at 0x7f2323130a58>
(Pdb) vars(train_args)
{'obj': {'aconv_chans': 10, 'aconv_filts': 100, 'adim': 320, 'aheads': 4, 'asr_model': False, 'atype': 'location', 'awin': 5, 'backend': 'pytorch', 'batch_size': 30, 'beam_size': 4, 'char_list': ['<blank>', '<unk>', '<space>', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', '<eos>'], 'criterion': 'acc', 'ctc_type': 'warpctc', 'ctc_weight': 0.3, 'debugdir': 'exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', 'debugmode': 1, 'dict': 'data/lang_1char/train_nodev_units.txt', 'dlayers': 1, 'dropout_rate': 0.0, 'dropout_rate_decoder': 0.0, 'dtype': 'lstm', 'dunits': 300, 'early_stop_criterion': 'validation/main/acc', 'elayers': 4, 'elayers_sd': 4, 'epochs': 20, 'eprojs': 320, 'eps': 1e-08, 'eps_decay': 0.01, 'etype': 'blstmp', 'eunits': 320, 'grad_clip': 5, 'lm_weight': 0.1, 'lsm_type': '', 'lsm_weight': 0.0, 'maxlen_in': 800, 'maxlen_out': 150, 'maxlenratio': 0.0, 'minibatches': 0, 'minlenratio': 0.0, 'mt_model': False, 'mtlalpha': 0.5, 'n_iter_processes': 0, 'nbest': 1, 'ngpu': 1, 'num_save_attention': 3, 'num_spkrs': 1, 'opt': 'adadelta', 'outdir': 'exp/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150/results', 'patience': 3, 'penalty': 0.0, 'preprocess_conf': None, 'report_cer': False, 'report_wer': False, 'resume': None, 'rnnlm': None, 'rnnlm_conf': None, 'sampling_probability': 0.0, 'seed': 1, 'sortagrad': 0, 'spa': False, 'subsample': '1_2_2_1_1', 'sym_blank': '<blank>', 'sym_space': '<space>', 'tensorboard_dir': 'tensorboard/train_nodev_pytorch_blstmp_e4_subsample1_2_2_1_1_unit320_proj320_d1_unit300_location_aconvc10_aconvf100_mtlalpha0.5_adadelta_sampprob0.0_bs30_mli800_mlo150', 'threshold': 0.0001, 'train_json': 'dump/train_nodev/deltafalse/data.json', 'valid_json': 'dump/train_dev/deltafalse/data.json', 'verbose': 1, 'weight_decay': 0.0}}
```
This causes unexpected behavior in following line.
https://github.com/espnet/espnet/blob/fb1cbd605c5fefc6e82c829cafc01840918c90c4/espnet/nets/pytorch_backend/ctc.py#L116
`vars(train_args).get(“ctc_type”)` always return `None`, so `vars(train_args).get(“ctc_type”, “builtin”)` will always return `“builtin”`.
@gtache Is there any reason why using `vars(train_args).get(“ctc_type”)` instead of `train_args.ctc_type`?
@sw005320 What is your intension of using `AttributeDict` in loading a config file?
| [
{
"content": "import logging\n\nimport numpy as np\nimport torch\nimport torch.nn.functional as F\n\nfrom espnet.nets.pytorch_backend.nets_utils import to_device\n\n\nclass CTC(torch.nn.Module):\n \"\"\"CTC module\n\n :param int odim: dimension of outputs\n :param int eprojs: number of encoder projecti... | [
{
"content": "import logging\n\nimport numpy as np\nimport torch\nimport torch.nn.functional as F\n\nfrom espnet.nets.pytorch_backend.nets_utils import to_device\n\n\nclass CTC(torch.nn.Module):\n \"\"\"CTC module\n\n :param int odim: dimension of outputs\n :param int eprojs: number of encoder projecti... | diff --git a/espnet/nets/pytorch_backend/ctc.py b/espnet/nets/pytorch_backend/ctc.py
index 4573cec9d7e..8c58ca557be 100644
--- a/espnet/nets/pytorch_backend/ctc.py
+++ b/espnet/nets/pytorch_backend/ctc.py
@@ -113,4 +113,4 @@ def ctc_for(args, odim, reduce=True):
:return: the corresponding CTC module
"""
return CTC(odim, args.eprojs, args.dropout_rate,
- ctc_type=vars(args).get('ctc_type', 'builtin'), reduce=reduce)
+ ctc_type=args.ctc_type, reduce=reduce)
|
numba__numba-941 | Update README
Two issues with our README file:
- it is not up-to-date (e.g. it mentions Cython, which we don't use anymore)
- it uses Markdown rather than reST, and therefore is badly formatted when used for the PyPI long description: https://pypi.python.org/pypi/numba
| [
{
"content": "try:\n # Try to use setuptools so as to enable support of the special\n # \"Microsoft Visual C++ Compiler for Python 2.7\" (http://aka.ms/vcpython27)\n # for building under Windows.\n # Note setuptools >= 6.0 is required for this.\n from setuptools import setup, Extension\nexcept Im... | [
{
"content": "try:\n # Try to use setuptools so as to enable support of the special\n # \"Microsoft Visual C++ Compiler for Python 2.7\" (http://aka.ms/vcpython27)\n # for building under Windows.\n # Note setuptools >= 6.0 is required for this.\n from setuptools import setup, Extension\nexcept Im... | diff --git a/MANIFEST.in b/MANIFEST.in
index f6157564a73..17330a666bc 100644
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -1,4 +1,4 @@
-include README.md setup.py runtests.py versioneer.py CHANGE_LOG AUTHORS LICENSE
+include README.rst setup.py runtests.py versioneer.py CHANGE_LOG AUTHORS LICENSE
recursive-include docs *.ipynb *.txt *.py Makefile *.rstls
prune docs/_build
prune docs/gh-pages
diff --git a/README.md b/README.rst
similarity index 65%
rename from README.md
rename to README.rst
index 1b6f6ccefc3..657f63b557d 100644
--- a/README.md
+++ b/README.rst
@@ -1,3 +1,4 @@
+=====
Numba
=====
@@ -18,94 +19,93 @@ in the decorator.
Numba is a mechanism for producing machine code from Python syntax and typed
data structures such as those that exist in NumPy.
+
Dependencies
============
- * llvmlite
- * numpy (version 1.6 or higher)
- * argparse (for pycc in python2.6)
- * funcsigs (for Python 2)
+* llvmlite
+* numpy (version 1.6 or higher)
+* argparse (for pycc in python2.6)
+* funcsigs (for Python 2)
+
Installing
-=================
+==========
The easiest way to install numba and get updates is by using the Anaconda
Distribution: https://store.continuum.io/cshop/anaconda/
-```bash
-$ conda install numba
-```
+::
+
+ $ conda install numba
If you wanted to compile Numba from source,
it is recommended to use conda environment to maintain multiple isolated
-development environments. To create a new environment for Numba development:
+development environments. To create a new environment for Numba development::
-```bash
-$ conda create -p ~/dev/mynumba python numpy llvmlite
-```
+ $ conda create -p ~/dev/mynumba python numpy llvmlite
To select the installed version, append "=VERSION" to the package name,
-where, "VERSION" is the version number. For example:
+where, "VERSION" is the version number. For example::
-```bash
-$ conda create -p ~/dev/mynumba python=2.7 numpy=1.6 llvmlite
-```
+ $ conda create -p ~/dev/mynumba python=2.7 numpy=1.6 llvmlite
to use Python 2.7 and Numpy 1.6.
-**Note**: binary packages for llvmlite are currently available from Numba's
-own binstar account, so you'll have to add it to your channels first:
+If you need CUDA support, you should also install the CUDA toolkit::
-```bash
-$ conda config --add channels numba
-```
+ $ conda install cudatoolkit
Custom Python Environments
-==========================
+--------------------------
If you're not using conda, you will need to build llvmlite yourself:
-* Building and installing llvmlite
+Building and installing llvmlite
+''''''''''''''''''''''''''''''''
See https://github.com/numba/llvmlite for the most up-to-date instructions.
You will need a build of LLVM 3.5.
-```bash
-$ git clone https://github.com/numba/llvmlite
-$ cd llvmlite
-$ python setup.py install
-```
+::
+
+ $ git clone https://github.com/numba/llvmlite
+ $ cd llvmlite
+ $ python setup.py install
+
+Installing Numba
+''''''''''''''''
-* Installing Numba
+::
-```bash
-$ git clone https://github.com/numba/numba.git
-$ cd numba
-$ pip install -r requirements.txt
-$ python setup.py build_ext --inplace
-$ python setup.py install
-```
+ $ git clone https://github.com/numba/numba.git
+ $ cd numba
+ $ pip install -r requirements.txt
+ $ python setup.py build_ext --inplace
+ $ python setup.py install
or simply
-```bash
-$ pip install numba
-```
+::
+
+ $ pip install numba
+
+If you want to enable CUDA support, you will need CUDA Toolkit 5.5+ (which
+contains ``libnvvm``). After installing the Toolkit, you might have to
+specify a few environment variables according to
+http://numba.pydata.org/numba-doc/dev/CUDASupport.html
-If you want to enable CUDA support, you will need CUDA Toolkit 5.5+ (which contains
-``libnvvm``). After installing the Toolkit, you might have to specify a few
-environment variables according to http://numba.pydata.org/numba-doc/dev/CUDASupport.html
Documentation
=============
http://numba.pydata.org/numba-doc/dev/index.html
+
Mailing Lists
=============
Join the numba mailing list numba-users@continuum.io:
-
https://groups.google.com/a/continuum.io/d/forum/numba-users
or access it through the Gmane mirror:
@@ -113,6 +113,7 @@ http://news.gmane.org/gmane.comp.python.numba.user
Some old archives are at: http://librelist.com/browser/numba/
+
Website
=======
@@ -120,6 +121,7 @@ See if our sponsor can help you (which can help this project): http://www.contin
http://numba.pydata.org
+
Continuous Integration
======================
diff --git a/setup.py b/setup.py
index 03b1d5de616..4da981636a7 100644
--- a/setup.py
+++ b/setup.py
@@ -21,7 +21,7 @@
cmdclass = versioneer.get_cmdclass()
setup_args = {
- 'long_description': open('README.md').read(),
+ 'long_description': open('README.rst').read(),
}
GCCFLAGS = ["-std=c89", "-Wdeclaration-after-statement", "-Werror"]
|
chainer__chainer-987 | Fix the shape of return value of F.det
Currently, return value of `det` is `xp.array` whose shape is `(1, )`, not a scalar.
```
In [16]: a = chainer.Variable(numpy.random.uniform(-1, 1, (3, 3)).astype(numpy.float32))
In [17]: chainer.functions.det(a).data
Out[17]: array([-0.80874199], dtype=float32)
```
But the document says the return value should be `chainer.Variable` whose data have the shape `()`.
| [
{
"content": "import numpy\n\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer.functions.array import reshape\nfrom chainer.functions.math import inv\nfrom chainer.functions.math import matmul\nfrom chainer import utils\nfrom chainer.utils import type_check\n\n\ndef _det_gpu(b):\n # We do... | [
{
"content": "import numpy\n\nfrom chainer import cuda\nfrom chainer import function\nfrom chainer.functions.array import reshape\nfrom chainer.functions.math import inv\nfrom chainer.functions.math import matmul\nfrom chainer import utils\nfrom chainer.utils import type_check\n\n\ndef _det_gpu(b):\n # We do... | diff --git a/chainer/functions/math/det.py b/chainer/functions/math/det.py
index c34d5ce5e2ce..219e16247dfa 100644
--- a/chainer/functions/math/det.py
+++ b/chainer/functions/math/det.py
@@ -109,4 +109,4 @@ def det(a):
shape = (1, len(a.data), a.data.shape[1])
batched_a = reshape.Reshape(shape)(a)
batched_det = BatchDet()(batched_a)
- return reshape.Reshape((1, ))(batched_det)
+ return reshape.Reshape(())(batched_det)
diff --git a/tests/chainer_tests/functions_tests/math_tests/test_det.py b/tests/chainer_tests/functions_tests/math_tests/test_det.py
index 021911c02cd9..23ae65cb9f25 100644
--- a/tests/chainer_tests/functions_tests/math_tests/test_det.py
+++ b/tests/chainer_tests/functions_tests/math_tests/test_det.py
@@ -14,11 +14,6 @@
class DetFunctionTestBase(object):
- def setUp(self):
- self.x, self.y, self.gy = self.make_data()
- self.ct = numpy.array(
- [ix.T for ix in self.x], dtype=numpy.float32)
-
def det_transpose(self, gpu=False):
if gpu:
cx = cuda.to_gpu(self.x)
@@ -122,18 +117,20 @@ def test_batch_backward_cpu(self):
x_data, y_grad = self.x, self.gy
gradient_check.check_backward(self.det, x_data, y_grad)
- def test_expect_scalar_cpu(self):
- x = numpy.random.uniform(.5, 1, (2, 2)).astype(numpy.float32)
+ def check_single_matrix(self, x):
x = chainer.Variable(x)
- y = F.det(x)
- self.assertEqual(y.data.ndim, 1)
+ y = self.det(x)
+ if self.batched:
+ self.assertEqual(y.data.ndim, 1)
+ else:
+ self.assertEqual(y.data.ndim, 0)
+
+ def test_single_matrix_cpu(self):
+ self.check_single_matrix(self.x)
@attr.gpu
def test_expect_scalar_gpu(self):
- x = cuda.cupy.random.uniform(.5, 1, (2, 2)).astype(numpy.float32)
- x = chainer.Variable(x)
- y = F.det(x)
- self.assertEqual(y.data.ndim, 1)
+ self.check_single_matrix(cuda.to_gpu(self.x))
def test_zero_det_cpu(self):
x_data, y_grad = self.x, self.gy
@@ -198,11 +195,11 @@ def det(self, x):
def matmul(self, x, y):
return F.batch_matmul(x, y)
- def make_data(self):
- x = numpy.random.uniform(.5, 1, (6, 3, 3)).astype(numpy.float32)
- y = numpy.random.uniform(.5, 1, (6, 3, 3)).astype(numpy.float32)
- gy = numpy.random.uniform(-1, 1, (6,)).astype(numpy.float32)
- return x, y, gy
+ def setUp(self):
+ self.x = numpy.random.uniform(.5, 1, (6, 3, 3)).astype(numpy.float32)
+ self.y = numpy.random.uniform(.5, 1, (6, 3, 3)).astype(numpy.float32)
+ self.gy = numpy.random.uniform(-1, 1, (6,)).astype(numpy.float32)
+ self.ct = self.x.transpose(0, 2, 1)
class TestSquareDet(DetFunctionTestBase, unittest.TestCase):
@@ -214,11 +211,11 @@ def det(self, x):
def matmul(self, x, y):
return F.matmul(x, y)
- def make_data(self):
- x = numpy.random.uniform(.5, 1, (5, 5)).astype(numpy.float32)
- y = numpy.random.uniform(.5, 1, (5, 5)).astype(numpy.float32)
- gy = numpy.random.uniform(-1, 1, (1,)).astype(numpy.float32)
- return x, y, gy
+ def setUp(self):
+ self.x = numpy.random.uniform(.5, 1, (5, 5)).astype(numpy.float32)
+ self.y = numpy.random.uniform(.5, 1, (5, 5)).astype(numpy.float32)
+ self.gy = numpy.random.uniform(-1, 1, ()).astype(numpy.float32)
+ self.ct = self.x.transpose()
class DetFunctionRaiseTest(unittest.TestCase):
|
mitmproxy__mitmproxy-4762 | When too many requests come simultaneously, mitmdump called an error and quited [ValueError: too many file descriptors in select()]
#### Problem Description
A clear and concise description of what the bug is.
When too many requests come simultaneously, mitmdump called an error and quited.
Traceback (most recent call last):
File "mitmdump", line 3, in <module>
File "mitmproxy\tools\main.py", line 147, in mitmdump
File "mitmproxy\tools\main.py", line 114, in run
File "mitmproxy\master.py", line 76, in run
File "mitmproxy\master.py", line 59, in run_loop
File "mitmproxy\master.py", line 95, in shutdown
File "asyncio\base_events.py", line 629, in run_until_complete
File "asyncio\base_events.py", line 596, in run_forever
File "asyncio\base_events.py", line 1854, in _run_once
File "selectors.py", line 324, in select
File "selectors.py", line 315, in _select
ValueError: too many file descriptors in select()
[77436] Failed to execute script 'mitmdump' due to unhandled exception!
I googled the error message, and found the following answer. Don't know if it's related.
https://stackoverflow.com/questions/57182009/why-am-i-getting-an-valueerror-too-many-file-descriptors-in-select
#### Steps to reproduce the behavior:
1. I use the following command
`mitmdump.exe -p 8080 --anticomp -q -s "d:\redirect-router.py"`
In the script, I re-write the host for a specific URL
2.
3.
#### System Information
Paste the output of "mitmproxy --version" here.
mitmproxy --version
Mitmproxy: 7.0.2 binary
Python: 3.9.6
OpenSSL: OpenSSL 1.1.1k 25 Mar 2021
Platform: Windows-10-10.0.18363-SP0
| [
{
"content": "import asyncio\nimport sys\n\nif sys.platform == 'win32':\n # workaround for\n # https://github.com/tornadoweb/tornado/issues/2751\n # https://www.tornadoweb.org/en/stable/index.html#installation\n # (copied multiple times in the codebase, please remove all occurrences)\n asyncio.se... | [
{
"content": "",
"path": "mitmproxy/__init__.py"
}
] | diff --git a/CHANGELOG.md b/CHANGELOG.md
index c4f9ff6dd8..acf76819b8 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -3,6 +3,8 @@
## Unreleased: mitmproxy next
* fix some responses not being decoded properly if the encoding was uppercase #4735 (@Mattwmaster58)
+* Windows: Switch to Python's default asyncio event loop, which increases the number of sockets
+ that can be processed simultaneously.
## 4 August 2021: mitmproxy 7.0.2
diff --git a/mitmproxy/__init__.py b/mitmproxy/__init__.py
index 9deef96050..e69de29bb2 100644
--- a/mitmproxy/__init__.py
+++ b/mitmproxy/__init__.py
@@ -1,9 +0,0 @@
-import asyncio
-import sys
-
-if sys.platform == 'win32':
- # workaround for
- # https://github.com/tornadoweb/tornado/issues/2751
- # https://www.tornadoweb.org/en/stable/index.html#installation
- # (copied multiple times in the codebase, please remove all occurrences)
- asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
diff --git a/test/mitmproxy/tools/web/test_app.py b/test/mitmproxy/tools/web/test_app.py
index 79456e6003..4272b202b6 100644
--- a/test/mitmproxy/tools/web/test_app.py
+++ b/test/mitmproxy/tools/web/test_app.py
@@ -2,18 +2,10 @@
import json as _json
import logging
import os
-import sys
from unittest import mock
import pytest
-if sys.platform == 'win32':
- # workaround for
- # https://github.com/tornadoweb/tornado/issues/2751
- # https://www.tornadoweb.org/en/stable/index.html#installation
- # (copied multiple times in the codebase, please remove all occurrences)
- asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
-
import tornado.testing # noqa
from tornado import httpclient # noqa
from tornado import websocket # noqa
|
jazzband__django-debug-toolbar-1872 | New AJAX request resets whole view if HistoryPanel is enabled.
Maybe I am doing something wrong, but I find working with DDT with HistoryPanel enabled quite annoying.
I have notifications on my website which makes request every ~5 seconds if there is anything new.
If I have HistoryPanel enabled in DDT this means that if I am exploring some request (from history or just the last one) and this AJAX notification request is made I loose everything I am seeing and the whole DDT resets to the newest (notification) request.
Would it be possible to set DDT so that it switches the request only if I explicitly select it from history?
| [
{
"content": "import warnings\nfrom functools import lru_cache\n\nfrom django.conf import settings\nfrom django.dispatch import receiver\nfrom django.test.signals import setting_changed\n\nCONFIG_DEFAULTS = {\n # Toolbar options\n \"DISABLE_PANELS\": {\n \"debug_toolbar.panels.profiling.ProfilingPa... | [
{
"content": "import warnings\nfrom functools import lru_cache\n\nfrom django.conf import settings\nfrom django.dispatch import receiver\nfrom django.test.signals import setting_changed\n\nCONFIG_DEFAULTS = {\n # Toolbar options\n \"DISABLE_PANELS\": {\n \"debug_toolbar.panels.profiling.ProfilingPa... | diff --git a/debug_toolbar/settings.py b/debug_toolbar/settings.py
index eb6b59209..1df24527d 100644
--- a/debug_toolbar/settings.py
+++ b/debug_toolbar/settings.py
@@ -42,6 +42,7 @@
"SQL_WARNING_THRESHOLD": 500, # milliseconds
"OBSERVE_REQUEST_CALLBACK": "debug_toolbar.toolbar.observe_request",
"TOOLBAR_LANGUAGE": None,
+ "UPDATE_ON_FETCH": False,
}
diff --git a/debug_toolbar/static/debug_toolbar/js/history.js b/debug_toolbar/static/debug_toolbar/js/history.js
index b30fcabae..314ddb3ef 100644
--- a/debug_toolbar/static/debug_toolbar/js/history.js
+++ b/debug_toolbar/static/debug_toolbar/js/history.js
@@ -104,3 +104,6 @@ $$.on(djDebug, "click", ".refreshHistory", function (event) {
event.preventDefault();
refreshHistory();
});
+// We don't refresh the whole toolbar each fetch or ajax request,
+// so we need to refresh the history when we open the panel
+$$.onPanelRender(djDebug, "HistoryPanel", refreshHistory);
diff --git a/debug_toolbar/static/debug_toolbar/js/toolbar.js b/debug_toolbar/static/debug_toolbar/js/toolbar.js
index 6648fb52b..199616336 100644
--- a/debug_toolbar/static/debug_toolbar/js/toolbar.js
+++ b/debug_toolbar/static/debug_toolbar/js/toolbar.js
@@ -17,8 +17,10 @@ function getDebugElement() {
const djdt = {
handleDragged: false,
+ needUpdateOnFetch: false,
init() {
const djDebug = getDebugElement();
+ djdt.needUpdateOnFetch = djDebug.dataset.updateOnFetch === "True";
$$.on(djDebug, "click", "#djDebugPanelList li a", function (event) {
event.preventDefault();
if (!this.className) {
@@ -274,7 +276,9 @@ const djdt = {
storeId = encodeURIComponent(storeId);
const dest = `${sidebarUrl}?store_id=${storeId}`;
slowjax(dest).then(function (data) {
- replaceToolbarState(storeId, data);
+ if (djdt.needUpdateOnFetch){
+ replaceToolbarState(storeId, data);
+ }
});
}
diff --git a/debug_toolbar/templates/debug_toolbar/base.html b/debug_toolbar/templates/debug_toolbar/base.html
index 5447970af..6f4967f21 100644
--- a/debug_toolbar/templates/debug_toolbar/base.html
+++ b/debug_toolbar/templates/debug_toolbar/base.html
@@ -16,7 +16,7 @@
data-sidebar-url="{{ history_url }}"
{% endif %}
data-default-show="{% if toolbar.config.SHOW_COLLAPSED %}false{% else %}true{% endif %}"
- {{ toolbar.config.ROOT_TAG_EXTRA_ATTRS|safe }}>
+ {{ toolbar.config.ROOT_TAG_EXTRA_ATTRS|safe }} data-update-on-fetch="{{ toolbar.config.UPDATE_ON_FETCH }}">
<div class="djdt-hidden" id="djDebugToolbar">
<ul id="djDebugPanelList">
<li><a id="djHideToolBarButton" href="#" title="{% trans 'Hide toolbar' %}">{% trans "Hide" %} »</a></li>
diff --git a/docs/changes.rst b/docs/changes.rst
index 82185d756..9ff88b2b8 100644
--- a/docs/changes.rst
+++ b/docs/changes.rst
@@ -19,6 +19,9 @@ Pending
<https://astral.sh/blog/the-ruff-formatter>`__.
* Changed the default position of the toolbar from top to the upper top
position.
+* Added the setting, ``UPDATE_ON_FETCH`` to control whether the
+ toolbar automatically updates to the latest AJAX request or not.
+ It defaults to ``False``.
4.2.0 (2023-08-10)
------------------
diff --git a/docs/configuration.rst b/docs/configuration.rst
index 887608c6e..8271092ca 100644
--- a/docs/configuration.rst
+++ b/docs/configuration.rst
@@ -163,6 +163,16 @@ Toolbar options
but want to render your application in French, you would set this to
``"en-us"`` and :setting:`LANGUAGE_CODE` to ``"fr"``.
+.. _UPDATE_ON_FETCH:
+
+* ``UPDATE_ON_FETCH``
+
+ Default: ``False``
+
+ This controls whether the toolbar should update to the latest AJAX
+ request when it occurs. This is especially useful when using htmx
+ boosting or similar JavaScript techniques.
+
Panel options
~~~~~~~~~~~~~
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 7a15d9aeb..436977bdc 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -6,6 +6,7 @@ Pympler
Roboto
Transifex
Werkzeug
+ajax
async
backend
backends
diff --git a/tests/templates/ajax/ajax.html b/tests/templates/ajax/ajax.html
new file mode 100644
index 000000000..c9de3acb6
--- /dev/null
+++ b/tests/templates/ajax/ajax.html
@@ -0,0 +1,21 @@
+{% extends "base.html" %}
+{% block content %}
+ <div id="click_for_ajax">click for ajax</div>
+
+ <script>
+
+ let click_for_ajax = document.getElementById("click_for_ajax");
+ function send_ajax() {
+ let xhr = new XMLHttpRequest();
+ let url = '/json_view/';
+ xhr.open("GET", url, true);
+ xhr.onreadystatechange = function () {
+ if (this.readyState == 4 && this.status == 200) {
+ console.log(this.responseText);
+ }
+ }
+ xhr.send();
+ }
+ document.addEventListener("click", (event) => {send_ajax()});
+ </script>
+{% endblock %}
diff --git a/tests/test_integration.py b/tests/test_integration.py
index b77b7cede..379fafaf4 100644
--- a/tests/test_integration.py
+++ b/tests/test_integration.py
@@ -1,5 +1,6 @@
import os
import re
+import time
import unittest
import html5lib
@@ -749,3 +750,24 @@ def test_toolbar_language_will_render_to_locale_when_set_both(self):
)
self.assertIn("Query", table.text)
self.assertIn("Action", table.text)
+
+ def test_ajax_dont_refresh(self):
+ self.get("/ajax/")
+ make_ajax = self.selenium.find_element(By.ID, "click_for_ajax")
+ make_ajax.click()
+ history_panel = self.selenium.find_element(By.ID, "djdt-HistoryPanel")
+ self.assertIn("/ajax/", history_panel.text)
+ self.assertNotIn("/json_view/", history_panel.text)
+
+ @override_settings(DEBUG_TOOLBAR_CONFIG={"UPDATE_ON_FETCH": True})
+ def test_ajax_refresh(self):
+ self.get("/ajax/")
+ make_ajax = self.selenium.find_element(By.ID, "click_for_ajax")
+ make_ajax.click()
+ # Need to wait until the ajax request is over and json_view is displayed on the toolbar
+ time.sleep(2)
+ history_panel = self.wait.until(
+ lambda selenium: self.selenium.find_element(By.ID, "djdt-HistoryPanel")
+ )
+ self.assertNotIn("/ajax/", history_panel.text)
+ self.assertIn("/json_view/", history_panel.text)
diff --git a/tests/urls.py b/tests/urls.py
index 6fc8811b7..f8929f1e8 100644
--- a/tests/urls.py
+++ b/tests/urls.py
@@ -21,6 +21,7 @@
path("cached_low_level_view/", views.cached_low_level_view),
path("json_view/", views.json_view),
path("redirect/", views.redirect_view),
+ path("ajax/", views.ajax_view),
path("login_without_redirect/", LoginView.as_view(redirect_field_name=None)),
path("admin/", admin.site.urls),
path("__debug__/", include("debug_toolbar.urls")),
diff --git a/tests/views.py b/tests/views.py
index b2fd21c54..c7214029e 100644
--- a/tests/views.py
+++ b/tests/views.py
@@ -58,3 +58,7 @@ def listcomp_view(request):
def redirect_view(request):
return HttpResponseRedirect("/regular/redirect/")
+
+
+def ajax_view(request):
+ return render(request, "ajax/ajax.html")
|
translate__pootle-3671 | Confusing sentence in permissions view
There is a permission called "Can review translations" that confused me as I thought that there are also reviewers beside suggesters and translators! Hopefully you fix it so that it lands in 2.7.0.
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship informa... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship informa... | diff --git a/pootle/core/initdb.py b/pootle/core/initdb.py
index fc14e6d8cc5..3b3b5ab3b07 100644
--- a/pootle/core/initdb.py
+++ b/pootle/core/initdb.py
@@ -124,7 +124,7 @@ def create_pootle_permissions():
'codename': "translate",
},
{
- 'name': _("Can review translations"),
+ 'name': _("Can review suggestions"),
'codename': "review",
},
{
|
archlinux__archinstall-1787 | archinstall crashing
Here's the traceback
```
Traceback (most recent call last):
File "/usr/bin/archinstall", line 5, in <module>
from archinstall import run_as_a_module
File "/usr/lib/python3.11/site-packages/archinstall/__init__.py", line 5, in <module>
from .lib import disk
File "/usr/lib/python3.11/site-packages/archinstall/lib/disk/__init__.py", line 1, in <module>
from .device_handler import device_handler, disk_layouts
File "/usr/lib/python3.11/site-packages/archinstall/lib/disk/device_handler.py", line 16, in <module>
from .device_model import (
File "/usr/lib/python3.11/site-packages/archinstall/lib/disk/device_model.py", line 849, in <module>
@dataclass
^^^^^^^^^
File "/usr/lib/python3.11/dataclasses.py", line 1223, in dataclass
return wrap(cls)
^^^^^^^^^
File "/usr/lib/python3.11/dataclasses.py", line 1213, in wrap
return _process_class(cls, init, repr, eq, order, unsafe_hash,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/dataclasses.py", line 958, in _process_class
cls_fields.append(_get_field(cls, name, type, kw_only))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/dataclasses.py", line 815, in _get_field
raise ValueError(f'mutable default {type(f.default)} for field '
ValueError: mutable default <class 'archinstall.lib.disk.device_model.Size'> for field size is not allowed: use default_factory
```
| [
{
"content": "from __future__ import annotations\n\nimport dataclasses\nimport json\nimport logging\nimport math\nimport time\nimport uuid\nfrom dataclasses import dataclass, field\nfrom enum import Enum\nfrom enum import auto\nfrom pathlib import Path\nfrom typing import Optional, List, Dict, TYPE_CHECKING, An... | [
{
"content": "from __future__ import annotations\n\nimport dataclasses\nimport json\nimport logging\nimport math\nimport time\nimport uuid\nfrom dataclasses import dataclass, field\nfrom enum import Enum\nfrom enum import auto\nfrom pathlib import Path\nfrom typing import Optional, List, Dict, TYPE_CHECKING, An... | diff --git a/archinstall/lib/disk/device_model.py b/archinstall/lib/disk/device_model.py
index 8e26b1d703..d57347b737 100644
--- a/archinstall/lib/disk/device_model.py
+++ b/archinstall/lib/disk/device_model.py
@@ -851,7 +851,7 @@ class LsblkInfo:
name: str = ''
path: Path = Path()
pkname: str = ''
- size: Size = Size(0, Unit.B)
+ size: Size = field(default_factory=lambda: Size(0, Unit.B))
log_sec: int = 0
pttype: str = ''
ptuuid: str = ''
|
ivy-llc__ivy-17989 | fmax
| [
{
"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import to_ivy_arrays_and_back\n\n\n@with_unsupported_dtypes({\"2.5.0 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\... | [
{
"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import to_ivy_arrays_and_back\n\n\n@with_unsupported_dtypes({\"2.5.0 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\... | diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py
index a55742b58857c..c0301050626a6 100644
--- a/ivy/functional/frontends/paddle/tensor/math.py
+++ b/ivy/functional/frontends/paddle/tensor/math.py
@@ -253,3 +253,9 @@ def fmin(x, y, name=None):
@to_ivy_arrays_and_back
def logit(x, eps=None, name=None):
return ivy.logit(x, eps=eps)
+
+
+@with_unsupported_dtypes({"2.5.0 and below": "bfloat16"}, "paddle")
+@to_ivy_arrays_and_back
+def fmax(x, y, name=None):
+ return ivy.fmax(x, y)
diff --git a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_math.py b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_math.py
index 27fc996cbfa93..4bb3bcc18a915 100644
--- a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_math.py
+++ b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_math.py
@@ -1080,3 +1080,29 @@ def test_paddle_angle(
on_device=on_device,
x=x[0],
)
+
+
+@handle_frontend_test(
+ fn_tree="paddle.fmax",
+ dtypes_and_x=helpers.dtype_and_values(
+ available_dtypes=helpers.get_dtypes("float"), num_arrays=2, shared_dtype=True
+ ),
+)
+def test_paddle_fmax(
+ *,
+ dtypes_and_x,
+ on_device,
+ fn_tree,
+ frontend,
+ test_flags,
+):
+ input_dtype, x = dtypes_and_x
+ helpers.test_frontend_function(
+ input_dtypes=input_dtype,
+ frontend=frontend,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ x=x[0],
+ y=x[1],
+ )
|
conan-io__conan-127 | mark headers as "SYSTEM" headers to silence warnings
Many libraries generate tons of warnings in public headers. WebSocket++ uses auto_ptr for example and many Boost libraries truncate integers implicitly (-Wconversion). To consume these libraries you have to treat them as system headers because GCC won't emit warnings in these.
This is how Conan currently sets the include directories:
``` CMake
include_directories(${CONAN_INCLUDE_DIRS})
```
This is how you would add them as "system" headers to silence warnings:
``` CMake
include_directories(SYSTEM ${CONAN_INCLUDE_DIRS})
```
Is there a reason it is not already done this way?
This issue may apply to configurations other than CMake/GCC, too, but this is the most important one for me.
| [
{
"content": "from conans.model import Generator\nfrom conans.paths import BUILD_INFO_CMAKE\n\n\nclass DepsCppCmake(object):\n def __init__(self, deps_cpp_info):\n self.include_paths = \"\\n\\t\\t\\t\".join('\"%s\"' % p.replace(\"\\\\\", \"/\")\n for p in de... | [
{
"content": "from conans.model import Generator\nfrom conans.paths import BUILD_INFO_CMAKE\n\n\nclass DepsCppCmake(object):\n def __init__(self, deps_cpp_info):\n self.include_paths = \"\\n\\t\\t\\t\".join('\"%s\"' % p.replace(\"\\\\\", \"/\")\n for p in de... | diff --git a/conans/client/generators/cmake.py b/conans/client/generators/cmake.py
index 827f8b2cb1a..56291fdedcf 100644
--- a/conans/client/generators/cmake.py
+++ b/conans/client/generators/cmake.py
@@ -82,7 +82,7 @@ def _aux_cmake_test_setup(self):
endmacro()
macro(CONAN_FLAGS_SETUP)
- include_directories(${CONAN_INCLUDE_DIRS})
+ include_directories(SYSTEM ${CONAN_INCLUDE_DIRS})
link_directories(${CONAN_LIB_DIRS})
add_definitions(${CONAN_DEFINES})
|
pypa__pip-6427 | ensure_dir() should also check for ENOTEMPTY
**Environment**
* pip version:
pip 19.0.3
* Python version:
python 3.7
* OS:
'python:3.7-alpine3.9' docker image (docker-ce 17.9) running on Ubuntu 18.04.2 LTS in WSL (Windows)
**Description**
`pip install pipenv` fails with the following error:
> Could not install packages due to an EnvironmentError: [Errno 39] Directory not empty: '/tmp/pip-install-wx86kab7/pipenv/pipenv'
**How to Reproduce**
1. Set up environment as described above (alpine3.9, on docker-ce 17.9, running on Ubuntu in WSL). and run the command.
2. `apk --update add --virtual build-dependencies libffi-dev openssl-dev build-base`
3. `pip install --upgrade pip`
4. `pip install pipenv`
**Output**
> Could not install packages due to an EnvironmentError: [Errno 39] Directory not empty: '/tmp/pip-install-wx86kab7/pipenv/pipenv'
**Investigation**
Compared strace results of a successful run (on a different env) vs the failed run.
On a successful run, the `mkdir` command is continually executed with `/tmp/pip-install-<hash>/pipenv/pipenv` as an argument and fails with an `EEXIST` error. However, on the failed run the same command fails with an `ENOTEMPT` error. This has to do with the environment itself (maybe docker/windows related), as the same difference is observed when simply performing mkdir from a shell.
| [
{
"content": "from __future__ import absolute_import\n\nimport contextlib\nimport errno\nimport io\n# we have a submodule named 'logging' which would shadow this if we used the\n# regular name:\nimport logging as std_logging\nimport os\nimport posixpath\nimport re\nimport shutil\nimport stat\nimport subprocess\... | [
{
"content": "from __future__ import absolute_import\n\nimport contextlib\nimport errno\nimport io\n# we have a submodule named 'logging' which would shadow this if we used the\n# regular name:\nimport logging as std_logging\nimport os\nimport posixpath\nimport re\nimport shutil\nimport stat\nimport subprocess\... | diff --git a/news/6426.bugfix b/news/6426.bugfix
new file mode 100644
index 00000000000..25512b3c808
--- /dev/null
+++ b/news/6426.bugfix
@@ -0,0 +1 @@
+Make ``ensure_dir()`` also ignore ``ENOTEMPTY`` as seen on Windows.
diff --git a/src/pip/_internal/utils/misc.py b/src/pip/_internal/utils/misc.py
index ca7a529387c..7c7fc4a09b8 100644
--- a/src/pip/_internal/utils/misc.py
+++ b/src/pip/_internal/utils/misc.py
@@ -98,7 +98,8 @@ def ensure_dir(path):
try:
os.makedirs(path)
except OSError as e:
- if e.errno != errno.EEXIST:
+ # Windows can raise spurious ENOTEMPTY errors. See #6426.
+ if e.errno != errno.EEXIST and e.errno != errno.ENOTEMPTY:
raise
|
coala__coala-4276 | pytest-3.1 raises lots of warnings running our tests
Latest `pytest-3.1.x` versions raise several warnings when running our tests, mostly telling that `unittest` functions `assertEquals` and `assertRaisesRegexp` should not be used anymore. We should get rid of those warnings...
| [
{
"content": "import os\nimport platform\nimport re\nfrom functools import lru_cache\n\nfrom coala_utils.decorators import yield_once\nfrom coalib.misc.Constants import GLOBBING_SPECIAL_CHARS\n\n\ndef _end_of_set_index(string, start_index):\n \"\"\"\n Returns the position of the appropriate closing bracke... | [
{
"content": "import os\nimport platform\nimport re\nfrom functools import lru_cache\n\nfrom coala_utils.decorators import yield_once\nfrom coalib.misc.Constants import GLOBBING_SPECIAL_CHARS\n\n\ndef _end_of_set_index(string, start_index):\n \"\"\"\n Returns the position of the appropriate closing bracke... | diff --git a/coalib/parsing/Globbing.py b/coalib/parsing/Globbing.py
index c31726a861..0e64920688 100644
--- a/coalib/parsing/Globbing.py
+++ b/coalib/parsing/Globbing.py
@@ -191,7 +191,7 @@ def translate(pattern):
regex += '[' + sequence + ']'
else:
regex = regex + re.escape(char)
- return regex + '\\Z(?ms)'
+ return '(?ms)' + regex + '\\Z'
def fnmatch(name, globs):
diff --git a/test-requirements.txt b/test-requirements.txt
index 05831f5b12..b622eb08e7 100644
--- a/test-requirements.txt
+++ b/test-requirements.txt
@@ -2,7 +2,7 @@ argcomplete~=1.8
coverage~=4.3.4
codecov~=2.0.5
freezegun~=0.3.9
-pytest~=3.0
+pytest~=3.1.1
pytest-cov~=2.2
pytest-env~=0.6.0
pytest-mock~=1.1
diff --git a/tests/bearlib/languages/LanguageTest.py b/tests/bearlib/languages/LanguageTest.py
index 84dba7d9e5..71a5d2e684 100644
--- a/tests/bearlib/languages/LanguageTest.py
+++ b/tests/bearlib/languages/LanguageTest.py
@@ -21,9 +21,9 @@ def tearDown(self):
pass
def test_invalid_attribute(self):
- with self.assertRaisesRegexp(AttributeError, 'not a valid attribute'):
+ with self.assertRaisesRegex(AttributeError, 'not a valid attribute'):
self.lang_cpp.not_an_attribute
def test_attribute_list_empy(self):
- with self.assertRaisesRegexp(AttributeError, 'no available attribute'):
+ with self.assertRaisesRegex(AttributeError, 'no available attribute'):
self.lang_unknown.not_an_attribute
diff --git a/tests/bears/BearTest.py b/tests/bears/BearTest.py
index c8977052e2..36da1d32f4 100644
--- a/tests/bears/BearTest.py
+++ b/tests/bears/BearTest.py
@@ -364,7 +364,7 @@ def test_connection_timeout_mocked(self):
exc = requests.exceptions.ConnectTimeout
with requests_mock.Mocker() as reqmock:
reqmock.get(self.mock_url, exc=exc)
- with self.assertRaisesRegexp(exc, '^$'):
+ with self.assertRaisesRegex(exc, '^$'):
self.uut.download_cached_file(
self.mock_url, self.filename)
@@ -380,17 +380,18 @@ def test_read_broken(self):
with requests_mock.Mocker() as reqmock:
reqmock.get(self.mock_url, body=fake_content_provider)
- with self.assertRaisesRegexp(exc, 'Fake read timeout'):
+ with self.assertRaisesRegex(exc, 'Fake read timeout'):
self.uut.download_cached_file(
self.mock_url, self.filename)
self.assertTrue(isfile(self.file_location))
- self.assertEqual(open(self.file_location, 'rb').read(),
- b''.join(fake_content))
+
+ with open(self.file_location, 'rb') as fh:
+ self.assertEqual(fh.read(), b''.join(fake_content))
def test_status_code_error(self):
exc = requests.exceptions.HTTPError
- with self.assertRaisesRegexp(exc, '418 Client Error'):
+ with self.assertRaisesRegex(exc, '418 Client Error'):
self.uut.download_cached_file(
'http://httpbin.org/status/418', self.filename)
diff --git a/tests/core/BearTest.py b/tests/core/BearTest.py
index 941db19379..843c8f9d92 100644
--- a/tests/core/BearTest.py
+++ b/tests/core/BearTest.py
@@ -211,13 +211,13 @@ def test_download_cached_file_connection_timeout_mocked(self):
exc = requests.exceptions.ConnectTimeout
with requests_mock.Mocker() as reqmock:
reqmock.get(mock_url, exc=exc)
- with self.assertRaisesRegexp(exc, '^$'):
+ with self.assertRaisesRegex(exc, '^$'):
Bear.download_cached_file(
mock_url, 'test.html')
def test_download_cached_file_status_code_error(self):
exc = requests.exceptions.HTTPError
- with self.assertRaisesRegexp(exc, '418 Client Error'):
+ with self.assertRaisesRegex(exc, '418 Client Error'):
Bear.download_cached_file(
'http://httpbin.org/status/418', 'test.html')
diff --git a/tests/output/JSONEncoderTest.py b/tests/output/JSONEncoderTest.py
index 9b05000b5f..7f624392f5 100644
--- a/tests/output/JSONEncoderTest.py
+++ b/tests/output/JSONEncoderTest.py
@@ -56,25 +56,25 @@ class JSONEncoderTest(unittest.TestCase):
kw = {'cls': JSONEncoder, 'sort_keys': True}
def test_builtins(self):
- self.assertEquals('"test"', json.dumps('test', **self.kw))
- self.assertEquals('1', json.dumps(1, **self.kw))
- self.assertEquals('true', json.dumps(True, **self.kw))
- self.assertEquals('null', json.dumps(None, **self.kw))
+ self.assertEqual('"test"', json.dumps('test', **self.kw))
+ self.assertEqual('1', json.dumps(1, **self.kw))
+ self.assertEqual('true', json.dumps(True, **self.kw))
+ self.assertEqual('null', json.dumps(None, **self.kw))
def test_iter(self):
- self.assertEquals('[0, 1]', json.dumps([0, 1], **self.kw))
- self.assertEquals('[0, 1]', json.dumps((0, 1), **self.kw))
- self.assertEquals('[0, 1]', json.dumps(range(2), **self.kw))
+ self.assertEqual('[0, 1]', json.dumps([0, 1], **self.kw))
+ self.assertEqual('[0, 1]', json.dumps((0, 1), **self.kw))
+ self.assertEqual('[0, 1]', json.dumps(range(2), **self.kw))
def test_dict(self):
- self.assertEquals('{"0": 1}', json.dumps({0: 1}, **self.kw))
- self.assertEquals('{"0": 1}', json.dumps({'0': 1}, **self.kw))
- self.assertEquals('{"0": "1"}', json.dumps({'0': '1'}, **self.kw))
+ self.assertEqual('{"0": 1}', json.dumps({0: 1}, **self.kw))
+ self.assertEqual('{"0": 1}', json.dumps({'0': 1}, **self.kw))
+ self.assertEqual('{"0": "1"}', json.dumps({'0': '1'}, **self.kw))
def test_time(self):
tf = datetime.today()
- self.assertEquals('"' + tf.isoformat() + '"',
- json.dumps(tf, **self.kw))
+ self.assertEqual('"' + tf.isoformat() + '"',
+ json.dumps(tf, **self.kw))
def test_re_object(self):
uut = re.compile('x')
@@ -83,19 +83,19 @@ def test_re_object(self):
def test_class1(self):
tc1 = TestClass1()
- self.assertEquals('{"a": 0}', json.dumps(tc1, **self.kw))
- self.assertEquals('[{"a": 0}]', json.dumps([tc1], **self.kw))
- self.assertEquals('{"0": {"a": 0}}', json.dumps({0: tc1}, **self.kw))
+ self.assertEqual('{"a": 0}', json.dumps(tc1, **self.kw))
+ self.assertEqual('[{"a": 0}]', json.dumps([tc1], **self.kw))
+ self.assertEqual('{"0": {"a": 0}}', json.dumps({0: tc1}, **self.kw))
def test_class2(self):
tc2 = TestClass2()
- self.assertEquals('{"a": 0, "b": {"a": 0}}',
- json.dumps(tc2, **self.kw))
+ self.assertEqual('{"a": 0, "b": {"a": 0}}',
+ json.dumps(tc2, **self.kw))
def test_class3(self):
tc3 = TestClass3()
- self.assertEquals('{"key": "val"}',
- json.dumps(tc3, **self.kw))
+ self.assertEqual('{"key": "val"}',
+ json.dumps(tc3, **self.kw))
def test_propertied_class(self):
uut = PropertiedClass()
diff --git a/tests/parsing/DefaultArgParserTest.py b/tests/parsing/DefaultArgParserTest.py
index 1750c474db..18f5115b41 100644
--- a/tests/parsing/DefaultArgParserTest.py
+++ b/tests/parsing/DefaultArgParserTest.py
@@ -23,11 +23,11 @@ def test_metavar_in_usage(self):
self.output,
flags=re.DOTALL)
self.assertIsNotNone(match)
- self.assertEquals(match.group(1), '-a [BOOL]')
+ self.assertEqual(match.group(1), '-a [BOOL]')
def test_metavar_not_in_optional_args_sections(self):
match = re.search('optional arguments:.+(-a, --all).*',
self.output,
flags=re.DOTALL)
self.assertIsNotNone(match)
- self.assertEquals(match.group(1), '-a, --all')
+ self.assertEqual(match.group(1), '-a, --all')
diff --git a/tests/results/result_actions/ApplyPatchActionTest.py b/tests/results/result_actions/ApplyPatchActionTest.py
index 41a11c552f..2b0c0ae02f 100644
--- a/tests/results/result_actions/ApplyPatchActionTest.py
+++ b/tests/results/result_actions/ApplyPatchActionTest.py
@@ -115,7 +115,8 @@ def test_apply_rename(self):
file_diff_dict)
self.assertFalse(isfile(f_a+'.renamed.orig'))
- file_dict = {f_a+'.renamed': open(f_a+'.renamed').readlines()}
+ with open(f_a+'.renamed') as fh:
+ file_dict = {f_a+'.renamed': fh.readlines()}
self.assertEqual(file_dict, expected_file_dict)
# Recreate file so that context manager make_temp() can delete it
|
ansible__molecule-3103 | Current directory being inadvertently added to ANSIBLE_LIBRARY
<!--- Verify first that your issue is not already reported on GitHub -->
<!--- Do not report bugs before reproducing them with the code of the master branch! -->
<!--- Please also check https://molecule.readthedocs.io/en/latest/faq.html --->
<!--- Please use https://groups.google.com/forum/#!forum/molecule-users for usage questions -->
# Issue Type
- Bug report
# Molecule and Ansible details
```
ansible --version && molecule --version
ansible 2.10.7.post0
config file = None
configured module search path = ['/home/mgraves/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/mgraves/git/ansible/lib/ansible
executable location = /home/mgraves/git/ansible/bin/ansible
python version = 3.8.8 (default, Mar 5 2021, 11:47:01) [GCC 10.2.1 20210110]
molecule 3.3.0 using python 3.8
ansible:2.10.7.post0
delegated:3.3.0 from molecule
docker:0.2.4 from molecule_docker
```
Molecule installation method (one of):
- pip
Ansible installation method (one of):
- source
Detail any linters or test runners used:
# Desired Behavior
Molecule should successfully complete.
# Actual Behaviour
```
~/git/ansible_collections/community/kubernetes $ molecule --debug converge -- -vvv
DEBUG Validating schema /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/molecule.yml.
INFO default scenario test matrix: dependency, create, prepare, converge
INFO Performing prerun...
INFO Added ANSIBLE_LIBRARY=:plugins/modules
INFO Added ANSIBLE_COLLECTIONS_PATH=/home/mgraves/git:/home/mgraves/.ansible/collections:/home/mgraves/git:/home/mgraves/.ansible/collections:./.cache/collections
INFO Running default > dependency
WARNING Skipping, missing the requirements file.
WARNING Skipping, missing the requirements file.
INFO Running default > create
WARNING Skipping, instances are delegated.
INFO Running default > prepare
WARNING Skipping, prepare playbook not configured.
INFO Running default > converge
DEBUG: ANSIBLE ENVIRONMENT:
ANSIBLE_COLLECTIONS_PATH: /home/mgraves/.cache/molecule/kubernetes/default/collections:/home/mgraves/git:/home/mgraves/.ansible/collections:/usr/share/ansible/collections:/etc/ansible/collections
ANSIBLE_CONFIG: /home/mgraves/.cache/molecule/kubernetes/default/ansible.cfg
ANSIBLE_FILTER_PLUGINS: /home/mgraves/git/ansible/venv/lib/python3.8/site-packages/molecule/provisioner/ansible/plugins/filter:/home/mgraves/.cache/molecule/kubernetes/default/plugins/filter:/home/mgraves/git/ansible_collections/community/kubernetes/plugins/filter:/home/mgraves/.ansible/plugins/filter:/usr/share/ansible/plugins/filter
ANSIBLE_FORCE_COLOR: 'true'
ANSIBLE_HOME: /home/mgraves/git/ansible
ANSIBLE_LIBRARY: /home/mgraves/git/ansible/venv/lib/python3.8/site-packages/molecule/provisioner/ansible/plugins/modules:/home/mgraves/.cache/molecule/kubernetes/default/library:/home/mgraves/git/ansible_collections/community/kubernetes/library:/home/mgraves/.ansible/plugins/modules:/usr/share/ansible/plugins/modules:/home/mgraves/git/ansible_collections/community/kubernetes/:plugins/modules
ANSIBLE_ROLES_PATH: '/home/mgraves/.cache/molecule/kubernetes/default/roles:/home/mgraves/git/ansible_collections/community:/home/mgraves/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles:'
DEBUG: MOLECULE ENVIRONMENT:
MOLECULE_DEBUG: 'True'
MOLECULE_DEPENDENCY_NAME: galaxy
MOLECULE_DRIVER_NAME: delegated
MOLECULE_ENV_FILE: /home/mgraves/git/ansible_collections/community/kubernetes/.env.yml
MOLECULE_EPHEMERAL_DIRECTORY: /home/mgraves/.cache/molecule/kubernetes/default
MOLECULE_FILE: /home/mgraves/.cache/molecule/kubernetes/default/molecule.yml
MOLECULE_INSTANCE_CONFIG: /home/mgraves/.cache/molecule/kubernetes/default/instance_config.yml
MOLECULE_INVENTORY_FILE: /home/mgraves/.cache/molecule/kubernetes/default/inventory/ansible_inventory.yml
MOLECULE_PROJECT_DIRECTORY: /home/mgraves/git/ansible_collections/community/kubernetes
MOLECULE_PROVISIONER_NAME: ansible
MOLECULE_SCENARIO_DIRECTORY: /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default
MOLECULE_SCENARIO_NAME: default
MOLECULE_STATE_FILE: /home/mgraves/.cache/molecule/kubernetes/default/state.yml
MOLECULE_VERIFIER_NAME: ansible
MOLECULE_VERIFIER_TEST_DIRECTORY: /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/tests
DEBUG: SHELL REPLAY:
ANSIBLE_COLLECTIONS_PATH=/home/mgraves/.cache/molecule/kubernetes/default/collections:/home/mgraves/git:/home/mgraves/.ansible/collections:/usr/share/ansible/collections:/etc/ansible/collections ANSIBLE_CONFIG=/home/mgraves/.cache/molecule/kubernetes/default/ansible.cfg ANSIBLE_FILTER_PLUGINS=/home/mgraves/git/ansible/venv/lib/python3.8/site-packages/molecule/provisioner/ansible/plugins/filter:/home/mgraves/.cache/molecule/kubernetes/default/plugins/filter:/home/mgraves/git/ansible_collections/community/kubernetes/plugins/filter:/home/mgraves/.ansible/plugins/filter:/usr/share/ansible/plugins/filter ANSIBLE_FORCE_COLOR=true ANSIBLE_HOME=/home/mgraves/git/ansible ANSIBLE_LIBRARY=/home/mgraves/git/ansible/venv/lib/python3.8/site-packages/molecule/provisioner/ansible/plugins/modules:/home/mgraves/.cache/molecule/kubernetes/default/library:/home/mgraves/git/ansible_collections/community/kubernetes/library:/home/mgraves/.ansible/plugins/modules:/usr/share/ansible/plugins/modules:/home/mgraves/git/ansible_collections/community/kubernetes/:plugins/modules ANSIBLE_ROLES_PATH=/home/mgraves/.cache/molecule/kubernetes/default/roles:/home/mgraves/git/ansible_collections/community:/home/mgraves/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles: MOLECULE_DEBUG=True MOLECULE_DEPENDENCY_NAME=galaxy MOLECULE_DRIVER_NAME=delegated MOLECULE_ENV_FILE=/home/mgraves/git/ansible_collections/community/kubernetes/.env.yml MOLECULE_EPHEMERAL_DIRECTORY=/home/mgraves/.cache/molecule/kubernetes/default MOLECULE_FILE=/home/mgraves/.cache/molecule/kubernetes/default/molecule.yml MOLECULE_INSTANCE_CONFIG=/home/mgraves/.cache/molecule/kubernetes/default/instance_config.yml MOLECULE_INVENTORY_FILE=/home/mgraves/.cache/molecule/kubernetes/default/inventory/ansible_inventory.yml MOLECULE_PROJECT_DIRECTORY=/home/mgraves/git/ansible_collections/community/kubernetes MOLECULE_PROVISIONER_NAME=ansible MOLECULE_SCENARIO_DIRECTORY=/home/mgraves/git/ansible_collections/community/kubernetes/molecule/default MOLECULE_SCENARIO_NAME=default MOLECULE_STATE_FILE=/home/mgraves/.cache/molecule/kubernetes/default/state.yml MOLECULE_VERIFIER_NAME=ansible MOLECULE_VERIFIER_TEST_DIRECTORY=/home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/tests
COMMAND: ansible-playbook --diff --inventory /home/mgraves/.cache/molecule/kubernetes/default/inventory --skip-tags molecule-notest,notest -vvv /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/converge.yml
ansible-playbook 2.10.7.post0
config file = /home/mgraves/.cache/molecule/kubernetes/default/ansible.cfg
configured module search path = ['/home/mgraves/git/ansible/venv/lib/python3.8/site-packages/molecule/provisioner/ansible/plugins/modules', '/home/mgraves/.cache/molecule/kubernetes/default/library', '/home/mgraves/git/ansible_collections/community/kubernetes/library', '/home/mgraves/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules', '/home/mgraves/git/ansible_collections/community/kubernetes', '/home/mgraves/git/ansible_collections/community/kubernetes/plugins/modules']
ansible python module location = /home/mgraves/git/ansible/lib/ansible
executable location = /home/mgraves/git/ansible/bin/ansible-playbook
python version = 3.8.8 (default, Mar 5 2021, 11:47:01) [GCC 10.2.1 20210110]
Using /home/mgraves/.cache/molecule/kubernetes/default/ansible.cfg as config file
[WARNING]: running playbook inside collection community.kubernetes
[WARNING]: * Failed to parse /home/mgraves/.cache/molecule/kubernetes/default/
inventory/ansible_inventory.yml with
ansible_collections.community.kubernetes.plugins.inventory.k8s plugin:
Incorrect plugin name in file: none found
File "/home/mgraves/git/ansible/lib/ansible/inventory/manager.py", line 289, in parse_source
plugin.parse(self._inventory, self._loader, source, cache=cache)
File "/home/mgraves/git/ansible_collections/community/kubernetes/plugins/inventory/k8s.py", line 153, in parse
config_data = self._read_config_data(path)
File "/home/mgraves/git/ansible/lib/ansible/plugins/inventory/__init__.py", line 227, in _read_config_data
raise AnsibleParserError("Incorrect plugin name in file: %s" % config.get('plugin', 'none found'))
[WARNING]: Unable to parse /home/mgraves/.cache/molecule/kubernetes/default/inv
entory/ansible_inventory.yml as an inventory source
[WARNING]: Invalid characters were found in group names but not replaced, use
-vvvv to see details
Parsed /home/mgraves/.cache/molecule/kubernetes/default/inventory/hosts inventory source with ansible_collections.community.kubernetes.plugins.inventory.k8s plugin
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
redirecting (type: action) community.kubernetes.k8s to community.kubernetes.k8s_info
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: converge.yml *********************************************************
3 plays in /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/converge.yml
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
Read vars_file 'vars/main.yml'
PLAY [Converge] ****************************************************************
Read vars_file 'vars/main.yml'
TASK [Gathering Facts] *********************************************************
task path: /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/converge.yml:2
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: mgraves
<127.0.0.1> EXEC /bin/sh -c 'echo ~mgraves && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/mgraves/.ansible/tmp `"&& mkdir "` echo /home/mgraves/.ansible/tmp/ansible-tmp-1616596936.7291377-1244769-237862218132468 `" && echo ansible-tmp-1616596936.7291377-1244769-237862218132468="` echo /home/mgraves/.ansible/tmp/ansible-tmp-1616596936.7291377-1244769-237862218132468 `" ) && sleep 0'
Using module file /home/mgraves/git/ansible_collections/community/kubernetes/setup.cfg
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /home/mgraves/.ansible/tmp/ansible-tmp-1616596936.7291377-1244769-237862218132468/ > /dev/null 2>&1 && sleep 0'
fatal: [localhost]: FAILED! => {
"msg": "module (ansible.legacy.setup) is missing interpreter line"
}
PLAY RECAP *********************************************************************
localhost : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
CRITICAL Ansible return code was 2, command was: ansible-playbook --diff --inventory /home/mgraves/.cache/molecule/kubernetes/default/inventory --skip-tags molecule-notest,notest -vvv /home/mgraves/git/ansible_collections/community/kubernetes/molecule/default/converge.yml
Please give some details of what is actually happening.
Include a [minimum complete verifiable example](https://stackoverflow.com/help/mcve) with
output of running `molecule --debug`.
```
Our test suite started failing with the 3.3.0 version of molecule. The prerun change that was added in #3077 started adding the current directory to `ANSIBLE_LIBRARY`. This can cause ansible to fail when there is a file with the same name as a module as can be seen here where ansible is trying to read `setup.cfg` in our project root as the setup module.
The problem seems to be in https://github.com/ansible-community/molecule/blob/60b68140fb5c650c47019f5db238c0864dbd43ed/src/molecule/provisioner/ansible.py#L943 In our case, after ansible-lint has run `prepare_environment` the `ANSIBLE_LIBRARY` envvar is `:plugin/modules`. I would think calling `abs_path` on this probably not appropriate since this is a colon separated path string and should just be read in unprocessed.
| [
{
"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# rights... | [
{
"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# rights... | diff --git a/.github/workflows/tox.yml b/.github/workflows/tox.yml
index d4a61c8db1..af350717a0 100644
--- a/.github/workflows/tox.yml
+++ b/.github/workflows/tox.yml
@@ -25,17 +25,17 @@ jobs:
- tox_env: lint
- tox_env: docs
- tox_env: py36
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: py37
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: py38
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: py39
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: py36-devel
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: py39-devel
- PREFIX: PYTEST_REQPASS=433
+ PREFIX: PYTEST_REQPASS=435
- tox_env: packaging
- tox_env: eco
- tox_env: dockerfile
diff --git a/src/molecule/provisioner/ansible.py b/src/molecule/provisioner/ansible.py
index 9982e03691..99a7173aa9 100644
--- a/src/molecule/provisioner/ansible.py
+++ b/src/molecule/provisioner/ansible.py
@@ -940,7 +940,7 @@ def _get_modules_directories(self):
)
if os.environ.get("ANSIBLE_LIBRARY"):
- paths.extend([util.abs_path(os.environ.get("ANSIBLE_LIBRARY"))])
+ paths.extend(map(util.abs_path, os.environ["ANSIBLE_LIBRARY"].split(":")))
return paths
diff --git a/src/molecule/test/unit/provisioner/test_ansible.py b/src/molecule/test/unit/provisioner/test_ansible.py
index 61bd744a1a..73ffe768a2 100644
--- a/src/molecule/test/unit/provisioner/test_ansible.py
+++ b/src/molecule/test/unit/provisioner/test_ansible.py
@@ -20,6 +20,7 @@
import collections
import os
+import re
import pytest
@@ -684,22 +685,36 @@ def test_get_plugin_directory(_instance):
assert ("molecule", "provisioner", "ansible", "plugins") == parts[-4:]
-def test_get_modules_directories(_instance, monkeypatch):
- result = _instance._get_modules_directories()[0]
- parts = pytest.helpers.os_split(result)
- x = ("molecule", "provisioner", "ansible", "plugins", "modules")
+def test_get_modules_directories_default(_instance, monkeypatch):
+ monkeypatch.delenv("ANSIBLE_LIBRARY", raising=False)
+
+ paths = _instance._get_modules_directories()
+
+ assert len(paths) == 5
+ assert re.search(r"molecule/provisioner/ansible/plugins/modules$", paths[0])
+ assert re.search(r"\.cache/molecule/[^/]+/default/library$", paths[1])
+ assert re.search(r"/library$", paths[2])
+ assert re.search(r"\.ansible/plugins/modules$", paths[3])
+ assert re.search(r"/usr/share/ansible/plugins/modules$", paths[4])
+
+
+def test_get_modules_directories_single_ansible_library(_instance, monkeypatch):
+ monkeypatch.setenv("ANSIBLE_LIBRARY", "/abs/path/lib")
+
+ paths = _instance._get_modules_directories()
+
+ assert len(paths) == 6
+ assert paths[-1] == "/abs/path/lib"
- assert x == parts[-5:]
- lib_prev = os.environ.get("ANSIBLE_LIBRARY")
- monkeypatch.setenv("ANSIBLE_LIBRARY", "/foo/bar")
- result = _instance._get_modules_directories()[-1]
- monkeypatch.setenv("ANSIBLE_LIBRARY", lib_prev if lib_prev else "")
+def test_get_modules_directories_multi_ansible_library(_instance, monkeypatch):
+ monkeypatch.setenv("ANSIBLE_LIBRARY", "relpath/lib:/abs/path/lib")
- env_lib_result_parts = pytest.helpers.os_split(result)
- env_lib_expected_parts = ("foo", "bar")
+ paths = _instance._get_modules_directories()
- assert env_lib_result_parts == env_lib_expected_parts[-2:]
+ assert len(paths) == 7
+ assert paths[-2].endswith("relpath/lib")
+ assert paths[-1] == "/abs/path/lib"
def test_get_filter_plugin_directory(_instance):
|
ccnmtl__django-pagetree-89 | Custom pageblocks in Hierarchy menu
Jess has a feature request for pagetree:
WORTH has a giant hierarchy menu: https://worth2.ccnmtl.columbia.edu/pages/edit/ and it would be nice to see which sections have which pageblocks on them. She says it shouldn't list text blocks, html blocks, quizzes etc., since many of those will be present on every page.
| [
{
"content": "# Copyright (c) 2007-2015, Columbia Center For New Media Teaching And Learning (CCNMTL)\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n# * Redistributions of source... | [
{
"content": "# Copyright (c) 2007-2015, Columbia Center For New Media Teaching And Learning (CCNMTL)\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are met:\n# * Redistributions of source... | diff --git a/CHANGES.txt b/CHANGES.txt
index edfd1023..1d7c5f10 100644
--- a/CHANGES.txt
+++ b/CHANGES.txt
@@ -1,3 +1,7 @@
+1.1.5 (2015-05-15)
+==================
+* Add pageblock count to sections in the edit page
+
1.1.4 (2015-05-13)
==================
* Show sections with empty labels in the edit_page hierarchy
diff --git a/pagetree/media/css/pagetree.css b/pagetree/media/css/pagetree.css
index 1db47171..fde9040d 100644
--- a/pagetree/media/css/pagetree.css
+++ b/pagetree/media/css/pagetree.css
@@ -1,5 +1,4 @@
-.admin_block_column
-{
+.admin_block_column {
width: 870px;
margin: 5px 0 0 15px;
padding: 5px 0 0 25px;
@@ -13,27 +12,27 @@
border-bottom-color: #eeeeee;
}
-.admin_tools_column
-{
+.admin_tools_column {
text-align: left;
}
-.admin_block_view_row1
-{
+.admin_block_view_row1 {
background-color: #eee;
}
-.admin_block_view_row2
-{
+.admin_block_view_row2 {
background-color: #fff;
}
-#id_label, #id_slug, #id_template
-{
+#id_label, #id_slug, #id_template {
width: 300px;
}
.draghandle {
cursor: move;
float: left;
+}
+
+.pagetree-pageblock-section-count {
+ font-size: smaller;
}
\ No newline at end of file
diff --git a/pagetree/templates/pagetree/edit_page.html b/pagetree/templates/pagetree/edit_page.html
index deba18e0..d1fcc417 100644
--- a/pagetree/templates/pagetree/edit_page.html
+++ b/pagetree/templates/pagetree/edit_page.html
@@ -124,8 +124,14 @@ <h3 style="margin-top: 0;"><a href="{{root.get_edit_url}}">{{hierarchy.name}}</a
<span class="glyphicon glyphicon-hand-right"></span>
<strong>
{% endifequal %}
- <a href="{{s.get_edit_url}}"
- >{{s.label|default:"Empty Label"}}{% if s.label|length < 1 %} ({{s.slug}}){% endif %}</a>
+ <span title="The number of pageblocks in this section"
+ class="pagetree-pageblock-section-count">
+ {{s.pageblock_set.count}}
+ </span>
+ <a href="{{s.get_edit_url}}"
+ >{{s.label|default:"Empty Label"}}
+ {% if s.label|length < 1 %} ({{s.slug}}) {% endif %}
+ </a>
{% ifequal s section %}
</strong>
<span class="glyphicon glyphicon-hand-left"></span>
diff --git a/setup.py b/setup.py
index 2acf536f..8fcd3aa7 100644
--- a/setup.py
+++ b/setup.py
@@ -27,7 +27,7 @@
setup(
name="django-pagetree",
- version="1.1.4",
+ version="1.1.5",
author="Anders Pearson",
author_email="anders@columbia.edu",
url="https://github.com/ccnmtl/django-pagetree",
|
bridgecrewio__checkov-4012 | Dependent Package "packaging" upgrade halts invocation
**Describe the issue**
Currently we are running checkov in a CI environment in Azure DevOps over our Terraform configurations. Earlier today Checkov started failing to run, at first it was believed to link to the release that occurred earlier.
Investigation though has shown that the dependency `packaging` has also had a release, wherein it has dropped `LegacyVersion` from its codebase (see stack trace).
The quick solution is to pin `packaging==21.3` to ensure the needed codebase functionality is in place.
This seems to only apply to environments that fresh install everything, as this was innoticed in local development until the CI pipeline triggered the issue.
**Examples**
In the ADO CI this simple version should recreate the behavior:
```
- script: |
python -m pip install --upgrade pip setuptools wheel
pip install checkov
displayName: "Install Checkov"
- task: Bash@3
displayName: Run Checkov tests
inputs:
targetType: "inline"
script: |
checkov -d . -o cli
```
**Exception Trace**
```sh
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.8.15/x64/bin/checkov", line 2, in <module>
from checkov.main import run
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/main.py", line 20, in <module>
from checkov.argo_workflows.runner import Runner as argo_workflows_runner
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/argo_workflows/runner.py", line 7, in <module>
from checkov.common.images.image_referencer import ImageReferencer, Image
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/images/image_referencer.py", line 12, in <module>
from checkov.common.bridgecrew.vulnerability_scanning.image_scanner import image_scanner
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/bridgecrew/vulnerability_scanning/image_scanner.py", line 15, in <module>
from checkov.common.bridgecrew.vulnerability_scanning.integrations.docker_image_scanning import \
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/bridgecrew/vulnerability_scanning/integrations/docker_image_scanning.py", line 8, in <module>
from checkov.common.bridgecrew.vulnerability_scanning.integrations.twistcli import TwistcliIntegration
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/bridgecrew/vulnerability_scanning/integrations/twistcli.py", line 11, in <module>
from checkov.common.bridgecrew.platform_integration import bc_integration
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/bridgecrew/platform_integration.py", line 31, in <module>
from checkov.common.bridgecrew.wrapper import reduce_scan_reports, persist_checks_results, \
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/bridgecrew/wrapper.py", line 14, in <module>
from checkov.common.util.json_utils import CustomJSONEncoder
File "/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/checkov/common/util/json_utils.py", line 6, in <module>
from packaging.version import LegacyVersion, Version
ImportError: cannot import name 'LegacyVersion' from 'packaging.version' (/opt/hostedtoolcache/Python/3.8.15/x64/lib/python3.8/site-packages/packaging/version.py)
```
**Desktop (please complete the following information):**
- OS: Ubuntu 20.04 ADO Pipeline Container
- Checkov Version: tested 2.2.124 and 2.2.116, likely applies to others if they have the dependency
**Additional context**
Release in packaging that causes this issue is `22.0`, `21.3` appears to function as expected.
| [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | diff --git a/Pipfile b/Pipfile
index 49b82a3f54..74502f78a7 100644
--- a/Pipfile
+++ b/Pipfile
@@ -55,7 +55,7 @@ jmespath = "*"
tqdm = "*"
update-checker = "*"
semantic-version = "*"
-packaging = "*"
+packaging = "==21.3"
cloudsplaining = ">=0.4.3"
networkx = "<2.7"
dockerfile-parse ="*"
diff --git a/Pipfile.lock b/Pipfile.lock
index 1584a9be8e..6c50e65f7c 100644
--- a/Pipfile.lock
+++ b/Pipfile.lock
@@ -1,7 +1,7 @@
{
"_meta": {
"hash": {
- "sha256": "ae6d1b7e14b39f049aad52a8d0bee5677dc964db53939eb49eb1587b8b523bda"
+ "sha256": "74a303f71e9f83752824ccd5e623d33c3d1b2142a84db7cb4d687e5597ffd467"
},
"pipfile-spec": 6,
"requires": {
@@ -149,14 +149,6 @@
"markers": "python_version >= '3.6'",
"version": "==4.0.2"
},
- "asynctest": {
- "hashes": [
- "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676",
- "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"
- ],
- "markers": "python_version < '3.8'",
- "version": "==0.13.0"
- },
"attrs": {
"hashes": [
"sha256:29adc2665447e5191d0e7c568fde78b21f9672d344281d0c6e1ab085429b22b6",
@@ -191,19 +183,19 @@
},
"boto3": {
"hashes": [
- "sha256:6531198b9d4cd86a1945eaffb2c5d8d75c5447b72870ad2b07c411ea75d6e59c",
- "sha256:f6117707d140363c58ffe41495400cc88c35c165e0f711c6b62edadbd6f600b5"
+ "sha256:7048335b099473816240046753d77ef0953ce5a037b93b2848477dcf036d3849",
+ "sha256:e0ba3620feb430e31926270ea3dc0bb94df55a0fd2b209bd91f7904f2f2166ef"
],
"index": "pypi",
- "version": "==1.26.24"
+ "version": "==1.26.25"
},
"botocore": {
"hashes": [
- "sha256:4f9c92979b29132185f645f61bdf0fecf031ecc26a8dd1a99dbd88d323211325",
- "sha256:fb37c63d5e2b7c778f52d096c6c54207d49143cd942b4dc19297086a1385a7cd"
+ "sha256:a204140c9d7adadf3919d8024d79278f1865a20c869e4f216eaea599ca3a1743",
+ "sha256:cb489ca8fbc043cd9bf901e3e105f0dec316ed438ee883e55c9f9c77bd0f6a2d"
],
"markers": "python_version >= '3.7'",
- "version": "==1.29.24"
+ "version": "==1.29.25"
},
"cached-property": {
"hashes": [
@@ -222,11 +214,11 @@
},
"certifi": {
"hashes": [
- "sha256:0d9c601124e5a6ba9712dbc60d9c53c21e34f5f641fe83002317394311bdce14",
- "sha256:90c1a32f1d68f940488354e36370f6cca89f0f106db09518524c88d6ed83f382"
+ "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3",
+ "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"
],
"markers": "python_version >= '3.6'",
- "version": "==2022.9.24"
+ "version": "==2022.12.7"
},
"cffi": {
"hashes": [
@@ -318,7 +310,7 @@
"sha256:0f8ca79bc9b1d6fcaafdbe194b17ba1a2dde44ddf19087235c3efed2ad288143",
"sha256:78ee474f07a0ca0ef6c0317bb3ebe79387aafb0c4a1e03b1d8b2b0be1e42fc78"
],
- "markers": "python_version >= '3.6' and python_version < '4'",
+ "markers": "python_version >= '3.6' and python_version < '4.0'",
"version": "==0.5.5"
},
"cloudsplaining": {
@@ -506,19 +498,11 @@
},
"importlib-metadata": {
"hashes": [
- "sha256:8a8a81bcf996e74fee46f0d16bd3eaa382a7eb20fd82445c3ad11f4090334116",
- "sha256:dd0173e8f150d6815e098fd354f6414b0f079af4644ddfe90c71e2fc6174346d"
+ "sha256:d5059f9f1e8e41f80e9c56c2ee58811450c31984dfa625329ffd7c0dad88a73b",
+ "sha256:d84d17e21670ec07990e1044a99efe8d615d860fd176fc29ef5c306068fda313"
],
"index": "pypi",
- "version": "==4.13.0"
- },
- "importlib-resources": {
- "hashes": [
- "sha256:32bb095bda29741f6ef0e5278c42df98d135391bee5f932841efc0041f748dc3",
- "sha256:c09b067d82e72c66f4f8eb12332f5efbebc9b007c0b6c40818108c9870adc363"
- ],
- "markers": "python_version < '3.9'",
- "version": "==5.10.1"
+ "version": "==5.1.0"
},
"jinja2": {
"hashes": [
@@ -1402,14 +1386,6 @@
"markers": "python_version >= '3.6'",
"version": "==4.0.2"
},
- "asynctest": {
- "hashes": [
- "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676",
- "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"
- ],
- "markers": "python_version < '3.8'",
- "version": "==0.13.0"
- },
"attrs": {
"hashes": [
"sha256:29adc2665447e5191d0e7c568fde78b21f9672d344281d0c6e1ab085429b22b6",
@@ -1428,11 +1404,11 @@
},
"certifi": {
"hashes": [
- "sha256:0d9c601124e5a6ba9712dbc60d9c53c21e34f5f641fe83002317394311bdce14",
- "sha256:90c1a32f1d68f940488354e36370f6cca89f0f106db09518524c88d6ed83f382"
+ "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3",
+ "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"
],
"markers": "python_version >= '3.6'",
- "version": "==2022.9.24"
+ "version": "==2022.12.7"
},
"cfgv": {
"hashes": [
@@ -1682,20 +1658,12 @@
"markers": "python_version >= '3.5'",
"version": "==3.4"
},
- "importlib-metadata": {
- "hashes": [
- "sha256:8a8a81bcf996e74fee46f0d16bd3eaa382a7eb20fd82445c3ad11f4090334116",
- "sha256:dd0173e8f150d6815e098fd354f6414b0f079af4644ddfe90c71e2fc6174346d"
- ],
- "index": "pypi",
- "version": "==4.13.0"
- },
"importlib-resources": {
"hashes": [
"sha256:32bb095bda29741f6ef0e5278c42df98d135391bee5f932841efc0041f748dc3",
"sha256:c09b067d82e72c66f4f8eb12332f5efbebc9b007c0b6c40818108c9870adc363"
],
- "markers": "python_version < '3.9'",
+ "index": "pypi",
"version": "==5.10.1"
},
"iniconfig": {
@@ -1876,21 +1844,13 @@
"markers": "python_version >= '2.6'",
"version": "==5.11.0"
},
- "pkgutil-resolve-name": {
- "hashes": [
- "sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174",
- "sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e"
- ],
- "markers": "python_version < '3.9'",
- "version": "==1.3.10"
- },
"platformdirs": {
"hashes": [
- "sha256:1006647646d80f16130f052404c6b901e80ee4ed6bef6792e1f238a8969106f7",
- "sha256:af0276409f9a02373d540bf8480021a048711d572745aef4b7842dad245eba10"
+ "sha256:1a89a12377800c81983db6be069ec068eee989748799b946cce2a6e80dcc54ca",
+ "sha256:b46ffafa316e6b83b47489d240ce17173f123a9b9c83282141c3daf26ad9ac2e"
],
"markers": "python_version >= '3.7'",
- "version": "==2.5.4"
+ "version": "==2.6.0"
},
"pluggy": {
"hashes": [
@@ -1924,14 +1884,6 @@
"markers": "python_version >= '3.6'",
"version": "==2.5.0"
},
- "pyparsing": {
- "hashes": [
- "sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb",
- "sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"
- ],
- "markers": "python_full_version >= '3.6.8'",
- "version": "==3.0.9"
- },
"pyrsistent": {
"hashes": [
"sha256:055ab45d5911d7cae397dc418808d8802fb95262751872c841c170b0dbf51eed",
@@ -2080,11 +2032,11 @@
},
"stevedore": {
"hashes": [
- "sha256:cf99f41fc0d5a4f185ca4d3d42b03be9011b0a1ec1a4ea1a282be1b4b306dcc2",
- "sha256:fa2630e3d0ad3e22d4914aff2501445815b9a4467a6edc49387c667a38faf5bf"
+ "sha256:7f8aeb6e3f90f96832c301bff21a7eb5eefbe894c88c506483d355565d88cc1a",
+ "sha256:aa6436565c069b2946fe4ebff07f5041e0c8bf18c7376dd29edf80cf7d524e4e"
],
- "markers": "python_version >= '3.6'",
- "version": "==3.5.2"
+ "markers": "python_version >= '3.8'",
+ "version": "==4.1.1"
},
"toml": {
"hashes": [
@@ -2102,36 +2054,6 @@
"markers": "python_version < '3.11'",
"version": "==2.0.1"
},
- "typed-ast": {
- "hashes": [
- "sha256:0261195c2062caf107831e92a76764c81227dae162c4f75192c0d489faf751a2",
- "sha256:0fdbcf2fef0ca421a3f5912555804296f0b0960f0418c440f5d6d3abb549f3e1",
- "sha256:183afdf0ec5b1b211724dfef3d2cad2d767cbefac291f24d69b00546c1837fb6",
- "sha256:211260621ab1cd7324e0798d6be953d00b74e0428382991adfddb352252f1d62",
- "sha256:267e3f78697a6c00c689c03db4876dd1efdfea2f251a5ad6555e82a26847b4ac",
- "sha256:2efae9db7a8c05ad5547d522e7dbe62c83d838d3906a3716d1478b6c1d61388d",
- "sha256:370788a63915e82fd6f212865a596a0fefcbb7d408bbbb13dea723d971ed8bdc",
- "sha256:39e21ceb7388e4bb37f4c679d72707ed46c2fbf2a5609b8b8ebc4b067d977df2",
- "sha256:3e123d878ba170397916557d31c8f589951e353cc95fb7f24f6bb69adc1a8a97",
- "sha256:4879da6c9b73443f97e731b617184a596ac1235fe91f98d279a7af36c796da35",
- "sha256:4e964b4ff86550a7a7d56345c7864b18f403f5bd7380edf44a3c1fb4ee7ac6c6",
- "sha256:639c5f0b21776605dd6c9dbe592d5228f021404dafd377e2b7ac046b0349b1a1",
- "sha256:669dd0c4167f6f2cd9f57041e03c3c2ebf9063d0757dc89f79ba1daa2bfca9d4",
- "sha256:6778e1b2f81dfc7bc58e4b259363b83d2e509a65198e85d5700dfae4c6c8ff1c",
- "sha256:683407d92dc953c8a7347119596f0b0e6c55eb98ebebd9b23437501b28dcbb8e",
- "sha256:79b1e0869db7c830ba6a981d58711c88b6677506e648496b1f64ac7d15633aec",
- "sha256:7d5d014b7daa8b0bf2eaef684295acae12b036d79f54178b92a2b6a56f92278f",
- "sha256:98f80dee3c03455e92796b58b98ff6ca0b2a6f652120c263efdba4d6c5e58f72",
- "sha256:a94d55d142c9265f4ea46fab70977a1944ecae359ae867397757d836ea5a3f47",
- "sha256:a9916d2bb8865f973824fb47436fa45e1ebf2efd920f2b9f99342cb7fab93f72",
- "sha256:c542eeda69212fa10a7ada75e668876fdec5f856cd3d06829e6aa64ad17c8dfe",
- "sha256:cf4afcfac006ece570e32d6fa90ab74a17245b83dfd6655a6f68568098345ff6",
- "sha256:ebd9d7f80ccf7a82ac5f88c521115cc55d84e35bf8b446fcd7836eb6b98929a3",
- "sha256:ed855bbe3eb3715fca349c80174cfcfd699c2f9de574d40527b8429acae23a66"
- ],
- "markers": "python_version < '3.8'",
- "version": "==1.5.4"
- },
"types-cachetools": {
"hashes": [
"sha256:069cfc825697cd51445c1feabbe4edc1fae2b2315870e7a9a179a7c4a5851bee",
@@ -2237,11 +2159,11 @@
},
"virtualenv": {
"hashes": [
- "sha256:0ef5be6d07181946891f5abc8047fda8bc2f0b4b9bf222c64e6e8963baee76db",
- "sha256:635b272a8e2f77cb051946f46c60a54ace3cb5e25568228bd6b57fc70eca9ff3"
+ "sha256:ce3b1684d6e1a20a3e5ed36795a97dfc6af29bc3970ca8dab93e11ac6094b3c4",
+ "sha256:f8b927684efc6f1cc206c9db297a570ab9ad0e51c16fa9e45487d36d1905c058"
],
"markers": "python_version >= '3.6'",
- "version": "==20.16.2"
+ "version": "==20.17.1"
},
"yarl": {
"hashes": [
@@ -2322,14 +2244,6 @@
],
"markers": "python_version >= '3.7'",
"version": "==1.8.2"
- },
- "zipp": {
- "hashes": [
- "sha256:83a28fcb75844b5c0cdaf5aa4003c2d728c77e05f5aeabe8e95e56727005fbaa",
- "sha256:a7a22e05929290a67401440b39690ae6563279bced5f314609d9d03798f56766"
- ],
- "markers": "python_version >= '3.7'",
- "version": "==3.11.0"
}
}
}
diff --git a/setup.py b/setup.py
index d58f039110..d8822d4ecd 100644
--- a/setup.py
+++ b/setup.py
@@ -48,7 +48,7 @@
"tqdm",
"update-checker",
"semantic-version",
- "packaging",
+ "packaging==21.3",
"cloudsplaining>=0.4.3",
"networkx<2.7",
"dockerfile-parse",
|
swcarpentry__python-novice-inflammation-736 | Lesson 10 - numpy.mean(data) and data.mean
In lesson 10, when the lesson refers to readings_03.py, the code shows that to calculate the mean over 'data' across all days, numpy.mean is used: numpy.mean(data, axis=1). However when looking at the file readings_03.py (at least the version I downloaded recently) uses the instruction data.mean(axis=1). Both lead to the same result, but for consistency I would suggest to either modify the readings_*.py to use numpy.mean (as this is what it has been used throughout the entire lesson), or explain explicitly that both expressions lead to the same result (it would be a good time to remind students about object attributes).
| [
{
"content": "import sys\nimport numpy\n\n\ndef main():\n script = sys.argv[0]\n for filename in sys.argv[1:]:\n data = numpy.loadtxt(filename, delimiter=',')\n for m in data.mean(axis=1):\n print(m)\n\n\nif __name__ == '__main__':\n main()\n",
"path": "code/readings_03.py"... | [
{
"content": "import sys\nimport numpy\n\n\ndef main():\n script = sys.argv[0]\n for filename in sys.argv[1:]:\n data = numpy.loadtxt(filename, delimiter=',')\n for m in numpy.mean(data, axis=1):\n print(m)\n\n\nif __name__ == '__main__':\n main()\n",
"path": "code/readings... | diff --git a/code/readings_03.py b/code/readings_03.py
index 7736fdf73..423ec9bf5 100644
--- a/code/readings_03.py
+++ b/code/readings_03.py
@@ -6,7 +6,7 @@ def main():
script = sys.argv[0]
for filename in sys.argv[1:]:
data = numpy.loadtxt(filename, delimiter=',')
- for m in data.mean(axis=1):
+ for m in numpy.mean(data, axis=1):
print(m)
|
codespell-project__codespell-96 | README: outdated license notice
README says:
> The Python script `codespell.py` is available with the following terms:
But currently `codespell.py` is just a thin wrapper over `codespell_lib`, with little to no creativity.
This sentence should probably read something like this:
> The Python code is available with the following terms:
| [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; version 2 of the License.\n#\n# This program is distributed in the hop... | [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; version 2 of the License.\n#\n# This program is distributed in the hop... | diff --git a/Makefile b/Makefile
index b541dc91ec..9b0b051d3f 100644
--- a/Makefile
+++ b/Makefile
@@ -1,28 +1,12 @@
-prefix ?= /usr/local
-bindir ?= ${prefix}/bin
-datadir ?= ${prefix}/share/codespell
-mandir ?= ${prefix}/share/man/man1
-
-_VERSION := $(shell grep -e "VERSION = '[0-9]\.[0-9]" codespell_lib/_codespell.py | cut -f 3 -d ' ')
-VERSION = $(subst ',,$(_VERSION))
-
SORT_ARGS := -f
-PHONY := all manpage check check-dictionary sort-dictionary install git-tag-release tar-sync clean
-
-all: codespell manpage
+PHONY := all check check-dictionary sort-dictionary clean
-codespell: codespell.py check-dictionary
- sed "s|^default_dictionary = .*|default_dictionary = '${datadir}/dictionary.txt'|" < $< > $@
- chmod 755 codespell
+all: check-dictionary codespell.1
-manpage: codespell codespell.1.include
- help2man ./codespell --include codespell.1.include --no-info --output codespell.1
+codespell.1: codespell.1.include bin/codespell
+ PYTHONPATH=. help2man ./bin/codespell --include codespell.1.include --no-info --output codespell.1
sed -i '/\.SS \"Usage/,+2d' codespell.1
- gzip -9 -f codespell.1
-
-check:
- test 1bfb1f089c3c7772f0898f66df089b9e = $$(./codespell.py example/ | md5sum | cut -f1 -d\ )
check-dictionary:
@if ! LANG=C sort ${SORT_ARGS} -c codespell_lib/data/dictionary.txt; then \
@@ -33,37 +17,8 @@ check-dictionary:
sort-dictionary:
LANG=C sort ${SORT_ARGS} -u -o codespell_lib/data/dictionary.txt codespell_lib/data/dictionary.txt
-install: codespell manpage
- install -d ${DESTDIR}${datadir} ${DESTDIR}${bindir} ${DESTDIR}${mandir}
- install -m644 -t ${DESTDIR}${datadir} data/dictionary.txt data/linux-kernel.exclude
- install -m755 -T codespell ${DESTDIR}${bindir}/codespell
- install -d ${DESTDIR}${mandir}
- install -m644 -t ${DESTDIR}${mandir} codespell.1.gz
-
-git-tag-release:
- git commit -a -m "codespell $(VERSION)"
- git tag -m "codespell $(VERSION)" -s v$(VERSION)
- git gc --prune=0
-
-codespell-$(VERSION).tar.xz.asc: codespell-$(VERSION).tar.xz
- gpg --armor --detach-sign --output $@ $^
-
-codespell-$(VERSION).tar.xz:
- git archive --format=tar --prefix codespell-$(VERSION)/ v$(VERSION) | xz > $@
-
-tar-sync: codespell-$(VERSION).tar.xz codespell-$(VERSION).tar.xz.asc
- github-release release --repo codespell --tag v$(VERSION) --name v$(VERSION)
- github-release upload --repo codespell --tag v$(VERSION) \
- --name codespell-$(VERSION).tar.xz \
- --file codespell-$(VERSION).tar.xz
- github-release upload --repo codespell --tag v$(VERSION) \
- --name codespell-$(VERSION).tar.xz.asc \
- --file codespell-$(VERSION).tar.xz.asc
-
pypi:
python setup.py sdist register upload
clean:
- rm -rf codespell
rm -rf codespell.1
- rm -rf codespell.1.gz
diff --git a/README.rst b/README.rst
index 44b4f0b749..f0ae2cb330 100644
--- a/README.rst
+++ b/README.rst
@@ -71,7 +71,8 @@ directly, but instead be manually inspected. E.g.:
License
-------
-The Python script ``codespell`` is available with the following terms:
+The Python script ``codespell`` with its library ``codespell_lib`` is available
+with the following terms:
(*tl;dr*: `GPL v2`_)
Copyright (C) 2010-2011 Lucas De Marchi <lucas.de.marchi@gmail.com>
diff --git a/codespell_lib/_codespell.py b/codespell_lib/_codespell.py
index 62bff4227f..4706ec8529 100755
--- a/codespell_lib/_codespell.py
+++ b/codespell_lib/_codespell.py
@@ -510,7 +510,6 @@ def main(*args):
parser.print_help()
return 1
build_dict(dictionary)
-
colors = TermColors()
if not options.colors:
colors.disable()
|
aws__aws-cli-2892 | - Support use of colorama up to 0.3.8
+ colorama bugfix release 0.3.8 is available and contains no incompatible
changes. There is no need to restrict use to less or equal 0.3.7
| [
{
"content": "#!/usr/bin/env python\nimport codecs\nimport os.path\nimport re\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n return codecs.open(os.path.join(here, *parts), 'r').read()\n\n\ndef find_version(*file_paths... | [
{
"content": "#!/usr/bin/env python\nimport codecs\nimport os.path\nimport re\nimport sys\n\nfrom setuptools import setup, find_packages\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read(*parts):\n return codecs.open(os.path.join(here, *parts), 'r').read()\n\n\ndef find_version(*file_paths... | diff --git a/.changes/next-release/enhancement-colorama-91382.json b/.changes/next-release/enhancement-colorama-91382.json
new file mode 100644
index 000000000000..95c802ae4eac
--- /dev/null
+++ b/.changes/next-release/enhancement-colorama-91382.json
@@ -0,0 +1,5 @@
+{
+ "type": "enhancement",
+ "category": "colorama",
+ "description": "Increased the upper bound on the colorama dependency to 0.3.9."
+}
diff --git a/requirements.txt b/requirements.txt
index 1cd3845af02f..2f887cc15547 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -7,7 +7,7 @@ docutils>=0.10
-e git://github.com/boto/s3transfer.git@develop#egg=s3transfer
-e git://github.com/boto/jmespath.git@develop#egg=jmespath
nose==1.3.7
-colorama>=0.2.5,<=0.3.7
+colorama>=0.2.5,<=0.3.9
mock==1.3.0
rsa>=3.1.2,<=3.5.0
wheel==0.24.0
diff --git a/setup.cfg b/setup.cfg
index 47fa9c8240a7..f39b54f14d87 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -5,7 +5,7 @@ universal = 1
[metadata]
requires-dist =
botocore==1.10.19
- colorama>=0.2.5,<=0.3.7
+ colorama>=0.2.5,<=0.3.9
docutils>=0.10
rsa>=3.1.2,<=3.5.0
PyYAML>=3.10,<=3.12
diff --git a/setup.py b/setup.py
index aa50140c6740..56341a9feb8b 100644
--- a/setup.py
+++ b/setup.py
@@ -24,7 +24,7 @@ def find_version(*file_paths):
requires = ['botocore==1.10.19',
- 'colorama>=0.2.5,<=0.3.7',
+ 'colorama>=0.2.5,<=0.3.9',
'docutils>=0.10',
'rsa>=3.1.2,<=3.5.0',
's3transfer>=0.1.12,<0.2.0',
|
cookiecutter__cookiecutter-django-4520 | Unused gulp dependency added to package.json when Webpack is chosen
## What happened?
package.json has reference to `"gulp-concat": "^2.6.1"` even though it is not used when Webpack has been chosen
## What should've happened instead?
No reference to gulp-concat at all when using Webpack
## Additional details
Host system configuration:
Cookiecutter 2.1.1
Ventura 13.5
Python 3.11.4
Docker version 24.0.2, build cb74dfc
Docker Compose version v2.19.1
| [
{
"content": "\"\"\"\nNOTE:\n the below code is to be maintained Python 2.x-compatible\n as the whole Cookiecutter Django project initialization\n can potentially be run in Python 2.x environment\n (at least so we presume in `pre_gen_project.py`).\n\nTODO: restrict Cookiecutter Django project initia... | [
{
"content": "\"\"\"\nNOTE:\n the below code is to be maintained Python 2.x-compatible\n as the whole Cookiecutter Django project initialization\n can potentially be run in Python 2.x environment\n (at least so we presume in `pre_gen_project.py`).\n\nTODO: restrict Cookiecutter Django project initia... | diff --git a/hooks/post_gen_project.py b/hooks/post_gen_project.py
index 8d1be5a165..37f96efc03 100644
--- a/hooks/post_gen_project.py
+++ b/hooks/post_gen_project.py
@@ -183,6 +183,7 @@ def handle_js_runner(choice, use_docker, use_async):
"browser-sync",
"cssnano",
"gulp",
+ "gulp-concat",
"gulp-imagemin",
"gulp-plumber",
"gulp-postcss",
|
microsoft__botbuilder-python-1932 | Bump MSAL to the latest version
**Is your feature request related to a problem? Please describe.**
Old version of MSAL is used in [botframework-connector](https://github.com/microsoft/botbuilder-python/blob/main/libraries/botframework-connector/requirements.txt#L6) (v1.6.0)
**Describe the solution you'd like**
Upgrade to the [latest version](https://github.com/AzureAD/microsoft-authentication-library-for-python/releases) (v1.13.0 is the latest at this moment).
**Describe alternatives you've considered**
No alternatives.
**Additional context**
Please also consider to not pin this dependency (#1467).
| [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==... | [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==... | diff --git a/libraries/botframework-connector/setup.py b/libraries/botframework-connector/setup.py
index 7e6bff958..84bb6dce9 100644
--- a/libraries/botframework-connector/setup.py
+++ b/libraries/botframework-connector/setup.py
@@ -11,7 +11,7 @@
"requests>=2.23.0,<2.26",
"PyJWT>=1.5.3,<2.0.0",
"botbuilder-schema==4.15.0",
- "msal==1.6.0",
+ "msal==1.17.0",
]
root = os.path.abspath(os.path.dirname(__file__))
|
webkom__lego-1505 | Add end_time of an event when getting all events with get request
I want to be able to get the end time of an event when getting all events. I know I can get the end time when getting a specific event, but it is a bit cumbersome.
| [
{
"content": "from django.db import transaction\nfrom rest_framework import serializers\nfrom rest_framework.fields import BooleanField, CharField\n\nfrom lego.apps.comments.serializers import CommentSerializer\nfrom lego.apps.companies.fields import CompanyField\nfrom lego.apps.companies.models import Company\... | [
{
"content": "from django.db import transaction\nfrom rest_framework import serializers\nfrom rest_framework.fields import BooleanField, CharField\n\nfrom lego.apps.comments.serializers import CommentSerializer\nfrom lego.apps.companies.fields import CompanyField\nfrom lego.apps.companies.models import Company\... | diff --git a/lego/apps/events/serializers/events.py b/lego/apps/events/serializers/events.py
index 95da047a2..7f81934ee 100644
--- a/lego/apps/events/serializers/events.py
+++ b/lego/apps/events/serializers/events.py
@@ -62,6 +62,7 @@ class Meta:
"event_type",
"location",
"start_time",
+ "end_time",
"thumbnail",
"total_capacity",
"company",
diff --git a/requirements/base.txt b/requirements/base.txt
index 33e73c150..fc7b53fe4 100644
--- a/requirements/base.txt
+++ b/requirements/base.txt
@@ -29,6 +29,8 @@ libthumbor==1.3.2
channels==2.1.7
channels_redis==2.3.3
daphne==2.2.5
+# Implicit dependency
+flatbuffers==1.10
djangorestframework==3.9.0
djangorestframework-jwt==1.11.0
|
aio-libs-abandoned__aioredis-py-1048 | [2.0] Type annotations break mypy
I tried porting an existing project to aioredis 2.0. I've got it almost working, but the type annotations that have been added are too strict (and in some cases just wrong) and break mypy. The main problem is that all the functions that take keys annotate them as `str`, when `bytes` (and I think several other types) are perfectly acceptable and are used in my code. The same applies to `register_script`.
The `min` and `max` arguments of `zrangebylex` and `zrevrangebylex` are annotated as int, but they're used for lexicographical sorting so are string-like.
Getting the type annotations right is a fair large undertaking. If there is a desire to release 2.0 soon I'd suggest deleting `py.typed` so that mypy doesn't see this package as annotated. There are annotations for redis-py in typeshed; perhaps that would be a good place to start, although I've occasionally also had issues there.
| [
{
"content": "import os.path\nimport re\n\nfrom setuptools import find_packages, setup\n\n\ndef read(*parts):\n with open(os.path.join(*parts)) as f:\n return f.read().strip()\n\n\ndef read_version():\n regexp = re.compile(r\"^__version__\\W*=\\W*\\\"([\\d.abrc]+)\\\"\")\n init_py = os.path.join... | [
{
"content": "import os.path\nimport re\n\nfrom setuptools import find_packages, setup\n\n\ndef read(*parts):\n with open(os.path.join(*parts)) as f:\n return f.read().strip()\n\n\ndef read_version():\n regexp = re.compile(r\"^__version__\\W*=\\W*\\\"([\\d.abrc]+)\\\"\")\n init_py = os.path.join... | diff --git a/CHANGES/1009.misc b/CHANGES/1009.misc
deleted file mode 100644
index e14ad9413..000000000
--- a/CHANGES/1009.misc
+++ /dev/null
@@ -1 +0,0 @@
-Temporarily remove py.typed because there are errors in the type annotations.
diff --git a/aioredis/py.typed b/aioredis/py.typed
new file mode 100644
index 000000000..e69de29bb
diff --git a/setup.py b/setup.py
index 942ed303e..6a2eb8ac7 100644
--- a/setup.py
+++ b/setup.py
@@ -54,6 +54,7 @@ def read_version():
extras_require={
"hiredis": 'hiredis>=1.0; implementation_name=="cpython"',
},
+ package_data={"aioredis": ["py.typed"]},
python_requires=">=3.6",
include_package_data=True,
)
|
fidals__shopelectro-346 | Do `gulp build` on image side
Currently, we have problems with gulp build on client side. See #344 for details.
Moreover, building static on container site is more good practice
| [
{
"content": "\"\"\"\nDjango settings for shopelectro project.\n\nGenerated by 'django-admin startproject' using Django 1.9.5.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.9/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/... | [
{
"content": "\"\"\"\nDjango settings for shopelectro project.\n\nGenerated by 'django-admin startproject' using Django 1.9.5.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.9/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/... | diff --git a/.dockerignore b/.dockerignore
index cf35f273..ed67fd93 100644
--- a/.dockerignore
+++ b/.dockerignore
@@ -1,3 +1,4 @@
+node_modules/
media/
*.sqlite*
*.log
diff --git a/.drone.yml b/.drone.yml
index 1a762090..39dfb09d 100644
--- a/.drone.yml
+++ b/.drone.yml
@@ -13,24 +13,29 @@ pipeline:
event: [push, pull_request]
branch: master
+ # @todo #345:60m Use ready image for drone npm step. stb2
+ # Image `fidals/se-nodejs:dev` already contains built static and node_modules.
npm:
image: node:slim
environment:
- DEPS_DIR=/usr/app/deps
+ - FRONT_BUILD_DIR=/usr/app/front/build
commands:
- npm install
- npm install -g gulp-cli
- gulp build
volumes:
- - /tmp/cache/drone/shopelectro/node_modules:/drone/src/github.com/fidals/shopelectro/commit/${DRONE_COMMIT_SHA}/node_modules
- /tmp/cache/drone/shopelectro/site-packages/${DRONE_COMMIT_SHA}/site-packages:/usr/app/deps
+ - /tmp/cache/drone/shopelectro/front_build/${DRONE_COMMIT_SHA}:/usr/app/front/build
when:
event: [push, pull_request]
branch: master
+
test:
image: python
environment:
+ - FRONT_BUILD_DIR=/usr/app/front/build
- TEST_ENV=true
- DJANGO_SETTINGS_MODULE=shopelectro.settings.drone
- POSTGRES_USER=postgres
@@ -50,9 +55,10 @@ pipeline:
- python manage.py excel
- python manage.py price
- python manage.py collectstatic --noinput
- - python manage.py test --parallel --liveserver=test:8020-8030
+ - python manage.py test --parallel --liveserver=test:8021-8029
volumes:
- /tmp/cache/drone/shopelectro/site-packages/${DRONE_COMMIT_SHA}/site-packages:/usr/local/lib/python3.6/site-packages
+ - /tmp/cache/drone/shopelectro/front_build/${DRONE_COMMIT_SHA}:/usr/app/front/build
secrets: [ FTP_IP, FTP_USER, FTP_PASS ]
when:
event: [push, pull_request]
@@ -78,6 +84,9 @@ pipeline:
docker-build:
image: docker/compose:1.17.1
+ environment:
+ - DEPS_DIR=/usr/app/deps
+ - FRONT_BUILD_DIR=/usr/app/front/build
commands:
- cd docker
# Build python images with sources and static files
@@ -88,6 +97,9 @@ pipeline:
- /root/prog/shopelectro/docker/.env:/drone/src/github.com/fidals/shopelectro/commit/${DRONE_COMMIT_SHA}/docker/.env
# in case if "Pull Request Hooks" is enabled in Drone settings GUI
- /root/prog/shopelectro/docker/.env:/drone/src/github.com/fidals/shopelectro/pull/${DRONE_PULL_REQUEST}/docker/.env
+ # for nodejs build
+ - /tmp/cache/drone/shopelectro/site-packages/${DRONE_COMMIT_SHA}/site-packages:/usr/app/deps
+ - /tmp/cache/drone/shopelectro/front_build/${DRONE_COMMIT_SHA}:/usr/app/front/build
when:
event: push
branch: master
diff --git a/.env.prod b/.env.prod
deleted file mode 100644
index 7c47b15c..00000000
--- a/.env.prod
+++ /dev/null
@@ -1,48 +0,0 @@
-VIRTUAL_HOST_PORT=8000
-VIRTUAL_HOST_STAGE_PORT=8001
-VIRTUAL_HOST_LIVESERVER_PORT=8020-8030
-VIRTUAL_HOST_EXPOSE_PORT=8010
-VIRTUAL_HOST_STAGE_EXPOSE_PORT=8011
-
-DB_USER=postgres
-DB_PASS=oC7hY6qNiqeG4FnWS0dD
-DB_NAME=se_prod
-DB_DEV_NAME=se_dev
-
-DEPS_DIR=/usr/local/lib/python3.6/site-packages
-SRC_DIR=/usr/app/src
-
-RABBITMQ_DEFAULT_USER=se_rabbitmq_user
-RABBITMQ_DEFAULT_PASS=321CwIGuKKQJCcQOTk41Gw==
-
-DJANGO_SETTINGS_MODULE=shopelectro.settings.local
-SECRET_KEY=SE5&3h672i-gijy2ixibfjzp34pqpo7(iu6fv(wqu@=l&f+lqd0x
-
-EMAIL_HOST_PASSWORD=21b34b446a
-YANDEX_SHOP_PASS=0b782c87d61c9a9fa9dc
-
-FTP_PASS=e01ebbd06176f46b1f732bd0868ad6be
-FTP_USER=1c_exc
-FTP_IP=37.18.77.165
-
-REDIS_PASSWORD=ZGRiODViMDQ3ZmRmNzc5Y2U4ZGU2Njhm
-
-SLACK_REPORT_URL=https://hooks.slack.com/services/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
-
-DJANGO_LOG_LEVEL=INFO
-ENV_TYPE=PROD
-
-# TEST_ENV=false
-
-# SECRET_KEY=SE5&3h672i-gijy2ixibfjzp34pqpo7(iu6fv(wqu@=l&f+lqd0x
-# YANDEX_SHOP_PASS=0b782c87d61c9a9fa9dc
-# EMAIL_HOST_PASSWORD=21b34b446a
-# DB_PASS=oC7hY6qNiqeG4FnWS0dD
-# FTP_PASS=e01ebbd06176f46b1f732bd0868ad6be
-# FTP_USER=1c_exc
-# FTP_IP=37.18.77.165
-# REDIS_PASSWORD=ZGRiODViMDQ3ZmRmNzc5Y2U4ZGU2Njhm
-# RABBITMQ_DEFAULT_USER=se_rabbitmq_user
-# RABBITMQ_DEFAULT_PASS=321CwIGuKKQJCcQOTk41Gw==
-# SLACK_REPORT_URL=https://hooks.slack.com/services/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
-
diff --git a/docker/Makefile b/docker/Makefile
index 65482ac9..a0892ce7 100644
--- a/docker/Makefile
+++ b/docker/Makefile
@@ -29,7 +29,7 @@ excel:
create-env:
@bash ./create-env.sh
-create-env:
+create-config:
@bash ./create-config.sh
build-static:
@@ -54,25 +54,22 @@ lint:
restore:
@bash ../etc/stb-backup-restore.sh
-
-# ---------------------- Deploy section ----------------------
deploy-dev:
$(MAKE) create-env
$(MAKE) create-config
$(dc) pull
$(dc) up -d app
$(MAKE) build-static
+ $(MAKE) migrate
# Create admin user with login/pass: admin/asdfjkl;
+ $(MAKE) fixtures
# Launch "collectstatic" not in static recipe because ManifestStaticStorage writes to db
- $(dc) exec app bash -c "\
- python manage.py migrate \
- && python manage.py loaddata shopelectro/fixtures/admin.json \
- && python manage.py loaddata shopelectro/fixtures/dump.json \
- && python manage.py collectstatic --noinput \
- "
+ $(MAKE) collectstatic
# to make fresh collected static visible immediately
$(dc) stop app && $(dc) up -d app
+
+# ---------------------- Production deploy section ----------------------
backup:
$(dcp) run --rm backup-data sh /usr/bin/entrypoint.sh
diff --git a/docker/docker-compose-build.yml b/docker/docker-compose-build.yml
index 704b2a99..bdd5773b 100644
--- a/docker/docker-compose-build.yml
+++ b/docker/docker-compose-build.yml
@@ -18,6 +18,9 @@ services:
image: fidals/se-nodejs:dev
build:
context: ../
+ args:
+ - deps_dir=$DEPS_DIR
+ - front_build_dir=$FRONT_BUILD_DIR
dockerfile: docker/images/node/Dockerfile
nginx:
diff --git a/docker/docker-compose.yml b/docker/docker-compose.yml
index d18a6740..c302812f 100644
--- a/docker/docker-compose.yml
+++ b/docker/docker-compose.yml
@@ -23,6 +23,8 @@ services:
networks:
- se-backend
- se-frontend
+ volumes_from:
+ - nodejs
volumes:
- ./../:$SRC_DIR
# contains refarm-site modules
@@ -33,8 +35,12 @@ services:
nodejs:
image: fidals/se-nodejs:dev
- volumes_from:
- - app
+ volumes:
+ - $FRONT_BUILD_DIR
+ # Volumes for refarm-site's front development
+ #- ../gulpfile.babel.js:/usr/app/src_front/gulpfile.babel.js
+ #- ../package.json:/usr/app/src_front/package.json
+ #- ../front:/usr/app/src_front/front
env_file:
- env_files/paths
@@ -76,7 +82,7 @@ services:
image: selenium/standalone-chrome-debug:3.10.0
restart: always
ports:
- - 4444:4444
+ - 4444
# VNC port. Password: secret
- 5900:5900
environment:
diff --git a/docker/env_files/paths.dist b/docker/env_files/paths.dist
index e772ea4f..06f4e194 100644
--- a/docker/env_files/paths.dist
+++ b/docker/env_files/paths.dist
@@ -1,5 +1,9 @@
# Identify the dependencies folder
DEPS_DIR=/usr/local/lib/python3.6/site-packages
+# Directory, where you cloned `refarm-site` repository
+REFARM_DIR=/path/to_my/refarm_site
# Identify the source folder
SRC_DIR=/usr/app/src
+# Set smth like `/var/fidals/se_db`, if you use VirtualBox
POSTGRES_DATA_DIR=./../database
+FRONT_BUILD_DIR=/usr/app/front/build
diff --git a/docker/images/node/Dockerfile b/docker/images/node/Dockerfile
index 7a9530f2..9621a27c 100644
--- a/docker/images/node/Dockerfile
+++ b/docker/images/node/Dockerfile
@@ -1,8 +1,24 @@
FROM node:slim
-WORKDIR /usr/app/src/
+ARG front_build_dir
+ARG deps_dir
+ENV DEPS_DIR=$deps_dir
+ENV FRONT_BUILD_DIR=$front_build_dir
+
+# Also this directory differs from $SRC_DIR to avoid `node_modules/` volumes conflicts.
+WORKDIR /usr/app/src_front/
+
+RUN apt-get update \
+ && apt-get install --no-install-recommends --no-install-suggests -y ssh git \
+ && git clone https://github.com/fidals/refarm-site.git $DEPS_DIR \
+ && apt-get remove --purge -y git \
+ && apt-get -y --purge autoremove \
+ && rm -rf /var/lib/apt/lists/*
COPY package.json package.json
+COPY gulpfile.babel.js gulpfile.babel.js
# we use `--no-optional` because some optional npm dependencies fail on install
-RUN npm install -g gulp-cli && npm install --no-optional
+RUN npm install -g gulp-cli
+RUN npm install --no-optional
+RUN gulp build
diff --git a/gulpfile.babel.js b/gulpfile.babel.js
index 424bac89..f13b34c4 100755
--- a/gulpfile.babel.js
+++ b/gulpfile.babel.js
@@ -50,7 +50,7 @@ const plugins = [
}),
];
-const buildDir = 'front/build';
+const buildDir = process.env.FRONT_BUILD_DIR;
const ecommercePaths = getAppSrcPaths('ecommerce');
const genericAdminPaths = getAppSrcPaths('generic_admin');
@@ -188,7 +188,7 @@ gulp.task('build', () => {
// ================================================================
// Clear : Clear destination directory.
// ================================================================
-gulp.task('clear', () => del(`${buildDir}/**/*`));
+gulp.task('clear', () => del(`${buildDir}/**/*`, { force: true }));
// ================================================================
// STYLES
diff --git a/shopelectro/settings/base.py b/shopelectro/settings/base.py
index 02a9b34b..2be7c4cc 100644
--- a/shopelectro/settings/base.py
+++ b/shopelectro/settings/base.py
@@ -152,7 +152,7 @@
STATICFILES_STORAGE = 'django.contrib.staticfiles.storage.ManifestStaticFilesStorage'
STATICFILES_DIRS = [
- os.path.join(BASE_DIR, 'front/build'),
+ os.environ['FRONT_BUILD_DIR'],
ASSETS_DIR,
]
|
web2py__web2py-2144 | gluon.utils.unlocalised_http_header_date returns wrong time
**Describe the bug**
in function unlocalised_http_header_date, line 481:
`year_and_time = time.strftime("%Y %H:%M:%S GMT")`
should be:
`year_and_time = time.strftime("%Y %H:%M:%S GMT", data)`
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#pylint: disable=invalid-name,redefined-builtin\n\n\"\"\"\n| This file is part of the web2py Web Framework\n| Copyrighted by Massimo Di Pierro <mdipierro@cs.depaul.edu>\n| License: LGPLv3 (http://www.gnu.org/licenses/lgpl.html)\n\nThis file specifica... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#pylint: disable=invalid-name,redefined-builtin\n\n\"\"\"\n| This file is part of the web2py Web Framework\n| Copyrighted by Massimo Di Pierro <mdipierro@cs.depaul.edu>\n| License: LGPLv3 (http://www.gnu.org/licenses/lgpl.html)\n\nThis file specifica... | diff --git a/gluon/utils.py b/gluon/utils.py
index bbaba4e53..e4771d9cb 100644
--- a/gluon/utils.py
+++ b/gluon/utils.py
@@ -411,7 +411,7 @@ def unlocalised_http_header_date(data):
"12": "Dec",
}.get(time.strftime("%m", data))
- year_and_time = time.strftime("%Y %H:%M:%S GMT")
+ year_and_time = time.strftime("%Y %H:%M:%S GMT", data)
return "{}, {} {} {}".format(
short_weekday,
|
spyder-ide__spyder-16483 | Drag & drop error in the Help pane
## Description
### What steps will reproduce the problem?
<!--- You can use Markdown here --->
open spyder
Drag any python file from file explorer to spyder application
Spyder reports internal problem
### Traceback
```python-traceback
Traceback (most recent call last):
File "C:\Program Files\Spyder\pkgs\spyder\plugins\help\widgets.py", line 855, in handle_link_clicks
self.rich_text.load_url(url)
File "C:\Program Files\Spyder\pkgs\spyder\plugins\help\widgets.py", line 188, in load_url
self.load(qurl)
AttributeError: 'RichText' object has no attribute 'load'
```
## Versions
* Spyder version: 5.0.1
* Python version: 3.7.9
* Qt version: 5.12.10
* PyQt5 version: 5.12.3
* Operating System: Windows 10
### Dependencies
```
# Mandatory:
atomicwrites >=1.2.0 : 1.4.0 (OK)
chardet >=2.0.0 : 4.0.0 (OK)
cloudpickle >=0.5.0 : 1.6.0 (OK)
cookiecutter >=1.6.0 : 1.7.2 (OK)
diff_match_patch >=20181111 : 20200713 (OK)
intervaltree : None (OK)
IPython >=7.6.0 : 7.22.0 (OK)
jedi =0.17.2 : 0.17.2 (OK)
jsonschema >=3.2.0 : 3.2.0 (OK)
keyring >=17.0.0 : 23.0.1 (OK)
nbconvert >=4.0 : 6.0.7 (OK)
numpydoc >=0.6.0 : 1.1.0 (OK)
paramiko >=2.4.0 : 2.7.2 (OK)
parso =0.7.0 : 0.7.0 (OK)
pexpect >=4.4.0 : 4.8.0 (OK)
pickleshare >=0.4 : 0.7.5 (OK)
psutil >=5.3 : 5.8.0 (OK)
pygments >=2.0 : 2.8.1 (OK)
pylint >=1.0 : 2.7.4 (OK)
pyls >=0.36.2;<1.0.0 : 0.36.2 (OK)
pyls_black >=0.4.6 : 0.4.6 (OK)
pyls_spyder >=0.3.2 : 0.3.2 (OK)
qdarkstyle =3.0.2 : 3.0.2 (OK)
qstylizer >=0.1.10 : 0.1.10 (OK)
qtawesome >=1.0.2 : 1.0.2 (OK)
qtconsole >=5.0.3 : 5.0.3 (OK)
qtpy >=1.5.0 : 1.9.0 (OK)
rtree >=0.8.3 : 0.9.4 (OK)
setuptools >=39.0.0 : 56.0.0 (OK)
sphinx >=0.6.6 : 3.5.4 (OK)
spyder_kernels >=2.0.1;<2.1.0 : 2.0.1 (OK)
textdistance >=4.2.0 : 4.2.1 (OK)
three_merge >=0.1.1 : 0.1.1 (OK)
watchdog : 1.0.2 (OK)
zmq >=17 : 22.0.3 (OK)
# Optional:
cython >=0.21 : 0.29.23 (OK)
matplotlib >=2.0.0 : 3.4.1 (OK)
numpy >=1.7 : 1.19.3 (OK)
pandas >=1.1.1 : 1.2.4 (OK)
scipy >=0.17.0 : 1.6.2 (OK)
sympy >=0.7.3 : 1.8 (OK)
```
| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright © Spyder Project Contributors\n# Licensed under the terms of the MIT License\n#\n\"\"\"\nHelp plugin widgets.\n\"\"\"\n\n# Standard library imports\nimport os\nimport re\nimport socket\nimport sys\n\n# Third party imports\nfrom qtpy.QtCore import Qt, QUrl, S... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright © Spyder Project Contributors\n# Licensed under the terms of the MIT License\n#\n\"\"\"\nHelp plugin widgets.\n\"\"\"\n\n# Standard library imports\nimport os\nimport re\nimport socket\nimport sys\n\n# Third party imports\nfrom qtpy.QtCore import Qt, QUrl, S... | diff --git a/spyder/plugins/help/widgets.py b/spyder/plugins/help/widgets.py
index 17ed408c570..12bc411b5d8 100644
--- a/spyder/plugins/help/widgets.py
+++ b/spyder/plugins/help/widgets.py
@@ -195,8 +195,7 @@ def load_url(self, url):
qurl = url
else:
qurl = QUrl(url)
-
- self.load(qurl)
+ self.webview.load(qurl)
def clear(self):
self.set_html('', self.webview.url())
|
ansible-collections__community.general-2419 | svr4pkg on Solaris 11.4: TypeError: a bytes-like object is required, not 'str'
### Summary
When you try to install a package on Solaris 11.4 with the svr4pkg module, you get an error:
TypeError: a bytes-like object is required, not 'str'
Fix:
```
--- svr4pkg.py.orig 2021-04-29 08:28:55.110835528 -0400
+++ svr4pkg.py 2021-04-29 08:27:49.567089417 -0400
@@ -121,7 +121,7 @@
def create_admin_file():
(desc, filename) = tempfile.mkstemp(prefix='ansible_svr4pkg', text=True)
- fullauto = '''
+ fullauto = b'''
mail=
instance=unique
partial=nocheck
```
After the fix it still works on Solaris 11.4 SRU15, Solaris 11.4 SRU31, Solaris 10 1/13
### Issue Type
Bug Report
### Component Name
communtiry.general.svr4pkg
### Ansible Version
```console (paste below)
$ ansible --version
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
ansible [core 2.12.0.dev0] (devel 60adf8e1ee) last updated 2021/04/29 08:21:55 (GMT -400)
config file = None
configured module search path = ['/home/srml/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/srml/ansible/lib/ansible
ansible collection location = /home/srml/.ansible/collections:/usr/share/ansible/collections
executable location = /home/srml/ansible/bin/ansible
python version = 3.6.8 (default, Aug 18 2020, 08:33:21) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
jinja version = 2.11.3
libyaml = True
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
```
### OS / Environment
RHEL 8.3
### Steps to Reproduce
<!--- Paste example playbooks or commands between quotes below -->
```yaml (paste below)
---
- hosts: all
become: yes
tasks:
- name: install svr4 package
community.general.svr4pkg:
name: CSWntop
state: present
src: /var/tmp/XYZsome.pkg
```
### Expected Results
Package should be installed
### Actual Results
```console (paste below)
$ ansible-playbook -i inventory -l sol11 svr4pkg.yml
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
PLAY [all] ***********************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************[WARNING]: Platform sunos on host sol11 is using the discovered Python interpreter at /usr/bin/python, but future installation of another Python interpreter could
change the meaning of that path. See https://docs.ansible.com/ansible/devel/reference_appendices/interpreter_discovery.html for more information.
ok: [sol11]
TASK [install svr4 package] ******************************************************************************************************************************************An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: a bytes-like object is required, not 'str'
fatal: [sol11]: FAILED! => {"changed": false, "module_stderr": "Shared connection to 10.0.75.109 closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 100, in <module>\r\n _ansiballz_main()\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 92, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 41, in invoke_module\r\n run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.5/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.5/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib/python3.5/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 262, in <module>\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 216, in main\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 154, in package_install\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 142, in create_admin_file\r\nTypeError: a bytes-like object is required, not 'str'\r\n", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
PLAY RECAP ***********************************************************************************************************************************************************sol11 : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
The full traceback is:
Traceback (most recent call last):
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 100, in <module>
_ansiballz_main()
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 92, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 41, in invoke_module
run_name='__main__', alter_sys=True)
File "/usr/lib/python3.5/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib/python3.5/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 262, in <module>
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 216, in main
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 154, in package_install
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 142, in create_admin_file
TypeError: a bytes-like object is required, not 'str'
fatal: [sol11]: FAILED! => {
"changed": false,
"module_stderr": "Shared connection to 10.0.75.109 closed.\r\n",
"module_stdout": "Traceback (most recent call last):\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 100, in <module>\r\n _ansiballz_main()\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 92, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 41, in invoke_module\r\n run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.5/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.5/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib/python3.5/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 262, in <module>\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 216, in main\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 154, in package_install\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 142, in create_admin_file\r\nTypeError: a bytes-like object is required, not 'str'\r\n",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
svr4pkg on Solaris 11.4: TypeError: a bytes-like object is required, not 'str'
### Summary
When you try to install a package on Solaris 11.4 with the svr4pkg module, you get an error:
TypeError: a bytes-like object is required, not 'str'
Fix:
```
--- svr4pkg.py.orig 2021-04-29 08:28:55.110835528 -0400
+++ svr4pkg.py 2021-04-29 08:27:49.567089417 -0400
@@ -121,7 +121,7 @@
def create_admin_file():
(desc, filename) = tempfile.mkstemp(prefix='ansible_svr4pkg', text=True)
- fullauto = '''
+ fullauto = b'''
mail=
instance=unique
partial=nocheck
```
After the fix it still works on Solaris 11.4 SRU15, Solaris 11.4 SRU31, Solaris 10 1/13
### Issue Type
Bug Report
### Component Name
communtiry.general.svr4pkg
### Ansible Version
```console (paste below)
$ ansible --version
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
ansible [core 2.12.0.dev0] (devel 60adf8e1ee) last updated 2021/04/29 08:21:55 (GMT -400)
config file = None
configured module search path = ['/home/srml/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/srml/ansible/lib/ansible
ansible collection location = /home/srml/.ansible/collections:/usr/share/ansible/collections
executable location = /home/srml/ansible/bin/ansible
python version = 3.6.8 (default, Aug 18 2020, 08:33:21) [GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]
jinja version = 2.11.3
libyaml = True
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
```
### OS / Environment
RHEL 8.3
### Steps to Reproduce
<!--- Paste example playbooks or commands between quotes below -->
```yaml (paste below)
---
- hosts: all
become: yes
tasks:
- name: install svr4 package
community.general.svr4pkg:
name: CSWntop
state: present
src: /var/tmp/XYZsome.pkg
```
### Expected Results
Package should be installed
### Actual Results
```console (paste below)
$ ansible-playbook -i inventory -l sol11 svr4pkg.yml
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 18 2020, 08:33:21)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.
[WARNING]: You are running the development version of Ansible. You should only run Ansible from "devel" if you are modifying the Ansible engine, or trying out
features under development. This is a rapidly changing source of code and can become unstable at any point.
PLAY [all] ***********************************************************************************************************************************************************
TASK [Gathering Facts] ***********************************************************************************************************************************************[WARNING]: Platform sunos on host sol11 is using the discovered Python interpreter at /usr/bin/python, but future installation of another Python interpreter could
change the meaning of that path. See https://docs.ansible.com/ansible/devel/reference_appendices/interpreter_discovery.html for more information.
ok: [sol11]
TASK [install svr4 package] ******************************************************************************************************************************************An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: a bytes-like object is required, not 'str'
fatal: [sol11]: FAILED! => {"changed": false, "module_stderr": "Shared connection to 10.0.75.109 closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 100, in <module>\r\n _ansiballz_main()\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 92, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699186.3019922-33970-236219862995078/AnsiballZ_svr4pkg.py\", line 41, in invoke_module\r\n run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.5/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.5/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib/python3.5/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 262, in <module>\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 216, in main\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 154, in package_install\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_ndukwobh/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 142, in create_admin_file\r\nTypeError: a bytes-like object is required, not 'str'\r\n", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
PLAY RECAP ***********************************************************************************************************************************************************sol11 : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
The full traceback is:
Traceback (most recent call last):
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 100, in <module>
_ansiballz_main()
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 92, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py", line 41, in invoke_module
run_name='__main__', alter_sys=True)
File "/usr/lib/python3.5/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib/python3.5/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 262, in <module>
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 216, in main
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 154, in package_install
File "/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py", line 142, in create_admin_file
TypeError: a bytes-like object is required, not 'str'
fatal: [sol11]: FAILED! => {
"changed": false,
"module_stderr": "Shared connection to 10.0.75.109 closed.\r\n",
"module_stdout": "Traceback (most recent call last):\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 100, in <module>\r\n _ansiballz_main()\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 92, in _ansiballz_main\r\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n File \"/export/home/srml/.ansible/tmp/ansible-tmp-1619699820.2843351-34415-58061845298388/AnsiballZ_svr4pkg.py\", line 41, in invoke_module\r\n run_name='__main__', alter_sys=True)\r\n File \"/usr/lib/python3.5/runpy.py\", line 205, in run_module\r\n return _run_module_code(code, init_globals, run_name, mod_spec)\r\n File \"/usr/lib/python3.5/runpy.py\", line 96, in _run_module_code\r\n mod_name, mod_spec, pkg_name, script_name)\r\n File \"/usr/lib/python3.5/runpy.py\", line 85, in _run_code\r\n exec(code, run_globals)\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 262, in <module>\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 216, in main\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 154, in package_install\r\n File \"/tmp/ansible_community.general.svr4pkg_payload_n2ffzlfd/ansible_community.general.svr4pkg_payload.zip/ansible_collections/community/general/plugins/modules/svr4pkg.py\", line 142, in create_admin_file\r\nTypeError: a bytes-like object is required, not 'str'\r\n",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
| [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2012, Boyd Adamson <boyd () boydadamson.com>\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTA... | [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2012, Boyd Adamson <boyd () boydadamson.com>\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTA... | diff --git a/changelogs/fragments/2373-svr4pkg-fix-typeerror.yml b/changelogs/fragments/2373-svr4pkg-fix-typeerror.yml
new file mode 100644
index 00000000000..d0b35808895
--- /dev/null
+++ b/changelogs/fragments/2373-svr4pkg-fix-typeerror.yml
@@ -0,0 +1,3 @@
+---
+bugfixes:
+ - svr4pkg - convert string to a bytes-like object to avoid ``TypeError`` with Python 3 (https://github.com/ansible-collections/community.general/issues/2373).
diff --git a/plugins/modules/packaging/os/svr4pkg.py b/plugins/modules/packaging/os/svr4pkg.py
index ea3cd7d4683..aa7a5c2e523 100644
--- a/plugins/modules/packaging/os/svr4pkg.py
+++ b/plugins/modules/packaging/os/svr4pkg.py
@@ -121,7 +121,7 @@ def package_installed(module, name, category):
def create_admin_file():
(desc, filename) = tempfile.mkstemp(prefix='ansible_svr4pkg', text=True)
- fullauto = '''
+ fullauto = b'''
mail=
instance=unique
partial=nocheck
|
internetarchive__openlibrary-4551 | Add February carousel to top of homepage
@seabelis has created a collection for February! Let's add it to the top of the home page as "Books for February". We want:
- [ ] A search query that summarizes the collection so that books appear on the homepage (here's the literal aggregate of all the carousels: https://openlibrary.org/search?mode=everything&q=subject%3A%22february%22+OR++subject%3A%22groundhog+day%22+OR++subject%3A%22valentines+day%22+OR++subject%3A%22heart%22+OR++subject%3A%22black+history%22+OR++key%3A%28%2Fworks%2FOL3912087W+OR+%2Fworks%2FOL19728320W+OR+%2Fworks%2FOL19666828W+OR+%2Fworks%2FOL3459949W+OR+%2Fworks%2FOL66728W+OR+%2Fworks%2FOL17768453W+OR+%2Fworks%2FOL16190571W+OR+%2Fworks%2FOL15160873W+OR+%2Fworks%2FOL8275668W+OR+%2Fworks%2FOL17211582W+OR+%2Fworks%2FOL17628545W+OR+%2Fworks%2FOL20163236W+OR+%2Fworks%2FOL20153225W++OR+%2Fworks%2FOL17371695W%29+OR++subject%3A%22canned+food%22+OR++subject%3A%22friendship%22+OR++subject%3A%22pie%22+OR++subject%3A%22libraries%22+OR++subject%3A%22baking%22+OR++title%3A%22bird+feeding%22+OR++title%3A%22cat+health%22+OR++subject%3A%22cherries%22+OR++subject%3A%22Childrens+Dental+Health%22+OR++title%3A%22Childrens+Dental+Health%22+OR++subject%3A%22Embroidery%22+OR++title%3A%22Grapefruit%22+OR++subject%3A%22hot+breakfast%22+OR++title%3A%22hot+breakfast%22+OR++subject%3A%22snack+food%22+OR++title%3A%22Youth+Leadership%22+OR++title%3A%22Teen+Dating+Violence%22&has_fulltext=true , but the results aren't super great. Maybe @seabelis can come up with one?)
- [ ] The carousel should link to the collection: https://openlibrary.org/collections/february
### Describe the problem that you'd like solved
Showcase the February collection :)
### Proposal & Constraints
- Note: We might have to do some stuff to make sure it caches (I don't believe it caches by default)
### Additional context
- https://github.com/internetarchive/openlibrary/blob/ce81c3986dae8bce9df8e4d81b17578f30454d1b/openlibrary/templates/home/index.html#L21
### Stakeholders
@seabelis @Sabreen-Parveen
| [
{
"content": "import web\nimport json\nimport babel\nimport babel.core\nimport babel.dates\nfrom collections import defaultdict\nimport re\nimport random\nimport xml.etree.ElementTree as etree\nimport datetime\nimport logging\n\nimport six\nfrom six import PY3\nfrom six.moves import urllib\nfrom six.moves.colle... | [
{
"content": "import web\nimport json\nimport babel\nimport babel.core\nimport babel.dates\nfrom collections import defaultdict\nimport re\nimport random\nimport xml.etree.ElementTree as etree\nimport datetime\nimport logging\n\nimport six\nfrom six import PY3\nfrom six.moves import urllib\nfrom six.moves.colle... | diff --git a/openlibrary/macros/QueryCarousel.html b/openlibrary/macros/QueryCarousel.html
index 55ad3a3f842..e9998acc3c9 100644
--- a/openlibrary/macros/QueryCarousel.html
+++ b/openlibrary/macros/QueryCarousel.html
@@ -1,4 +1,4 @@
-$def with(query, title=None, sort='new', key='', limit=20, search=False, has_fulltext_only=True)
+$def with(query, title=None, sort='new', key='', limit=20, search=False, has_fulltext_only=True, url=None)
$# Takes following parameters
$# * query (str) -- Any arbitrary Open Library search query, e.g. subject:"Textbooks"
@@ -20,6 +20,7 @@
$code:
params = { 'q': query }
+ url = url or "/search?" + urlencode(params)
if has_fulltext_only:
params['has_fulltext'] = 'true'
@@ -27,4 +28,4 @@
books = [storage(b) for b in (results.get('docs', []))]
load_more = {"url": "/search.json?" + urlencode(params), "limit": limit }
-$:render_template("books/custom_carousel", books=books, title=title, url="/search?" + urlencode(params), key=key, load_more=load_more)
+$:render_template("books/custom_carousel", books=books, title=title, url=url, key=key, load_more=load_more)
diff --git a/openlibrary/plugins/upstream/utils.py b/openlibrary/plugins/upstream/utils.py
index 5daf8d7e00f..e29cc83e347 100644
--- a/openlibrary/plugins/upstream/utils.py
+++ b/openlibrary/plugins/upstream/utils.py
@@ -818,6 +818,11 @@ def render_once(key):
return True
+@public
+def today():
+ return datetime.datetime.today()
+
+
def setup():
"""Do required initialization"""
# monkey-patch get_markdown to use OL Flavored Markdown
diff --git a/openlibrary/templates/home/index.html b/openlibrary/templates/home/index.html
index 60c1f523d5a..6c16e7a5121 100644
--- a/openlibrary/templates/home/index.html
+++ b/openlibrary/templates/home/index.html
@@ -15,11 +15,17 @@
$add_metatag(name="twitter:image:alt", content="Open Library Logo")
$add_metatag(name="twitter:card", content="homepage_summary")
+$code:
+ FEB_READS = 'key:(/works/OL18181363W OR /works/OL3481095W OR /works/OL4360244W OR /works/OL20017931W OR /works/OL20615204W OR /works/OL2363176W OR /works/OL17869588W OR /works/OL17784026W OR /works/OL21179764W OR /works/OL8870595W OR /works/OL21054973W OR /works/OL21673730W OR /works/OL20548582W OR /works/OL15279153W OR /works/OL19992836W OR /works/OL15691480W OR /works/OL16305795W OR /works/OL19923407W OR /works/OL16529029W OR /works/OL9242636W OR /works/OL17529769W OR /works/OL3345332W OR /works/OL20013209W OR /works/OL20015483W OR /works/OL19987474W OR /works/OL19992114W OR /works/OL17893900W OR /works/OL18435803W OR /works/OL17314666W OR /works/OL17358927W OR /works/OL15933199W OR /works/OL17858931W OR /works/OL18187603W OR /works/OL16853133W OR /works/OL16894393W OR /works/OL19976062W OR /works/OL20037832W OR /works/OL16885033W OR /works/OL19708155W OR /works/OL17921756W OR /works/OL21037237W OR /works/OL17786027W OR /works/OL17345141W OR /works/OL21294275W OR /works/OL9582417W OR /works/OL9357555W OR /works/OL20907853W OR /works/OL20005568W OR /works/OL3296483W OR /works/OL11983310W OR /works/OL7159886W OR /works/OL1662667W OR /works/OL19990553W OR /works/OL15285884W OR /works/OL6888879W OR /works/OL17900435W OR /works/OL5706069W OR /works/OL2977589W OR /works/OL1593701W OR /works/OL16451688W OR /works/OL16910779W OR /works/OL18215336W OR /works/OL17371695W OR /works/OL3521634W OR /works/OL17355199W OR /works/OL5739152W OR /works/OL20016962W OR /works/OL3191599W OR /works/OL20896695W OR /works/OL19752490W OR /works/OL18335154W OR /works/OL4582875W OR /works/OL16515210W OR /works/OL16868407W OR /works/OL3459949W OR /works/OL16025481W OR /works/OL1928280W OR /works/OL6208302W OR /works/OL17566265W OR /works/OL20652811W OR /works/OL22059158W OR /works/OL4370955W OR /works/OL19998526W OR /works/OL6218060W OR /works/OL16813953W OR /works/OL21179974W OR /works/OL7213898W OR /works/OL17872185W OR /works/OL17340085W OR /works/OL21584979W OR /works/OL21078916W OR /works/OL158519W OR /works/OL4114499W OR /works/OL19638041W OR /works/OL16844793W OR /works/OL20940485W OR /works/OL17392121W OR /works/OL20030448W OR /works/OL15920474W OR /works/OL20544657W)'
+
<div id="contentBody">
$:render_template("home/categories", test=test)
$:render_template("books/custom_carousel", books=readonline_carousel(), title=_('Classic Books'), url="/read", key="public_domain", test=test)
+ $if today().month == 2 and not test:
+ $:macros.QueryCarousel(query=FEB_READS, title=_('Books For February'), key="monthly_reads", url="/collections/february", sort='editions')
+
$:render_template("home/custom_ia_carousel", title=_('Books We Love'), key="staff_picks", query='languageSorter:("English")', subject="openlibrary_staff_picks", sorts=["lending___last_browse desc"], limit=18, test=test)
$:render_template("home/custom_ia_carousel", title=_('Recently Returned'), key="recently_returned", sorts=["lending___last_browse desc"], limit=18, test=test)
|
pwndbg__pwndbg-760 | `find_fake_fast` fails when providing a size argument
<!--
Before reporting a new issue, make sure that we do not have any duplicates already open.
If there is one it might be good to take part in the discussion there.
Please make sure you have checked that the issue persists on LATEST pwndbg version.
Below is a template for BUG REPORTS.
Don't include it if this is a FEATURE REQUEST.
-->
### Description
<!--
Briefly describe the problem you are having in a few paragraphs.
-->
Providing a size argument to the `find_fake_fast` command causes a TypeError at [heap.py:519](https://github.com/pwndbg/pwndbg/blob/dev/pwndbg/commands/heap.py#L519).
### Steps to reproduce
<!--
What do we have to do to reproduce the problem?
If this is connected to particular C/asm code,
please provide the smallest C code that reproduces the issue.
-->
1. Run gdb on a program that utilizes the heap
2. Once the heap is initialized, run `find_fake_fast &__malloc_hook 0x7f`
`find_fake_fast` working correctly as of commit 1158a3086d2eaa137e3ce30810539c1aa578e87a

Same command, same program, updated to commit 609284cee279de345dcb0706e11a0b56abe349f4

### My setup
<!--
Show us your gdb/python/pwndbg/OS/IDA Pro version (depending on your case).
NOTE: We are currently supporting only Ubuntu installations.
It is known that pwndbg is not fully working e.g. on Arch Linux (the heap stuff is not working there).
If you would like to change this situation - help us improving pwndbg and supporting other distros!
This can be displayed in pwndbg through `version` command.
If it is somehow unavailable, use:
* `show version` - for gdb
* `py import sys; print(sys.version)` - for python
* pwndbg version/git commit id
-->
Gdb: 7.11.1
Python: 3.5.2 (default, Oct 8 2019, 13:06:37) [GCC 5.4.0 20160609]
Pwndbg: 1.1.0 build: 609284c
Capstone: 4.0.1024
Unicorn: 1.0.1
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport argparse\nimport ctypes\nimport struct\n\nimport gdb\nimport six\n\nimport pwndbg.color.... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport argparse\nimport ctypes\nimport struct\n\nimport gdb\nimport six\n\nimport pwndbg.color.... | diff --git a/pwndbg/commands/heap.py b/pwndbg/commands/heap.py
index 743d55d4bee..ae28b14852f 100755
--- a/pwndbg/commands/heap.py
+++ b/pwndbg/commands/heap.py
@@ -511,7 +511,7 @@ def find_fake_fast(addr, size=None):
if size is None:
sizes = range(min_fast, max_fast + 1, align)
else:
- sizes = [size]
+ sizes = [int(size)]
print(C.banner("FAKE CHUNKS"))
for size in sizes:
|
ivy-llc__ivy-17162 | is_integer
| [
{
"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef is_complex(x):\n return ivy.is_complex_dtype(x)\n\n\n@to_ivy_arrays_and_back\ndef is_floating_point(x):\n return ivy.is_float_dtype(x)\n",
"p... | [
{
"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef is_complex(x):\n return ivy.is_complex_dtype(x)\n\n\n@to_ivy_arrays_and_back\ndef is_integer(x):\n return ivy.is_int_dtype(x)\n\n\n@to_ivy_arrays... | diff --git a/ivy/functional/frontends/paddle/tensor/attribute.py b/ivy/functional/frontends/paddle/tensor/attribute.py
index 9520930395f91..cc5d69066978b 100644
--- a/ivy/functional/frontends/paddle/tensor/attribute.py
+++ b/ivy/functional/frontends/paddle/tensor/attribute.py
@@ -10,6 +10,11 @@ def is_complex(x):
return ivy.is_complex_dtype(x)
+@to_ivy_arrays_and_back
+def is_integer(x):
+ return ivy.is_int_dtype(x)
+
+
@to_ivy_arrays_and_back
def is_floating_point(x):
return ivy.is_float_dtype(x)
diff --git a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_attribute.py b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_attribute.py
index 5e6573d4acd69..5f25f247e137b 100644
--- a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_attribute.py
+++ b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_attribute.py
@@ -30,6 +30,31 @@ def test_paddle_is_complex(
)
+@handle_frontend_test(
+ fn_tree="paddle.tensor.attribute.is_integer",
+ dtype_and_x=helpers.dtype_and_values(
+ available_dtypes=helpers.get_dtypes("valid"),
+ ),
+)
+def test_paddle_is_integer(
+ *,
+ dtype_and_x,
+ on_device,
+ fn_tree,
+ frontend,
+ test_flags,
+):
+ input_dtype, input = dtype_and_x
+ helpers.test_frontend_function(
+ input_dtypes=input_dtype,
+ frontend=frontend,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ x=input[0],
+ )
+
+
@handle_frontend_test(
fn_tree="paddle.tensor.attribute.is_floating_point",
dtype_and_x=helpers.dtype_and_values(
|
mitmproxy__mitmproxy-1510 | Divide by Zero error
Its in `netlib/strutils.py`
This line # around 126 :
``` python
for i in six.iterbytes(s[:100])
) / len(s[:100]) > 0.3
```
if s is empty, it gives this error in the mitmproxy, (doesn't crash though due to recent improvements in mitmproxy i guess..)
| [
{
"content": "from __future__ import absolute_import, print_function, division\nimport re\nimport codecs\n\nimport six\n\n\ndef always_bytes(unicode_or_bytes, *encode_args):\n if isinstance(unicode_or_bytes, six.text_type):\n return unicode_or_bytes.encode(*encode_args)\n return unicode_or_bytes\n\... | [
{
"content": "from __future__ import absolute_import, print_function, division\nimport re\nimport codecs\n\nimport six\n\n\ndef always_bytes(unicode_or_bytes, *encode_args):\n if isinstance(unicode_or_bytes, six.text_type):\n return unicode_or_bytes.encode(*encode_args)\n return unicode_or_bytes\n\... | diff --git a/netlib/strutils.py b/netlib/strutils.py
index 4a46b6b1ff..4cb3b80560 100644
--- a/netlib/strutils.py
+++ b/netlib/strutils.py
@@ -121,6 +121,9 @@ def escaped_str_to_bytes(data):
def is_mostly_bin(s):
# type: (bytes) -> bool
+ if not s or len(s) == 0:
+ return False
+
return sum(
i < 9 or 13 < i < 32 or 126 < i
for i in six.iterbytes(s[:100])
diff --git a/test/netlib/test_strutils.py b/test/netlib/test_strutils.py
index 52299e5991..5be254a3e1 100644
--- a/test/netlib/test_strutils.py
+++ b/test/netlib/test_strutils.py
@@ -85,6 +85,7 @@ def test_escaped_str_to_bytes():
def test_is_mostly_bin():
assert not strutils.is_mostly_bin(b"foo\xFF")
assert strutils.is_mostly_bin(b"foo" + b"\xFF" * 10)
+ assert not strutils.is_mostly_bin("")
def test_is_xml():
|
learningequality__kolibri-5237 | running kolibri pex on mac in background mode causes a seg fault
### Observed behavior
Trying to run the kolibri pex in background mode causes a hard python crash with no script-level errors:

running with `--foreground` does not have this issue
### Expected behavior
no crash
### User-facing consequences
frustration
### Errors and logs
This is the top of the report
```
Crashed Thread: 2
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x000000011207bb36
Exception Note: EXC_CORPSE_NOTIFY
Termination Signal: Segmentation fault: 11
Termination Reason: Namespace SIGNAL, Code 0xb
Terminating Process: exc handler [54294]
```
I can supply additional info
### Steps to reproduce
```kolibri start```
### Context
0.12
| [
{
"content": "from __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport importlib\nimport logging\nimport os\nimport signal\nimport sys\nfrom sqlite3 import DatabaseError as SQLite3DatabaseError\n\nimport django\nfrom django.core.exceptions ... | [
{
"content": "from __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport importlib\nimport logging\nimport os\nimport signal\nimport sys\nfrom sqlite3 import DatabaseError as SQLite3DatabaseError\n\nimport django\nfrom django.core.exceptions ... | diff --git a/kolibri/utils/cli.py b/kolibri/utils/cli.py
index 998417f5ebf..bb6d2c3264b 100644
--- a/kolibri/utils/cli.py
+++ b/kolibri/utils/cli.py
@@ -668,6 +668,8 @@ def main(args=None): # noqa: max-complexity=13
call_command("clearsessions")
daemon = not arguments['--foreground']
+ if sys.platform == 'darwin':
+ daemon = False
start(port, daemon=daemon)
return
|
python-gitlab__python-gitlab-2088 | sphinx warnings `reference target not found`
On building my packages I'm using `sphinx-build` command with `-n` switch which shows warmings about missing references. These are not critical issues.
Here is the output with warnings:
```console
+ /usr/bin/sphinx-build -n -T -b man docs build/sphinx/man
Running Sphinx v4.5.0
making output directory... done
myst v0.17.2: MdParserConfig(commonmark_only=False, gfm_only=False, enable_extensions=[], linkify_fuzzy_links=True, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', disable_syntax=[], all_links_external=False, url_schemes=('http', 'https', 'mailto', 'ftp'), ref_domains=None, highlight_code_blocks=True, number_code_blocks=[], title_to_header=False, heading_anchors=None, heading_slug_func=None, footnote_transition=True, sub_delimiters=('{', '}'), words_per_minute=200)
[autosummary] generating autosummary for: api-objects.rst, api-usage.rst, api/gitlab.rst, api/gitlab.v4.rst, changelog.md, cli-examples.rst, cli-objects.rst, cli-usage.rst, faq.rst, gl_objects/access_requests.rst, ..., gl_objects/snippets.rst, gl_objects/system_hooks.rst, gl_objects/templates.rst, gl_objects/todos.rst, gl_objects/topics.rst, gl_objects/users.rst, gl_objects/variables.rst, gl_objects/wikis.rst, index.rst, release-notes.rst
building [mo]: targets for 0 po files that are out of date
building [man]: all manpages
updating environment: [new config] 65 added, 0 changed, 0 removed
reading sources... [100%] release-notes
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
writing... python-gitlab.3 { cli-usage api-usage cli-examples api-objects gl_objects/access_requests gl_objects/appearance gl_objects/applications gl_objects/emojis gl_objects/badges gl_objects/branches gl_objects/clusters gl_objects/messages gl_objects/commits gl_objects/deploy_keys gl_objects/deploy_tokens gl_objects/deployments gl_objects/discussions gl_objects/environments gl_objects/events gl_objects/epics gl_objects/features gl_objects/geo_nodes gl_objects/groups gl_objects/group_access_tokens gl_objects/issues gl_objects/keys gl_objects/boards gl_objects/labels gl_objects/notifications gl_objects/merge_trains gl_objects/merge_requests gl_objects/merge_request_approvals gl_objects/milestones gl_objects/namespaces gl_objects/notes gl_objects/packages gl_objects/pagesdomains gl_objects/personal_access_tokens gl_objects/pipelines_and_jobs gl_objects/projects gl_objects/project_access_tokens gl_objects/protected_branches gl_objects/releases gl_objects/runners gl_objects/remote_mirrors gl_objects/repositories gl_objects/repository_tags gl_objects/search gl_objects/settings gl_objects/snippets gl_objects/system_hooks gl_objects/templates gl_objects/todos gl_objects/topics gl_objects/users gl_objects/variables gl_objects/sidekiq gl_objects/wikis api/gitlab api/gitlab.v4 cli-objects changelog release-notes faq } /home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/applications.rst:10: WARNING: py:class reference target not found: gitlab.v4.objects.Applications
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/epics.rst:15: WARNING: py:attr reference target not found: gitlab.Gitlab.Group.epics
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/epics.rst:54: WARNING: py:attr reference target not found: gitlab.Gitlab.GroupEpic.issues
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/group_access_tokens.rst:14: WARNING: py:attr reference target not found: gitlab.Gitlab.group_access_tokens
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/issues.rst:239: WARNING: py:attr reference target not found: gitlab.issues_statistics
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/notes.rst:19: WARNING: py:attr reference target not found: gitlab.v4.objects.GroupEpic.notes
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/personal_access_tokens.rst:11: WARNING: py:class reference target not found: gitlab.v4.objects.PersonalAcessTokenManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/personal_access_tokens.rst:14: WARNING: py:class reference target not found: gitlab.v4.objects.UserPersonalAcessTokenManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/personal_access_tokens.rst:15: WARNING: py:attr reference target not found: gitlab.Gitlab.User.personal_access_tokens
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/project_access_tokens.rst:14: WARNING: py:attr reference target not found: gitlab.Gitlab.project_access_tokens
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/repository_tags.rst:12: WARNING: py:attr reference target not found: gitlab.v4.objects.Repository.tags
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/snippets.rst:11: WARNING: py:class reference target not found: gitlab.v4.objects.SnipptManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/todos.rst:10: WARNING: py:class reference target not found: gitlab.objects.Todo
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/todos.rst:11: WARNING: py:class reference target not found: gitlab.objects.TodoManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/gl_objects/users.rst:219: WARNING: py:attr reference target not found: gitlab.Gitlab.user
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab:: WARNING: py:class reference target not found: requests.sessions.Session
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab:: WARNING: py:class reference target not found: requests.sessions.Session
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.from_config:: WARNING: py:class reference target not found: config_files
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_delete:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_delete:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_get:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_get:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_post:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_post:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_put:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_put:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_request:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.http_request:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/client.py:docstring of gitlab.client.Gitlab.set_license:: WARNING: py:exc reference target not found: GitlabPostError
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/packages.py:docstring of gitlab.v4.objects.packages.GenericPackageManager.upload:: WARNING: py:class reference target not found: pathlib.Path
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/packages.py:docstring of gitlab.v4.objects.packages.GenericPackageManager.upload:: WARNING: py:class reference target not found: pathlib.Path
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/groups.py:docstring of gitlab.v4.objects.groups.GroupManager.import_group:: WARNING: py:obj reference target not found: typing.BinaryIO
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/groups.py:docstring of gitlab.v4.objects.groups.GroupManager.import_group:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/groups.py:docstring of gitlab.v4.objects.groups.GroupManager.import_group:: WARNING: py:obj reference target not found: typing.BinaryIO
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/groups.py:docstring of gitlab.v4.objects.groups.GroupManager.import_group:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.Project.groups:: WARNING: py:class reference target not found: gitlab.v4.objects.projects.ProjectGroupManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.Project.languages:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.Project.languages:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.Project.transfer:: WARNING: py:class reference target not found: project
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/artifacts.py:docstring of gitlab.v4.objects.artifacts.ProjectArtifactManager.download:: WARNING: py:class reference target not found: are not
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/commits.py:docstring of gitlab.v4.objects.commits.ProjectCommit.revert:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/commits.py:docstring of gitlab.v4.objects.commits.ProjectCommit.revert:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/commits.py:docstring of gitlab.v4.objects.commits.ProjectCommit.signature:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/commits.py:docstring of gitlab.v4.objects.commits.ProjectCommit.signature:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/environments.py:docstring of gitlab.v4.objects.environments.ProjectEnvironment.stop:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/environments.py:docstring of gitlab.v4.objects.environments.ProjectEnvironment.stop:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/issues.py:docstring of gitlab.v4.objects.issues.ProjectIssue.closed_by:: WARNING: py:exc reference target not found: GitlabGetErrot
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/issues.py:docstring of gitlab.v4.objects.issues.ProjectIssue.related_merge_requests:: WARNING: py:exc reference target not found: GitlabGetErrot
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/deploy_keys.py:docstring of gitlab.v4.objects.deploy_keys.ProjectKeyManager.enable:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/deploy_keys.py:docstring of gitlab.v4.objects.deploy_keys.ProjectKeyManager.enable:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/labels.py:docstring of gitlab.v4.objects.labels.ProjectLabel:1: WARNING: py:class reference target not found: gitlab.mixins.PromoteMixin
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_bitbucket_server:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_bitbucket_server:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_github:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_github:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_project:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/projects.py:docstring of gitlab.v4.objects.projects.ProjectManager.import_project:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.changes:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.changes:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.merge_ref:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.merge_ref:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.rebase:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/merge_requests.py:docstring of gitlab.v4.objects.merge_requests.ProjectMergeRequest.rebase:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/milestones.py:docstring of gitlab.v4.objects.milestones.ProjectMilestone:1: WARNING: py:class reference target not found: gitlab.mixins.PromoteMixin
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/pipelines.py:docstring of gitlab.v4.objects.pipelines.ProjectPipeline.cancel:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/pipelines.py:docstring of gitlab.v4.objects.pipelines.ProjectPipeline.cancel:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/pipelines.py:docstring of gitlab.v4.objects.pipelines.ProjectPipeline.retry:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/pipelines.py:docstring of gitlab.v4.objects.pipelines.ProjectPipeline.retry:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.repository_blob:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.repository_blob:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.repository_compare:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.repository_compare:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.update_submodule:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.update_submodule:: WARNING: py:exc reference target not found: GitlabPutError
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/repositories.py:docstring of gitlab.v4.objects.repositories.RepositoryMixin.update_submodule:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.compound_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.compound_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.job_stats:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.job_stats:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.process_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.process_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.queue_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/sidekiq.py:docstring of gitlab.v4.objects.sidekiq.SidekiqManager.queue_metrics:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.activate:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.activate:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.block:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.block:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.deactivate:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.deactivate:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.follow:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.follow:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.User.followers_users:: WARNING: py:class reference target not found: UserFollowersManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.User.following_users:: WARNING: py:class reference target not found: UserFollowingManager
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.unblock:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.unblock:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.unfollow:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/v4/objects/users.py:docstring of gitlab.v4.objects.users.User.unfollow:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.docs:: WARNING: py:class reference target not found: argparse.ArgumentParser
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.docs:: WARNING: py:class reference target not found: argparse.ArgumentParser
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.register_custom_action:: WARNING: py:class reference target not found: gitlab.cli.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.register_custom_action:: WARNING: py:class reference target not found: gitlab.cli.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.register_custom_action:: WARNING: py:class reference target not found: gitlab.cli.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.register_custom_action:: WARNING: py:class reference target not found: gitlab.cli.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.what_to_cls:: WARNING: py:class reference target not found: module
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/cli.py:docstring of gitlab.cli.what_to_cls:: WARNING: py:class reference target not found: module
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/exceptions.py:docstring of gitlab.exceptions.on_http_error:: WARNING: py:class reference target not found: gitlab.exceptions.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/exceptions.py:docstring of gitlab.exceptions.on_http_error:: WARNING: py:class reference target not found: gitlab.exceptions.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/exceptions.py:docstring of gitlab.exceptions.on_http_error:: WARNING: py:class reference target not found: The exception type to raise -- must inherit from
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/exceptions.py:docstring of gitlab.exceptions.on_http_error:: WARNING: py:class reference target not found: gitlab.exceptions.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/exceptions.py:docstring of gitlab.exceptions.on_http_error:: WARNING: py:class reference target not found: gitlab.exceptions.__F
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/utils.py:docstring of gitlab.utils.response_content:: WARNING: py:class reference target not found: requests.models.Response
/home/tkloczko/rpmbuild/BUILD/python-gitlab-3.4.0/docs/../gitlab/utils.py:docstring of gitlab.utils.response_content:: WARNING: py:class reference target not found: requests.models.Response
done
build succeeded, 112 warnings.
```
| [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# python-gitlab documentation build configuration file, created by\n# sphinx-quickstart on Mon Dec 8 15:17:39 2014.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configur... | [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# python-gitlab documentation build configuration file, created by\n# sphinx-quickstart on Mon Dec 8 15:17:39 2014.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configur... | diff --git a/docs/conf.py b/docs/conf.py
index 11b920e7b..13de175d0 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -287,6 +287,7 @@ def setup(sphinx):
# If true, show URL addresses after external links.
# man_show_urls = False
+nitpick_ignore_regex = [(r"py:.*", r".*")]
# -- Options for Texinfo output -------------------------------------------
diff --git a/setup.cfg b/setup.cfg
deleted file mode 100644
index 0e198a6f9..000000000
--- a/setup.cfg
+++ /dev/null
@@ -1,3 +0,0 @@
-[build_sphinx]
-warning-is-error = 1
-keep-going = 1
diff --git a/tox.ini b/tox.ini
index 144c52164..38171f2f6 100644
--- a/tox.ini
+++ b/tox.ini
@@ -83,7 +83,7 @@ per-file-ignores =
[testenv:docs]
deps = -r{toxinidir}/requirements-docs.txt
-commands = python setup.py build_sphinx
+commands = sphinx-build -n -W --keep-going -b html docs build/sphinx/html
[testenv:cover]
commands =
|
cisagov__manage.get.gov-1583 | Redirect logout to {beta.}get.gov info site
Now that we have the `cloud.gov Pages` site setup at get.gov, we should redirect logout actions to that site.
As a logged-in user of the registrar
I want to be redirected to the new get.gov informational site when I log out
So that I stay in the .gov experience rather than login.gov
AC:
- [ ] **Given** a logged-in user on the .gov registrar, **when** I logout and also choose "Yes, sign out of login.gov," **then** I am redirected to the get.gov as an unauthenticated user.
- [ ] Language on login.gov screen reads "Do you want to sign out of Login.gov and return to **get.gov**?

### Additional Context:
Currently, if we select to "return to the .gov registrar," we go back to login.gov.... and if we select to go back to the .gov registrar, we get a nasty 401 error because we aren't logged in anymore.
### Links to related issues
🔄 #1509
| [
{
"content": "\"\"\"\nDjango settings for .gov registrar project.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/4.0/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/4.0/ref/settings/\n\nIF you'd like to see all of these set... | [
{
"content": "\"\"\"\nDjango settings for .gov registrar project.\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/4.0/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/4.0/ref/settings/\n\nIF you'd like to see all of these set... | diff --git a/src/registrar/config/settings.py b/src/registrar/config/settings.py
index bc46c60ba..2de7e6eb2 100644
--- a/src/registrar/config/settings.py
+++ b/src/registrar/config/settings.py
@@ -519,7 +519,7 @@
]
# where to go after logging out
-LOGOUT_REDIRECT_URL = "home"
+LOGOUT_REDIRECT_URL = "https://get.gov/"
# disable dynamic client registration,
# only the OP inside OIDC_PROVIDERS will be available
diff --git a/src/zap.conf b/src/zap.conf
index e7dc980b0..7a1e5c96d 100644
--- a/src/zap.conf
+++ b/src/zap.conf
@@ -67,6 +67,7 @@
10038 OUTOFSCOPE http://app:8080/dns/nameservers
10038 OUTOFSCOPE http://app:8080/dns/dnssec
10038 OUTOFSCOPE http://app:8080/dns/dnssec/dsdata
+10038 OUTOFSCOPE http://app:8080/org-name-address
# This URL always returns 404, so include it as well.
10038 OUTOFSCOPE http://app:8080/todo
# OIDC isn't configured in the test environment and DEBUG=True so this gives a 500 without CSP headers
|
microsoft__botbuilder-python-1637 | botbuilder-testing is missing install requirements
## Version
botbuilder-testing 4.12.0
## Describe the bug
While installing botbuilder-testing for CI I got errors about missing dependencies.
## To Reproduce
1. `python3 -m venv .venv`
2. `. .venv/bin/activate`
3. `pip install -U pip wheel`
4. `pip install botbuilder-testing`
5. `python -c "from botbuilder.testing import DialogTestClient"`
First error is missing `pytest`:
```python
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/calum/sureswift/jell/jell-bot-teams-v2/.venv-test/lib/python3.8/site-packages/botbuilder/testing/__init__.py", line 6, in <module>
from .storage_base_tests import StorageBaseTests
File "/home/calum/sureswift/jell/jell-bot-teams-v2/.venv-test/lib/python3.8/site-packages/botbuilder/testing/storage_base_tests.py", line 26, in <module>
import pytest
ModuleNotFoundError: No module named 'pytest'
```
6. `pip install pytest`
7. `python -c 'from botbuilder.testing import DialogTestClient'`
Next error is missing `botbuilder-azure`:
```python
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/calum/sureswift/jell/jell-bot-teams-v2/.venv-test/lib/python3.8/site-packages/botbuilder/testing/__init__.py", line 6, in <module>
from .storage_base_tests import StorageBaseTests
File "/home/calum/sureswift/jell/jell-bot-teams-v2/.venv-test/lib/python3.8/site-packages/botbuilder/testing/storage_base_tests.py", line 27, in <module>
from botbuilder.azure import CosmosDbStorage
ModuleNotFoundError: No module named 'botbuilder.azure'
```
8. `pip install botbuilder-azure`
9. `python -c 'from botbuilder.testing import DialogTestClient'`
Command works!
## Expected behavior
No errors after installing botbuilder-testing and importing module
I do wonder if the requirement for pytest is not necessary, leaving the lib test-suite agnostic and could be refactored out?
| [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nREQUIRES = [\n \"botbuilder-schema==4.13.0\",\n \"botbuilder-core==4.13.0\",\n \"botbuilder-dialogs==4.13.0\",\n]\n\nTESTS_REQUIRES = [\"aiounittest... | [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nREQUIRES = [\n \"botbuilder-schema==4.13.0\",\n \"botbuilder-core==4.13.0\",\n \"botbuilder-dialogs==4.13.0\",\n \"botbuilder-azure==4.13.0\",\n ... | diff --git a/libraries/botbuilder-testing/setup.py b/libraries/botbuilder-testing/setup.py
index af36832cd..bd6ed4856 100644
--- a/libraries/botbuilder-testing/setup.py
+++ b/libraries/botbuilder-testing/setup.py
@@ -8,6 +8,8 @@
"botbuilder-schema==4.13.0",
"botbuilder-core==4.13.0",
"botbuilder-dialogs==4.13.0",
+ "botbuilder-azure==4.13.0",
+ "pytest~=6.2.3",
]
TESTS_REQUIRES = ["aiounittest==1.3.0"]
|
python-poetry__poetry-3159 | Poetry fails with KeyError if the PATH environment variable is not present
- [x] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.
- [x] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.
- [x] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).
- **OS version and name**: Ubuntu 18.04
- **Poetry version**: 1.0.10
## Issue
When running in CI using a docker container the `PATH` environment variable is not set and causes an issue with poetry. Unfortunately I don't see any traceback. Here's a snipped showing the issue:
```
root@5d1e49d5433c:~/src# unset PATH
root@5d1e49d5433c:~/src# /usr/local/bin/poetry run -vvv pip install pip
[KeyError]
'PATH'
```
| [
{
"content": "import base64\nimport hashlib\nimport json\nimport os\nimport platform\nimport re\nimport shutil\nimport sys\nimport sysconfig\nimport textwrap\n\nfrom contextlib import contextmanager\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typi... | [
{
"content": "import base64\nimport hashlib\nimport json\nimport os\nimport platform\nimport re\nimport shutil\nimport sys\nimport sysconfig\nimport textwrap\n\nfrom contextlib import contextmanager\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typi... | diff --git a/poetry/utils/env.py b/poetry/utils/env.py
index 0a027bd668a..ccd855b5c60 100644
--- a/poetry/utils/env.py
+++ b/poetry/utils/env.py
@@ -1212,7 +1212,7 @@ def unset_env(self, key):
del os.environ[key]
def _updated_path(self):
- return os.pathsep.join([str(self._bin_dir), os.environ["PATH"]])
+ return os.pathsep.join([str(self._bin_dir), os.environ.get("PATH", "")])
class NullEnv(SystemEnv):
|
python-poetry__poetry-3146 | Poetry fails with KeyError if the PATH environment variable is not present
- [x] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.
- [x] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.
- [x] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).
- **OS version and name**: Ubuntu 18.04
- **Poetry version**: 1.0.10
## Issue
When running in CI using a docker container the `PATH` environment variable is not set and causes an issue with poetry. Unfortunately I don't see any traceback. Here's a snipped showing the issue:
```
root@5d1e49d5433c:~/src# unset PATH
root@5d1e49d5433c:~/src# /usr/local/bin/poetry run -vvv pip install pip
[KeyError]
'PATH'
```
| [
{
"content": "import base64\nimport hashlib\nimport json\nimport os\nimport platform\nimport re\nimport shutil\nimport sys\nimport sysconfig\nimport textwrap\n\nfrom contextlib import contextmanager\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typi... | [
{
"content": "import base64\nimport hashlib\nimport json\nimport os\nimport platform\nimport re\nimport shutil\nimport sys\nimport sysconfig\nimport textwrap\n\nfrom contextlib import contextmanager\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typi... | diff --git a/poetry/utils/env.py b/poetry/utils/env.py
index 0a027bd668a..ccd855b5c60 100644
--- a/poetry/utils/env.py
+++ b/poetry/utils/env.py
@@ -1212,7 +1212,7 @@ def unset_env(self, key):
del os.environ[key]
def _updated_path(self):
- return os.pathsep.join([str(self._bin_dir), os.environ["PATH"]])
+ return os.pathsep.join([str(self._bin_dir), os.environ.get("PATH", "")])
class NullEnv(SystemEnv):
|
cleanlab__cleanlab-990 | Add underperforming_group issue type among the Datalab defaults
Test issue manager with different datasets (Image, tabular etc.) to make sure that the underperforming group in the dataset is extracted successfully. List any failure cases that might need to be addressed before adding this issue type to the defaults.
| [
{
"content": "# Copyright (C) 2017-2023 Cleanlab Inc.\n# This file is part of cleanlab.\n#\n# cleanlab is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as published\n# by the Free Software Foundation, either version 3 of the License, or\n# (... | [
{
"content": "# Copyright (C) 2017-2023 Cleanlab Inc.\n# This file is part of cleanlab.\n#\n# cleanlab is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as published\n# by the Free Software Foundation, either version 3 of the License, or\n# (... | diff --git a/cleanlab/datalab/internal/issue_manager_factory.py b/cleanlab/datalab/internal/issue_manager_factory.py
index 28cabb58a6..d85fc62e4c 100644
--- a/cleanlab/datalab/internal/issue_manager_factory.py
+++ b/cleanlab/datalab/internal/issue_manager_factory.py
@@ -223,6 +223,7 @@ def list_default_issue_types(task: str) -> List[str]:
"near_duplicate",
"non_iid",
"class_imbalance",
+ "underperforming_group",
],
"regression": [
"null",
diff --git a/tests/datalab/datalab/test_datalab.py b/tests/datalab/datalab/test_datalab.py
index 6d06d5de04..172801a1bf 100644
--- a/tests/datalab/datalab/test_datalab.py
+++ b/tests/datalab/datalab/test_datalab.py
@@ -89,6 +89,7 @@ def test_list_default_issue_types(self):
"near_duplicate",
"non_iid",
"class_imbalance",
+ "underperforming_group",
]
def tmp_path(self):
diff --git a/tests/datalab/test_cleanvision_integration.py b/tests/datalab/test_cleanvision_integration.py
index 972791bd9a..1d463473e6 100644
--- a/tests/datalab/test_cleanvision_integration.py
+++ b/tests/datalab/test_cleanvision_integration.py
@@ -32,7 +32,7 @@ def num_imagelab_issues(self):
@pytest.fixture
def num_datalab_issues(self):
- return 5
+ return 6
@pytest.fixture
def pred_probs(self, image_dataset):
@@ -69,6 +69,7 @@ def test_imagelab_issues_checked(
"near_duplicate",
"class_imbalance",
"null",
+ "underperforming_group",
# "non_iid",
]
@@ -94,12 +95,14 @@ def test_imagelab_issues_checked(
"near_duplicate",
"class_imbalance",
"null",
+ "underperforming_group",
],
- "num_issues": [1, 1, 0, 1, 1, 1, 1, 0, 0, 0, 0, 0],
+ "num_issues": [1, 1, 0, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0],
}
)
expected_count = df.sort_values(by="issue_type")["num_issues"].tolist()
count = datalab.issue_summary.sort_values(by="issue_type")["num_issues"].tolist()
+ assert set(datalab.issue_summary["issue_type"].tolist()) == set(df["issue_type"].tolist())
assert count == expected_count
assert datalab.issue_summary["num_issues"].sum() == df["num_issues"].sum()
@@ -147,7 +150,15 @@ def test_imagelab_issues_not_checked(
assert len(datalab.issues.columns) == num_datalab_issues * 2
assert len(datalab.issue_summary) == num_datalab_issues
- all_keys = ["statistics", "label", "outlier", "near_duplicate", "class_imbalance", "null"]
+ all_keys = [
+ "statistics",
+ "label",
+ "outlier",
+ "near_duplicate",
+ "class_imbalance",
+ "null",
+ "underperforming_group",
+ ]
assert set(all_keys) == set(datalab.info.keys())
datalab.report()
|
liqd__adhocracy4-1243 | Poll cannot change order of questions
Poll: after moving a question in the dashboard and saving, the question moves back to it's original position.
NOTE: flip-move lib still working in documents, may require a poll refactor
https://github.com/liqd/adhocracy-plus/issues/1964
https://github.com/liqd/a4-meinberlin/issues/4370
| [
{
"content": "from django.contrib.contenttypes.fields import GenericRelation\nfrom django.core.exceptions import ValidationError\nfrom django.db import models\nfrom django.utils.translation import gettext_lazy as _\n\nfrom adhocracy4.comments import models as comment_models\nfrom adhocracy4.models.base import U... | [
{
"content": "from django.contrib.contenttypes.fields import GenericRelation\nfrom django.core.exceptions import ValidationError\nfrom django.db import models\nfrom django.utils.translation import gettext_lazy as _\n\nfrom adhocracy4.comments import models as comment_models\nfrom adhocracy4.models.base import U... | diff --git a/adhocracy4/comments_async/static/comments_async/comment_edit_form.jsx b/adhocracy4/comments_async/static/comments_async/comment_edit_form.jsx
index 958b0108a..85ba95a8a 100644
--- a/adhocracy4/comments_async/static/comments_async/comment_edit_form.jsx
+++ b/adhocracy4/comments_async/static/comments_async/comment_edit_form.jsx
@@ -89,7 +89,7 @@ export default class CommentEditForm extends React.Component {
</button>
<button
- type="submit" value={translated.cancel} className="cancel-button"
+ type="submit" value={translated.cancel} className="btn btn--light cancel-button"
onClick={this.props.handleCancel}
>
{translated.cancel}
diff --git a/adhocracy4/polls/assets/EditPollQuestions.jsx b/adhocracy4/polls/assets/EditPollQuestions.jsx
deleted file mode 100644
index d874d2119..000000000
--- a/adhocracy4/polls/assets/EditPollQuestions.jsx
+++ /dev/null
@@ -1,286 +0,0 @@
-import React, { useState, useRef, useEffect } from 'react'
-import django from 'django'
-import dashboard from '../../../adhocracy4/dashboard/assets/dashboard'
-import update from 'immutability-helper'
-import { EditPollQuestion } from './EditPollQuestion'
-import { EditPollOpenQuestion } from './EditPollOpenQuestion'
-import Alert from '../../static/Alert'
-import PopperMenu from './PopperMenu'
-
-const api = require('adhocracy4').api
-const FlipMove = require('react-flip-move').default
-
-/*
-|--------------------------------------------------------------------------
-| Helper method for local scoped key/identifier
-|--------------------------------------------------------------------------
-*/
-
-let maxLocalKey = 0
-const getNextLocalKey = () => {
- /** Get an artificial key for non-committed items.
- *
- * The key is prefixed to prevent collisions with real database keys.
- */
- return 'local_' + maxLocalKey++
-}
-
-export const EditPollQuestions = (props) => {
- /*
- |--------------------------------------------------------------------------
- | Question state related handlers
- |--------------------------------------------------------------------------
- */
-
- const getNewQuestion = (label = '', helptext = '') => {
- return {
- label,
- help_text: helptext,
- multiple_choice: false,
- key: getNextLocalKey(),
- is_open: false,
- choices: [
- getNewChoice(),
- getNewChoice()
- ],
- answers: []
- }
- }
-
- const getNewOpenQuestion = (label = '') => {
- const newQuestion = getNewQuestion(label)
- newQuestion.is_open = true
- newQuestion.choices = []
- return newQuestion
- }
-
- const updatePopper = () => {
- popper &&
- popper.current &&
- popper.current.instance.update &&
- popper.current.instance.update()
- }
-
- const handleQuestion = (action, params) => {
- let diff = {}
- if (action === 'label') {
- const { index, label } = params
- diff[index] = { $merge: { label } }
- updatePopper()
- } else if (action === 'helptext') {
- const { index, helptext } = params
- diff[index] = { $merge: { help_text: helptext } }
- updatePopper()
- } else if (action === 'multiple-choice') {
- const { index, multipleChoice } = params
- diff[index] = { $merge: { multiple_choice: multipleChoice } }
- } else if (action === 'move') {
- const { index, direction } = params
- const position = direction === 'up' ? (index - 1) : (index + 1)
- diff = { $splice: [[index, 1], [position, 0, questions[index]]] }
- } else if (action === 'append') {
- const newQuestion = params && params.isOpen
- ? getNewOpenQuestion()
- : getNewQuestion()
- diff = { $push: [newQuestion] }
- updatePopper()
- } else if (action === 'delete') {
- const { index } = params
- diff = { $splice: [[index, 1]] }
- updatePopper()
- } else {
- return null
- }
- action && setQuestions(update(questions, diff))
- }
-
- /*
- |--------------------------------------------------------------------------
- | Choice state related handlers
- |--------------------------------------------------------------------------
- */
-
- const getNewChoice = (label = '', isOther = false) => {
- return {
- label,
- key: isOther ? 'other-choice' : getNextLocalKey(),
- is_other_choice: isOther
- }
- }
-
- const handleChoice = (action, params) => {
- const diff = {}
- if (action === 'label') {
- const { index, choiceIndex, label } = params
- diff[index] = { choices: {} }
- diff[index].choices[choiceIndex] = { $merge: { label } }
- } else if (action === 'append') {
- const { index, hasOtherOption } = params
- const position = questions[index].choices.length - 1
- const newChoice = getNewChoice()
- diff[index] = hasOtherOption
- ? { choices: { $splice: [[position, 0, newChoice]] } }
- : { choices: { $push: [newChoice] } }
- } else if (action === 'is-other-choice') {
- const { index, isOtherChoice } = params
- if (isOtherChoice) {
- const otherChoice = getNewChoice('other', true)
- diff[index] = { choices: { $push: [otherChoice] } }
- } else {
- const choiceIndex = questions[index].choices.findIndex(c => c.key === 'other-choice')
- diff[index] = { choices: { $splice: [[choiceIndex, 1]] } }
- }
- } else if (action === 'delete') {
- const { index, choiceIndex } = params
- diff[index] = { choices: { $splice: [[choiceIndex, 1]] } }
- }
- updatePopper()
- action && setQuestions(update(questions, diff))
- }
-
- /*
- |--------------------------------------------------------------------------
- | Poll form and submit logic
- |--------------------------------------------------------------------------
- */
-
- const removeAlert = () => {
- setAlert(null)
- }
-
- const handleSubmit = (e) => {
- e.preventDefault()
-
- const data = {
- questions
- }
-
- api.poll.change(data, props.pollId)
- .done((data) => {
- setQuestions(data.questions)
- setAlert({
- type: 'success',
- message: django.gettext('The poll has been updated.')
- })
- setErrors([])
- if (props.reloadOnSuccess) {
- dashboard.updateDashboard()
- }
- })
- .fail((xhr, status, err) => {
- if (xhr.responseJSON && 'questions' in xhr.responseJSON) {
- setErrors(xhr.responseJSON.questions)
- }
-
- setAlert({
- type: 'danger',
- message: django.gettext('The poll could not be updated.')
- })
- })
- }
-
- /*
- |--------------------------------------------------------------------------
- | Runtime logic and JSX render
- |--------------------------------------------------------------------------
- */
-
- const [questions, setQuestions] = useState([])
- const [errors, setErrors] = useState([])
- const [alert, setAlert] = useState(null)
- const popper = useRef()
-
- const popperMenuContent = {
- popperButton: {
- styleClass: 'btn poll__btn--light',
- buttonText: django.gettext('New question'),
- icon: 'fa fa-plus'
- },
- popperMenuItems: [
- {
- styleClass: 'btn poll__btn--light submenu-item',
- text: django.gettext('Multiple choice question'),
- handleClick: () => handleQuestion('append')
- },
- {
- styleClass: 'btn poll__btn--light submenu-item',
- text: django.gettext('Open question'),
- handleClick: () => handleQuestion('append', { isOpen: true })
- }
- ]
- }
-
- useEffect(() => {
- api.poll.get(props.pollId).done(({ questions }) => {
- questions.length > 0
- ? setQuestions(questions)
- : setQuestions([getNewQuestion()])
- })
- // eslint-disable-next-line react-hooks/exhaustive-deps
- }, [])
-
- return (
- <form
- onSubmit={(e) => handleSubmit(e)} onChange={() => removeAlert()}
- className="editpoll__questions"
- >
- <FlipMove easing="cubic-bezier(0.25, 0.5, 0.75, 1)">
- {
- questions.map((question, index, arr) => {
- const key = question.id || question.key
- return question.is_open
- ? (
- <div key={key}>
- <EditPollOpenQuestion
- id={key}
- question={question}
- onLabelChange={(label) => handleQuestion('label', { index, label })}
- onHelptextChange={(helptext) => handleQuestion('helptext', { index, helptext })}
- onMoveUp={index !== 0 ? () => handleQuestion('move', { index, direction: 'up' }) : null}
- onMoveDown={index < arr.length - 1 ? () => handleQuestion('move', { index, direction: 'down' }) : null}
- onDelete={() => handleQuestion('delete', { index })}
- errors={errors && errors[index] ? errors[index] : {}}
- />
- </div>
- )
- : (
- <div key={key}>
- <EditPollQuestion
- id={key}
- question={question}
- onLabelChange={(label) => handleQuestion('label', { index, label })}
- onHelptextChange={(helptext) => handleQuestion('helptext', { index, helptext })}
- onMultipleChoiceChange={(multipleChoice) => handleQuestion('multiple-choice', { index, multipleChoice })}
- onHasOtherChoiceChange={(isOtherChoice) => handleChoice('is-other-choice', { index, isOtherChoice })}
- onMoveUp={index !== 0 ? () => handleQuestion('move', { index, direction: 'up' }) : null}
- onMoveDown={index < arr.length - 1 ? () => handleQuestion('move', { index, direction: 'down' }) : null}
- onDelete={() => handleQuestion('delete', { index })}
- errors={errors && errors[index] ? errors[index] : {}}
- onChoiceLabelChange={(choiceIndex, label) => handleChoice('label', { index, choiceIndex, label })}
- onDeleteChoice={(choiceIndex) => handleChoice('delete', { index, choiceIndex })}
- onAppendChoice={(hasOtherOption) => handleChoice('append', { index, hasOtherOption })}
- />
- </div>
- )
- })
- }
- </FlipMove>
- <Alert onClick={() => removeAlert()} {...alert} />
- <div className="editpoll__actions-container">
- <div className="editpoll__menu-container">
- <PopperMenu
- ref={popper}
- containerStyleClass="editpoll__menu-container--override"
- >
- {popperMenuContent}
- </PopperMenu>
- </div>
- <div className="editpoll__menu-container">
- <button type="submit" className="btn poll__btn--dark">
- {django.gettext('Save')}
- </button>
- </div>
- </div>
- </form>
- )
-}
diff --git a/adhocracy4/polls/assets/EditPollChoice.jsx b/adhocracy4/polls/assets/PollDashboard/EditPollChoice.jsx
similarity index 96%
rename from adhocracy4/polls/assets/EditPollChoice.jsx
rename to adhocracy4/polls/assets/PollDashboard/EditPollChoice.jsx
index 2ef0dc556..46a57b1d8 100644
--- a/adhocracy4/polls/assets/EditPollChoice.jsx
+++ b/adhocracy4/polls/assets/PollDashboard/EditPollChoice.jsx
@@ -1,6 +1,6 @@
import React from 'react'
import django from 'django'
-import ErrorList from '../../static/ErrorList'
+import ErrorList from '../../../static/ErrorList'
export const EditPollChoice = (props) => {
return (
diff --git a/adhocracy4/polls/assets/PollDashboard/EditPollDropdown.jsx b/adhocracy4/polls/assets/PollDashboard/EditPollDropdown.jsx
new file mode 100644
index 000000000..da462a0e0
--- /dev/null
+++ b/adhocracy4/polls/assets/PollDashboard/EditPollDropdown.jsx
@@ -0,0 +1,45 @@
+import React from 'react'
+import django from 'django'
+
+const translated = {
+ new: django.gettext(' New Question'),
+ multi: django.gettext('Multiple Choice question'),
+ open: django.gettext('Open question')
+}
+
+const EditPollDropdown = (props) => {
+ return (
+ <div className="dropdown editpoll__dropdown">
+ <button
+ type="button"
+ className="dropdown-toggle btn btn--light"
+ aria-haspopup="true"
+ aria-expanded="false"
+ data-bs-toggle="dropdown"
+ >
+ <i className="fa fa-plus" />
+ {translated.new}
+ </button>
+ <div className="dropdown-menu">
+ <button
+ key="1"
+ className="dropdown-item"
+ type="button"
+ onClick={props.handleToggleMulti}
+ >
+ {translated.multi}
+ </button>
+ <button
+ key="2"
+ className="dropdown-item"
+ type="button"
+ onClick={props.handleToggleOpen}
+ >
+ {translated.open}
+ </button>
+ </div>
+ </div>
+ )
+}
+
+export default EditPollDropdown
diff --git a/adhocracy4/polls/assets/PollDashboard/EditPollManagement.jsx b/adhocracy4/polls/assets/PollDashboard/EditPollManagement.jsx
new file mode 100644
index 000000000..ee2e64961
--- /dev/null
+++ b/adhocracy4/polls/assets/PollDashboard/EditPollManagement.jsx
@@ -0,0 +1,260 @@
+import React, { useState, useEffect } from 'react'
+import django from 'django'
+import FlipMove from 'react-flip-move'
+import update from 'immutability-helper'
+
+import { EditPollQuestion } from './EditPollQuestion'
+import { EditPollOpenQuestion } from './EditPollOpenQuestion'
+import EditPollDropdown from './EditPollDropdown'
+
+import dashboard from '../../../../adhocracy4/dashboard/assets/dashboard'
+import api from '../../../static/api'
+import Alert from '../../../static/Alert'
+
+// | Helper method for local scoped key/identifier
+
+let maxLocalKey = 0
+const getNextLocalKey = () => {
+ // Get an artificial key for non-committed items.
+ // The key is prefixed to prevent collisions with real database keys.
+ return 'local_' + maxLocalKey++
+}
+
+export const EditPollManagement = (props) => {
+ const [questions, setQuestions] = useState([])
+ const [errors, setErrors] = useState([])
+ const [alert, setAlert] = useState(null)
+
+ useEffect(() => {
+ api.poll.get(props.pollId).done(({ questions }) => {
+ questions.length > 0
+ ? setQuestions(questions)
+ : setQuestions([getNewQuestion()])
+ })
+ // eslint-disable-next-line react-hooks/exhaustive-deps
+ }, [])
+
+ const getNewQuestion = (label = '', helptext = '') => {
+ return {
+ label,
+ help_text: helptext,
+ multiple_choice: false,
+ key: getNextLocalKey(),
+ is_open: false,
+ choices: [
+ getNewChoice(),
+ getNewChoice()
+ ],
+ answers: []
+ }
+ }
+
+ // | Question state related handlers
+
+ const getNewOpenQuestion = (label = '') => {
+ const newQuestion = getNewQuestion(label)
+ newQuestion.is_open = true
+ newQuestion.choices = []
+ return newQuestion
+ }
+
+ const handleQuestionLabel = (index, label) => {
+ const diff = {}
+ diff[index] = { $merge: { label } }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionHelpText = (index, helptext) => {
+ const diff = {}
+ diff[index] = { $merge: { help_text: helptext } }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionMultiChoice = (index, multipleChoice) => {
+ const diff = {}
+ diff[index] = { $merge: { multiple_choice: multipleChoice } }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionAppend = (params, index) => {
+ let diff = {}
+ const newQuestion = params && params.isOpen
+ ? getNewOpenQuestion()
+ : getNewQuestion()
+ diff = { $push: [newQuestion] }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionDelete = (index) => {
+ let diff = {}
+ diff = { $splice: [[index, 1]] }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionMoveUp = (index) => {
+ let diff = {}
+ const position = index - 1
+ diff = {
+ $splice: [
+ [index, 1], // remove from current index
+ [position, 0, questions[index]] // insert to new index
+ ]
+ }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleQuestionMoveDown = (index) => {
+ let diff = {}
+ const position = index + 1
+ diff = { $splice: [[index, 1], [position, 0, questions[index]]] }
+ setQuestions(update(questions, diff))
+ }
+
+ // | Choice state related handlers
+
+ const getNewChoice = (label = '', isOther = false) => {
+ return {
+ label,
+ key: isOther ? 'other-choice' : getNextLocalKey(),
+ is_other_choice: isOther
+ }
+ }
+
+ const handleChoiceLabel = (index, choiceIndex, label) => {
+ const diff = {}
+ diff[index] = { choices: {} }
+ diff[index].choices[choiceIndex] = { $merge: { label } }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleChoiceAppend = (index, hasOtherOption) => {
+ const position = questions[index].choices.length - 1
+ const newChoice = getNewChoice()
+ const diff = {}
+ diff[index] = hasOtherOption
+ ? { choices: { $splice: [[position, 0, newChoice]] } }
+ : { choices: { $push: [newChoice] } }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleChoiceIsOtherChoice = (index, isOtherChoice) => {
+ const diff = {}
+ if (isOtherChoice) {
+ const otherChoice = getNewChoice('other', true)
+ diff[index] = { choices: { $push: [otherChoice] } }
+ } else {
+ const choiceIndex = questions[index].choices.findIndex(c => c.key === 'other-choice')
+ diff[index] = { choices: { $splice: [[choiceIndex, 1]] } }
+ }
+ setQuestions(update(questions, diff))
+ }
+
+ const handleChoiceDelete = (index, choiceIndex) => {
+ const diff = {}
+ diff[index] = { choices: { $splice: [[choiceIndex, 1]] } }
+ setQuestions(update(questions, diff))
+ }
+
+ // | Poll form and submit logic
+
+ const removeAlert = () => {
+ setAlert(null)
+ }
+
+ const handleSubmit = (e) => {
+ e.preventDefault()
+
+ const data = {
+ questions
+ }
+
+ api.poll.change(data, props.pollId)
+ .done((data) => {
+ setQuestions(data.questions)
+ setAlert({
+ type: 'success',
+ message: django.gettext('The poll has been updated.')
+ })
+ setErrors([])
+ if (props.reloadOnSuccess) {
+ dashboard.updateDashboard()
+ }
+ })
+ .fail((xhr, status, err) => {
+ if (xhr.responseJSON && 'questions' in xhr.responseJSON) {
+ setErrors(xhr.responseJSON.questions)
+ }
+
+ setAlert({
+ type: 'danger',
+ message: django.gettext('The poll could not be updated.')
+ })
+ })
+ }
+
+ // | JSX render
+
+ return (
+ <form
+ onSubmit={(e) => handleSubmit(e)} onChange={() => removeAlert()}
+ className="editpoll__questions"
+ >
+ <FlipMove easing="cubic-bezier(0.25, 0.5, 0.75, 1)">
+ {
+ questions.map((question, index, arr) => {
+ const key = question.id || question.key
+ return question.is_open
+ ? (
+ <div key={key}>
+ <EditPollOpenQuestion
+ id={key}
+ question={question}
+ onLabelChange={(label) => handleQuestionLabel(index, label)}
+ onHelptextChange={(helptext) => handleQuestionHelpText(index, helptext)}
+ onMoveUp={index !== 0 ? () => handleQuestionMoveUp(index) : null}
+ onMoveDown={index < arr.length - 1 ? () => handleQuestionMoveDown(index) : null}
+ onDelete={() => handleQuestionDelete(index)}
+ errors={errors && errors[index] ? errors[index] : {}}
+ />
+ </div>
+ )
+ : (
+ <div key={key}>
+ <EditPollQuestion
+ id={key}
+ question={question}
+ onLabelChange={(label) => handleQuestionLabel(index, label)}
+ onHelptextChange={(helptext) => handleQuestionHelpText(index, helptext)}
+ onMultipleChoiceChange={(multipleChoice) => handleQuestionMultiChoice(index, multipleChoice)}
+ onMoveUp={index !== 0 ? () => handleQuestionMoveUp(index) : null}
+ onMoveDown={index < arr.length - 1 ? () => handleQuestionMoveDown(index) : null}
+ onDelete={() => handleQuestionDelete(index)}
+ errors={errors && errors[index] ? errors[index] : {}}
+ onHasOtherChoiceChange={(isOtherChoice) => handleChoiceIsOtherChoice(index, isOtherChoice)}
+ onChoiceLabelChange={(choiceIndex, label) => handleChoiceLabel(index, choiceIndex, label)}
+ onDeleteChoice={(choiceIndex) => handleChoiceDelete(index, choiceIndex)}
+ onAppendChoice={(hasOtherOption) => handleChoiceAppend(index, hasOtherOption)}
+ />
+ </div>
+ )
+ })
+ }
+ </FlipMove>
+ <Alert onClick={() => removeAlert()} {...alert} />
+ <div className="editpoll__actions-container">
+ <div className="editpoll__menu-container">
+ <EditPollDropdown
+ handleToggleMulti={() => handleQuestionAppend()}
+ handleToggleOpen={() => handleQuestionAppend({ isOpen: true })}
+ />
+ </div>
+
+ <div className="editpoll__menu-container">
+ <button type="submit" className="btn poll__btn--dark">
+ {django.gettext('Save')}
+ </button>
+ </div>
+ </div>
+ </form>
+ )
+}
diff --git a/adhocracy4/polls/assets/EditPollOpenQuestion.jsx b/adhocracy4/polls/assets/PollDashboard/EditPollOpenQuestion.jsx
similarity index 98%
rename from adhocracy4/polls/assets/EditPollOpenQuestion.jsx
rename to adhocracy4/polls/assets/PollDashboard/EditPollOpenQuestion.jsx
index 23e63b7f9..e5ecbf016 100644
--- a/adhocracy4/polls/assets/EditPollOpenQuestion.jsx
+++ b/adhocracy4/polls/assets/PollDashboard/EditPollOpenQuestion.jsx
@@ -1,6 +1,6 @@
import React, { useState } from 'react'
import django from 'django'
-import ErrorList from '../../static/ErrorList'
+import ErrorList from '../../../static/ErrorList'
import { HelptextForm } from './HelptextForm'
export const EditPollOpenQuestion = (props) => {
diff --git a/adhocracy4/polls/assets/EditPollQuestion.jsx b/adhocracy4/polls/assets/PollDashboard/EditPollQuestion.jsx
similarity index 99%
rename from adhocracy4/polls/assets/EditPollQuestion.jsx
rename to adhocracy4/polls/assets/PollDashboard/EditPollQuestion.jsx
index 6a7fa71b9..94de1958c 100644
--- a/adhocracy4/polls/assets/EditPollQuestion.jsx
+++ b/adhocracy4/polls/assets/PollDashboard/EditPollQuestion.jsx
@@ -1,7 +1,7 @@
import React, { useState } from 'react'
import { EditPollChoice } from './EditPollChoice'
import django from 'django'
-import ErrorList from '../../static/ErrorList'
+import ErrorList from '../../../static/ErrorList'
import { HelptextForm } from './HelptextForm'
const FlipMove = require('react-flip-move').default
diff --git a/adhocracy4/polls/assets/HelptextForm.jsx b/adhocracy4/polls/assets/PollDashboard/HelptextForm.jsx
similarity index 92%
rename from adhocracy4/polls/assets/HelptextForm.jsx
rename to adhocracy4/polls/assets/PollDashboard/HelptextForm.jsx
index 39cf1bd89..9abedc1a3 100644
--- a/adhocracy4/polls/assets/HelptextForm.jsx
+++ b/adhocracy4/polls/assets/PollDashboard/HelptextForm.jsx
@@ -1,6 +1,6 @@
import React from 'react'
import django from 'django'
-import ErrorList from '../../static/ErrorList'
+import ErrorList from '../../../static/ErrorList'
export const HelptextForm = (props) => {
return (
diff --git a/adhocracy4/polls/assets/CharCounter.jsx b/adhocracy4/polls/assets/PollDetail/CharCounter.jsx
similarity index 100%
rename from adhocracy4/polls/assets/CharCounter.jsx
rename to adhocracy4/polls/assets/PollDetail/CharCounter.jsx
diff --git a/adhocracy4/polls/assets/PollOpenQuestion.jsx b/adhocracy4/polls/assets/PollDetail/PollOpenQuestion.jsx
similarity index 97%
rename from adhocracy4/polls/assets/PollOpenQuestion.jsx
rename to adhocracy4/polls/assets/PollDetail/PollOpenQuestion.jsx
index c5496e273..aee0e7598 100644
--- a/adhocracy4/polls/assets/PollOpenQuestion.jsx
+++ b/adhocracy4/polls/assets/PollDetail/PollOpenQuestion.jsx
@@ -2,8 +2,7 @@ import React, { useState } from 'react'
import { CharCounter } from './CharCounter'
export const PollOpenQuestion = (props) => {
- const questionHelpText = props.question.help_text ? <div className="poll__help-text">{props.question.help_text}</div> : null
- const maxlength = 750
+ // | Function to define state
const getUserOpenAnswer = () => {
const userAnswerId = props.question.userAnswer
@@ -17,6 +16,8 @@ export const PollOpenQuestion = (props) => {
}
const [userAnswer, setUserAnswer] = useState(getUserOpenAnswer())
+ const questionHelpText = props.question.help_text ? <div className="poll__help-text">{props.question.help_text}</div> : null
+ const maxlength = 750
const handleOpenChange = (event) => {
setUserAnswer(event.target.value)
diff --git a/adhocracy4/polls/assets/PollQuestion.jsx b/adhocracy4/polls/assets/PollDetail/PollQuestion.jsx
similarity index 94%
rename from adhocracy4/polls/assets/PollQuestion.jsx
rename to adhocracy4/polls/assets/PollDetail/PollQuestion.jsx
index c9f72431a..134cd2887 100644
--- a/adhocracy4/polls/assets/PollQuestion.jsx
+++ b/adhocracy4/polls/assets/PollDetail/PollQuestion.jsx
@@ -1,9 +1,16 @@
import React, { useEffect, useState } from 'react'
import django from 'django'
import { CharCounter } from './CharCounter'
-import ErrorList from '../../static/ErrorList'
+import ErrorList from '../../../static/ErrorList'
+
+const translated = {
+ multiple: django.gettext('Multiple answers are possible.'),
+ other: django.gettext('other')
+}
export const PollQuestion = (props) => {
+ // | Function to define state
+
const getUserAnswer = () => {
const userAnswerId = props.question.other_choice_user_answer
const userAnswer = props.question.other_choice_answers.find(oc => oc.vote_id === userAnswerId)
@@ -15,11 +22,11 @@ export const PollQuestion = (props) => {
)
}
- const multiHelpText = props.question.multiple_choice ? <div className="poll__help-text">{django.gettext('Multiple answers are possible.')}</div> : null
- const questionHelpText = props.question.help_text ? <div className="poll__help-text">{props.question.help_text}</div> : null
const [userChoices, setUserChoices] = useState(props.question.userChoices)
const [otherChoiceAnswer, setOtherChoiceAnswer] = useState(getUserAnswer())
const [errors, setErrors] = useState()
+ const multiHelpText = props.question.multiple_choice ? <div className="poll__help-text">{translated.multiple}</div> : null
+ const questionHelpText = props.question.help_text ? <div className="poll__help-text">{props.question.help_text}</div> : null
const maxlength = 250
useEffect(() => {
@@ -81,7 +88,7 @@ export const PollQuestion = (props) => {
onChange={(event) => { handleSingleChange(event, choice.is_other_choice) }}
disabled={!props.question.authenticated || props.question.isReadOnly}
/>
- <span className="radio__text">{choice.is_other_choice ? django.gettext('other') : choice.label}</span>
+ <span className="radio__text">{choice.is_other_choice ? translated.other : choice.label}</span>
{choice.is_other_choice &&
<>
<input
@@ -120,7 +127,7 @@ export const PollQuestion = (props) => {
onChange={(event) => { handleMultiChange(event, choice.is_other_choice) }}
disabled={!props.question.authenticated || props.question.isReadOnly}
/>
- <span className="radio__text radio__text--checkbox">{choice.is_other_choice ? django.gettext('other') : choice.label}</span>
+ <span className="radio__text radio__text--checkbox">{choice.is_other_choice ? translated.other : choice.label}</span>
{choice.is_other_choice &&
<>
<input
diff --git a/adhocracy4/polls/assets/PollQuestions.jsx b/adhocracy4/polls/assets/PollDetail/PollQuestions.jsx
similarity index 98%
rename from adhocracy4/polls/assets/PollQuestions.jsx
rename to adhocracy4/polls/assets/PollDetail/PollQuestions.jsx
index 40a2aa32f..7918b4358 100644
--- a/adhocracy4/polls/assets/PollQuestions.jsx
+++ b/adhocracy4/polls/assets/PollDetail/PollQuestions.jsx
@@ -1,13 +1,14 @@
import React from 'react'
+import django from 'django'
+
import { PollQuestion } from './PollQuestion'
import { PollOpenQuestion } from './PollOpenQuestion'
-import Alert from '../../static/Alert'
-import django from 'django'
import PollResults from './PollResults'
-import { TermsOfUseCheckbox } from '../../static/TermsOfUseCheckbox'
-const api = require('adhocracy4').api
-const config = require('adhocracy4').config
+import Alert from '../../../static/Alert'
+import api from '../../../static/api'
+import config from '../../../static/config'
+import { TermsOfUseCheckbox } from '../../../static/TermsOfUseCheckbox'
const ALERT_SUCCESS = {
type: 'success',
diff --git a/adhocracy4/polls/assets/PollResults.jsx b/adhocracy4/polls/assets/PollDetail/PollResults.jsx
similarity index 100%
rename from adhocracy4/polls/assets/PollResults.jsx
rename to adhocracy4/polls/assets/PollDetail/PollResults.jsx
diff --git a/adhocracy4/polls/assets/PopperMenu.jsx b/adhocracy4/polls/assets/PopperMenu.jsx
deleted file mode 100644
index d2054b5d8..000000000
--- a/adhocracy4/polls/assets/PopperMenu.jsx
+++ /dev/null
@@ -1,91 +0,0 @@
-import React, { useState, useEffect, useRef, useImperativeHandle, forwardRef } from 'react'
-import { usePopper } from 'react-popper'
-
-const PopperMenu = (props, ref) => {
- const { children: { popperButton, popperMenuItems, popperConfig } } = props
- const referenceRef = useRef(null)
- const popperRef = useRef(null)
- const [visible, setVisible] = useState(false)
-
- let config = {
- placement: 'bottom-start'
- }
-
- popperConfig &&
- (config = { ...config, ...popperConfig })
-
- const popper = usePopper(
- referenceRef.current,
- popperRef.current,
- {
- ...config
- }
- )
-
- const { styles, attributes } = popper
- const containerStyleClass = props.containerStyleClass
- ? `popper-content--container ${props.containerStyleClass}`
- : 'popper-content--container'
-
- useEffect(() => {
- // listen for clicks and close dropdown on body
- document.addEventListener('mousedown', handleDocumentClick)
- return () => {
- document.removeEventListener('mousedown', handleDocumentClick)
- }
- }, [])
-
- const handleDocumentClick = (event) => {
- (referenceRef.current.contains(event.target) ||
- popperRef.current.contains(event.target)) ||
- setVisible(false)
- }
- const handleDropdownClick = (event) => {
- setVisible(!visible)
- }
-
- const handleClickAction = (menuItem) => {
- setVisible(false)
- menuItem.handleClick()
- popper.update()
- }
-
- useImperativeHandle(ref, () => ({
- instance: popper
- }))
-
- return (
- <>
- <button
- className={popperButton.styleClass ? popperButton.styleClass : ''}
- ref={referenceRef} onClick={handleDropdownClick}
- type="button"
- >
- {popperButton.icon && <i className={popperButton.icon} />} {popperButton.buttonText}
- </button>
- <div ref={popperRef} style={styles.popper} {...attributes.popper}>
- <div
- style={styles.offset}
- className={containerStyleClass}
- data-visible={visible}
- >
- <ul className="popper-container">
- {popperMenuItems.map((menuItem, idx) => (
- <li key={idx}>
- <button
- className={`${menuItem.styleClass ? menuItem.styleClass : ''} popper-menuitem__button`}
- type="button"
- onClick={() => handleClickAction(menuItem)}
- >
- {menuItem.text}
- </button>
- </li>
- ))}
- </ul>
- </div>
- </div>
- </>
- )
-}
-
-export default forwardRef(PopperMenu)
diff --git a/adhocracy4/polls/assets/__tests__/CharCounter.jest.jsx b/adhocracy4/polls/assets/__tests__/CharCounter.jest.jsx
index de2686881..9ecc2f453 100644
--- a/adhocracy4/polls/assets/__tests__/CharCounter.jest.jsx
+++ b/adhocracy4/polls/assets/__tests__/CharCounter.jest.jsx
@@ -3,7 +3,7 @@ import React from 'react'
import { render } from '@testing-library/react'
// component and related data to be tested
-import { CharCounter } from '../CharCounter'
+import { CharCounter } from '../PollDetail/CharCounter'
test('<CharCounter> component renders correctly', () => {
const tree = render(<CharCounter value="random" max={25} />)
diff --git a/adhocracy4/polls/assets/__tests__/EditPollChoice.jest.jsx b/adhocracy4/polls/assets/__tests__/EditPollChoice.jest.jsx
index 8d33cb2a2..35421d3dc 100644
--- a/adhocracy4/polls/assets/__tests__/EditPollChoice.jest.jsx
+++ b/adhocracy4/polls/assets/__tests__/EditPollChoice.jest.jsx
@@ -3,7 +3,7 @@ import React from 'react'
import { render, fireEvent } from '@testing-library/react'
// component and related data to be tested
-import { EditPollChoice } from '../EditPollChoice.jsx'
+import { EditPollChoice } from '../PollDashboard/EditPollChoice.jsx'
const CHOICE_OBJECT = {
id: 1,
diff --git a/adhocracy4/polls/assets/__tests__/EditPollOpenQuestion.jest.jsx b/adhocracy4/polls/assets/__tests__/EditPollOpenQuestion.jest.jsx
index 6e42e664a..819375941 100644
--- a/adhocracy4/polls/assets/__tests__/EditPollOpenQuestion.jest.jsx
+++ b/adhocracy4/polls/assets/__tests__/EditPollOpenQuestion.jest.jsx
@@ -3,7 +3,7 @@ import React from 'react'
import { render, fireEvent } from '@testing-library/react'
// component and related data to be tested
-import { EditPollOpenQuestion } from '../EditPollOpenQuestion.jsx'
+import { EditPollOpenQuestion } from '../PollDashboard/EditPollOpenQuestion.jsx'
import { QUESTION_OBJECT } from './__testdata__/QUESTION_OBJECT'
describe('<EditPollOpenQuestion> with...', () => {
diff --git a/adhocracy4/polls/assets/__tests__/EditPollQuestion.jest.jsx b/adhocracy4/polls/assets/__tests__/EditPollQuestion.jest.jsx
index f031ef938..df565e738 100644
--- a/adhocracy4/polls/assets/__tests__/EditPollQuestion.jest.jsx
+++ b/adhocracy4/polls/assets/__tests__/EditPollQuestion.jest.jsx
@@ -3,7 +3,7 @@ import React from 'react'
import { render, fireEvent } from '@testing-library/react'
// component and related data to be tested
-import { EditPollQuestion } from '../EditPollQuestion.jsx'
+import { EditPollQuestion } from '../PollDashboard/EditPollQuestion.jsx'
import { QUESTION_OBJECT } from './__testdata__/QUESTION_OBJECT'
describe('<EditPollQuestion> with...', () => {
diff --git a/adhocracy4/polls/assets/__tests__/PollQuestion.jest.jsx b/adhocracy4/polls/assets/__tests__/PollQuestion.jest.jsx
index 540fa7538..95bf3abf1 100644
--- a/adhocracy4/polls/assets/__tests__/PollQuestion.jest.jsx
+++ b/adhocracy4/polls/assets/__tests__/PollQuestion.jest.jsx
@@ -3,7 +3,7 @@ import React from 'react'
import { render, fireEvent } from '@testing-library/react'
// component and related data to be tested
-import { PollQuestion } from '../PollQuestion.jsx'
+import { PollQuestion } from '../PollDetail/PollQuestion.jsx'
import { QUESTION_OBJECT } from './__testdata__/QUESTION_OBJECT'
describe('render <PollQuestion> with...', () => {
diff --git a/adhocracy4/polls/assets/react_polls.jsx b/adhocracy4/polls/assets/react_polls.jsx
index 1876b493d..6af4a8fe9 100644
--- a/adhocracy4/polls/assets/react_polls.jsx
+++ b/adhocracy4/polls/assets/react_polls.jsx
@@ -1,8 +1,8 @@
import React from 'react'
import { createRoot } from 'react-dom/client'
-import { EditPollQuestions } from './EditPollQuestions'
-import PollQuestions from './PollQuestions'
+import { EditPollManagement } from './PollDashboard/EditPollManagement'
+import PollQuestions from './PollDetail/PollQuestions'
module.exports.renderPolls = function (element) {
const pollId = element.getAttribute('data-poll-id')
@@ -17,5 +17,5 @@ module.exports.renderPollManagement = function (element) {
const reloadOnSuccess = JSON.parse(element.getAttribute('data-reloadOnSuccess'))
const root = createRoot(element)
- root.render(<EditPollQuestions pollId={pollId} reloadOnSuccess={reloadOnSuccess} />)
+ root.render(<EditPollManagement pollId={pollId} reloadOnSuccess={reloadOnSuccess} />)
}
diff --git a/adhocracy4/polls/models.py b/adhocracy4/polls/models.py
index e88f93d81..fda89f75e 100644
--- a/adhocracy4/polls/models.py
+++ b/adhocracy4/polls/models.py
@@ -21,7 +21,7 @@ def annotate_vote_count(self):
answer_count=models.Count(
'answers__creator_id',
distinct=True),
- )
+ ).order_by('weight')
class ChoiceQuerySet(models.QuerySet):
diff --git a/adhocracy4/static/ErrorList.jsx b/adhocracy4/static/ErrorList.jsx
index 3ecc9e9e8..86614a9e8 100644
--- a/adhocracy4/static/ErrorList.jsx
+++ b/adhocracy4/static/ErrorList.jsx
@@ -3,7 +3,7 @@ import React from 'react'
const ErrorList = ({ errors, field }) => {
if (errors && errors[field]) {
return (
- <ul className="errorlist">
+ <ul className="errorlist" role="alert">
{errors[field].map(function (msg, index) {
return <li key={msg}>{msg}</li>
})}
diff --git a/package.json b/package.json
index 735f3606f..c02375de6 100644
--- a/package.json
+++ b/package.json
@@ -48,8 +48,8 @@
"lint-staged": "13.0.3",
"react": "18.2.0",
"react-dom": "18.2.0",
+ "react-flip-move": "3.0.4",
"react-markdown": "8.0.3",
- "react-popper": "2.3.0",
"react-slick": "0.29.0",
"shpjs": "4.0.4",
"slick-carousel": "git+https://github.com/liqd/slick.git#pm-2019-07-overwrites"
@@ -62,8 +62,8 @@
"leaflet.markercluster": "git+https://github.com/liqd/Leaflet.markercluster#5ed89b26922c51083fc9632a2c01425b9261a0f5",
"react": "18.2.0",
"react-dom": "18.2.0",
+ "react-flip-move": "3.0.4",
"react-markdown": "8.0.3",
- "react-popper": "2.3.0",
"react-slick": "0.29",
"shpjs": "4.0.4",
"slick-carousel": "git+https://github.com/liqd/slick.git#pm-2019-07-overwrites"
|
DataDog__dd-trace-py-906 | Scrolling for the left-side menu on the API docs is broken
Chromium, Ubuntu
Go to http://pypi.datadoghq.com/trace/docs/advanced_usage.html
Scroll up and down
If your browser window is short enough, you'll notice the left-side menu doesn't scroll with the page, leaving some parts inaccessible.
Video: [vokoscreen-2019-04-25_08-21-40.zip](https://github.com/DataDog/dd-trace-py/files/3117626/vokoscreen-2019-04-25_08-21-40.zip)
Since the API docs are generated from this repo, I figured I'd report the issue here.
| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# ddtrace documentation build configuration file, created by\n# sphinx-quickstart on Thu Jul 7 17:25:05 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in th... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# ddtrace documentation build configuration file, created by\n# sphinx-quickstart on Thu Jul 7 17:25:05 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in th... | diff --git a/docs/conf.py b/docs/conf.py
index 0ed85b47747..5abb255baaf 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -145,7 +145,6 @@
#
html_theme_options = {
'description': 'Datadog\'s Python tracing client',
- 'fixed_sidebar': True,
}
# Add any paths that contain custom themes here, relative to this directory.
|
opendatacube__datacube-core-875 | If DB_PORT is not set, config process sets port to an empty string
I have an existing environment that sets up the datacube connection using this:
```
- DB_HOSTNAME=host.docker.internal
- DB_USERNAME=opendatacube
- DB_PASSWORD=opendatacubepassword
- DB_DATABASE=opendatacube
```
and with the new changes to read config from environment variables over the config file, the port is required to be set with `DB_PORT=5432`.
Expected behaviour was that if the port is blank it is set to the default for Postgres.
https://github.com/opendatacube/datacube-core/blob/8481d907b198a1c8946326b8b70625a9a8523a12/datacube/config.py#L265
| [
{
"content": "# coding=utf-8\n\"\"\"\nUser configuration.\n\"\"\"\n\nimport os\nfrom pathlib import Path\nimport configparser\nfrom urllib.parse import unquote_plus, urlparse\nfrom typing import Optional, Iterable, Union, Any, Tuple, Dict\n\nPathLike = Union[str, 'os.PathLike[Any]']\n\n\nENVIRONMENT_VARNAME = '... | [
{
"content": "# coding=utf-8\n\"\"\"\nUser configuration.\n\"\"\"\n\nimport os\nfrom pathlib import Path\nimport configparser\nfrom urllib.parse import unquote_plus, urlparse\nfrom typing import Optional, Iterable, Union, Any, Tuple, Dict\n\nPathLike = Union[str, 'os.PathLike[Any]']\n\n\nENVIRONMENT_VARNAME = '... | diff --git a/datacube/config.py b/datacube/config.py
index 1bf4ce6371..fd10ff3c84 100755
--- a/datacube/config.py
+++ b/datacube/config.py
@@ -233,7 +233,7 @@ def parse_env_params() -> Dict[str, str]:
for k in DB_KEYS}
return {k: v
for k, v in params.items()
- if v is not None}
+ if v is not None and v != ""}
def _cfg_from_env_opts(opts: Dict[str, str],
diff --git a/tests/test_config.py b/tests/test_config.py
index 215310b4ea..330a331854 100644
--- a/tests/test_config.py
+++ b/tests/test_config.py
@@ -164,6 +164,16 @@ def check_env(**kw):
username='user',
password='pass@')
+ assert check_env(DB_DATABASE='db',
+ DB_HOSTNAME='host.tld',
+ DB_USERNAME='user',
+ DB_PORT='',
+ DB_PASSWORD='pass@') == dict(
+ database='db',
+ hostname='host.tld',
+ username='user',
+ password='pass@')
+
def test_cfg_from_env(monkeypatch):
def set_env(**kw):
|
openai__gym-2633 | [Bug Report] Empty print version warning
**Describe the bug**
When I import gym, there's an empty line printed.
It's because of this line: https://github.com/openai/gym/blob/master/gym/__init__.py#L30
Either it's a bug, because `notice` shouldn't be an empty string, or the check should be `if notice:` which is false for both `None` and `""` (empty string).
Currently it's cluttering the logs at best, or masking some other issue.
**Code example**
```python
import gym
```
**System Info**
Describe the characteristic of your environment:
Latest gym installed from pip, Ubuntu 20.04, Python 3.9.7
### Checklist
- [x] I have checked that there is no similar [issue](https://github.com/openai/gym/issues) in the repo (**required**)
| [
{
"content": "from gym import error\nfrom gym.version import VERSION as __version__\n\nfrom gym.core import (\n Env,\n Wrapper,\n ObservationWrapper,\n ActionWrapper,\n RewardWrapper,\n)\nfrom gym.spaces import Space\nfrom gym.envs import make, spec, register\nfrom gym import logger\nfrom gym imp... | [
{
"content": "from gym import error\nfrom gym.version import VERSION as __version__\n\nfrom gym.core import (\n Env,\n Wrapper,\n ObservationWrapper,\n ActionWrapper,\n RewardWrapper,\n)\nfrom gym.spaces import Space\nfrom gym.envs import make, spec, register\nfrom gym import logger\nfrom gym imp... | diff --git a/gym/__init__.py b/gym/__init__.py
index 71797c7e798..b44d1b419ad 100644
--- a/gym/__init__.py
+++ b/gym/__init__.py
@@ -26,7 +26,7 @@
# print version warning if necessary
notice = notices.notices.get(__version__)
- if notice is not None:
+ if notice:
print(notice, file=sys.stderr)
except Exception: # nosec
|
feast-dev__feast-1585 | Bump fastavro version
**Is your feature request related to a problem? Please describe.**
The version of Fastavro that we're using is kinda old and may be buggy soon. It's also causing some version conflicts with packages that have already upgraded to the newer (1.xx) versions.
**Describe the solution you'd like**
Bump Fastavro to 1.x.x
| [
{
"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a... | [
{
"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a... | diff --git a/sdk/python/setup.py b/sdk/python/setup.py
index e2bb02f10d0..2c40d7ec4ac 100644
--- a/sdk/python/setup.py
+++ b/sdk/python/setup.py
@@ -40,7 +40,7 @@
REQUIRED = [
"Click==7.*",
"colorama>=0.3.9",
- "fastavro>=0.22.11,<0.23",
+ "fastavro>=1.1.0",
"google-api-core>=1.23.0",
"googleapis-common-protos==1.52.*",
"grpcio>=1.34.0",
|
apache__airflow-12386 | [ldap] section in configuration is not applicable anymore in 2.0
**Apache Airflow version**: 2.0.0b* / master
**What happened**:
`[ldap]` section in `airflow.cfg` is not applicable anymore in 2.0 and `master`, because the LDAP authentication (for webserver and API) is handled by FAB, and the configuration for this is handled by `webserver_config.py` file.

**What you expected to happen**:
The `[ldap]` section should be removed from `airflow/config_templates/default_airflow.cfg` and `airflow/config_templates/config.yml` (and some other applicable files).
Otherwise leaving this section there will be a big confusion for users.
| [
{
"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n... | [
{
"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n... | diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index df5a53abe7282..516bb6284e1c0 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -1747,84 +1747,6 @@
type: string
example: ~
default: "False"
-- name: ldap
- description: ~
- options:
- - name: uri
- description: |
- set this to ldaps://<your.ldap.server>:<port>
- version_added: ~
- type: string
- example: ~
- default: ""
- - name: user_filter
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "objectClass=*"
- - name: user_name_attr
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "uid"
- - name: group_member_attr
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "memberOf"
- - name: superuser_filter
- description: ~
- version_added: ~
- type: string
- example: ~
- default: ""
- - name: data_profiler_filter
- description: ~
- version_added: ~
- type: string
- example: ~
- default: ""
- - name: bind_user
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "cn=Manager,dc=example,dc=com"
- - name: bind_password
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "insecure"
- - name: basedn
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "dc=example,dc=com"
- - name: cacert
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "/etc/ca/ldap_ca.crt"
- - name: search_scope
- description: ~
- version_added: ~
- type: string
- example: ~
- default: "LEVEL"
- - name: ignore_malformed_schema
- description: |
- This setting allows the use of LDAP servers that either return a
- broken schema, or do not return a schema.
- version_added: 1.10.3
- type: string
- example: ~
- default: "False"
- name: kerberos
description: ~
options:
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index 8a9a6a62b6ceb..cebbfd955489f 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -873,24 +873,6 @@ use_job_schedule = True
# Only has effect if schedule_interval is set to None in DAG
allow_trigger_in_future = False
-[ldap]
-# set this to ldaps://<your.ldap.server>:<port>
-uri =
-user_filter = objectClass=*
-user_name_attr = uid
-group_member_attr = memberOf
-superuser_filter =
-data_profiler_filter =
-bind_user = cn=Manager,dc=example,dc=com
-bind_password = insecure
-basedn = dc=example,dc=com
-cacert = /etc/ca/ldap_ca.crt
-search_scope = LEVEL
-
-# This setting allows the use of LDAP servers that either return a
-# broken schema, or do not return a schema.
-ignore_malformed_schema = False
-
[kerberos]
ccache = /tmp/airflow_krb5_ccache
diff --git a/airflow/configuration.py b/airflow/configuration.py
index 92790d1fb763b..338526b06c62d 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -129,7 +129,6 @@ class AirflowConfigParser(ConfigParser): # pylint: disable=too-many-ancestors
('celery', 'result_backend'),
('atlas', 'password'),
('smtp', 'smtp_password'),
- ('ldap', 'bind_password'),
('kubernetes', 'git_password'),
}
diff --git a/docs/howto/set-config.rst b/docs/howto/set-config.rst
index 090a6f9352d44..3ba7d9fd2ab02 100644
--- a/docs/howto/set-config.rst
+++ b/docs/howto/set-config.rst
@@ -69,7 +69,6 @@ The following config options support this ``_cmd`` and ``_secret`` version:
* ``result_backend`` in ``[celery]`` section
* ``password`` in ``[atlas]`` section
* ``smtp_password`` in ``[smtp]`` section
-* ``bind_password`` in ``[ldap]`` section
* ``git_password`` in ``[kubernetes]`` section
The ``_cmd`` config options can also be set using a corresponding environment variable
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 0a1571a59c78f..eaf8c2e15d60c 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -478,7 +478,6 @@ backticks
balancer
balancers
baseOperator
-basedn
basestring
basetaskrunner
bashrc
diff --git a/tests/core/test_config_templates.py b/tests/core/test_config_templates.py
index 9c09c318c678c..42ba99133028a 100644
--- a/tests/core/test_config_templates.py
+++ b/tests/core/test_config_templates.py
@@ -45,7 +45,6 @@
'celery_broker_transport_options',
'dask',
'scheduler',
- 'ldap',
'kerberos',
'github_enterprise',
'admin',
|
archlinux__archinstall-555 | Version Bump in conf.py?
https://github.com/archlinux/archinstall/blob/a4033a7d3a94916f2b4972d212f9d0069fca39cd/docs/conf.py#L44
| [
{
"content": "import os\nimport re\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\n\n\ndef process_docstring(app, what, name, obj, options, lines):\n\tspaces_pat = re.compile(r\"( {8})\")\n\tll = []\n\tfor line in lines:\n\t\tll.append(spaces_pat.sub(\" \", line))\n\tlines[:] = ll\n\n\ndef setup(ap... | [
{
"content": "import os\nimport re\nimport sys\n\nsys.path.insert(0, os.path.abspath('..'))\n\n\ndef process_docstring(app, what, name, obj, options, lines):\n\tspaces_pat = re.compile(r\"( {8})\")\n\tll = []\n\tfor line in lines:\n\t\tll.append(spaces_pat.sub(\" \", line))\n\tlines[:] = ll\n\n\ndef setup(ap... | diff --git a/docs/conf.py b/docs/conf.py
index 375ff434de..add1c5e749 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -41,7 +41,7 @@ def setup(app):
author = 'Anton Hvornum'
# The full version, including alpha/beta/rc tags
-release = 'v2.1.0'
+release = 'v2.3.0.dev0'
# -- General configuration ---------------------------------------------------
|
comic__grand-challenge.org-758 | grandchallenge.cases.tasks.build_images should use a separate queue
This process can take a long time
| [
{
"content": "# Django settings for comic project.\nimport glob\nimport os\nimport re\nimport uuid\nfrom datetime import timedelta\nfrom distutils.util import strtobool as strtobool_i\n\nfrom django.contrib.messages import constants as messages\nfrom django.core.exceptions import ImproperlyConfigured\n\nfrom co... | [
{
"content": "# Django settings for comic project.\nimport glob\nimport os\nimport re\nimport uuid\nfrom datetime import timedelta\nfrom distutils.util import strtobool as strtobool_i\n\nfrom django.contrib.messages import constants as messages\nfrom django.core.exceptions import ImproperlyConfigured\n\nfrom co... | diff --git a/app/config/settings.py b/app/config/settings.py
index 85a8a2cd6b..f2b24552f2 100644
--- a/app/config/settings.py
+++ b/app/config/settings.py
@@ -601,7 +601,8 @@ def strtobool(val) -> bool:
}
CELERY_TASK_ROUTES = {
- "grandchallenge.container_exec.tasks.execute_job": "evaluation"
+ "grandchallenge.container_exec.tasks.execute_job": "evaluation",
+ "grandchallenge.cases.tasks.build_images": "images",
}
# Set which template pack to use for forms
diff --git a/docker-compose.yml b/docker-compose.yml
index 5a2e841960..f722e77cbd 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -160,7 +160,7 @@ services:
<<: *protected_storage_credentials
<<: *protected_storage_connections
restart: always
- command: "celery -A config worker -l info -Q evaluation -c 1"
+ command: "celery -A config worker -l info -Q evaluation,images -c 1"
scale: 1
depends_on:
web:
|
weecology__retriever-950 | Check MySQL and Postgres credential files
In addition to allowing users to directly provide their MySQL and PostgreSQL credentials, it should also be possible for them to store these credentials in the usual places.
We should check information given by the user to the retriever first, and then fall back on the configuration files for usernames and passwords if they are not provided.
For PostgreSQL this is `~/.pgpass` with the format:
```
hostname:port:database:username:password
```
See: https://wiki.postgresql.org/wiki/Pgpass. `*`s can be used in place of any of the `:` separated values.
For MySQL this is `~/.my.cnf` with the format:
```
[client]
user = root
password = yourpassword
```
See: https://dev.mysql.com/doc/refman/5.5/en/option-files.html. `.my.cnf` can contain a lot of additional configuration information so we'll need to look explicitly for `user =` and `password =`.
| [
{
"content": "from __future__ import print_function\nfrom builtins import str\nimport os\nfrom retriever.lib.models import Engine, no_cleanup\nfrom retriever import ENCODING\n\n\nclass engine(Engine):\n \"\"\"Engine instance for MySQL.\"\"\"\n name = \"MySQL\"\n abbreviation = \"mysql\"\n datatypes ... | [
{
"content": "from __future__ import print_function\nfrom builtins import str\nimport os\nfrom retriever.lib.models import Engine, no_cleanup\nfrom retriever import ENCODING\n\n\nclass engine(Engine):\n \"\"\"Engine instance for MySQL.\"\"\"\n name = \"MySQL\"\n abbreviation = \"mysql\"\n datatypes ... | diff --git a/docs/introduction.rst b/docs/introduction.rst
index 3458a4577..57f8c7fbf 100644
--- a/docs/introduction.rst
+++ b/docs/introduction.rst
@@ -241,6 +241,39 @@ The ``citation`` command show the citation for the retriever and for the scripts
**To create new, edit, delete scripts please read the documentation on scripts**
+
+Storing database connection details
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The retriever reads from the standard configuration files for the database
+management systems. If you want to store connection details they should be
+stored in those files. Make sure to secure these files appropriately.
+
+For postgreSQL, create or modify `~/.pgpass`. This is a file named `.pgpass`
+located in the users home directory. It should take the general form:
+
+``hostname:port:database:username:password``
+
+where each word is replaced with the correct information for your database
+connection or replaced with an ``*`` to apply to all values for that section.
+
+For MySQL, create or modify `~/.my.cnf`. This is a file named `.my.cnf` located
+in the users home directory. The relevant portion of this file for the retriever
+is the `client` section which should take the general form:
+
+::
+
+ [client]
+ host=hostname
+ port=port
+ user=username
+ password=password
+
+where each word to the right of the `=` is replaced with the correct information
+for your database connection. Remove or comment out the lines for any values you
+don't want to set.
+
+
Acknowledgments
~~~~~~~~~~~~~~~
diff --git a/retriever/engines/mysql.py b/retriever/engines/mysql.py
index a6c0db7eb..dbaf3a87b 100644
--- a/retriever/engines/mysql.py
+++ b/retriever/engines/mysql.py
@@ -116,4 +116,4 @@ def get_connection(self):
import pymysql.constants.CLIENT as client
args['client_flag'] = client.LOCAL_FILES
self.get_input()
- return dbapi.connect(**args)
+ return dbapi.connect(read_default_file='~/.my.cnf', **args)
|
deepchecks__deepchecks-1291 | [BUG] Tables have no style rendered on `check_result.save_as_html()`
The tables are not looking good when exporting single check as html

```
from deepchecks.tabular.datasets.regression import avocado
from deepchecks.tabular.checks import TrainTestFeatureDrift
train, test = avocado.load_data(as_train_test=True)
result = TrainTestFeatureDrift().add_condition_drift_score_not_greater_than().run(train, test)
result.save_as_html()
```
| [
{
"content": "# ----------------------------------------------------------------------------\n# Copyright (C) 2021-2022 Deepchecks (https://www.deepchecks.com)\n#\n# This file is part of Deepchecks.\n# Deepchecks is distributed under the terms of the GNU Affero General\n# Public License (version 3 or later).\n#... | [
{
"content": "# ----------------------------------------------------------------------------\n# Copyright (C) 2021-2022 Deepchecks (https://www.deepchecks.com)\n#\n# This file is part of Deepchecks.\n# Deepchecks is distributed under the terms of the GNU Affero General\n# Public License (version 3 or later).\n#... | diff --git a/deepchecks/core/check_result.py b/deepchecks/core/check_result.py
index d9d26d9290..7426865571 100644
--- a/deepchecks/core/check_result.py
+++ b/deepchecks/core/check_result.py
@@ -130,6 +130,7 @@ def display_check(self, unique_id: str = None, as_widget: bool = False,
"""
if as_widget:
box = widgets.VBox()
+ box.add_class('rendered_html')
box_children = []
check_html = ''
if unique_id:
|
ray-project__ray-1471 | Travis test failures in test_catalog.py.
The Travis builds all seem to be failing in `test_catalog.py`.
I can reproduce some failures locally with `gym` version `0.9.5`.
Gym pushed a new version today, so that may be the issue https://pypi.python.org/pypi/gym.
For example,
```
$ python -m pytest python/ray/rllib/test/test_catalog.py
[1m============================= test session starts ==============================[0m
platform linux2 -- Python 2.7.14, pytest-3.3.2, py-1.5.2, pluggy-0.6.0
rootdir: /home/travis/build/robertnishihara/ray-private-travis/python, inifile:
[1m
collecting 0 items [0m[1m
collecting 5 items [0m[1m
collecting 5 items [0m[1m
collected 5 items [0m
python/ray/rllib/test/test_catalog.py ...FF[36m [100%][0m
=================================== FAILURES ===================================
[1m[31m____________________ ModelCatalogTest.testGymPreprocessors _____________________[0m
self = <ray.rllib.test.test_catalog.ModelCatalogTest testMethod=testGymPreprocessors>
[1m def testGymPreprocessors(self):[0m
[1m p1 = ModelCatalog.get_preprocessor([0m
[1m get_registry(), gym.make("CartPole-v0"))[0m
[1m self.assertEqual(type(p1), NoPreprocessor)[0m
[1m [0m
[1m p2 = ModelCatalog.get_preprocessor([0m
[1m> get_registry(), gym.make("FrozenLake-v0"))[0m
[1m[31mpython/ray/rllib/test/test_catalog.py[0m:41:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mpython/ray/rllib/models/catalog.py[0m:215: in get_preprocessor
[1m return preprocessor(env.observation_space, options)[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:23: in __init__
[1m self._init()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <ray.rllib.models.preprocessors.OneHotPreprocessor object at 0x7fad2df67dd0>
[1m def _init(self):[0m
[1m> assert self._obs_space.shape == ()[0m
[1m[31mE AssertionError[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:81: AssertionError
----------------------------- Captured stdout call -----------------------------
Observation shape is (4,)
Not using any observation preprocessor.
Observation shape is (16,)
Using one-hot preprocessor for discrete envs.
----------------------------- Captured stderr call -----------------------------
[2018-01-25 07:26:43,537] Making new env: CartPole-v0
[2018-01-25 07:26:43,540] Making new env: FrozenLake-v0
------------------------------ Captured log call -------------------------------
registration.py 120 INFO Making new env: CartPole-v0
registration.py 120 INFO Making new env: FrozenLake-v0
[1m[31m____________________ ModelCatalogTest.testTuplePreprocessor ____________________[0m
self = <ray.rllib.test.test_catalog.ModelCatalogTest testMethod=testTuplePreprocessor>
[1m def testTuplePreprocessor(self):[0m
[1m ray.init()[0m
[1m [0m
[1m class TupleEnv(object):[0m
[1m def __init__(self):[0m
[1m self.observation_space = Tuple([0m
[1m [Discrete(5), Box(0, 1, shape=(3,))])[0m
[1m p1 = ModelCatalog.get_preprocessor([0m
[1m> get_registry(), TupleEnv())[0m
[1m[31mpython/ray/rllib/test/test_catalog.py[0m:52:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mpython/ray/rllib/models/catalog.py[0m:215: in get_preprocessor
[1m return preprocessor(env.observation_space, options)[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:23: in __init__
[1m self._init()[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:112: in _init
[1m preprocessor = get_preprocessor(space)(space, self._options)[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:23: in __init__
[1m self._init()[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <ray.rllib.models.preprocessors.OneHotPreprocessor object at 0x7fad4ff234d0>
[1m def _init(self):[0m
[1m> assert self._obs_space.shape == ()[0m
[1m[31mE AssertionError[0m
[1m[31mpython/ray/rllib/models/preprocessors.py[0m:81: AssertionError
----------------------------- Captured stdout call -----------------------------
Waiting for redis server at 127.0.0.1:44545 to respond...
Waiting for redis server at 127.0.0.1:60007 to respond...
Starting local scheduler with the following resources: {'GPU': 0, 'CPU': 2}.
Failed to start the UI, you may need to run 'pip install jupyter'.
Observation shape is ((5,), (3,))
Using a TupleFlatteningPreprocessor
Creating sub-preprocessor for Discrete(5)
Observation shape is (5,)
Using one-hot preprocessor for discrete envs.
----------------------------- Captured stderr call -----------------------------
Allowing the Plasma store to use up to 3.13728GB of memory.
Starting object store with directory /dev/shm and huge page support disabled
Disconnecting client on fd 22
[INFO] (/home/travis/build/robertnishihara/ray-private-travis/src/local_scheduler/local_scheduler.cc:171) Killed worker pid 14098 which hadn't started yet.
[INFO] (/home/travis/build/robertnishihara/ray-private-travis/src/local_scheduler/local_scheduler.cc:171) Killed worker pid 14099 which hadn't started yet.
Disconnecting client on fd 20
Disconnecting client on fd 18
[1m[31m====================== 2 failed, 3 passed in 7.09 seconds ======================[0m
travis_time:end:224e60d5:start=1516865197573618638,finish=1516865205120814512,duration=7547195874
[0K
[31;1mThe command "python -m pytest python/ray/rllib/test/test_catalog.py" exited with 1.[0m
```
| [
{
"content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nimport cv2\nimport numpy as np\nimport gym\n\nATARI_OBS_SHAPE = (210, 160, 3)\nATARI_RAM_OBS_SHAPE = (128,)\n\n\nclass Preprocessor(object):\n \"\"\"Defines an abstract observation pr... | [
{
"content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nimport cv2\nimport numpy as np\nimport gym\n\nATARI_OBS_SHAPE = (210, 160, 3)\nATARI_RAM_OBS_SHAPE = (128,)\n\n\nclass Preprocessor(object):\n \"\"\"Defines an abstract observation pr... | diff --git a/python/ray/rllib/models/preprocessors.py b/python/ray/rllib/models/preprocessors.py
index 01e9db16fbc3..f6ec1fa5e071 100644
--- a/python/ray/rllib/models/preprocessors.py
+++ b/python/ray/rllib/models/preprocessors.py
@@ -78,7 +78,6 @@ def transform(self, observation):
class OneHotPreprocessor(Preprocessor):
def _init(self):
- assert self._obs_space.shape == ()
self.shape = (self._obs_space.n,)
def transform(self, observation):
|
CTPUG__wafer-643 | Add support for Django 4.0
Currently failing tests (See #632)
| [
{
"content": "from glob import glob\nimport subprocess\n\nfrom setuptools import find_packages, setup\n\nREQUIRES = [\n 'Django>=2.2,<4',\n 'bleach',\n 'bleach-allowlist',\n 'diff-match-patch',\n 'django-bakery>=0.12.0',\n 'django-crispy-forms',\n 'django-markitup>=4.0.0',\n 'django-regi... | [
{
"content": "from glob import glob\nimport subprocess\n\nfrom setuptools import find_packages, setup\n\nREQUIRES = [\n 'Django>=2.2,<4',\n 'bleach',\n 'bleach-allowlist',\n 'diff-match-patch',\n 'django-bakery>=0.13.0',\n 'django-crispy-forms',\n 'django-markitup>=4.0.0',\n 'django-regi... | diff --git a/setup.py b/setup.py
index 87913769..223b9c61 100644
--- a/setup.py
+++ b/setup.py
@@ -8,7 +8,7 @@
'bleach',
'bleach-allowlist',
'diff-match-patch',
- 'django-bakery>=0.12.0',
+ 'django-bakery>=0.13.0',
'django-crispy-forms',
'django-markitup>=4.0.0',
'django-registration-redux',
|
Nitrate__Nitrate-564 | Remove Django 2.0
Django 2.0 is not supported and marked as insecure. Refer to https://docs.djangoproject.com/en/2.0/
| [
{
"content": "# -*- coding: utf-8 -*-\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'beautifulsoup4 >= 4.... | [
{
"content": "# -*- coding: utf-8 -*-\n\nfrom setuptools import setup, find_packages\n\n\nwith open('VERSION.txt', 'r') as f:\n pkg_version = f.read().strip()\n\n\ndef get_long_description():\n with open('README.rst', 'r') as f:\n return f.read()\n\n\ninstall_requires = [\n 'beautifulsoup4 >= 4.... | diff --git a/.travis.yml b/.travis.yml
index 18075323..2e06adea 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -4,11 +4,6 @@ services:
- docker
matrix:
include:
- # Django 2.0.x with Python 3.6 and 3.7
- - python: "3.6"
- env: DJANGO_REL="django>=2.0,<2.1"
- - python: "3.7"
- env: DJANGO_REL="django>=2.0,<2.1"
# Django 2.1.x with Python 3.6 and 3.7
# Also, run tests against Django 2.1.x, Python 3.6 and several database backends
- python: "3.6"
diff --git a/setup.py b/setup.py
index f0b2be6b..566a6623 100644
--- a/setup.py
+++ b/setup.py
@@ -14,7 +14,7 @@ def get_long_description():
install_requires = [
'beautifulsoup4 >= 4.1.1',
- 'django >= 2.0,<3.0',
+ 'django >= 2.1,<3.0',
'django-contrib-comments == 1.9.1',
'django-tinymce == 2.7.0',
'django-uuslug == 1.1.8',
|
pymodbus-dev__pymodbus-2065 | ModbusException 0x07 is missing in pdu.py
In pdu.py is ModbusException NegativeAcknowledge missing. Is it possible to add: NegativeAcknowledge = 0x07 ?
class ModbusExceptions:
IllegalFunction = 0x01
IllegalAddress = 0x02
IllegalValue = 0x03
SlaveFailure = 0x04
Acknowledge = 0x05
SlaveBusy = 0x06
MemoryParityError = 0x08
GatewayPathUnavailable = 0x0A
GatewayNoResponse = 0x0B
| [
{
"content": "\"\"\"Contains base classes for modbus request/response/error packets.\"\"\"\n\n__all__ = [\n \"ModbusRequest\",\n \"ModbusResponse\",\n \"ModbusExceptions\",\n \"ExceptionResponse\",\n \"IllegalFunctionRequest\",\n]\n\n# pylint: disable=missing-type-doc\nimport struct\n\nfrom pymod... | [
{
"content": "\"\"\"Contains base classes for modbus request/response/error packets.\"\"\"\n\n__all__ = [\n \"ModbusRequest\",\n \"ModbusResponse\",\n \"ModbusExceptions\",\n \"ExceptionResponse\",\n \"IllegalFunctionRequest\",\n]\n\n# pylint: disable=missing-type-doc\nimport struct\n\nfrom pymod... | diff --git a/pymodbus/pdu.py b/pymodbus/pdu.py
index d73b841cd..64c48b1e1 100644
--- a/pymodbus/pdu.py
+++ b/pymodbus/pdu.py
@@ -164,6 +164,7 @@ class ModbusExceptions: # pylint: disable=too-few-public-methods
SlaveFailure = 0x04
Acknowledge = 0x05
SlaveBusy = 0x06
+ NegativeAcknowledge = 0x07
MemoryParityError = 0x08
GatewayPathUnavailable = 0x0A
GatewayNoResponse = 0x0B
|
wandb__wandb-424 | Install issue on DLAMI images, conflict with PyYAML
wandb has a dependency conflict when installing on AWS Deep Learning images -- DLAMI v23
You can get arround it with 'pip install wandb --ignore-installed', but also perhaps wandb could relax PyYAML version requirement to make life easier (ie, I can't put wandb in requirements.txt because of this)
```
(pytorch_p36) ubuntu@ip-172-31-28-233:~$ pip install wandb
Collecting wandb
Using cached https://files.pythonhosted.org/packages/6a/d1/af8371f39d9383f4f1e9ba76c8894f75c01d5eddf4ec57bd45952fefab74/wandb-0.8.3-py2.py3-none-any.whl
Collecting watchdog>=0.8.3 (from wandb)
Requirement already satisfied: psutil>=5.0.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (5.4.5)
Collecting backports.tempfile>=1.0 (from wandb)
Using cached https://files.pythonhosted.org/packages/b4/5c/077f910632476281428fe254807952eb47ca78e720d059a46178c541e669/backports.tempfile-1.0-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.0.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (2.20.0)
Requirement already satisfied: sentry-sdk>=0.4.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (0.9.5)
Requirement already satisfied: six>=1.10.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (1.11.0)
Collecting shortuuid>=0.5.0 (from wandb)
Collecting gql>=0.1.0 (from wandb)
Requirement already satisfied: subprocess32>=3.5.3 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (3.5.4)
Collecting GitPython>=1.0.0 (from wandb)
Using cached https://files.pythonhosted.org/packages/fe/e5/fafe827507644c32d6dc553a1c435cdf882e0c28918a5bab29f7fbebfb70/GitPython-2.1.11-py2.py3-none-any.whl
Requirement already satisfied: docker-pycreds>=0.4.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (0.4.0)
Requirement already satisfied: nvidia-ml-py3>=7.352.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (7.352.0)
Requirement already satisfied: Click>=7.0 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (7.0)
Requirement already satisfied: python-dateutil>=2.6.1 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from wandb) (2.7.3)
Collecting PyYAML>=4.2b4 (from wandb)
Requirement already satisfied: argh>=0.24.1 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from watchdog>=0.8.3->wandb) (0.26.2)
Collecting pathtools>=0.1.1 (from watchdog>=0.8.3->wandb)
Collecting backports.weakref (from backports.tempfile>=1.0->wandb)
Using cached https://files.pythonhosted.org/packages/88/ec/f598b633c3d5ffe267aaada57d961c94fdfa183c5c3ebda2b6d151943db6/backports.weakref-1.0.post1-py2.py3-none-any.whl
Requirement already satisfied: urllib3<1.25,>=1.21.1 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from requests>=2.0.0->wandb) (1.23)
Requirement already satisfied: certifi>=2017.4.17 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from requests>=2.0.0->wandb) (2019.3.9)
Requirement already satisfied: idna<2.8,>=2.5 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from requests>=2.0.0->wandb) (2.6)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in ./anaconda3/envs/pytorch_p36/lib/python3.6/site-packages (from requests>=2.0.0->wandb) (3.0.4)
Collecting graphql-core>=0.5.0 (from gql>=0.1.0->wandb)
Using cached https://files.pythonhosted.org/packages/f1/88/a4a7bf8ab66c35b146e44d77a1f9fd2c36e0ec9fb1a51581608c16deb6e3/graphql_core-2.2-py2.py3-none-any.whl
Collecting promise>=0.4.0 (from gql>=0.1.0->wandb)
Collecting gitdb2>=2.0.0 (from GitPython>=1.0.0->wandb)
Using cached https://files.pythonhosted.org/packages/da/30/a407568aa8d8f25db817cf50121a958722f3fc5f87e3a6fba1f40c0633e3/gitdb2-2.0.5-py2.py3-none-any.whl
Collecting rx>=1.6.0 (from graphql-core>=0.5.0->gql>=0.1.0->wandb)
Using cached https://files.pythonhosted.org/packages/33/0f/5ef4ac78e2a538cc1b054eb86285fe0bf7a5dbaeaac2c584757c300515e2/Rx-1.6.1-py2.py3-none-any.whl
Collecting smmap2>=2.0.0 (from gitdb2>=2.0.0->GitPython>=1.0.0->wandb)
Using cached https://files.pythonhosted.org/packages/55/d2/866d45e3a121ee15a1dc013824d58072fd5c7799c9c34d01378eb262ca8f/smmap2-2.0.5-py2.py3-none-any.whl
thinc 6.12.1 has requirement msgpack<0.6.0,>=0.5.6, but you'll have msgpack 0.6.0 which is incompatible.
tensorflow 1.13.1 has requirement protobuf>=3.6.1, but you'll have protobuf 3.5.2 which is incompatible.
tensorboard 1.13.1 has requirement protobuf>=3.6.0, but you'll have protobuf 3.5.2 which is incompatible.
docker-compose 1.24.0 has requirement PyYAML<4.3,>=3.10, but you'll have pyyaml 5.1.1 which is incompatible.
Installing collected packages: PyYAML, pathtools, watchdog, backports.weakref, backports.tempfile, shortuuid, rx, promise, graphql-core, gql, smmap2, gitdb2, GitPython, wandb
Found existing installation: PyYAML 3.12
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
You are using pip version 10.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
(pytorch_p36) ubuntu@ip-172-31-28-233:~$ echo $?
```
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\nfrom setuptools import setup\n\nwith open('README.md') as readme_file:\n readme = readme_file.read()\n\nrequirements = [\n 'backports.tempfile>=1.0',\n 'Click>=7.0',\n 'GitPython>=1.0.0',\n 'gql>=0.1.0',\n 'nvidia-ml-py3>=7.352.0'... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\nfrom setuptools import setup\n\nwith open('README.md') as readme_file:\n readme = readme_file.read()\n\nrequirements = [\n 'backports.tempfile>=1.0',\n 'Click>=7.0',\n 'GitPython>=1.0.0',\n 'gql>=0.1.0',\n 'nvidia-ml-py3>=7.352.0'... | diff --git a/setup.py b/setup.py
index 4f157098539..9613dc19495 100644
--- a/setup.py
+++ b/setup.py
@@ -17,7 +17,6 @@
'shortuuid>=0.5.0',
'six>=1.10.0',
'watchdog>=0.8.3',
- 'PyYAML>=4.2b4', # watchdog depends on pyyaml but doesnt specify safe version
'psutil>=5.0.0',
'sentry-sdk>=0.4.0',
'subprocess32>=3.5.3',
|
ceph__ceph-ansible-3614 | Python3 seems to break TASK [ceph-mon : create monitor initial keyring]
<!-- **Are you in the right place?**
1. For issues or feature requests, please create an issue in this repository.
2. Did you already search the existing open issues for anything similar? -->
**Bug Report**
# What happened:
Using stable-3.2 to control Fedora ARM 29 nodes, when I use Python3 on those ARM nodes; the firewall gets set up as expected but I get a failure on `TASK [ceph-mon : create monitor initial keyring]`.
To be able to run a copy of `site.yml.sample`, I have to use the default of Pyton2 on those Fedora ARM 29 nodes and can thus not configure the firewall (It is not ceph-ansible's problem that F29 offers no python2-firewall).
## details with Python3
While `ansible_python_interpreter=/usr/bin/python3` allows me to configure firewall (`configure_firewall: True`) it fails on `TASK [ceph-mon : create monitor initial keyring]`
```
TASK [ceph-mon : create monitor initial keyring] ****************************************************************************************
Saturday 02 February 2019 13:22:05 +0100 (0:00:00.578) 0:03:51.103 *****
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: rstrip arg must be None or str
fatal: [odroid-hc2-00]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_wt9j1z5d/ansible_module_ceph_key.py\", line 697, in <module>\n main()\n File \"/tmp/ansible_wt9j1z5d/ansible_module_ceph_key.py\", line 693, in main\n run_module()\n File \"/tmp/ansible_wt9j1z5d/ansible_module_ceph_key.py\", line 681, in run_module\n stdout=out.rstrip(b\"\\r\\n\"),\nTypeError: rstrip arg must be None or str\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1}
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: rstrip arg must be None or str
fatal: [odroid-hc2-02]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_fvc_9har/ansible_module_ceph_key.py\", line 697, in <module>\n main()\n File \"/tmp/ansible_fvc_9har/ansible_module_ceph_key.py\", line 693, in main\n run_module()\n File \"/tmp/ansible_fvc_9har/ansible_module_ceph_key.py\", line 681, in run_module\n stdout=out.rstrip(b\"\\r\\n\"),\nTypeError: rstrip arg must be None or str\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1}
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: rstrip arg must be None or str
fatal: [odroid-hc2-01]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_77ptji0m/ansible_module_ceph_key.py\", line 697, in <module>\n main()\n File \"/tmp/ansible_77ptji0m/ansible_module_ceph_key.py\", line 693, in main\n run_module()\n File \"/tmp/ansible_77ptji0m/ansible_module_ceph_key.py\", line 681, in run_module\n stdout=out.rstrip(b\"\\r\\n\"),\nTypeError: rstrip arg must be None or str\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1}
```
## note on Python2
without overriding the `ansible_python_interpreter`, I must set `configure_firewall: False` as there is no `python2-firewall.noarch` for Fedora 29. A copy of `site.yml.sample` runs through just fine with Python2 and I get a working cluster. Obviously I need to deal with firewall myself.
```bash
[root@odroid-hc2-00 ~]# ceph -s
cluster:
id: d4fe8da4-bad1-4564-bfaa-358e1ab8e02c
health: HEALTH_OK
services:
mon: 3 daemons, quorum odroid-hc2-00,odroid-hc2-01,odroid-hc2-02
mgr: odroid-hc2-00(active), standbys: odroid-hc2-02, odroid-hc2-01
osd: 5 osds: 5 up, 5 in
data:
pools: 0 pools, 0 pgs
objects: 0 objects, 0B
usage: 5.01GiB used, 8.18TiB / 8.19TiB avail
pgs:
```
I verified with `ansible -m setup odroid-hc2-00|less` that Python 2 gets used in that case. `2.7.15` to be precise.
# What you expected to happen:
Being able to have `ceph-ansible` set up the firewall on Fedora 29 nodes. Ideally by being able to use `ansible_python_interpreter=/usr/bin/python3` (allowing the ansible firewall module to be used).
# How to reproduce it (minimal and precise):
1. Have a RHEL 7 x86_64 machine to run `ceph-ansible`. Be it `ceph-ansible-3.2.4-1.el7cp.noarch` or `branch stable-3.2` from `origin git@github.com:ceph/ceph-ansible.git`; I can reproduce the problem with both. (While I could have run ceph-ansible from one of the Fedora ARM 29 nodes, using a RHSM-registered RHEL7 VM simply made it easy for me to `yum install ceph-ansible`)
2. Have 5 OSD hosts, one disk each, running Fedora ARM 29 (mine are ODROID-HC2, sadly no RHEL7 for that platform)
3. `cp site.ym.samle site.yml`
4. `ansible-playbook site.ym`
# Share your group_vars files, inventory
This is my play cluster while learning Ceph, so there are `ceph_conf_overrides`, silly small journal sizes etc, don't mind those.
```bash
[ansible@ceph-ansible-rhel7 ceph-ansible]$ pwd
/usr/share/ceph-ansible
[ansible@ceph-ansible-rhel7 ceph-ansible]$ rpm -qf /usr/share/ceph-ansible
ceph-ansible-3.2.4-1.el7cp.noarch
```
`/etc/ansible/hosts` is as follows, obviously I toggle the `ansible_python_interpreter=…` line on or off while rproducing for this bug report. And yes, I just noticed I set the ansible_user needlessly twice ;-)
```ini
[ceph-arm-nodes]
odroid-hc2-[00:04]
[ceph-arm-nodes:vars]
ansible_user=ansible
#ansible_python_interpreter=/usr/bin/python3
[ceph-housenet]
ceph-ansible-rhel7
odroid-hc2-[00:04]
[ceph-housenet:vars]
ansible_user=ansible
[mons]
odroid-hc2-[00:02]
# MGRs are typically collocated with MONs
[mgrs]
odroid-hc2-[00:02]
[osds]
odroid-hc2-[00:04]
[clients]
ceph-ansible-rhel7
odroid-hc2-00
```
```bash
[ansible@ceph-ansible-rhel7 group_vars]$ diff all.yml all.yml.sample
45c45
< cluster: ceph
---
> #cluster: ceph
63d62
< #configure_firewall: False
110d108
< ntp_daemon_type: chronyd
139c137
< ceph_origin: distro
---
> ceph_origin: repository
197d194
< ceph_repository_type: cdn
301d297
< rbd_cache_writethrough_until_flush: "false"
305d300
< rbd_client_directories: false # as per CEPH125-RHCS3.0-en-1-20180517 pages 45 and 60
350,351d344
< monitor_interface: eth0
<
374d366
< journal_size: 1024 # As per CEPH125-RHCS3.0-en-1-20180517 page 45
377,378c369
< public_network: 192.168.50.0/24 # HouseNet
< cluster_network: "{{ public_network | regex_replace(' ', '') }}"
---
> #cluster_network: "{{ public_network | regex_replace(' ', '') }}"
528,537d518
< # Overrides from CEPH125-RHCS3.0-en-1-20180517
< ceph_conf_overrides:
< global:
< mon_osd_allow_primary_affinity: 1
< mon_clock_drift_allowed: 0.5
< mon_pg_warn_min_per_osd: 0
< mon_allow_pool_delete: true
< client:
< rbd_default_features: 1
<
585a567,570
>
> # this is only here for usage with the switch-from-non-containerized-to-containerized-ceph-daemons.yml playbook
> # do not ever change this here
> #switch_to_container: false
```
```bash
[ansible@ceph-ansible-rhel7 ceph-ansible]$ diff /usr/share/ceph-ansible/group_vars/osds.yml.sample /usr/share/ceph-ansible/group_vars/osds.yml
22a23
> copy_admin_key: true
46a48,49
> devices:
> - /dev/sda
61a65
> dmcrypt: True
89a94
> osd_scenario: non-collocated # collocated was as per CEPH125-RHCS3.0-en-1-20180517 page 36, this is for my fiddlings
131,133c136,137
< # - The devices in 'dedicated_devices' will get one partition for RocksDB DB, called 'block.db'
< # and one for RocksDB WAL, called 'block.wal'. To use a single partition for RocksDB and WAL together
< # set bluestore_wal_devices to [].
---
> # - The devices in 'dedicated_devices' will get 1 partition for RocksDB DB, called 'block.db'
> # and one for RocksDB WAL, called 'block.wal'
147a152,153
> dedicated_devices:
> - /dev/mmcblk0
156,157d161
< #
< # Set bluestore_wal_devices: [] to use the same partition for RocksDB and WAL.
```
```bash
[ansible@ceph-ansible-rhel7 ceph-ansible]$ diff /usr/share/ceph-ansible/group_vars/clients.yml.sample /usr/share/ceph-ansible/group_vars/clients.yml
18a19
> copy_admin_key: true
```
# Environment details
**Environment of RHEL7 x86_64 VM running `ceph-ansible`**:
* OS (e.g. from /etc/os-release): Red Hat Enterprise Linux Server release 7.6 (Maipo)
* Kernel (e.g. `uname -a`): Linux ceph-ansible-rhel7.internal.pcfe.net 3.10.0-862.el7.x86_64 #1 SMP Wed Mar 21 18:14:51 EDT 2018 x86_64 x86_64 x86_64 GNU/Linux
* Docker version if applicable (e.g. `docker version`): n/a
* Ansible version (e.g. `ansible-playbook --version`): ansible-playbook 2.6.12
config file = /usr/share/ceph-ansible/ansible.cfg
configured module search path = [u'/home/ansible/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible-playbook
python version = 2.7.5 (default, Sep 12 2018, 05:31:16) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)]
* ceph-ansible version (e.g. `git head or tag or stable branch`): ceph-ansible-3.2.4-1.el7cp.noarch and `stable-3.2` from git both allow to reproduce the problem
* Ceph version (e.g. `ceph -v`): ceph version 12.2.8-52.el7cp (3af3ca15b68572a357593c261f95038d02f46201) luminous (stable)
**Environment of Fedora ARM 29 OSD nodes**:
* OS (e.g. from /etc/os-release): Fedora release 29 (Twenty Nine)
* Kernel (e.g. `uname -a`): Linux odroid-hc2-00.fritz.box 4.20.3-200.fc29.armv7hl #1 SMP Thu Jan 17 17:09:08 UTC 2019 armv7l armv7l armv7l GNU/Linux
* Docker version if applicable (e.g. `docker version`): n/a
* Ansible version (e.g. `ansible-playbook --version`): ansible-playbook 2.7.5
-m setup run on the RHEL7 b
```
"ansible_python": {
"executable": "/usr/bin/python",
"has_sslcontext": true,
"type": "CPython",
"version": {
"major": 2,
"micro": 15,
"minor": 7,
"releaselevel": "final",
"serial": 0
},
"version_info": [
2,
7,
15,
"final",
0
]
},
* ceph-ansible version (e.g. `git head or tag or stable branch`):
* Ceph version (e.g. `ceph -v`): ceph version 12.2.10 (177915764b752804194937482a39e95e0ca3de94) luminous (stable)
# additional info
I do not expect this to get fixed in stable-3.2, after all the firewall config functionality in ceph-ansible is quite recent, but it would be nice if it was fixed in the next release
| [
{
"content": "#!/usr/bin/python\n# Copyright 2018, Red Hat, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | [
{
"content": "#!/usr/bin/python\n# Copyright 2018, Red Hat, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | diff --git a/library/ceph_key.py b/library/ceph_key.py
index a5fd517c91..34a3b79b25 100644
--- a/library/ceph_key.py
+++ b/library/ceph_key.py
@@ -678,8 +678,8 @@ def run_module():
end=str(endd),
delta=str(delta),
rc=rc,
- stdout=out.rstrip(b"\r\n"),
- stderr=err.rstrip(b"\r\n"),
+ stdout=out.rstrip("\r\n"),
+ stderr=err.rstrip("\r\n"),
changed=True,
)
|
pallets__click-2175 | `click.secho` is improperly typed
The `file` argument for `click.secho` is missing part of its typehint causing the entire secho function to be untyped.
This is not flagged by mypy strict mode, but does in pyright strict mode.
---
Install pyright and click
```bash
python -m venv .venv && source .venv/bin/activate
pip install click pyright
```
Create a py file
```py
# main.py
import click
click.secho("hello")
```
Set pyright to strict mode
```toml
# pyproject.toml
[tool.pyright]
typeCheckingMode = "strict"
```
Run pyright
```bash
pyright main.py
```
Result:
```bash
error: Type of "secho" is partially unknown
Type of "secho" is "(message: Any | None = None, file: IO[Unknown] | None = None, nl: bool = True, err: bool = False, color: bool | None = None, **styles: Any) -> None" (reportUnknownMemberType)
```
---
The function should not produce a typing error. I will PR a fix for this momentarily.
---
Environment:
- Python version: 3.10.1
- Click version: 8.0.3
| [
{
"content": "import inspect\nimport io\nimport itertools\nimport os\nimport sys\nimport typing as t\nfrom gettext import gettext as _\n\nfrom ._compat import isatty\nfrom ._compat import strip_ansi\nfrom ._compat import WIN\nfrom .exceptions import Abort\nfrom .exceptions import UsageError\nfrom .globals impor... | [
{
"content": "import inspect\nimport io\nimport itertools\nimport os\nimport sys\nimport typing as t\nfrom gettext import gettext as _\n\nfrom ._compat import isatty\nfrom ._compat import strip_ansi\nfrom ._compat import WIN\nfrom .exceptions import Abort\nfrom .exceptions import UsageError\nfrom .globals impor... | diff --git a/CHANGES.rst b/CHANGES.rst
index 6348d6f25..6c9a79327 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -14,6 +14,7 @@ Unreleased
- Fix a typo in the Bash completion script that affected file and
directory completion. If this script was generated by a previous
version, it should be regenerated. :issue:`2163`
+- Fix typing for ``secho`` file argument. :issue:`2174`
Version 8.0.3
diff --git a/src/click/termui.py b/src/click/termui.py
index a7a8d03cb..07b5257cc 100644
--- a/src/click/termui.py
+++ b/src/click/termui.py
@@ -624,7 +624,7 @@ def unstyle(text: str) -> str:
def secho(
message: t.Optional[t.Any] = None,
- file: t.Optional[t.IO] = None,
+ file: t.Optional[t.IO[t.AnyStr]] = None,
nl: bool = True,
err: bool = False,
color: t.Optional[bool] = None,
|
jupyterhub__zero-to-jupyterhub-k8s-403 | Allow making JupyterLab default thing to launch
Is there a way to make JupyterLab come up by default when new users connect?
Is there a way to get the JupyterHub control panel from JupyterLab?
| [
{
"content": "import os\nimport glob\nfrom tornado.httpclient import AsyncHTTPClient\n\nfrom z2jh import get_config, get_secret\n\n# Configure JupyterHub to use the curl backend for making HTTP requests,\n# rather than the pure-python implementations. The default one starts\n# being too slow to make a large num... | [
{
"content": "import os\nimport glob\nfrom tornado.httpclient import AsyncHTTPClient\n\nfrom z2jh import get_config, get_secret\n\n# Configure JupyterHub to use the curl backend for making HTTP requests,\n# rather than the pure-python implementations. The default one starts\n# being too slow to make a large num... | diff --git a/doc/source/user-environment.rst b/doc/source/user-environment.rst
index 42cb34a6a2..fdf4b829e4 100644
--- a/doc/source/user-environment.rst
+++ b/doc/source/user-environment.rst
@@ -185,6 +185,35 @@ how to configure JupyterHub to build off of this image:
9. **Enjoy your new computing environment!** You should now have a live
computing environment built off of the Docker image we’ve created.
+Use JupyterLab by default
+-------------------------
+
+`JupyterLab <https://github.com/jupyterlab/jupyterlab>`_ is the next generation
+user interface for Project Jupyter. It can be used with JupyterHub, both as an
+optional interface and as a default.
+
+1. `Install JupyterLab <https://github.com/jupyterlab/jupyterlab#installation`_
+ in your user image.
+2. `Install JupyterLab Hub extension
+ <https://github.com/jupyterhub/jupyterlab-hub#installation>`_ in your user
+ image. This provides a nice UI for accessing JupyterHub control panel from
+ JupyterLab. You only need the `jupyter labextension` command.
+3. If you want users to launch automatically into JupyterLab instead of classic
+ notebook, use the following in your ``config.yaml``
+
+ .. code-block:: yaml
+ singleuser:
+ defaultUrl: "/lab"
+
+ This will put users into JupyterLab when they launch.
+4. Users can always classic Jupyter Notebook by replacing the ``/lab`` in the URL
+ after their server starts with ``/tree``. Similarly, you can access
+ JupyterLab even if it is not the default by replacing ``/tree`` in the URL
+ with ``/lab``
+
+.. note::
+ JupyterLab is just about to go into beta, so use with caution!
+
Set environment variables
-------------------------
diff --git a/images/hub/jupyterhub_config.py b/images/hub/jupyterhub_config.py
index deef3e1529..9b8d675859 100644
--- a/images/hub/jupyterhub_config.py
+++ b/images/hub/jupyterhub_config.py
@@ -248,6 +248,9 @@ def generate_user_name(spawner):
if cmd:
c.Spawner.cmd = cmd
+default_url = get_config('singleuser.default-url', None)
+if default_url:
+ c.Spawner.default_url = default_url
scheduler_strategy = get_config('singleuser.scheduler-strategy', 'spread')
diff --git a/jupyterhub/templates/hub/configmap.yaml b/jupyterhub/templates/hub/configmap.yaml
index 8f15f498f8..398fae2c3e 100644
--- a/jupyterhub/templates/hub/configmap.yaml
+++ b/jupyterhub/templates/hub/configmap.yaml
@@ -113,6 +113,9 @@ data:
{{ if .Values.singleuser.cmd -}}
singleuser.cmd: {{ .Values.singleuser.cmd | quote }}
{{- end }}
+ {{ if .Values.singleuser.defaultUrl }}
+ singleuser.default-url: {{ .Values.singleuser.defaultUrl | quote }}
+ {{- end }}
singleuser.uid: {{ .Values.singleuser.uid | quote }}
singleuser.fs-gid: {{ .Values.singleuser.fsGid | quote }}
diff --git a/jupyterhub/values.yaml b/jupyterhub/values.yaml
index efb5e8c80e..11b6dca487 100644
--- a/jupyterhub/values.yaml
+++ b/jupyterhub/values.yaml
@@ -143,6 +143,7 @@ singleuser:
limit:
guarantee: 1G
cmd: jupyterhub-singleuser
+ defaultUrl:
prePuller:
hook:
|
sql-machine-learning__elasticdl-368 | Better check for codec names
currently, codec name argument is not checked. A typo would result in worker interpreting encoded data.
| [
{
"content": "import logging\nimport time\nimport argparse\nimport os\n\nimport grpc\nimport tensorflow as tf\n\ntf.enable_eager_execution()\n\nfrom concurrent import futures\nfrom recordio import File\nfrom elasticdl.proto import master_pb2_grpc\nfrom elasticdl.master.servicer import MasterServicer\nfrom elast... | [
{
"content": "import logging\nimport time\nimport argparse\nimport os\n\nimport grpc\nimport tensorflow as tf\n\ntf.enable_eager_execution()\n\nfrom concurrent import futures\nfrom recordio import File\nfrom elasticdl.proto import master_pb2_grpc\nfrom elasticdl.master.servicer import MasterServicer\nfrom elast... | diff --git a/elasticdl/master/main.py b/elasticdl/master/main.py
index d06686bbc..0f2b0b4c7 100644
--- a/elasticdl/master/main.py
+++ b/elasticdl/master/main.py
@@ -65,6 +65,7 @@ def _parse_args():
parser.add_argument(
"--codec-type",
default=None,
+ choices=["tf_example"],
help="Type of codec(tf_example or None)",
)
return parser.parse_args()
|
vega__altair-1844 | Fix simple typo: packge -> package
There is a small typo in setup.py.
Should read package rather than packge.
| [
{
"content": "import io\nimport os\nimport re\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n#==============================================================================\n# Utilities\n#===================================================================... | [
{
"content": "import io\nimport os\nimport re\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n#==============================================================================\n# Utilities\n#===================================================================... | diff --git a/setup.py b/setup.py
index 8c2e4e73b..da9afb201 100644
--- a/setup.py
+++ b/setup.py
@@ -27,7 +27,7 @@ def get_install_requirements(path):
def version(path):
- """Obtain the packge version from a python file e.g. pkg/__init__.py
+ """Obtain the package version from a python file e.g. pkg/__init__.py
See <https://packaging.python.org/en/latest/single_source_version.html>.
"""
|
pyca__cryptography-3803 | Signer/Verifier deprecation warning has wrong stacklevel
Seeing this with Cryptography 2.0:
```
.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:477: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.
_warn_sign_verify_deprecated()
.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:382: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.
_warn_sign_verify_deprecated()
```
I see a few open issues related to deprecations (e.g. #3794), but I'm not sure if any of them cover this particular message.
| [
{
"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport warnings\n\nfrom cryptography import u... | [
{
"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport warnings\n\nfrom cryptography import u... | diff --git a/src/cryptography/hazmat/backends/openssl/utils.py b/src/cryptography/hazmat/backends/openssl/utils.py
index ff1b97458735..05d0fe589158 100644
--- a/src/cryptography/hazmat/backends/openssl/utils.py
+++ b/src/cryptography/hazmat/backends/openssl/utils.py
@@ -41,5 +41,5 @@ def _warn_sign_verify_deprecated():
"signer and verifier have been deprecated. Please use sign "
"and verify instead.",
utils.PersistentlyDeprecated,
- stacklevel=2
+ stacklevel=3
)
|
fossasia__open-event-server-9132 | Add the unique ticket code into the downlad CSV file
The CSV download file of the attendee list does not include the numbers on the QR Code. Please add this field "Ticket-ID".
The ticket ID has the following format: 135ccbd7-9b23-4a52-a7fd-326fec1b2c1c
Whereas the order has a format like this: #O1691408152-34896

Expected: The exported CSV should have a table column "Ticket ID" with the ticket ID number that is encoded in the QR code as well.

| [
{
"content": "import base64\nfrom dataclasses import dataclass\nfrom datetime import datetime\nfrom io import BytesIO\n\nimport qrcode\nfrom citext import CIText\n\nfrom app.api.helpers.storage import UPLOAD_PATHS, generate_hash\nfrom app.models import db\nfrom app.models.base import SoftDeletionModel\n\n\n@dat... | [
{
"content": "import base64\nfrom dataclasses import dataclass\nfrom datetime import datetime\nfrom io import BytesIO\n\nimport qrcode\nfrom citext import CIText\n\nfrom app.api.helpers.storage import UPLOAD_PATHS, generate_hash\nfrom app.models import db\nfrom app.models.base import SoftDeletionModel\n\n\n@dat... | diff --git a/app/models/ticket_holder.py b/app/models/ticket_holder.py
index 3501b54e7a..1d8f1371a3 100644
--- a/app/models/ticket_holder.py
+++ b/app/models/ticket_holder.py
@@ -108,7 +108,7 @@ def qr_code(self):
box_size=10,
border=0,
)
- qr.add_data(self.order.identifier + "-" + str(self.id))
+ qr.add_data(self.order.identifier)
qr.make(fit=True)
img = qr.make_image()
|
ansible-collections__community.vmware-1280 | community.vmware.vmware_guest_powerstate not finding VM by name
##### SUMMARY
When trying to control powerstate of a VM by name the module is unable to find the VM. This despite the fact that the exact same parameters will find the VM in other modules (such as vmware_guest_snapshot).
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
vmware_guest_powerstate
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```ansible [core 2.12.2]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.8/site-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.8.12 (default, Sep 21 2021, 00:10:52) [GCC 8.5.0 20210514 (Red Hat 8.5.0-3)]
jinja version = 2.10.3
```
##### COLLECTION VERSION
<!--- Paste verbatim output from "ansible-galaxy collection list <namespace>.<collection>" between the quotes
for example: ansible-galaxy collection list community.general
-->
```# /root/.ansible/collections/ansible_collections
Collection Version
---------------- -------
community.vmware 2.1.0
[root@jumpserver snaprevert_test]# ansible-galaxy collection list community.vmware
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```[root@jumpserver snaprevert_test]# ansible-config dump --only-changed
[root@jumpserver snaprevert_test]#
```
##### OS / ENVIRONMENT
```NAME="CentOS Stream"
VERSION="8"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="8"
PLATFORM_ID="platform:el8"
PRETTY_NAME="CentOS Stream 8"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:8"
HOME_URL="https://centos.org/"
BUG_REPORT_URL="https://bugzilla.redhat.com/"
REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 8"
REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream"```
##### STEPS TO REPRODUCE
Running the playbook below you'll find that the vmware_guest_snapshot task will find the VM and perform action while the vmware_guest_powerstate will fail with "Unable to set power state for non-existing virtual machine" despite all parameters being identical.
```---
- name: Test of snapshot revert
hosts: localhost
gather_facts: no
vars:
vcenter_hostname: 1.2.3.4
vcenter_username: administrator@vsphere.local
vcenter_password: FOO
datacenter_name: BAR
tasks:
- name: Revert to initial snapshot
community.vmware.vmware_guest_snapshot:
validate_certs: no
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
folder: "/{{ datacenter_name }}/vm/Jumpserver_VMs/"
name: "jump_7216"
state: revert
snapshot_name: "Initial_Setup"
delegate_to: localhost
- name: Power on machine
community.vmware.vmware_guest_powerstate:
validate_certs: no
hostname: "{{ vcenter_hostname }}"
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
datacenter: "{{ datacenter_name }}"
folder: "/{{ datacenter_name }}/vm/Jumpserver_VMs/"
name: "jump_7216"
state: powered-on
delegate_to: localhost
```
##### EXPECTED RESULTS
I would expect vmware_guest_powerstate to find the VM just like vmware_guest_snapshot does.
##### ACTUAL RESULTS
Task fails with "non-existing virtual machine" error despite VM existing.
<!--- Paste verbatim command output between quotes -->
```PLAY [Test of snapshot revert] ********************************************************************************************************************************************************************************************************************************************************************************************************************************************
TASK [Revert to a snapshot] ***********************************************************************************************************************************************************************************************************************************************************************************************************************************************
changed: [localhost]
TASK [Power on machine] ****************************************************************************************************************************************************************************************************************************************************************************************************************************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Unable to set power state for non-existing virtual machine : 'jump_7216'"}
PLAY RECAP ****************************************************************************************************************************************************************************************************************************************************************************************************************************************************************
localhost : ok=1 changed=1 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
```
| [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n#\n# Copyright: (c) 2017, Abhijeet Kasurde <akasurde@redhat.com>\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\n... | [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n#\n# Copyright: (c) 2017, Abhijeet Kasurde <akasurde@redhat.com>\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\n... | diff --git a/changelogs/fragments/1238-vmware_guest_powerstate-ignore_trailing_slash_in_folder.yml b/changelogs/fragments/1238-vmware_guest_powerstate-ignore_trailing_slash_in_folder.yml
new file mode 100644
index 0000000000..f73c29011e
--- /dev/null
+++ b/changelogs/fragments/1238-vmware_guest_powerstate-ignore_trailing_slash_in_folder.yml
@@ -0,0 +1,2 @@
+bugfixes:
+ - vmware_guest_powerstate - Ignore trailing `/` in `folder` parameter like other guest modules do (https://github.com/ansible-collections/community.vmware/issues/1238).
diff --git a/plugins/modules/vmware_guest_powerstate.py b/plugins/modules/vmware_guest_powerstate.py
index 4aafbec5de..712debf4e2 100644
--- a/plugins/modules/vmware_guest_powerstate.py
+++ b/plugins/modules/vmware_guest_powerstate.py
@@ -261,6 +261,9 @@ def main():
result = dict(changed=False,)
+ if module.params['folder']:
+ module.params['folder'] = module.params['folder'].rstrip('/')
+
pyv = PyVmomi(module)
# Check if the VM exists before continuing
diff --git a/tests/integration/targets/vmware_guest_powerstate/tasks/main.yml b/tests/integration/targets/vmware_guest_powerstate/tasks/main.yml
index 8fb3e8ad31..9c48b74e83 100644
--- a/tests/integration/targets/vmware_guest_powerstate/tasks/main.yml
+++ b/tests/integration/targets/vmware_guest_powerstate/tasks/main.yml
@@ -40,7 +40,7 @@
username: "{{ vcenter_username }}"
password: "{{ vcenter_password }}"
name: test_vm1
- folder: '{{ f0 }}'
+ folder: '{{ f0 }}/' # Test with a trailing / because of issue 1238
state: powered-off
register: poweroff_d1_c1_f0
|
oppia__oppia-7459 | Upgrade @typescript-eslint/eslint-plugin
`eslint-utils` is currently out of date, https://github.com/oppia/oppia/pull/7451 provides a temporary fix, but we need to upgrade the main package that requires `eslint-utils` to ensure that we have a long term fix.
When fixing this, please make sure that the lint tests run successfully.
| [
{
"content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n... | [
{
"content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n... | diff --git a/.eslintrc b/.eslintrc
index 2a39be6f84aae..ab5cdb333ac38 100644
--- a/.eslintrc
+++ b/.eslintrc
@@ -149,6 +149,7 @@
"no-multi-str": [
"error"
],
+ "no-prototype-builtins": "off",
"no-redeclare": [
"off"
],
diff --git a/core/domain/feedback_jobs_one_off.py b/core/domain/feedback_jobs_one_off.py
index f92e661a17fd5..99788c22302f9 100644
--- a/core/domain/feedback_jobs_one_off.py
+++ b/core/domain/feedback_jobs_one_off.py
@@ -13,6 +13,7 @@
# limitations under the License.
"""One-off jobs for feedback models."""
+from __future__ import absolute_import # pylint: disable=import-only-modules
from core import jobs
from core.platform import models
diff --git a/core/domain/feedback_jobs_one_off_test.py b/core/domain/feedback_jobs_one_off_test.py
index f324fd80c1114..cdb625cf4d684 100644
--- a/core/domain/feedback_jobs_one_off_test.py
+++ b/core/domain/feedback_jobs_one_off_test.py
@@ -13,6 +13,7 @@
# limitations under the License.
"""Tests for Feedback-related jobs."""
+from __future__ import absolute_import # pylint: disable=import-only-modules
import ast
diff --git a/extensions/classifiers/SVMPredictionService.ts b/extensions/classifiers/SVMPredictionService.ts
index 35cc33bd3bcad..ca2bab119cfea 100644
--- a/extensions/classifiers/SVMPredictionService.ts
+++ b/extensions/classifiers/SVMPredictionService.ts
@@ -154,7 +154,7 @@ export class SVMPredictionService {
}
if (iter >= maxIter) {
- console.info('Exceeds maxIter in calculateMulticlassProbabilities');
+ console.warn('Exceeds maxIter in calculateMulticlassProbabilities');
}
return P;
diff --git a/package-lock.json b/package-lock.json
index f6550ace26216..29434cca09d4b 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -975,83 +975,55 @@
}
},
"@typescript-eslint/eslint-plugin": {
- "version": "1.13.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-1.13.0.tgz",
- "integrity": "sha512-WQHCozMnuNADiqMtsNzp96FNox5sOVpU8Xt4meaT4em8lOG1SrOv92/mUbEHQVh90sldKSfcOc/I0FOb/14G1g==",
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-2.0.0.tgz",
+ "integrity": "sha512-Mo45nxTTELODdl7CgpZKJISvLb+Fu64OOO2ZFc2x8sYSnUpFrBUW3H+H/ZGYmEkfnL6VkdtOSxgdt+Av79j0sA==",
"dev": true,
"requires": {
- "@typescript-eslint/experimental-utils": "1.13.0",
- "eslint-utils": "^1.3.1",
+ "@typescript-eslint/experimental-utils": "2.0.0",
+ "eslint-utils": "^1.4.0",
"functional-red-black-tree": "^1.0.1",
"regexpp": "^2.0.1",
- "tsutils": "^3.7.0"
- },
- "dependencies": {
- "@typescript-eslint/experimental-utils": {
- "version": "1.13.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/experimental-utils/-/experimental-utils-1.13.0.tgz",
- "integrity": "sha512-zmpS6SyqG4ZF64ffaJ6uah6tWWWgZ8m+c54XXgwFtUv0jNz8aJAVx8chMCvnk7yl6xwn8d+d96+tWp7fXzTuDg==",
- "dev": true,
- "requires": {
- "@types/json-schema": "^7.0.3",
- "@typescript-eslint/typescript-estree": "1.13.0",
- "eslint-scope": "^4.0.0"
- }
- },
- "@typescript-eslint/typescript-estree": {
- "version": "1.13.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-1.13.0.tgz",
- "integrity": "sha512-b5rCmd2e6DCC6tCTN9GSUAuxdYwCM/k/2wdjHGrIRGPSJotWMCe/dGpi66u42bhuh8q3QBzqM4TMA1GUUCJvdw==",
- "dev": true,
- "requires": {
- "lodash.unescape": "4.0.1",
- "semver": "5.5.0"
- }
- },
- "semver": {
- "version": "5.5.0",
- "resolved": "https://registry.npmjs.org/semver/-/semver-5.5.0.tgz",
- "integrity": "sha512-4SJ3dm0WAwWy/NVeioZh5AntkdJoWKxHxcmyP622fOkgHa4z3R0TdBJICINyaSDE6uNwVc8gZr+ZinwZAH4xIA==",
- "dev": true
- }
+ "tsutils": "^3.14.0"
}
},
"@typescript-eslint/experimental-utils": {
- "version": "1.12.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/experimental-utils/-/experimental-utils-1.12.0.tgz",
- "integrity": "sha512-s0soOTMJloytr9GbPteMLNiO2HvJ+qgQkRNplABXiVw6vq7uQRvidkby64Gqt/nA7pys74HksHwRULaB/QRVyw==",
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/@typescript-eslint/experimental-utils/-/experimental-utils-2.0.0.tgz",
+ "integrity": "sha512-XGJG6GNBXIEx/mN4eTRypN/EUmsd0VhVGQ1AG+WTgdvjHl0G8vHhVBHrd/5oI6RRYBRnedNymSYWW1HAdivtmg==",
"dev": true,
"requires": {
- "@typescript-eslint/typescript-estree": "1.12.0",
+ "@types/json-schema": "^7.0.3",
+ "@typescript-eslint/typescript-estree": "2.0.0",
"eslint-scope": "^4.0.0"
}
},
"@typescript-eslint/parser": {
- "version": "1.12.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-1.12.0.tgz",
- "integrity": "sha512-0uzbaa9ZLCA5yMWJywnJJ7YVENKGWVUhJDV5UrMoldC5HoI54W5kkdPhTfmtFKpPFp93MIwmJj0/61ztvmz5Dw==",
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/@typescript-eslint/parser/-/parser-2.0.0.tgz",
+ "integrity": "sha512-ibyMBMr0383ZKserIsp67+WnNVoM402HKkxqXGlxEZsXtnGGurbnY90pBO3e0nBUM7chEEOcxUhgw9aPq7fEBA==",
"dev": true,
"requires": {
"@types/eslint-visitor-keys": "^1.0.0",
- "@typescript-eslint/experimental-utils": "1.12.0",
- "@typescript-eslint/typescript-estree": "1.12.0",
+ "@typescript-eslint/experimental-utils": "2.0.0",
+ "@typescript-eslint/typescript-estree": "2.0.0",
"eslint-visitor-keys": "^1.0.0"
}
},
"@typescript-eslint/typescript-estree": {
- "version": "1.12.0",
- "resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-1.12.0.tgz",
- "integrity": "sha512-nwN6yy//XcVhFs0ZyU+teJHB8tbCm7AIA8mu6E2r5hu6MajwYBY3Uwop7+rPZWUN/IUOHpL8C+iUPMDVYUU3og==",
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/@typescript-eslint/typescript-estree/-/typescript-estree-2.0.0.tgz",
+ "integrity": "sha512-NXbmzA3vWrSgavymlzMWNecgNOuiMMp62MO3kI7awZRLRcsA1QrYWo6q08m++uuAGVbXH/prZi2y1AWuhSu63w==",
"dev": true,
"requires": {
"lodash.unescape": "4.0.1",
- "semver": "5.5.0"
+ "semver": "^6.2.0"
},
"dependencies": {
"semver": {
- "version": "5.5.0",
- "resolved": "https://registry.npmjs.org/semver/-/semver-5.5.0.tgz",
- "integrity": "sha512-4SJ3dm0WAwWy/NVeioZh5AntkdJoWKxHxcmyP622fOkgHa4z3R0TdBJICINyaSDE6uNwVc8gZr+ZinwZAH4xIA==",
+ "version": "6.3.0",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
+ "integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==",
"dev": true
}
}
@@ -1267,9 +1239,9 @@
"dev": true
},
"acorn-jsx": {
- "version": "5.0.1",
- "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.0.1.tgz",
- "integrity": "sha512-HJ7CfNHrfJLlNTzIEUTj43LNWGkqpRLxm3YjAlcD0ACydk9XynzYsCBHxut+iqt+1aBXkx9UP/w/ZqMr13XIzg==",
+ "version": "5.0.2",
+ "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.0.2.tgz",
+ "integrity": "sha512-tiNTrP1MP0QrChmD2DdupCr6HWSFeKVw5d/dHTu4Y7rkAkRhU/Dt7dphAfIUyxtHpl/eBVip5uTNSpQJHylpAw==",
"dev": true
},
"adm-zip": {
@@ -4378,47 +4350,48 @@
}
},
"eslint": {
- "version": "5.16.0",
- "resolved": "https://registry.npmjs.org/eslint/-/eslint-5.16.0.tgz",
- "integrity": "sha512-S3Rz11i7c8AA5JPv7xAH+dOyq/Cu/VXHiHXBPOU1k/JAM5dXqQPt3qcrhpHSorXmrpu2g0gkIBVXAqCpzfoZIg==",
+ "version": "6.2.2",
+ "resolved": "https://registry.npmjs.org/eslint/-/eslint-6.2.2.tgz",
+ "integrity": "sha512-mf0elOkxHbdyGX1IJEUsNBzCDdyoUgljF3rRlgfyYh0pwGnreLc0jjD6ZuleOibjmnUWZLY2eXwSooeOgGJ2jw==",
"dev": true,
"requires": {
"@babel/code-frame": "^7.0.0",
- "ajv": "^6.9.1",
+ "ajv": "^6.10.0",
"chalk": "^2.1.0",
"cross-spawn": "^6.0.5",
"debug": "^4.0.1",
"doctrine": "^3.0.0",
- "eslint-scope": "^4.0.3",
- "eslint-utils": "^1.3.1",
- "eslint-visitor-keys": "^1.0.0",
- "espree": "^5.0.1",
+ "eslint-scope": "^5.0.0",
+ "eslint-utils": "^1.4.2",
+ "eslint-visitor-keys": "^1.1.0",
+ "espree": "^6.1.1",
"esquery": "^1.0.1",
"esutils": "^2.0.2",
"file-entry-cache": "^5.0.1",
"functional-red-black-tree": "^1.0.1",
- "glob": "^7.1.2",
+ "glob-parent": "^5.0.0",
"globals": "^11.7.0",
"ignore": "^4.0.6",
"import-fresh": "^3.0.0",
"imurmurhash": "^0.1.4",
- "inquirer": "^6.2.2",
- "js-yaml": "^3.13.0",
+ "inquirer": "^6.4.1",
+ "is-glob": "^4.0.0",
+ "js-yaml": "^3.13.1",
"json-stable-stringify-without-jsonify": "^1.0.1",
"levn": "^0.3.0",
- "lodash": "^4.17.11",
+ "lodash": "^4.17.14",
"minimatch": "^3.0.4",
"mkdirp": "^0.5.1",
"natural-compare": "^1.4.0",
"optionator": "^0.8.2",
- "path-is-inside": "^1.0.2",
"progress": "^2.0.0",
"regexpp": "^2.0.1",
- "semver": "^5.5.1",
- "strip-ansi": "^4.0.0",
- "strip-json-comments": "^2.0.1",
+ "semver": "^6.1.2",
+ "strip-ansi": "^5.2.0",
+ "strip-json-comments": "^3.0.1",
"table": "^5.2.3",
- "text-table": "^0.2.0"
+ "text-table": "^0.2.0",
+ "v8-compile-cache": "^2.0.3"
},
"dependencies": {
"debug": {
@@ -4430,6 +4403,22 @@
"ms": "^2.1.1"
}
},
+ "eslint-scope": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-5.0.0.tgz",
+ "integrity": "sha512-oYrhJW7S0bxAFDvWqzvMPRm6pcgcnWc4QnofCAqRTRfQC0JcwenzGglTtsLyIuuWFfkqDG9vz67cnttSd53djw==",
+ "dev": true,
+ "requires": {
+ "esrecurse": "^4.1.0",
+ "estraverse": "^4.1.1"
+ }
+ },
+ "eslint-visitor-keys": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-1.1.0.tgz",
+ "integrity": "sha512-8y9YjtM1JBJU/A9Kc+SbaOV4y29sSWckBwMHa+FGtVj5gN/sbnKDf6xJUl+8g7FAij9LVaP8C24DUiH/f/2Z9A==",
+ "dev": true
+ },
"ignore": {
"version": "4.0.6",
"resolved": "https://registry.npmjs.org/ignore/-/ignore-4.0.6.tgz",
@@ -4457,6 +4446,12 @@
"resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz",
"integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==",
"dev": true
+ },
+ "semver": {
+ "version": "6.3.0",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
+ "integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==",
+ "dev": true
}
}
},
@@ -4467,12 +4462,12 @@
"dev": true
},
"eslint-plugin-html": {
- "version": "5.0.5",
- "resolved": "https://registry.npmjs.org/eslint-plugin-html/-/eslint-plugin-html-5.0.5.tgz",
- "integrity": "sha512-v/33i3OD0fuXcRXexVyXXBOe4mLBLBQoF1UO1Uy9D+XLq4MC8K45GcQKfqjC/FnHAHp3pYUjpHHktYNCtShGmg==",
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/eslint-plugin-html/-/eslint-plugin-html-6.0.0.tgz",
+ "integrity": "sha512-PQcGippOHS+HTbQCStmH5MY1BF2MaU8qW/+Mvo/8xTa/ioeMXdSP+IiaBw2+nh0KEMfYQKuTz1Zo+vHynjwhbg==",
"dev": true,
"requires": {
- "htmlparser2": "^3.10.0"
+ "htmlparser2": "^3.10.1"
}
},
"eslint-scope": {
@@ -4501,20 +4496,26 @@
"dev": true
},
"espree": {
- "version": "5.0.1",
- "resolved": "https://registry.npmjs.org/espree/-/espree-5.0.1.tgz",
- "integrity": "sha512-qWAZcWh4XE/RwzLJejfcofscgMc9CamR6Tn1+XRXNzrvUSSbiAjGOI/fggztjIi7y9VLPqnICMIPiGyr8JaZ0A==",
+ "version": "6.1.1",
+ "resolved": "https://registry.npmjs.org/espree/-/espree-6.1.1.tgz",
+ "integrity": "sha512-EYbr8XZUhWbYCqQRW0duU5LxzL5bETN6AjKBGy1302qqzPaCH10QbRg3Wvco79Z8x9WbiE8HYB4e75xl6qUYvQ==",
"dev": true,
"requires": {
- "acorn": "^6.0.7",
- "acorn-jsx": "^5.0.0",
- "eslint-visitor-keys": "^1.0.0"
+ "acorn": "^7.0.0",
+ "acorn-jsx": "^5.0.2",
+ "eslint-visitor-keys": "^1.1.0"
},
"dependencies": {
"acorn": {
- "version": "6.2.1",
- "resolved": "https://registry.npmjs.org/acorn/-/acorn-6.2.1.tgz",
- "integrity": "sha512-JD0xT5FCRDNyjDda3Lrg/IxFscp9q4tiYtxE1/nOzlKCk7hIRuYjhq1kCNkbPjMRMZuFq20HNQn1I9k8Oj0E+Q==",
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/acorn/-/acorn-7.0.0.tgz",
+ "integrity": "sha512-PaF/MduxijYYt7unVGRuds1vBC9bFxbNf+VWqhOClfdgy7RlVkQqt610ig1/yxTgsDIfW1cWDel5EBbOy3jdtQ==",
+ "dev": true
+ },
+ "eslint-visitor-keys": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-1.1.0.tgz",
+ "integrity": "sha512-8y9YjtM1JBJU/A9Kc+SbaOV4y29sSWckBwMHa+FGtVj5gN/sbnKDf6xJUl+8g7FAij9LVaP8C24DUiH/f/2Z9A==",
"dev": true
}
}
@@ -7011,9 +7012,9 @@
"dev": true
},
"inquirer": {
- "version": "6.5.0",
- "resolved": "https://registry.npmjs.org/inquirer/-/inquirer-6.5.0.tgz",
- "integrity": "sha512-scfHejeG/lVZSpvCXpsB4j/wQNPM5JC8kiElOI0OUTwmc1RTpXr4H32/HOlQHcZiYl2z2VElwuCVDRG8vFmbnA==",
+ "version": "6.5.2",
+ "resolved": "https://registry.npmjs.org/inquirer/-/inquirer-6.5.2.tgz",
+ "integrity": "sha512-cntlB5ghuB0iuO65Ovoi8ogLHiWGs/5yNrtUcKjFhSSiVeAIVpD7koaSU9RM8mpXw5YDi9RdYXGQMaOURB7ycQ==",
"dev": true,
"requires": {
"ansi-escapes": "^3.2.0",
@@ -7029,23 +7030,6 @@
"string-width": "^2.1.0",
"strip-ansi": "^5.1.0",
"through": "^2.3.6"
- },
- "dependencies": {
- "ansi-regex": {
- "version": "4.1.0",
- "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz",
- "integrity": "sha512-1apePfXM1UOSqw0o9IiFAovVz9M5S1Dg+4TrDwfMewQ6p/rmMueb7tWZjQ1rx4Loy1ArBggoqGpfqqdI4rondg==",
- "dev": true
- },
- "strip-ansi": {
- "version": "5.2.0",
- "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-5.2.0.tgz",
- "integrity": "sha512-DuRs1gKbBqsMKIZlrffwlug8MHkcnpjs5VPmL1PAh+mA30U0DTotfDZ0d2UUsXpPmPmMMJ6W773MaA3J+lbiWA==",
- "dev": true,
- "requires": {
- "ansi-regex": "^4.1.0"
- }
- }
}
},
"interpret": {
@@ -12565,6 +12549,17 @@
"requires": {
"is-fullwidth-code-point": "^2.0.0",
"strip-ansi": "^4.0.0"
+ },
+ "dependencies": {
+ "strip-ansi": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-4.0.0.tgz",
+ "integrity": "sha1-qEeQIusaw2iocTibY1JixQXuNo8=",
+ "dev": true,
+ "requires": {
+ "ansi-regex": "^3.0.0"
+ }
+ }
}
},
"string.prototype.startswith": {
@@ -12595,12 +12590,20 @@
}
},
"strip-ansi": {
- "version": "4.0.0",
- "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-4.0.0.tgz",
- "integrity": "sha1-qEeQIusaw2iocTibY1JixQXuNo8=",
+ "version": "5.2.0",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-5.2.0.tgz",
+ "integrity": "sha512-DuRs1gKbBqsMKIZlrffwlug8MHkcnpjs5VPmL1PAh+mA30U0DTotfDZ0d2UUsXpPmPmMMJ6W773MaA3J+lbiWA==",
"dev": true,
"requires": {
- "ansi-regex": "^3.0.0"
+ "ansi-regex": "^4.1.0"
+ },
+ "dependencies": {
+ "ansi-regex": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz",
+ "integrity": "sha512-1apePfXM1UOSqw0o9IiFAovVz9M5S1Dg+4TrDwfMewQ6p/rmMueb7tWZjQ1rx4Loy1ArBggoqGpfqqdI4rondg==",
+ "dev": true
+ }
}
},
"strip-bom": {
@@ -12628,9 +12631,9 @@
}
},
"strip-json-comments": {
- "version": "2.0.1",
- "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-2.0.1.tgz",
- "integrity": "sha1-PFMZQukIwml8DsNEhYwobHygpgo=",
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.0.1.tgz",
+ "integrity": "sha512-VTyMAUfdm047mwKl+u79WIdrZxtFtn+nBxHeb844XBQ9uMNTuTHdx2hc5RiAJYqwTj3wc/xe5HLSdJSkJ+WfZw==",
"dev": true
},
"style-loader": {
diff --git a/package.json b/package.json
index 8dbb95f19114b..25cd376d3a773 100644
--- a/package.json
+++ b/package.json
@@ -56,8 +56,8 @@
"@types/q": "^1.5.1",
"@types/select2": "^4.0.48",
"@types/selenium-webdriver": "^4.0.0",
- "@typescript-eslint/eslint-plugin": "^1.13.0",
- "@typescript-eslint/parser": "^1.7.0",
+ "@typescript-eslint/eslint-plugin": "^2.0.0",
+ "@typescript-eslint/parser": "^2.0.0",
"ajv": "^6.10.0",
"angular": "1.6.6",
"angular-route": "1.6.6",
@@ -70,9 +70,9 @@
"css-loader": "^3.1.0",
"dotenv": "^7.0.0",
"enhanced-resolve": "^4.1.0",
- "eslint": "^5.16.0",
- "eslint-plugin-angular": "^4.0.0",
- "eslint-plugin-html": "^5.0.3",
+ "eslint": "^6.0.0",
+ "eslint-plugin-angular": "^4.0.1",
+ "eslint-plugin-html": "^6.0.0",
"fork-ts-checker-webpack-plugin": "^1.3.3",
"gulp": "^4.0.1",
"gulp-concat": "^2.6.1",
diff --git a/typings/custom-scope-defs.d.ts b/typings/custom-scope-defs.d.ts
index 627f535edcf10..04d5a683b0b64 100644
--- a/typings/custom-scope-defs.d.ts
+++ b/typings/custom-scope-defs.d.ts
@@ -15,7 +15,8 @@ interface ICustomScope extends ng.IScope {
onFileCleared?: (() => void);
droppedFile?: any;
- // custom-forms-directives/audio-file-uploader.directive.ts, image-uploader.directive.ts
+ // custom-forms-directives/audio-file-uploader.directive.ts,
+ // image-uploader.directive.ts
errorMessage?: string;
onFileChanged?: ((file: any, fileName?: string) => void);
@@ -36,7 +37,8 @@ interface ICustomScope extends ng.IScope {
getAlwaysEditable?: (() => boolean);
getIsEditable?: (() => boolean);
- // value-generator-editor.directive.ts, CopierDirective.ts, RandomSelectorDirective.ts
+ // value-generator-editor.directive.ts, CopierDirective.ts,
+ // RandomSelectorDirective.ts
generatorId?: string;
// value-generator-editor.directive.ts
diff --git a/typings/globals.d.ts b/typings/globals.d.ts
index 9fd2595152ac6..8ab9e1e7996ad 100644
--- a/typings/globals.d.ts
+++ b/typings/globals.d.ts
@@ -1,4 +1,4 @@
-// Using angular without declaration gives the following error:
+// Using angular without declaration gives the following error:
// 'angular' refers to a UMD global, but the current file is a module.
// Consider adding an import instead. To fix this, we need to mark
// angular as a global. Ref: https://stackoverflow.com/a/42035067
|
PyGithub__PyGithub-486 | GistFile.content is None If Gist haven't complete
If gist object haven't complete, files in this gist has no content.
I create an pull request using Just4test account.
| [
{
"content": "# -*- coding: utf-8 -*-\n\n# ########################## Copyrights and license ############################\n# #\n# Copyright 2012 Steve English <steve.english@navetas.com> #\n# Copyright 2012 Vincent ... | [
{
"content": "# -*- coding: utf-8 -*-\n\n# ########################## Copyrights and license ############################\n# #\n# Copyright 2012 Steve English <steve.english@navetas.com> #\n# Copyright 2012 Vincent ... | diff --git a/github/Gist.py b/github/Gist.py
index 8d75e1e10e..12e83c9547 100644
--- a/github/Gist.py
+++ b/github/Gist.py
@@ -88,7 +88,7 @@ def files(self):
"""
:type: dict of string to :class:`github.GistFile.GistFile`
"""
- self._completeIfNotSet(self._files)
+ self._completeIfNeeded()
return self._files.value
@property
|
projectmesa__mesa-1432 | `OSError: Int or String expected` when running boid_flockers example
**Describe the bug**
Running the `boid_flockers` example results in `OSError: Int or String expected`.
**Expected behavior**
Examples should be able to run without errors.
**Additional context**
This is likely due to a breaking change introduced through https://github.com/projectmesa/mesa/pull/1403: a new parameter `port` is added before `model_params` in `ModularServer.__init__()`, i.e.,
```diff
def __init__(
- self, model_cls, visualization_elements, name="Mesa Model", model_params=None
+ self,
+ model_cls,
+ visualization_elements,
+ name="Mesa Model",
+ port=None,
+ model_params=None,
):
```
As a result, in the `boid_flockers` example, `model_params` gets passed into `__init__()` as `port`:
```python
server = mesa.visualization.ModularServer(
BoidFlockers, [boid_canvas], "Boids", model_params
)
```
Examples such as `bank_reserves` are not affected:
```python
server = mesa.visualization.ModularServer(
BankReserves,
[canvas_element, chart_element],
"Bank Reserves Model",
model_params=model_params,
)
```
| [
{
"content": "\"\"\"\nModularServer\n=============\n\nA visualization server which renders a model via one or more elements.\n\nThe concept for the modular visualization server as follows:\nA visualization is composed of VisualizationElements, each of which defines how\nto generate some visualization from a mod... | [
{
"content": "\"\"\"\nModularServer\n=============\n\nA visualization server which renders a model via one or more elements.\n\nThe concept for the modular visualization server as follows:\nA visualization is composed of VisualizationElements, each of which defines how\nto generate some visualization from a mod... | diff --git a/mesa/visualization/ModularVisualization.py b/mesa/visualization/ModularVisualization.py
index cc6d727bb97..9aa2c4b53dd 100644
--- a/mesa/visualization/ModularVisualization.py
+++ b/mesa/visualization/ModularVisualization.py
@@ -257,8 +257,8 @@ def __init__(
model_cls,
visualization_elements,
name="Mesa Model",
- port=None,
model_params=None,
+ port=None,
):
"""
Args:
|
google__jax-1096 | jaxlib build w/ cuda: File not found during compilation
I'm compiling `jaxlib` with CUDA 10.0 on Ubuntu 18.04. The build fails with the following error:
```
$ python3 build/build.py --enable_cuda --cuda_path /usr/local/cuda-10.0/ --cudnn_path /usr/local/cuda-10.0/ --enable_march_native
[...]
ERROR: /home/clem/.cache/bazel/_bazel_clem/ffaac3f7c6ad1cb26f04f1933452eef6/external/nccl_archive/BUILD.bazel:53:1: error while parsing .d file: /h
ome/clem/.cache/bazel/_bazel_clem/ffaac3f7c6ad1cb26f04f1933452eef6/execroot/__main__/bazel-out/k8-opt/bin/external/nccl_archive/_objs/device_lib/pr
od_i32_reduce_scatter.cu.d (No such file or directory)
nvcc fatal : Could not open input file /tmp/tmpxft_00000004_00000000-6_prod_i32_reduce_scatter.cu.compute_35.cpp1.ii
Target //build:install_xla_in_source_tree failed to build
INFO: Elapsed time: 278.116s, Critical Path: 69.60s
INFO: 1281 processes: 1281 linux-sandbox.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "build/build.py", line 331, in <module>
main()
File "build/build.py", line 326, in main
[":install_xla_in_source_tree", os.getcwd()])
File "build/build.py", line 50, in shell
output = subprocess.check_output(cmd)
File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
**kwargs).stdout
File "/usr/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['./bazel-0.24.1-linux-x86_64', 'run', '--verbose_failures=true', '--config=opt', '--config=mkl_open_source
_only', '--config=cuda', ':install_xla_in_source_tree', '/home/clem/git/jax/build']' returned non-zero exit status 1.
```
Above this error message are only compiler warnings but no errors which could lead to some file not being created. Am I missing something? Or might there be a file name bug? Thanks a lot for your help!
---
I'm on a fresh Ubuntu 18.04.2 install with CUDA 10.0, cudnn and driver version 410.48.
[Full log](http://paste.ubuntu.com/p/tvXBHbr5gw/)
| [
{
"content": "#!/usr/bin/python\n#\n# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | [
{
"content": "#!/usr/bin/python\n#\n# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | diff --git a/build/build.py b/build/build.py
index f2428415c590..32fb7b38d180 100755
--- a/build/build.py
+++ b/build/build.py
@@ -187,6 +187,9 @@ def check_bazel_version(bazel_path, min_version, max_version):
build:cuda --crosstool_top=@local_config_cuda//crosstool:toolchain
build:cuda --define=using_cuda=true --define=using_cuda_nvcc=true
+
+build --spawn_strategy=standalone
+build --strategy=Genrule=standalone
"""
|
Parsl__parsl-534 | Fix import error
```
ImportError: cannot import name 'BashApp' from 'parsl.app.python' (/home/annawoodard/parsl/parsl/app/python.py)
```
It looks like I introduced this bug in 3d0e2d1e69ad27a133b0c40a42472ae43876d5f2.
| [
{
"content": "\"\"\"Definitions for the @App decorator and the App classes.\n\nThe App class encapsulates a generic leaf task that can be executed asynchronously.\n\"\"\"\nimport logging\nfrom inspect import getsource\nfrom hashlib import md5\nfrom inspect import signature\n\nfrom parsl.app.errors import Invali... | [
{
"content": "\"\"\"Definitions for the @App decorator and the App classes.\n\nThe App class encapsulates a generic leaf task that can be executed asynchronously.\n\"\"\"\nimport logging\nfrom inspect import getsource\nfrom hashlib import md5\nfrom inspect import signature\n\nfrom parsl.app.errors import Invali... | diff --git a/parsl/app/app.py b/parsl/app/app.py
index 6c7523f2b6..9b75a8ee88 100644
--- a/parsl/app/app.py
+++ b/parsl/app/app.py
@@ -181,7 +181,7 @@ def bash_app(function=None, data_flow_kernel=None, walltime=60, cache=False, exe
cache : bool
Enable caching of the app call. Default is False.
"""
- from parsl.app.python import BashApp
+ from parsl.app.bash import BashApp
def decorator(func):
def wrapper(f):
|
hylang__hy-161 | LIST-COMP breaks with certain variable names
Try compiling:
```
(list-comp (, i j) (i [-1 0 1] j [-1 0 1]))
```
With hy and you'll get some strange errors. If you replace "i" and "j" with "x" and "y" respectively, the same piece of code works as expected.
| [
{
"content": "# Copyright (c) 2013 Paul Tagliamonte <paultag@debian.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# t... | [
{
"content": "# Copyright (c) 2013 Paul Tagliamonte <paultag@debian.org>\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the \"Software\"),\n# to deal in the Software without restriction, including without limitation\n# t... | diff --git a/hy/lex/states.py b/hy/lex/states.py
index 8c8ffd4df..772f8c6a0 100644
--- a/hy/lex/states.py
+++ b/hy/lex/states.py
@@ -67,10 +67,11 @@ def _resolve_atom(obj):
except ValueError:
pass
- try:
- return HyComplex(obj)
- except ValueError:
- pass
+ if obj != "j":
+ try:
+ return HyComplex(obj)
+ except ValueError:
+ pass
table = {
"true": "True",
diff --git a/tests/lex/test_lex.py b/tests/lex/test_lex.py
index cc3a76072..590e51ee9 100644
--- a/tests/lex/test_lex.py
+++ b/tests/lex/test_lex.py
@@ -230,3 +230,12 @@ def test_hashbang():
""" Ensure we can escape things """
entry = tokenize("#!this is a comment\n")
assert entry == []
+
+
+def test_complex():
+ """Ensure we tokenize complex numbers properly"""
+ # This is a regression test for #143
+ entry = tokenize("(1j)")[0][0]
+ assert entry == HyComplex("1.0j")
+ entry = tokenize("(j)")[0][0]
+ assert entry == HySymbol("j")
diff --git a/tests/native_tests/language.hy b/tests/native_tests/language.hy
index 9722f5b9b..beb61c5ed 100644
--- a/tests/native_tests/language.hy
+++ b/tests/native_tests/language.hy
@@ -420,7 +420,8 @@
(assert (= (sorted (list-comp (* y 2) ((, x y) (.items {"1" 1 "2" 2}))))
[2 4]))
(assert (= (list-comp (, x y) (x (range 2) y (range 2)))
- [(, 0 0) (, 0 1) (, 1 0) (, 1 1)])))
+ [(, 0 0) (, 0 1) (, 1 0) (, 1 1)]))
+ (assert (= (list-comp j (j [1 2])) [1 2])))
(defn test-defn-order []
|
buildbot__buildbot-380 | Update flake8 to version 2.6.
This introduces a few new errors:
- `W503` line break before binary operator
I don't think this is a sensible choice and the codebase doesn't follow this convention.
- `E731` do not assign a lambda expression, use a def
This is used often in tests for functions that return canned values. I think turning them into `def`s obscures that.
- `E402` module level import not at top of file
I've fixed most of the occurrences of this, as they are fairly trivial.
| [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | diff --git a/master/buildbot/buildslave.py b/master/buildbot/buildslave.py
index 6fe79c1cd0bb..29f77df99f6c 100644
--- a/master/buildbot/buildslave.py
+++ b/master/buildbot/buildslave.py
@@ -233,6 +233,8 @@ def reconfigService(self, new_config):
return d
def stopService(self):
+ if self.registration:
+ self.registration.unregister()
self.stopMissingTimer()
return service.MultiService.stopService(self)
diff --git a/master/buildbot/test/fake/fakemaster.py b/master/buildbot/test/fake/fakemaster.py
index f6f10fa9755b..4494513912ca 100644
--- a/master/buildbot/test/fake/fakemaster.py
+++ b/master/buildbot/test/fake/fakemaster.py
@@ -16,6 +16,7 @@
import weakref
from twisted.internet import defer
from buildbot.test.fake import fakedb
+from buildbot.test.fake.pbmanager import FakePBManager
from buildbot import config
import mock
@@ -36,7 +37,7 @@ def mkref(x):
return d
-def make_master(master_id=fakedb.FakeBuildRequestsComponent.MASTER_ID):
+class FakeMaster(mock.Mock):
"""
Create a fake Master instance: a Mock with some convenience
implementations:
@@ -44,15 +45,19 @@ def make_master(master_id=fakedb.FakeBuildRequestsComponent.MASTER_ID):
- Non-caching implementation for C{self.caches}
"""
- fakemaster = mock.Mock(name="fakemaster")
+ def __init__(self, master_id=fakedb.FakeBuildRequestsComponent.MASTER_ID):
+ mock.Mock.__init__(self, name="fakemaster")
+ self._master_id = master_id
+ self.config = config.MasterConfig()
+ self.caches.get_cache = FakeCache
+ self.pbmanager = FakePBManager()
- # set up caches
- fakemaster.caches.get_cache = FakeCache
+ def getObjectId(self):
+ return defer.succeed(self._master_id)
- # and a getObjectId method
- fakemaster.getObjectId = (lambda : defer.succeed(master_id))
+ # work around http://code.google.com/p/mock/issues/detail?id=105
+ def _get_child_mock(self, **kw):
+ return mock.Mock(**kw)
- # and some config - this class's constructor is good enough to trust
- fakemaster.config = config.MasterConfig()
-
- return fakemaster
+# Leave this alias, in case we want to add more behavior later
+make_master = FakeMaster
diff --git a/master/buildbot/test/fake/pbmanager.py b/master/buildbot/test/fake/pbmanager.py
new file mode 100644
index 000000000000..e91b7e5ecdf2
--- /dev/null
+++ b/master/buildbot/test/fake/pbmanager.py
@@ -0,0 +1,47 @@
+# This file is part of Buildbot. Buildbot is free software: you can
+# redistribute it and/or modify it under the terms of the GNU General Public
+# License as published by the Free Software Foundation, version 2.
+#
+# This program is distributed in the hope that it will be useful, but WITHOUT
+# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
+# FOR A PARTICULAR PURPOSE. See the GNU General Public License for more
+# details.
+#
+# You should have received a copy of the GNU General Public License along with
+# this program; if not, write to the Free Software Foundation, Inc., 51
+# Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+#
+# Copyright Buildbot Team Members
+
+from twisted.application import service
+from twisted.internet import defer
+
+class FakePBManager(service.MultiService):
+
+ def __init__(self):
+ service.MultiService.__init__(self)
+ self.setName("fake-pbmanager")
+ self._registrations = []
+ self._unregistrations = []
+
+ def register(self, portstr, username, password, pfactory):
+ if (portstr, username) not in self._registrations:
+ reg = FakeRegistration(self, portstr, username)
+ self._registrations.append((portstr,username,password))
+ return reg
+ else:
+ raise KeyError, ("username '%s' is already registered on port %s"
+ % (username, portstr))
+
+ def _unregister(self, portstr, username):
+ self._unregistrations.append((portstr, username))
+ return defer.succeed(None)
+
+class FakeRegistration(object):
+ def __init__(self, pbmanager, portstr, username):
+ self._portstr = portstr
+ self._username = username
+ self._pbmanager = pbmanager
+
+ def unregister(self):
+ self._pbmanager._unregister(self._portstr, self._username)
diff --git a/master/buildbot/test/unit/test_buildslave.py b/master/buildbot/test/unit/test_buildslave.py
index 75f4d96c3b9d..f4068a7b5c67 100644
--- a/master/buildbot/test/unit/test_buildslave.py
+++ b/master/buildbot/test/unit/test_buildslave.py
@@ -17,7 +17,7 @@
from twisted.trial import unittest
from twisted.internet import defer
from buildbot import buildslave, config
-from buildbot.test.fake import fakemaster
+from buildbot.test.fake import fakemaster, pbmanager
class AbstractBuildSlave(unittest.TestCase):
@@ -68,15 +68,11 @@ def do_test_reconfigService(self, old, old_port, new, new_port):
old.master = master
if old_port:
self.old_registration = old.registration = \
- mock.Mock(name='old_registration')
+ pbmanager.FakeRegistration(master.pbmanager, old_port, old.slavename)
old.registered_port = old_port
old.missing_timer = mock.Mock(name='missing_timer')
old.startService()
- self.new_registration = mock.Mock(name='new_registration')
- master.pbmanager.register = mock.Mock(
- side_effect=lambda *args : self.new_registration)
-
new_config = mock.Mock()
new_config.slavePortnum = new_port
new_config.slaves = [ new ]
@@ -113,7 +109,7 @@ def test_reconfigService_attrs(self):
self.assertEqual(old.missing_timeout, 121)
self.assertEqual(old.properties.getProperty('a'), 'c')
self.assertEqual(old.keepalive_interval, 61)
- self.assertFalse(self.master.pbmanager.register.called)
+ self.assertEqual(self.master.pbmanager._registrations, [])
self.assertTrue(old.updateSlave.called)
@defer.deferredGenerator
@@ -136,7 +132,7 @@ def test_reconfigService_initial_registration(self):
yield wfd
wfd.getResult()
- self.assertTrue(self.master.pbmanager.register.called)
+ self.assertEqual(self.master.pbmanager._registrations, [('tcp:1234', 'bot', 'pass')])
@defer.deferredGenerator
def test_reconfigService_reregister_password(self):
@@ -149,8 +145,8 @@ def test_reconfigService_reregister_password(self):
wfd.getResult()
self.assertEqual(old.password, 'newpass')
- self.assertTrue(self.old_registration.unregister.called)
- self.assertTrue(self.master.pbmanager.register.called)
+ self.assertEqual(self.master.pbmanager._unregistrations, [('tcp:1234', 'bot')])
+ self.assertEqual(self.master.pbmanager._registrations, [('tcp:1234', 'bot', 'newpass')])
@defer.deferredGenerator
def test_reconfigService_reregister_port(self):
@@ -162,8 +158,25 @@ def test_reconfigService_reregister_port(self):
yield wfd
wfd.getResult()
- self.assertTrue(self.old_registration.unregister.called)
- self.assertTrue(self.master.pbmanager.register.called)
+ self.assertEqual(self.master.pbmanager._unregistrations, [('tcp:1234', 'bot')])
+ self.assertEqual(self.master.pbmanager._registrations, [('tcp:5678', 'bot', 'pass')])
+
+ @defer.inlineCallbacks
+ def test_stopService(self):
+ master = self.master = fakemaster.make_master()
+ slave = self.ConcreteBuildSlave('bot', 'pass')
+ slave.master = master
+ slave.startService()
+
+ config = mock.Mock()
+ config.slavePortnum = "tcp:1234"
+ config.slaves = [ slave ]
+
+ yield slave.reconfigService(config)
+ yield slave.stopService()
+
+ self.assertEqual(self.master.pbmanager._unregistrations, [('tcp:1234', 'bot')])
+ self.assertEqual(self.master.pbmanager._registrations, [('tcp:1234', 'bot', 'pass')])
# FIXME: Test that reconfig properly deals with
# 1) locks
|
feast-dev__feast-4085 | Remove numpy <1.25 dependency in setup.py
In setup.py, I can see that the dependency for pandas has already been updated from
"pandas>=1.4.3,<2" (which is still in the current PyPI version) to "pandas>=1.4.3,<3", but numpy hasn't, which will break the installation if I am using e.g. pandas 2.2.1, that requires numpy (>=1.26.0,<2)
## Problem
"numpy>=1.22,<1.25"
## Solution
"numpy>=1.22,<2"
## Steps to reproduce
poetry add git+https://github.com/feast-dev/feast.git
| [
{
"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a... | [
{
"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by a... | diff --git a/setup.py b/setup.py
index f94fb25bb55..b0e9c0c6af4 100644
--- a/setup.py
+++ b/setup.py
@@ -48,7 +48,7 @@
"Jinja2>=2,<4",
"jsonschema",
"mmh3",
- "numpy>=1.22,<1.25",
+ "numpy>=1.22,<2",
"pandas>=1.4.3,<3",
# Higher than 4.23.4 seems to cause a seg fault
"protobuf>=4.24.0,<5.0.0",
|
readthedocs__readthedocs.org-4853 | Confusing error message to end user
In https://github.com/rtfd/readthedocs.org/issues/4071#issuecomment-405939492 I realized that we are saying that we have a problem parsing the YAML file but the problem is in fact in one of the options set from the web admin dashboard.
Example:

There is no `requirements_file` entry in the YAML file (https://github.com/geopandas/geopandas/blob/master/readthedocs.yml) but it exists under the `Admin -> Advanced Settings` field form.
We need to improve this error to something more user-friendly that expresses the real error. It's not an error on parsing the YAML file. The file was parsed properly, but the problem is with one of the values from one of the fields.
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Exceptions raised when building documentation.\"\"\"\n\nfrom __future__ import division, print_function, unicode_literals\n\nfrom django.utils.translation import ugettext_noop\n\n\nclass BuildEnvironmentException(Exception):\n message = None\n status_code = Non... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Exceptions raised when building documentation.\"\"\"\n\nfrom __future__ import division, print_function, unicode_literals\n\nfrom django.utils.translation import ugettext_noop\n\n\nclass BuildEnvironmentException(Exception):\n message = None\n status_code = Non... | diff --git a/readthedocs/doc_builder/exceptions.py b/readthedocs/doc_builder/exceptions.py
index 4897fd41daa..ce2ce3d844b 100644
--- a/readthedocs/doc_builder/exceptions.py
+++ b/readthedocs/doc_builder/exceptions.py
@@ -43,7 +43,7 @@ class ProjectBuildsSkippedError(BuildEnvironmentError):
class YAMLParseError(BuildEnvironmentError):
GENERIC_WITH_PARSE_EXCEPTION = ugettext_noop(
- 'Problem parsing YAML configuration. {exception}',
+ 'Problem in your project\'s configuration. {exception}',
)
|
apache__airflow-9699 | TimeSensor triggers immediately when used over midnight (UTC)
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
This questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.10 (issue exists in current master as well)
**Environment**: does not seem relevant
**What happened**:
The TimeSensor does trigger if the current time is later than the defined trigger time. Looking at the [source code](https://github.com/apache/airflow/blob/master/airflow/sensors/time_sensor.py), the trigger rule is defined as
```
return timezone.utcnow().time() > self.target_time
```
This leads to problems when the DAG runs over midnight UTC. For example, suppose the following DAG:
```
with DAG('foo',
default_args={'start_date': datetime(2020, 7, 1, tzinfo=pendulum.timezone("Europe/Berlin"))},
schedule_interval="0 0 * * *") as dag:
# in summer, Europe/Berlin is two hours after UTC, hence:
time_04h00_local = TimeSensor(task_id="time_01h30", target_time=time(hour=2, minute=00))
```
This DAG will be triggered at 22:00 UTC. Then, according to the trigger rule:
```
22:00 UTC > 2:00 UTC
```
Hence, the TimeSensor will be triggered immediately.
**What you expected to happen**:
The TimeSensor should trigger at the following day if `target_time < next_execution_date.time()`
**Possible workarounds**:
One can always use the TimeDeltaSensor to archive similar effects. This does result in code that is not as readable, though.
| [
{
"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (th... | [
{
"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (th... | diff --git a/UPDATING.md b/UPDATING.md
index c5097aeb34521..b58eaf1bb9e70 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -1475,6 +1475,12 @@ arguments, please change `store_serialized_dags` to `read_dags_from_db`.
Similarly, if you were using `DagBag().store_serialized_dags` property, change it to
`DagBag().read_dags_from_db`.
+### TimeSensor will consider default_timezone setting.
+
+Previously `TimeSensor` always compared the `target_time` with the current time in UTC.
+
+Now it will compare `target_time` with the current time in the timezone set by `default_timezone` under the `core` section of the config.
+
## Airflow 1.10.11
diff --git a/airflow/sensors/time_sensor.py b/airflow/sensors/time_sensor.py
index 210dc00aad4ca..69feaaefafb2a 100644
--- a/airflow/sensors/time_sensor.py
+++ b/airflow/sensors/time_sensor.py
@@ -36,4 +36,4 @@ def __init__(self, target_time, *args, **kwargs):
def poke(self, context):
self.log.info('Checking if the time (%s) has come', self.target_time)
- return timezone.utcnow().time() > self.target_time
+ return timezone.make_naive(timezone.utcnow()).time() > self.target_time
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.