in_source_id stringlengths 13 58 | issue stringlengths 3 241k | before_files listlengths 0 3 | after_files listlengths 0 3 | pr_diff stringlengths 109 107M ⌀ |
|---|---|---|---|---|
python__mypy-2247 | join_types(UninhabitedType, t) should be t
For some reason `join_simple` has logic to move `UninhabitedType` to the second argument but `join_types` does not.
| [
{
"content": "\"\"\"Calculation of the least upper bound types (joins).\"\"\"\n\nfrom typing import List\n\nfrom mypy.types import (\n Type, AnyType, NoneTyp, Void, TypeVisitor, Instance, UnboundType,\n ErrorType, TypeVarType, CallableType, TupleType, ErasedType, TypeList,\n UnionType, FunctionLike, Ov... | [
{
"content": "\"\"\"Calculation of the least upper bound types (joins).\"\"\"\n\nfrom typing import List\n\nfrom mypy.types import (\n Type, AnyType, NoneTyp, Void, TypeVisitor, Instance, UnboundType,\n ErrorType, TypeVarType, CallableType, TupleType, ErasedType, TypeList,\n UnionType, FunctionLike, Ov... | diff --git a/mypy/join.py b/mypy/join.py
index c6d63331b233..c88ed2b7e58f 100644
--- a/mypy/join.py
+++ b/mypy/join.py
@@ -82,6 +82,9 @@ def join_types(s: Type, t: Type) -> Type:
if isinstance(s, NoneTyp) and not isinstance(t, NoneTyp):
s, t = t, s
+ if isinstance(s, UninhabitedType) and not isinstance(t, UninhabitedType):
+ s, t = t, s
+
# Use a visitor to handle non-trivial cases.
return t.accept(TypeJoinVisitor(s))
|
djangopackages__djangopackages-959 | 🐛 package_updater is missing the `all` argument
**Describe the bug**
The `package_updater` management command is missing the `all` argument. This means we should at least be testing that we can invoke `--help` on this command too.
**To Reproduce**
```
root@web2:~# /usr/bin/docker compose -f /code/djangopackages/docker-compose.prod.yml run --rm django-a python manage.py package_updater
[+] Running 1/0
⠿ Container djangopackages-redis-1 Running 0.0s
Postgres is up - continuing...
Traceback (most recent call last):
File "/app/manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 446, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 440, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.9/site-packages/djclick/adapter.py", line 68, in run_from_argv
exit_code = self.main(
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.9/site-packages/djclick/adapter.py", line 50, in invoke
return super(DjangoCommandMixin, self).invoke(ctx)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
TypeError: command() missing 1 required positional argument: 'all'
```
| [
{
"content": "import logging\nfrom time import sleep\n\nimport djclick as click\nfrom django.conf import settings\nfrom django.db.models import F\nfrom django.utils import timezone\nfrom github3 import login as github_login\nfrom github3.exceptions import NotFoundError, UnexpectedResponse\nfrom rich import prin... | [
{
"content": "import logging\nfrom time import sleep\n\nimport djclick as click\nfrom django.conf import settings\nfrom django.db.models import F\nfrom django.utils import timezone\nfrom github3 import login as github_login\nfrom github3.exceptions import NotFoundError, UnexpectedResponse\nfrom rich import prin... | diff --git a/package/management/commands/package_updater.py b/package/management/commands/package_updater.py
index 3d97c8c1c..a0f1ac24b 100644
--- a/package/management/commands/package_updater.py
+++ b/package/management/commands/package_updater.py
@@ -24,7 +24,7 @@ def __init__(self, error, title):
@click.command()
@click.option("--limit", default=None, type=int)
-def command(all, limit):
+def command(limit):
"""Updates all the GitHub Packages in the database."""
github = github_login(token=settings.GITHUB_TOKEN)
diff --git a/package/tests/test_package_updater.py b/package/tests/test_package_updater.py
new file mode 100644
index 000000000..4e4d85a96
--- /dev/null
+++ b/package/tests/test_package_updater.py
@@ -0,0 +1,9 @@
+import pytest
+
+from click import exceptions
+from django.core.management import call_command
+
+
+def test_package_updater_command(db):
+ with pytest.raises(exceptions.Exit):
+ call_command("package_updater", "--help")
|
encode__django-rest-framework-510 | Bug : JSON integer won't match integer in a ChoiceField
I have a Model with :
```
PENDING = 1
COMPLETE = 2
CANCELLED = 3
STATUS = (
(PENDING, 'Pending'),
(COMPLETE, 'Complete'),
(CANCELLED, 'Cancelled'),
)
(...)
status = models.PositiveIntegerField(default=COMPLETE, choices=STATUS)
```
And when I perform a PUT (update) on that model (using the default ModelSerializer) with the following JSON :
```
{"id":8,"status":3,"t_type":1,"description":"Transaction example"}
```
I get the following error message :
```
"status" : "Select a valid choice. 3 is not one of the available choices."
```
Which it clearly is.
Bug : JSON integer won't match integer in a ChoiceField
I have a Model with :
```
PENDING = 1
COMPLETE = 2
CANCELLED = 3
STATUS = (
(PENDING, 'Pending'),
(COMPLETE, 'Complete'),
(CANCELLED, 'Cancelled'),
)
(...)
status = models.PositiveIntegerField(default=COMPLETE, choices=STATUS)
```
And when I perform a PUT (update) on that model (using the default ModelSerializer) with the following JSON :
```
{"id":8,"status":3,"t_type":1,"description":"Transaction example"}
```
I get the following error message :
```
"status" : "Select a valid choice. 3 is not one of the available choices."
```
Which it clearly is.
| [
{
"content": "import copy\nimport datetime\nimport inspect\nimport re\nimport warnings\n\nfrom io import BytesIO\n\nfrom django.core import validators\nfrom django.core.exceptions import ObjectDoesNotExist, ValidationError\nfrom django.core.urlresolvers import resolve, get_script_prefix\nfrom django.conf import... | [
{
"content": "import copy\nimport datetime\nimport inspect\nimport re\nimport warnings\n\nfrom io import BytesIO\n\nfrom django.core import validators\nfrom django.core.exceptions import ObjectDoesNotExist, ValidationError\nfrom django.core.urlresolvers import resolve, get_script_prefix\nfrom django.conf import... | diff --git a/rest_framework/fields.py b/rest_framework/fields.py
index da588082c9..903c384e36 100644
--- a/rest_framework/fields.py
+++ b/rest_framework/fields.py
@@ -794,7 +794,7 @@ def valid_value(self, value):
if value == smart_unicode(k2):
return True
else:
- if value == smart_unicode(k):
+ if value == smart_unicode(k) or value == k:
return True
return False
diff --git a/rest_framework/tests/models.py b/rest_framework/tests/models.py
index 428bf130d0..807bcf9832 100644
--- a/rest_framework/tests/models.py
+++ b/rest_framework/tests/models.py
@@ -51,6 +51,10 @@ class Meta:
abstract = True
+class HasPositiveIntegerAsChoice(RESTFrameworkModel):
+ some_choices = ((1,'A'),(2,'B'),(3,'C'))
+ some_integer = models.PositiveIntegerField(choices=some_choices)
+
class Anchor(RESTFrameworkModel):
text = models.CharField(max_length=100, default='anchor')
diff --git a/rest_framework/tests/serializer.py b/rest_framework/tests/serializer.py
index 780177aa0c..7f2c27b05a 100644
--- a/rest_framework/tests/serializer.py
+++ b/rest_framework/tests/serializer.py
@@ -2,7 +2,7 @@
import pickle
from django.test import TestCase
from rest_framework import serializers
-from rest_framework.tests.models import (Album, ActionItem, Anchor, BasicModel,
+from rest_framework.tests.models import (HasPositiveIntegerAsChoice, Album, ActionItem, Anchor, BasicModel,
BlankFieldModel, BlogPost, Book, CallableDefaultValueModel, DefaultValueModel,
ManyToManyModel, Person, ReadOnlyManyToManyModel, Photo)
@@ -69,6 +69,11 @@ class Meta:
model = Album
fields = ['title'] # lists are also valid options
+class PositiveIntegerAsChoiceSerializer(serializers.ModelSerializer):
+ class Meta:
+ model = HasPositiveIntegerAsChoice
+ fields = ['some_integer']
+
class BasicTests(TestCase):
def setUp(self):
@@ -285,6 +290,12 @@ def test_default_modelfield_max_length_exceeded(self):
self.assertEquals(serializer.errors, {'info': [u'Ensure this value has at most 12 characters (it has 13).']})
+class PositiveIntegerAsChoiceTests(TestCase):
+ def test_positive_integer_in_json_is_correctly_parsed(self):
+ data = {'some_integer':1}
+ serializer = PositiveIntegerAsChoiceSerializer(data=data)
+ self.assertEquals(serializer.is_valid(), True)
+
class ModelValidationTests(TestCase):
def test_validate_unique(self):
"""
|
facebookresearch__ParlAI-1821 | Obselete download link for CLEVR Dataset
Apparently, the current link to CLEVR in the source code is "https://s3-us-west-1.amazonaws.com/clevr/CLEVR_v1.0.zip" that returns the message "All access to this object has been disabled"
When I try to execute the following line of code
`!python ~/ParlAI/examples/display_data.py -t clevr`
I obtain
```
[creating task(s): clevr]
[building data: /root/ParlAI/data/CLEVR]
[ downloading: https://s3-us-west-1.amazonaws.com/clevr/CLEVR_v1.0.zip to /root/ParlAI/data/CLEVR/CLEVR_v1.0.zip ]
Downloading CLEVR_v1.0.zip: 0.00B [00:00, ?B/s]
unpacking CLEVR_v1.0.zip
Traceback (most recent call last):
File "/root/ParlAI/parlai/core/agents.py", line 819, in _create_task_agents
task_agents = my_module.create_agent(opt)
AttributeError: module 'parlai.tasks.clevr.agents' has no attribute 'create_agent'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/ParlAI/examples/display_data.py", line 22, in <module>
display_data(opt)
File "/root/ParlAI/parlai/scripts/display_data.py", line 42, in display_data
world = create_task(opt, agent)
File "/root/ParlAI/parlai/core/worlds.py", line 1151, in create_task
world = create_task_world(opt, user_agents, default_world=default_world)
File "/root/ParlAI/parlai/core/worlds.py", line 1108, in create_task_world
opt, user_agents, default_world=default_world
File "/root/ParlAI/parlai/core/worlds.py", line 1068, in _get_task_world
task_agents = _create_task_agents(opt)
File "/root/ParlAI/parlai/core/agents.py", line 822, in _create_task_agents
return create_task_agent_from_taskname(opt)
File "/root/ParlAI/parlai/core/agents.py", line 776, in create_task_agent_from_taskname
task_agents = teacher_class(opt)
File "/root/ParlAI/parlai/tasks/clevr/agents.py", line 45, in __init__
data_path, self.images_path = _path(opt)
File "/root/ParlAI/parlai/tasks/clevr/agents.py", line 15, in _path
build(opt)
File "/root/ParlAI/parlai/tasks/clevr/build.py", line 28, in build
build_data.untar(dpath, fname)
File "/root/ParlAI/parlai/core/build_data.py", line 180, in untar
shutil.unpack_archive(fullpath, path)
File "/usr/lib/python3.6/shutil.py", line 983, in unpack_archive
func(filename, extract_dir, **kwargs)
File "/usr/lib/python3.6/shutil.py", line 883, in _unpack_zipfile
raise ReadError("%s is not a zip file" % filename)
shutil.ReadError: /root/ParlAI/data/CLEVR/CLEVR_v1.0.zip is not a zip file
```
I found the following working link on CLEVR webpage (https://cs.stanford.edu/people/jcjohns/clevr/):
https://dl.fbaipublicfiles.com/clevr/CLEVR_v1.0.zip
| [
{
"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n# Download and build the data if it does not exist.\n\nimport parlai.core.build_data as build_... | [
{
"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n# Download and build the data if it does not exist.\n\nimport parlai.core.build_data as build_... | diff --git a/parlai/tasks/clevr/build.py b/parlai/tasks/clevr/build.py
index 39b70209252..806c9fcf32b 100644
--- a/parlai/tasks/clevr/build.py
+++ b/parlai/tasks/clevr/build.py
@@ -22,7 +22,7 @@ def build(opt):
# Download the data.
fname = 'CLEVR_v1.0.zip'
- url = 'https://s3-us-west-1.amazonaws.com/clevr/'
+ url = 'https://dl.fbaipublicfiles.com/clevr/'
build_data.download(url + fname, dpath, fname)
build_data.untar(dpath, fname)
|
pypi__warehouse-13060 | OIDC publishers should be manageable within the admin app
Breakout of #11296: PyPI's admins should be able to administrate OIDC publishers (both full and "pending") from within the admin app/views.
| [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | diff --git a/tests/unit/admin/views/test_projects.py b/tests/unit/admin/views/test_projects.py
index 826b9ac4837e..25eabbb6c199 100644
--- a/tests/unit/admin/views/test_projects.py
+++ b/tests/unit/admin/views/test_projects.py
@@ -19,6 +19,7 @@
from pyramid.httpexceptions import HTTPBadRequest, HTTPMovedPermanently, HTTPSeeOther
+from tests.common.db.oidc import GitHubPublisherFactory
from warehouse.admin.views import projects as views
from warehouse.packaging.models import Project, Role
from warehouse.search.tasks import reindex_project
@@ -84,6 +85,7 @@ def test_gets_project(self, db_request):
[RoleFactory(project=project) for _ in range(5)],
key=lambda x: (x.role_name, x.user.username),
)
+ oidc_publishers = [GitHubPublisherFactory(projects=[project]) for _ in range(5)]
db_request.matchdict["project_name"] = str(project.normalized_name)
result = views.project_detail(project, db_request)
@@ -92,6 +94,7 @@ def test_gets_project(self, db_request):
"releases": [],
"maintainers": roles,
"journal": journals[:30],
+ "oidc_publishers": oidc_publishers,
"ONE_MB": views.ONE_MB,
"MAX_FILESIZE": views.MAX_FILESIZE,
"MAX_PROJECT_SIZE": views.MAX_PROJECT_SIZE,
diff --git a/warehouse/admin/templates/admin/projects/detail.html b/warehouse/admin/templates/admin/projects/detail.html
index ab7c1433d824..26620eb89597 100644
--- a/warehouse/admin/templates/admin/projects/detail.html
+++ b/warehouse/admin/templates/admin/projects/detail.html
@@ -233,6 +233,36 @@ <h4 class="modal-title" id="exampleModalLabel">Remove role for {{ role.user.user
</div>
</div> <!-- .card #releases -->
+{% if oidc_publishers %}
+<div class="card card-info" id="oidc-publishers">
+ <div class="card-header">OpenID Connect Publishers</div>
+ <div class="card-body">
+ <div class="table-responsive p-0">
+ <table class="table table-hover table-striped">
+ <thead>
+ <tr>
+ <th>Publisher name</th>
+ <th>URL</th>
+ <th>repr</th>
+ </tr>
+ <tbody>
+ {% for pub in oidc_publishers %}
+ <tr>
+ <td>{{ pub.publisher_name }}</td>
+ <td><a href="{{ pub.publisher_url }}">{{ pub.publisher_url }}</a></td>
+ <td><code>{{ pub }}</code></td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </thead>
+ </table>
+ </div>
+ </div>
+</div> <!-- .card #oidc-publishers -->
+{% else %}
+No publishers configured.
+{% endif %}
+
<div class="card card-primary card-outline collapsed-card" id="journals">
<div class="card-header">
<h3 class="card-title">Journals</h3>
diff --git a/warehouse/admin/templates/admin/users/detail.html b/warehouse/admin/templates/admin/users/detail.html
index 7e34d0715fd0..db3930a8a11f 100644
--- a/warehouse/admin/templates/admin/users/detail.html
+++ b/warehouse/admin/templates/admin/users/detail.html
@@ -411,6 +411,39 @@ <h3 class="card-title">Projects</h3>
</div>
</div> <!-- .card -->
+ {% if user.pending_oidc_publishers %}
+ <div class="card">
+ <div class="card-header with-border">
+ <h3 class="card-title">Pending OpenID Connect Publishers</h3>
+ </div>
+
+ <div class="card-body">
+ <table class="table table-hover" id="pending-oidc-publishers">
+ <thead>
+ <tr>
+ <th scope="col">Project name</th>
+ <th scope="col">Publisher name</th>
+ <th scope="col">URL</th>
+ <th scope="col">repr</th>
+ </tr>
+ </thead>
+ <tbody>
+ {% for pub in user.pending_oidc_publishers %}
+ <tr>
+ <td>{{ pub.project_name }}</td>
+ <td>{{ pub.publisher_name }}</td>
+ <td><a href="{{ pub.publisher_url }}">{{ pub.publisher_url }}</a></td>
+ <td><code>{{ pub }}</code></td>
+ </tr>
+ {% endfor %}
+ </tbody>
+ </table>
+ </div>
+ </div> <!-- .card -->
+ {% else %}
+ No publishers configured.
+ {% endif %}
+
<div class="card">
<div class="card-header with-border">
<h3 class="card-title">Account activity</h3>
diff --git a/warehouse/admin/views/projects.py b/warehouse/admin/views/projects.py
index 625756550c24..8d76a35cf7ba 100644
--- a/warehouse/admin/views/projects.py
+++ b/warehouse/admin/views/projects.py
@@ -129,6 +129,7 @@ def project_detail(project, request):
"releases": releases,
"maintainers": maintainers,
"journal": journal,
+ "oidc_publishers": project.oidc_publishers,
"ONE_MB": ONE_MB,
"MAX_FILESIZE": MAX_FILESIZE,
"ONE_GB": ONE_GB,
|
ethereum__web3.py-3228 | Add API to iterate through all events in a contract
### What was wrong?
No easy way to get all events for a contract (while still parsing the results). See [this StackExchange question](https://ethereum.stackexchange.com/questions/54473/how-to-read-allevents-using-python-web3-theres-capability-in-web3-js). One option is to iterate over all the events, but it's a bit awkward right now. I think the easiest way is:
```py
from web3.contract import ContractEvent
filters = [
event.createFilter(fromBlock='latest')
for event in myContract.events
if isinstance(event, ContractEvent)
]
```
### How can it be fixed?
Some options:
- Implement `__iter__` on `Contract.events` to iterate through all events in the ABI (my favorite option, except that it's inconsistent with `contract.functions`, which is doing the wrong thing IMO)
- Add a new `Contract.all_events()` equivalent to `Contract.all_functions()`
Then the example changes to:
```py
filters = [
event.createFilter(fromBlock='latest')
for event in myContract.events
]
```
---
Of course, we could also implement `contract.create_filter()` like web3.js's `contract.events.allEvents`. I kind of like that the filters are event specific right now, though. I don't think it's too big a deal to require callers to write a filter loop on events.
| [
{
"content": "import copy\nfrom typing import (\n TYPE_CHECKING,\n Any,\n Callable,\n Dict,\n Iterable,\n List,\n Optional,\n Sequence,\n Type,\n cast,\n)\n\nfrom eth_typing import (\n ChecksumAddress,\n)\nfrom eth_utils import (\n combomethod,\n)\nfrom eth_utils.toolz import... | [
{
"content": "import copy\nfrom typing import (\n TYPE_CHECKING,\n Any,\n Callable,\n Dict,\n Iterable,\n List,\n Optional,\n Sequence,\n Type,\n cast,\n)\n\nfrom eth_typing import (\n ChecksumAddress,\n)\nfrom eth_utils import (\n combomethod,\n)\nfrom eth_utils.toolz import... | diff --git a/docs/web3.contract.rst b/docs/web3.contract.rst
index ee836ca349..f5082e29bb 100644
--- a/docs/web3.contract.rst
+++ b/docs/web3.contract.rst
@@ -944,6 +944,8 @@ For example:
Fetches all logs for a given event within the specified block range or block hash.
+ Returns a list of decoded event logs sorted by ``logIndex``.
+
``argument_filters`` is an optional dictionary argument that can be used to filter
for logs where the event's argument values match the values provided in the
dictionary. The keys must match the event argument names as they exist in the ABI.
diff --git a/newsfragments/3228.feature.rst b/newsfragments/3228.feature.rst
new file mode 100644
index 0000000000..8a9b0a9813
--- /dev/null
+++ b/newsfragments/3228.feature.rst
@@ -0,0 +1 @@
+Contract event ``get_logs`` results sorted by each ``ContractEvent`` ``logIndex``.
\ No newline at end of file
diff --git a/tests/conftest.py b/tests/conftest.py
index 6a4b502867..730ddb0053 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -46,9 +46,10 @@ def emitter_contract_data():
return EMITTER_CONTRACT_DATA
+# This class defines events for the EmitterContract and are used to construct
+# a fixture for contract event logs. Parameterized tests that utilize an `emitter`
+# contract fixture will use this data.
class LogFunctions:
- # These appear to be for a very specific test and this doesn't need to be updated
- # for every event in the emitter contract. That ends up breaking that test.
LogAnonymous = 0
LogNoArguments = 1
LogSingleArg = 2
@@ -74,6 +75,9 @@ def emitter_contract_event_ids():
return LogFunctions
+# This class defines topics for the EmitterContract and are used to construct
+# a fixture for contract event log topics. Parameterized tests that utilize
+# an `emitter` contract fixture will use this data.
class LogTopics:
LogAnonymous = event_signature_to_log_topic("LogAnonymous()")
LogNoArguments = event_signature_to_log_topic("LogNoArguments()")
diff --git a/tests/core/contracts/test_extracting_event_data.py b/tests/core/contracts/test_extracting_event_data.py
index 07e113dbf4..630a2c0471 100644
--- a/tests/core/contracts/test_extracting_event_data.py
+++ b/tests/core/contracts/test_extracting_event_data.py
@@ -331,6 +331,78 @@ def test_argument_extraction_strict_bytes_types(
assert event_data["event"] == "LogListArgs"
+def test_contract_event_get_logs_sorted_by_log_index(w3, emitter, request_mocker):
+ get_logs_response = [
+ {
+ "type": "mined",
+ "logIndex": 10,
+ "transactionIndex": 0,
+ "transactionHash": "0xaef7f312d863780b861d8c38984b2a33f77e9508810735e2b042143f7f189f83", # noqa: E501
+ "blockHash": "0x2200ec3324fdaca4ee2f4629489d2d06fb28108dae61b63b84ef39702e2b64e7", # noqa: E501
+ "blockNumber": 3,
+ "address": "0xF2E246BB76DF876Cef8b38ae84130F4F55De395b",
+ "data": "0x",
+ "topics": [
+ "0x1e86022f78f8d04f8e3dfd13a2bdb280403e6632877c0dbee5e4eeb259908a5c"
+ ],
+ },
+ {
+ "type": "mined",
+ "logIndex": 0,
+ "transactionIndex": 0,
+ "transactionHash": "0x61e57bb1b5af14ca1b0964a84fb640bf39927961f26311a6450475a749e00cbb", # noqa: E501
+ "blockHash": "0x73dd9a3b0f581689ebd67adea0debe05672a334c723379dc506fb71a666c1754", # noqa: E501
+ "blockNumber": 4,
+ "address": "0xF2E246BB76DF876Cef8b38ae84130F4F55De395b",
+ "data": "0x",
+ "topics": [
+ "0x1e86022f78f8d04f8e3dfd13a2bdb280403e6632877c0dbee5e4eeb259908a5c"
+ ],
+ },
+ {
+ "type": "mined",
+ "logIndex": 123,
+ "transactionIndex": 0,
+ "transactionHash": "0x61e57bb1b5af14ca1b0964a84fb640bf39927961f26311a6450475a749e00cbb", # noqa: E501
+ "blockHash": "0x73dd9a3b0f581689ebd67adea0debe05672a334c723379dc506fb71a666c1754", # noqa: E501
+ "blockNumber": 1,
+ "address": "0xF2E246BB76DF876Cef8b38ae84130F4F55De395b",
+ "data": "0x",
+ "topics": [
+ "0x1e86022f78f8d04f8e3dfd13a2bdb280403e6632877c0dbee5e4eeb259908a5c"
+ ],
+ },
+ {
+ "type": "mined",
+ "logIndex": 54,
+ "transactionIndex": 0,
+ "transactionHash": "0x61e57bb1b5af14ca1b0964a84fb640bf39927961f26311a6450475a749e00cbb", # noqa: E501
+ "blockHash": "0x73dd9a3b0f581689ebd67adea0debe05672a334c723379dc506fb71a666c1754", # noqa: E501
+ "blockNumber": 1,
+ "address": "0xF2E246BB76DF876Cef8b38ae84130F4F55De395b",
+ "data": "0x",
+ "topics": [
+ "0x1e86022f78f8d04f8e3dfd13a2bdb280403e6632877c0dbee5e4eeb259908a5c"
+ ],
+ },
+ ]
+
+ with request_mocker(w3, mock_results={"eth_getLogs": get_logs_response}):
+ logs = emitter.events.LogNoArguments().get_logs()
+
+ sorted_logs = sorted(
+ emitter.events.LogNoArguments().get_logs(),
+ key=lambda l: l["logIndex"],
+ )
+ sorted_logs = sorted(
+ emitter.events.LogNoArguments().get_logs(),
+ key=lambda l: l["blockNumber"],
+ )
+
+ assert len(logs) == 4
+ assert logs == sorted_logs
+
+
@pytest.mark.parametrize(
"contract_fn,event_name,call_args,expected_args,warning_msg,process_receipt",
(
diff --git a/web3/contract/contract.py b/web3/contract/contract.py
index c9e4ac8a7c..75dbc34c4d 100644
--- a/web3/contract/contract.py
+++ b/web3/contract/contract.py
@@ -192,7 +192,9 @@ def get_logs(
all_event_logs,
argument_filters,
)
- return filtered_logs
+ sorted_logs = sorted(filtered_logs, key=lambda e: e["logIndex"])
+ sorted_logs = sorted(sorted_logs, key=lambda e: e["blockNumber"])
+ return sorted_logs
@combomethod
def create_filter(
|
saleor__saleor-1389 | Add robots meta tag and "nofollow" link attribute
1. Fragile pages should be not indexed by search engines.
```
<meta name=”robots” content=”nofollow, noindex”>
```
- [x] Add above meta tag to order's confirmation page
2. Pages that brings no to little content value should not be crawled
```
<meta name=”robots” content=”nofollow”>
```
- [x] Add above meta tag to sign in/sign up/cart pages
3. Add link attribute
- [x] Links pointing to above pages should have set attribute `rel="nofollow"`
| [
{
"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.conf import settings\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashb... | [
{
"content": "from __future__ import unicode_literals\n\nfrom django.template.response import TemplateResponse\nfrom django.contrib import messages\nfrom django.utils.translation import pgettext_lazy\nfrom impersonate.views import impersonate as orig_impersonate\n\nfrom ..dashboard.views import staff_member_req... | diff --git a/saleor/core/views.py b/saleor/core/views.py
index d08fb5f9a1e..90e13056d3e 100644
--- a/saleor/core/views.py
+++ b/saleor/core/views.py
@@ -2,7 +2,6 @@
from django.template.response import TemplateResponse
from django.contrib import messages
-from django.conf import settings
from django.utils.translation import pgettext_lazy
from impersonate.views import impersonate as orig_impersonate
diff --git a/templates/account/login.html b/templates/account/login.html
index bd6af86ce21..337126d0541 100644
--- a/templates/account/login.html
+++ b/templates/account/login.html
@@ -4,6 +4,10 @@
{% block title %}{% trans "Log in" context "Login page title" %} — {{ block.super }}{% endblock %}
+{% block meta_tags %}
+ <meta name="robots" content="nofollow">
+{% endblock meta_tags %}
+
{% block content %}
<div class="col-lg-10 col-sm-12 m-auto">
@@ -13,7 +17,7 @@
<h3>{% trans "Don't have an account yet?" context "Login form secondary title" %}</h3>
<img src="{% static 'images/pirate_login.png' %}"
srcset="{% static 'images/pirate_login.png' %} 1x, {% static 'images/pirate_login2x.png' %} 2x">
- <a href="{% url 'account_signup' %}" class="btn secondary narrow">
+ <a rel="nofollow" href="{% url 'account_signup' %}" class="btn secondary narrow">
{% trans "Register" context "Login form secondary action" %}
</a>
</div>
diff --git a/templates/account/partials/login_form.html b/templates/account/partials/login_form.html
index c6a5638e963..08e38a58e15 100644
--- a/templates/account/partials/login_form.html
+++ b/templates/account/partials/login_form.html
@@ -16,7 +16,7 @@
<button class="btn primary narrow">
{% trans "Log in" context "Login form primary action" %}
</button>
- <a class="link--styled" href="{% url 'account_reset_password' %}">
+ <a rel="nofollow" class="link--styled" href="{% url 'account_reset_password' %}">
{% trans "Forgot password?" context "Login form secondary link" %}
</a>
{% with available_backends=settings.available_backends %}
diff --git a/templates/account/password_reset.html b/templates/account/password_reset.html
index faaff4da92e..3c127b539d2 100644
--- a/templates/account/password_reset.html
+++ b/templates/account/password_reset.html
@@ -5,6 +5,10 @@
{% block title %}{% trans "Password reset" context "Password reset page title" %} — {{ block.super }}{% endblock %}
+{% block meta_tags %}
+ <meta name="robots" content="nofollow">
+{% endblock meta_tags %}
+
{% block content %}
<div class="row login__forgot-password">
<div class="col-md-8 m-auto text-center">
diff --git a/templates/account/signup.html b/templates/account/signup.html
index e859ed2e256..7e479cb1022 100644
--- a/templates/account/signup.html
+++ b/templates/account/signup.html
@@ -5,6 +5,10 @@
{% block title %}{% trans "Sign Up" context "Signup page title" %} — {{ block.super }}{% endblock %}
+{% block meta_tags %}
+ <meta name="robots" content="nofollow">
+{% endblock meta_tags %}
+
{% block content %}
<div class="col-lg-10 offset-lg-1 col-sm-12">
<div class="row login">
@@ -13,7 +17,7 @@
<h3>{% trans "Already have an account?" context "Signup form secondary title" %}</h3>
<img class="signup-img" src="{% static 'images/pirate_login.png' %}"
srcset="{% static 'images/pirate_login.png' %} 1x, {% static 'images/pirate_login2x.png' %} 2x">
- <p><a href="{% url 'account_login' %}" class="btn secondary narrow">
+ <p><a rel="nofollow" href="{% url 'account_login' %}" class="btn secondary narrow">
{% trans "Log in" context "Signup form secondary action" %}
</a></p>
</div>
diff --git a/templates/base.html b/templates/base.html
index 046e403ce99..d21157bb8de 100644
--- a/templates/base.html
+++ b/templates/base.html
@@ -18,6 +18,7 @@
{% render_bundle 'storefront' 'css' %}
{% block stylesheet %}{% endblock stylesheet %}
+ {% block meta_tags %}{% endblock meta_tags %}
<!-- Le HTML5 shim, for IE6-8 support of HTML5 elements -->
<!--[if lt IE 9]>
@@ -62,11 +63,11 @@
{% endif %}
{% else %}
<li>
- <a href="{% url "account_signup" %}">
+ <a rel="nofollow" href="{% url "account_signup" %}">
{% trans "Register" context "Main navigation item" %}</a>
</li>
<li>
- <a href="{% url "account_login" %}">
+ <a rel="nofollow" href="{% url "account_login" %}">
{% trans "Log in" context "Main navigation item" %}
</a>
</li>
@@ -108,7 +109,7 @@
</div>
<div class="col-2 col-md-4">
<div class="navbar__brand__cart float-right">
- <a class="cart__icon" href="{% url "cart:index" %}">
+ <a rel="nofollow" class="cart__icon" href="{% url "cart:index" %}">
<span class="cart-label d-none d-md-inline-block">
{% trans "Your Cart" context "Main navigation item" %}
</span>
@@ -184,7 +185,7 @@
<div class="col-md-3 col-sm-6">
<ul>
<li>
- <a href="{% url "cart:index" %}">
+ <a rel="nofollow" href="{% url "cart:index" %}">
{% trans "Your Cart" context "Main navigation item" %}
</a>
</li>
@@ -220,12 +221,12 @@
{% endif %}
{% else %}
<li>
- <a href="{% url "account_signup" %}">
+ <a rel="nofollow" href="{% url "account_signup" %}">
{% trans "Register" context "Main navigation item" %}
</a>
</li>
<li>
- <a href="{% url "account_login" %}">
+ <a rel="nofollow" href="{% url "account_login" %}">
{% trans "Log in" context "Main navigation item" %}
</a>
</li>
diff --git a/templates/cart/index.html b/templates/cart/index.html
index 177c66a2a59..0700648972f 100644
--- a/templates/cart/index.html
+++ b/templates/cart/index.html
@@ -10,10 +10,14 @@
{% block breadcrumb %}
<ul class="breadcrumbs list-unstyled">
<li><a href="/">{% trans "Home" context "Main navigation item" %}</a></li>
- <li><a href="{% url 'cart:index' %}">{% trans "Cart" context "Cart breadcrumb" %}</a></li>
+ <li><a rel="nofollow" href="{% url 'cart:index' %}">{% trans "Cart" context "Cart breadcrumb" %}</a></li>
</ul>
{% endblock breadcrumb %}
+{% block meta_tags %}
+ <meta name="robots" content="nofollow">
+{% endblock meta_tags %}
+
{% block content %}
<div class="alert alert-success d-block d-sm-none remove-product-alert">
{% trans "Product has been removed from cart" context "Cart message" %}
diff --git a/templates/order/details.html b/templates/order/details.html
index 8cf5c05a910..38a6a6ad6c0 100644
--- a/templates/order/details.html
+++ b/templates/order/details.html
@@ -27,6 +27,10 @@
{% endif %}
{% endblock breadcrumb %}
+{% block meta_tags %}
+ <meta name="robots" content="noindex, nofollow">
+{% endblock meta_tags %}
+
{% block content %}
{# This view is available by just knowing url, #}
{# so we don't show all details (like delivery address) #}
|
e-valuation__EvaP-1666 | Make Typescript code Prettier
We should add automated formatting for our typescript files. I think https://prettier.io/ is pretty good, but the choice is open for discussion. The formatting should be done in `manage.py format` and be checked in CI.
| [
{
"content": "import subprocess # nosec\n\nfrom django.core.management.base import BaseCommand\n\n\nclass Command(BaseCommand):\n args = \"\"\n help = \"Runs the code formatter\"\n requires_migrations_checks = False\n\n def handle(self, *args, **options):\n subprocess.run([\"black\", \"evap\... | [
{
"content": "import subprocess # nosec\n\nfrom django.core.management.base import BaseCommand\n\n\nclass Command(BaseCommand):\n args = \"\"\n help = \"Runs the code formatter\"\n requires_migrations_checks = False\n\n def handle(self, *args, **options):\n subprocess.run([\"black\", \"evap\... | diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml
index 6f3f6abb19..544304bc01 100644
--- a/.github/workflows/tests.yml
+++ b/.github/workflows/tests.yml
@@ -69,22 +69,29 @@ jobs:
formatter:
runs-on: ubuntu-18.04
- container:
- image: python:3.7
-
name: Formatting
steps:
- name: Check out repository code
uses: actions/checkout@v2
- - name: Install dependencies
+ - uses: actions/setup-python@v2
+ with:
+ python-version: 3.7
+ - name: Install Python dependencies
run: pip install -r requirements-dev.txt
+ - name: Setup Node
+ uses: actions/setup-node@v2
+ - name: Install Node dependencies
+ run: npm ci
- name: Add localsettings
run: cp evap/settings_test.py evap/localsettings.py
- name: Check code formatting
run: black --check evap
- name: Check imports formatting
run: isort . --check --diff
+ - run: ls -laR evap/static/ts
+ - name: Check TypeScript formatting
+ run: npx prettier --list-different --loglevel debug --config evap/static/ts/.prettierrc.json evap/static/ts/src
backup-process:
diff --git a/.gitignore b/.gitignore
index a29c85d6db..b8388d2dc6 100644
--- a/.gitignore
+++ b/.gitignore
@@ -42,6 +42,7 @@ htmlcov
# pip puts editable packages here
src
+!evap/static/ts/.prettierrc.json
!evap/static/ts/src
# node modules
diff --git a/evap/evaluation/management/commands/format.py b/evap/evaluation/management/commands/format.py
index a1994d7d51..f513276cf2 100644
--- a/evap/evaluation/management/commands/format.py
+++ b/evap/evaluation/management/commands/format.py
@@ -11,3 +11,4 @@ class Command(BaseCommand):
def handle(self, *args, **options):
subprocess.run(["black", "evap"], check=False) # nosec
subprocess.run(["isort", "."], check=False) # nosec
+ subprocess.run(["npx", "prettier", "--write", "evap/static/ts/src"], check=False) # nosec
diff --git a/evap/evaluation/tests/test_commands.py b/evap/evaluation/tests/test_commands.py
index 4824fcd16e..f0f2347878 100644
--- a/evap/evaluation/tests/test_commands.py
+++ b/evap/evaluation/tests/test_commands.py
@@ -352,11 +352,12 @@ class TestFormatCommand(TestCase):
@patch("subprocess.run")
def test_formatters_called(self, mock_subprocess_run):
management.call_command("format")
- self.assertEqual(len(mock_subprocess_run.mock_calls), 2)
+ self.assertEqual(len(mock_subprocess_run.mock_calls), 3)
mock_subprocess_run.assert_has_calls(
[
call(["black", "evap"], check=False),
call(["isort", "."], check=False),
+ call(["npx", "prettier", "--write", "evap/static/ts/src"], check=False),
]
)
diff --git a/evap/static/ts/.prettierrc.json b/evap/static/ts/.prettierrc.json
new file mode 100644
index 0000000000..d59f981524
--- /dev/null
+++ b/evap/static/ts/.prettierrc.json
@@ -0,0 +1,6 @@
+{
+ "tabWidth": 4,
+ "arrowParens": "avoid",
+ "trailingComma": "all",
+ "printWidth": 120
+}
diff --git a/evap/static/ts/src/csrf-utils.ts b/evap/static/ts/src/csrf-utils.ts
index b221865a44..5300b1b03d 100644
--- a/evap/static/ts/src/csrf-utils.ts
+++ b/evap/static/ts/src/csrf-utils.ts
@@ -1,7 +1,8 @@
// based on: https://docs.djangoproject.com/en/3.1/ref/csrf/#ajax
function getCookie(name: string): string | null {
if (document.cookie !== "") {
- const cookie = document.cookie.split(";")
+ const cookie = document.cookie
+ .split(";")
.map(cookie => cookie.trim())
.find(cookie => cookie.substring(0, name.length + 1) === `${name}=`);
if (cookie) {
@@ -19,7 +20,7 @@ function isMethodCsrfSafe(method: string): boolean {
// setup ajax sending csrf token
$.ajaxSetup({
- beforeSend: function(xhr: JQuery.jqXHR, settings: JQuery.AjaxSettings) {
+ beforeSend: function (xhr: JQuery.jqXHR, settings: JQuery.AjaxSettings) {
const isMethodSafe = settings.method && isMethodCsrfSafe(settings.method);
if (!isMethodSafe && !this.crossDomain) {
xhr.setRequestHeader("X-CSRFToken", csrftoken);
diff --git a/evap/static/ts/src/datagrid.ts b/evap/static/ts/src/datagrid.ts
index 5262d356c5..8f0584c68d 100644
--- a/evap/static/ts/src/datagrid.ts
+++ b/evap/static/ts/src/datagrid.ts
@@ -1,27 +1,27 @@
declare const Sortable: typeof import("sortablejs");
interface Row {
- element: HTMLElement,
- searchWords: string[],
- filterValues: Map<string, string[]>,
- orderValues: Map<string, string | number>,
- isDisplayed: boolean,
+ element: HTMLElement;
+ searchWords: string[];
+ filterValues: Map<string, string[]>;
+ orderValues: Map<string, string | number>;
+ isDisplayed: boolean;
}
interface State {
- search: string,
- filter: Map<string, string[]>,
- order: [string, "asc" | "desc"][],
+ search: string;
+ filter: Map<string, string[]>;
+ order: [string, "asc" | "desc"][];
}
interface BaseParameters {
- storageKey: string,
- searchInput: HTMLInputElement,
+ storageKey: string;
+ searchInput: HTMLInputElement;
}
interface DataGridParameters extends BaseParameters {
- head: HTMLElement,
- container: HTMLElement
+ head: HTMLElement;
+ container: HTMLElement;
}
abstract class DataGrid {
@@ -33,7 +33,7 @@ abstract class DataGrid {
private delayTimer: any | null;
protected state: State;
- protected constructor({storageKey, head, container, searchInput}: DataGridParameters) {
+ protected constructor({ storageKey, head, container, searchInput }: DataGridParameters) {
this.storageKey = storageKey;
this.sortableHeaders = new Map();
head.querySelectorAll<HTMLElement>(".col-order").forEach(header => {
@@ -83,16 +83,19 @@ abstract class DataGrid {
private static NUMBER_REGEX = /^[+-]?\d+(?:[.,]\d*)?$/;
private fetchRows(): Row[] {
- let rows = [...this.container.children].map(row => row as HTMLElement).map(row => {
- const searchWords = this.findSearchableCells(row)
- .flatMap(element => DataGrid.searchWordsOf(element.textContent!));
- return {
- element: row,
- searchWords,
- filterValues: this.fetchRowFilterValues(row),
- orderValues: this.fetchRowOrderValues(row),
- } as Row;
- });
+ let rows = [...this.container.children]
+ .map(row => row as HTMLElement)
+ .map(row => {
+ const searchWords = this.findSearchableCells(row).flatMap(element =>
+ DataGrid.searchWordsOf(element.textContent!),
+ );
+ return {
+ element: row,
+ searchWords,
+ filterValues: this.fetchRowFilterValues(row),
+ orderValues: this.fetchRowOrderValues(row),
+ } as Row;
+ });
for (const column of this.sortableHeaders.keys()) {
const orderValues = rows.map(row => row.orderValues.get(column) as string);
const isNumericalColumn = orderValues.every(orderValue => DataGrid.NUMBER_REGEX.test(orderValue));
@@ -100,7 +103,7 @@ abstract class DataGrid {
rows.forEach(row => {
const numberString = (row.orderValues.get(column) as string).replace(",", ".");
row.orderValues.set(column, parseFloat(numberString));
- })
+ });
}
}
return rows;
@@ -173,9 +176,7 @@ abstract class DataGrid {
// Reflects changes to the rows to the DOM
protected renderToDOM() {
[...this.container.children].map(element => element as HTMLElement).forEach(element => element.remove());
- const elements = this.rows
- .filter(row => row.isDisplayed)
- .map(row => row.element);
+ const elements = this.rows.filter(row => row.isDisplayed).map(row => row.element);
this.container.append(...elements);
this.saveStateToStorage();
}
@@ -206,15 +207,15 @@ abstract class DataGrid {
}
interface TableGridParameters extends BaseParameters {
- table: HTMLTableElement,
- resetSearch: HTMLButtonElement,
+ table: HTMLTableElement;
+ resetSearch: HTMLButtonElement;
}
// Table based data grid which uses its head and body
export class TableGrid extends DataGrid {
private resetSearch: HTMLButtonElement;
- constructor({table, resetSearch, ...options}: TableGridParameters) {
+ constructor({ table, resetSearch, ...options }: TableGridParameters) {
super({
head: table.querySelector("thead")!,
container: table.querySelector("tbody")!,
@@ -252,13 +253,13 @@ export class TableGrid extends DataGrid {
}
interface EvaluationGridParameters extends TableGridParameters {
- filterButtons: HTMLButtonElement[],
+ filterButtons: HTMLButtonElement[];
}
export class EvaluationGrid extends TableGrid {
private filterButtons: HTMLButtonElement[];
- constructor({filterButtons, ...options}: EvaluationGridParameters) {
+ constructor({ filterButtons, ...options }: EvaluationGridParameters) {
super(options);
this.filterButtons = filterButtons;
}
@@ -295,8 +296,9 @@ export class EvaluationGrid extends TableGrid {
}
protected fetchRowFilterValues(row: HTMLElement): Map<string, string[]> {
- const evaluationState = [...row.querySelectorAll<HTMLElement>("[data-filter]")]
- .map(element => element.dataset.filter!);
+ const evaluationState = [...row.querySelectorAll<HTMLElement>("[data-filter]")].map(
+ element => element.dataset.filter!,
+ );
return new Map([["evaluationState", evaluationState]]);
}
@@ -315,13 +317,13 @@ export class EvaluationGrid extends TableGrid {
}
interface QuestionnaireParameters extends TableGridParameters {
- updateUrl: string,
+ updateUrl: string;
}
export class QuestionnaireGrid extends TableGrid {
private readonly updateUrl: string;
- constructor({updateUrl, ...options}: QuestionnaireParameters) {
+ constructor({ updateUrl, ...options }: QuestionnaireParameters) {
super(options);
this.updateUrl = updateUrl;
}
@@ -338,35 +340,41 @@ export class QuestionnaireGrid extends TableGrid {
}
const questionnaireIndices = this.rows.map((row, index) => [$(row.element).data("id"), index]);
$.post(this.updateUrl, Object.fromEntries(questionnaireIndices));
- }
+ },
});
}
private reorderRow(oldPosition: number, newPosition: number) {
- const displayedRows = this.rows.map((row, index) => ({row, index}))
- .filter(({row}) => row.isDisplayed);
+ const displayedRows = this.rows.map((row, index) => ({ row, index })).filter(({ row }) => row.isDisplayed);
this.rows.splice(displayedRows[oldPosition].index, 1);
this.rows.splice(displayedRows[newPosition].index, 0, displayedRows[oldPosition].row);
}
}
interface ResultGridParameters extends DataGridParameters {
- filterCheckboxes: Map<string, {selector: string, checkboxes: HTMLInputElement[]}>,
- sortColumnSelect: HTMLSelectElement,
- sortOrderCheckboxes: HTMLInputElement[],
- resetFilter: HTMLButtonElement,
- resetOrder: HTMLButtonElement,
+ filterCheckboxes: Map<string, { selector: string; checkboxes: HTMLInputElement[] }>;
+ sortColumnSelect: HTMLSelectElement;
+ sortOrderCheckboxes: HTMLInputElement[];
+ resetFilter: HTMLButtonElement;
+ resetOrder: HTMLButtonElement;
}
// Grid based data grid which has its container separated from its header
export class ResultGrid extends DataGrid {
- private readonly filterCheckboxes: Map<string, {selector: string, checkboxes: HTMLInputElement[]}>;
+ private readonly filterCheckboxes: Map<string, { selector: string; checkboxes: HTMLInputElement[] }>;
private sortColumnSelect: HTMLSelectElement;
private sortOrderCheckboxes: HTMLInputElement[];
private resetFilter: HTMLButtonElement;
private resetOrder: HTMLButtonElement;
- constructor({filterCheckboxes, sortColumnSelect, sortOrderCheckboxes, resetFilter, resetOrder, ...options}: ResultGridParameters) {
+ constructor({
+ filterCheckboxes,
+ sortColumnSelect,
+ sortOrderCheckboxes,
+ resetFilter,
+ resetOrder,
+ ...options
+ }: ResultGridParameters) {
super(options);
this.filterCheckboxes = filterCheckboxes;
this.sortColumnSelect = sortColumnSelect;
@@ -377,7 +385,7 @@ export class ResultGrid extends DataGrid {
public bindEvents() {
super.bindEvents();
- for (const [name, {checkboxes}] of this.filterCheckboxes.entries()) {
+ for (const [name, { checkboxes }] of this.filterCheckboxes.entries()) {
checkboxes.forEach(checkbox => {
checkbox.addEventListener("change", () => {
const values = checkboxes.filter(checkbox => checkbox.checked).map(elem => elem.value);
@@ -413,21 +421,23 @@ export class ResultGrid extends DataGrid {
const order = this.sortOrderCheckboxes.find(checkbox => checkbox.checked)!.value;
if (order === "asc" || order === "desc") {
if (column === "name-semester") {
- this.sort([["name", order], ["semester", order]]);
+ this.sort([
+ ["name", order],
+ ["semester", order],
+ ]);
} else {
this.sort([[column, order]]);
}
}
}
-
protected findSearchableCells(row: HTMLElement): HTMLElement[] {
return [...row.querySelectorAll<HTMLElement>(".evaluation-name, [data-col=responsible]")];
}
protected fetchRowFilterValues(row: HTMLElement): Map<string, string[]> {
let filterValues = new Map();
- for (const [name, {selector, checkboxes}] of this.filterCheckboxes.entries()) {
+ for (const [name, { selector, checkboxes }] of this.filterCheckboxes.entries()) {
// To store filter values independent of the language, use the corresponding id from the checkbox
const values = [...row.querySelectorAll(selector)]
.map(element => element.textContent!.trim())
@@ -438,12 +448,15 @@ export class ResultGrid extends DataGrid {
}
protected get defaultOrder(): [string, "asc" | "desc"][] {
- return [["name", "asc"], ["semester", "asc"]];
+ return [
+ ["name", "asc"],
+ ["semester", "asc"],
+ ];
}
protected reflectFilterStateOnInputs() {
super.reflectFilterStateOnInputs();
- for (const [name, {checkboxes}] of this.filterCheckboxes.entries()) {
+ for (const [name, { checkboxes }] of this.filterCheckboxes.entries()) {
checkboxes.forEach(checkbox => {
let isActive;
if (this.state.filter.has(name)) {
diff --git a/package-lock.json b/package-lock.json
index 206271ac7b..4b83b774ba 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -12,6 +12,7 @@
"@types/sortablejs": "^1.3.0",
"jest": "^27.3.1",
"jest-environment-puppeteer": "^6.0.0",
+ "prettier": "^2.4.1",
"puppeteer": "^10.4.0",
"sass": "1.32.13",
"ts-jest": "^27.0.7",
@@ -6426,6 +6427,18 @@
"node": ">= 0.8.0"
}
},
+ "node_modules/prettier": {
+ "version": "2.4.1",
+ "resolved": "https://registry.npmjs.org/prettier/-/prettier-2.4.1.tgz",
+ "integrity": "sha512-9fbDAXSBcc6Bs1mZrDYb3XKzDLm4EXXL9sC1LqKP5rZkT6KRr/rf9amVUcODVXgguK/isJz0d0hP72WeaKWsvA==",
+ "dev": true,
+ "bin": {
+ "prettier": "bin-prettier.js"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
"node_modules/pretty-format": {
"version": "26.6.2",
"resolved": "https://registry.npmjs.org/pretty-format/-/pretty-format-26.6.2.tgz",
@@ -12653,6 +12666,12 @@
"integrity": "sha1-IZMqVJ9eUv/ZqCf1cOBL5iqX2lQ=",
"dev": true
},
+ "prettier": {
+ "version": "2.4.1",
+ "resolved": "https://registry.npmjs.org/prettier/-/prettier-2.4.1.tgz",
+ "integrity": "sha512-9fbDAXSBcc6Bs1mZrDYb3XKzDLm4EXXL9sC1LqKP5rZkT6KRr/rf9amVUcODVXgguK/isJz0d0hP72WeaKWsvA==",
+ "dev": true
+ },
"pretty-format": {
"version": "26.6.2",
"resolved": "https://registry.npmjs.org/pretty-format/-/pretty-format-26.6.2.tgz",
diff --git a/package.json b/package.json
index ae7bee343a..ed91972473 100644
--- a/package.json
+++ b/package.json
@@ -7,10 +7,11 @@
"@types/sortablejs": "^1.3.0",
"jest": "^27.3.1",
"jest-environment-puppeteer": "^6.0.0",
+ "prettier": "^2.4.1",
"puppeteer": "^10.4.0",
+ "sass": "1.32.13",
"ts-jest": "^27.0.7",
- "typescript": "^4.4.4",
- "sass": "1.32.13"
+ "typescript": "^4.4.4"
},
"jest": {
"testRunner": "jest-jasmine2",
|
qtile__qtile-1522 | EzKey does not allow description
I think the [EzKey constructor](https://github.com/qtile/qtile/blob/master/libqtile/config.py#L155) does not allow a description (no `kwds` variable) although [Key constructor](https://github.com/qtile/qtile/blob/master/libqtile/config.py#L53) does.
Edit: Why do you set the description within a dictionary instead of having a constructor argument for it?
Edit 2: Forgot my versions and stuff:
| Item | Version |
|:---------:|:--------------:|
| Qtile (from official repositories) | 0.14.2 |
| ArchLinux | 5.4.6-arch3-1 |
| [
{
"content": "# Copyright (c) 2012-2015 Tycho Andersen\n# Copyright (c) 2013 xarvh\n# Copyright (c) 2013 horsik\n# Copyright (c) 2013-2014 roger\n# Copyright (c) 2013 Tao Sauvage\n# Copyright (c) 2014 ramnes\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Adi Sieker\n#\n# Permission is hereby granted, free... | [
{
"content": "# Copyright (c) 2012-2015 Tycho Andersen\n# Copyright (c) 2013 xarvh\n# Copyright (c) 2013 horsik\n# Copyright (c) 2013-2014 roger\n# Copyright (c) 2013 Tao Sauvage\n# Copyright (c) 2014 ramnes\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Adi Sieker\n#\n# Permission is hereby granted, free... | diff --git a/libqtile/config.py b/libqtile/config.py
index dacff71672..0e53fc66f2 100644
--- a/libqtile/config.py
+++ b/libqtile/config.py
@@ -152,9 +152,9 @@ def parse(self, spec):
class EzKey(EzConfig, Key):
- def __init__(self, keydef, *commands):
+ def __init__(self, keydef, *commands, **kwargs):
modkeys, key = self.parse(keydef)
- super().__init__(modkeys, key, *commands)
+ super().__init__(modkeys, key, *commands, **kwargs)
class EzClick(EzConfig, Click):
|
liberapay__liberapay.com-1156 | Support avatars from Gitlab
Quite a number of open-source projects are hosted on Gitlab, including mine.
With Libavatar [shutting down](https://blog.libravatar.org/posts/Libravatar.org_is_shutting_down_on_2018-09-01/), it'd be nice to have an alternative that doesn't require creating an account on a proprietary service. (While Mastodon isn't proprietary, it's still an unnecessary extra account to take care of.)
| [
{
"content": "# coding: utf8\nfrom __future__ import print_function, unicode_literals\n\nfrom collections import namedtuple, OrderedDict\nfrom datetime import date, datetime, timedelta\nfrom decimal import Decimal, ROUND_UP\nimport re\n\nfrom jinja2 import StrictUndefined\nfrom mangopay.utils import Money\nfrom... | [
{
"content": "# coding: utf8\nfrom __future__ import print_function, unicode_literals\n\nfrom collections import namedtuple, OrderedDict\nfrom datetime import date, datetime, timedelta\nfrom decimal import Decimal, ROUND_UP\nimport re\n\nfrom jinja2 import StrictUndefined\nfrom mangopay.utils import Money\nfrom... | diff --git a/liberapay/constants.py b/liberapay/constants.py
index 0d2c2efeb8..8a8b76770f 100644
--- a/liberapay/constants.py
+++ b/liberapay/constants.py
@@ -57,7 +57,9 @@ def with_vat(self):
"-_.")
AVATAR_QUERY = '?s=160&default=retro'
-AVATAR_SOURCES = 'libravatar bitbucket facebook github google mastodon twitter'.split()
+AVATAR_SOURCES = (
+ 'libravatar bitbucket facebook github gitlab google mastodon twitch twitter youtube'
+).split()
BIRTHDAY = date(2015, 5, 22)
|
deepset-ai__haystack-7396 | `HuggingFaceTGIChatGenerator` does not work properly in a Pipeline
**Describe the bug**
Reported on Discord, reproducible.
[Our example in docs](https://docs.haystack.deepset.ai/docs/huggingfacetgichatgenerator#in-a-pipeline) is broken.
**To Reproduce**
```python
from haystack.components.builders import DynamicChatPromptBuilder
from haystack.components.generators.chat import HuggingFaceTGIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret
from haystack import Pipeline
# no parameter init, we don't use any runtime template variables
prompt_builder = DynamicChatPromptBuilder()
llm = HuggingFaceTGIChatGenerator(model="meta-llama/Llama-2-70b-chat-hf", token=Secret.from_token("..."))
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
messages = [ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
ChatMessage.from_user("Tell me about {{location}}")]
pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "prompt_source": messages}})
```
**Error message**
```
AttributeError Traceback (most recent call last)
[<ipython-input-6-4084a601c8bf>](https://localhost:8080/#) in <cell line: 13>()
11 pipe = Pipeline()
12 pipe.add_component("prompt_builder", prompt_builder)
---> 13 pipe.add_component("llm", llm)
14 pipe.connect("prompt_builder.prompt", "llm.messages")
15 location = "Berlin"
[/usr/local/lib/python3.10/dist-packages/haystack/core/pipeline/pipeline.py](https://localhost:8080/#) in add_component(self, name, instance)
291 name,
292 instance=instance,
--> 293 input_sockets=instance.__haystack_input__._sockets_dict, # type: ignore[attr-defined]
294 output_sockets=instance.__haystack_output__._sockets_dict, # type: ignore[attr-defined]
295 visits=0,
AttributeError: 'HuggingFaceTGIChatGenerator' object has no attribute '__haystack_input__'
```
**System:**
- Haystack version (commit or version number): 2.0.0
| [
{
"content": "from dataclasses import asdict\nfrom typing import Any, Callable, Dict, Iterable, List, Optional\nfrom urllib.parse import urlparse\n\nfrom haystack import component, default_from_dict, default_to_dict, logging\nfrom haystack.dataclasses import ChatMessage, StreamingChunk\nfrom haystack.lazy_impor... | [
{
"content": "from dataclasses import asdict\nfrom typing import Any, Callable, Dict, Iterable, List, Optional\nfrom urllib.parse import urlparse\n\nfrom haystack import component, default_from_dict, default_to_dict, logging\nfrom haystack.dataclasses import ChatMessage, StreamingChunk\nfrom haystack.lazy_impor... | diff --git a/haystack/components/generators/chat/hugging_face_tgi.py b/haystack/components/generators/chat/hugging_face_tgi.py
index 3e388a008d..95adbe792d 100644
--- a/haystack/components/generators/chat/hugging_face_tgi.py
+++ b/haystack/components/generators/chat/hugging_face_tgi.py
@@ -16,6 +16,7 @@
logger = logging.getLogger(__name__)
+@component
class HuggingFaceTGIChatGenerator:
"""
Enables text generation using HuggingFace Hub hosted chat-based LLMs. This component is designed to seamlessly
diff --git a/releasenotes/notes/tgi-chat-missing-decorator-799b2a133ee4708c.yaml b/releasenotes/notes/tgi-chat-missing-decorator-799b2a133ee4708c.yaml
new file mode 100644
index 0000000000..5437a83b5d
--- /dev/null
+++ b/releasenotes/notes/tgi-chat-missing-decorator-799b2a133ee4708c.yaml
@@ -0,0 +1,5 @@
+---
+fixes:
+ - |
+ Add the `@component` decorator to `HuggingFaceTGIChatGenerator`.
+ The lack of this decorator made it impossible to use the `HuggingFaceTGIChatGenerator` in a pipeline.
|
falconry__falcon-981 | Doc site: On small screen height, sidebar ("Navigation") clips at bottom.
Using a laptop with 768 pixels height resolution.

| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Falcon documentation build configuration file, created by\n# sphinx-quickstart on Wed Mar 12 14:14:02 2014.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in thi... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Falcon documentation build configuration file, created by\n# sphinx-quickstart on Wed Mar 12 14:14:02 2014.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in thi... | diff --git a/docs/_static/custom.css b/docs/_static/custom.css
index 34a4ad865..67d13d9e8 100644
--- a/docs/_static/custom.css
+++ b/docs/_static/custom.css
@@ -14,6 +14,10 @@
display: none;
}
+#logo {
+ position: relative;
+}
+
#logo a,
#logo a:hover {
border-bottom: none;
@@ -32,12 +36,12 @@
font-family: "Amethysta", "goudy old style", serif;
font-weight: bold;
- font-size: 18pt;
+ font-size: 16pt;
color: #444;
position: absolute;
- left: 9px;
- top: 2px;
+ left: 10px;
+ top: -14px;
/*margin: -4px -4px 0 0;*/
}
diff --git a/docs/conf.py b/docs/conf.py
index ad11a5e65..059a916da 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -152,7 +152,7 @@
'github_repo': 'falcon',
'github_button': False,
'github_banner': True,
- 'fixed_sidebar': True,
+ 'fixed_sidebar': False,
'show_powered_by': False,
'extra_nav_links': {
'Falcon Home': 'http://falconframework.org/',
|
yt-project__yt-4463 | BUG: setting a boolean parameter via the command line break runtime
### Bug report
**Bug summary**
**Code for reproduction**
```shell
$ yt config set --local yt colored_logs true && python -c "import yt"
```
**Actual outcome**
<!--The output produced by the above code, which may be a screenshot, console
output, etc.-->
```python-traceback
Traceback (most recent call last):
File "/Users/robcleme/.pyenv/versions/yt-dev/bin/yt", line 8, in <module>
sys.exit(run_main())
^^^^^^^^^^
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/command_line.py", line 1615, in run_main
args.func(args)
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/command_line.py", line 228, in run
self(args)
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/command_line.py", line 1402, in __call__
set_config(args.section, args.option, args.value, self.config_file)
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/configure.py", line 195, in set_config
CONFIG.set(section, *option_path, _cast_value_helper(value))
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/configure.py", line 79, in set
self.config_root.upsert_from_list(
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/configuration_tree.py", line 54, in upsert_from_list
next_node.upsert_from_list(next_keys, value, extra_data)
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/configuration_tree.py", line 46, in upsert_from_list
leaf.value = value
^^^^^^^^^^
File "/Users/robcleme/dev/yt-project/yt/yt/utilities/configuration_tree.py", line 187, in value
raise TypeError(msg)
TypeError: Error when setting yt.colored_logs.
Tried to assign a value of type <class 'str'>, expected type <class 'bool'>.
This entry was last modified in file: /Users/robcleme/dev/yt-project/yt/yt.toml.
```
One way to patch this would be to special-case `true` and `false` to be interpreted as booleans when received from the command line.
| [
{
"content": "import os\nimport sys\nimport warnings\nfrom pathlib import Path\nfrom typing import Callable, List\n\nimport tomli_w\nfrom more_itertools import always_iterable\n\nfrom yt.utilities.configuration_tree import ConfigLeaf, ConfigNode\n\nif sys.version_info >= (3, 11):\n import tomllib\nelse:\n ... | [
{
"content": "import os\nimport sys\nimport warnings\nfrom pathlib import Path\nfrom typing import Callable, List\n\nimport tomli_w\nfrom more_itertools import always_iterable\n\nfrom yt.utilities.configuration_tree import ConfigLeaf, ConfigNode\n\nif sys.version_info >= (3, 11):\n import tomllib\nelse:\n ... | diff --git a/yt/utilities/configure.py b/yt/utilities/configure.py
index 7894b63314f..64a034b7ee7 100644
--- a/yt/utilities/configure.py
+++ b/yt/utilities/configure.py
@@ -161,9 +161,9 @@ def _repr_json_(self):
def _cast_bool_helper(value):
- if value == "True":
+ if value in ("true", "True", True):
return True
- elif value == "False":
+ elif value in ("false", "False", False):
return False
else:
raise ValueError("Cannot safely cast to bool")
|
pretalx__pretalx-381 | installation crashes when there are no config files
## Current Behavior
```
$ cd pretalx
$ pip-3.6 install . --user
(...)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/pip-xa87l9tk-build/pretalx/settings.py", line 460, in <module>
plugins=PLUGINS
File "/tmp/pip-xa87l9tk-build/pretalx/common/settings/utils.py", line 11, in log_initial
(f'Read from: {", ".join(config_files)}', False),
TypeError: can only join an iterable
```
if there are no config files at all, the installation crashes, because `config_files` is `None`.
## Your Environment
* Version used: master
* Operating System and version (desktop or mobile): FreeBSD
| [
{
"content": "import configparser\nimport os\nimport sys\n\nfrom pretalx.common.settings.utils import reduce_dict\n\nCONFIG = {\n 'filesystem': {\n 'base': {\n 'default': os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))),\n },\n 'logs': {\n ... | [
{
"content": "import configparser\nimport os\nimport sys\n\nfrom pretalx.common.settings.utils import reduce_dict\n\nCONFIG = {\n 'filesystem': {\n 'base': {\n 'default': os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))),\n },\n 'logs': {\n ... | diff --git a/src/pretalx/common/settings/config.py b/src/pretalx/common/settings/config.py
index f89488e514..70b8a56172 100644
--- a/src/pretalx/common/settings/config.py
+++ b/src/pretalx/common/settings/config.py
@@ -128,7 +128,7 @@ def read_config_files(config):
os.path.expanduser('~/.pretalx.cfg'),
'pretalx.cfg',
], encoding='utf-8')
- return config, config_files
+ return config, config_files or [] # .read() returns None, if there are no config files
def read_layer(layer_name, config):
|
TileDB-Inc__TileDB-Py-501 | Four components should be three components?
In the recently created example "writing_dense_rgb.py" there is this fragment:
https://github.com/TileDB-Inc/TileDB-Py/blob/75ddcf56ed80ba5e1a1237b7e527ec4fbd87abb9/examples/writing_dense_rgb.py#L56-L57
It says four int32 components where it seems like it should be three int32 components. After all the values of the attribute are RGB and not RGBA.
| [
{
"content": "# writing_dense_rgb.py\n#\n# LICENSE\n#\n# The MIT License\n#\n# Copyright (c) 2021 TileDB, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restrict... | [
{
"content": "# writing_dense_rgb.py\n#\n# LICENSE\n#\n# The MIT License\n#\n# Copyright (c) 2021 TileDB, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restrict... | diff --git a/examples/writing_dense_rgb.py b/examples/writing_dense_rgb.py
index 8a2dfd1b6b..20a0669b37 100644
--- a/examples/writing_dense_rgb.py
+++ b/examples/writing_dense_rgb.py
@@ -53,7 +53,7 @@ def create_array():
),
)
- # create multi-component attribute with four int32 components
+ # create multi-component attribute with three int32 components
attr = tiledb.Attr(dtype=np.dtype("i4, i4, i4"))
schema = tiledb.ArraySchema(domain=domain, sparse=False, attrs=[attr])
|
flairNLP__flair-1713 | encoding error while loading WIKINER_RUSSIAN
I am trying to load russian NER labelled data 'WIKINER_RUSSIAN' into Flair.
But getting some encoding error during loading.
My System spec:
Flair: 0.4.3
Python: 3.8.0

| [
{
"content": "import logging\nimport re\nimport os\nfrom pathlib import Path\nfrom typing import Union, Dict, List\n\nimport flair\nfrom flair.data import Corpus, FlairDataset, Sentence, Token\nfrom flair.datasets.base import find_train_dev_test_files\nfrom flair.file_utils import cached_path\n\nlog = logging.g... | [
{
"content": "import logging\nimport re\nimport os\nfrom pathlib import Path\nfrom typing import Union, Dict, List\n\nimport flair\nfrom flair.data import Corpus, FlairDataset, Sentence, Token\nfrom flair.datasets.base import find_train_dev_test_files\nfrom flair.file_utils import cached_path\n\nlog = logging.g... | diff --git a/flair/datasets/sequence_labeling.py b/flair/datasets/sequence_labeling.py
index 23c80881c0..fd2c859a1c 100644
--- a/flair/datasets/sequence_labeling.py
+++ b/flair/datasets/sequence_labeling.py
@@ -1310,6 +1310,7 @@ def _download_wikiner(language_code: str, dataset_name: str):
/ dataset_name
/ f"aij-wikiner-{lc}-wp3.train",
"w",
+ encoding="utf-8"
) as out:
for line in f:
line = line.decode("utf-8")
|
coala__coala-bears-900 | YapfBear: Make asciinema
@Mariatta are you interested?
| [
{
"content": "import sys\n\nfrom yapf.yapflib.yapf_api import FormatFile\n\nfrom coalib.bearlib import deprecate_settings\nfrom coalib.bearlib.spacing.SpacingHelper import SpacingHelper\nfrom coalib.bears.LocalBear import LocalBear\nfrom coalib.bears.requirements.PipRequirement import PipRequirement\nfrom coali... | [
{
"content": "import sys\n\nfrom yapf.yapflib.yapf_api import FormatFile\n\nfrom coalib.bearlib import deprecate_settings\nfrom coalib.bearlib.spacing.SpacingHelper import SpacingHelper\nfrom coalib.bears.LocalBear import LocalBear\nfrom coalib.bears.requirements.PipRequirement import PipRequirement\nfrom coali... | diff --git a/bears/python/YapfBear.py b/bears/python/YapfBear.py
index 03d686596b..d02c898f1f 100644
--- a/bears/python/YapfBear.py
+++ b/bears/python/YapfBear.py
@@ -23,6 +23,7 @@ class YapfBear(LocalBear):
AUTHORS_EMAILS = {'coala-devel@googlegroups.com'}
LICENSE = 'AGPL-3.0'
CAN_FIX = {'Formatting'}
+ ASCIINEMA_URL = 'https://asciinema.org/a/89021'
@deprecate_settings(indent_size='tab_width')
def run(self, filename, file,
|
getpelican__pelican-2521 | WARNING: Docutils has no localization for 'english'. Using 'en' instead.
1. pipenv install pelican markdown
2. pelican-quickstart
3. create an article in content
4. run pelican
**Expected**: Clean run and output created
**Observed**: Warning
> WARNING: Docutils has no localization for 'english'. Using 'en' instead.
When I change DEFAULT_LANG = 'English' in my seetings to DEFAULT_LANG = 'en' it runs fine.
Should I PR that as a fix or is there some reason it is English and not en.
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import print_function, unicode_literals\n\nimport argparse\nimport codecs\nimport locale\nimport os\nimport sys\n\nfrom jinja2 import Environment, FileSystemLoader\n\nimport pytz\n\ntry:\n import readline # NOQA\nexcept ImportErro... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import print_function, unicode_literals\n\nimport argparse\nimport codecs\nimport locale\nimport os\nimport sys\n\nfrom jinja2 import Environment, FileSystemLoader\n\nimport pytz\n\ntry:\n import readline # NOQA\nexcept ImportErro... | diff --git a/pelican/tools/pelican_quickstart.py b/pelican/tools/pelican_quickstart.py
index 529eeb527..4a6b8cbc3 100755
--- a/pelican/tools/pelican_quickstart.py
+++ b/pelican/tools/pelican_quickstart.py
@@ -34,7 +34,7 @@
# Don't fail on macosx: "unknown locale: UTF-8"
_DEFAULT_LANGUAGE = None
if _DEFAULT_LANGUAGE is None:
- _DEFAULT_LANGUAGE = 'English'
+ _DEFAULT_LANGUAGE = 'en'
else:
_DEFAULT_LANGUAGE = _DEFAULT_LANGUAGE.split('_')[0]
|
huggingface__huggingface_hub-790 | Support python=3.10
Python 3.10 has been out for a while but we seem to not test for it. What are the roadblocks for us to support 3.10 and maybe deprecate 3.6? (Many packages now support 3.8-3.10 and older versions are not supported anymore).
Ping @LysandreJik @osanseviero maybe?
| [
{
"content": "from setuptools import find_packages, setup\n\n\ndef get_version() -> str:\n rel_path = \"src/huggingface_hub/__init__.py\"\n with open(rel_path, \"r\") as fp:\n for line in fp.read().splitlines():\n if line.startswith(\"__version__\"):\n delim = '\"' if '\"'... | [
{
"content": "from setuptools import find_packages, setup\n\n\ndef get_version() -> str:\n rel_path = \"src/huggingface_hub/__init__.py\"\n with open(rel_path, \"r\") as fp:\n for line in fp.read().splitlines():\n if line.startswith(\"__version__\"):\n delim = '\"' if '\"'... | diff --git a/.github/workflows/python-tests.yml b/.github/workflows/python-tests.yml
index 37df59bca1..71bd60054a 100644
--- a/.github/workflows/python-tests.yml
+++ b/.github/workflows/python-tests.yml
@@ -22,7 +22,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: ["3.6", "3.9"]
+ python-version: ["3.7", "3.10"]
test_repository: ["Repository only", "Everything else"]
steps:
@@ -52,7 +52,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: ["3.6", "3.9"]
+ python-version: ["3.7", "3.10"]
steps:
- uses: actions/checkout@v2
@@ -73,7 +73,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: ["3.6", "3.9"]
+ python-version: ["3.7", "3.10"]
steps:
- uses: actions/checkout@v2
@@ -100,7 +100,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v2
with:
- python-version: 3.9
+ python-version: "3.10"
- run: |
git config --global user.email "ci@dummy.com"
diff --git a/setup.py b/setup.py
index d552bc5e89..1e643e2ecc 100644
--- a/setup.py
+++ b/setup.py
@@ -69,7 +69,7 @@ def get_version() -> str:
"huggingface-cli=huggingface_hub.commands.huggingface_cli:main"
]
},
- python_requires=">=3.6.0",
+ python_requires=">=3.7.0",
install_requires=install_requires,
classifiers=[
"Intended Audience :: Developers",
|
CTFd__CTFd-1918 | Users in admin scoreboard show user position instead of team position
In teams mode on the admin panel, users are shown with their user position on the scoreboard instead of their teams position. We should be showing both.
| [
{
"content": "from flask import render_template, request, url_for\nfrom sqlalchemy.sql import not_\n\nfrom CTFd.admin import admin\nfrom CTFd.models import Challenges, Tracking, Users\nfrom CTFd.utils import get_config\nfrom CTFd.utils.decorators import admins_only\nfrom CTFd.utils.modes import TEAMS_MODE\n\n\n... | [
{
"content": "from flask import render_template, request, url_for\nfrom sqlalchemy.sql import not_\n\nfrom CTFd.admin import admin\nfrom CTFd.models import Challenges, Tracking, Users\nfrom CTFd.utils import get_config\nfrom CTFd.utils.decorators import admins_only\nfrom CTFd.utils.modes import TEAMS_MODE\n\n\n... | diff --git a/CTFd/admin/users.py b/CTFd/admin/users.py
index 46f16c8af..f2a0c484d 100644
--- a/CTFd/admin/users.py
+++ b/CTFd/admin/users.py
@@ -88,8 +88,8 @@ def users_detail(user_id):
awards = user.get_awards(admin=True)
# Get user properties
- score = user.get_score(admin=True)
- place = user.get_place(admin=True)
+ score = user.account.get_score(admin=True)
+ place = user.account.get_place(admin=True)
return render_template(
"admin/users/user.html",
|
DataDog__dd-trace-py-1468 | Cannot install ddtrace 0.38.0 with Python 3.8 without the wheels
Hi,
I cannot install ddtrace 0.38.0 without using the provided wheel. It was working with ddtrace version 0.37.1.
### Which version of dd-trace-py are you using?
0.38.0 with Python 3.8.3 on Linux (tried from my Archlinux and from a Docker container with Debian)
### How can we reproduce your problem?
Run `pip install --no-binary=:all: ddtrace`
### What is the result that you get?
```
Collecting ddtrace==0.38.0
Using cached ddtrace-0.38.0.tar.gz (887 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing wheel metadata ... done
Requirement already satisfied: msgpack>=0.5.0 in /home/yannick/.local/share/virtualenvs/core/lib/python3.8/site-packages (from ddtrace==0.38.0) (1.0.0)
Building wheels for collected packages: ddtrace
Building wheel for ddtrace (PEP 517) ... error
ERROR: Command errored out with exit status 1:
command: /home/yannick/.local/share/virtualenvs/core/bin/python /home/yannick/.local/share/virtualenvs/core/lib/python3.8/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /tmp/tmp5caazvta
cwd: /tmp/pip-install-b0v_y4yt/ddtrace
Complete output (423 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.8
creating build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/util.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/tracer.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/span.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/sampler.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/provider.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/pin.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/payload.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/monkey.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/helpers.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/filters.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/encoding.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/context.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/constants.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/compat.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/api.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/_worker.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/_hooks.py -> build/lib.linux-x86_64-3.8/ddtrace
copying ddtrace/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace
creating build/lib.linux-x86_64-3.8/ddtrace/vendor
copying ddtrace/vendor/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor
creating build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/wrappers.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/time.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/importlib.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/http.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/hook.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/formats.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/deprecation.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/config.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/attrdict.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
copying ddtrace/utils/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/utils
creating build/lib.linux-x86_64-3.8/ddtrace/settings
copying ddtrace/settings/integration.py -> build/lib.linux-x86_64-3.8/ddtrace/settings
copying ddtrace/settings/http.py -> build/lib.linux-x86_64-3.8/ddtrace/settings
copying ddtrace/settings/exceptions.py -> build/lib.linux-x86_64-3.8/ddtrace/settings
copying ddtrace/settings/config.py -> build/lib.linux-x86_64-3.8/ddtrace/settings
copying ddtrace/settings/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/settings
creating build/lib.linux-x86_64-3.8/ddtrace/propagation
copying ddtrace/propagation/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/propagation
copying ddtrace/propagation/http.py -> build/lib.linux-x86_64-3.8/ddtrace/propagation
copying ddtrace/propagation/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/propagation
creating build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/scheduler.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/recorder.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/profiler.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/event.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/auto.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/_traceback.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/_service.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/_periodic.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/_line2def.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/_attr.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/__main__.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
copying ddtrace/profiling/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling
creating build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/scheduler.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/recorder.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/profiler.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/event.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/auto.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
copying ddtrace/profile/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profile
creating build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/tracer.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/tags.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/span_context.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/span.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/settings.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/helpers.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
copying ddtrace/opentracer/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer
creating build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/writer.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/rate_limiter.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/logger.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/import_hooks.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/hostname.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/context_manager.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
copying ddtrace/internal/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/internal
creating build/lib.linux-x86_64-3.8/ddtrace/http
copying ddtrace/http/headers.py -> build/lib.linux-x86_64-3.8/ddtrace/http
copying ddtrace/http/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/http
creating build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/system.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/sql.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/redis.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/priority.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/net.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/mongo.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/memcached.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/kombu.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/http.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/errors.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/elasticsearch.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/db.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/consul.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/cassandra.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/aws.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
copying ddtrace/ext/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/ext
creating build/lib.linux-x86_64-3.8/ddtrace/contrib
copying ddtrace/contrib/util.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib
copying ddtrace/contrib/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib
creating build/lib.linux-x86_64-3.8/ddtrace/commands
copying ddtrace/commands/ddtrace_run.py -> build/lib.linux-x86_64-3.8/ddtrace/commands
copying ddtrace/commands/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/commands
creating build/lib.linux-x86_64-3.8/ddtrace/bootstrap
copying ddtrace/bootstrap/sitecustomize.py -> build/lib.linux-x86_64-3.8/ddtrace/bootstrap
copying ddtrace/bootstrap/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/bootstrap
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
copying ddtrace/vendor/wrapt/wrappers.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
copying ddtrace/vendor/wrapt/setup.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
copying ddtrace/vendor/wrapt/importer.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
copying ddtrace/vendor/wrapt/decorators.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
copying ddtrace/vendor/wrapt/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/wrapt
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/six
copying ddtrace/vendor/six/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/six
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/setup.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_pswindows.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_pssunos.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_psposix.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_psosx.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_pslinux.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_psbsd.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_psaix.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_compat.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/_common.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
copying ddtrace/vendor/psutil/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/psutil
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/monotonic
copying ddtrace/vendor/monotonic/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/monotonic
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/route.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/context_async.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/context.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/compat.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/base.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
copying ddtrace/vendor/dogstatsd/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/dogstatsd
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/updating.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/renames.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/removals.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/moves.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/_utils.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
copying ddtrace/vendor/debtcollector/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/debtcollector
creating build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/validators.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/filters.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/exceptions.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/converters.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/_version_info.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/_make.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/_funcs.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/_config.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/_compat.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
copying ddtrace/vendor/attr/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/vendor/attr
creating build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
copying ddtrace/profiling/exporter/pprof_pb2.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
copying ddtrace/profiling/exporter/pprof.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
copying ddtrace/profiling/exporter/http.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
copying ddtrace/profiling/exporter/file.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
copying ddtrace/profiling/exporter/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/exporter
creating build/lib.linux-x86_64-3.8/ddtrace/profiling/collector
copying ddtrace/profiling/collector/threading.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/collector
copying ddtrace/profiling/collector/memory.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/collector
copying ddtrace/profiling/collector/exceptions.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/collector
copying ddtrace/profiling/collector/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/collector
creating build/lib.linux-x86_64-3.8/ddtrace/profiling/bootstrap
copying ddtrace/profiling/bootstrap/sitecustomize.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/bootstrap
copying ddtrace/profiling/bootstrap/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profiling/bootstrap
creating build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
copying ddtrace/profile/exporter/pprof_pb2.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
copying ddtrace/profile/exporter/pprof.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
copying ddtrace/profile/exporter/http.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
copying ddtrace/profile/exporter/file.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
copying ddtrace/profile/exporter/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/exporter
creating build/lib.linux-x86_64-3.8/ddtrace/profile/collector
copying ddtrace/profile/collector/threading.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/collector
copying ddtrace/profile/collector/stack.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/collector
copying ddtrace/profile/collector/memory.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/collector
copying ddtrace/profile/collector/exceptions.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/collector
copying ddtrace/profile/collector/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/collector
creating build/lib.linux-x86_64-3.8/ddtrace/profile/bootstrap
copying ddtrace/profile/bootstrap/sitecustomize.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/bootstrap
copying ddtrace/profile/bootstrap/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/profile/bootstrap
creating build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
copying ddtrace/opentracer/propagation/text.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
copying ddtrace/opentracer/propagation/propagator.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
copying ddtrace/opentracer/propagation/http.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
copying ddtrace/opentracer/propagation/binary.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
copying ddtrace/opentracer/propagation/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/opentracer/propagation
creating build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/tag_collectors.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/runtime_metrics.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/metric_collectors.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/container.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/collector.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
copying ddtrace/internal/runtime/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/internal/runtime
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/vertica
copying ddtrace/contrib/vertica/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/vertica
copying ddtrace/contrib/vertica/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/vertica
copying ddtrace/contrib/vertica/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/vertica
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/template.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/stack_context.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/handlers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/decorators.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/compat.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/application.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
copying ddtrace/contrib/tornado/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/tornado
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlite3
copying ddtrace/contrib/sqlite3/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlite3
copying ddtrace/contrib/sqlite3/connection.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlite3
copying ddtrace/contrib/sqlite3/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlite3
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlalchemy
copying ddtrace/contrib/sqlalchemy/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlalchemy
copying ddtrace/contrib/sqlalchemy/engine.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlalchemy
copying ddtrace/contrib/sqlalchemy/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/sqlalchemy
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/session.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/legacy.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/connection.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
copying ddtrace/contrib/requests/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/requests
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/rediscluster
copying ddtrace/contrib/rediscluster/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/rediscluster
copying ddtrace/contrib/rediscluster/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/rediscluster
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/redis
copying ddtrace/contrib/redis/util.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/redis
copying ddtrace/contrib/redis/tracers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/redis
copying ddtrace/contrib/redis/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/redis
copying ddtrace/contrib/redis/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/redis
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pyramid
copying ddtrace/contrib/pyramid/trace.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pyramid
copying ddtrace/contrib/pyramid/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pyramid
copying ddtrace/contrib/pyramid/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pyramid
copying ddtrace/contrib/pyramid/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pyramid
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pymysql
copying ddtrace/contrib/pymysql/tracers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymysql
copying ddtrace/contrib/pymysql/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymysql
copying ddtrace/contrib/pymysql/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymysql
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pymongo
copying ddtrace/contrib/pymongo/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymongo
copying ddtrace/contrib/pymongo/parse.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymongo
copying ddtrace/contrib/pymongo/client.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymongo
copying ddtrace/contrib/pymongo/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymongo
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pymemcache
copying ddtrace/contrib/pymemcache/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymemcache
copying ddtrace/contrib/pymemcache/client.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymemcache
copying ddtrace/contrib/pymemcache/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pymemcache
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/renderer.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/middleware.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/compat.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
copying ddtrace/contrib/pylons/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylons
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/pylibmc
copying ddtrace/contrib/pylibmc/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylibmc
copying ddtrace/contrib/pylibmc/client.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylibmc
copying ddtrace/contrib/pylibmc/addrs.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylibmc
copying ddtrace/contrib/pylibmc/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/pylibmc
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/psycopg
copying ddtrace/contrib/psycopg/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/psycopg
copying ddtrace/contrib/psycopg/connection.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/psycopg
copying ddtrace/contrib/psycopg/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/psycopg
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/mysqldb
copying ddtrace/contrib/mysqldb/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mysqldb
copying ddtrace/contrib/mysqldb/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mysqldb
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/mysql
copying ddtrace/contrib/mysql/tracers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mysql
copying ddtrace/contrib/mysql/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mysql
copying ddtrace/contrib/mysql/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mysql
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/mongoengine
copying ddtrace/contrib/mongoengine/trace.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mongoengine
copying ddtrace/contrib/mongoengine/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mongoengine
copying ddtrace/contrib/mongoengine/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mongoengine
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/molten
copying ddtrace/contrib/molten/wrappers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/molten
copying ddtrace/contrib/molten/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/molten
copying ddtrace/contrib/molten/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/molten
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/mako
copying ddtrace/contrib/mako/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mako
copying ddtrace/contrib/mako/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mako
copying ddtrace/contrib/mako/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/mako
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/logging
copying ddtrace/contrib/logging/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/logging
copying ddtrace/contrib/logging/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/logging
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/kombu
copying ddtrace/contrib/kombu/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/kombu
copying ddtrace/contrib/kombu/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/kombu
copying ddtrace/contrib/kombu/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/kombu
copying ddtrace/contrib/kombu/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/kombu
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/jinja2
copying ddtrace/contrib/jinja2/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/jinja2
copying ddtrace/contrib/jinja2/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/jinja2
copying ddtrace/contrib/jinja2/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/jinja2
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/httplib
copying ddtrace/contrib/httplib/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/httplib
copying ddtrace/contrib/httplib/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/httplib
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/server_interceptor.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/client_interceptor.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
copying ddtrace/contrib/grpc/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/grpc
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/gevent
copying ddtrace/contrib/gevent/provider.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/gevent
copying ddtrace/contrib/gevent/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/gevent
copying ddtrace/contrib/gevent/greenlet.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/gevent
copying ddtrace/contrib/gevent/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/gevent
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/futures
copying ddtrace/contrib/futures/threading.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/futures
copying ddtrace/contrib/futures/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/futures
copying ddtrace/contrib/futures/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/futures
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/flask_cache
copying ddtrace/contrib/flask_cache/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask_cache
copying ddtrace/contrib/flask_cache/tracers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask_cache
copying ddtrace/contrib/flask_cache/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask_cache
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
copying ddtrace/contrib/flask/wrappers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
copying ddtrace/contrib/flask/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
copying ddtrace/contrib/flask/middleware.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
copying ddtrace/contrib/flask/helpers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
copying ddtrace/contrib/flask/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/flask
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/falcon
copying ddtrace/contrib/falcon/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/falcon
copying ddtrace/contrib/falcon/middleware.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/falcon
copying ddtrace/contrib/falcon/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/falcon
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
copying ddtrace/contrib/elasticsearch/transport.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
copying ddtrace/contrib/elasticsearch/quantize.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
copying ddtrace/contrib/elasticsearch/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
copying ddtrace/contrib/elasticsearch/elasticsearch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
copying ddtrace/contrib/elasticsearch/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/elasticsearch
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/dogpile_cache
copying ddtrace/contrib/dogpile_cache/region.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/dogpile_cache
copying ddtrace/contrib/dogpile_cache/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/dogpile_cache
copying ddtrace/contrib/dogpile_cache/lock.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/dogpile_cache
copying ddtrace/contrib/dogpile_cache/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/dogpile_cache
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/restframework.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/middleware.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/conf.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/compat.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/apps.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
copying ddtrace/contrib/django/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/django
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/dbapi
copying ddtrace/contrib/dbapi/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/dbapi
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/consul
copying ddtrace/contrib/consul/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/consul
copying ddtrace/contrib/consul/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/consul
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/utils.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/task.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/signals.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/constants.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/app.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
copying ddtrace/contrib/celery/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/celery
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/cassandra
copying ddtrace/contrib/cassandra/session.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/cassandra
copying ddtrace/contrib/cassandra/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/cassandra
copying ddtrace/contrib/cassandra/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/cassandra
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/bottle
copying ddtrace/contrib/bottle/trace.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/bottle
copying ddtrace/contrib/bottle/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/bottle
copying ddtrace/contrib/bottle/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/bottle
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/botocore
copying ddtrace/contrib/botocore/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/botocore
copying ddtrace/contrib/botocore/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/botocore
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/boto
copying ddtrace/contrib/boto/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/boto
copying ddtrace/contrib/boto/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/boto
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/wrappers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/provider.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/helpers.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/compat.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
copying ddtrace/contrib/asyncio/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/asyncio
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/algoliasearch
copying ddtrace/contrib/algoliasearch/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/algoliasearch
copying ddtrace/contrib/algoliasearch/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/algoliasearch
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/aiopg
copying ddtrace/contrib/aiopg/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiopg
copying ddtrace/contrib/aiopg/connection.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiopg
copying ddtrace/contrib/aiopg/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiopg
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/aiohttp
copying ddtrace/contrib/aiohttp/template.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiohttp
copying ddtrace/contrib/aiohttp/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiohttp
copying ddtrace/contrib/aiohttp/middlewares.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiohttp
copying ddtrace/contrib/aiohttp/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiohttp
creating build/lib.linux-x86_64-3.8/ddtrace/contrib/aiobotocore
copying ddtrace/contrib/aiobotocore/patch.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiobotocore
copying ddtrace/contrib/aiobotocore/__init__.py -> build/lib.linux-x86_64-3.8/ddtrace/contrib/aiobotocore
running build_ext
building 'ddtrace.internal._rand' extension
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/ddtrace
creating build/temp.linux-x86_64-3.8/ddtrace/internal
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fno-semantic-interposition -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fPIC -I/usr/include/python3.8 -c ddtrace/internal/_rand.c -o build/temp.linux-x86_64-3.8/ddtrace/internal/_rand.o
gcc -pthread -shared -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -fno-semantic-interposition -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now build/temp.linux-x86_64-3.8/ddtrace/internal/_rand.o -L/usr/lib -o build/lib.linux-x86_64-3.8/ddtrace/internal/_rand.cpython-38-x86_64-linux-gnu.so
building 'ddtrace.profiling.collector.stack' extension
creating build/temp.linux-x86_64-3.8/ddtrace/profiling
creating build/temp.linux-x86_64-3.8/ddtrace/profiling/collector
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fno-semantic-interposition -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fPIC -I/usr/include/python3.8 -c ddtrace/profiling/collector/stack.c -o build/temp.linux-x86_64-3.8/ddtrace/profiling/collector/stack.o -DPy_BUILD_CORE
ddtrace/profiling/collector/stack.c:619:10: fatal error: internal/pystate.h: No such file or directory
619 | #include <internal/pystate.h>
| ^~~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
ERROR: Failed building wheel for ddtrace
Failed to build ddtrace
ERROR: Could not build wheels for ddtrace which use PEP 517 and cannot be installed directly
```
### What is the result that you expected?
I should be able to install ddtrace without using the provided wheels, as I could with previous versions.
| [
{
"content": "import os\nimport sys\n\nfrom setuptools import setup, find_packages\nfrom setuptools.command.test import test as TestCommand\n\n# ORDER MATTERS\n# Import this after setuptools or it will fail\nfrom Cython.Build import cythonize # noqa: I100\nimport Cython.Distutils\n\n\nHERE = os.path.dirname(os... | [
{
"content": "import os\nimport sys\n\nfrom setuptools import setup, find_packages\nfrom setuptools.command.test import test as TestCommand\n\n# ORDER MATTERS\n# Import this after setuptools or it will fail\nfrom Cython.Build import cythonize # noqa: I100\nimport Cython.Distutils\n\n\nHERE = os.path.dirname(os... | diff --git a/setup.py b/setup.py
index 0e7dd44fdcb..5caab5a5e57 100644
--- a/setup.py
+++ b/setup.py
@@ -173,6 +173,7 @@ def get_exts_for(name):
"PY_MINOR_VERSION": sys.version_info.minor,
"PY_MICRO_VERSION": sys.version_info.micro,
},
+ force=True,
)
+ get_exts_for("wrapt")
+ get_exts_for("psutil"),
diff --git a/tox.ini b/tox.ini
index dacb183bac8..0d0af1f161f 100644
--- a/tox.ini
+++ b/tox.ini
@@ -158,8 +158,7 @@ isolated_build = true
# meaning running on py3.x will fail
# https://stackoverflow.com/questions/57459123/why-do-i-need-to-run-tox-twice-to-test-a-python-package-with-c-extension
whitelist_externals=rm
-commands_pre=rm -f ddtrace/profiling/_build.c ddtrace/profiling/collector/stack.c ddtrace/profiling/collector/_traceback.c ddtrace/internal/_rand.c
- {envpython} {toxinidir}/setup.py develop
+commands_pre={envpython} {toxinidir}/setup.py develop
usedevelop =
# do not use develop mode with celery as running multiple python versions within
# same job will cause problem for tests that use ddtrace-run
|
kivy__python-for-android-2055 | Can't use AsyncImage with HTTPS URL (or any HTTPS url wit any request): fix is to manually load certifi
### Versions
* Python: 3
* OS: Android
* Kivy: 1.10.1
* Cython: 0.29.7
### Description
Try to open HTTPS Url
Failed with urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate
Actually happening on Async Image I use like that:
```
AsyncImage:
source: 'https://i.goopics.net/27Odx.png'
```
Work perfectly on Windows, not on Android
### buildozer.spec
Command:
```
buildozer android debug
```
Spec file:
```
[app]
# (str) Title of your application
title = myapp
# (str) Package name
package.name = myapp
# (str) Package domain (needed for android/ios packaging)
package.domain = org.myapp
# (str) Source code where the main.py live
source.dir = ./kivy_app
# (list) Source files to include (let empty to include all the files)
source.include_exts = py,png,jpg,kv,atlas
# (list) List of inclusions using pattern matching
#source.include_patterns = assets/*,images/*.png
# (list) Source files to exclude (let empty to not exclude anything)
#source.exclude_exts = spec
# (list) List of directory to exclude (let empty to not exclude anything)
#source.exclude_dirs = tests, bin
# (list) List of exclusions using pattern matching
#source.exclude_patterns = license,images/*/*.jpg
# (str) Application versioning (method 1)
version = 0.2
# (str) Application versioning (method 2)
# version.regex = __version__ = ['"](.*)['"]
# version.filename = %(source.dir)s/main.py
# (list) Application requirements
# comma separated e.g. requirements = sqlite3,kivy
requirements = certifi,openssl,python3,kivy,android
# (str) Custom source folders for requirements
# Sets custom source for any requirements with recipes
# requirements.source.kivy = ../../kivy
# (list) Garden requirements
#garden_requirements =
# (str) Presplash of the application
#presplash.filename = %(source.dir)s/data/presplash.png
# (str) Icon of the application
#icon.filename = %(source.dir)s/data/icon.png
# (str) Supported orientation (one of landscape, sensorLandscape, portrait or all)
orientation = all
# (list) List of service to declare
#services = NAME:ENTRYPOINT_TO_PY,NAME2:ENTRYPOINT2_TO_PY
#
# OSX Specific
#
#
# author = © Copyright Info
# change the major version of python used by the app
osx.python_version = 3.7
# Kivy version to use
osx.kivy_version = 1.10.1
#
# Android specific
#
# (bool) Indicate if the application should be fullscreen or not
fullscreen = 0
# (string) Presplash background color (for new android toolchain)
# Supported formats are: #RRGGBB #AARRGGBB or one of the following names:
# red, blue, green, black, white, gray, cyan, magenta, yellow, lightgray,
# darkgray, grey, lightgrey, darkgrey, aqua, fuchsia, lime, maroon, navy,
# olive, purple, silver, teal.
#android.presplash_color = #FFFFFF
# (list) Permissions
android.permissions = INTERNET
# (int) Target Android API, should be as high as possible.
android.api = 27
# (int) Minimum API your APK will support.
android.minapi = 21
# (str) Android NDK version to use
android.ndk = 17c
# (int) Android NDK API to use. This is the minimum API your app will support, it should usually match android.minapi.
android.ndk_api = 21
# (bool) Use --private data storage (True) or --dir public storage (False)
#android.private_storage = True
# (str) Android NDK directory (if empty, it will be automatically downloaded.)
#android.ndk_path =
# (str) Android SDK directory (if empty, it will be automatically downloaded.)
#android.sdk_path =
# (str) ANT directory (if empty, it will be automatically downloaded.)
#android.ant_path =
# (bool) If True, then skip trying to update the Android sdk
# This can be useful to avoid excess Internet downloads or save time
# when an update is due and you just want to test/build your package
android.skip_update = False
# (bool) If True, then automatically accept SDK license
# agreements. This is intended for automation only. If set to False,
# the default, you will be shown the license when first running
# buildozer.
android.accept_sdk_license = True
# (str) Android entry point, default is ok for Kivy-based app
#android.entrypoint = org.renpy.android.PythonActivity
# (list) Pattern to whitelist for the whole project
#android.whitelist =
# (str) Path to a custom whitelist file
#android.whitelist_src =
# (str) Path to a custom blacklist file
#android.blacklist_src =
# (list) List of Java .jar files to add to the libs so that pyjnius can access
# their classes. Don't add jars that you do not need, since extra jars can slow
# down the build process. Allows wildcards matching, for example:
# OUYA-ODK/libs/*.jar
#android.add_jars = foo.jar,bar.jar,path/to/more/*.jar
# (list) List of Java files to add to the android project (can be java or a
# directory containing the files)
#android.add_src =
# (list) Android AAR archives to add (currently works only with sdl2_gradle
# bootstrap)
#android.add_aars =
# (list) Gradle dependencies to add (currently works only with sdl2_gradle
# bootstrap)
#android.gradle_dependencies =
# (list) Java classes to add as activities to the manifest.
#android.add_activites = com.example.ExampleActivity
# (str) python-for-android branch to use, defaults to master
#p4a.branch = master
# (str) OUYA Console category. Should be one of GAME or APP
# If you leave this blank, OUYA support will not be enabled
#android.ouya.category = GAME
# (str) Filename of OUYA Console icon. It must be a 732x412 png image.
#android.ouya.icon.filename = %(source.dir)s/data/ouya_icon.png
# (str) XML file to include as an intent filters in <activity> tag
#android.manifest.intent_filters =
# (str) launchMode to set for the main activity
#android.manifest.launch_mode = standard
# (list) Android additional libraries to copy into libs/armeabi
#android.add_libs_armeabi = libs/android/*.so
#android.add_libs_armeabi_v7a = libs/android-v7/*.so
#android.add_libs_x86 = libs/android-x86/*.so
#android.add_libs_mips = libs/android-mips/*.so
# (bool) Indicate whether the screen should stay on
# Don't forget to add the WAKE_LOCK permission if you set this to True
#android.wakelock = False
# (list) Android application meta-data to set (key=value format)
#android.meta_data =
# (list) Android library project to add (will be added in the
# project.properties automatically.)
#android.library_references =
# (list) Android shared libraries which will be added to AndroidManifest.xml using <uses-library> tag
#android.uses_library =
# (str) Android logcat filters to use
#android.logcat_filters = *:S python:D
# (bool) Copy library instead of making a libpymodules.so
#android.copy_libs = 1
# (str) The Android arch to build for, choices: armeabi-v7a, arm64-v8a, x86, x86_64
android.arch = armeabi-v7a
#
# Python for android (p4a) specific
#
# (str) python-for-android git clone directory (if empty, it will be automatically cloned from github)
#p4a.source_dir =
# (str) The directory in which python-for-android should look for your own build recipes (if any)
#p4a.local_recipes =
# (str) Filename to the hook for p4a
#p4a.hook =
# (str) Bootstrap to use for android builds
# p4a.bootstrap = sdl2
# (int) port number to specify an explicit --port= p4a argument (eg for bootstrap flask)
#p4a.port =
#
# iOS specific
#
# (str) Path to a custom kivy-ios folder
#ios.kivy_ios_dir = ../kivy-ios
# Alternately, specify the URL and branch of a git checkout:
ios.kivy_ios_url = https://github.com/kivy/kivy-ios
ios.kivy_ios_branch = master
# Another platform dependency: ios-deploy
# Uncomment to use a custom checkout
#ios.ios_deploy_dir = ../ios_deploy
# Or specify URL and branch
ios.ios_deploy_url = https://github.com/phonegap/ios-deploy
ios.ios_deploy_branch = 1.7.0
# (str) Name of the certificate to use for signing the debug version
# Get a list of available identities: buildozer ios list_identities
#ios.codesign.debug = "iPhone Developer: <lastname> <firstname> (<hexstring>)"
# (str) Name of the certificate to use for signing the release version
#ios.codesign.release = %(ios.codesign.debug)s
[buildozer]
# (int) Log level (0 = error only, 1 = info, 2 = debug (with command output))
log_level = 2
# (int) Display warning if buildozer is run as root (0 = False, 1 = True)
warn_on_root = 1
# (str) Path to build artifact storage, absolute or relative to spec file
# build_dir = ./.buildozer
# (str) Path to build output (i.e. .apk, .ipa) storage
# bin_dir = ./bin
# -----------------------------------------------------------------------------
# List as sections
#
# You can define all the "list" as [section:key].
# Each line will be considered as a option to the list.
# Let's take [app] / source.exclude_patterns.
# Instead of doing:
#
#[app]
#source.exclude_patterns = license,data/audio/*.wav,data/images/original/*
#
# This can be translated into:
#
#[app:source.exclude_patterns]
#license
#data/audio/*.wav
#data/images/original/*
#
# -----------------------------------------------------------------------------
# Profiles
#
# You can extend section / key with a profile
# For example, you want to deploy a demo version of your application without
# HD content. You could first change the title to add "(demo)" in the name
# and extend the excluded directories to remove the HD content.
#
#[app@demo]
#title = My Application (demo)
#
#[app:source.exclude_patterns@demo]
#images/hd/*
#
# Then, invoke the command line with the "demo" profile:
#
#buildozer --profile demo android debug
```
### Logs
```
05-27 19:29:05.842 23309 23355 I python : [ERROR ] [Loader ] Failed to load image <https://i.goopics.net/27Odx.png>
05-27 19:29:05.842 23309 23355 I python : Traceback (most recent call last):
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 1317, in do_open
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 1229, in request05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 1275, in _send_request
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 1224, in endheaders
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 1016, in _send_output
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 956, in send
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/http/client.py", line 1392, in connect05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/ssl.py", line 412, in wrap_socket
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/ssl.py", line 853, in _create
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/ssl.py", line 1117, in do_handshake
05-27 19:29:05.842 23309 23355 I python : ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1051)
05-27 19:29:05.842 23309 23355 I python :
05-27 19:29:05.842 23309 23355 I python : During handling of the above exception, another exception occurred:
05-27 19:29:05.842 23309 23355 I python :
05-27 19:29:05.842 23309 23355 I python : Traceback (most recent call last):
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/python-installs/kydoo/kivy/loader.py", line 342, in _load_urllib
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 525, in open
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 543, in _open05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 503, in _call_chain
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 1360, in https_open
05-27 19:29:05.842 23309 23355 I python : File "/home/user/hostcwd/.buildozer/android/platform/build/build/other_builds/python3-libffi-openssl-sqlite3/armeabi-v7a__ndk_target_21/python3/Lib/urllib/request.py", line 1319, in do_open
05-27 19:29:05.842 23309 23355 I python : urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1051)>
```
I actually found a """solution""" using:
```
import ssl
try:
_create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
# Legacy Python that doesn't verify HTTPS certificates by default
pass
else:
# Handle target environment that doesn't support HTTPS verification
ssl._create_default_https_context = _create_unverified_https_context
```
But using that in my main.py don't fix AsyncImage or any call in other py file
Any ideas ?
Thank's
| [
{
"content": "from pythonforandroid.recipe import CythonRecipe\nfrom pythonforandroid.toolchain import current_directory, shprint\nfrom os.path import exists, join, basename\nimport sh\nimport glob\n\n\nclass KivyRecipe(CythonRecipe):\n version = '1.11.1'\n url = 'https://github.com/kivy/kivy/archive/{ver... | [
{
"content": "from pythonforandroid.recipe import CythonRecipe\nfrom pythonforandroid.toolchain import current_directory, shprint\nfrom os.path import exists, join, basename\nimport sh\nimport glob\n\n\nclass KivyRecipe(CythonRecipe):\n version = '1.11.1'\n url = 'https://github.com/kivy/kivy/archive/{ver... | diff --git a/pythonforandroid/recipes/kivy/__init__.py b/pythonforandroid/recipes/kivy/__init__.py
index 3106f25ce6..a93627a021 100644
--- a/pythonforandroid/recipes/kivy/__init__.py
+++ b/pythonforandroid/recipes/kivy/__init__.py
@@ -11,6 +11,7 @@ class KivyRecipe(CythonRecipe):
name = 'kivy'
depends = ['sdl2', 'pyjnius', 'setuptools']
+ python_depends = ['certifi']
def cythonize_build(self, env, build_dir='.'):
super(KivyRecipe, self).cythonize_build(env, build_dir=build_dir)
|
blaze__blaze-1560 | Use of deprecated `flask.ext.cors` results in a warning
``` python
In [1]: import blaze as bz
C:\Python\envs\py-dev\lib\site-packages\flask\exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cors is deprecated, use flask_cors instead.
.format(x=modname), ExtDeprecationWarning
```
Looks like the culprit is:
https://github.com/blaze/blaze/blob/bcddeba0230743d040bc915804af2ff906ce4758/blaze/server/server.py#L22
| [
{
"content": "from __future__ import absolute_import, division, print_function\n\nimport sys\nimport logging\nfrom logging import Formatter\nfrom functools import wraps\nimport traceback\nimport collections\nfrom datetime import datetime\nimport errno\nimport functools\nfrom hashlib import md5\nimport os\nimpor... | [
{
"content": "from __future__ import absolute_import, division, print_function\n\nimport sys\nimport logging\nfrom logging import Formatter\nfrom functools import wraps\nimport traceback\nimport collections\nfrom datetime import datetime\nimport errno\nimport functools\nfrom hashlib import md5\nimport os\nimpor... | diff --git a/blaze/server/server.py b/blaze/server/server.py
index 1438cb0af..15abdc741 100644
--- a/blaze/server/server.py
+++ b/blaze/server/server.py
@@ -19,7 +19,7 @@
from datashape import discover, pprint
import flask
from flask import Blueprint, Flask, Response
-from flask.ext.cors import cross_origin
+from flask_cors import cross_origin
from werkzeug.http import parse_options_header
from toolz import valmap, compose
diff --git a/blaze/server/tests/test_server.py b/blaze/server/tests/test_server.py
index f32305340..6decbbea9 100644
--- a/blaze/server/tests/test_server.py
+++ b/blaze/server/tests/test_server.py
@@ -2,7 +2,7 @@
import pytest
pytest.importorskip('flask')
-pytest.importorskip('flask.ext.cors')
+pytest.importorskip('flask_cors')
from base64 import b64encode
from copy import copy
diff --git a/docs/source/whatsnew/0.12.0.txt b/docs/source/whatsnew/0.12.0.txt
new file mode 100644
index 000000000..278001bcf
--- /dev/null
+++ b/docs/source/whatsnew/0.12.0.txt
@@ -0,0 +1,45 @@
+Release 0.12.0
+-----------------
+
+:Release: 0.12.0
+
+New Expressions
+~~~~~~~~~~~~~~~
+
+None
+
+Improved Expressions
+~~~~~~~~~~~~~~~~~~~~
+
+None
+
+New Backends
+~~~~~~~~~~~~
+
+None
+
+Improved Backends
+~~~~~~~~~~~~~~~~~
+
+None
+
+Experimental Features
+~~~~~~~~~~~~~~~~~~~~~
+
+None
+
+API Changes
+~~~~~~~~~~~
+
+None
+
+Bug Fixes
+~~~~~~~~~
+
+* The ``flask.ext.cors`` import was updated to resolve a ``DeprecationWarning``
+(:issue:`1556`).
+
+Miscellaneous
+~~~~~~~~~~~~~
+
+None
|
twisted__twisted-11958 | expand mypy .* module overrides
**Is your feature request related to a problem? Please describe.**
we'd like to be able to delete a module from the pyproject.toml to mark it as fully type annotated, however having .* overrides with weaker type hinting prevents this
**Describe the solution you'd like**
expand mypy .* module overrides
| [
{
"content": "# -*- test-case-name: twisted.words.test.test_jabberjid -*-\n#\n# Copyright (c) Twisted Matrix Laboratories.\n# See LICENSE for details.\n\n\"\"\"\nJabber Identifier support.\n\nThis module provides an object to represent Jabber Identifiers (JIDs) and\nparse string representations into them with p... | [
{
"content": "# -*- test-case-name: twisted.words.test.test_jabberjid -*-\n#\n# Copyright (c) Twisted Matrix Laboratories.\n# See LICENSE for details.\n\n\"\"\"\nJabber Identifier support.\n\nThis module provides an object to represent Jabber Identifiers (JIDs) and\nparse string representations into them with p... | diff --git a/pyproject.toml b/pyproject.toml
index 5eec9f57d8a..d4f872eaed0 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -318,14 +318,76 @@ no_implicit_reexport = false
allow_untyped_defs = true
check_untyped_defs = false
module = [
- 'twisted._threads.*',
+ 'twisted._threads.test.test_team',
+ 'twisted._threads.test.test_threadworker',
'twisted.application.app',
'twisted.application.internet',
'twisted.application.service',
'twisted.application.test.test_internet',
- 'twisted.conch.*',
- 'twisted.cred.*',
- 'twisted.enterprise.*',
+ 'twisted.conch.client.agent',
+ 'twisted.conch.client.default',
+ 'twisted.conch.client.direct',
+ 'twisted.conch.endpoints',
+ 'twisted.conch.insults.helper',
+ 'twisted.conch.insults.insults',
+ 'twisted.conch.insults.window',
+ 'twisted.conch.ls',
+ 'twisted.conch.manhole',
+ 'twisted.conch.manhole_tap',
+ 'twisted.conch.mixin',
+ 'twisted.conch.recvline',
+ 'twisted.conch.scripts.cftp',
+ 'twisted.conch.scripts.ckeygen',
+ 'twisted.conch.scripts.conch',
+ 'twisted.conch.scripts.tkconch',
+ 'twisted.conch.ssh.agent',
+ 'twisted.conch.ssh.channel',
+ 'twisted.conch.ssh.connection',
+ 'twisted.conch.ssh.factory',
+ 'twisted.conch.ssh.filetransfer',
+ 'twisted.conch.ssh.forwarding',
+ 'twisted.conch.ssh.keys',
+ 'twisted.conch.ssh.service',
+ 'twisted.conch.ssh.session',
+ 'twisted.conch.ssh.sexpy',
+ 'twisted.conch.ssh.transport',
+ 'twisted.conch.ssh.userauth',
+ 'twisted.conch.stdio',
+ 'twisted.conch.tap',
+ 'twisted.conch.telnet',
+ 'twisted.conch.test.loopback',
+ 'twisted.conch.test.test_agent',
+ 'twisted.conch.test.test_cftp',
+ 'twisted.conch.test.test_channel',
+ 'twisted.conch.test.test_checkers',
+ 'twisted.conch.test.test_ckeygen',
+ 'twisted.conch.test.test_conch',
+ 'twisted.conch.test.test_connection',
+ 'twisted.conch.test.test_default',
+ 'twisted.conch.test.test_endpoints',
+ 'twisted.conch.test.test_filetransfer',
+ 'twisted.conch.test.test_forwarding',
+ 'twisted.conch.test.test_helper',
+ 'twisted.conch.test.test_insults',
+ 'twisted.conch.test.test_keys',
+ 'twisted.conch.test.test_knownhosts',
+ 'twisted.conch.test.test_manhole',
+ 'twisted.conch.test.test_mixin',
+ 'twisted.conch.test.test_recvline',
+ 'twisted.conch.test.test_session',
+ 'twisted.conch.test.test_ssh',
+ 'twisted.conch.test.test_telnet',
+ 'twisted.conch.test.test_transport',
+ 'twisted.conch.test.test_userauth',
+ 'twisted.conch.test.test_window',
+ 'twisted.conch.ui.tkvt100',
+ 'twisted.conch.unix',
+ 'twisted.cred.checkers',
+ 'twisted.cred.strcred',
+ 'twisted.cred.test.test_cred',
+ 'twisted.cred.test.test_digestauth',
+ 'twisted.cred.test.test_strcred',
+ 'twisted.enterprise.adbapi',
'twisted.internet._baseprocess',
'twisted.internet._dumbwin32proc',
'twisted.internet._glibbase',
@@ -345,7 +407,9 @@ module = [
'twisted.internet.iocpreactor.reactor',
'twisted.internet.iocpreactor.udp',
'twisted.internet.kqreactor',
+ 'twisted.internet.posixbase',
'twisted.internet.process',
+ 'twisted.internet.protocol',
'twisted.internet.serialport',
'twisted.internet.test._posixifaces',
'twisted.internet.test.connectionmixins',
@@ -353,6 +417,7 @@ module = [
'twisted.internet.test.test_abstract',
'twisted.internet.test.test_address',
'twisted.internet.test.test_asyncioreactor',
+ 'twisted.internet.test.test_base',
'twisted.internet.test.test_baseprocess',
'twisted.internet.test.test_defer_await',
'twisted.internet.test.test_defer_yieldfrom',
@@ -381,6 +446,7 @@ module = [
'twisted.internet.test.test_udp_internals',
'twisted.internet.test.test_unix',
'twisted.internet.test.test_win32events',
+ 'twisted.internet.testing',
'twisted.internet.threads',
'twisted.internet.tksupport',
'twisted.internet.udp',
@@ -389,14 +455,64 @@ module = [
'twisted.internet.win32eventreactor',
'twisted.internet.wxreactor',
'twisted.internet.wxsupport',
- 'twisted.logger.*',
- 'twisted.mail.*',
- 'twisted.names.*',
- 'twisted.pair.*',
- 'twisted.persisted.*',
- 'twisted.plugin.*',
- 'twisted.plugins.*',
- 'twisted.positioning.*',
+ 'twisted.logger._json',
+ 'twisted.mail._cred',
+ 'twisted.mail._pop3client',
+ 'twisted.mail.alias',
+ 'twisted.mail.imap4',
+ 'twisted.mail.mail',
+ 'twisted.mail.maildir',
+ 'twisted.mail.pb',
+ 'twisted.mail.pop3',
+ 'twisted.mail.protocols',
+ 'twisted.mail.relay',
+ 'twisted.mail.relaymanager',
+ 'twisted.mail.scripts.mailmail',
+ 'twisted.mail.smtp',
+ 'twisted.mail.tap',
+ 'twisted.mail.test.pop3testserver',
+ 'twisted.mail.test.test_imap',
+ 'twisted.mail.test.test_mail',
+ 'twisted.mail.test.test_mailmail',
+ 'twisted.mail.test.test_options',
+ 'twisted.mail.test.test_pop3',
+ 'twisted.mail.test.test_pop3client',
+ 'twisted.mail.test.test_smtp',
+ 'twisted.names.authority',
+ 'twisted.names.cache',
+ 'twisted.names.client',
+ 'twisted.names.common',
+ 'twisted.names.dns',
+ 'twisted.names.hosts',
+ 'twisted.names.root',
+ 'twisted.names.secondary',
+ 'twisted.names.server',
+ 'twisted.names.srvconnect',
+ 'twisted.names.tap',
+ 'twisted.names.test.test_cache',
+ 'twisted.names.test.test_client',
+ 'twisted.names.test.test_common',
+ 'twisted.names.test.test_dns',
+ 'twisted.names.test.test_examples',
+ 'twisted.names.test.test_hosts',
+ 'twisted.names.test.test_names',
+ 'twisted.names.test.test_rootresolve',
+ 'twisted.names.test.test_server',
+ 'twisted.names.test.test_srvconnect',
+ 'twisted.names.test.test_tap',
+ 'twisted.pair.test.test_tuntap',
+ 'twisted.pair.testing',
+ 'twisted.pair.tuntap',
+ 'twisted.persisted._tokenize',
+ 'twisted.persisted.aot',
+ 'twisted.persisted.sob',
+ 'twisted.persisted.styles',
+ 'twisted.plugin',
+ 'twisted.plugins.cred_unix',
+ 'twisted.positioning._sentence',
+ 'twisted.positioning.nmea',
+ 'twisted.positioning.test.test_nmea',
+ 'twisted.positioning.test.test_sentence',
'twisted.protocols.amp',
'twisted.protocols.basic',
'twisted.protocols.finger',
@@ -413,10 +529,10 @@ module = [
'twisted.protocols.sip',
'twisted.protocols.socks',
'twisted.protocols.stateful',
- 'twisted.protocols.tls',
- 'twisted.protocols.wire',
'twisted.protocols.test.test_basic',
'twisted.protocols.test.test_tls',
+ 'twisted.protocols.tls',
+ 'twisted.protocols.wire',
'twisted.python.failure',
'twisted.python.formmethod',
'twisted.python.logfile',
@@ -443,13 +559,24 @@ module = [
'twisted.python.util',
'twisted.python.win32',
'twisted.python.zipstream',
- 'twisted.runner.procmon',
'twisted.runner.inetd',
- 'twisted.runner.test.test_procmon',
+ 'twisted.runner.procmon',
'twisted.runner.test.test_inetdconf',
- 'twisted.scripts.*',
- 'twisted.spread.*',
- 'twisted.tap.*',
+ 'twisted.runner.test.test_procmon',
+ 'twisted.scripts._twistd_unix',
+ 'twisted.scripts.test.test_scripts',
+ 'twisted.scripts.trial',
+ 'twisted.spread.banana',
+ 'twisted.spread.flavors',
+ 'twisted.spread.jelly',
+ 'twisted.spread.pb',
+ 'twisted.spread.publish',
+ 'twisted.spread.test.test_banana',
+ 'twisted.spread.test.test_jelly',
+ 'twisted.spread.test.test_pb',
+ 'twisted.spread.test.test_pbfailure',
+ 'twisted.spread.util',
+ 'twisted.tap.ftp',
'twisted.test.iosim',
'twisted.test.process_twisted',
'twisted.test.stdio_test_consumer',
@@ -487,6 +614,7 @@ module = [
'twisted.test.test_paths',
'twisted.test.test_pcp',
'twisted.test.test_persisted',
+ 'twisted.test.test_plugin',
'twisted.test.test_policies',
'twisted.test.test_postfix',
'twisted.test.test_process',
@@ -515,19 +643,149 @@ module = [
'twisted.test.test_unix',
'twisted.test.test_usage',
'twisted.test.testutils',
- 'twisted.trial.*',
- 'twisted.web.*',
- 'twisted.words.*',
- 'twisted.test.test_plugin',
- 'twisted.internet.testing',
- 'twisted.internet.test.test_base',
- 'twisted.internet.protocol',
- 'twisted.internet.posixbase',
+ 'twisted.trial._asynctest',
+ 'twisted.trial._dist.test.test_disttrial',
+ 'twisted.trial._dist.test.test_matchers',
+ 'twisted.trial._dist.test.test_stream',
+ 'twisted.trial._dist.test.test_worker',
+ 'twisted.trial._dist.test.test_workertrial',
+ 'twisted.trial._dist.workerreporter',
+ 'twisted.trial._synctest',
+ 'twisted.trial.reporter',
+ 'twisted.trial.runner',
+ 'twisted.trial.test.detests',
+ 'twisted.trial.test.erroneous',
+ 'twisted.trial.test.mockcustomsuite',
+ 'twisted.trial.test.mockcustomsuite2',
+ 'twisted.trial.test.mockcustomsuite3',
+ 'twisted.trial.test.skipping',
+ 'twisted.trial.test.suppression',
+ 'twisted.trial.test.test_assertions',
+ 'twisted.trial.test.test_asyncassertions',
+ 'twisted.trial.test.test_deferred',
+ 'twisted.trial.test.test_keyboard',
+ 'twisted.trial.test.test_loader',
+ 'twisted.trial.test.test_log',
+ 'twisted.trial.test.test_plugins',
+ 'twisted.trial.test.test_pyunitcompat',
+ 'twisted.trial.test.test_reporter',
+ 'twisted.trial.test.test_runner',
+ 'twisted.trial.test.test_script',
+ 'twisted.trial.test.test_suppression',
+ 'twisted.trial.test.test_testcase',
+ 'twisted.trial.test.test_tests',
+ 'twisted.trial.test.test_util',
+ 'twisted.trial.test.test_warning',
+ 'twisted.trial.test.weird',
+ 'twisted.trial.util',
+ 'twisted.web._auth.basic',
+ 'twisted.web._auth.wrapper',
+ 'twisted.web._http2',
+ 'twisted.web._newclient',
+ 'twisted.web._template_util',
+ 'twisted.web.client',
+ 'twisted.web.distrib',
+ 'twisted.web.domhelpers',
+ 'twisted.web.error',
+ 'twisted.web.http',
+ 'twisted.web.http_headers',
+ 'twisted.web.microdom',
+ 'twisted.web.proxy',
+ 'twisted.web.resource',
+ 'twisted.web.server',
+ 'twisted.web.soap',
+ 'twisted.web.static',
+ 'twisted.web.sux',
+ 'twisted.web.tap',
+ 'twisted.web.test.injectionhelpers',
+ 'twisted.web.test.requesthelper',
+ 'twisted.web.test.test_agent',
+ 'twisted.web.test.test_cgi',
+ 'twisted.web.test.test_distrib',
+ 'twisted.web.test.test_domhelpers',
+ 'twisted.web.test.test_http',
+ 'twisted.web.test.test_http2',
+ 'twisted.web.test.test_httpauth',
+ 'twisted.web.test.test_newclient',
+ 'twisted.web.test.test_pages',
+ 'twisted.web.test.test_proxy',
+ 'twisted.web.test.test_resource',
+ 'twisted.web.test.test_soap',
+ 'twisted.web.test.test_static',
+ 'twisted.web.test.test_tap',
+ 'twisted.web.test.test_util',
+ 'twisted.web.test.test_vhost',
+ 'twisted.web.test.test_web',
+ 'twisted.web.test.test_webclient',
+ 'twisted.web.test.test_wsgi',
+ 'twisted.web.test.test_xml',
+ 'twisted.web.test.test_xmlrpc',
+ 'twisted.web.twcgi',
+ 'twisted.web.wsgi',
+ 'twisted.web.xmlrpc',
+ 'twisted.words.im.basesupport',
+ 'twisted.words.im.ircsupport',
+ 'twisted.words.im.pbsupport',
+ 'twisted.words.protocols.irc',
+ 'twisted.words.protocols.jabber.client',
+ 'twisted.words.protocols.jabber.component',
+ 'twisted.words.protocols.jabber.error',
+ 'twisted.words.protocols.jabber.jstrports',
+ 'twisted.words.protocols.jabber.sasl',
+ 'twisted.words.protocols.jabber.xmlstream',
+ 'twisted.words.service',
+ 'twisted.words.test.test_basesupport',
+ 'twisted.words.test.test_domish',
+ 'twisted.words.test.test_irc',
+ 'twisted.words.test.test_irc_service',
+ 'twisted.words.test.test_jabberclient',
+ 'twisted.words.test.test_jabbercomponent',
+ 'twisted.words.test.test_jabberjstrports',
+ 'twisted.words.test.test_jabbersasl',
+ 'twisted.words.test.test_jabberxmlstream',
+ 'twisted.words.test.test_service',
+ 'twisted.words.test.test_xishutil',
+ 'twisted.words.test.test_xmlstream',
+ 'twisted.words.xish.domish',
+ 'twisted.words.xish.utility',
+ 'twisted.words.xish.xmlstream',
+ 'twisted.words.xish.xpath',
]
[[tool.mypy.overrides]]
allow_untyped_defs = true
module = [
+ 'twisted._threads._convenience',
+ 'twisted._threads._ithreads',
+ 'twisted._threads._memory',
+ 'twisted._threads._threadworker',
+ 'twisted._threads.test.test_convenience',
+ 'twisted._threads.test.test_memory',
+ 'twisted.conch.avatar',
+ 'twisted.conch.checkers',
+ 'twisted.conch.client.connect',
+ 'twisted.conch.client.knownhosts',
+ 'twisted.conch.client.options',
+ 'twisted.conch.error',
+ 'twisted.conch.insults.text',
+ 'twisted.conch.interfaces',
+ 'twisted.conch.manhole_ssh',
+ 'twisted.conch.openssh_compat.factory',
+ 'twisted.conch.ssh._kex',
+ 'twisted.conch.ssh.address',
+ 'twisted.conch.ssh.common',
+ 'twisted.conch.test.test_address',
+ 'twisted.conch.test.test_manhole_tap',
+ 'twisted.conch.test.test_openssh_compat',
+ 'twisted.conch.test.test_scripts',
+ 'twisted.conch.test.test_tap',
+ 'twisted.conch.test.test_text',
+ 'twisted.conch.test.test_unix',
+ 'twisted.conch.ui.ansi',
+ 'twisted.cred._digest',
+ 'twisted.cred.credentials',
+ 'twisted.cred.test.test_cramauth',
+ 'twisted.cred.test.test_simpleauth',
'twisted.internet._pollingfile',
'twisted.internet._posixserialport',
'twisted.internet._posixstdio',
@@ -537,7 +795,6 @@ module = [
'twisted.internet.epollreactor',
'twisted.internet.gireactor',
'twisted.internet.glib2reactor',
- 'twisted.internet.gtk3reactor',
'twisted.internet.iocpreactor.interfaces',
'twisted.internet.main',
'twisted.internet.pollreactor',
@@ -557,8 +814,38 @@ module = [
'twisted.internet.test.test_sigchld',
'twisted.internet.test.test_testing',
'twisted.internet.test.test_win32serialport',
- 'twisted.protocols.dict',
- 'twisted.python._pydoctor',
+ 'twisted.mail._except',
+ 'twisted.mail.bounce',
+ 'twisted.mail.interfaces',
+ 'twisted.mail.test.test_bounce',
+ 'twisted.mail.test.test_scripts',
+ 'twisted.names._rfc1982',
+ 'twisted.names.error',
+ 'twisted.names.resolve',
+ 'twisted.names.test.test_resolve',
+ 'twisted.names.test.test_rfc1982',
+ 'twisted.names.test.test_util',
+ 'twisted.pair.ethernet',
+ 'twisted.pair.ip',
+ 'twisted.pair.raw',
+ 'twisted.pair.rawudp',
+ 'twisted.pair.test.test_ethernet',
+ 'twisted.pair.test.test_ip',
+ 'twisted.pair.test.test_rawudp',
+ 'twisted.persisted._token',
+ 'twisted.persisted.crefutil',
+ 'twisted.persisted.dirdbm',
+ 'twisted.persisted.test.test_styles',
+ 'twisted.plugins.cred_anonymous',
+ 'twisted.plugins.cred_file',
+ 'twisted.plugins.cred_memory',
+ 'twisted.plugins.cred_sshkeys',
+ 'twisted.plugins.twisted_trial',
+ 'twisted.plugins.twisted_words',
+ 'twisted.positioning.base',
+ 'twisted.positioning.ipositioning',
+ 'twisted.positioning.test.receiver',
+ 'twisted.positioning.test.test_base',
'twisted.python._release',
'twisted.python._shellcomp',
'twisted.python._textattributes',
@@ -576,10 +863,14 @@ module = [
'twisted.python.roots',
'twisted.python.shortcut',
'twisted.python.syslog',
- 'twisted.python.test.test_pydoctor',
- 'twisted.python.test.test_systemd',
'twisted.runner.inetdconf',
'twisted.runner.inetdtap',
+ 'twisted.scripts._twistw',
+ 'twisted.scripts.htmlizer',
+ 'twisted.scripts.twistd',
+ 'twisted.spread.interfaces',
+ 'twisted.tap.portforward',
+ 'twisted.tap.socks',
'twisted.test.crash_test_dummy',
'twisted.test.mock_win32process',
'twisted.test.myrebuilder1',
@@ -589,7 +880,6 @@ module = [
'twisted.test.plugin_extra2',
'twisted.test.process_tester',
'twisted.test.ssl_helpers',
- 'twisted.test.test_dict',
'twisted.test.test_finger',
'twisted.test.test_formmethod',
'twisted.test.test_htb',
@@ -599,7 +889,60 @@ module = [
'twisted.test.test_rebuild',
'twisted.test.test_roots',
'twisted.test.test_shortcut',
- 'twisted.test.test_text'
+ 'twisted.test.test_text',
+ 'twisted.trial._asyncrunner',
+ 'twisted.trial._dist.distreporter',
+ 'twisted.trial._dist.disttrial',
+ 'twisted.trial._dist.functional',
+ 'twisted.trial._dist.options',
+ 'twisted.trial._dist.test.test_options',
+ 'twisted.trial._dist.worker',
+ 'twisted.trial._dist.workertrial',
+ 'twisted.trial.itrial',
+ 'twisted.trial.test',
+ 'twisted.trial.test.mockdoctest',
+ 'twisted.trial.test.moduleself',
+ 'twisted.trial.test.ordertests',
+ 'twisted.trial.test.packages',
+ 'twisted.trial.test.pyunitcases',
+ 'twisted.trial.test.sample',
+ 'twisted.trial.test.test_doctest',
+ 'twisted.trial.test.test_matchers',
+ 'twisted.trial.test.test_output',
+ 'twisted.trial.test.test_skip',
+ 'twisted.web._auth.digest',
+ 'twisted.web.demo',
+ 'twisted.web.html',
+ 'twisted.web.iweb',
+ 'twisted.web.rewrite',
+ 'twisted.web.script',
+ 'twisted.web.test._util',
+ 'twisted.web.test.test_client',
+ 'twisted.web.test.test_error',
+ 'twisted.web.test.test_html',
+ 'twisted.web.test.test_http_headers',
+ 'twisted.web.test.test_script',
+ 'twisted.web.test.test_web__responses',
+ 'twisted.web.vhost',
+ 'twisted.words.im.baseaccount',
+ 'twisted.words.im.basechat',
+ 'twisted.words.im.interfaces',
+ 'twisted.words.iwords',
+ 'twisted.words.protocols.jabber.ijabber',
+ 'twisted.words.protocols.jabber.jid',
+ 'twisted.words.protocols.jabber.sasl_mechanisms',
+ 'twisted.words.protocols.jabber.xmpp_stringprep',
+ 'twisted.words.tap',
+ 'twisted.words.test.test_basechat',
+ 'twisted.words.test.test_ircsupport',
+ 'twisted.words.test.test_jabbererror',
+ 'twisted.words.test.test_jabberjid',
+ 'twisted.words.test.test_jabbersaslmechanisms',
+ 'twisted.words.test.test_jabberxmppstringprep',
+ 'twisted.words.test.test_tap',
+ 'twisted.words.test.test_xmpproutertap',
+ 'twisted.words.test.test_xpath',
+ 'twisted.words.xmpproutertap',
]
[[tool.mypy.overrides]]
diff --git a/src/twisted/newsfragments/11957.misc b/src/twisted/newsfragments/11957.misc
new file mode 100644
index 00000000000..6c77563a40b
--- /dev/null
+++ b/src/twisted/newsfragments/11957.misc
@@ -0,0 +1 @@
+expand mypy .* module overrides
diff --git a/src/twisted/words/protocols/jabber/jid.py b/src/twisted/words/protocols/jabber/jid.py
index c263b36e47a..52e154fee4f 100644
--- a/src/twisted/words/protocols/jabber/jid.py
+++ b/src/twisted/words/protocols/jabber/jid.py
@@ -146,7 +146,7 @@ class JID:
def __init__(
self,
str: Union[str, None] = None,
- tuple: Union[Tuple[str, str, str], None] = None,
+ tuple: Union[Tuple[Union[str, None], str, Union[str, None]], None] = None,
):
if str:
user, host, res = parse(str)
diff --git a/src/twisted/words/test/test_jabberjid.py b/src/twisted/words/test/test_jabberjid.py
index 18c5cd4d708..c24f14f7192 100644
--- a/src/twisted/words/test/test_jabberjid.py
+++ b/src/twisted/words/test/test_jabberjid.py
@@ -128,21 +128,21 @@ def test_userhostJIDNoResource(self):
j = jid.JID("user@host")
self.assertIdentical(j, j.userhostJID())
- def test_fullHost(self):
+ def test_fullHost(self) -> None:
"""
Test giving a string representation of the JID with only a host part.
"""
j = jid.JID(tuple=(None, "host", None))
self.assertEqual("host", j.full())
- def test_fullHostResource(self):
+ def test_fullHostResource(self) -> None:
"""
Test giving a string representation of the JID with host, resource.
"""
j = jid.JID(tuple=(None, "host", "resource"))
self.assertEqual("host/resource", j.full())
- def test_fullUserHost(self):
+ def test_fullUserHost(self) -> None:
"""
Test giving a string representation of the JID with user, host.
"""
|
beetbox__beets-2240 | Slowdown with beet web - regression
There is a massive slowdown in queries with python2 and python3 when using the beet web interface.
When large queries are run, i.e. 'format:flac' on a flac only library, the web interface posts queries 10x slower than before the regression. To clarify that does seem to be a relative time based on the length of time originally taken by the query.
This is caused by a regression in the commit.
https://github.com/beetbox/beets/commit/5e8ac9e4a5d06de791fe051a419ba070bbdd5bec
beet stats
Tracks: 43913
Total time: 51.4 weeks
Approximate total size: 976.18 GiB
Artists: 7345
Albums: 12004
Album artists: 1800
| [
{
"content": "# -*- coding: utf-8 -*-\n# This file is part of beets.\n# Copyright 2016, Adrian Sampson.\n#\n# Permission is hereby granted, free of charge, to any person obtaining\n# a copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, in... | [
{
"content": "# -*- coding: utf-8 -*-\n# This file is part of beets.\n# Copyright 2016, Adrian Sampson.\n#\n# Permission is hereby granted, free of charge, to any person obtaining\n# a copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, in... | diff --git a/beetsplug/web/__init__.py b/beetsplug/web/__init__.py
index 07e68638b0..810de87183 100644
--- a/beetsplug/web/__init__.py
+++ b/beetsplug/web/__init__.py
@@ -37,7 +37,7 @@ def _rep(obj, expand=False):
out = dict(obj)
if isinstance(obj, beets.library.Item):
- out['path'] = obj.destination(fragment=True)
+ del out['path']
# Get the size (in bytes) of the backing file. This is useful
# for the Tomahawk resolver API.
|
piskvorky__gensim-2738 | keywords.py gives `IndexError: list index out of range` when `words` parameter is provided.
Really confused why I'm getting this error. Perhaps I'm making a silly mistake I'm not familiar with gensim and nlp in general.
Im running on Windows 10 Home 64-bit, conda version : 4.7.11, conda-build version : 2.18.8, python version : 3.7.3.final.0
My code is attempting to get keywords per sentence in a loop. To simplify matters I've isolated the following code that causes this, trying to get keywords from gensim's `keywords.py`.
```python
s = "Don’t dive right into solving without a plan (and somehow hope you can muddle your way through)."
keywords(s, words=4, scores=False, split=True, lemmatize=True)
File "C:\Users\username\Anaconda3\envs\gensim\lib\site-packages\gensim\summarization\keywords.py", line 521, in keywords
extracted_lemmas = _extract_tokens(graph.nodes(), pagerank_scores, ratio, words)
File "C:\Users\username\Anaconda3\envs\gensim\lib\site-packages\gensim\summarization\keywords.py", line 304, in _extract_tokens
return [(scores[lemmas[i]], lemmas[i],) for i in range(int(length))]
File "C:\Users\username\Anaconda3\envs\gensim\lib\site-packages\gensim\summarization\keywords.py", line 304, in <listcomp>
return [(scores[lemmas[i]], lemmas[i],) for i in range(int(length))]
IndexError: list index out of range
```
I've tried setting `scores=True`, `lemmatize=False`, and `split=False` but the same error persists. I've also tried removing the parenthesis and removing the apostrophe, the error persisted. What did work is removing the `words` parameter altogether, but still it shouldn't create an error if it's provided. Thanks for the help in advance!
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Licensed under the GNU LGPL v2.1 - http://www.gnu.org/licenses/lgpl.html\n\n\"\"\"This module contains functions to find keywords of the text and building graph on tokens from text.\n\nExamples\n--------\nExtract keywords from text\n\n.. sourcec... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Licensed under the GNU LGPL v2.1 - http://www.gnu.org/licenses/lgpl.html\n\n\"\"\"This module contains functions to find keywords of the text and building graph on tokens from text.\n\nExamples\n--------\nExtract keywords from text\n\n.. sourcec... | diff --git a/gensim/summarization/keywords.py b/gensim/summarization/keywords.py
index db7c8a0dc7..2c85cf0bfe 100644
--- a/gensim/summarization/keywords.py
+++ b/gensim/summarization/keywords.py
@@ -302,7 +302,7 @@ def _extract_tokens(lemmas, scores, ratio, words):
"""
lemmas.sort(key=lambda s: scores[s], reverse=True)
- length = len(lemmas) * ratio if words is None else words
+ length = len(lemmas) * ratio if words is None else min(words, len(lemmas))
return [(scores[lemmas[i]], lemmas[i],) for i in range(int(length))]
diff --git a/gensim/test/test_keywords.py b/gensim/test/test_keywords.py
index 6011c83df4..ffe2f32a8f 100644
--- a/gensim/test/test_keywords.py
+++ b/gensim/test/test_keywords.py
@@ -101,6 +101,12 @@ def test_text_keywords_without_graph_edges(self):
kwds = keywords(text, deacc=False, scores=True)
self.assertFalse(len(kwds))
+ def test_keywords_with_words_greater_than_lemmas(self):
+ # words parameter is greater than number of words in text variable
+ text = 'Test string small length'
+ kwds = keywords(text, words=5, split=True)
+ self.assertIsNotNone(kwds)
+
if __name__ == '__main__':
logging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s', level=logging.DEBUG)
|
hydroshare__hydroshare-2769 | Add back Active, Date joined, and last login in mezzanine listing of users
In the 3/19/18 version of HydroShare when an admin listed users the fields listed were

At present when an admin lists users the fields are

The fields Active, Date joined and last login are needed so that when there are problems with users creating and activating accounts (as occurred this week) an admin can list recent account creations and account creation attempts to assess the extent of the problem, and contact users that may have been impacted.
This regression was noted in https://github.com/hydroshare/hydroshare/pull/2677#issuecomment-374183106
| [
{
"content": "from django import forms\nfrom django.contrib.auth.admin import UserAdmin\nfrom django.contrib.auth.forms import UserCreationForm\nfrom django.contrib.auth.models import User\nfrom django.contrib.gis import admin\nfrom django.contrib.contenttypes.admin import GenericTabularInline\nfrom django.util... | [
{
"content": "from django import forms\nfrom django.contrib.auth.admin import UserAdmin\nfrom django.contrib.auth.forms import UserCreationForm\nfrom django.contrib.auth.models import User\nfrom django.contrib.gis import admin\nfrom django.contrib.contenttypes.admin import GenericTabularInline\nfrom django.util... | diff --git a/hs_core/admin.py b/hs_core/admin.py
index a0a7b4e7f1..797c926320 100755
--- a/hs_core/admin.py
+++ b/hs_core/admin.py
@@ -23,6 +23,10 @@ def __init__(self, *args, **kwargs):
'fields': ('email', 'username', 'password1', 'password2',)
}),
)
+UserAdmin.list_display = [
+ 'username', 'email', 'first_name', 'last_name', 'is_staff',
+ 'is_active', 'date_joined', 'last_login'
+]
class InlineResourceFiles(GenericTabularInline):
model = ResourceFile
diff --git a/theme/templates/resource-landing-page/title-section.html b/theme/templates/resource-landing-page/title-section.html
index 09f1ccdcb1..cd9b2f25c9 100644
--- a/theme/templates/resource-landing-page/title-section.html
+++ b/theme/templates/resource-landing-page/title-section.html
@@ -229,7 +229,7 @@ <h2 id="resource-title">{{ title }}</h2>
"@type": "Dataset",
"additionalType": ["http://schema.geolink.org/1.0/base/main#Dataset", "http://vivoweb.org/ontology/core#Dataset"],
"name": "{{ title }}",
- "description": "{{ cm.metadata.description }}",
+ "description": "{{ cm.metadata.description | escapejs }}",
"url": "https://www.hydroshare.org/resource/{{ cm.short_id }}/",
"version": "2017-06-04",
{% if cm.raccess.public %} "isAccessibleForFree": true, {% endif %}
@@ -294,7 +294,7 @@ <h2 id="resource-title">{{ title }}</h2>
"creator": {
"@id": "{{ cr.description }}",
"@type": "Person",
- "additionalType": "http://schema.geolink.org/1.0/base/main#Person", // Is this necessary?
+ "additionalType": "http://schema.geolink.org/1.0/base/main#Person",
"name": "{{ cr.name }}",
"url": "{{ cr.description }}/"
}
|
bokeh__bokeh-10311 | [BUG] Link in docs is not working for fill color property
https://docs.bokeh.org/en/latest/_modules/bokeh/core/property_mixins.html#FillProps
| [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2012 - 2020, Anaconda, Inc., and Bokeh Contributors.\n# All rights reserved.\n#\n# The full license is in the file LICENSE.txt, distributed with this software.\n#----------------------------------------... | [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2012 - 2020, Anaconda, Inc., and Bokeh Contributors.\n# All rights reserved.\n#\n# The full license is in the file LICENSE.txt, distributed with this software.\n#----------------------------------------... | diff --git a/bokeh/core/property_mixins.py b/bokeh/core/property_mixins.py
index 79b0b4577eb..85369400031 100644
--- a/bokeh/core/property_mixins.py
+++ b/bokeh/core/property_mixins.py
@@ -118,7 +118,7 @@ class SomeGlyph(Glyph):
- a 3-tuple of integers (r,g,b) between 0 and 255
- a 4-tuple of (r,g,b,a) where r,g,b are integers between 0..255 and a is between 0..1
-.. _CSS colors: http://www.w3schools.com/cssref/css_colornames.asp
+.. _CSS colors: https://www.w3schools.com/colors/colors_names.asp
"""
|
ray-project__ray-7665 | [Python] jsonschema included twice in setup.py requires list.
<!--Please include [tune], [rllib], [autoscaler] etc. in the issue title if relevant-->
### What is the problem?
`jsonschema` is included twice in the Python package [setup.py `requires` list](https://github.com/ray-project/ray/blob/master/python/setup.py#L176-L183). This is causing the usage of the Ray Python library within Bazel to fail during the analysis phase due to label duplication in the generated `py_library` target's `'deps'`:
```
ERROR: .../external/requirements_py3_pypi__ray_0_9_0_dev0/BUILD:6:1: Label '@requirements_py3_pypi__jsonschema_3_2_0//:pkg' is duplicated in the 'deps' attribute of rule 'pkg'
```
This bug was introduced in the [cluster json schema validator PR](https://github.com/ray-project/ray/pull/7261/files#diff-8cf6167d58ce775a08acafcfe6f40966).
*Ray version and other system information (Python version, TensorFlow version, OS):*
Ray master commit 90b553ed058a546e036374cd0919e00604892514 (most recent commit as of this issue filing)
### Reproduction (REQUIRED)
- [x] I have verified my script runs in a clean environment and reproduces the issue.
- [x] I have verified the issue also occurs with the [latest wheels](https://ray.readthedocs.io/en/latest/installation.html).
| [
{
"content": "from itertools import chain\nimport os\nimport re\nimport shutil\nimport subprocess\nimport sys\n\nfrom setuptools import setup, find_packages, Distribution\nimport setuptools.command.build_ext as _build_ext\n\n# Ideally, we could include these files by putting them in a\n# MANIFEST.in or using th... | [
{
"content": "from itertools import chain\nimport os\nimport re\nimport shutil\nimport subprocess\nimport sys\n\nfrom setuptools import setup, find_packages, Distribution\nimport setuptools.command.build_ext as _build_ext\n\n# Ideally, we could include these files by putting them in a\n# MANIFEST.in or using th... | diff --git a/python/setup.py b/python/setup.py
index 36af00e764bfb..a4edb34606ca9 100644
--- a/python/setup.py
+++ b/python/setup.py
@@ -180,7 +180,6 @@ def find_version(*filepath):
"packaging",
"pytest",
"pyyaml",
- "jsonschema",
"redis>=3.3.2",
# NOTE: Don't upgrade the version of six! Doing so causes installation
# problems. See https://github.com/ray-project/ray/issues/4169.
|
googleapis__google-cloud-python-1347 | Should we compare Entity._meanings in __eq__
/cc @tseaver @pcostell
| [
{
"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless... | [
{
"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless... | diff --git a/gcloud/datastore/entity.py b/gcloud/datastore/entity.py
index 7a25e648391a..c27db1fc76e8 100644
--- a/gcloud/datastore/entity.py
+++ b/gcloud/datastore/entity.py
@@ -98,6 +98,8 @@ def __eq__(self, other):
return False
return (self.key == other.key and
+ self._exclude_from_indexes == other._exclude_from_indexes and
+ self._meanings == other._meanings and
super(Entity, self).__eq__(other))
def __ne__(self, other):
diff --git a/gcloud/datastore/test_entity.py b/gcloud/datastore/test_entity.py
index 122b916ea58b..8cf7ee43856d 100644
--- a/gcloud/datastore/test_entity.py
+++ b/gcloud/datastore/test_entity.py
@@ -70,10 +70,21 @@ def test___eq_____ne___w_different_keys(self):
def test___eq_____ne___w_same_keys(self):
from gcloud.datastore.key import Key
+
+ name = 'foo'
+ value = 42
+ meaning = 9
+
key1 = Key(_KIND, _ID, dataset_id=_DATASET_ID)
- entity1 = self._makeOne(key=key1)
+ entity1 = self._makeOne(key=key1, exclude_from_indexes=(name,))
+ entity1[name] = value
+ entity1._meanings[name] = (meaning, value)
+
key2 = Key(_KIND, _ID, dataset_id=_DATASET_ID)
- entity2 = self._makeOne(key=key2)
+ entity2 = self._makeOne(key=key2, exclude_from_indexes=(name,))
+ entity2[name] = value
+ entity2._meanings[name] = (meaning, value)
+
self.assertTrue(entity1 == entity2)
self.assertFalse(entity1 != entity2)
@@ -140,6 +151,38 @@ def test___eq_____ne___w_same_keys_props_w_diff_entities_as_value(self):
self.assertFalse(entity1 == entity2)
self.assertTrue(entity1 != entity2)
+ def test__eq__same_value_different_exclude(self):
+ from gcloud.datastore.key import Key
+
+ name = 'foo'
+ value = 42
+ key = Key(_KIND, _ID, dataset_id=_DATASET_ID)
+
+ entity1 = self._makeOne(key=key, exclude_from_indexes=(name,))
+ entity1[name] = value
+
+ entity2 = self._makeOne(key=key, exclude_from_indexes=())
+ entity2[name] = value
+
+ self.assertFalse(entity1 == entity2)
+
+ def test__eq__same_value_different_meanings(self):
+ from gcloud.datastore.key import Key
+
+ name = 'foo'
+ value = 42
+ meaning = 9
+ key = Key(_KIND, _ID, dataset_id=_DATASET_ID)
+
+ entity1 = self._makeOne(key=key, exclude_from_indexes=(name,))
+ entity1[name] = value
+
+ entity2 = self._makeOne(key=key, exclude_from_indexes=(name,))
+ entity2[name] = value
+ entity2._meanings[name] = (meaning, value)
+
+ self.assertFalse(entity1 == entity2)
+
def test___repr___no_key_empty(self):
entity = self._makeOne()
self.assertEqual(repr(entity), '<Entity {}>')
|
espnet__espnet-2227 | espnet2 inference error without language model
If not using a language model, the espnet2 asr_inference.py causes the following error.
File "espnet2/espnet2/bin/asr_inference.py", line 152, in __init__
self.lm_train_args = lm_train_args
UnboundLocalError: local variable 'lm_train_args' referenced before assignment
| [
{
"content": "#!/usr/bin/env python3\nimport argparse\nimport logging\nfrom pathlib import Path\nimport sys\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport numpy as np\nimport torch\nfrom typeguard import check_argument_types\nfrom typeguar... | [
{
"content": "#!/usr/bin/env python3\nimport argparse\nimport logging\nfrom pathlib import Path\nimport sys\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport numpy as np\nimport torch\nfrom typeguard import check_argument_types\nfrom typeguar... | diff --git a/espnet2/bin/asr_inference.py b/espnet2/bin/asr_inference.py
index 61e657b6eb2..6c99fe0f265 100755
--- a/espnet2/bin/asr_inference.py
+++ b/espnet2/bin/asr_inference.py
@@ -147,7 +147,6 @@ def __init__(
self.asr_model = asr_model
self.asr_train_args = asr_train_args
- self.lm_train_args = lm_train_args
self.converter = converter
self.tokenizer = tokenizer
self.beam_search = beam_search
|
dbt-labs__dbt-core-7080 | [CT-2225] [Bug] Suddenly getting ModuleNotFoundError: No module named 'pytz'
### Is this a new bug in dbt-core?
- [X] I believe this is a new bug in dbt-core
- [X] I have searched the existing issues, and I could not find an existing issue for this bug
### Current Behavior
I am installing dbt-bigquery with meltano (which installs it in a isolated *venv*).
Today when invoking `dbt deps` using `meltano invoke dbt-bigquery:deps` I am getting a stacktrace with
ModuleNotFoundError: No module named 'pytz'
### Expected Behavior
`pytz` should be found. I have noted that it is not included in the requirements. So while it's strange that it suddenly started failing, maybe it was more of an accident that it ever worked in the first place?
### Steps To Reproduce
With versions specified as
dbt-core~=1.3.0
dbt-bigquery~=1.3.0
invoking `dbt deps` should not throw a ModuleNotFoundError
### Relevant log output
```shell
Traceback (most recent call last):
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/bin/dbt", line 5, in <module>
from dbt.main import main
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/main.py", line 24, in <module>
import dbt.task.build as build_task
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/task/build.py", line 1, in <module>
from .run import RunTask, ModelRunner as run_model_runner
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/task/run.py", line 8, in <module>
from .compile import CompileRunner, CompileTask
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/task/compile.py", line 4, in <module>
from .runnable import GraphRunnableTask
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/task/runnable.py", line 11, in <module>
from .printer import (
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/task/printer.py", line 22, in <module>
from dbt.tracking import InvocationProcessor
File "/workspaces/elt/.meltano/transformers/dbt-bigquery/venv/lib/python3.9/site-packages/dbt/tracking.py", line 25, in <module>
import pytz
ModuleNotFoundError: No module named 'pytz'
```
### Environment
```markdown
- OS: Linux (fresh docker container inside virtual environment)
- Python: 3.9
- dbt: 1.3.1 (~=1.3.0)
```
### Which database adapter are you using with dbt?
other (mention it in "Additional Context")
### Additional Context
_No response_
| [
{
"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 7, 2):\n print(\"Error: dbt does not support this version of Python.\")\n print(\"Please upgrade to Python 3.7.2 or higher.\")\n sys.exit(1)\n\n\nfrom setuptools import setup\n\ntry:\n from setuptools import find... | [
{
"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 7, 2):\n print(\"Error: dbt does not support this version of Python.\")\n print(\"Please upgrade to Python 3.7.2 or higher.\")\n sys.exit(1)\n\n\nfrom setuptools import setup\n\ntry:\n from setuptools import find... | diff --git a/.changes/unreleased/Fixes-20230228-130318.yaml b/.changes/unreleased/Fixes-20230228-130318.yaml
new file mode 100644
index 00000000000..abcbee150a2
--- /dev/null
+++ b/.changes/unreleased/Fixes-20230228-130318.yaml
@@ -0,0 +1,6 @@
+kind: Fixes
+body: add pytz dependency
+time: 2023-02-28T13:03:18.353468+01:00
+custom:
+ Author: sdebruyn
+ Issue: "7077"
diff --git a/core/setup.py b/core/setup.py
index b2f58533fba..56454e9049c 100644
--- a/core/setup.py
+++ b/core/setup.py
@@ -65,6 +65,7 @@
"dbt-extractor~=0.4.1",
"typing-extensions>=3.7.4",
"werkzeug>=1,<3",
+ "pytz>=2015.7",
# the following are all to match snowflake-connector-python
"requests<3.0.0",
"idna>=2.5,<4",
diff --git a/dev-requirements.txt b/dev-requirements.txt
index 2701e4cab77..e13aa4628ea 100644
--- a/dev-requirements.txt
+++ b/dev-requirements.txt
@@ -14,7 +14,6 @@ pytest-dotenv
pytest-logbook
pytest-mock
pytest-xdist
-pytz
tox>=3.13
twine
types-colorama
|
open-mmlab__mmdetection-3553 | VOCDataset object has no attribute dataset
Thanks for your error report and we appreciate it a lot.
**Checklist**
1. I have searched related issues but cannot get the expected help.
2. The bug has not been fixed in the latest version.
**Describe the bug**
I tried to train my model on Pascal VOC 2012 dataset, and set the config for data as follows:
```python3
batch_size = 8
data = dict(
samples_per_gpu=batch_size,
workers_per_gpu=4,
train=dict(
type=dataset_type,
ann_file=data_root + 'VOC2012/ImageSets/Main/train.txt',
img_prefix=data_root + 'VOC2012/',
pipeline=train_pipeline,),
val=dict(
type=dataset_type,
ann_file=data_root + 'VOC2012/ImageSets/Main/val.txt',
img_prefix=data_root + 'VOC2012/',
pipeline=test_pipeline,),
)
evaluation=dict(interval=1, metric='mAP')
```
But during evaluation, it raised following error:
```shell
File "train.py", line 166, in <module>
main()
File "train.py", line 162, in main
meta=meta)
File "/home/lfc199471/mmdetection/mmdet/apis/train.py", line 128, in train_detector
runner.run(data_loaders, cfg.workflow, cfg.total_epochs)
File "/home/lfc199471/anaconda3/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 122, in run
epoch_runner(data_loaders[i], **kwargs)
File "/home/lfc199471/anaconda3/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 46, in train
self.call_hook('after_train_epoch')
File "/home/lfc199471/anaconda3/lib/python3.7/site-packages/mmcv/runner/base_runner.py", line 282, in call_hook
getattr(hook, fn_name)(self)
File "/home/lfc199471/mmdetection/mmdet/core/evaluation/eval_hooks.py", line 28, in after_train_epoch
self.evaluate(runner, results)
File "/home/lfc199471/mmdetection/mmdet/core/evaluation/eval_hooks.py", line 32, in evaluate
results, logger=runner.logger, **self.eval_kwargs)
File "/home/lfc199471/mmdetection/mmdet/datasets/voc.py", line 43, in evaluate
ds_name = self.dataset.CLASSES
AttributeError: 'VOCDataset' object has no attribute 'dataset'
```
I checked the `voc.py` in `mmdet` and found that in line 43, it was
```python3
ds_name = self.dataset.CLASSES
```
but `VOCDataset` and its superclasses `XMLDataset` and `CustomDataset` don't have this attribute. Is it a bug or did I make some mistakes in the config?
**Reproduction**
1. What command or script did you run?
```
python tools/train.py --gpus 1 configs/<my_config_file>
```
2. Did you make any modifications on the code or config? Did you understand what you have modified?
Yes, please see above.
3. What dataset did you use?
Pascal VOC 2012 detection
**Environment**
1. Please run `python mmdet/utils/collect_env.py` to collect necessary environment infomation and paste it here.
```shell
sys.platform: linux
Python: 3.7.6 (default, Jan 8 2020, 19:59:22) [GCC 7.3.0]
CUDA available: True
CUDA_HOME: /usr/local/cuda
NVCC: Cuda compilation tools, release 10.2, V10.2.89
GPU 0: Tesla P100-PCIE-16GB
GCC: gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
PyTorch: 1.5.1
PyTorch compiling details: PyTorch built with:
- GCC 7.3
- C++ Version: 201402
- Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications
- Intel(R) MKL-DNN v0.21.1 (Git Hash 7d2fd500bc78936d1d648ca713b901012f470dbc)
- OpenMP 201511 (a.k.a. OpenMP 4.5)
- NNPACK is enabled
- CPU capability usage: AVX2
- CUDA Runtime 10.2
- NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_37,code=compute_37
- CuDNN 7.6.5
- Magma 2.5.2
- Build settings: BLAS=MKL, BUILD_TYPE=Release, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -fopenmp -DNDEBUG -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DUSE_INTERNAL_THREADPOOL_IMPL -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, USE_CUDA=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_STATIC_DISPATCH=OFF,
TorchVision: 0.6.0a0+35d732a
OpenCV: 4.2.0
MMCV: 0.6.1
MMDetection: 2.1.0+b44e78b
MMDetection Compiler: GCC 7.5
MMDetection CUDA Compiler: 10.2
```
2. You may add addition that may be helpful for locating the problem, such as
- How you installed PyTorch [e.g., pip, conda, source] : conda
If you need any log file or some source code from me, just let me know.
| [
{
"content": "from mmdet.core import eval_map, eval_recalls\nfrom .builder import DATASETS\nfrom .xml_style import XMLDataset\n\n\n@DATASETS.register_module()\nclass VOCDataset(XMLDataset):\n\n CLASSES = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car',\n 'cat', 'chair', 'cow', '... | [
{
"content": "from mmdet.core import eval_map, eval_recalls\nfrom .builder import DATASETS\nfrom .xml_style import XMLDataset\n\n\n@DATASETS.register_module()\nclass VOCDataset(XMLDataset):\n\n CLASSES = ('aeroplane', 'bicycle', 'bird', 'boat', 'bottle', 'bus', 'car',\n 'cat', 'chair', 'cow', '... | diff --git a/mmdet/datasets/voc.py b/mmdet/datasets/voc.py
index 9de96b1c774..87689b5e726 100644
--- a/mmdet/datasets/voc.py
+++ b/mmdet/datasets/voc.py
@@ -62,7 +62,7 @@ def evaluate(self,
if self.year == 2007:
ds_name = 'voc07'
else:
- ds_name = self.dataset.CLASSES
+ ds_name = self.CLASSES
mean_ap, _ = eval_map(
results,
annotations,
|
learningequality__kolibri-4343 | Enable ePUB plugin to run by default
### Observed behavior
ePUB plugin is not enabled by default, and it prevents from importing & viewing ePUB files, until the command `kolibri plugin kolibri.plugins.document_epub_render enable` is run.
### User-facing consequences
Inability to view and import ePUB files.
### Context
dev environment, tried on `develop` and `0.11.a7` branches
| [
{
"content": "\"\"\"\nKolibri configuration data\n==========================\n\n.. warning::\n Do not load any django.conf.settings stuff here. This configuration data\n precedes loading of settings, it is not part of the settings stack.\n\nTODO: We need to figure out our conf API. Do we store in ini/json... | [
{
"content": "\"\"\"\nKolibri configuration data\n==========================\n\n.. warning::\n Do not load any django.conf.settings stuff here. This configuration data\n precedes loading of settings, it is not part of the settings stack.\n\nTODO: We need to figure out our conf API. Do we store in ini/json... | diff --git a/kolibri/utils/conf.py b/kolibri/utils/conf.py
index dd308d001b1..0576a182b59 100644
--- a/kolibri/utils/conf.py
+++ b/kolibri/utils/conf.py
@@ -68,6 +68,7 @@
"kolibri.plugins.user",
"kolibri_exercise_perseus_plugin",
"kolibri.plugins.style_guide",
+ "kolibri.plugins.document_epub_render",
]
#: Everything in this list is added to django.conf.settings.INSTALLED_APPS
|
pwr-Solaar__Solaar-1003 | Please create an AppData file for Solaar
Please consider writing and installing an AppData file with the application description and some screenshots, else Solaar looks really bad in the GNOME and KDE Software Centers. We'd love to showcase more applications, but without the extra data file we can't. See http://people.freedesktop.org/~hughsient/appdata/ for details; thanks!
Richard
| [
{
"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n# from solaar import NAME, __version__\n__version__ = '1.0.4'\nNAME = 'Solaar'\n\n\ndef _data_files():\n from os.path import dirname a... | [
{
"content": "#!/usr/bin/env python3\n\nfrom glob import glob as _glob\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\n# from solaar import NAME, __version__\n__version__ = '1.0.4'\nNAME = 'Solaar'\n\n\ndef _data_files():\n from os.path import dirname a... | diff --git a/setup.py b/setup.py
index 42e3cc7e3b..9fc93ae0bd 100755
--- a/setup.py
+++ b/setup.py
@@ -24,6 +24,7 @@ def _data_files():
yield 'share/applications', ['share/applications/solaar.desktop']
yield 'share/solaar/udev-rules.d', ['rules.d/42-logitech-unify-permissions.rules']
+ yield 'share/metainfo/io.github.pwr_solaar.solaar.metainfo.xml', ['share/solaar/metainfo.xml']
del _dirname
diff --git a/share/solaar/metainfo.xml b/share/solaar/metainfo.xml
new file mode 100644
index 0000000000..dfc65e87f3
--- /dev/null
+++ b/share/solaar/metainfo.xml
@@ -0,0 +1,43 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<component type="desktop-application">
+ <id>io.github.pwr_solaar.solaar</id>
+
+ <name>Solaar</name>
+ <summary>Solaar is a Linux manager for many Logitech keyboards, mice, and trackpads.</summary>
+
+ <metadata_license>CC-BY-4.0</metadata_license>
+ <project_license>GPL-2.0-only</project_license>
+
+ <recommends>
+ <control>pointing</control>
+ <control>keyboard</control>
+ <control>touch</control>
+ </recommends>
+
+ <description>
+ <p>
+ <em>
+ </em>Solaar<em>
+ </em> is a Linux manager for many Logitech keyboards, mice, and trackpads that connect wirelessly to a USB, Lightspeed, or Nano receiver, connect directly via a USB cable, or connect via Bluetooth. Solaar does not work with peripherals from other companies.
+ </p>
+ <p>
+ Solaar can be used as a GUI application or via its command-line interface. Both interfaces are able to list the connected devices and show information about each device, often including battery status. Solaar is able to pair and unpair devices with receivers as supported by the device and receiver. Solaar can also control some changeable features of devices, such as smooth scrolling or function key behavior.
+ </p>
+ <p>
+ Solaar's GUI normally uses an icon in the system tray and starts with its main window visible.
+ </p>
+</description>
+
+<launchable type="desktop-id">solaar.desktop</launchable>
+<screenshots>
+ <screenshot type="default">
+ 
+ </screenshot>
+ <screenshot>
+ 
+ </screenshot>
+ <screenshot>
+ 
+ </screenshot>
+</screenshots>
+</component>
|
fossasia__open-event-server-4135 | Unable to update the user info via patch request.
**I'm submitting a ...** (check one with "x")
- [x] bug report
- [ ] feature request
- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server
**Current behavior:**
I am trying to update the user info of 'email', 'phone' by sending a patch request at 'https://open-event-api.herokuapp.com/users/<user_id>' but it's giving me 'Unknown error'. I also tried to send it by postman with the same access token so that I can override the info, still I am getting the same error. Following are the screenshots:



URL: https://open-event-api.herokuapp.com/v1/users/110
Request headers:
```
Content-Type: application/vnd.api+json
Authorization: JWT <Auth Key>
```
Response:
```
{
"errors": [
{
"detail": "Unknown error",
"source": {
"pointer": ""
},
"status": 500,
"title": "Unknown error"
}
],
"jsonapi": {
"version": "1.0"
}
}
```
Status code: 500
| [
{
"content": "from datetime import datetime\nimport pytz\nimport random\nimport humanize\nfrom flask import url_for\nfrom sqlalchemy import event, desc\nfrom sqlalchemy.orm.exc import MultipleResultsFound, NoResultFound\nfrom flask.ext.scrypt import generate_password_hash, generate_random_salt\nfrom sqlalchemy.... | [
{
"content": "from datetime import datetime\nimport pytz\nimport random\nimport humanize\nfrom flask import url_for\nfrom sqlalchemy import event, desc\nfrom sqlalchemy.orm.exc import MultipleResultsFound, NoResultFound\nfrom flask.ext.scrypt import generate_password_hash, generate_random_salt\nfrom sqlalchemy.... | diff --git a/app/models/user.py b/app/models/user.py
index db06bc8f4b..a7debdb13b 100644
--- a/app/models/user.py
+++ b/app/models/user.py
@@ -106,10 +106,8 @@ def email(self, email):
:param email:
:return:
"""
- if self._email is None:
- self._email = email
- else:
- raise AttributeError("Email cannot be modified")
+ self._email = email
+ self.is_verified = False
# User Permissions
def can_publish_event(self):
|
sunpy__sunpy-3835 | Plot titles and x-labels overlapping in example
The plot titles and labels overlap in the 3rd image of https://docs.sunpy.org/en/latest/generated/gallery/acquiring_data/2011_06_07_sampledata_overview.html#sphx-glr-generated-gallery-acquiring-data-2011-06-07-sampledata-overview-py (see below). I'm guessing the tight-layout just needs tweaking.

| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\n========================\nSample data set overview\n========================\n\nAn overview of the coordinated sample data set.\n\"\"\"\nimport matplotlib.pyplot as plt\nimport astropy.units as u\n\nimport sunpy.map\nimport sunpy.timeseries\nimport sunpy.data.sampl... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\n========================\nSample data set overview\n========================\n\nAn overview of the coordinated sample data set.\n\"\"\"\nimport matplotlib.pyplot as plt\nimport astropy.units as u\n\nimport sunpy.map\nimport sunpy.timeseries\nimport sunpy.data.sampl... | diff --git a/changelog/3835.doc.rst b/changelog/3835.doc.rst
new file mode 100644
index 00000000000..acb95ba1734
--- /dev/null
+++ b/changelog/3835.doc.rst
@@ -0,0 +1 @@
+Changed padding value of an example in the example gallery to fix the overlap of titles and x-label axes.
diff --git a/examples/acquiring_data/2011_06_07_sampledata_overview.py b/examples/acquiring_data/2011_06_07_sampledata_overview.py
index cdda728d649..b33ddb0469a 100644
--- a/examples/acquiring_data/2011_06_07_sampledata_overview.py
+++ b/examples/acquiring_data/2011_06_07_sampledata_overview.py
@@ -78,7 +78,7 @@
aia_1600_map.plot(clip_interval=(0.5, 99.9)*u.percent)
aia_1600_map.draw_grid()
-fig.tight_layout(pad=6.50)
+fig.tight_layout(pad=8.50)
plt.show()
###############################################################################
|
readthedocs__readthedocs.org-4811 | Delete untracked tags on fetch step
Currently, if the user deletes a tag, it needs to wipe the environment for this change be reflected in their version list.
There are some solutions to delete untracked tags (require more than 2 commands). But I found that the newest version of git has the `--prune-tags` option, which is used as `git fetch --prune --prune-tags` (`git >2.17`). We need to update git on the servers (we use 2.7.4) and change the fetch command. Or we can find a way to wipe the environment if we detect something like this case.
Raised in https://github.com/rtfd/readthedocs.org/pull/3913#issuecomment-396673349
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Git-related utilities.\"\"\"\n\nfrom __future__ import (\n absolute_import, division, print_function, unicode_literals)\n\nimport csv\nimport logging\nimport os\nimport re\n\nimport git\nfrom builtins import str\nfrom django.core.exceptions import ValidationError\... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Git-related utilities.\"\"\"\n\nfrom __future__ import (\n absolute_import, division, print_function, unicode_literals)\n\nimport csv\nimport logging\nimport os\nimport re\n\nimport git\nfrom builtins import str\nfrom django.core.exceptions import ValidationError\... | diff --git a/.travis.yml b/.travis.yml
index 7eff80a97e4..6f4bdf452ee 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -21,6 +21,8 @@ cache:
- ~/.cache/pip
- ~/.nvm/nvm.sh
- ~/.npm
+before_install:
+ - sudo apt-get install -y git
install:
- ./scripts/travis/install_elasticsearch.sh
- pip install tox-travis
diff --git a/docs/install.rst b/docs/install.rst
index 4d6f85d625b..d7234bf9283 100644
--- a/docs/install.rst
+++ b/docs/install.rst
@@ -13,7 +13,7 @@ since it will help you to avoid clutter in your system-wide libraries.
Additionally Read the Docs depends on:
-* `Git`_ (version >=2)
+* `Git`_ (version >=2.17.0)
* `Mercurial`_ (only if you need to work with mercurial repositories)
* `Pip`_ (version >1.5)
* `Redis`_
diff --git a/readthedocs/rtd_tests/tests/test_backend.py b/readthedocs/rtd_tests/tests/test_backend.py
index 3acae2d2036..239c2dc8f57 100644
--- a/readthedocs/rtd_tests/tests/test_backend.py
+++ b/readthedocs/rtd_tests/tests/test_backend.py
@@ -1,21 +1,33 @@
# -*- coding: utf-8 -*-
from __future__ import (
- absolute_import, division, print_function, unicode_literals)
+ absolute_import,
+ division,
+ print_function,
+ unicode_literals,
+)
+import os
from os.path import exists
+from tempfile import mkdtemp
import django_dynamic_fixture as fixture
import pytest
from django.contrib.auth.models import User
-from mock import Mock
+from mock import Mock, patch
from readthedocs.config import ALL
from readthedocs.projects.exceptions import RepositoryError
from readthedocs.projects.models import Feature, Project
from readthedocs.rtd_tests.base import RTDTestCase
from readthedocs.rtd_tests.utils import (
- create_git_tag, make_test_git, make_test_hg)
+ create_git_branch,
+ create_git_tag,
+ delete_git_branch,
+ delete_git_tag,
+ make_test_git,
+ make_test_hg,
+)
class TestGitBackend(RTDTestCase):
@@ -118,6 +130,51 @@ def test_check_invalid_submodule_urls(self):
repo.checkout('invalidsubmodule')
self.assertEqual(e.msg, RepositoryError.INVALID_SUBMODULES)
+ @patch('readthedocs.projects.models.Project.checkout_path')
+ def test_fetch_clean_tags_and_branches(self, checkout_path):
+ upstream_repo = self.project.repo
+ create_git_tag(upstream_repo, 'v01')
+ create_git_tag(upstream_repo, 'v02')
+ create_git_branch(upstream_repo, 'newbranch')
+
+ local_repo = os.path.join(mkdtemp(), 'local')
+ os.mkdir(local_repo)
+ checkout_path.return_value = local_repo
+
+ repo = self.project.vcs_repo()
+ repo.clone()
+
+ delete_git_tag(upstream_repo, 'v02')
+ delete_git_branch(upstream_repo, 'newbranch')
+
+ # We still have all branches and tags in the local repo
+ self.assertEqual(
+ set(['v01', 'v02']),
+ set(vcs.verbose_name for vcs in repo.tags)
+ )
+ self.assertEqual(
+ set([
+ 'relativesubmodule', 'invalidsubmodule',
+ 'master', 'submodule', 'newbranch',
+ ]),
+ set(vcs.verbose_name for vcs in repo.branches)
+ )
+
+ repo.checkout()
+
+ # We don't have the eliminated branches and tags in the local repo
+ self.assertEqual(
+ set(['v01']),
+ set(vcs.verbose_name for vcs in repo.tags)
+ )
+ self.assertEqual(
+ set([
+ 'relativesubmodule', 'invalidsubmodule',
+ 'master', 'submodule'
+ ]),
+ set(vcs.verbose_name for vcs in repo.branches)
+ )
+
class TestHgBackend(RTDTestCase):
def setUp(self):
diff --git a/readthedocs/vcs_support/backends/git.py b/readthedocs/vcs_support/backends/git.py
index 9b117799fb3..2959add5493 100644
--- a/readthedocs/vcs_support/backends/git.py
+++ b/readthedocs/vcs_support/backends/git.py
@@ -122,7 +122,9 @@ def validate_submodules(self, config):
return True, submodules.keys()
def fetch(self):
- code, _, _ = self.run('git', 'fetch', '--tags', '--prune')
+ code, _, _ = self.run(
+ 'git', 'fetch', '--tags', '--prune', '--prune-tags',
+ )
if code != 0:
raise RepositoryError
|
kivy__kivy-6322 | PermissionError is not available in Python2.7
<!--
The issue tracker is a tool to address bugs.
Please use the #support Discord channel at https://chat.kivy.org/ or Stack Overflow for
support questions, more information at https://git.io/vM1yQ.
Before opening a new issue, make sure you do the following:
* check that your issue isn't already filed: https://git.io/vM1iE
* prepare a short, runnable example that reproduces the issue
* reproduce the problem with the latest development version of Kivy
* double-check that the issue is indeed a bug and not a support request
-->
### Versions
* Python: 2.7
* OS: Any
* Kivy: 1.11.0rc1 (ef216431d5b2762480596ed4a2c93a5ecbd5a355)
* Kivy installation method: Installed following [official instructions](https://kivy.org/doc/stable/installation/installation-windows.html#use-development-kivy)
### Description
`PermissionError` isn't builtin error in Python2, so this line in `logger.py` will raise an error https://github.com/kivy/kivy/blob/ef216431d5b2762480596ed4a2c93a5ecbd5a355/kivy/logger.py#L150
| [
{
"content": "'''\nLogger object\n=============\n\nDifferents logging levels are available : trace, debug, info, warning, error\nand critical.\n\nExamples of usage::\n\n from kivy.logger import Logger\n\n Logger.info('title: This is a info message.')\n Logger.debug('title: This is a debug message.')\n\... | [
{
"content": "'''\nLogger object\n=============\n\nDifferents logging levels are available : trace, debug, info, warning, error\nand critical.\n\nExamples of usage::\n\n from kivy.logger import Logger\n\n Logger.info('title: This is a info message.')\n Logger.debug('title: This is a debug message.')\n\... | diff --git a/kivy/logger.py b/kivy/logger.py
index 48bb314fcd..bd40701b3d 100644
--- a/kivy/logger.py
+++ b/kivy/logger.py
@@ -65,6 +65,11 @@
__all__ = (
'Logger', 'LOG_LEVELS', 'COLORS', 'LoggerHistory', 'file_log_handler')
+try:
+ PermissionError
+except NameError: # Python 2
+ PermissionError = OSError, IOError
+
Logger = None
BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = list(range(8))
|
DataBiosphere__toil-2028 | Race condition in Mesos batch system
In (very!) rare cases, the Mesos driver thread can crash with a KeyError. Fortunately, the log file demonstrates exactly the interleaving that needs to happen to cause this error:
```
Launched Mesos task 1667171.
Queueing the job command: _toil_worker CactusHalGeneratorUpWrapper aws:us-west-2:birds-first-jobstore e257211f-7dde-4f82-b2d4-1162f85589fe with
job id: 1667173 ...
Launched Mesos task 1667172.
Got offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 for a non-preemptable slave with 12988.00 MiB memory, 31.00 core(s) and 368882.00 MiB o
f disk.
Preparing to launch Mesos task 1667173 using offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 ...
Offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 not suitable to run the tasks with requirements {'cores': 1, 'preemptable': True, 'disk': 2
147483648, 'memory': 4089446400}. Mesos offered 13618905088.0 memory, 31.0 cores and 3.86800812032e+11 of disk on a non-preemptable slave.
Offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 not suitable to run the tasks with requirements {'cores': 1, 'preemptable': True, 'disk': 2
147483648, 'memory': 100000000}. Mesos offered 13618905088.0 memory, 31.0 cores and 3.86800812032e+11 of disk on a non-preemptable slave.
Failed to call scheduler's resourceOffer
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/toil/batchSystems/mesos/batchSystem.py", line 468, in resourceOffers
self._updateStateToRunning(offer, runnableTasks)
File "/usr/local/lib/python2.7/dist-packages/toil/batchSystems/mesos/batchSystem.py", line 379, in _updateStateToRunning
resources = self.taskResources[resourceKey]
KeyError: 1667173
I0120 07:18:14.162212 21183 sched.cpp:2055] Asked to abort the driver
... queued
Issued job 'CactusHalGeneratorUpWrapper' e257211f-7dde-4f82-b2d4-1162f85589fe with job batch system ID: 1667173 and preemptability: True, cores: 1, disk: 2.0 G, and memory: 3.8 G
```
The `Queueing the job command... queued` messages come from `issueBatchJob` on the leader thread. The `Preparing Mesos task` and similar messages come from `resourceOffers` on the driver thread. And if we look at `issueBatchJob`, there's a subtle race:
```python
self.jobQueues.insertJob(job, jobType)
<---------- leader thread was paused here
self.taskResources[jobID] = job.resources
log.debug("... queued")
```
The job was made available for processing before it was entirely ready. If the leader thread gets interrupted *immediately* after putting the job in the queue, and `resourceOffers` gets a chance to run without interruption for a while,`taskResources` won't be filled in properly and the Mesos driver will crash.
Race condition in Mesos batch system
In (very!) rare cases, the Mesos driver thread can crash with a KeyError. Fortunately, the log file demonstrates exactly the interleaving that needs to happen to cause this error:
```
Launched Mesos task 1667171.
Queueing the job command: _toil_worker CactusHalGeneratorUpWrapper aws:us-west-2:birds-first-jobstore e257211f-7dde-4f82-b2d4-1162f85589fe with
job id: 1667173 ...
Launched Mesos task 1667172.
Got offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 for a non-preemptable slave with 12988.00 MiB memory, 31.00 core(s) and 368882.00 MiB o
f disk.
Preparing to launch Mesos task 1667173 using offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 ...
Offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 not suitable to run the tasks with requirements {'cores': 1, 'preemptable': True, 'disk': 2
147483648, 'memory': 4089446400}. Mesos offered 13618905088.0 memory, 31.0 cores and 3.86800812032e+11 of disk on a non-preemptable slave.
Offer 99e660ab-d9e0-4a70-9eb0-588da54bd4b0-O5620082 not suitable to run the tasks with requirements {'cores': 1, 'preemptable': True, 'disk': 2
147483648, 'memory': 100000000}. Mesos offered 13618905088.0 memory, 31.0 cores and 3.86800812032e+11 of disk on a non-preemptable slave.
Failed to call scheduler's resourceOffer
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/toil/batchSystems/mesos/batchSystem.py", line 468, in resourceOffers
self._updateStateToRunning(offer, runnableTasks)
File "/usr/local/lib/python2.7/dist-packages/toil/batchSystems/mesos/batchSystem.py", line 379, in _updateStateToRunning
resources = self.taskResources[resourceKey]
KeyError: 1667173
I0120 07:18:14.162212 21183 sched.cpp:2055] Asked to abort the driver
... queued
Issued job 'CactusHalGeneratorUpWrapper' e257211f-7dde-4f82-b2d4-1162f85589fe with job batch system ID: 1667173 and preemptability: True, cores: 1, disk: 2.0 G, and memory: 3.8 G
```
The `Queueing the job command... queued` messages come from `issueBatchJob` on the leader thread. The `Preparing Mesos task` and similar messages come from `resourceOffers` on the driver thread. And if we look at `issueBatchJob`, there's a subtle race:
```python
self.jobQueues.insertJob(job, jobType)
<---------- leader thread was paused here
self.taskResources[jobID] = job.resources
log.debug("... queued")
```
The job was made available for processing before it was entirely ready. If the leader thread gets interrupted *immediately* after putting the job in the queue, and `resourceOffers` gets a chance to run without interruption for a while,`taskResources` won't be filled in properly and the Mesos driver will crash.
| [
{
"content": "# Copyright (C) 2015-2016 Regents of the University of California\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-... | [
{
"content": "# Copyright (C) 2015-2016 Regents of the University of California\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-... | diff --git a/src/toil/batchSystems/mesos/batchSystem.py b/src/toil/batchSystems/mesos/batchSystem.py
index 6a671049aa..25da89b8d0 100644
--- a/src/toil/batchSystems/mesos/batchSystem.py
+++ b/src/toil/batchSystems/mesos/batchSystem.py
@@ -190,8 +190,8 @@ def issueBatchJob(self, jobNode):
# TODO: round all elements of resources
- self.jobQueues.insertJob(job, jobType)
self.taskResources[jobID] = job.resources
+ self.jobQueues.insertJob(job, jobType)
log.debug("... queued")
return jobID
|
ckan__ckan-8093 | readthedocs sphinx build failures
## CKAN version
master
## Describe the bug
infinite loop in build, looks like no tags are returned from `git log`?
### Steps to reproduce
check sphinx logs
### Expected behavior
build docs on rtd working
### Additional details
```python-traceback
Traceback (most recent call last):
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/envs/latest/lib/python3.10/site-packages/sphinx/config.py", line 358, in eval_config_file
exec(code, namespace) # NoQA: S102
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 388, in <module>
current_release_tag_value = get_current_release_tag()
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 211, in get_current_release_tag
return get_latest_release_tag()
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 228, in get_latest_release_tag
return get_latest_release_version()
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 237, in get_latest_release_version
version = get_latest_release_tag()[len('ckan-'):]
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 228, in get_latest_release_tag
return get_latest_release_version()
File "/home/docs/checkouts/readthedocs.org/user_builds/ckan/checkouts/latest/doc/conf.py", line 237, in get_latest_release_version
version = get_latest_release_tag()[len('ckan-'):]
…
```
| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# CKAN documentation build configuration file, created by\n# sphinx-quickstart on Sun Oct 25 16:47:17 2009.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# The contents of this file are pickled, so don't put values in the names... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# CKAN documentation build configuration file, created by\n# sphinx-quickstart on Sun Oct 25 16:47:17 2009.\n#\n# This file is execfile()d with the current directory set to its containing dir.\n#\n# The contents of this file are pickled, so don't put values in the names... | diff --git a/.readthedocs.yaml b/.readthedocs.yaml
index 1bd376c77b3..b8e02ef2197 100644
--- a/.readthedocs.yaml
+++ b/.readthedocs.yaml
@@ -9,9 +9,13 @@ version: 2
build:
os: ubuntu-22.04
apt_packages:
- - libmagic-dev
+ - libmagic-dev
+ - libmagic1
tools:
python: "3.10"
+ jobs:
+ post_checkout:
+ - git fetch --tags || true
sphinx:
configuration: doc/conf.py
diff --git a/doc/conf.py b/doc/conf.py
index 0e6325a8ec2..eca46d5929f 100644
--- a/doc/conf.py
+++ b/doc/conf.py
@@ -85,7 +85,9 @@
extensions = [
'sphinx.ext.autodoc', 'sphinx.ext.todo',
'sphinx.ext.autosummary', 'ckan.plugins.toolkit_sphinx_extension',
+ 'sphinx_rtd_theme',
]
+html_theme = 'sphinx_rtd_theme'
autodoc_member_order = 'bysource'
todo_include_todos = True
|
fossasia__open-event-server-4302 | Custom-forms: Change data.type in custom-form
**I'm submitting a ...** (check one with "x")
- [x] bug report
- [ ] feature request
- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server
**Current behavior:**
The type attribute is `custom_form` which leads to error 409 while making a request after #4300
**Expected behavior:**
The type attribute should be `custom-form`
@enigmaeth Can you please check?
| [
{
"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\nfrom marshmallow_jsonapi.flask import Schema, Relationship\nfrom marshmallow_jsonapi import fields\nimport marshmallow.validate as validate\nfrom app.api.helpers.permissions import jwt_required\nfrom flask_rest_json... | [
{
"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\nfrom marshmallow_jsonapi.flask import Schema, Relationship\nfrom marshmallow_jsonapi import fields\nimport marshmallow.validate as validate\nfrom app.api.helpers.permissions import jwt_required\nfrom flask_rest_json... | diff --git a/app/api/custom_forms.py b/app/api/custom_forms.py
index 5474779926..924b5a1ad7 100644
--- a/app/api/custom_forms.py
+++ b/app/api/custom_forms.py
@@ -24,7 +24,7 @@ class Meta:
"""
Meta class for CustomForm Schema
"""
- type_ = 'custom_form'
+ type_ = 'custom-form'
self_view = 'v1.custom_form_detail'
self_view_kwargs = {'id': '<id>'}
inflect = dasherize
diff --git a/docs/api/api_blueprint.apib b/docs/api/api_blueprint.apib
index 5457207c75..bccb44396e 100644
--- a/docs/api/api_blueprint.apib
+++ b/docs/api/api_blueprint.apib
@@ -16238,7 +16238,7 @@ Create a new Custom Form with event_id.
{
"data": {
- "type": "custom_form",
+ "type": "custom-form",
"relationships": {
"event": {
"data": {
@@ -16279,7 +16279,7 @@ Create a new Custom Form with event_id.
"is-fixed": false,
"type": "text"
},
- "type": "custom_form",
+ "type": "custom-form",
"id": 1,
"links": {
"self": "/v1/custom-forms/1"
@@ -16329,7 +16329,7 @@ Get a single custom form.
"is-included": false,
"type": "text"
},
- "type": "custom_form",
+ "type": "custom-form",
"id": 1,
"links": {
"self": "/v1/custom-forms/1"
@@ -16359,7 +16359,7 @@ Update a single custom form with `id`.
{
"data": {
- "type": "custom_form",
+ "type": "custom-form",
"attributes": {
"form": "form",
"field-identifier": "abc123",
@@ -16393,7 +16393,7 @@ Update a single custom form with `id`.
"is-included": false,
"type": "text"
},
- "type": "custom_form",
+ "type": "custom-form",
"id": 1,
"links": {
"self": "/v1/custom-forms/1"
@@ -16475,7 +16475,7 @@ Get a list of Custom Forms for an event.
"is-fixed": false,
"type": "text"
},
- "type": "custom_form",
+ "type": "custom-form",
"id": 1,
"links": {
"self": "/v1/custom-forms/1"
|
DataDog__dd-trace-py-1582 | ddtrace.Pin() for multiple grpc channels doesn't work
Thanks for taking the time for reporting an issue!
Before reporting an issue on dd-trace-py, please be sure to provide all
necessary information.
If you're hitting a bug, make sure that you're using the latest version of this
library.
### Which version of dd-trace-py are you using?
0.38.2
I didn't find anything related to this issue in the release notes of the releases after this version.
### Which version of the libraries are you using?
datadog==0.36.0
### How can we reproduce your problem?
Approach 1:
servers is a list of grpc server addresses
```
for server in servers:
channel = grpc.insecure_channel(server)
Pin.override(channel, service=server)
# Do something with the channel
```
Since `Pin.override(grpc.Channel, service=server)` worked with one server, I also tried the following to see how it looks
Approach 2:
servers is a list of grpc server addresses
```
for server in servers:
Pin.override(grpc.Channel, service=server)
channel = grpc.insecure_channel(server)
# Do something with the channel
```
### What is the result that you get?
In Approach 1, Pin.override did not set the service name correctly. Everywhere in Datadog, I could see it as `grpc-client` which is the default value.
In Approach 2, since I I don't pass the channels corresponding to each server, all servers are overriden by Pin to the final server (probably because it's the last one in the loop)
### What is the result that you expected?
ddtrace.Pin() onto multiple grpc channels should work and I should be able to see the correct `service` in Datadog APM traces and Service Map
| [
{
"content": "import os\n\nimport grpc\n\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\nfrom ddtrace import config, Pin\n\nfrom ...utils.wrappers import unwrap as _u\n\nfrom . import constants\nfrom .client_interceptor import create_client_interceptor, intercept_channel\nfrom .server_interceptor... | [
{
"content": "import os\n\nimport grpc\n\nfrom ddtrace.vendor.wrapt import wrap_function_wrapper as _w\nfrom ddtrace import config, Pin\n\nfrom ...utils.wrappers import unwrap as _u\n\nfrom . import constants\nfrom .client_interceptor import create_client_interceptor, intercept_channel\nfrom .server_interceptor... | diff --git a/ddtrace/contrib/grpc/patch.py b/ddtrace/contrib/grpc/patch.py
index b815d8a7c4f..1d48914dd03 100644
--- a/ddtrace/contrib/grpc/patch.py
+++ b/ddtrace/contrib/grpc/patch.py
@@ -98,7 +98,7 @@ def _unpatch_server():
def _client_channel_interceptor(wrapped, instance, args, kwargs):
channel = wrapped(*args, **kwargs)
- pin = Pin.get_from(constants.GRPC_PIN_MODULE_CLIENT)
+ pin = Pin.get_from(channel)
if not pin or not pin.enabled():
return channel
|
jazzband__django-simple-history-1218 | Creating historical records for models with M2M fields to `"self"` causes `FieldError`
**Describe the bug**
*See title.*
**To Reproduce**
Steps to reproduce the behavior:
1. Given the following model:
```python
class Person(models.Model):
relations = models.ManyToManyField("self")
history = HistoricalRecords(m2m_fields=[relations])
```
2. Run the following code (which should also create a historical record for the `Person` object):
```python
Person.objects.create()
```
3. This will produce the following error:
```
django.core.exceptions.FieldError: Cannot resolve keyword 'person' into field. Choices are: from_person, from_person_id, id, to_person, to_person_id
```
**Expected behavior**
That a model object and associated historical record were successfully created, and that the error was not raised.
**Environment (please complete the following information):**
- OS: Windows 11 22H2
- Django Simple History Version: [the current `master` branch](https://github.com/jazzband/django-simple-history/tree/636bcbc46d473862c000101ef040e4eda693117f)
- Django Version: 4.1.6
- Database Version: SQLite 3.38.4
| [
{
"content": "import copy\nimport importlib\nimport uuid\nimport warnings\nfrom functools import partial\n\nfrom django.apps import apps\nfrom django.conf import settings\nfrom django.contrib import admin\nfrom django.contrib.auth import get_user_model\nfrom django.core.exceptions import ImproperlyConfigured, O... | [
{
"content": "import copy\nimport importlib\nimport uuid\nimport warnings\nfrom functools import partial\n\nfrom django.apps import apps\nfrom django.conf import settings\nfrom django.contrib import admin\nfrom django.contrib.auth import get_user_model\nfrom django.core.exceptions import ImproperlyConfigured, O... | diff --git a/AUTHORS.rst b/AUTHORS.rst
index 875e0173b..12eeb4383 100644
--- a/AUTHORS.rst
+++ b/AUTHORS.rst
@@ -90,6 +90,7 @@ Authors
- Lucas Wiman
- Maciej "RooTer" Urbański
- Marcelo Canina (`marcanuy <https://github.com/marcanuy>`_)
+- Marco Sirabella
- Mark Davidoff
- Martin Bachwerk
- Marty Alchin
diff --git a/CHANGES.rst b/CHANGES.rst
index 0e6660d5c..fbe4f979c 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -24,6 +24,8 @@ Unreleased
``HistoricalRecords.context.request``) under some circumstances (gh-1188)
- Made ``HistoryRequestMiddleware`` async-capable (gh-1209)
- Fixed error when setting ``table_name`` with ``inherit=True`` (gh-1195)
+- Fixed ``FieldError`` when creating historical records for many-to-many fields with
+ ``to="self"`` (gh-1218)
3.3.0 (2023-03-08)
------------------
diff --git a/simple_history/models.py b/simple_history/models.py
index 19080a4d4..4ad5c2e9c 100644
--- a/simple_history/models.py
+++ b/simple_history/models.py
@@ -670,7 +670,8 @@ def create_historical_record_m2ms(self, history_instance, instance):
insert_rows = []
- through_field_name = type(original_instance).__name__.lower()
+ # `m2m_field_name()` is part of Django's internal API
+ through_field_name = field.m2m_field_name()
rows = through_model.objects.filter(**{through_field_name: instance})
diff --git a/simple_history/tests/models.py b/simple_history/tests/models.py
index a41374d7d..5c1da32ad 100644
--- a/simple_history/tests/models.py
+++ b/simple_history/tests/models.py
@@ -200,6 +200,11 @@ class PollChildRestaurantWithManyToMany(PollParentWithManyToMany):
_history_m2m_fields = [restaurants]
+class PollWithSelfManyToMany(models.Model):
+ relations = models.ManyToManyField("self")
+ history = HistoricalRecords(m2m_fields=[relations])
+
+
class CustomAttrNameForeignKey(models.ForeignKey):
def __init__(self, *args, **kwargs):
self.attr_name = kwargs.pop("attr_name", None)
diff --git a/simple_history/tests/tests/test_models.py b/simple_history/tests/tests/test_models.py
index 2f98594a7..484df73f9 100644
--- a/simple_history/tests/tests/test_models.py
+++ b/simple_history/tests/tests/test_models.py
@@ -103,6 +103,7 @@
PollWithManyToManyCustomHistoryID,
PollWithManyToManyWithIPAddress,
PollWithNonEditableField,
+ PollWithSelfManyToMany,
PollWithSeveralManyToMany,
Province,
Restaurant,
@@ -1869,6 +1870,17 @@ def test_separation(self):
self.assertEqual(add.restaurants.all().count(), 0)
self.assertEqual(add.places.all().count(), 0)
+ def test_self_field(self):
+ poll1 = PollWithSelfManyToMany.objects.create()
+ poll2 = PollWithSelfManyToMany.objects.create()
+
+ self.assertEqual(poll1.history.all().count(), 1)
+
+ poll1.relations.add(poll2)
+ self.assertIn(poll2, poll1.relations.all())
+
+ self.assertEqual(poll1.history.all().count(), 2)
+
class ManyToManyWithSignalsTest(TestCase):
def setUp(self):
|
scverse__scanpy-1979 | small spelling mistake
In the file scanpy/_utils/__init__.py on master branch, line 412 says:
"Revieved a view of an AnnData. Making a copy."
probably meaning "received"
| [
{
"content": "\"\"\"Utility functions and classes\n\nThis file largely consists of the old _utils.py file. Over time, these functions\nshould be moved of this file.\n\"\"\"\nimport sys\nimport inspect\nimport warnings\nimport importlib.util\nfrom enum import Enum\nfrom pathlib import Path\nfrom weakref import W... | [
{
"content": "\"\"\"Utility functions and classes\n\nThis file largely consists of the old _utils.py file. Over time, these functions\nshould be moved of this file.\n\"\"\"\nimport sys\nimport inspect\nimport warnings\nimport importlib.util\nfrom enum import Enum\nfrom pathlib import Path\nfrom weakref import W... | diff --git a/scanpy/_utils/__init__.py b/scanpy/_utils/__init__.py
index 105ca8802a..fd169ff9d4 100644
--- a/scanpy/_utils/__init__.py
+++ b/scanpy/_utils/__init__.py
@@ -409,7 +409,7 @@ def sanitize_anndata(adata):
def view_to_actual(adata):
if adata.is_view:
warnings.warn(
- "Revieved a view of an AnnData. Making a copy.",
+ "Received a view of an AnnData. Making a copy.",
stacklevel=2,
)
adata._init_as_actual(adata.copy())
|
apple__coremltools-911 | cuda tensor parameter fail to convert to numpy in InternalTorchIRGraph
## 🐞Describe the bug
- If the input parameter type to a traced model is tensor.cuda(), ct.convert fails with the below error
- Torch
## Trace
```
File "/home/josh/anaconda3/lib/python3.7/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 180, in __init__
value = param.detach().numpy()
TypeError: can't convert cuda:0 device type tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first.
```
Note, possible fix:
replace
` value = param.detach().numpy()
`
with
` value = param.cpu().detach().numpy()
`
## System environment (please complete the following information):
- coremltools version: 4.0b1
- OS: Linux
- How you install python: anaconda
- python version: 3.7.6
## Additional context
Add any other context about the problem here.
| [
{
"content": "# Copyright (c) 2020, Apple Inc. All rights reserved.\n#\n# Use of this source code is governed by a BSD-3-clause license that can be\n# found in the LICENSE.txt file or at https://opensource.org/licenses/BSD-3-Clause\n\nfrom collections import OrderedDict\nfrom itertools import islice\n\nimpor... | [
{
"content": "# Copyright (c) 2020, Apple Inc. All rights reserved.\n#\n# Use of this source code is governed by a BSD-3-clause license that can be\n# found in the LICENSE.txt file or at https://opensource.org/licenses/BSD-3-Clause\n\nfrom collections import OrderedDict\nfrom itertools import islice\n\nimpor... | diff --git a/coremltools/converters/mil/frontend/torch/internal_graph.py b/coremltools/converters/mil/frontend/torch/internal_graph.py
index 197e8848f..097519f89 100644
--- a/coremltools/converters/mil/frontend/torch/internal_graph.py
+++ b/coremltools/converters/mil/frontend/torch/internal_graph.py
@@ -246,7 +246,7 @@ def __init__(
# Add params
for name, param in params_dict.items():
- value = param.detach().numpy()
+ value = param.detach().cpu().numpy()
self.params[name] = value
# Add inputs
|
xonsh__xonsh-1181 | Configuration fails on Windows due to colon in filename
`xonfig wizard` fails on Windows because the temporary configuration file created has a colon in its filename.
The relevant output is:
```
Would you like to save this state, yes or no [default: no]? yes
filename [default='C:\\Users\\alowe\\.config\\xonsh\\config.json']:
Traceback (most recent call last):
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\shutil.py", line 538, in move
os.rename(src, real_dst)
OSError: [WinError 123] The filename, directory name, or volume label syntax is incorrect: 'C:\\Users\\alowe\\.config\\xonsh\\config.json' -> 'C:\\Users\\alowe\\.config\\xonsh\\config.2016-06-08T11:18:52.170226.json'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\xonfig.py", line 307, in _wizard
pv.visit()
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 481, in visit
rtn = super().visit(node)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 302, in visit
rtn = meth(node)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 538, in visit_wizard
self.visit(child)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 481, in visit
rtn = super().visit(node)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 302, in visit
rtn = meth(node)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\wizard.py", line 623, in visit_save
backup_file(fname)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\site-packages\xonsh\tools.py", line 1165, in backup_file
shutil.move(fname, newfname)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\shutil.py", line 552, in move
copy_function(src, real_dst)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\shutil.py", line 251, in copy2
copyfile(src, dst, follow_symlinks=follow_symlinks)
File "c:\users\alowe\appdata\local\continuum\anaconda3\lib\shutil.py", line 115, in copyfile
with open(dst, 'wb') as fdst:
OSError: [Errno 22] Invalid argument: 'C:\\Users\\alowe\\.config\\xonsh\\config.2016-06-08T11:18:52.170226.json'
```
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Misc. xonsh tools.\n\nThe following implementations were forked from the IPython project:\n\n* Copyright (c) 2008-2014, IPython Development Team\n* Copyright (C) 2001-2007 Fernando Perez <fperez@colorado.edu>\n* Copyright (c) 2001, Janko Hauser <jhauser@zscout.de>\n*... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Misc. xonsh tools.\n\nThe following implementations were forked from the IPython project:\n\n* Copyright (c) 2008-2014, IPython Development Team\n* Copyright (C) 2001-2007 Fernando Perez <fperez@colorado.edu>\n* Copyright (c) 2001, Janko Hauser <jhauser@zscout.de>\n*... | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 2aa7aad2f8..d473624ee9 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -12,7 +12,9 @@ Current Developments
**Removed:** None
-**Fixed:** None
+**Fixed:**
+
+* Fixed xonfig wizard failing on Windows due to colon in created filename.
**Security:** None
@@ -48,7 +50,6 @@ v0.3.4
file.
-
v0.3.3
====================
**Added:**
diff --git a/xonsh/tools.py b/xonsh/tools.py
index 277b7aeb46..a12711bb20 100644
--- a/xonsh/tools.py
+++ b/xonsh/tools.py
@@ -1171,7 +1171,8 @@ def backup_file(fname):
import shutil
from datetime import datetime
base, ext = os.path.splitext(fname)
- newfname = base + '.' + datetime.now().isoformat() + ext
+ timestamp = datetime.now().strftime('%Y-%m-%d-%H-%M-%S-%f')
+ newfname = '%s.%s%s' % (base, timestamp, ext)
shutil.move(fname, newfname)
|
getmoto__moto-1462 | Add opsworks app mocks
Add the mocks of OpsWork create_app and describe_apps calls. This is part of #1477
| [
{
"content": "from __future__ import unicode_literals\nimport logging\n# logging.getLogger('boto').setLevel(logging.CRITICAL)\n\n__title__ = 'moto'\n__version__ = '1.2.0',\n\nfrom .acm import mock_acm # flake8: noqa\nfrom .apigateway import mock_apigateway, mock_apigateway_deprecated # flake8: noqa\nfrom .aut... | [
{
"content": "from __future__ import unicode_literals\nimport logging\n# logging.getLogger('boto').setLevel(logging.CRITICAL)\n\n__title__ = 'moto'\n__version__ = '1.2.0'\n\nfrom .acm import mock_acm # flake8: noqa\nfrom .apigateway import mock_apigateway, mock_apigateway_deprecated # flake8: noqa\nfrom .auto... | diff --git a/moto/__init__.py b/moto/__init__.py
index 9d292a3e1847..c38212b42f4c 100644
--- a/moto/__init__.py
+++ b/moto/__init__.py
@@ -3,7 +3,7 @@
# logging.getLogger('boto').setLevel(logging.CRITICAL)
__title__ = 'moto'
-__version__ = '1.2.0',
+__version__ = '1.2.0'
from .acm import mock_acm # flake8: noqa
from .apigateway import mock_apigateway, mock_apigateway_deprecated # flake8: noqa
|
ManimCommunity__manim-1335 | Add import statements to examples in documentation
See title.
The examples in the documentation should also include the `from manim import *` at the very least, and actually we could provide best-practice examples where we dont do a *-import, but rather import classes/functions separately.
This can of course be an iterative process: start with adding `from manim import *` first, and become more specific later.
| [
{
"content": "r\"\"\"\nA directive for including Manim videos in a Sphinx document\n===========================================================\n\nWhen rendering the HTML documentation, the ``.. manim::`` directive\nimplemented here allows to include rendered videos.\n\nIts basic usage that allows processing **... | [
{
"content": "r\"\"\"\nA directive for including Manim videos in a Sphinx document\n===========================================================\n\nWhen rendering the HTML documentation, the ``.. manim::`` directive\nimplemented here allows to include rendered videos.\n\nIts basic usage that allows processing **... | diff --git a/docs/source/manim_directive.py b/docs/source/manim_directive.py
index 4f152a928a..d049b6c1db 100644
--- a/docs/source/manim_directive.py
+++ b/docs/source/manim_directive.py
@@ -202,6 +202,7 @@ def run(self):
source_block = [
".. code-block:: python",
"",
+ " from manim import *\n",
*[" " + line for line in self.content],
]
source_block = "\n".join(source_block)
|
wagtail__wagtail-9923 | Search on listing views doesn't work unless the `?q=` param exists in the URL
<!--
Found a bug? Please fill out the sections below. 👍
-->
### Issue Summary
Possible regression in https://github.com/wagtail/wagtail/pull/9768
The `URLSearchParams.get()` returns `null` if the param doesn't exist, so the following code:
https://github.com/wagtail/wagtail/blob/a3f10acae17c892d843c419495e4204adb3ed991/client/src/entrypoints/admin/core.js#L270-L276
will crash during `currentQuery.trim()` when searching on the listing views (snippets, images, etc.) if the `?q=` param doesn't exist in the URL.
Might be a good time to add `required=False` in here as well:
https://github.com/wagtail/wagtail/blob/a3f10acae17c892d843c419495e4204adb3ed991/wagtail/admin/forms/search.py#L12
to remove this silly error when `q` is an empty string:
<img width="473" alt="image" src="https://user-images.githubusercontent.com/6379424/213499685-ce37c064-2635-434f-952f-e85fae4ab9af.png">
<!--
A summary of the issue.
-->
### Steps to Reproduce
1. Spin up bakerydemo
2. Open the images listing
3. Try to search
| [
{
"content": "from django import forms\nfrom django.utils.translation import gettext as _\nfrom django.utils.translation import gettext_lazy\n\n\nclass SearchForm(forms.Form):\n def __init__(self, *args, **kwargs):\n placeholder = kwargs.pop(\"placeholder\", _(\"Search\"))\n super().__init__(*a... | [
{
"content": "from django import forms\nfrom django.utils.translation import gettext as _\nfrom django.utils.translation import gettext_lazy\n\n\nclass SearchForm(forms.Form):\n def __init__(self, *args, **kwargs):\n placeholder = kwargs.pop(\"placeholder\", _(\"Search\"))\n super().__init__(*a... | diff --git a/client/src/entrypoints/admin/core.js b/client/src/entrypoints/admin/core.js
index c9d86855cf13..9f0561a40d0d 100644
--- a/client/src/entrypoints/admin/core.js
+++ b/client/src/entrypoints/admin/core.js
@@ -270,7 +270,7 @@ $(() => {
const search = function () {
const newQuery = $input.val();
const searchParams = new URLSearchParams(window.location.search);
- const currentQuery = searchParams.get('q');
+ const currentQuery = searchParams.get('q') || '';
// only do the query if it has changed for trimmed queries
// for example - " " === "" and "firstword " ==== "firstword"
if (currentQuery.trim() !== newQuery.trim()) {
diff --git a/wagtail/admin/forms/search.py b/wagtail/admin/forms/search.py
index 4d6f85956aea..fb2303c302a2 100644
--- a/wagtail/admin/forms/search.py
+++ b/wagtail/admin/forms/search.py
@@ -9,4 +9,8 @@ def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields["q"].widget.attrs = {"placeholder": placeholder}
- q = forms.CharField(label=gettext_lazy("Search term"), widget=forms.TextInput())
+ q = forms.CharField(
+ label=gettext_lazy("Search term"),
+ widget=forms.TextInput(),
+ required=False,
+ )
diff --git a/wagtail/images/tests/test_admin_views.py b/wagtail/images/tests/test_admin_views.py
index c0928aed66fa..700021e3cbcb 100644
--- a/wagtail/images/tests/test_admin_views.py
+++ b/wagtail/images/tests/test_admin_views.py
@@ -44,6 +44,16 @@ def test_simple(self):
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, "wagtailimages/images/index.html")
self.assertContains(response, "Add an image")
+ # The search box should not raise an error
+ self.assertNotContains(response, "This field is required.")
+
+ def test_empty_q(self):
+ response = self.get({"q": ""})
+ self.assertEqual(response.status_code, 200)
+ self.assertEqual(response.context["query_string"], "")
+ self.assertContains(response, "Add an image")
+ # The search box should not raise an error
+ self.assertNotContains(response, "This field is required.")
def test_search(self):
response = self.get({"q": "Hello"})
diff --git a/wagtail/snippets/tests/test_snippets.py b/wagtail/snippets/tests/test_snippets.py
index c17427a7bc1a..2988dce0154f 100644
--- a/wagtail/snippets/tests/test_snippets.py
+++ b/wagtail/snippets/tests/test_snippets.py
@@ -479,6 +479,23 @@ def test_simple(self):
self.assertIn(self.snippet_b, items)
self.assertIn(self.snippet_c, items)
+ # The search box should not raise an error
+ self.assertNotContains(response, "This field is required.")
+
+ def test_empty_q(self):
+ response = self.get({"q": ""})
+ self.assertEqual(response.status_code, 200)
+ self.assertTemplateUsed(response, "wagtailsnippets/snippets/type_index.html")
+
+ # All snippets should be in items
+ items = list(response.context["page_obj"].object_list)
+ self.assertIn(self.snippet_a, items)
+ self.assertIn(self.snippet_b, items)
+ self.assertIn(self.snippet_c, items)
+
+ # The search box should not raise an error
+ self.assertNotContains(response, "This field is required.")
+
def test_is_searchable(self):
self.assertTrue(self.get().context["is_searchable"])
|
sanic-org__sanic-1397 | Logger not work.
**Describe the bug**
Logger did not work at current master commit (https://github.com/huge-success/sanic/commit/7d79a86d4dc48de11cd34e8ba12e41f3a9f9ff18).
**Code snippet**
```python
from sanic import Sanic
from sanic.log import logger
from sanic.response import text
app = Sanic()
@app.listener('before_server_start')
async def setup(app, loop):
logger.info('INFO')
@app.get('/')
async def test(request):
return text('hello world')
if __name__ == '__main__':
app.run()
```
There is no any log/output now.
**Expected behavior**
At `0.8.3` release, it will logging/output some messages like:
```
[2018-11-05 17:34:47 +0800] [12112] [INFO] Goin' Fast @ http://127.0.0.1:8000
[2018-11-05 17:34:47 +0800] [12112] [INFO] INFO
[2018-11-05 17:34:47 +0800] [12112] [INFO] Starting worker [12112]
```
**Environment (please complete the following information):**
- OS: Ubuntu 18.04
- Version: https://github.com/huge-success/sanic/commit/7d79a86d4dc48de11cd34e8ba12e41f3a9f9ff18
**Additional context**
It seems that `getLogger()` does not get the correct logger at [line 56](https://github.com/huge-success/sanic/blob/master/sanic/log.py#L56) in `log.py`. The logger is trying to get a logger named `sanic.root`, but it does not exist. Rename the logger `root` at [line 9](https://github.com/huge-success/sanic/blob/master/sanic/log.py#L9) should fix this bug.
| [
{
"content": "import logging\nimport sys\n\n\nLOGGING_CONFIG_DEFAULTS = dict(\n version=1,\n disable_existing_loggers=False,\n loggers={\n \"root\": {\"level\": \"INFO\", \"handlers\": [\"console\"]},\n \"sanic.error\": {\n \"level\": \"INFO\",\n \"handlers\": [\"err... | [
{
"content": "import logging\nimport sys\n\n\nLOGGING_CONFIG_DEFAULTS = dict(\n version=1,\n disable_existing_loggers=False,\n loggers={\n \"sanic.root\": {\"level\": \"INFO\", \"handlers\": [\"console\"]},\n \"sanic.error\": {\n \"level\": \"INFO\",\n \"handlers\": ... | diff --git a/sanic/log.py b/sanic/log.py
index cb8ca52475..08fc835d14 100644
--- a/sanic/log.py
+++ b/sanic/log.py
@@ -6,7 +6,7 @@
version=1,
disable_existing_loggers=False,
loggers={
- "root": {"level": "INFO", "handlers": ["console"]},
+ "sanic.root": {"level": "INFO", "handlers": ["console"]},
"sanic.error": {
"level": "INFO",
"handlers": ["error_console"],
diff --git a/tests/test_logging.py b/tests/test_logging.py
index 3af3f122db..95c55de0ca 100644
--- a/tests/test_logging.py
+++ b/tests/test_logging.py
@@ -49,7 +49,7 @@ def test_logging_defaults():
reset_logging()
app = Sanic("test_logging")
- for fmt in [h.formatter for h in logging.getLogger('root').handlers]:
+ for fmt in [h.formatter for h in logging.getLogger('sanic.root').handlers]:
assert fmt._fmt == LOGGING_CONFIG_DEFAULTS['formatters']['generic']['format']
for fmt in [h.formatter for h in logging.getLogger('sanic.error').handlers]:
@@ -68,7 +68,7 @@ def test_logging_pass_customer_logconfig():
app = Sanic("test_logging", log_config=modified_config)
- for fmt in [h.formatter for h in logging.getLogger('root').handlers]:
+ for fmt in [h.formatter for h in logging.getLogger('sanic.root').handlers]:
assert fmt._fmt == modified_config['formatters']['generic']['format']
for fmt in [h.formatter for h in logging.getLogger('sanic.error').handlers]:
@@ -82,7 +82,7 @@ def test_logging_pass_customer_logconfig():
def test_log_connection_lost(app, debug, monkeypatch):
""" Should not log Connection lost exception on non debug """
stream = StringIO()
- root = logging.getLogger('root')
+ root = logging.getLogger('sanic.root')
root.addHandler(logging.StreamHandler(stream))
monkeypatch.setattr(sanic.server, 'logger', root)
@@ -102,3 +102,15 @@ async def conn_lost(request):
assert 'Connection lost before response written @' in log
else:
assert 'Connection lost before response written @' not in log
+
+
+def test_logging_modified_root_logger_config():
+ reset_logging()
+
+ modified_config = LOGGING_CONFIG_DEFAULTS
+ modified_config['loggers']['sanic.root']['level'] = 'DEBUG'
+
+ app = Sanic("test_logging", log_config=modified_config)
+
+ assert logging.getLogger('sanic.root').getEffectiveLevel() == logging.DEBUG
+
|
dotkom__onlineweb4-810 | Active feedbacks bug
Minor bug where feedbacks where everyone answers does not get set to inactive.
| [
{
"content": "# -*- coding: utf-8 -*-\nimport datetime\nimport socket\nimport locale\nimport logging\n\nfrom django.utils import timezone\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\n\nfrom apps.events.models import Eve... | [
{
"content": "# -*- coding: utf-8 -*-\nimport datetime\nimport socket\nimport locale\nimport logging\n\nfrom django.utils import timezone\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\n\nfrom apps.events.models import Eve... | diff --git a/apps/feedback/mommy.py b/apps/feedback/mommy.py
index 7295634f5..efd8ef73f 100644
--- a/apps/feedback/mommy.py
+++ b/apps/feedback/mommy.py
@@ -45,7 +45,10 @@ def generate_message(feedback, logger):
#return if everyone has answered
if not not_responded:
+ feedback.active = False
+ feedback.save()
logger.info('Everyone has answered')
+ logger.info('Feedback set to innactive')
return message
|
sktime__sktime-3653 | [DOC] sktime docs should link clearly to example notebooks
It seems that the sktime doc user journey does not lead clearly to the example notebooks when starting on the doc page.
This should be investigated and reworked.
Related issue: https://github.com/alan-turing-institute/sktime/issues/2127
| [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n\"\"\"Configuration file for the Sphinx documentation builder.\"\"\"\n\nimport os\nimport sys\nfrom importlib import import_module\n\nimport sktime\n\n# -- Path setup --------------------------------------------------------------\n\n# If extension... | [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n\"\"\"Configuration file for the Sphinx documentation builder.\"\"\"\n\nimport os\nimport sys\nfrom importlib import import_module\n\nimport sktime\n\n# -- Path setup --------------------------------------------------------------\n\n# If extension... | diff --git a/docs/source/conf.py b/docs/source/conf.py
index 411941bbbae..976e3c9e332 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -55,6 +55,11 @@
# Recommended by sphinx_design when using the MyST Parser
myst_enable_extensions = ["colon_fence"]
+# Notebook thumbnails
+nbsphinx_thumbnails = {
+ "examples/02_classification": "examples/img/tsc.png",
+}
+
# Use bootstrap CSS from theme.
panels_add_bootstrap_css = False
diff --git a/docs/source/examples.rst b/docs/source/examples.rst
new file mode 100644
index 00000000000..590de105a6b
--- /dev/null
+++ b/docs/source/examples.rst
@@ -0,0 +1,72 @@
+.. _examples:
+
+==========
+Examples
+==========
+
+Forecasting
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/01_forecasting.ipynb
+ examples/01a_forecasting_sklearn.ipynb
+ examples/01b_forecasting_proba.ipynb
+ examples/forecasting/*
+
+Classification
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/02_classification.ipynb
+ examples/classification/*
+
+Regression
+=============
+
+To come!
+
+Clustering
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/clustering/*
+
+Annotation
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/annotation/*
+
+Transformation
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/transformation/*
+
+Data
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/AA_datatypes_and_datasets.ipynb
+ examples/data/*
+
+Other
+=============
+
+.. nbgallery::
+ :glob:
+
+ examples/04_benchmarking.ipynb
+ examples/other/*
diff --git a/docs/source/examples/annotation/segmentation_with_clasp.ipynb b/docs/source/examples/annotation/segmentation_with_clasp.ipynb
new file mode 120000
index 00000000000..3e51aa3e367
--- /dev/null
+++ b/docs/source/examples/annotation/segmentation_with_clasp.ipynb
@@ -0,0 +1 @@
+../../../../examples/segmentation_with_clasp.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/classification/02a_classification_multivariate_cnn.ipynb b/docs/source/examples/classification/02a_classification_multivariate_cnn.ipynb
new file mode 120000
index 00000000000..7bbb59cfa55
--- /dev/null
+++ b/docs/source/examples/classification/02a_classification_multivariate_cnn.ipynb
@@ -0,0 +1 @@
+../../../../examples/02a_classification_multivariate_cnn.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/classification/channel_selection.ipynb b/docs/source/examples/classification/channel_selection.ipynb
new file mode 120000
index 00000000000..23e3736743f
--- /dev/null
+++ b/docs/source/examples/classification/channel_selection.ipynb
@@ -0,0 +1 @@
+../../../../examples/channel_selection.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/classification/dictionary_based_classification.ipynb b/docs/source/examples/classification/dictionary_based_classification.ipynb
new file mode 120000
index 00000000000..66ff49c6cb6
--- /dev/null
+++ b/docs/source/examples/classification/dictionary_based_classification.ipynb
@@ -0,0 +1 @@
+../../../../examples/dictionary_based_classification.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/classification/early_classification.ipynb b/docs/source/examples/classification/early_classification.ipynb
new file mode 120000
index 00000000000..4cdfa912314
--- /dev/null
+++ b/docs/source/examples/classification/early_classification.ipynb
@@ -0,0 +1 @@
+../../../../examples/early_classification.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/classification/interval_based_classification.ipynb b/docs/source/examples/classification/interval_based_classification.ipynb
new file mode 120000
index 00000000000..745612878dd
--- /dev/null
+++ b/docs/source/examples/classification/interval_based_classification.ipynb
@@ -0,0 +1 @@
+../../../../examples/interval_based_classification.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/clustering/partition_based_clustering.ipynb b/docs/source/examples/clustering/partition_based_clustering.ipynb
new file mode 120000
index 00000000000..cafd15ab95e
--- /dev/null
+++ b/docs/source/examples/clustering/partition_based_clustering.ipynb
@@ -0,0 +1 @@
+../../../../examples/partition_based_clustering.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/data/loading_data.ipynb b/docs/source/examples/data/loading_data.ipynb
new file mode 120000
index 00000000000..f91d9400996
--- /dev/null
+++ b/docs/source/examples/data/loading_data.ipynb
@@ -0,0 +1 @@
+../../../../examples/loading_data.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/forecasting/01c_forecasting_hierarchical_global.ipynb b/docs/source/examples/forecasting/01c_forecasting_hierarchical_global.ipynb
new file mode 120000
index 00000000000..392fa386adb
--- /dev/null
+++ b/docs/source/examples/forecasting/01c_forecasting_hierarchical_global.ipynb
@@ -0,0 +1 @@
+../../../../examples/01c_forecasting_hierarchical_global.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/forecasting/window_splitters.ipynb b/docs/source/examples/forecasting/window_splitters.ipynb
new file mode 120000
index 00000000000..58c6aeb9659
--- /dev/null
+++ b/docs/source/examples/forecasting/window_splitters.ipynb
@@ -0,0 +1 @@
+../../../../examples/window_splitters.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/other/distances.ipynb b/docs/source/examples/other/distances.ipynb
new file mode 120000
index 00000000000..1a7021840a7
--- /dev/null
+++ b/docs/source/examples/other/distances.ipynb
@@ -0,0 +1 @@
+../../../../examples/distances.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/catch22.ipynb b/docs/source/examples/transformation/catch22.ipynb
new file mode 120000
index 00000000000..aa0770a2ccb
--- /dev/null
+++ b/docs/source/examples/transformation/catch22.ipynb
@@ -0,0 +1 @@
+../../../../examples/catch22.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/feature_extraction_with_tsfresh.ipynb b/docs/source/examples/transformation/feature_extraction_with_tsfresh.ipynb
new file mode 120000
index 00000000000..1703cc6e2d0
--- /dev/null
+++ b/docs/source/examples/transformation/feature_extraction_with_tsfresh.ipynb
@@ -0,0 +1 @@
+../../../../examples/feature_extraction_with_tsfresh.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/hidalgo_segmentation.ipynb b/docs/source/examples/transformation/hidalgo_segmentation.ipynb
new file mode 120000
index 00000000000..2d0eccb8dc7
--- /dev/null
+++ b/docs/source/examples/transformation/hidalgo_segmentation.ipynb
@@ -0,0 +1 @@
+../../../../examples/hidalgo_segmentation.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/interpolation.ipynb b/docs/source/examples/transformation/interpolation.ipynb
new file mode 120000
index 00000000000..9c906f2f888
--- /dev/null
+++ b/docs/source/examples/transformation/interpolation.ipynb
@@ -0,0 +1 @@
+../../../../examples/interpolation.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/minirocket.ipynb b/docs/source/examples/transformation/minirocket.ipynb
new file mode 120000
index 00000000000..84adf880da2
--- /dev/null
+++ b/docs/source/examples/transformation/minirocket.ipynb
@@ -0,0 +1 @@
+../../../../examples/minirocket.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/plateau_finder.ipynb b/docs/source/examples/transformation/plateau_finder.ipynb
new file mode 120000
index 00000000000..10226731968
--- /dev/null
+++ b/docs/source/examples/transformation/plateau_finder.ipynb
@@ -0,0 +1 @@
+../../../../examples/plateau_finder.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/rocket.ipynb b/docs/source/examples/transformation/rocket.ipynb
new file mode 120000
index 00000000000..d1189babf03
--- /dev/null
+++ b/docs/source/examples/transformation/rocket.ipynb
@@ -0,0 +1 @@
+../../../../examples/rocket.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/signature_method.ipynb b/docs/source/examples/transformation/signature_method.ipynb
new file mode 120000
index 00000000000..fc23d016352
--- /dev/null
+++ b/docs/source/examples/transformation/signature_method.ipynb
@@ -0,0 +1 @@
+../../../../examples/signature_method.ipynb
\ No newline at end of file
diff --git a/docs/source/examples/transformation/theta_transform.ipynb b/docs/source/examples/transformation/theta_transform.ipynb
new file mode 120000
index 00000000000..b845a69a588
--- /dev/null
+++ b/docs/source/examples/transformation/theta_transform.ipynb
@@ -0,0 +1 @@
+../../../../examples/theta_transform.ipynb
\ No newline at end of file
diff --git a/docs/source/index.rst b/docs/source/index.rst
index a3df10c16e6..2280163c645 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -59,6 +59,7 @@ Contents
get_involved
developers
about
+ examples
.. grid:: 1 2 2 2
:gutter: 3
diff --git a/examples/02_classification.ipynb b/examples/02_classification.ipynb
index 750f645c535..ff815c36fc3 100644
--- a/examples/02_classification.ipynb
+++ b/examples/02_classification.ipynb
@@ -2,54 +2,47 @@
"cells": [
{
"cell_type": "markdown",
- "metadata": {
- "collapsed": true,
- "pycharm": {
- "name": "#%% md\n"
- }
- },
"source": [
"# Time Series Classification with sktime\n",
"\n",
- "The Time Series Classification (TSC) task involves training a model from a collection of time series (real valued, ordered, data) in order to predict a target variable. For example, we might want to build a model that can predict whether a patient is sick based on the ECG reading, or predict whether a device will fail based on some sensor reading. This notebook gives a quick guide to get you started."
- ]
+ "The Time Series Classification (TSC) task involves training a model from a collection of time series (real valued, ordered, data) in order to predict a target variable. For example, we might want to build a model that can predict whether a patient is sick based on the ECG reading, or predict whether a device will fail based on some sensor reading. This notebook gives a quick guide to get you started.\n",
+ "\n",
+ "<img src=\"./img/tsc.png\" width=\"600\" alt=\"time series classification\"> [<i>​</i>](./img/tsc.png)"
+ ],
+ "metadata": {
+ "collapsed": false
+ }
},
{
"cell_type": "markdown",
- "metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
- },
"source": [
"## Datasets and Problem Types\n",
"\n",
"The UCR/UEA [TSC dataset archive](https://timeseriesclassification.com/) contains a large number of example TSC problems that have been used thousands of times in the literature to assess TSC algorithms. These datasets have certain characteristics that influence what data structure we use to store them in memory.\n",
"\n",
- "Most datasets in the archive contain time series all the same length. For example, the [ArrowHead dataset](https://timeseriesclassification.com/description.php?Dataset=ArrowHead) dataset consists of outlines of the images of arrow heads. The classification of projectile points is an important topic in anthropology.\n",
+ "Most datasets in the archive contain time series all the same length. For example, the [ArrowHead dataset](https://timeseriesclassification.com/description.php?Dataset=ArrowHead) consists of outlines of the images of arrow heads. The classification of projectile points is an important topic in anthropology.\n",
"\n",
- "<img src=\"./img/arrow-heads.png\" width=\"400\" alt=\"arrow heads\">\n",
+ "<img src=\"./img/arrow-heads.png\" width=\"600\" alt=\"arrow heads\">\n",
"\n",
"The shapes of the projectile points are converted into a sequence using the angle-based method as described in this [blog post](https://izbicki.me/blog/converting-images-into-time-series-for-data-mining.html) about converting images into time series for data mining.\n",
"\n",
- "<img src=\"./img/from-shapes-to-time-series.png\" width=\"400\" alt=\"from shapes to time series\">\n",
+ "<img src=\"./img/from-shapes-to-time-series.png\" width=\"600\" alt=\"from shapes to time series\">\n",
"\n",
"Each instance consists of a single time series (i.e. the problem is univariate) of equal length and a class label based on shape distinctions such as the presence and location of a notch in the arrow. The data set consists of 210 instances, by default split into 36 train and 175 test instances. We refer to the collection of time series as $X$ and to the collection of class labels as $y$.\n",
"\n",
"Below, we store the data in a 3D dimensional (instance, variable, time point) numpy array for $X$, and a one dimensional (instance) numpy array for $y$. In TSC the variable portion is commonly referred to as the dimension of the time series instance.\n",
"\n",
"For the single problem loader load arrow head, set the return type to `numpy3D` to store $X$ in such a 3D ndarray. The data can also be returned in other formats, e.g., `pd-multiindex` (row-index hierarchical pandas), or `numpyflat` (2D numpy with rows=instances, columns=time points; alias is `numpy2d`). The full range of options are the `Panel` data format strings desribed in tutorial AA - datatypes and data loaders (see there)."
- ]
+ ],
+ "metadata": {
+ "collapsed": false
+ }
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -68,10 +61,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -147,10 +137,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -175,10 +162,7 @@
{
"cell_type": "markdown",
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
+ "collapsed": false
},
"source": [
"Some data sets have unequal length series. Two data sets with this characteristic are shipped with sktime: PLAID (univariate) and JapaneseVowels (multivariate). We cannot store unequal length series in numpy arrays. Instead, we use nested pandas data frames, where each cell is a pandas Series. This is the default return type for all single problem loaders."
@@ -188,10 +172,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -211,10 +192,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -259,10 +237,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -292,10 +267,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -320,10 +292,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -354,10 +323,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -464,10 +430,7 @@
"Parameter tuning using `sklearn` `GridSearchCV`, we tune the _k_ and distance measure for a K-NN classifier:"
],
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
+ "collapsed": false
}
},
{
@@ -495,10 +458,7 @@
"Probability calibration with the `sklearn` `CalibratedClassifierCV`:"
],
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
+ "collapsed": false
}
},
{
@@ -519,10 +479,7 @@
"accuracy_score(arrow_test_y, y_pred)"
],
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
}
},
{
@@ -533,10 +490,7 @@
{
"cell_type": "markdown",
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
+ "collapsed": false
},
"source": [
"## Multivariate Classification\n",
@@ -547,10 +501,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -575,19 +526,13 @@
"accuracy_score(motions_test_y, y_pred)"
],
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
}
},
{
"cell_type": "markdown",
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%% md\n"
- }
+ "collapsed": false
},
"source": [
"`sktime` offers two other ways of building estimators for multivariate time series problems:\n",
@@ -602,10 +547,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
@@ -630,10 +572,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "collapsed": false,
- "pycharm": {
- "name": "#%%\n"
- }
+ "collapsed": false
},
"outputs": [],
"source": [
diff --git a/examples/img/tsc.png b/examples/img/tsc.png
new file mode 100644
index 00000000000..48c5df29586
Binary files /dev/null and b/examples/img/tsc.png differ
|
huggingface__peft-646 | importing peft with an old version of bitsandbytes causes an exception
### System Info
Importing peft with the bitsandbytes version "0.39.1" works. But when importing peft with the version "0.38.1", I get an exception : `AttributeError: module 'bitsandbytes.nn' has no attribute 'Linear4bit'`.
Indeed, the class `SVDLinear4bit` should be defined only if `is_bnb_4bit_available()`, not just if `is_bnb_available()`.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder
- [ ] My own task or dataset (give details below)
### Reproduction
in a notebook :
!pip install 'bitsandbytes==0.38.1'
import peft
### Expected behavior
no exception
| [
{
"content": "import re\nimport warnings\nfrom dataclasses import dataclass, field\nfrom typing import Optional\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom transformers.pytorch_utils import Conv1D\n\nfrom ..import_utils import is_bnb_4bit_available, is_bnb_available\nfrom ..uti... | [
{
"content": "import re\nimport warnings\nfrom dataclasses import dataclass, field\nfrom typing import Optional\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom transformers.pytorch_utils import Conv1D\n\nfrom ..import_utils import is_bnb_4bit_available, is_bnb_available\nfrom ..uti... | diff --git a/src/peft/tuners/adalora.py b/src/peft/tuners/adalora.py
index d1ff7f2e4f..aff877adac 100644
--- a/src/peft/tuners/adalora.py
+++ b/src/peft/tuners/adalora.py
@@ -523,6 +523,9 @@ def forward(self, x: torch.Tensor):
result = result + output
return result
+
+if is_bnb_4bit_available():
+
class SVDLinear4bit(bnb.nn.Linear4bit, AdaLoraLayer):
# Low-rank matrix for SVD-based adaptation
def __init__(
|
pyload__pyload-1733 | HOOK LinkdecrypterCom: 'LinkdecrypterComHook' object has no attribute 'COOKIES'
03.08.2015 20:46:43 INFO Free space: 6.48 TiB
630 03.08.2015 20:46:43 INFO Activating Accounts...
631 03.08.2015 20:46:43 INFO Activating Plugins...
632 03.08.2015 20:46:43 WARNING HOOK AntiStandby: Unable to change system power state | [Errno 2] No such file or directory
633 03.08.2015 20:46:43 WARNING HOOK AntiStandby: Unable to change display power state | [Errno 2] No such file or directory
634 03.08.2015 20:46:43 INFO HOOK XFileSharingPro: Handling any hoster I can!
635 03.08.2015 20:46:43 WARNING HOOK UpdateManager: Unable to retrieve server to get updates
636 03.08.2015 20:46:43 INFO HOOK XFileSharingPro: Handling any crypter I can!
637 03.08.2015 20:46:43 INFO pyLoad is up and running
638 03.08.2015 20:46:45 INFO HOOK LinkdecrypterCom: Reloading supported crypter list
639 03.08.2015 20:46:45 WARNING HOOK LinkdecrypterCom: 'LinkdecrypterComHook' object has no attribute 'COOKIES' | Waiting 1 minute and retry
640 03.08.2015 20:46:53 INFO HOOK ClickAndLoad: Proxy listening on 127.0.0.1:9666
641 03.08.2015 20:46:53 INFO HOOK LinkdecrypterCom: Reloading supported crypter list
642 03.08.2015 20:46:53 WARNING HOOK LinkdecrypterCom: 'LinkdecrypterComHook' object has no attribute 'COOKIES' | Waiting 1 minute and retry
643 03.08.2015 20:47:45 WARNING HOOK LinkdecrypterCom: 'LinkdecrypterComHook' object has no attribute 'COOKIES' | Waiting 1 minute and retry
644 03.08.2015 20:47:53 WARNING HOOK LinkdecrypterCom: 'LinkdecrypterComHook' object has no attribute 'COOKIES' | Waiting 1 minute and retry
| [
{
"content": "# -*- coding: utf-8 -*-\n\nimport re\n\nfrom module.plugins.internal.MultiHook import MultiHook\n\n\nclass LinkdecrypterComHook(MultiHook):\n __name__ = \"LinkdecrypterComHook\"\n __type__ = \"hook\"\n __version__ = \"1.07\"\n __status__ = \"testing\"\n\n __config__ = [(\"act... | [
{
"content": "# -*- coding: utf-8 -*-\n\nimport re\n\nfrom module.plugins.internal.MultiHook import MultiHook\n\n\nclass LinkdecrypterComHook(MultiHook):\n __name__ = \"LinkdecrypterComHook\"\n __type__ = \"hook\"\n __version__ = \"1.07\"\n __status__ = \"testing\"\n\n __config__ = [(\"act... | diff --git a/module/plugins/hooks/LinkdecrypterComHook.py b/module/plugins/hooks/LinkdecrypterComHook.py
index 6930afdb50..bf437fb6d8 100644
--- a/module/plugins/hooks/LinkdecrypterComHook.py
+++ b/module/plugins/hooks/LinkdecrypterComHook.py
@@ -21,6 +21,7 @@ class LinkdecrypterComHook(MultiHook):
__license__ = "GPLv3"
__authors__ = [("Walter Purcaro", "vuolter@gmail.com")]
+ COOKIES = False
def get_hosters(self):
list = re.search(r'>Supported\(\d+\)</b>: <i>(.[\w.\-, ]+)',
|
networkx__networkx-6600 | Error in method description in ismags.py
The docstring of `partition_to_color` method in ismags.py seems off to me. The description is not clear, and it's hard to understand what the method is supposed to do.
```python
def partition_to_color(partitions):
"""
Creates a dictionary with for every item in partition for every partition
in partitions the index of partition in partitions.
Parameters
----------
partitions: collections.abc.Sequence[collections.abc.Iterable]
As returned by :func:`make_partitions`.
Returns
-------
dict
"""
colors = {}
for color, keys in enumerate(partitions):
for key in keys:
colors[key] = color
return colors
```
I think the following description explains the method better.
```python
def partition_to_color(partitions):
"""
Creates a dictionary that maps each item in each partition to the index of
the partition it belongs to
"""
```
If the new description looks alright, I'll go ahead and make the changes.
| [
{
"content": "\"\"\"\n****************\nISMAGS Algorithm\n****************\n\nProvides a Python implementation of the ISMAGS algorithm. [1]_\n\nIt is capable of finding (subgraph) isomorphisms between two graphs, taking the\nsymmetry of the subgraph into account. In most cases the VF2 algorithm is\nfaster (at l... | [
{
"content": "\"\"\"\n****************\nISMAGS Algorithm\n****************\n\nProvides a Python implementation of the ISMAGS algorithm. [1]_\n\nIt is capable of finding (subgraph) isomorphisms between two graphs, taking the\nsymmetry of the subgraph into account. In most cases the VF2 algorithm is\nfaster (at l... | diff --git a/networkx/algorithms/isomorphism/ismags.py b/networkx/algorithms/isomorphism/ismags.py
index 4145be1150c..76fdee05c5a 100644
--- a/networkx/algorithms/isomorphism/ismags.py
+++ b/networkx/algorithms/isomorphism/ismags.py
@@ -184,8 +184,8 @@ def make_partitions(items, test):
def partition_to_color(partitions):
"""
- Creates a dictionary with for every item in partition for every partition
- in partitions the index of partition in partitions.
+ Creates a dictionary that maps each item in each partition to the index of
+ the partition to which it belongs.
Parameters
----------
|
biolab__orange3-4619 | Table doesn't output the current tab selection when switching between tabs
**Describe the bug**
If the Table widget has multiple tabs and if on each tab there are some instances selected it doesn't change the output when a user switches between tabs. The widget doesn't output the current tab selection but it outputs the last selection made on some of the previous tabs and the user has to re-select instances.
**Orange version:**
3.25.dev
**Expected behavior**
The output should change every time the tab is changed.

| [
{
"content": "import sys\nimport threading\nimport io\nimport csv\nimport itertools\nimport concurrent.futures\n\nfrom collections import OrderedDict, namedtuple\nfrom typing import List, Tuple, Iterable\n\nfrom math import isnan\n\nimport numpy\nfrom scipy.sparse import issparse\n\nfrom AnyQt.QtWidgets import ... | [
{
"content": "import sys\nimport threading\nimport io\nimport csv\nimport itertools\nimport concurrent.futures\n\nfrom collections import OrderedDict, namedtuple\nfrom typing import List, Tuple, Iterable\n\nfrom math import isnan\n\nimport numpy\nfrom scipy.sparse import issparse\n\nfrom AnyQt.QtWidgets import ... | diff --git a/Orange/widgets/data/owtable.py b/Orange/widgets/data/owtable.py
index 7288937d537..abd2e0f0fb1 100644
--- a/Orange/widgets/data/owtable.py
+++ b/Orange/widgets/data/owtable.py
@@ -718,6 +718,7 @@ def _on_current_tab_changed(self, index):
view = self.tabs.widget(index)
if view is not None and view.model() is not None:
self.set_info(view.input_slot.summary)
+ self.update_selection()
else:
self.set_info(None)
|
encode__django-rest-framework-7158 | RemoteUserAuthentication.authenticate calls django.contrib.auth.authenticate without request argument
## Checklist
- [X] I have verified that that issue exists against the `master` branch of Django REST framework.
- [X] I have searched for similar issues in both open and closed tickets and cannot find a duplicate.
- [X] This is not a usage question. (Those should be directed to the [discussion group](https://groups.google.com/forum/#!forum/django-rest-framework) instead.)
- [X] This cannot be dealt with as a third party library. (We prefer new functionality to be [in the form of third party libraries](https://www.django-rest-framework.org/community/third-party-packages/#about-third-party-packages) where possible.)
- [X] I have reduced the issue to the simplest possible case.
- [x] I have included a failing test as a pull request. (If you are unable to do so we can still accept the issue.)
## Expected behavior
`user = authenticate(request=request, remote_user=request.META.get(self.header))`
## Actual behavior
`user = authenticate(remote_user=request.META.get(self.header))`
| [
{
"content": "\"\"\"\nProvides various authentication policies.\n\"\"\"\nimport base64\nimport binascii\n\nfrom django.contrib.auth import authenticate, get_user_model\nfrom django.middleware.csrf import CsrfViewMiddleware\nfrom django.utils.translation import gettext_lazy as _\n\nfrom rest_framework import HTT... | [
{
"content": "\"\"\"\nProvides various authentication policies.\n\"\"\"\nimport base64\nimport binascii\n\nfrom django.contrib.auth import authenticate, get_user_model\nfrom django.middleware.csrf import CsrfViewMiddleware\nfrom django.utils.translation import gettext_lazy as _\n\nfrom rest_framework import HTT... | diff --git a/rest_framework/authentication.py b/rest_framework/authentication.py
index 1e30728d34..1dfc23d7f9 100644
--- a/rest_framework/authentication.py
+++ b/rest_framework/authentication.py
@@ -220,6 +220,6 @@ class RemoteUserAuthentication(BaseAuthentication):
header = "REMOTE_USER"
def authenticate(self, request):
- user = authenticate(remote_user=request.META.get(self.header))
+ user = authenticate(request=request, remote_user=request.META.get(self.header))
if user and user.is_active:
return (user, None)
|
benoitc__gunicorn-1708 | gunicorn crashed on start with --reload flag
Setup: Vagrant, virtualenv, gunicorn 19.3.0:
The following command produces this stack:
`gunicorn -c /data/shared/api/gunicorn_config.py -b unix:/tmp/api-dev-gunicorn.sock --log-level INFO --reload wsgi:app`
```
Exception in thread Thread-1:
Traceback (most recent call last):
File "/home/vagrant/.pyenv/versions/2.7.6/lib/python2.7/threading.py", line 810, in __bootstrap_inner
self.run()
File "/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py", line 41, in run
for filename in self.get_files():
File "/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py", line 30, in get_files
if hasattr(module, '__file__')
File "/data/virtualenv/default/lib/python2.7/re.py", line 151, in sub
return _compile(pattern, flags).sub(repl, string, count)
TypeError: expected string or buffer
```
If I remove --reload it boots up fine.
| [
{
"content": "# -*- coding: utf-8 -\n#\n# This file is part of gunicorn released under the MIT license.\n# See the NOTICE for more information.\n\nimport os\nimport os.path\nimport re\nimport sys\nimport time\nimport threading\n\n\nclass Reloader(threading.Thread):\n def __init__(self, extra_files=None, inte... | [
{
"content": "# -*- coding: utf-8 -\n#\n# This file is part of gunicorn released under the MIT license.\n# See the NOTICE for more information.\n\nimport os\nimport os.path\nimport re\nimport sys\nimport time\nimport threading\n\n\nclass Reloader(threading.Thread):\n def __init__(self, extra_files=None, inte... | diff --git a/gunicorn/reloader.py b/gunicorn/reloader.py
index b1ce743f9..4ab868e94 100644
--- a/gunicorn/reloader.py
+++ b/gunicorn/reloader.py
@@ -28,7 +28,7 @@ def get_files(self):
fnames = [
re.sub('py[co]$', 'py', module.__file__)
for module in list(sys.modules.values())
- if hasattr(module, '__file__')
+ if getattr(module, '__file__', None)
]
with self._extra_files_lock:
|
ivy-llc__ivy-13420 | standard_gamma
| [
{
"content": "# local\nimport ivy\nfrom ivy.functional.frontends.numpy.func_wrapper import (\n to_ivy_arrays_and_back,\n from_zero_dim_arrays_to_scalar,\n)\n\n\n@to_ivy_arrays_and_back\n@from_zero_dim_arrays_to_scalar\ndef random_sample(size=None):\n return ivy.random_uniform(low=0.0, high=1.0, shape=s... | [
{
"content": "# local\nimport ivy\nfrom ivy.functional.frontends.numpy.func_wrapper import (\n to_ivy_arrays_and_back,\n from_zero_dim_arrays_to_scalar,\n)\n\n\n@to_ivy_arrays_and_back\n@from_zero_dim_arrays_to_scalar\ndef random_sample(size=None):\n return ivy.random_uniform(low=0.0, high=1.0, shape=s... | diff --git a/ivy/functional/frontends/numpy/random/functions.py b/ivy/functional/frontends/numpy/random/functions.py
index 4dc1c4a1e42f5..bbd25a45c5e3d 100644
--- a/ivy/functional/frontends/numpy/random/functions.py
+++ b/ivy/functional/frontends/numpy/random/functions.py
@@ -89,3 +89,9 @@ def shuffle(x, /):
@from_zero_dim_arrays_to_scalar
def standard_normal(size=None):
return ivy.random_normal(mean=0.0, std=1.0, shape=size, dtype="float64")
+
+
+@to_ivy_arrays_and_back
+@from_zero_dim_arrays_to_scalar
+def standard_gamma(alpha):
+ return ivy.gamma(alpha, beta=1.0, dtype="float64")
diff --git a/ivy_tests/test_ivy/test_frontends/test_numpy/test_random/test_functions.py b/ivy_tests/test_ivy/test_frontends/test_numpy/test_random/test_functions.py
index fb21c870aebfb..ab4e80c6d7210 100644
--- a/ivy_tests/test_ivy/test_frontends/test_numpy/test_random/test_functions.py
+++ b/ivy_tests/test_ivy/test_frontends/test_numpy/test_random/test_functions.py
@@ -349,3 +349,32 @@ def test_numpy_standard_normal(
test_values=False,
size=size,
)
+
+
+@handle_frontend_test(
+ fn_tree="numpy.random.standard_gamma",
+ dtype_and_x=helpers.dtype_and_values(
+ available_dtypes=helpers.get_dtypes("float"),
+ shape=st.tuples(st.integers(min_value=1, max_value=2)),
+ min_value=1,
+ max_value=100,
+ ),
+ test_with_out=st.just(False),
+)
+def test_numpy_standard_gamma(
+ dtype_and_x,
+ frontend,
+ test_flags,
+ fn_tree,
+ on_device,
+):
+ input_dtype, x = dtype_and_x
+ helpers.test_frontend_function(
+ input_dtypes=input_dtype,
+ test_flags=test_flags,
+ frontend=frontend,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ alpha=x[0],
+ test_values=False,
+ )
|
fedora-infra__bodhi-1061 | Bodhi sends notifications to old address after e-mail change
I've changed my e-mail addresses in all locations I could think of:
- [fedmsg](https://apps.fedoraproject.org/notifications)
- [bugzilla](https://bugzilla.redhat.com/)
- [Fedora Admin](https://admin.fedoraproject.org/accounts/)
But I still get notifications from bodhi at my old address about updates I've commented on, see the message below for an example.
It looks like this message doesn't come from fedmsg, the mail doesn't have any X-Fedmsg header fields. If I click on "Manage Alerts" in bodhi, it shows my fedmsg settings (the new e-mail address).
I initially thought this is a caching issue. But I changed my address months ago and I still get notifications to my old address. In addition to that, I also get fedmsg style notifications to my new address, but only about my own comments.
I'm not sure if this is a bug or I forgot to change my address somewhere. If it's the latter, I would expect "Manage alerts" to point to the right location.
Example message:
> Return-Path: updates@fedoraproject.org
> Delivered-To: hofmann@kbsg.rwth-aachen.de
> Received: from mx-out-2.rwth-aachen.de (mx-out-2.rwth-aachen.de [134.130.5.187])
> (using TLSv1 with cipher DHE-RSA-CAMELLIA256-SHA (256/256 bits))
> (Client CN "mx-out-2.rwth-aachen.de", Issuer "RWTH Aachen CA" (verified OK))
> by lagrande.kbsg.rwth-aachen.de (Postfix) with ESMTPS id 7FA403FAD7
> for hofmann@kbsg.rwth-aachen.de; Fri, 2 Sep 2016 01:00:39 +0200 (CEST)
> X-IronPort-Anti-Spam-Filtered: true
> X-IronPort-Anti-Spam-Result: A0BnAQAJsshXhwK1hNFdHAEBBAEBgywBAQEBAXV8pHaRLIQRJIV4AoIkAQIBAQEBAQITAQEBCgsJCRkvhGICAQOBCSwPFg9IiGEOuwcBAQEBAQEEAQEBAQEBASCGLIIDhnABAQVkgXwLWIIvBZlQhiCJB4F3ToQPgw2GAIZwhViDeYMdEQqBTTw0hE2CHwEBAQ
> X-IPAS-Result: A0BnAQAJsshXhwK1hNFdHAEBBAEBgywBAQEBAXV8pHaRLIQRJIV4AoIkAQIBAQEBAQITAQEBCgsJCRkvhGICAQOBCSwPFg9IiGEOuwcBAQEBAQEEAQEBAQEBASCGLIIDhnABAQVkgXwLWIIvBZlQhiCJB4F3ToQPgw2GAIZwhViDeYMdEQqBTTw0hE2CHwEBAQ
> X-IronPort-AV: E=Sophos;i="5.30,268,1470693600";
> d="scan'208";a="456213363"
> Received: from bastion01.fedoraproject.org (HELO bastion.fedoraproject.org) ([209.132.181.2])
> by mx-2.rz.rwth-aachen.de with ESMTP; 02 Sep 2016 01:00:39 +0200
> Received: from bodhi03.phx2.fedoraproject.org (bodhi03.phx2.fedoraproject.org [10.5.126.115])
> by bastion01.phx2.fedoraproject.org (Postfix) with ESMTP id C7A8A6070D39
> for hofmann@kbsg.rwth-aachen.de; Thu, 1 Sep 2016 23:00:36 +0000 (UTC)
> From: updates@fedoraproject.org
> To: hofmann@kbsg.rwth-aachen.de
> X-Bodhi-Update-Builds: kernel-4.7.2-201.fc24
> In-Reply-To: bodhi-update-66003-labbott-F24@admin.fedoraproject.org
> X-Bodhi-Update-Pushed: True
> X-Bodhi-Update-Type: security
> X-Bodhi-Update-Release: F24
> References: bodhi-update-66003-labbott-F24@admin.fedoraproject.org
> X-Bodhi-Update-Status: testing
> X-Bodhi-Update-Request: stable
> X-Bodhi-Update-Submitter: labbott
> X-Bodhi-Update-Title: kernel-4.7.2-201.fc24
> X-Bodhi: fedoraproject.org
> Subject: [Fedora Update] [CRITPATH] [comment] kernel-4.7.2-201.fc24
> Message-Id: 20160901230036.C7A8A6070D39@bastion01.phx2.fedoraproject.org
> Date: Thu, 1 Sep 2016 23:00:36 +0000 (UTC)
> [message body skipped]
| [
{
"content": "# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope t... | [
{
"content": "# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope t... | diff --git a/bodhi/server/security.py b/bodhi/server/security.py
index e95947dbd9..c2633ba2df 100644
--- a/bodhi/server/security.py
+++ b/bodhi/server/security.py
@@ -91,9 +91,8 @@ def remember_me(context, request, info, *args, **kw):
db.add(user)
db.flush()
else:
- # We used to not track email addresses, so fill in the fields as people
- # log back in
- if not user.email:
+ # Update email address if the address changed
+ if user.email != email:
user.email = email
db.flush()
|
huggingface__optimum-360 | AutoConfig is not imported in optimization.py
### System Info
```shell
optimium master : fb7e303d9254fcee194aa76f4a0b7fa9d9b140d0
```
### Who can help?
@echarlaix
### Information
- [X] The official example scripts
- [ ] My own modified scripts
### Tasks
- [X] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Try to optimize a model and get `NameError: name 'AutoConfig' is not defined` as it's not imported
### Expected behavior
No runtime error
I made a PR here to fix that : https://github.com/huggingface/optimum/pull/360
| [
{
"content": "# Copyright 2021 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2... | [
{
"content": "# Copyright 2021 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2... | diff --git a/optimum/onnxruntime/optimization.py b/optimum/onnxruntime/optimization.py
index 2448f3b478..3b0eea7623 100644
--- a/optimum/onnxruntime/optimization.py
+++ b/optimum/onnxruntime/optimization.py
@@ -17,6 +17,7 @@
from typing import Callable, Dict, List, Optional, Tuple, Union
import transformers
+from transformers.models.auto.configuration_auto import AutoConfig
from onnx import load_model
from onnxruntime.transformers.fusion_options import FusionOptions
|
beeware__toga-569 | Error looking for icon for tutorial for 0.3.0.dev9
This is with Python 3.6.5 in a clean venv:
```
(.venv) PS C:\Users\_\Desktop\toga_tutorial> python .\helloworld.py
[Winforms] No valid icon format available for C:\Users\brcan\Desktop\toga_tutorial\.venv\lib\site-packages\toga\resources\tiberius; fall back on Tiberius instead
Unhandled Exception: Python.Runtime.PythonException: FileNotFoundException : Could not find file 'C:\Users\brcan\Desktop\toga_tutorial\.venv\lib\site-packages\toga\resources\tiberius.ico'.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)
at System.Drawing.Icon..ctor(String fileName, Int32 width, Int32 height)
at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.ThreadHelper.ThreadStart()
```
| [
{
"content": "#/usr/bin/env python\nimport io\nimport re\n\nfrom setuptools import setup, find_packages\n\nwith io.open('toga/__init__.py', encoding='utf8') as version_file:\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\", version_file.read(), re.M)\n if version_match:\n v... | [
{
"content": "#/usr/bin/env python\nimport io\nimport re\n\nfrom setuptools import setup, find_packages\n\nwith io.open('toga/__init__.py', encoding='utf8') as version_file:\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\", version_file.read(), re.M)\n if version_match:\n v... | diff --git a/src/core/setup.py b/src/core/setup.py
index a15d41397d..c4d67176e3 100644
--- a/src/core/setup.py
+++ b/src/core/setup.py
@@ -27,7 +27,7 @@
packages=find_packages(exclude='tests'),
python_requires='>=3.5',
package_data={
- 'toga': ['resources/*.icns', 'resources/*.png'],
+ 'toga': ['resources/*.icns', 'resources/*.ico', 'resources/*.png'],
},
include_package_data=True,
install_requires=[
|
evennia__evennia-2813 | [BUG - Develop] Can't `|` two SaverDicts
#### Describe the bug
When combining two attributes containing dict data, it fails with a traceback.
```
File "./TestGame/typeclasses/characters.py", line 30, in test_attr
return self.db.db_one | self.db.db_two
File "./evennia/evennia/utils/dbserialize.py", line 243, in __or__
return self._data | other
TypeError: unsupported operand type(s) for |: 'dict' and '_SaverDict
```
#### To Reproduce
Steps to reproduce the behavior:
1. Store dicts in two attributes or attribute properties.
2. Use the `|` operator on them
4. See error
#### Develop-branch commit
22fa2c6b8
| [
{
"content": "\"\"\"\nThis module handles serialization of arbitrary python structural data,\nintended primarily to be stored in the database. It also supports\nstoring Django model instances (which plain pickle cannot do).\n\nThis serialization is used internally by the server, notably for\nstoring data in Att... | [
{
"content": "\"\"\"\nThis module handles serialization of arbitrary python structural data,\nintended primarily to be stored in the database. It also supports\nstoring Django model instances (which plain pickle cannot do).\n\nThis serialization is used internally by the server, notably for\nstoring data in Att... | diff --git a/evennia/utils/dbserialize.py b/evennia/utils/dbserialize.py
index 11321d8dfd7..0b8b0e63b8d 100644
--- a/evennia/utils/dbserialize.py
+++ b/evennia/utils/dbserialize.py
@@ -243,6 +243,9 @@ def __gt__(self, other):
def __or__(self, other):
return self._data | other
+ def __ror__(self, other):
+ return self._data | other
+
@_save
def __setitem__(self, key, value):
self._data.__setitem__(key, self._convert_mutables(value))
|
ipython__ipython-1991 | %page not working
```
%page
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-336-e5a187ccb094> in <module>()
----> 1 get_ipython().magic(u'page')
c:\python26\lib\site-packages\ipython-0.13.dev-py2.6.egg\IPython\core\interactiveshell.pyc in magic(self, arg_s)
2150 magic_name, _, magic_arg_s = arg_s.partition(' ')
2151 magic_name = magic_name.lstrip(prefilter.ESC_MAGIC)
-> 2152 return self.run_line_magic(magic_name, magic_arg_s)
2153
2154 #-------------------------------------------------------------------------
c:\python26\lib\site-packages\ipython-0.13.dev-py2.6.egg\IPython\core\interactiveshell.pyc in run_line_magic(self, magic_name, line)
2076 args.append(sys._getframe(stack_depth).f_locals)
2077 with self.builtin_trap:
-> 2078 result = fn(*args)
2079 return result
2080
c:\python26\lib\site-packages\ipython-0.13.dev-py2.6.egg\IPython\core\magics\basic.pyc in page(self, parameter_s)
c:\python26\lib\site-packages\ipython-0.13.dev-py2.6.egg\IPython\core\magic.pyc in <lambda>(f, *a, **k)
188 # but it's overkill for just that one bit of state.
189 def magic_deco(arg):
--> 190 call = lambda f, *a, **k: f(*a, **k)
191
192 if callable(arg):
c:\python26\lib\site-packages\ipython-0.13.dev-py2.6.egg\IPython\core\magics\basic.pyc in page(self, parameter_s)
186
187 oname = args and args or '_'
--> 188 info = self._ofind(oname)
189 if info['found']:
190 txt = (raw and str or pformat)( info['obj'] )
AttributeError: 'BasicMagics' object has no attribute '_ofind'
```
| [
{
"content": "\"\"\"Implementation of basic magic functions.\n\"\"\"\n#-----------------------------------------------------------------------------\n# Copyright (c) 2012 The IPython Development Team.\n#\n# Distributed under the terms of the Modified BSD License.\n#\n# The full license is in the file COPYING... | [
{
"content": "\"\"\"Implementation of basic magic functions.\n\"\"\"\n#-----------------------------------------------------------------------------\n# Copyright (c) 2012 The IPython Development Team.\n#\n# Distributed under the terms of the Modified BSD License.\n#\n# The full license is in the file COPYING... | diff --git a/IPython/core/magics/basic.py b/IPython/core/magics/basic.py
index 534a210a973..d7f35263238 100644
--- a/IPython/core/magics/basic.py
+++ b/IPython/core/magics/basic.py
@@ -185,7 +185,7 @@ def page(self, parameter_s=''):
raw = 'r' in opts
oname = args and args or '_'
- info = self._ofind(oname)
+ info = self.shell._ofind(oname)
if info['found']:
txt = (raw and str or pformat)( info['obj'] )
page.page(txt)
|
PennyLaneAI__pennylane-5623 | [BUG] `param_shift` with `broadcast=True` does not work with zero-length recipes
### Expected behavior
`param_shift` has feature parity between `broadcast=True` and `broadcast=False`
### Actual behavior
The (somewhat esoteric) example from the test suite below runs with `broadcast=False` but not with `broadcast=True`.
### Additional information
If we do not include the commented `assert` line, the gradient computation will raise an error, because the created tapes are not valid.
If we include the commented `assert` line in the code below, we see that too many tapes are being created when `broadcast=True`. This is because unnecessary tapes with `batch_size=0` are being created, which is the core bug making the tapes invalid.
### Source code
```shell
ops_with_custom_recipe = [1]
broadcast = True
dev = qml.device("default.qubit", wires=2)
x = [0.543, -0.654]
with qml.queuing.AnnotatedQueue() as q:
qml.RX(x[0], wires=[0])
qml.RX(x[1], wires=[0])
qml.expval(qml.PauliZ(0))
tape = qml.tape.QuantumScript.from_queue(q)
gradient_recipes = tuple(
[[-1e7, 1, 0], [1e7, 1, 0]] if i in ops_with_custom_recipe else None for i in range(2)
)
tapes, fn = qml.gradients.param_shift(tape, gradient_recipes=gradient_recipes, broadcast=broadcast)
num_ops_standard = (2 - len(ops_with_custom_recipe))
# assert len(tapes) == (1 if broadcast else 2) * num_ops_standard + (tape.num_params != num_ops_standard)
grad = fn(qml.execute(tapes, dev, None))
print(grad)
```
### Tracebacks
```shell
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[2], line 19
17 num_ops_standard = (2 - len(ops_with_custom_recipe))
18 # assert len(tapes) == (1 if broadcast else 2) * num_ops_standard + (tape.num_params != num_ops_standard)
---> 19 grad = fn(qml.execute(tapes, dev, None))
20 print(grad)
File ~/repos/pennylane/pennylane/workflow/execution.py:616, in execute(tapes, device, gradient_fn, interface, transform_program, config, grad_on_execution, gradient_kwargs, cache, cachesize, max_diff, override_shots, expand_fn, max_expansion, device_batch_transform, device_vjp)
614 # Exiting early if we do not need to deal with an interface boundary
615 if no_interface_boundary_required:
--> 616 results = inner_execute(tapes)
617 return post_processing(results)
619 _grad_on_execution = False
File ~/repos/pennylane/pennylane/workflow/execution.py:297, in _make_inner_execute.<locals>.inner_execute(tapes, **_)
294 transformed_tapes, transform_post_processing = transform_program(tapes)
296 if transformed_tapes:
--> 297 results = device_execution(transformed_tapes)
298 else:
299 results = ()
File ~/repos/pennylane/pennylane/devices/modifiers/simulator_tracking.py:30, in _track_execute.<locals>.execute(self, circuits, execution_config)
28 @wraps(untracked_execute)
29 def execute(self, circuits, execution_config=DefaultExecutionConfig):
---> 30 results = untracked_execute(self, circuits, execution_config)
31 if isinstance(circuits, QuantumScript):
32 batch = (circuits,)
File ~/repos/pennylane/pennylane/devices/modifiers/single_tape_support.py:32, in _make_execute.<locals>.execute(self, circuits, execution_config)
30 is_single_circuit = True
31 circuits = (circuits,)
---> 32 results = batch_execute(self, circuits, execution_config)
33 return results[0] if is_single_circuit else results
File ~/repos/pennylane/pennylane/devices/default_qubit.py:594, in DefaultQubit.execute(self, circuits, execution_config)
591 prng_keys = [self.get_prng_keys()[0] for _ in range(len(circuits))]
593 if max_workers is None:
--> 594 return tuple(
595 _simulate_wrapper(
596 c,
597 {
598 "rng": self._rng,
599 "debugger": self._debugger,
600 "interface": interface,
601 "state_cache": self._state_cache,
602 "prng_key": _key,
603 },
604 )
605 for c, _key in zip(circuits, prng_keys)
606 )
608 vanilla_circuits = [convert_to_numpy_parameters(c) for c in circuits]
609 seeds = self._rng.integers(2**31 - 1, size=len(vanilla_circuits))
File ~/repos/pennylane/pennylane/devices/default_qubit.py:595, in <genexpr>(.0)
591 prng_keys = [self.get_prng_keys()[0] for _ in range(len(circuits))]
593 if max_workers is None:
594 return tuple(
--> 595 _simulate_wrapper(
596 c,
597 {
598 "rng": self._rng,
599 "debugger": self._debugger,
600 "interface": interface,
601 "state_cache": self._state_cache,
602 "prng_key": _key,
603 },
604 )
605 for c, _key in zip(circuits, prng_keys)
606 )
608 vanilla_circuits = [convert_to_numpy_parameters(c) for c in circuits]
609 seeds = self._rng.integers(2**31 - 1, size=len(vanilla_circuits))
File ~/repos/pennylane/pennylane/devices/default_qubit.py:842, in _simulate_wrapper(circuit, kwargs)
841 def _simulate_wrapper(circuit, kwargs):
--> 842 return simulate(circuit, **kwargs)
File ~/repos/pennylane/pennylane/devices/qubit/simulate.py:292, in simulate(circuit, debugger, state_cache, **execution_kwargs)
290 if state_cache is not None:
291 state_cache[circuit.hash] = state
--> 292 return measure_final_state(circuit, state, is_state_batched, rng=rng, prng_key=meas_key)
File ~/repos/pennylane/pennylane/devices/qubit/simulate.py:213, in measure_final_state(circuit, state, is_state_batched, **execution_kwargs)
210 raise TypeError("Native mid-circuit measurements are only supported with finite shots.")
212 if len(circuit.measurements) == 1:
--> 213 return measure(circuit.measurements[0], state, is_state_batched=is_state_batched)
215 return tuple(
216 measure(mp, state, is_state_batched=is_state_batched) for mp in circuit.measurements
217 )
219 # finite-shot case
File ~/repos/pennylane/pennylane/devices/qubit/measure.py:233, in measure(measurementprocess, state, is_state_batched)
220 def measure(
221 measurementprocess: MeasurementProcess, state: TensorLike, is_state_batched: bool = False
222 ) -> TensorLike:
223 """Apply a measurement process to a state.
224
225 Args:
(...)
231 Tensorlike: the result of the measurement
232 """
--> 233 return get_measurement_function(measurementprocess, state)(
234 measurementprocess, state, is_state_batched
235 )
File ~/repos/pennylane/pennylane/devices/qubit/measure.py:72, in state_diagonalizing_gates(measurementprocess, state, is_state_batched)
70 wires = Wires(range(total_indices))
71 flattened_state = flatten_state(state, total_indices)
---> 72 return measurementprocess.process_state(flattened_state, wires)
File ~/repos/pennylane/pennylane/measurements/expval.py:142, in ExpectationMP.process_state(self, state, wire_order)
140 return qml.math.squeeze(self.eigvals())
141 with qml.queuing.QueuingManager.stop_recording():
--> 142 prob = qml.probs(wires=self.wires).process_state(state=state, wire_order=wire_order)
143 # In case of broadcasting, `prob` has two axes and this is a matrix-vector product
144 return self._calculate_expectation(prob)
File ~/repos/pennylane/pennylane/measurements/probs.py:238, in ProbabilityMP.process_state(self, state, wire_order)
236 prob = qml.math.transpose(prob, desired_axes)
237 # flatten and return probabilities
--> 238 return qml.math.reshape(prob, flat_shape)
File ~/venvs/dev/lib/python3.10/site-packages/autoray/autoray.py:80, in do(fn, like, *args, **kwargs)
31 """Do function named ``fn`` on ``(*args, **kwargs)``, peforming single
32 dispatch to retrieve ``fn`` based on whichever library defines the class of
33 the ``args[0]``, or the ``like`` keyword argument if specified.
(...)
77 <tf.Tensor: id=91, shape=(3, 3), dtype=float32>
78 """
79 backend = choose_backend(fn, *args, like=like, **kwargs)
---> 80 return get_lib_fn(backend, fn)(*args, **kwargs)
File ~/venvs/dev/lib/python3.10/site-packages/numpy/core/fromnumeric.py:285, in reshape(a, newshape, order)
200 @array_function_dispatch(_reshape_dispatcher)
201 def reshape(a, newshape, order='C'):
202 """
203 Gives a new shape to an array without changing its data.
204
(...)
283 [5, 6]])
284 """
--> 285 return _wrapfunc(a, 'reshape', newshape, order=order)
File ~/venvs/dev/lib/python3.10/site-packages/numpy/core/fromnumeric.py:59, in _wrapfunc(obj, method, *args, **kwds)
56 return _wrapit(obj, method, *args, **kwds)
58 try:
---> 59 return bound(*args, **kwds)
60 except TypeError:
61 # A TypeError occurs if the object does have such a method in its
62 # class, but its signature is not identical to that of NumPy's. This
(...)
66 # Call _wrapit from within the except clause to ensure a potential
67 # exception has a traceback chain.
68 return _wrapit(obj, method, *args, **kwds)
ValueError: cannot reshape array of size 0 into shape (0,newaxis)
```
### System information
```shell
pl dev
```
### Existing GitHub issues
- [X] I have searched existing GitHub issues to make sure the issue does not already exist.
| [
{
"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\r\n\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n\r\n# http://www.apache.org/licenses/LICENSE-2... | [
{
"content": "# Copyright 2018-2021 Xanadu Quantum Technologies Inc.\r\n\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n\r\n# http://www.apache.org/licenses/LICENSE-2... | diff --git a/doc/releases/changelog-0.36.0.md b/doc/releases/changelog-0.36.0.md
index 77826722ffd..c81e7c5710b 100644
--- a/doc/releases/changelog-0.36.0.md
+++ b/doc/releases/changelog-0.36.0.md
@@ -564,8 +564,9 @@
[(#5610)](https://github.com/PennyLaneAI/pennylane/pull/5610)
* Using shot vectors with `param_shift(... broadcast=True)` caused a bug. This combination is no longer supported
- and will be added again in the next release.
+ and will be added again in the next release. Fixed a bug with custom gradient recipes that only consist of unshifted terms.
[(#5612)](https://github.com/PennyLaneAI/pennylane/pull/5612)
+ [(#5623)](https://github.com/PennyLaneAI/pennylane/pull/5623)
* Cast the keys of the `CountsMP` measurements returned `dynamic_one_shot` to the type produced by `MeasurementValue.concretize`.
[(#5587)](https://github.com/PennyLaneAI/pennylane/pull/5587)
diff --git a/pennylane/gradients/general_shift_rules.py b/pennylane/gradients/general_shift_rules.py
index 07e2b9fce56..cccb0f5a930 100644
--- a/pennylane/gradients/general_shift_rules.py
+++ b/pennylane/gradients/general_shift_rules.py
@@ -451,6 +451,9 @@ def generate_shifted_tapes(tape, index, shifts, multipliers=None, broadcast=Fals
the ``batch_size`` of the returned tape matches the length of ``shifts``.
"""
+ if len(shifts) == 0:
+ return tuple()
+
if multipliers is None:
multipliers = np.ones_like(shifts)
diff --git a/tests/gradients/parameter_shift/test_parameter_shift.py b/tests/gradients/parameter_shift/test_parameter_shift.py
index 7a0efd917d4..0dd03d8faac 100644
--- a/tests/gradients/parameter_shift/test_parameter_shift.py
+++ b/tests/gradients/parameter_shift/test_parameter_shift.py
@@ -536,7 +536,8 @@ def test_all_zero_diff_methods_multiple_returns_tape(self):
# tapes, _ = qml.gradients.param_shift(circuit.tape, broadcast=broadcast)
# assert tapes == []
- def test_with_gradient_recipes(self):
+ @pytest.mark.parametrize("broadcast", [True, False])
+ def test_with_gradient_recipes(self, broadcast):
"""Test that the function behaves as expected"""
with qml.queuing.AnnotatedQueue() as q:
@@ -549,18 +550,34 @@ def test_with_gradient_recipes(self):
tape = qml.tape.QuantumScript.from_queue(q)
tape.trainable_params = {0, 2}
gradient_recipes = ([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6]], [[1, 1, 1], [2, 2, 2], [3, 3, 3]])
- tapes, _ = qml.gradients.param_shift(tape, gradient_recipes=gradient_recipes)
+ tapes, _ = param_shift(tape, gradient_recipes=gradient_recipes, broadcast=broadcast)
- assert len(tapes) == 5
- assert [t.batch_size for t in tapes] == [None] * 5
- assert tapes[0].get_parameters(trainable_only=False) == [0.2 * 1.0 + 0.3, 2.0, 3.0, 4.0]
- assert tapes[1].get_parameters(trainable_only=False) == [0.5 * 1.0 + 0.6, 2.0, 3.0, 4.0]
- assert tapes[2].get_parameters(trainable_only=False) == [1.0, 2.0, 1 * 3.0 + 1, 4.0]
- assert tapes[3].get_parameters(trainable_only=False) == [1.0, 2.0, 2 * 3.0 + 2, 4.0]
- assert tapes[4].get_parameters(trainable_only=False) == [1.0, 2.0, 3 * 3.0 + 3, 4.0]
+ if broadcast:
+ assert len(tapes) == 2
+ assert [t.batch_size for t in tapes] == [2, 3]
+
+ shifted_batch = [0.2 * 1.0 + 0.3, 0.5 * 1.0 + 0.6]
+ tape_par = tapes[0].get_parameters(trainable_only=False)
+ assert np.allclose(tape_par[0], shifted_batch)
+ assert tape_par[1:] == [2.0, 3.0, 4.0]
+
+ shifted_batch = [1 * 3.0 + 1, 2 * 3.0 + 2, 3 * 3.0 + 3]
+ tape_par = tapes[1].get_parameters(trainable_only=False)
+ assert tape_par[:2] == [1.0, 2.0]
+ assert np.allclose(tape_par[2], shifted_batch)
+ assert tape_par[3:] == [4.0]
+ else:
+ assert len(tapes) == 5
+ assert [t.batch_size for t in tapes] == [None] * 5
+ assert tapes[0].get_parameters(trainable_only=False) == [0.2 * 1.0 + 0.3, 2.0, 3.0, 4.0]
+ assert tapes[1].get_parameters(trainable_only=False) == [0.5 * 1.0 + 0.6, 2.0, 3.0, 4.0]
+ assert tapes[2].get_parameters(trainable_only=False) == [1.0, 2.0, 1 * 3.0 + 1, 4.0]
+ assert tapes[3].get_parameters(trainable_only=False) == [1.0, 2.0, 2 * 3.0 + 2, 4.0]
+ assert tapes[4].get_parameters(trainable_only=False) == [1.0, 2.0, 3 * 3.0 + 3, 4.0]
+ @pytest.mark.parametrize("broadcast", [True, False])
@pytest.mark.parametrize("ops_with_custom_recipe", [[0], [1], [0, 1]])
- def test_recycled_unshifted_tape(self, ops_with_custom_recipe):
+ def test_recycled_unshifted_tape(self, ops_with_custom_recipe, broadcast):
"""Test that if the gradient recipe has a zero-shift component, then
the tape is executed only once using the current parameter
values."""
@@ -577,23 +594,28 @@ def test_recycled_unshifted_tape(self, ops_with_custom_recipe):
[[-1e7, 1, 0], [1e7, 1, 1e-7]] if i in ops_with_custom_recipe else None
for i in range(2)
)
- tapes, fn = qml.gradients.param_shift(tape, gradient_recipes=gradient_recipes)
+ tapes, fn = param_shift(tape, gradient_recipes=gradient_recipes, broadcast=broadcast)
- # two tapes per parameter that doesn't use a custom recipe,
+ # two (one with broadcast) tapes per parameter that doesn't use a custom recipe,
# one tape per parameter that uses custom recipe,
# plus one global call if at least one uses the custom recipe
- num_ops_standard_recipe = tape.num_params - len(ops_with_custom_recipe)
- assert len(tapes) == 2 * num_ops_standard_recipe + len(ops_with_custom_recipe) + 1
+ num_custom = len(ops_with_custom_recipe)
+ num_ops_standard_recipe = tape.num_params - num_custom
+ tapes_per_param = 1 if broadcast else 2
+ assert len(tapes) == tapes_per_param * num_ops_standard_recipe + num_custom + 1
# Test that executing the tapes and the postprocessing function works
grad = fn(qml.execute(tapes, dev, None))
assert qml.math.allclose(grad, -np.sin(x[0] + x[1]), atol=1e-5)
+ @pytest.mark.parametrize("broadcast", [False, True])
@pytest.mark.parametrize("ops_with_custom_recipe", [[0], [1], [0, 1]])
@pytest.mark.parametrize("multi_measure", [False, True])
- def test_custom_recipe_unshifted_only(self, ops_with_custom_recipe, multi_measure):
+ def test_custom_recipe_unshifted_only(self, ops_with_custom_recipe, multi_measure, broadcast):
"""Test that if the gradient recipe has a zero-shift component, then
the tape is executed only once using the current parameter
values."""
+ if multi_measure and broadcast:
+ pytest.skip("Multiple measurements are not supported with `broadcast=True` yet.")
dev = qml.device("default.qubit", wires=2)
x = [0.543, -0.654]
@@ -608,12 +630,13 @@ def test_custom_recipe_unshifted_only(self, ops_with_custom_recipe, multi_measur
gradient_recipes = tuple(
[[-1e7, 1, 0], [1e7, 1, 0]] if i in ops_with_custom_recipe else None for i in range(2)
)
- tapes, fn = qml.gradients.param_shift(tape, gradient_recipes=gradient_recipes)
+ tapes, fn = param_shift(tape, gradient_recipes=gradient_recipes, broadcast=broadcast)
- # two tapes per parameter that doesn't use a custom recipe,
+ # two (one with broadcast) tapes per parameter that doesn't use a custom recipe,
# plus one global (unshifted) call if at least one uses the custom recipe
num_ops_standard_recipe = tape.num_params - len(ops_with_custom_recipe)
- assert len(tapes) == 2 * num_ops_standard_recipe + int(
+ tapes_per_param = 1 if broadcast else 2
+ assert len(tapes) == tapes_per_param * num_ops_standard_recipe + int(
tape.num_params != num_ops_standard_recipe
)
# Test that executing the tapes and the postprocessing function works
@@ -662,8 +685,9 @@ def test_custom_recipe_mixing_unshifted_shifted(self, ops_with_custom_recipe):
assert qml.math.allclose(grad[0], -np.sin(x[0] + x[1]), atol=1e-5)
assert qml.math.allclose(grad[1], 0, atol=1e-5)
+ @pytest.mark.parametrize("broadcast", [True, False])
@pytest.mark.parametrize("y_wire", [0, 1])
- def test_f0_provided(self, y_wire):
+ def test_f0_provided(self, y_wire, broadcast):
"""Test that if the original tape output is provided, then
the tape is not executed additionally at the current parameter
values."""
@@ -677,7 +701,7 @@ def test_f0_provided(self, y_wire):
tape = qml.tape.QuantumScript.from_queue(q)
gradient_recipes = ([[-1e7, 1, 0], [1e7, 1, 1e7]],) * 2
f0 = dev.execute(tape)
- tapes, fn = qml.gradients.param_shift(tape, gradient_recipes=gradient_recipes, f0=f0)
+ tapes, fn = param_shift(tape, gradient_recipes=gradient_recipes, f0=f0, broadcast=broadcast)
# one tape per parameter that impacts the expval
assert len(tapes) == 2 if y_wire == 0 else 1
|
pypa__setuptools-936 | Graft with Asterisk broken after 28.4.0
28.4.0 is the last release where `graft */data` as an example, was working. After that release, there is a warning that `warning: no directories found matching '*/data'`
| [
{
"content": "\"\"\"setuptools.command.egg_info\n\nCreate a distribution's .egg-info directory and contents\"\"\"\n\nfrom distutils.filelist import FileList as _FileList\nfrom distutils.errors import DistutilsInternalError\nfrom distutils.util import convert_path\nfrom distutils import log\nimport distutils.err... | [
{
"content": "\"\"\"setuptools.command.egg_info\n\nCreate a distribution's .egg-info directory and contents\"\"\"\n\nfrom distutils.filelist import FileList as _FileList\nfrom distutils.errors import DistutilsInternalError\nfrom distutils.util import convert_path\nfrom distutils import log\nimport distutils.err... | diff --git a/setuptools/command/egg_info.py b/setuptools/command/egg_info.py
index 5ab54dc70f..62bf00aaa9 100755
--- a/setuptools/command/egg_info.py
+++ b/setuptools/command/egg_info.py
@@ -429,7 +429,9 @@ def recursive_exclude(self, dir, pattern):
def graft(self, dir):
"""Include all files from 'dir/'."""
- found = distutils.filelist.findall(dir)
+ found = []
+ for match_dir in glob(dir):
+ found += distutils.filelist.findall(match_dir)
self.extend(found)
return bool(found)
diff --git a/setuptools/tests/test_manifest.py b/setuptools/tests/test_manifest.py
index cf39346a18..3b34c88813 100644
--- a/setuptools/tests/test_manifest.py
+++ b/setuptools/tests/test_manifest.py
@@ -206,6 +206,15 @@ def test_graft(self):
l('app/static/app.css'), l('app/static/app.css.map')])
assert files == self.get_files()
+ def test_graft_glob_syntax(self):
+ """Include the whole app/static/ directory."""
+ l = make_local_path
+ self.make_manifest("graft */static")
+ files = default_files | set([
+ l('app/static/app.js'), l('app/static/app.js.map'),
+ l('app/static/app.css'), l('app/static/app.css.map')])
+ assert files == self.get_files()
+
def test_graft_global_exclude(self):
"""Exclude all *.map files in the project."""
l = make_local_path
|
microsoft__Qcodes-940 | error when saving to drive other than current path
This is due to windows handling of drives. A minimal example:
``` python
import qcodes,os
datadir = r'd:\Temp'
qcodes.DataSet.default_io = qcodes.DiskIO(datadir)
p=qcodes.Parameter('p', set_cmd=None)
q=qcodes.Parameter('q', set_cmd=None)
ds=qcodes.Loop(p[0:10:1]).each(q).run() # fine
qcodes.DataSet.default_io = qcodes.DiskIO(r'c:\Temp')
ds=qcodes.Loop(p[0:10:1]).each(p).run() # error
```
This generates the error `ValueError: path is on mount 'd:', start on mount 'c:'`
Also see https://bugs.python.org/issue7195
| [
{
"content": "\"\"\"\nIO managers for QCodes.\n\nIO managers wrap whatever physical storage layer the user wants to use\nin an interface mimicking the built-in <open> context manager, with\nsome restrictions to minimize the overhead in creating new IO managers.\n\nThe main thing these managers need to implement... | [
{
"content": "\"\"\"\nIO managers for QCodes.\n\nIO managers wrap whatever physical storage layer the user wants to use\nin an interface mimicking the built-in <open> context manager, with\nsome restrictions to minimize the overhead in creating new IO managers.\n\nThe main thing these managers need to implement... | diff --git a/qcodes/data/io.py b/qcodes/data/io.py
index 4b7a82c8588..93ff782a28b 100644
--- a/qcodes/data/io.py
+++ b/qcodes/data/io.py
@@ -141,7 +141,7 @@ def to_location(self, path):
location (str): the location string corresponding to this path.
"""
if self.base_location:
- return os.path.relpath(path, self.base_location)
+ return os.path.join(self.base_location, path)
else:
return path
|
strawberry-graphql__strawberry-2481 | Increased CPU usage when subscribing with the graphql-transport-ws protocol
<!-- Provide a general summary of the bug in the title above. -->
<!--- This template is entirely optional and can be removed, but is here to help both you and us. -->
<!--- Anything on lines wrapped in comments like these will not show up in the final text. -->
## Describe the Bug
We have a Strawberry GraphQL server that we have been stress testing and running CPU performance tests on. We have found that there is a noticeable and consistent increase in the CPU usage of our server application when our client subscribes using the _graphql-transport-ws_ protocol compared to using the _graphql-ws_ protocol.
I have done a bit of investigating and further profiling using py-spy and discovered that the Strawberry code is creating a `NextMessage` object ([here](https://github.com/strawberry-graphql/strawberry/blob/db9c22a53205cd82330a9c84d44ac1ee2731eafb/strawberry/subscriptions/protocols/graphql_transport_ws/handlers.py#L261)) for each message, which it then converts to a dictionary ([here](https://github.com/strawberry-graphql/strawberry/blob/db9c22a53205cd82330a9c84d44ac1ee2731eafb/strawberry/subscriptions/protocols/graphql_transport_ws/handlers.py#L283)) using the `dataclasses` `asdict() `method ([here](https://github.com/strawberry-graphql/strawberry/blob/db9c22a53205cd82330a9c84d44ac1ee2731eafb/strawberry/subscriptions/protocols/graphql_transport_ws/types.py#L12)). Some internet research shows that this `asdict()` method is doing a `deepcopy` of everything within the class. I ran a few timing tests and the `asdict()` method takes an order of magnitude longer than doing a simple `.__dict__` on the object. This is only done in the _graphql-transport-ws_ implementation and not the _graphql-ws_ implementation which explains why there is a difference in CPU usage between the 2 protocols.
I do not believe that we need to be doing a deepcopy when turning the class into a dictionary. What's more, I wonder whether we need to even be creating the `NextMessage` object because as far as I can see, we create it and pass it to a function that immediately turns it into a dictionary. So why don't we just create it as a dictionary and send it instead. This would bypass having to do any sort of conversion costing time.
I.e. instead of line 261 and 262 ([here](https://github.com/strawberry-graphql/strawberry/blob/db9c22a53205cd82330a9c84d44ac1ee2731eafb/strawberry/subscriptions/protocols/graphql_transport_ws/handlers.py#L261)) which do:
```
next_message = NextMessage(id=operation_id, payload=next_payload)
await self.send_message(next_message)`
```
we could do something like:
```
next_message = {"id":operation_id, "payload": next_payload, "type": "next"}
await self.send_json(next_message)
```
When I ran the performance tests with the above change the CPU usage dropped and was consistent with the _graphql-ws_ protocol performance.
<!-- A clear and concise description of what the bug is. -->
## System Information
- Operating system: Centos 7
- Strawberry version (if applicable): 0.154.1
## Additional Context
I have created a simple demo Strawberry GraphQL server and Python client on GitHub, available at: https://github.com/rjwills28/strawberry_cpu_demo/tree/master.
Instructions on how to install and run are in the readme. It simulates the tests that we were running where we have a server providing subscription updates at 10Hz and a client that creates 100 different subscriptions. Follow the example in the readme to first run with the _graphql-ws_ protocol (command line argument (`-p 1`) and then with the _graphql-transport-ws_ protocol (`-p 2`). Run both a few times and you should see that the average CPU usage is on the whole higher for the latter protocol. Please let me know if you have any problems running this.
<!-- POLAR PLEDGE BADGE START -->
## Upvote & Fund
- We're using [Polar.sh](https://polar.sh/strawberry-graphql) so you can upvote and help fund this issue.
- We receive the funding once the issue is completed & confirmed by you.
- Thank you in advance for helping prioritize & fund our backlog.
<a href="https://polar.sh/strawberry-graphql/strawberry/issues/2479">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://polar.sh/api/github/strawberry-graphql/strawberry/issues/2479/pledge.svg?darkmode=1">
<img alt="Fund with Polar" src="https://polar.sh/api/github/strawberry-graphql/strawberry/issues/2479/pledge.svg">
</picture>
</a>
<!-- POLAR PLEDGE BADGE END -->
| [
{
"content": "from dataclasses import asdict, dataclass\nfrom typing import Any, Dict, List, Optional\n\nfrom graphql import GraphQLFormattedError\n\nfrom strawberry.unset import UNSET\n\n\n@dataclass\nclass GraphQLTransportMessage:\n def as_dict(self) -> dict:\n data = asdict(self)\n if getatt... | [
{
"content": "from dataclasses import asdict, dataclass\nfrom typing import Any, Dict, List, Optional\n\nfrom graphql import GraphQLFormattedError\n\nfrom strawberry.unset import UNSET\n\n\n@dataclass\nclass GraphQLTransportMessage:\n def as_dict(self) -> dict:\n data = asdict(self)\n if getatt... | diff --git a/RELEASE.md b/RELEASE.md
new file mode 100644
index 0000000000..d3ebb00904
--- /dev/null
+++ b/RELEASE.md
@@ -0,0 +1,5 @@
+Release type: patch
+
+This release fixes a bug in subscriptions using the graphql-transport-ws protocol
+where the conversion of the NextMessage object to a dictionary took an unnecessary
+amount of time leading to an increase in CPU usage.
diff --git a/strawberry/subscriptions/protocols/graphql_transport_ws/types.py b/strawberry/subscriptions/protocols/graphql_transport_ws/types.py
index 04f844e1a0..72033f7ff4 100644
--- a/strawberry/subscriptions/protocols/graphql_transport_ws/types.py
+++ b/strawberry/subscriptions/protocols/graphql_transport_ws/types.py
@@ -85,6 +85,9 @@ class NextMessage(GraphQLTransportMessage):
payload: Dict[str, Any] # TODO: shape like ExecutionResult
type: str = "next"
+ def as_dict(self) -> dict:
+ return {"id": self.id, "payload": self.payload, "type": self.type}
+
@dataclass
class ErrorMessage(GraphQLTransportMessage):
|
oobabooga__text-generation-webui-3014 | Error when downloading model from UI
### Describe the bug
I just downloaded the latest version of text-generation-webui on Ubuntu and started the UI but it is not longer allowing me to download a model from the UI. I tried to downloading 'anon8231489123/vicuna-13b-GPTQ-4bit-128g' but got the following error:
Traceback (most recent call last): File “/home/squirol/ben2/oobabooga_linux/text-generation-webui/server.py”, line 134, in download_model_wrapper downloader = downloader_module.ModelDownloader() TypeError: ModelDownloader.init() missing 1 required positional argument: ‘max_retries’
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Reproduction
1. Launch web UI using ./start_linux.sh
2. Open browser to http://127.0.0.1:7860/
3. Enter 'anon8231489123/vicuna-13b-GPTQ-4bit-128g' and select Download in UI
4. View exception under Download button
### Screenshot

### Logs
```shell
N/A
```
### System Info
```shell
Ubuntu
NVIDIA
```
| [
{
"content": "'''\nDownloads models from Hugging Face to models/username_modelname.\n\nExample:\npython download-model.py facebook/opt-1.3b\n\n'''\n\nimport argparse\nimport base64\nimport datetime\nimport hashlib\nimport json\nimport os\nimport re\nimport sys\nfrom pathlib import Path\n\nimport requests\nimpor... | [
{
"content": "'''\nDownloads models from Hugging Face to models/username_modelname.\n\nExample:\npython download-model.py facebook/opt-1.3b\n\n'''\n\nimport argparse\nimport base64\nimport datetime\nimport hashlib\nimport json\nimport os\nimport re\nimport sys\nfrom pathlib import Path\n\nimport requests\nimpor... | diff --git a/download-model.py b/download-model.py
index 9ee7790664..2642c40545 100644
--- a/download-model.py
+++ b/download-model.py
@@ -23,7 +23,7 @@
class ModelDownloader:
- def __init__(self, max_retries):
+ def __init__(self, max_retries = 5):
self.s = requests.Session()
if max_retries:
self.s.mount('https://cdn-lfs.huggingface.co', HTTPAdapter(max_retries=max_retries))
|
GeotrekCE__Geotrek-admin-805 | ADMIN - Tronçon bouclant sur lui-même
Impossible de saisir le CIRCUIT DES LACS correctement.
Renvoie souvent une 504 BAD GATEWAY quand on enregistre. L'itinéraire a pourtant été modifié mais différemment de la façon dont il a été saisi. A creuser.
| [
{
"content": "from django.utils.translation import ugettext_lazy as _\n\nimport floppyforms as forms\n\nfrom geotrek.common.forms import CommonForm\nfrom .models import Path\nfrom .helpers import PathHelper\nfrom .fields import TopologyField, SnappedLineStringField\n\n\nclass TopologyForm(CommonForm):\n \"\"... | [
{
"content": "from django.utils.translation import ugettext_lazy as _\n\nimport floppyforms as forms\n\nfrom geotrek.common.forms import CommonForm\nfrom .models import Path\nfrom .helpers import PathHelper\nfrom .fields import TopologyField, SnappedLineStringField\n\n\nclass TopologyForm(CommonForm):\n \"\"... | diff --git a/CHANGES b/CHANGES
index ed9dbb2e7a..1deda94e32 100644
--- a/CHANGES
+++ b/CHANGES
@@ -35,6 +35,7 @@ CHANGELOG
* Allow server host to capture pages (fixes #733)
* Adjust map capture according to geometry aspect ratio (fixes #627)
* Always show path layer in detail pages (fixes #781)
+* Fix restore of topology on loop paths (fixes #760)
0.19.1 (2013-07-15)
diff --git a/geotrek/core/forms.py b/geotrek/core/forms.py
index f429aad222..42eebfd7d7 100644
--- a/geotrek/core/forms.py
+++ b/geotrek/core/forms.py
@@ -44,7 +44,6 @@ class Meta(CommonForm.Meta):
fields = CommonForm.Meta.fields + ['topology']
MEDIA_JS = ("core/dijkstra.js",
- "core/leaflet-geomutils.js",
"core/multipath.js",
"core/topology_helper.js") + CommonForm.MEDIA_JS
diff --git a/geotrek/core/static/core/leaflet-geomutils.js b/geotrek/core/static/core/leaflet-geomutils.js
deleted file mode 100644
index aa00e31d47..0000000000
--- a/geotrek/core/static/core/leaflet-geomutils.js
+++ /dev/null
@@ -1,325 +0,0 @@
-L.GeomUtils = (function() {
- var self;
- return self = {
-
- // Calculate if a point p is between a and b
- isBetween: function(x, a, b, epsilon) {
- epsilon = epsilon || 0.5;
- var d = x.distanceTo(a) + x.distanceTo(b) - a.distanceTo(b);
- return d < epsilon;
- },
-
- // Use LatLng
- getPercentageDistanceFromPolyline: function(ll, polyline) {
- // Will test every point, considering a point is in a segment with an error of 2 meters
- return self.getPercentageDistance(ll, polyline.getLatLngs(), 5 /* in meters */, true);
- },
-
- // May be used for performance issue but you will loose precision
- getPercentageDistanceFromPolylineAsPoints: function(point, polyline) {
- return self.getPercentageDistance(point, polyline._parts[0], 5, true);
- },
-
- // You may pass latlng or point to this function
- getPercentageDistance: function(x, xs, epsilon, only_first, recurse) {
- var xs_len = 0.0
- , distance_found = false
- , closest_idx = null
- , distance = Number.MAX_VALUE;
-
- for (var i = 0; i < xs.length - 1; i++) {
- var x1 = xs[i], x2 = xs[i+1];
-
- // We iterate on each segment of the path
- if (!distance_found || !only_first) {
- if (self.isBetween(x, x1, x2, epsilon)) {
- distance_found = true;
- xdistance = xs_len + x.distanceTo(x1);
-
- if (only_first || xdistance < distance) {
- distance = xdistance;
- closest_idx = i;
- }
- }
- }
-
- xs_len += x1.distanceTo(x2);
- }
-
- if (!distance_found) {
- if (!recurse) {
- console.warn('Could not find ' + x + ' in ' + xs);
- return null;
- }
- // Try with closest point.
- var seg = L.GeomUtils.closestSegment(x, xs)
- , p = L.LineUtil.closestPointOnSegment(x, seg[0], seg[1]);
- return L.GeomUtils.getPercentageDistance(p, xs, epsilon, only_first, true);
- }
- var percent = Math.round((distance / xs_len)*10000)/10000;
- return { 'distance': percent, 'closest': closest_idx };
- },
-
- getLatLngFromPos: function(map, polyline, pos_list, equal_delta) {
- equal_delta === equal_delta === undefined ? 2 /*in meters*/ : equal_delta;
-
- // Safety check : should be ordered and 0.0 <= X <=1.0!
- $.each(pos_list, function(i, pos) {
- var prev_pos = pos[i - 1];
- var sorted = prev_pos === undefined ? true : pos > prev_pos;
- if (! (pos >= 0 && pos <= 1 && sorted)) {
- throw 'Wrong value: ' + pos_list;
- }
- });
-
- // Polyline related
- var polyline_lls = polyline.getLatLngs();
- var d_len = self.getDistances(polyline_lls)
- , polyline_len = d_len.length
- , polyline_distances = d_len.distances;
-
- // Simple situation... simple solution.
- if (pos_list.length == 1) {
- if (pos_list[0] == 0.0) return [self.cloneLatLng(polyline_lls[0])];
- if (pos_list[0] == 1.0) return [self.cloneLatLng(polyline_lls[polyline_lls.length-1])];
- }
-
- var ds = $.map(pos_list, function(pos) { return polyline_len * pos; });
-
- var res = [];
- var i;
-
- var current_distance = ds.shift()
- , current_geom = [];
-
- // If pos is 0.0, take first latlng
- if (current_distance == 0.0) {
- res.push(self.cloneLatLng(polyline_distances[0].x1));
- current_distance = ds.shift()
- }
-
- for (i = 0; i < polyline_distances.length; i++) {
- var dist = polyline_distances[i];
- var new_acc = dist.acc + dist.distance;
-
- var delta = Math.abs(current_distance - new_acc)
- var distance_equal = delta < equal_delta;
-
- if (distance_equal || current_distance < new_acc) {
- if (distance_equal) {
- // Same point
- res.push(self.cloneLatLng(dist.x2));
- } else {
- // current_distance < new_acc
- // New point
-
- var dist_from_point = current_distance - dist.acc;
- var ratio_dist = dist_from_point / dist.distance;
- var ll = self.getPointOnLine(map, ratio_dist, dist.x1, dist.x2);
-
- res.push(ll);
- }
-
- if (ds.length == 0) break;
- current_distance = ds.shift()
- }
- }
-
- if (res.length < 1) console.warn("Could not get LatLng from position " + pos_list);
- if (window.DEBUG) {
- console.log("Invert getLatLngFromPos("+ pos_list[0] + ") : " +
- JSON.stringify(self.getPercentageDistanceFromPolyline(res[0], polyline)));
- }
- return res;
- },
-
- cloneLatLng: function(latlng) {
- return new L.LatLng(latlng.lat, latlng.lng);
- },
-
- getPointOnLine: function(map, ratio_dist, ll1, ll2) {
- if (ratio_dist == 0.0) return ll1;
- if (ratio_dist == 1.0) return ll2;
- var zoom = map.getMaxZoom()
- , p1 = map.project(ll1, zoom)
- , p2 = map.project(ll2, zoom)
- , d = p1.distanceTo(p2);
-
- var x_new = p1.x + (p2.x - p1.x) * ratio_dist
- , y_new = p1.y + (p2.y - p1.y) * ratio_dist
- , ll_new = map.unproject(new L.Point(x_new, y_new), zoom);
- console.assert(!ll_new.equals(ll1) && !ll_new.equals(ll2), ratio_dist + ' got extremity (margin is ' + L.LatLng.MAX_MARGIN + ')');
- return ll_new;
- },
-
- getGradient: function(x1, y1, x2, y2) {
- var a = (y2 - y1) / (x2 - x1);
- var b = y1 - (a * x1);
- return {'a': a, 'b': b};
- },
-
- getDistances: function(xs) {
- var xs_len = 0.0, d, distances = [];
-
- for (var i = 0; i < xs.length - 1; i++) {
- var x1 = xs[i], x2 = xs[i+1];
- d = x1.distanceTo(x2);
-
- // acc: so far (without distance)
- distances.push({
- 'i1': i, 'i2': i+1,
- 'x1': x1, 'x2': x2,
- 'acc': xs_len, 'distance': d
- });
-
- xs_len += d
- }
- return {'length': xs_len, 'distances': distances};
- },
-
- // Calculate length (works for either points or latlngs)
- length: function(xs) {
- var xs_len = 0;
- for (var i = 0; i < xs.length - 1; i++) {
- xs_len += xs[i].distanceTo(xs[i+1]);
- }
- return xs_len;
- },
-
- distance: function (map, latlng1, latlng2) {
- return map.latLngToLayerPoint(latlng1).distanceTo(map.latLngToLayerPoint(latlng2));
- },
-
- distanceSegment: function (map, latlng, latlngA, latlngB) {
- var p = map.latLngToLayerPoint(latlng),
- p1 = map.latLngToLayerPoint(latlngA),
- p2 = map.latLngToLayerPoint(latlngB);
- return L.LineUtil.pointToSegmentDistance(p, p1, p2);
- },
-
- latlngOnSegment: function (map, latlng, latlngA, latlngB) {
- var maxzoom = map.getMaxZoom();
- var p = map.project(latlng, maxzoom),
- p1 = map.project(latlngA, maxzoom),
- p2 = map.project(latlngB, maxzoom);
- closest = L.LineUtil.closestPointOnSegment(p, p1, p2);
- return map.unproject(closest, maxzoom);
- },
-
- closestSegment: function (p, points) {
- var mindist = Number.MAX_VALUE
- , idx = 0;
- for (var i=0; i<points.length-1; i++) {
- var x = points[i]
- , d = p.distanceTo(x);
- if (d < mindist) {
- idx = i;
- }
- }
- return [points[idx], points[idx+1]];
- },
-
- closestOnLine: function (map, latlng, linestring) {
- return self.closestOnLatLngs(map, latlng, linestring.getLatLngs());
- },
-
- closestOnLatLngs: function (map, latlng, lls) {
- // Iterate on line segments
- var segmentmindist = Number.MAX_VALUE,
- ll = null;
- // Keep the closest point of all segments
- for (var j = 0; j < lls.length - 1; j++) {
- var p1 = lls[j],
- p2 = lls[j+1],
- d = self.distanceSegment(map, latlng, p1, p2);
- if (d < segmentmindist) {
- segmentmindist = d;
- ll = self.latlngOnSegment(map, latlng, p1, p2);
- }
- }
- return ll;
- },
-
- closest: function (map, marker, snaplist, snap_distance) {
- var mindist = Number.MAX_VALUE,
- chosen = null,
- point = null;
- var n = snaplist.length;
- // /!\ Careful with size of this list, iterated at every marker move!
- if (n>1000) console.warn("Snap list is very big : " + n + " objects!");
-
- // Iterate the whole snaplist
- for (var i = 0; i < n ; i++) {
- var object = snaplist[i],
- ll = null,
- distance = Number.MAX_VALUE;
- if (object.getLatLng) {
- // Single dimension, snap on points
- ll = object.getLatLng();
- distance = self.distance(map, marker.getLatLng(), ll);
- }
- else {
- ll = L.GeomUtils.closestOnLine(map, marker.getLatLng(), object);
- distance = L.GeomUtils.distance(map, marker.getLatLng(), ll);
- }
- // Keep the closest point of all objects
- if (distance < snap_distance && distance < mindist) {
- mindist = distance;
- chosen = object;
- point = ll;
- }
- }
- // Try to snap on line points (extremities and middle points)
- if (chosen && chosen.getLatLngs) {
- var mindist = snap_distance,
- linepoint = null;
- for (var i=0; i<chosen.getLatLngs().length; i++) {
- var lp = chosen.getLatLngs()[i],
- distance = L.GeomUtils.distance(map, point, lp);
- if (distance < mindist) {
- linepoint = lp;
- mindist = distance;
- }
- }
- if (linepoint) point = linepoint;
- }
- return [chosen, point];
- },
-
- isBefore: function (polyline, other) {
- var lls = polyline.getLatLngs(),
- ll_p = lls[lls.length - 1];
- if (!other) return false;
- var lls = other.getLatLngs()
- , ll_a = lls[0];
- return ll_p.equals(ll_a);
- },
-
- isAfter: function (polyline, other) {
- var ll_p = polyline.getLatLngs()[0];
- if (!other) return false;
- var lls = other.getLatLngs()
- , ll_b = lls[lls.length - 1];
- return ll_p.equals(ll_b);
- },
-
- isStartAtEdges: function (polyline, other) {
- /**
- * Returns true if the first point of the polyline
- * is equal to start or end of the other
- */
- var ll_p = polyline.getLatLngs()[0];
- if (!other) return false;
-
- var lls = other.getLatLngs()
- , ll_a = lls[0]
- , ll_b = lls[lls.length - 1];
-
- return ll_p.equals(ll_a) || ll_p.equals(ll_b);
- },
-
- lineReverse: function (line) {
- return L.polyline(line.getLatLngs().slice(0).reverse());
- }
- };
-})();
diff --git a/geotrek/core/static/core/multipath.js b/geotrek/core/static/core/multipath.js
index 8c28e6ed1a..49c3cc83f4 100644
--- a/geotrek/core/static/core/multipath.js
+++ b/geotrek/core/static/core/multipath.js
@@ -302,13 +302,12 @@ L.Handler.MultiPath = L.Handler.extend({
pop.toggleActivate();
- // If this was clicked, the marker should be close enought, snap it.
+ // If this was clicked, the marker should be close enough, snap it.
self.forceMarkerToLayer(marker, layer);
},
forceMarkerToLayer: function(marker, layer) {
- var self = this;
- var closest = L.GeomUtils.closestOnLine(self.map, marker.getLatLng(), layer);
+ var closest = L.GeometryUtil.closest(this.map, layer, marker.getLatLng());
marker.editing.updateClosest(marker, [layer, closest]);
},
@@ -436,7 +435,7 @@ L.Handler.MultiPath = L.Handler.extend({
*
* Each sub-topoogy is a way between markers. The first marker
* of the first sub-topology is the beginning, the last of the last is the end.
- * All others are intermediary points.
+ * All others are intermediary points (via markers)
*/
var self = this;
@@ -455,8 +454,8 @@ L.Handler.MultiPath = L.Handler.extend({
var start_layer = this.idToLayer(paths[0]);
var end_layer = this.idToLayer(paths[paths.length - 1]);
- var start_ll = L.GeomUtils.getLatLngFromPos(this.map, start_layer, [ first_pos ])[0];
- var end_ll = L.GeomUtils.getLatLngFromPos(this.map, end_layer, [ last_pos ])[0];
+ var start_ll = L.GeometryUtil.interpolateOnLine(this.map, start_layer, first_pos).latLng;
+ var end_ll = L.GeometryUtil.interpolateOnLine(this.map, end_layer, last_pos).latLng;
var state = {
start_ll: start_ll,
@@ -474,7 +473,7 @@ L.Handler.MultiPath = L.Handler.extend({
var pos2latlng = function (pos, layer) {
var used_pos = pos;
if (pos instanceof Array) {
- used_pos = pos[0];
+ used_pos = pos[1]; // Default is second position (think of last path of topology)
if (pos[0] == 0.0 && pos[1] != 1.0)
used_pos = pos[1];
if (pos[0] == 1.0 && pos[1] != 0.0)
@@ -485,11 +484,11 @@ L.Handler.MultiPath = L.Handler.extend({
used_pos = pos[0];
console.log("Chose " + used_pos + " for " + pos);
}
- var ll = L.GeomUtils.getLatLngFromPos(self.map, layer, [ used_pos ])[0];
- if (!ll) {
+ var interpolated = L.GeometryUtil.interpolateOnLine(self.map, layer, used_pos);
+ if (!interpolated) {
throw ('Could not interpolate ' + used_pos + ' on layer ' + layer.properties.pk);
}
- return ll;
+ return interpolated.latLng;
};
for (var i=0; i<topo.length; i++) {
@@ -738,7 +737,7 @@ Geotrek.PointOnPolyline = function (marker) {
// if valid
this.ll = null;
this.polyline = null;
- this.length = null;
+ this.path_length = null;
this.percent_distance = null;
this._activated = false;
@@ -753,10 +752,10 @@ Geotrek.PointOnPolyline = function (marker) {
this.ll = e.location;
this.polyline = e.object;
- this.length = L.GeomUtils.length(this.polyline.getLatLngs());
- var dd = L.GeomUtils.getPercentageDistanceFromPolyline(this.ll, this.polyline);
+ this.path_length = L.GeometryUtil.length(this.polyline);
+ var dd = L.GeometryUtil.locateOnLine(this.polyline._map, this.polyline, this.ll);
if (dd) {
- this.percent_distance = dd.distance;
+ this.percent_distance = dd;
this.events.fire('valid');
}
},
@@ -809,8 +808,8 @@ Geotrek.PointOnPolyline.prototype.addToGraph = function(graph) {
// To which nodes dist start_point/end_point corresponds ?
// The edge.nodes_id are ordered, it corresponds to polylines: coords[0] and coords[coords.length - 1]
- var dist_start_point = this.percent_distance * this.length
- , dist_end_point = (1 - this.percent_distance) * this.length
+ var dist_start_point = this.percent_distance * this.path_length
+ , dist_end_point = (1 - this.percent_distance) * this.path_length
;
var new_node_id = Geotrek.getNextId();
diff --git a/geotrek/core/tests/topology.py b/geotrek/core/tests/topology.py
index ea649a5134..68195cd576 100644
--- a/geotrek/core/tests/topology.py
+++ b/geotrek/core/tests/topology.py
@@ -424,6 +424,8 @@ def test_return_path_serialized(self):
(7, 10, 0), (5, 10, 0), (5, 0, 0),
(7.5, 0, 0)))
+
+class TopologyLoopTests(TestCase):
def test_simple_loop(self):
"""
==========
@@ -521,11 +523,14 @@ def test_spoon_loop_2(self):
(17, 5, 0), (20, 5, 0), # extra point due middle aggregation
(20, 0, 0), (16, 0, 0), (10, 0, 0), (3, 0, 0)))
- # Deserializing should work too
- topod = Topology.deserialize("""
- [{"positions":{"0":[0.3,1],"1":[0, 0.4]},"paths":[%(pk1)s,%(pk2)s]},
- {"positions":{"0":[0.4, 0.8]},"paths":[%(pk2)s]},
- {"positions":{"0":[0.8,1],"1":[1,0.3]},"paths":[%(pk2)s,%(pk1)s]}]""" % {'pk1': p1.pk, 'pk2': p2.pk})
+ # De/Serializing should work too
+ serialized = """
+ [{"kind": "TOPOLOGY","positions":{"0":[0.3,1],"1":[0, 0.4]},"paths":[%(pk1)s,%(pk2)s],"offset": 0.0},
+ {"kind": "TOPOLOGY","positions":{"0":[0.4, 0.8]},"paths":[%(pk2)s],"offset": 0.0},
+ {"kind": "TOPOLOGY","positions":{"0":[0.8,1],"1":[1,0.3]},"paths":[%(pk2)s,%(pk1)s],"offset": 0.0}]""" % {'pk1': p1.pk, 'pk2': p2.pk}
+
+ self.assertEqual(json.loads(serialized), json.loads(topo.serialize()))
+ topod = Topology.deserialize(serialized)
self.assertEqual(topo.geom, topod.geom)
self.assertEqual(len(topod.aggregations.all()), 7)
|
buildbot__buildbot-5219 | Gerrit Change Event Monitor: assertion error: codebase cannot be None
I am getting this stacktrace in the log, with Buildbot 2.6.0 . I do not see this problem with Buildbot 2.5.1 .
2020-02-05 09:11:59-0500 [-] Unhandled Error
Traceback (most recent call last):
File "/home/buildbot/build-venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/home/buildbot/build-venv/lib/python3.6/site-packages/buildbot/changes/gerritchangesource.py", line 180, in addChange
self.master.db.sourcestamps.findOrCreateId(**stampdict))
File "/home/buildbot/build-venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1613, in unwindGenerator
return _cancellableInlineCallbacks(gen)
File "/home/buildbot/build-venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks
_inlineCallbacks(None, g, status)
--- <exception caught here> ---
File "/home/buildbot/build-venv/lib/python3.6/site-packages/buildbot/changes/gerritchangesource.py", line 343, in outReceived
yield self.change_source.lineReceived(line)
File "/home/buildbot/build-venv/lib/python3.6/site-packages/buildbot/changes/gerritchangesource.py", line 253, in addChangeFromEvent
'properties': properties})
File "/home/buildbot/build-venv/lib/python3.6/site-packages/buildbot/changes/gerritchangesource.py", line 180, in addChange
self.master.db.sourcestamps.findOrCreateId(**stampdict))
File "/home/buildbot/build-venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/home/buildbot/build-venv/lib/python3.6/site-packages/buildbot/db/sourcestamps.py", line 58, in findOrCreateId
assert codebase is not None, "codebase cannot be None"
builtins.AssertionError: codebase cannot be None
| [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | diff --git a/master/buildbot/changes/gerritchangesource.py b/master/buildbot/changes/gerritchangesource.py
index fd50922571c2..c800e3f290d2 100644
--- a/master/buildbot/changes/gerritchangesource.py
+++ b/master/buildbot/changes/gerritchangesource.py
@@ -171,6 +171,7 @@ def addChange(self, chdict):
"patch_comment": chdict["comments"],
"repository": chdict["repository"],
"project": chdict["project"],
+ "codebase": '',
}
stampid, found_existing = yield(
diff --git a/master/buildbot/newsfragments/handle-default-codebase-in-gerrit.bugfix b/master/buildbot/newsfragments/handle-default-codebase-in-gerrit.bugfix
new file mode 100644
index 000000000000..6e68e22c54c4
--- /dev/null
+++ b/master/buildbot/newsfragments/handle-default-codebase-in-gerrit.bugfix
@@ -0,0 +1,2 @@
+Work around incomplete support for codebases in GerritChangeSource (:issue:`5190`). This avoids an internal assertion when the configuration file does not specify any codebases.
+
diff --git a/master/buildbot/test/fakedb/sourcestamps.py b/master/buildbot/test/fakedb/sourcestamps.py
index eb9404d03925..c5f118b3ef9f 100644
--- a/master/buildbot/test/fakedb/sourcestamps.py
+++ b/master/buildbot/test/fakedb/sourcestamps.py
@@ -98,6 +98,11 @@ def findOrCreateId(self, branch=None, revision=None, repository=None,
patch_body=None, patch_level=None,
patch_author=None, patch_comment=None,
patch_subdir=None):
+
+ assert codebase is not None, "codebase cannot be None"
+ assert project is not None, "project cannot be None"
+ assert repository is not None, "repository cannot be None"
+
if patch_body:
patchid = len(self.patches) + 1
while patchid in self.patches:
|
kubeflow__pipelines-4319 | allow output artifact store configuration (vs hard coded)
it seems like the output artifacts are always stored in a specific minio service, port, namespace, bucket, secrets, etc (`minio-service.kubeflow:9000`).
see: https://github.com/kubeflow/pipelines/blob/f40a22a3f4a8e06d20cf3e3f425b5058d5c87e0b/sdk/python/kfp/compiler/_op_to_template.py#L148
it would be great to make it flexible, e.g. allow using S3, or change namespace or bucket names.
i suggest making it configurable, i can do such PR if we agree its needed.
flexible pipeline service (host) path in client SDK
when creating an SDK `Client()` the path to `ml-pipeline` API service is loaded from a hard coded value (`ml-pipeline.kubeflow.svc.cluster.local:8888`) which indicate a specific k8s namespace. it can be valuable to load that default value from an env variable, i.e. changing the line in `_client.py` from:
`config.host = host if host else Client.IN_CLUSTER_DNS_NAME`
to:
`config.host = host or os.environ.get('ML_PIPELINE_DNS_NAME',Client.IN_CLUSTER_DNS_NAME)`
also note that when a user provide the `host` parameter, the ipython output points to the API server and not to the UI service (see the logic in `_get_url_prefix()`), it seems like a potential bug
if its acceptable i can submit a PR for the line change above
| [
{
"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | [
{
"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | diff --git a/sdk/python/kfp/_client.py b/sdk/python/kfp/_client.py
index 6565a273f22..fd8d056306c 100644
--- a/sdk/python/kfp/_client.py
+++ b/sdk/python/kfp/_client.py
@@ -346,6 +346,8 @@ def get_pipeline_id(self, name):
]
})
result = self._pipelines_api.list_pipelines(filter=pipeline_filter)
+ if result.pipelines is None:
+ return None
if len(result.pipelines)==1:
return result.pipelines[0].id
elif len(result.pipelines)>1:
|
holoviz__panel-1064 | outdated param dependency
it seems panel 0.8 uses `CalendarDateRange` from param. This [was introduced in param 1.9.2](https://github.com/holoviz/param/releases/tag/v1.9.2), but the param dependency is still at >=1.9.0
https://github.com/holoviz/panel/blob/master/setup.py#L93
This can lead to errors like
```
param.CalendarDateRange: DateRangeSlider,
AttributeError: module 'param' has no attribute 'CalendarDateRange'
```
when upgrading to panel 0.8.0.
Will make a simple PR to fix this
| [
{
"content": "#!/usr/bin/env python\n\nimport os\nimport shutil\nimport sys\nimport json\n\nfrom setuptools import setup, find_packages\nfrom setuptools.command.develop import develop\nfrom setuptools.command.install import install\nfrom setuptools.command.sdist import sdist\n\nimport pyct.build\n\n\ndef get_se... | [
{
"content": "#!/usr/bin/env python\n\nimport os\nimport shutil\nimport sys\nimport json\n\nfrom setuptools import setup, find_packages\nfrom setuptools.command.develop import develop\nfrom setuptools.command.install import install\nfrom setuptools.command.sdist import sdist\n\nimport pyct.build\n\n\ndef get_se... | diff --git a/setup.py b/setup.py
index bc598ae72d..14848474a0 100644
--- a/setup.py
+++ b/setup.py
@@ -142,7 +142,7 @@ def run(self):
# non-python dependencies). Note that setup_requires isn't used
# because it doesn't work well with pip.
extras_require['build'] = [
- 'param >=1.9.0',
+ 'param >=1.9.2',
'pyct >=0.4.4',
'setuptools >=30.3.0',
'bokeh >=1.4.0',
|
holoviz__panel-2616 | --autoreload raises AttributeError: 'NoneType' object has no attribute 'stop'
I'm on the current Panel master. When I `panel serve 'script.py' --autoreload` this code
```python
import panel as pn
pn.extension()
import numpy as np
import holoviews as hv
from holoviews import opts, streams
from holoviews.plotting.links import DataLink
hv.extension('bokeh')
curve = hv.Curve(np.random.randn(10).cumsum()).opts(responsive=True, line_width=6)
table = hv.Table(curve).opts(editable=True)
component=pn.pane.HoloViews(table, height=500, sizing_mode="stretch_both")
pn.template.FastListTemplate(title="Table", main=[component]).servable()
```
and change the code I get the error
```bash
2021-08-04 06:40:44,760 Error thrown from periodic callback:
2021-08-04 06:40:44,763 Traceback (most recent call last):
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\tornado\gen.py", line 526, in callback
result_list.append(f.result())
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\bokeh\server\session.py", line 67, in _needs_document_lock_wrapper
result = func(self, *args, **kwargs)
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\bokeh\server\session.py", line 195, in with_document_locked
return func(*args, **kwargs)
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\bokeh\document\document.py", line 1212, in wrapper
return doc._with_self_as_curdoc(invoke)
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\bokeh\document\document.py", line 1198, in _with_self_as_curdoc
return f()
File "c:\repos\private\panel_docker\panel\.venv\lib\site-packages\bokeh\document\document.py", line 1211, in invoke
return f(*args, **kwargs)
File "c:\repos\private\panel_docker\panel\panel\io\callbacks.py", line 72, in _periodic_callback
self.callback()
File "c:\repos\private\panel_docker\panel\panel\io\reload.py", line 155, in _reload_on_update
_check_file(modify_times, path)
File "c:\repos\private\panel_docker\panel\panel\io\reload.py", line 134, in _check_file
_reload(module)
File "c:\repos\private\panel_docker\panel\panel\io\reload.py", line 117, in _reload
cb.stop()
File "c:\repos\private\panel_docker\panel\panel\io\callbacks.py", line 134, in stop
self._cb.stop()
AttributeError: 'NoneType' object has no attribute 'stop'
```
I believe this is would be a major issue if 0.12.1 was released before fixing this @philippjfr
| [
{
"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nimport time\nimport param\n\nfrom bokeh.io import curdoc as _curdoc\n\nfrom ..util import edit_readonly\nfrom .state import state\n\n\nclass PeriodicCallback(param.Parameterized):\n \"\... | [
{
"content": "\"\"\"\nDefines callbacks to be executed on a thread or by scheduling it\non a running bokeh server.\n\"\"\"\nimport time\nimport param\n\nfrom bokeh.io import curdoc as _curdoc\n\nfrom ..util import edit_readonly\nfrom .state import state\n\n\nclass PeriodicCallback(param.Parameterized):\n \"\... | diff --git a/panel/io/callbacks.py b/panel/io/callbacks.py
index b6ceb263f0..0176a1f7bd 100644
--- a/panel/io/callbacks.py
+++ b/panel/io/callbacks.py
@@ -130,7 +130,7 @@ def stop(self):
self._timeout = None
if self._doc:
self._doc.remove_periodic_callback(self._cb)
- else:
+ elif self._cb:
self._cb.stop()
self._cb = None
doc = self._doc or _curdoc()
|
qtile__qtile-2254 | Qtile loggin with default config
Hi today when i loggin in my arch linux with qtile, opened with default config. i see another post with similar problem but dont work. This is the log of qtile:
```
2021-02-22 13:35:55,667 WARNING libqtile lifecycle.py:_atexit():L38 Qtile will now terminate
2021-02-22 13:36:01,032 WARNING libqtile floating.py:__init__():L109 Non-config.Match objects in float_rules are deprecated
2021-02-22 13:36:01,032 ERROR libqtile confreader.py:load():L106 Could not import config file '/home/sailentk/.config/qtile/config.py'
Traceback (most recent call last):
File "/usr/lib/python3.9/site-packages/libqtile/confreader.py", line 101, in load
config = __import__(name) # noqa: F811
File "/home/sailentk/.config/qtile/config.py", line 9, in <module>
from settings.widgets import widget_defaults, extension_defaults
File "/home/sailentk/.config/qtile/settings/widgets.py", line 64, in <module>
widget.Pacman(**base(bg='color4'), update_interval=1800),
File "/usr/lib/python3.9/site-packages/libqtile/utils.py", line 226, in __getattr__
raise AttributeError
AttributeError
2021-02-22 13:36:01,033 ERROR libqtile manager.py:load_config():L107 Error while reading config file (Traceback (most recent call last):
File "/usr/lib/python3.9/site-packages/libqtile/confreader.py", line 101, in load
config = __import__(name) # noqa: F811
File "/home/sailentk/.config/qtile/config.py", line 9, in <module>
from settings.widgets import widget_defaults, extension_defaults
File "/home/sailentk/.config/qtile/settings/widgets.py", line 64, in <module>
widget.Pacman(**base(bg='color4'), update_interval=1800),
File "/usr/lib/python3.9/site-packages/libqtile/utils.py", line 226, in __getattr__
raise AttributeError
AttributeError
)
Traceback (most recent call last):
File "/usr/lib/python3.9/site-packages/libqtile/confreader.py", line 101, in load
config = __import__(name) # noqa: F811
File "/home/sailentk/.config/qtile/config.py", line 9, in <module>
from settings.widgets import widget_defaults, extension_defaults
File "/home/sailentk/.config/qtile/settings/widgets.py", line 64, in <module>
widget.Pacman(**base(bg='color4'), update_interval=1800),
File "/usr/lib/python3.9/site-packages/libqtile/utils.py", line 226, in __getattr__
raise AttributeError
AttributeError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.9/site-packages/libqtile/core/manager.py", line 104, in load_config
self.config.load()
File "/usr/lib/python3.9/site-packages/libqtile/confreader.py", line 108, in load
raise ConfigError(tb)
libqtile.confreader.ConfigError: Traceback (most recent call last):
File "/usr/lib/python3.9/site-packages/libqtile/confreader.py", line 101, in load
config = __import__(name) # noqa: F811
File "/home/sailentk/.config/qtile/config.py", line 9, in <module>
from settings.widgets import widget_defaults, extension_defaults
File "/home/sailentk/.config/qtile/settings/widgets.py", line 64, in <module>
widget.Pacman(**base(bg='color4'), update_interval=1800),
File "/usr/lib/python3.9/site-packages/libqtile/utils.py", line 226, in __getattr__
raise AttributeError
AttributeError
```
| [
{
"content": "# Copyright (c) 2021, Tycho Andersen. All rights reserved.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the ... | [
{
"content": "# Copyright (c) 2021, Tycho Andersen. All rights reserved.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the ... | diff --git a/libqtile/scripts/migrate.py b/libqtile/scripts/migrate.py
index 9d7931cbe9..98b1a2ace0 100644
--- a/libqtile/scripts/migrate.py
+++ b/libqtile/scripts/migrate.py
@@ -59,10 +59,19 @@ def threaded_poll_text_rename(config):
)
+def pacman_to_checkupdates(config):
+ return (
+ bowler.Query(config)
+ .select_class("Pacman")
+ .rename("CheckUpdates")
+ )
+
+
MIGRATIONS = [
client_name_updated,
tile_master_windows_rename,
threaded_poll_text_rename,
+ pacman_to_checkupdates,
]
diff --git a/test/test_migrate.py b/test/test_migrate.py
index f4fc4cc4fc..84f007440f 100644
--- a/test/test_migrate.py
+++ b/test/test_migrate.py
@@ -119,3 +119,21 @@ class MyWidget(ThreadPoolText):
""")
check_migrate(orig, expected)
+
+
+def test_pacman():
+ orig = textwrap.dedent("""
+ from libqtile import bar
+ from libqtile.widget import Pacman
+
+ bar.Bar([Pacman()])
+ """)
+
+ expected = textwrap.dedent("""
+ from libqtile import bar
+ from libqtile.widget import CheckUpdates
+
+ bar.Bar([CheckUpdates()])
+ """)
+
+ check_migrate(orig, expected)
|
translate__translate-3435 | multistring needs a __hash__ method
In old ttk you could do something like
``` python
foo = multistring("foo")
foodict = {foo: "bar"}
assert 'foo' in foodict
```
It seems this no longer works - not sure why, but a `__hash__` method that returns `hash(str(self))` should fix the problem i believe
@claudep @julen any thoughts on this?
| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2006 Zuza Software Foundation\n#\n# This file is part of translate.\n#\n# translate is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either versio... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright 2006 Zuza Software Foundation\n#\n# This file is part of translate.\n#\n# translate is free software; you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation; either versio... | diff --git a/translate/misc/multistring.py b/translate/misc/multistring.py
index c32a957266..87e6a9ec79 100644
--- a/translate/misc/multistring.py
+++ b/translate/misc/multistring.py
@@ -82,7 +82,7 @@ def cmp_compat(s1, s2):
return cmp_compat(str(type(self)), str(type(otherstring)))
def __hash__(self):
- return hash(''.join(self.strings))
+ return hash(str(self))
def __ne__(self, otherstring):
return self.__cmp__(otherstring) != 0
diff --git a/translate/misc/test_multistring.py b/translate/misc/test_multistring.py
index 1ca2e431fd..31d8c1319c 100644
--- a/translate/misc/test_multistring.py
+++ b/translate/misc/test_multistring.py
@@ -97,3 +97,12 @@ def test_list_coercion(self):
assert six.text_type([t(u"tést")]) == u"[multistring(['tést'])]"
else:
assert six.text_type([t(u"tést")]) == u"[multistring([u't\\xe9st'])]"
+
+ def test_multistring_hash(self):
+ t = multistring.multistring
+ foo = t([u"foo", u"bar"])
+ foodict = {foo: "baz"}
+ assert u"foo" in foodict
+ foodict2 = {"foo": "baz"}
+ assert foo in foodict2
+ assert hash(str(foo)) == hash(foo)
|
mozilla__bugbug-200 | Use 'product' and 'component' features in the models
b7369ea8bf282941ce4b378ad5ad3c832db20668 introduced the features, but we are still not using them.
| [
{
"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport xgboost\nfrom imblearn.over_sampling import BorderlineSMOTE\nf... | [
{
"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport xgboost\nfrom imblearn.over_sampling import BorderlineSMOTE\nf... | diff --git a/bugbug/models/bug.py b/bugbug/models/bug.py
index d55345cc74..f0ee0b4b68 100644
--- a/bugbug/models/bug.py
+++ b/bugbug/models/bug.py
@@ -39,6 +39,8 @@ def __init__(self, lemmatization=False):
bug_features.blocked_bugs_number(),
bug_features.ever_affected(),
bug_features.affected_then_unaffected(),
+ bug_features.product(),
+ bug_features.component(),
]
cleanup_functions = [
|
bookwyrm-social__bookwyrm-3239 | OWASP Core Rule Set 913101
**Describe the bug**
BookWyrm's user agent is blocked by an OWASP-compliant web application firewall (WAF) for violating rule 913101. No other fediverse applications violate this rule.
**To Reproduce**
This issue is not reproducible between normal servers and clients.
**Expected behavior**
The WAF allows communication.
**Screenshots**
`python-requests/2.31.0 (BookWyrm/0.6.6; +https://bookwyrm.social/)`
```
[Thu Nov 09 04:13:56.824444 2023] [security2:error] [pid 2117:tid 140508772919040] [client 143.110.147.80:53962] [client 143.110.147.80] ModSecurity: Warning. Matched phrase "python-requests" at REQUEST_HEADERS:User-Agent. [file "/usr/apache/conf/waf/rules/REQUEST-913-SCANNER-DETECTION.conf"] [line "143"] [id "913101"] [msg "Found User-Agent associated with scripting/generic HTTP client"] [data "Matched Data: python-requests found within REQUEST_HEADERS:User-Agent: python-requests/2.31.0 (bookwyrm/0.6.6; +https://bookwyrm.social/)"] [severity "CRITICAL"] [ver "OWASP_CRS/3.3.3"] [tag "application-multi"] [tag "language-multi"] [tag "platform-multi"] [tag "attack-reputation-scripting"] [tag "OWASP_CRS"] [tag "capec/1000/118/224/541/310"] [tag "PCI/6.5.10"] [tag "paranoia-level/2"] [hostname "muri.network"] [uri "/users/Yae"] [unique_id "ZUxchFymkmHm47qNPINTzgAAAKI"]
[Thu Nov 09 04:13:56.824875 2023] [security2:error] [pid 2117:tid 140508772919040] [client 143.110.147.80:53962] [client 143.110.147.80] ModSecurity: Access denied with code 403 (phase 2). Operator GE matched 5 at TX:anomaly_score. [file "/usr/apache/conf/waf/rules/REQUEST-949-BLOCKING-EVALUATION.conf"] [line "94"] [id "949110"] [msg "Inbound Anomaly Score Exceeded (Total Score: 5)"] [severity "CRITICAL"] [ver "OWASP_CRS/3.3.3"] [tag "application-multi"] [tag "language-multi"] [tag "platform-multi"] [tag "attack-generic"] [hostname "muri.network"] [uri "/users/Yae"] [unique_id "ZUxchFymkmHm47qNPINTzgAAAKI"]
[Thu Nov 09 04:13:56.825023 2023] [security2:error] [pid 2117:tid 140508772919040] [client 143.110.147.80:53962] [client 143.110.147.80] ModSecurity: Warning. Operator GE matched 5 at TX:inbound_anomaly_score. [file "/usr/apache/conf/waf/rules/RESPONSE-980-CORRELATION.conf"] [line "92"] [id "980130"] [msg "Inbound Anomaly Score Exceeded (Total Inbound Score: 5 - SQLI=0,XSS=0,RFI=0,LFI=0,RCE=0,PHPI=0,HTTP=0,SESS=0): individual paranoia level scores: 0, 5, 0, 0"] [ver "OWASP_CRS/3.3.3"] [tag "event-correlation"] [hostname "muri.network"] [uri "/users/Yae"] [unique_id "ZUxchFymkmHm47qNPINTzgAAAKI"]
```
**Instance**
bookwyrm.social and another all servers.
**Additional context**
The Bookwyrm server security staff may allow 913101 for communication between Bookwyrm servers.
This becomes a problem when Bookwyrm servers send communication requests to other fediverse application servers.
Script commands that violate 913101 are putting requests to most fediverse servers, and most fediverse servers do not violate 913101.
Security staff on other fediverse servers are unlikely to exclude 913101 from their WAFs, which limits Bookwyrm's federated communications.
---
| [
{
"content": "\"\"\" bookwyrm settings and configuration \"\"\"\nimport os\nfrom typing import AnyStr\n\nfrom environs import Env\n\n\nimport requests\nfrom django.utils.translation import gettext_lazy as _\nfrom django.core.exceptions import ImproperlyConfigured\n\n\n# pylint: disable=line-too-long\n\nenv = En... | [
{
"content": "\"\"\" bookwyrm settings and configuration \"\"\"\nimport os\nfrom typing import AnyStr\n\nfrom environs import Env\n\n\nimport requests\nfrom django.utils.translation import gettext_lazy as _\nfrom django.core.exceptions import ImproperlyConfigured\n\n\n# pylint: disable=line-too-long\n\nenv = En... | diff --git a/bookwyrm/settings.py b/bookwyrm/settings.py
index cc941da849..adc9bd0ef1 100644
--- a/bookwyrm/settings.py
+++ b/bookwyrm/settings.py
@@ -347,8 +347,7 @@
USE_TZ = True
-agent = requests.utils.default_user_agent()
-USER_AGENT = f"{agent} (BookWyrm/{VERSION}; +https://{DOMAIN}/)"
+USER_AGENT = f"BookWyrm (BookWyrm/{VERSION}; +https://{DOMAIN}/)"
# Imagekit generated thumbnails
ENABLE_THUMBNAIL_GENERATION = env.bool("ENABLE_THUMBNAIL_GENERATION", False)
|
Kinto__kinto-1342 | `kinto create-user` command should fallback to KINTO_INI env variable for the config file;
| [
{
"content": "import argparse\nimport os\nimport sys\nimport logging\nimport logging.config\n\nfrom kinto.core import scripts\nfrom kinto.plugins.accounts.scripts import create_user\nfrom pyramid.scripts import pserve\nfrom pyramid.paster import bootstrap\nfrom kinto import __version__\nfrom kinto.config import... | [
{
"content": "import argparse\nimport os\nimport sys\nimport logging\nimport logging.config\n\nfrom kinto.core import scripts\nfrom kinto.plugins.accounts.scripts import create_user\nfrom pyramid.scripts import pserve\nfrom pyramid.paster import bootstrap\nfrom kinto import __version__\nfrom kinto.config import... | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index f5f507aa2..74ec541ae 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -8,7 +8,8 @@ This document describes changes between each past release.
**Bug fixes**
-- Fix `create-user` command for PostgreSQL backend (#1340)
+- Use the ``KINTO_INI`` env variable to findout the configuration file. (#1339)
+- Fix ``create-user`` command for PostgreSQL backend (#1340)
- Make sure ``create-user`` command updates password (#1336)
diff --git a/docs/commandline.rst b/docs/commandline.rst
index 48b750acc..e56d2080c 100644
--- a/docs/commandline.rst
+++ b/docs/commandline.rst
@@ -5,8 +5,10 @@ Command Line
When Kinto is installed, a command ``kinto`` becomes available.
-It accepts a ``--ini`` parameter, whose default value is ``config/kinto.ini``,
-and a set of «sub commands» are available.
+It accepts a ``--ini`` parameter, whose default value is
+``config/kinto.ini`` or the ``KINTO_INI`` env variable if defined.
+
+A set of «sub commands» are available.
::
diff --git a/kinto/__main__.py b/kinto/__main__.py
index 8d53ed682..471742cfe 100644
--- a/kinto/__main__.py
+++ b/kinto/__main__.py
@@ -11,7 +11,7 @@
from kinto import __version__
from kinto.config import init
-DEFAULT_CONFIG_FILE = 'config/kinto.ini'
+DEFAULT_CONFIG_FILE = os.getenv('KINTO_INI', 'config/kinto.ini')
DEFAULT_PORT = 8888
DEFAULT_LOG_LEVEL = logging.INFO
DEFAULT_LOG_FORMAT = '%(levelname)-5.5s %(message)s'
|
horovod__horovod-1693 | horovodrun convenience script does not account for 'OpenRTE' in the output of mpirun --version
**Environment:**
1. Framework: (TensorFlow, PyTorch)
2. Framework version: 1.14.0
3. Horovod version: 0.16.4
4. MPI version: 3.1.4/4.0.1
5. CUDA version: 10.1
6. NCCL version: 2.4.8
7. Python version: 3.6
8. OS and version: Ubuntu, Docker
9. GCC version:5.4.0
**Checklist:**
1. Did you search issues to find if somebody asked this question before?
Yes, hasn't been specifically asked
4. Did you check if you question is answered in the [troubleshooting guide](https://github.com/horovod/horovod/blob/master/docs/troubleshooting.rst)?
**Bug report:**
1. horovodrun outputs the following, when using with Open MPI 4.0.1.
```
horovodrun -np 1 -H localhost:1 python pytorch_mnist.py
Open MPI not found in output of mpirun --version.
Traceback (most recent call last):
File "/opt/conda/bin/horovodrun", line 21, in <module>
run.run()
File "/opt/conda/lib/python3.6/site-packages/horovod/run/run.py", line 448, in run
'horovodrun convenience script currently only supports '
Exception: horovodrun convenience script currently only supports Open MPI.
Choose one of:
1. Install Open MPI 4.0.0+ and re-install Horovod (use --no-cache-dir pip option).
2. Run distributed training script using the standard way provided by your MPI distribution (usually mpirun, srun, or jsrun).
root@3da487b92c3d:/horovod/examples# mpirun --version
mpirun.real (OpenRTE) 4.0.1
Report bugs to http://www.open-mpi.org/community/help/
```
2. When Open MPI is installed as follows:
```
RUN wget https://www.open-mpi.org/software/ompi/v4.0/downloads/openmpi-$OPEN_MPI_VERSION.tar.gz \
&& gunzip -c openmpi-$OPEN_MPI_VERSION.tar.gz | tar xf - \
&& cd openmpi-$OPEN_MPI_VERSION \
&& ./configure --prefix=/home/.openmpi \
&& make all install \
&& cd .. \
&& rm openmpi-$OPEN_MPI_VERSION.tar.gz \
&& rm -rf openmpi-$OPEN_MPI_VERSION
```
3. The horovodrun check expects 'OpenMPI' to be present in the output of `mpirun --version`. [[link](https://github.com/horovod/horovod/blob/master/horovod/run/mpi_run.py)]. However, when installed as above, OpenMPI has the following in output:
```
root@3b5149353790:/horovod/examples# mpirun --version
mpirun.real (OpenRTE) 4.0.1
Report bugs to http://www.open-mpi.org/community/help/
```
4. Either openmpi was installed incorrectly (in which case, can horovod documentation clarify how to install it correctly?), or the horovodrun convenience script does not account for presence of 'OpenRTE' in the `mpirun --version`.
I'm unable to understand when is 'OpenRTE' visible in mpirun --version, and it isn't? I saw the option --enable-orterun-prefix-by-default, but I'm not using it to build open-mpi.
| [
{
"content": "# Copyright 2019 Uber Technologies, Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\... | [
{
"content": "# Copyright 2019 Uber Technologies, Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\... | diff --git a/horovod/run/mpi_run.py b/horovod/run/mpi_run.py
index 9fbc55e085..18c41ca747 100644
--- a/horovod/run/mpi_run.py
+++ b/horovod/run/mpi_run.py
@@ -49,7 +49,7 @@ def _get_mpi_implementation_flags():
output.close()
if exit_code == 0:
- if 'Open MPI' in output_msg:
+ if 'Open MPI' in output_msg or 'OpenRTE' in output_msg:
return list(_OMPI_FLAGS)
elif 'IBM Spectrum MPI' in output_msg:
return list(_SMPI_FLAGS)
|
pulp__pulpcore-3381 | Export is not locking on the exported repositories
SSIA
| [
{
"content": "from django_filters.rest_framework import filters\n\nfrom drf_spectacular.utils import extend_schema\nfrom rest_framework import mixins\n\nfrom pulpcore.app.models import (\n Export,\n Exporter,\n FilesystemExport,\n FilesystemExporter,\n Publication,\n PulpExport,\n PulpExpor... | [
{
"content": "from django_filters.rest_framework import filters\n\nfrom drf_spectacular.utils import extend_schema\nfrom rest_framework import mixins\n\nfrom pulpcore.app.models import (\n Export,\n Exporter,\n FilesystemExport,\n FilesystemExporter,\n Publication,\n PulpExport,\n PulpExpor... | diff --git a/CHANGES/3370.bugfix b/CHANGES/3370.bugfix
new file mode 100644
index 0000000000..7653714719
--- /dev/null
+++ b/CHANGES/3370.bugfix
@@ -0,0 +1 @@
+Insured that pulp-export correctly locks repos-being-exported.
diff --git a/pulpcore/app/viewsets/exporter.py b/pulpcore/app/viewsets/exporter.py
index 3918874387..099722f093 100644
--- a/pulpcore/app/viewsets/exporter.py
+++ b/pulpcore/app/viewsets/exporter.py
@@ -146,6 +146,7 @@ def create(self, request, exporter_pk):
task = dispatch(
pulp_export,
exclusive_resources=[exporter],
+ shared_resources=exporter.repositories.all(),
kwargs={"exporter_pk": str(exporter.pk), "params": request.data},
)
|
bokeh__bokeh-8730 | Delay between autoload.js and websocket request
# READ AND FOLLOW THESE INSTRUCTIONS CAREFULLY
*ISSUES THAT DO NOT CONTAIN NECESSARY INFORMATION MAY BE CLOSED, IMMEDIATELY*
The issue tracker is NOT the place for general support. For questions and
technical assistance, come ask the [Bokeh mailing list](https://groups.google.com/a/continuum.io/forum/#!forum/bokeh) or join the chat on [Gitter](https://gitter.im/bokeh/bokeh). For feature requests, please provide a detailed description or proposal of the new capability or behavior.
For defects or deficiencies, please provide ALL OF THE FOLLOWING:
#### ALL software version info (bokeh, python, notebook, OS, browser, any other relevant packages)
bokeh 1.0.2
python 3.6.5
OS CentOS-7.5.1804
#### Description of expected behavior and the observed behavior
For whatever reason, it appears that on some requests, there can be a significant delay between the autoload.js request and the subsequent websocket connection. Normally, this process takes no more than 1-2 seconds:
```
doc.session_context.request.arguments: {'bokeh-autoload-element': [b'1088'], 'bokeh-app-path': [b'/graphs/enviz_graphs'], 'bokeh-absolute-url': [b'https://*redacted*/graphs/enviz_graphs'], 'processor_id': [b'83,187,196,1114,206,335,536,212,214,1173,217,250,252,256,265,876,268,298,999']}
2019-01-18 22:44:45,794 root_url should end with a /, adding one
2019-01-18 22:44:45,797 200 GET /graphs/enviz_graphs/autoload.js?bokeh-autoload-element=1089&bokeh-app-path=/graphs/enviz_graphs&bokeh-absolute-url=https://*redacted*/graphs/enviz_graphs&processor_id=83%2C187%2C196%2C1114%2C206%2C335%2C536%2C212%2C214%2C1173%2C217%2C250%2C252%2C256%2C265%2C876%2C268%2C298%2C999 (10.50.1.159) 398.52ms
2019-01-18 22:44:47,291 101 GET /graphs/enviz_graphs/ws?bokeh-protocol-version=1.0&bokeh-session-id=ImqIQZ1sbiZS4KsAOocVHGFgUGfJJLwHxG44Irv9Xls9&pid=83,187,196,1114,206,335,536,212,214,1173,217,250,252,256,265,876,268,298,999 (10.50.1.159) 0.56ms
2019-01-18 22:44:47,291 WebSocket connection opened
2019-01-18 22:44:47,291 Receiver created for Protocol('1.0')
2019-01-18 22:44:47,291 ProtocolHandler created for Protocol('1.0')
2019-01-18 22:44:47,291 ServerConnection created
2019-01-18 22:44:47,350 Sending pull-doc-reply from session 'ImqIQZ1sbiZS4KsAOocVHGFgUGfJJLwHxG44Irv9Xls9'
```
Notice the autoload request at 22:44:45 and the ws request at 22:44:47. (2 seconds)
However, sometimes the ws request can arrive nearly a minute later:
```
doc.session_context.request.arguments: {'bokeh-autoload-element': [b'1090'], 'bokeh-app-path': [b'/graphs/enviz_graphs'], 'bokeh-absolute-url': [b'https://*redacted*/graphs/enviz_graphs'], 'processor_id': [b'83,187,196,1114,206,335,536,212,214,1173,217,250,252,256,265,876,268,298,300,1347,1350,1352,284,307,1115,1229,999,92,']}
2019-01-18 22:45:10,741 root_url should end with a /, adding one
2019-01-18 22:45:10,745 200 GET /graphs/enviz_graphs/autoload.js?bokeh-autoload-element=1090&bokeh-app-path=/graphs/enviz_graphs&bokeh-absolute-url=https://*redacted*/graphs/enviz_graphs&processor_id=83%2C187%2C196%2C1114%2C206%2C335%2C536%2C212%2C214%2C1173%2C217%2C250%2C252%2C256%2C265%2C876%2C268%2C298%2C300%2C1347%2C1350%2C1352%2C284%2C307%2C1115%2C1229%2C999%2C92%2C (10.50.1.159) 392.75ms
2019-01-18 22:45:35,357 Scheduling 1 sessions to discard
2019-01-18 22:45:35,357 Discarding session '1fz6E0KuyuaCaCscdyKLyI2YJze38csKNckNQkotkrE8' last in use 24616.089113235474 milliseconds ago
2019-01-18 22:45:35,358 Deleting 1 modules for <bokeh.document.document.Document object at 0x7f8bb89f8438>
2019-01-18 22:45:50,352 [pid 11775] 1 clients connected
2019-01-18 22:45:50,352 [pid 11775] /enviz_graphs has 1 sessions with 0 unused
2019-01-18 22:46:05,562 101 GET /graphs/enviz_graphs/ws?bokeh-protocol-version=1.0&bokeh-session-id=1fz6E0KuyuaCaCscdyKLyI2YJze38csKNckNQkotkrE8&pid=83,187,196,1114,206,335,536,212,214,1173,217,250,252,256,265,876,268,298,300,1347,1350,1352,284,307,1115,1229,999,92, (10.50.1.159) 0.58ms
2019-01-18 22:46:05,562 WebSocket connection opened
doc.session_context.request.arguments: {'pid': [b'83,187,196,1114,206,335,536,212,214,1173,217,250,252,256,265,876,268,298,300,1347,1350,1352,284,307,1115,1229,999,92,']}
2019-01-18 22:46:05,563 Error running application handler <bokeh.application.handlers.directory.DirectoryHandler object at 0x7f8bb8f75cf8>: local variable 'current_pids' referenced before assignment
File "env_frontend.py", line 30, in modify_doc:
if len(current_pids) < 1: Traceback (most recent call last):
File "/*redacted*/enviz/venv/lib64/python3.6/site-packages/bokeh/application/handlers/code_runner.py", line 180, in run
exec(self._code, module.__dict__)
File "/*redacted*/enviz/venv/lib/python3.6/site-packages/enviz_graphs/main.py", line 7, in <module>
modify_doc(doc)
File "/*redacted*/enviz/venv/lib/python3.6/site-packages/enviz_graphs/env_frontend.py", line 30, in modify_doc
if len(current_pids) < 1:
UnboundLocalError: local variable 'current_pids' referenced before assignment
2019-01-18 22:46:05,563 Receiver created for Protocol('1.0')
2019-01-18 22:46:05,563 ProtocolHandler created for Protocol('1.0')
2019-01-18 22:46:05,563 ServerConnection created
2019-01-18 22:46:05,631 Sending pull-doc-reply from session '1fz6E0KuyuaCaCscdyKLyI2YJze38csKNckNQkotkrE8'
```
Notice the autoload request at 22:45:10 and the ws request at 22:46:05. (55 seconds)
In that gap, it appears that the session created by the autoload request was discarded as being unused at 22:46:05. (we have the default 15000 ms timeout for that.)
In both cases, the request for autoload.js takes less than 400 ms, so the slowdown seems like it would be in the browser, though I don't yet have any browser profiling that caught it.
Then, when the ws request comes in, it tries to create a new session, but fails to run our module, as the correct keys aren't in doc.session_context.request.arguments.
After this, every request to the bokeh server fails at requesting autoload until we restart the server, as it appears that doc.session_context.request.arguments is always None after that.
#### Complete, minimal, self-contained example code that reproduces the issue
N/A
#### Stack traceback and/or browser JavaScript console output
N/A
#### Screenshots or screencasts of the bug in action
N/A
| [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2012 - 2019, Anaconda, Inc., and Bokeh Contributors.\n# All rights reserved.\n#\n# The full license is in the file LICENSE.txt, distributed with this software.\n#----------------------------------------... | [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2012 - 2019, Anaconda, Inc., and Bokeh Contributors.\n# All rights reserved.\n#\n# The full license is in the file LICENSE.txt, distributed with this software.\n#----------------------------------------... | diff --git a/bokeh/application/handlers/directory.py b/bokeh/application/handlers/directory.py
index 58c050aedfc..5f79257ab06 100644
--- a/bokeh/application/handlers/directory.py
+++ b/bokeh/application/handlers/directory.py
@@ -176,7 +176,7 @@ def modify_document(self, doc):
if they are found.
'''
- if self.failed:
+ if self._lifecycle_handler.failed:
return
# Note: we do NOT copy self._theme, which assumes the Theme
# class is immutable (has no setters)
|
facebookresearch__ParlAI-3351 | BERT classifier doesn't work under distributed_train
The default tokenization is re, I think it's building the dictionary along the way...
**Logs**
Please paste the command line output:
```
ValueError: Dictionaries should be pre-built before distributed train.
ValueError: Dictionaries should be pre-built before distributed train.
```
| [
{
"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom parlai.core.dict import DictionaryAgent\nfrom parlai.zoo.bert.build import download\nfrom... | [
{
"content": "#!/usr/bin/env python3\n\n# Copyright (c) Facebook, Inc. and its affiliates.\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\nfrom parlai.core.dict import DictionaryAgent\nfrom parlai.zoo.bert.build import download\nfrom... | diff --git a/parlai/agents/bert_ranker/bert_dictionary.py b/parlai/agents/bert_ranker/bert_dictionary.py
index 1711024073f..268a12fd490 100644
--- a/parlai/agents/bert_ranker/bert_dictionary.py
+++ b/parlai/agents/bert_ranker/bert_dictionary.py
@@ -24,6 +24,9 @@ class BertDictionaryAgent(DictionaryAgent):
Allow to use the Torch Agent with the wordpiece dictionary of Hugging Face.
"""
+ def is_prebuit(self):
+ return True
+
def __init__(self, opt):
super().__init__(opt)
# initialize from vocab path
|
pre-commit__pre-commit-1259 | [FR][bug?] pre-commit hook repo self-test
Given a repo with `.pre-commit-hooks.yaml` defined (like https://github.com/ansible/ansible-lint), I want to integrate testing of the hooks declared in it.
I can do `pre-commit try-repo https://github.com/ansible/ansible-lint.git` but this hits the remote which I want to avoid. I know that Git itself can work with local fs paths (like `/path/to/.git`) perfectly fine.
So I tried:
<details>
<summary>
<code>$ <kbd>pre-commit try-repo .git -vvv</kbd></code>
</summary>
```console
➜ pre-commit try-repo .git -vvv
[WARNING] Creating temporary repo with uncommitted changes...
An unexpected error has occurred: CalledProcessError: Command: ('/usr/bin/git', 'add', '-u')
Return code: 128
Expected return code: 0
Output: (none)
Errors:
fatal: this operation must be run in a work tree
Check the log at ~/.cache/pre-commit/pre-commit.log
```
</details>
The log doesn't reveal anything more than the fact that the Git command failed.
<details>
<summary>
<code>$ <kbd>cat ~/.cache/pre-commit/pre-commit.log</kbd></code>
</summary>
```console
An unexpected error has occurred: CalledProcessError: Command: ('/usr/bin/git', 'add', '-u')
Return code: 128
Expected return code: 0
Output: (none)
Errors:
fatal: this operation must be run in a work tree
Traceback (most recent call last):
File "~/.pyenv/versions/3.7.1/lib/python3.7/site-packages/pre_commit/error_handler.py", line 46, in error_handler
yield
File "~/.pyenv/versions/3.7.1/lib/python3.7/site-packages/pre_commit/main.py", line 296, in main
return try_repo(args)
File "~/.pyenv/versions/3.7.1/lib/python3.7/site-packages/pre_commit/commands/try_repo.py", line 55, in try_repo
repo, ref = _repo_ref(tempdir, args.repo, args.ref)
File "~/.pyenv/versions/3.7.1/lib/python3.7/site-packages/pre_commit/commands/try_repo.py", line 45, in _repo_ref
cmd_output('git', 'add', '-u', cwd=repo, env=env)
File "~/.pyenv/versions/3.7.1/lib/python3.7/site-packages/pre_commit/util.py", line 153, in cmd_output
returncode, cmd, retcode, output=(stdout, stderr),
pre_commit.util.CalledProcessError: Command: ('/usr/bin/git', 'add', '-u')
Return code: 128
Expected return code: 0
Output: (none)
Errors:
fatal: this operation must be run in a work tree
```
</details>
It must be pretty easy to fix.
| [
{
"content": "from __future__ import unicode_literals\n\nimport logging\nimport os.path\nimport sys\n\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cmd_output_b\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef zsplit(s):\n s = s.strip('\\0')\n if s:\n return s.split('\\0')\... | [
{
"content": "from __future__ import unicode_literals\n\nimport logging\nimport os.path\nimport sys\n\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cmd_output_b\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef zsplit(s):\n s = s.strip('\\0')\n if s:\n return s.split('\\0')\... | diff --git a/pre_commit/git.py b/pre_commit/git.py
index c8faf60f7..136cefef5 100644
--- a/pre_commit/git.py
+++ b/pre_commit/git.py
@@ -141,7 +141,7 @@ def has_diff(*args, **kwargs):
repo = kwargs.pop('repo', '.')
assert not kwargs, kwargs
cmd = ('git', 'diff', '--quiet', '--no-ext-diff') + args
- return cmd_output_b(*cmd, cwd=repo, retcode=None)[0]
+ return cmd_output_b(*cmd, cwd=repo, retcode=None)[0] == 1
def has_core_hookpaths_set():
diff --git a/tests/commands/try_repo_test.py b/tests/commands/try_repo_test.py
index 536eb9bc4..1849c70a5 100644
--- a/tests/commands/try_repo_test.py
+++ b/tests/commands/try_repo_test.py
@@ -98,6 +98,15 @@ def test_try_repo_relative_path(cap_out, tempdir_factory):
assert not try_repo(try_repo_opts(relative_repo, hook='bash_hook'))
+def test_try_repo_bare_repo(cap_out, tempdir_factory):
+ repo = make_repo(tempdir_factory, 'modified_file_returns_zero_repo')
+ with cwd(git_dir(tempdir_factory)):
+ _add_test_file()
+ bare_repo = os.path.join(repo, '.git')
+ # previously crashed attempting modification changes
+ assert not try_repo(try_repo_opts(bare_repo, hook='bash_hook'))
+
+
def test_try_repo_specific_revision(cap_out, tempdir_factory):
repo = make_repo(tempdir_factory, 'script_hooks_repo')
ref = git.head_rev(repo)
|
urllib3__urllib3-1782 | iterating a closed response improperly produces data
Consider the following script:
```
import urllib3
http = urllib3.PoolManager()
resp = http.request("GET", "https://www.python.org")
resp.close()
for d in resp:
print(repr(d))
```
With urllib3 1.25.7, this program prints `b''`. With urllib3 1.24.3, one sees:
```
Traceback (most recent call last):
File "example.py", line 6, in <module>
for d in resp:
ValueError: I/O operation on closed file.
```
The latter is in line with what I expect.
| [
{
"content": "from __future__ import absolute_import\nfrom contextlib import contextmanager\nimport zlib\nimport io\nimport logging\nfrom socket import timeout as SocketTimeout\nfrom socket import error as SocketError\n\ntry:\n import brotli\nexcept ImportError:\n brotli = None\n\nfrom ._collections impor... | [
{
"content": "from __future__ import absolute_import\nfrom contextlib import contextmanager\nimport zlib\nimport io\nimport logging\nfrom socket import timeout as SocketTimeout\nfrom socket import error as SocketError\n\ntry:\n import brotli\nexcept ImportError:\n brotli = None\n\nfrom ._collections impor... | diff --git a/src/urllib3/response.py b/src/urllib3/response.py
index adc321e713..6090a7350f 100644
--- a/src/urllib3/response.py
+++ b/src/urllib3/response.py
@@ -792,7 +792,7 @@ def geturl(self):
return self._request_url
def __iter__(self):
- buffer = [b""]
+ buffer = []
for chunk in self.stream(decode_content=True):
if b"\n" in chunk:
chunk = chunk.split(b"\n")
diff --git a/test/test_response.py b/test/test_response.py
index c6a9c3ad04..dccce8e56b 100644
--- a/test/test_response.py
+++ b/test/test_response.py
@@ -859,8 +859,9 @@ def test_geturl_retries(self):
@pytest.mark.parametrize(
["payload", "expected_stream"],
[
- (b"", [b""]),
+ (b"", []),
(b"\n", [b"\n"]),
+ (b"\n\n\n", [b"\n", b"\n", b"\n"]),
(b"abc\ndef", [b"abc\n", b"def"]),
(b"Hello\nworld\n\n\n!", [b"Hello\n", b"world\n", b"\n", b"\n", b"!"]),
],
|
openfun__richie-2306 | Frontend - Rename teacher dashboard menu entry
## Feature Request
**Is your feature request related to a problem or unsupported use case? Please describe.**
Currently, we name the dashboard dedicated to manage trainings "teacher dashboard". After a demonstration, it appears this term is weird as other user than teacher have to use this dashboard (university members, course leaders...).
**Describe the solution you'd like**
We should rename this entry by something like "Administration dashboard" or "Training dashboard" ?
Furthermore, we could display this entry apart from other to explicitly show this is an extra entry not available to all users.
| [
{
"content": "\"\"\"\nDjango settings for richie project.\n\"\"\"\n\nimport json\nimport os\n\nfrom django.utils.translation import gettext_lazy as _\n\n# pylint: disable=ungrouped-imports\nimport sentry_sdk\nfrom configurations import Configuration, values\nfrom sentry_sdk.integrations.django import DjangoInte... | [
{
"content": "\"\"\"\nDjango settings for richie project.\n\"\"\"\n\nimport json\nimport os\n\nfrom django.utils.translation import gettext_lazy as _\n\n# pylint: disable=ungrouped-imports\nimport sentry_sdk\nfrom configurations import Configuration, values\nfrom sentry_sdk.integrations.django import DjangoInte... | diff --git a/sandbox/settings.py b/sandbox/settings.py
index e83e9c0fc0..98fcb07910 100644
--- a/sandbox/settings.py
+++ b/sandbox/settings.py
@@ -284,7 +284,7 @@ class Base(StyleguideMixin, DRFMixin, RichieCoursesConfigurationMixin, Configura
"href": _("/dashboard/"),
},
"dashboard_teacher": {
- "label": _("Teacher dashboard"),
+ "label": _("Course administration"),
"href": _("/dashboard/teacher"),
},
}
diff --git a/src/frontend/js/widgets/UserLogin/components/UserMenu/DesktopUserMenu.tsx b/src/frontend/js/widgets/UserLogin/components/UserMenu/DesktopUserMenu.tsx
index d8ed22e4dd..8f2c71d03e 100644
--- a/src/frontend/js/widgets/UserLogin/components/UserMenu/DesktopUserMenu.tsx
+++ b/src/frontend/js/widgets/UserLogin/components/UserMenu/DesktopUserMenu.tsx
@@ -1,6 +1,7 @@
import { FC } from 'react';
import { defineMessages, FormattedMessage } from 'react-intl';
import { useSelect } from 'downshift';
+import classNames from 'classnames';
import { location } from 'utils/indirection/window';
import { UserHelper } from 'utils/UserHelper';
import { UserMenuProps } from '.';
@@ -36,6 +37,21 @@ export const DesktopUserMenu: FC<UserMenuProps> = ({ user }) => {
},
});
+ const teacherDasbhoardUrl = user.urls.find((link) => {
+ return link.key === 'dashboard_teacher';
+ });
+ let menuLinkList;
+ if (teacherDasbhoardUrl) {
+ menuLinkList = [
+ teacherDasbhoardUrl,
+ ...user.urls.filter((link) => {
+ return link.key !== 'dashboard_teacher';
+ }),
+ ];
+ } else {
+ menuLinkList = user.urls;
+ }
+
return (
<div className="user-menu user-menu--desktop selector">
<label {...getLabelProps()} className="offscreen">
@@ -52,8 +68,14 @@ export const DesktopUserMenu: FC<UserMenuProps> = ({ user }) => {
className={`selector__list ${isOpen ? '' : 'selector__list--is-closed'}`}
>
{isOpen &&
- user.urls.map((link, index) => (
- <li key={link.key} {...getItemProps({ item: link, index })}>
+ menuLinkList.map((link, index) => (
+ <li
+ key={link.key}
+ {...getItemProps({ item: link, index })}
+ className={classNames({
+ 'selector__list__item--bordered': link.key === 'dashboard_teacher',
+ })}
+ >
{typeof link.action === 'string' ? (
<a
className={`selector__list__link ${
diff --git a/src/frontend/scss/objects/_selector.scss b/src/frontend/scss/objects/_selector.scss
index 280ba4861f..917d75e9c4 100644
--- a/src/frontend/scss/objects/_selector.scss
+++ b/src/frontend/scss/objects/_selector.scss
@@ -57,6 +57,12 @@
margin-left: calc(3rem - 12px);
}
+ &__item {
+ &--bordered:not(:last-child) {
+ border-bottom: $onepixel solid r-theme-val(topbar, item-divider-border);
+ }
+ }
+
&__link {
@include button-reset-style();
background: r-theme-val(selector, base-background);
|
ckan__ckan-5478 | routes manual reference URL in comment is broken
**CKAN version**
latest
**Describe the bug**
The url in [comment ](https://github.com/ckan/ckan/blob/0f87337fd937a15545ed761367b5d27d888e3803/ckan/config/routing.py#L6) is broken.
**Steps to reproduce**
Steps to reproduce the behavior:
Open a browser and go to "http://routes.groovie.org/docs/"

**Expected behavior**
A valid documentation reference.
| [
{
"content": "# encoding: utf-8\n\"\"\"Routes configuration\n\nThe more specific and detailed routes should be defined first so they\nmay take precedent over the more generic routes. For more information\nrefer to the routes manual at http://routes.groovie.org/docs/\n\n\"\"\"\nimport re\n\nfrom routes.mapper im... | [
{
"content": "# encoding: utf-8\n\"\"\"Routes configuration\n\nThe more specific and detailed routes should be defined first so they\nmay take precedent over the more generic routes. For more information\nrefer to the routes manual at https://routes.readthedocs.io/en/latest/\n\n\"\"\"\nimport re\n\nfrom routes.... | diff --git a/ckan/config/routing.py b/ckan/config/routing.py
index f4632a2643a..af723c55448 100644
--- a/ckan/config/routing.py
+++ b/ckan/config/routing.py
@@ -3,7 +3,7 @@
The more specific and detailed routes should be defined first so they
may take precedent over the more generic routes. For more information
-refer to the routes manual at http://routes.groovie.org/docs/
+refer to the routes manual at https://routes.readthedocs.io/en/latest/
"""
import re
|
abey79__vpype-607 | Default to QT_QPA_PLATFORM=xcb on Linux/Wayland
If we detect a linux box running on wayland, we should force Qt to use the xcb platform as the wayland backend doesn't work properly with moderngl.
This maybe a good way to detect wayland:
```
XDG_SESSION_TYPE=wayland
```
Relevant discussions:
- https://github.com/abey79/vsketch/issues/353
- https://discord.com/channels/550302843777712148/696045774970028062/1072436292798926868
| [
{
"content": "from .viewer import *\n",
"path": "vpype_viewer/qtviewer/__init__.py"
}
] | [
{
"content": "def _check_wayland():\n \"\"\"Fix QT env variable on Wayland-based systems.\n\n See https://github.com/abey79/vpype/issues/596\n \"\"\"\n import os\n import sys\n\n if sys.platform.startswith(\"linux\"):\n if os.environ.get(\"XDG_SESSION_TYPE\", \"\") == \"wayland\":\n ... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 7c88f7ee..8f30b192 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -12,6 +12,7 @@ Release date: UNRELEASED
### Bug fixes
* Fixed a design issue with the `read` command where disjoints groups of digit in layer names would be used to determine layer IDs. Only the first contiguous group of digit is used, so a layer named "01-layer1" would now have layer ID of 1 instead of 11 (#606)
+* Fixed an issue on Wayland-based Linux distribution where using the viewer (e.g. with the `show` command) would crash (#607)
### API changes
diff --git a/vpype_viewer/qtviewer/__init__.py b/vpype_viewer/qtviewer/__init__.py
index d8bfbc32..8f8f143e 100644
--- a/vpype_viewer/qtviewer/__init__.py
+++ b/vpype_viewer/qtviewer/__init__.py
@@ -1 +1,18 @@
+def _check_wayland():
+ """Fix QT env variable on Wayland-based systems.
+
+ See https://github.com/abey79/vpype/issues/596
+ """
+ import os
+ import sys
+
+ if sys.platform.startswith("linux"):
+ if os.environ.get("XDG_SESSION_TYPE", "") == "wayland":
+ if "QT_QPA_PLATFORM" not in os.environ:
+ os.environ["QT_QPA_PLATFORM"] = "xcb"
+
+
+_check_wayland()
+
+
from .viewer import *
|
fonttools__fonttools-2274 | When parsing MVAR with lazy=True recordSize is wrong
Reproduction:
```
from fontTools import ttLib
import io
import sys
file_path = sys.argv[1]
fontdata = open(file_path, "rb").read()
font = ttLib.TTFont(io.BytesIO(fontdata), lazy=True)
mvar = font["MVAR"].table
print(mvar.ValueRecord.recordSize)
for rec in mvar.ValueRecord:
print(rec.ValueTag, "->", rec.VarIdx)
```
Running this against the latest version of recursive gives:
16
hcrn -> 65538
sbxo -> 65536
stro -> 131072
undo -> 1
xhgt -> 0
Ê -> 732
@ -> 1073741824
-> 0
-> 16384
Record size should be 8.
| [
{
"content": "from fontTools.misc.py23 import Tag, bytesjoin\nfrom .DefaultTable import DefaultTable\nimport sys\nimport array\nimport struct\nimport logging\n\nlog = logging.getLogger(__name__)\n\nclass OverflowErrorRecord(object):\n\tdef __init__(self, overflowTuple):\n\t\tself.tableType = overflowTuple[0]\n\... | [
{
"content": "from fontTools.misc.py23 import Tag, bytesjoin\nfrom .DefaultTable import DefaultTable\nimport sys\nimport array\nimport struct\nimport logging\n\nlog = logging.getLogger(__name__)\n\nclass OverflowErrorRecord(object):\n\tdef __init__(self, overflowTuple):\n\t\tself.tableType = overflowTuple[0]\n\... | diff --git a/Lib/fontTools/ttLib/tables/otBase.py b/Lib/fontTools/ttLib/tables/otBase.py
index 3c07f9e11a..24c6197006 100644
--- a/Lib/fontTools/ttLib/tables/otBase.py
+++ b/Lib/fontTools/ttLib/tables/otBase.py
@@ -571,7 +571,7 @@ def getRecordSize(cls, reader):
countValue = 1
if conv.repeat:
if conv.repeat in reader:
- countValue = reader[conv.repeat]
+ countValue = reader[conv.repeat] + conv.aux
else:
return NotImplemented
totalSize += size * countValue
diff --git a/Tests/ttLib/tables/M_V_A_R_test.py b/Tests/ttLib/tables/M_V_A_R_test.py
index 3972d8c302..a8b092e0ed 100644
--- a/Tests/ttLib/tables/M_V_A_R_test.py
+++ b/Tests/ttLib/tables/M_V_A_R_test.py
@@ -8,8 +8,8 @@
MVAR_DATA = deHexStr(
'0001 0000 ' # 0: version=1.0
'0000 0008 ' # 4: reserved=0, valueRecordSize=8
- '0007 ' # 8: valueRecordCount=7
- '0044 ' # 10: offsetToItemVariationStore=68
+ '0009 ' # 8: valueRecordCount=9
+ '0054 ' # 10: offsetToItemVariationStore=84
'6861 7363 ' # 12: ValueRecord.valueTag="hasc"
'0000 ' # 16: ValueRecord.deltaSetOuterIndex
'0003 ' # 18: ValueRecord.deltaSetInnerIndex
@@ -31,30 +31,36 @@
'7370 796F ' # 60: ValueRecord.valueTag="spyo"
'0000 ' # 64: ValueRecord.deltaSetOuterIndex
'0002 ' # 66: ValueRecord.deltaSetInnerIndex
- '0001 ' # 68: VarStore.format=1
- '0000 000C ' # 70: VarStore.offsetToVariationRegionList=12
- '0001 ' # 74: VarStore.itemVariationDataCount=1
- '0000 0016 ' # 76: VarStore.itemVariationDataOffsets[0]=22
- '0001 ' # 80: VarRegionList.axisCount=1
- '0001 ' # 82: VarRegionList.regionCount=1
- '0000 ' # 84: variationRegions[0].regionAxes[0].startCoord=0.0
- '4000 ' # 86: variationRegions[0].regionAxes[0].peakCoord=1.0
- '4000 ' # 88: variationRegions[0].regionAxes[0].endCoord=1.0
- '0004 ' # 90: VarData.ItemCount=4
- '0001 ' # 92: VarData.NumShorts=1
- '0001 ' # 94: VarData.VarRegionCount=1
- '0000 ' # 96: VarData.VarRegionIndex[0]=0
- 'FF38 ' # 98: VarData.deltaSets[0]=-200
- 'FFCE ' # 100: VarData.deltaSets[0]=-50
- '0064 ' # 102: VarData.deltaSets[0]=100
- '00C8 ' # 104: VarData.deltaSets[0]=200
+ '7465 7374 ' # 68: ValueRecord.valueTag="test"
+ '0000 ' # 72: ValueRecord.deltaSetOuterIndex
+ '0002 ' # 74: ValueRecord.deltaSetInnerIndex
+ '7465 7332 ' # 76: ValueRecord.valueTag="tes2"
+ '0000 ' # 78: ValueRecord.deltaSetOuterIndex
+ '0002 ' # 82: ValueRecord.deltaSetInnerIndex
+ '0001 ' # 84: VarStore.format=1
+ '0000 000C ' # 86: VarStore.offsetToVariationRegionList=12
+ '0001 ' # 90: VarStore.itemVariationDataCount=1
+ '0000 0016 ' # 92: VarStore.itemVariationDataOffsets[0]=22
+ '0001 ' # 96: VarRegionList.axisCount=1
+ '0001 ' # 98: VarRegionList.regionCount=1
+ '0000 ' # 100: variationRegions[0].regionAxes[0].startCoord=0.0
+ '4000 ' # 102: variationRegions[0].regionAxes[0].peakCoord=1.0
+ '4000 ' # 104: variationRegions[0].regionAxes[0].endCoord=1.0
+ '0004 ' # 106: VarData.ItemCount=4
+ '0001 ' # 108: VarData.NumShorts=1
+ '0001 ' # 110: VarData.VarRegionCount=1
+ '0000 ' # 112: VarData.VarRegionIndex[0]=0
+ 'FF38 ' # 114: VarData.deltaSets[0]=-200
+ 'FFCE ' # 116: VarData.deltaSets[0]=-50
+ '0064 ' # 118: VarData.deltaSets[0]=100
+ '00C8 ' # 120: VarData.deltaSets[0]=200
)
MVAR_XML = [
'<Version value="0x00010000"/>',
'<Reserved value="0"/>',
'<ValueRecordSize value="8"/>',
- '<!-- ValueRecordCount=7 -->',
+ '<!-- ValueRecordCount=9 -->',
'<VarStore Format="1">',
' <Format value="1"/>',
' <VarRegionList>',
@@ -108,6 +114,14 @@
' <ValueTag value="spyo"/>',
' <VarIdx value="2"/>',
'</ValueRecord>',
+ '<ValueRecord index="7">',
+ ' <ValueTag value="test"/>',
+ ' <VarIdx value="2"/>',
+ '</ValueRecord>',
+ '<ValueRecord index="8">',
+ ' <ValueTag value="tes2"/>',
+ ' <VarIdx value="2"/>',
+ '</ValueRecord>',
]
@@ -123,6 +137,13 @@ def test_decompile_toXML(self):
mvar.decompile(MVAR_DATA, font)
self.assertEqual(getXML(mvar.toXML), MVAR_XML)
+
+ def test_decompile_toXML_lazy(self):
+ mvar = newTable('MVAR')
+ font = TTFont(lazy=True)
+ mvar.decompile(MVAR_DATA, font)
+ self.assertEqual(getXML(mvar.toXML), MVAR_XML)
+
def test_compile_fromXML(self):
mvar = newTable('MVAR')
font = TTFont()
|
SeldonIO__MLServer-1172 | Star imports from `mlserver.codecs` not working
For example:
```python
from mlserver.codecs import *
```
Throws an error:
```python
Traceback (most recent call last):
File "/home/janis/.conda/envs/py310/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3460, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-2-b8cc62508f29>", line 1, in <module>
from mlserver.codecs import *
AttributeError: module 'mlserver.codecs' has no attribute 'StringRequestCodec'
```
This is likely because `__all__` is out-of-date with the actual imports. I haven't tested other sub-packages, but it might be worth looking at these.
P.S. I'm not a big fan of `__all__` and star imports in particular, the main issue is that the existence of `__all__` gives rise to two public APIs which may diverge (as it has in this case).
| [
{
"content": "from .numpy import NumpyCodec, NumpyRequestCodec\nfrom .pandas import PandasCodec\nfrom .string import StringCodec\nfrom .base64 import Base64Codec\nfrom .datetime import DatetimeCodec\nfrom .errors import CodecError\nfrom .decorator import decode_args\nfrom .base import (\n InputCodec,\n Re... | [
{
"content": "from .numpy import NumpyCodec, NumpyRequestCodec\nfrom .pandas import PandasCodec\nfrom .string import StringCodec, StringRequestCodec\nfrom .base64 import Base64Codec\nfrom .datetime import DatetimeCodec\nfrom .errors import CodecError\nfrom .decorator import decode_args\nfrom .base import (\n ... | diff --git a/mlserver/codecs/__init__.py b/mlserver/codecs/__init__.py
index 47f6a1880..99211dd32 100644
--- a/mlserver/codecs/__init__.py
+++ b/mlserver/codecs/__init__.py
@@ -1,6 +1,6 @@
from .numpy import NumpyCodec, NumpyRequestCodec
from .pandas import PandasCodec
-from .string import StringCodec
+from .string import StringCodec, StringRequestCodec
from .base64 import Base64Codec
from .datetime import DatetimeCodec
from .errors import CodecError
|
saulpw__visidata-591 | [wishlist] pasting data into input(): should not react to newlines
I've come against this a few times now where I've accidentally pasted multi line data into the regex search input. This does two things:
1. When a new line character is pasted, it takes this as and enter which competes the command
2. If there are any characters after the new line these are entered as key commands which could potentially mess up the table
Recreate:
Open a sheet
Copy some text to search for. This will include a new line character
Use / to open the input prompt
Paste the text
I often do this when I copy data from a cell using using zY (the cell data maybe partially hidden). This cell value may contain a new line character that I'm unaware of and upon pasting into the input field searches for everything up to the new line, but then entered a bunch of unintended key combinations which are from the following line.
| [
{
"content": "from contextlib import suppress\nimport collections\nimport curses\n\nimport visidata\n\nfrom visidata import EscapeException, ExpectedException, clipdraw, Sheet, VisiData\nfrom visidata import vd, status, error, warning, fail, options, theme, colors\nfrom visidata import launchExternalEditor, sus... | [
{
"content": "from contextlib import suppress\nimport collections\nimport curses\n\nimport visidata\n\nfrom visidata import EscapeException, ExpectedException, clipdraw, Sheet, VisiData\nfrom visidata import vd, status, error, warning, fail, options, theme, colors\nfrom visidata import launchExternalEditor, sus... | diff --git a/visidata/_input.py b/visidata/_input.py
index 677d14b4c..1f67ef44b 100644
--- a/visidata/_input.py
+++ b/visidata/_input.py
@@ -232,6 +232,13 @@ def editText(vd, y, x, w, record=True, display=True, **kwargs):
status('"%s"' % v)
if record and vd.cmdlog:
vd.setLastArgs(v)
+
+ # clear keyboard buffer upon exit from input()
+ # input() stops when it reaches an ENTER, and we do not want the expressions
+ # that follow to register as keystrokes
+ # see issue#585
+ curses.flushinp()
+
return v
|
ivy-llc__ivy-13703 | ptp
| [
{
"content": "# local\n\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\nfrom ivy.functional.frontends.jax.numpy import promote_types_of_jax_inputs\n\n\n@to_ivy_arrays_and_back\ndef ... | [
{
"content": "# local\n\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\nfrom ivy.functional.frontends.jax.numpy import promote_types_of_jax_inputs\n\n\n@to_ivy_arrays_and_back\ndef ... | diff --git a/ivy/functional/frontends/jax/numpy/statistical.py b/ivy/functional/frontends/jax/numpy/statistical.py
index 0a4778ab4d3b6..3fa96b1baf311 100644
--- a/ivy/functional/frontends/jax/numpy/statistical.py
+++ b/ivy/functional/frontends/jax/numpy/statistical.py
@@ -420,3 +420,10 @@ def std(a, axis=None, dtype=None, out=None, ddof=0, keepdims=False, *, where=Non
@to_ivy_arrays_and_back
def corrcoef(x, y=None, rowvar=True):
return ivy.corrcoef(x, y=y, rowvar=rowvar)
+
+
+@to_ivy_arrays_and_back
+def ptp(a, axis=None, out=None, keepdims=False):
+ x = ivy.max(a, axis=axis, keepdims=keepdims)
+ y = ivy.min(a, axis=axis, keepdims=keepdims)
+ return ivy.subtract(x, y)
diff --git a/ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_statistical.py b/ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_statistical.py
index b79dd658548f2..d627ff4fe382a 100644
--- a/ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_statistical.py
+++ b/ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_statistical.py
@@ -847,3 +847,31 @@ def test_jax_numpy_corrcoef(
y=x[1],
rowvar=rowvar,
)
+
+
+# ptp
+@handle_frontend_test(
+ fn_tree="jax.numpy.ptp",
+ dtype_and_x_axis_dtype=_get_castable_dtypes_values(allow_nan=False),
+ keep_dims=st.booleans(),
+)
+def test_jax_numpy_ptp(
+ dtype_and_x_axis_dtype,
+ frontend,
+ test_flags,
+ fn_tree,
+ on_device,
+ keep_dims,
+):
+ input_dtypes, x, axis, dtype = dtype_and_x_axis_dtype
+ np_frontend_helpers.test_frontend_function(
+ input_dtypes=input_dtypes,
+ frontend=frontend,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ a=x[0],
+ axis=axis,
+ out=None,
+ keepdims=keep_dims
+ )
|
scverse__scanpy-2248 | read_10x_h5() `genome` argument appears recently broken for 10x v2 format
- [x] I have checked that this issue has not already been reported.
- [x] I have confirmed this bug exists on the latest version of scanpy.
- [x] (optional) I have confirmed this bug exists on the master branch of scanpy.
---
To reproduce this issue:
1. download the public 10x dataset here (https://cf.10xgenomics.com/samples/cell-exp/2.1.0/hgmm_12k/hgmm_12k_raw_gene_bc_matrices_h5.h5)
2. run the following
```python
import scanpy as sc
adata_human = sc.read_10x_h5('hgmm_12k_raw_gene_bc_matrices_h5.h5', genome='hg19')
adata_mouse = sc.read_10x_h5('hgmm_12k_raw_gene_bc_matrices_h5.h5', genome='mm10')
assert (adata_human.X != adata_mouse.X).sum() > 0, 'these count matrices are equal'
```
which produces the assertion error. We see that the loaded data is the same regardless of `'genome'` argument. A look at the file itself shows this is not the case (notice the number of gene names, which are different for hg19 and mm10):

#### Versions
Also I think I can say confidently that this was working fine as of scanpy 1.8.1
<details>
-----
anndata 0.8.0
scanpy 1.9.1
-----
PIL 8.1.0
appnope 0.1.2
backcall 0.2.0
cached_property 1.5.2
cellbender NA
cffi 1.14.5
colorcet 3.0.0
cycler 0.10.0
cython_runtime NA
dateutil 2.8.1
decorator 5.0.9
fontTools 4.33.3
h5py 3.2.0
igraph 0.9.10
ipykernel 5.5.5
ipython_genutils 0.2.0
ipywidgets 7.6.3
jedi 0.18.0
joblib 1.0.1
kiwisolver 1.3.1
leidenalg 0.8.10
llvmlite 0.38.0
lxml 4.8.0
matplotlib 3.5.1
matplotlib_inline NA
mkl 2.3.0
mpl_toolkits NA
natsort 7.1.1
numba 0.55.1
numexpr 2.7.3
numpy 1.19.2
packaging 20.9
pandas 1.2.3
param 1.12.1
parso 0.8.2
pexpect 4.8.0
pickleshare 0.7.5
pkg_resources NA
prompt_toolkit 3.0.18
psutil 5.8.0
ptyprocess 0.7.0
pycparser 2.20
pygments 2.8.0
pynndescent 0.5.6
pyparsing 2.4.7
pytz 2021.1
scipy 1.6.1
seaborn 0.11.2
session_info 1.0.0
six 1.15.0
sklearn 0.24.1
skmisc 0.1.4
sphinxcontrib NA
statsmodels 0.12.2
storemagic NA
tables 3.6.1
texttable 1.6.4
tornado 6.1
tqdm 4.55.1
traitlets 5.0.5
typing_extensions NA
umap 0.5.3
wcwidth 0.2.5
yaml 6.0
zipp NA
zmq 22.0.3
-----
IPython 7.23.1
jupyter_client 6.1.12
jupyter_core 4.7.1
notebook 6.4.0
-----
Python 3.7.9 (default, Aug 31 2020, 07:22:35) [Clang 10.0.0 ]
Darwin-20.6.0-x86_64-i386-64bit
-----
</details>
| [
{
"content": "\"\"\"Reading and Writing\n\"\"\"\nfrom pathlib import Path, PurePath\nfrom typing import Union, Dict, Optional, Tuple, BinaryIO\n\nimport h5py\nimport json\nimport numpy as np\nimport pandas as pd\nfrom matplotlib.image import imread\nimport anndata\nfrom anndata import (\n AnnData,\n read_... | [
{
"content": "\"\"\"Reading and Writing\n\"\"\"\nfrom pathlib import Path, PurePath\nfrom typing import Union, Dict, Optional, Tuple, BinaryIO\n\nimport h5py\nimport json\nimport numpy as np\nimport pandas as pd\nfrom matplotlib.image import imread\nimport anndata\nfrom anndata import (\n AnnData,\n read_... | diff --git a/scanpy/readwrite.py b/scanpy/readwrite.py
index f2aef41297..5ff556ddc5 100644
--- a/scanpy/readwrite.py
+++ b/scanpy/readwrite.py
@@ -219,7 +219,7 @@ def _read_legacy_10x_h5(filename, *, genome=None, start=None):
)
dsets = {}
- _collect_datasets(dsets, f)
+ _collect_datasets(dsets, f[genome])
# AnnData works with csr matrices
# 10x stores the transposed data, so we do the transposition right away
diff --git a/scanpy/tests/_data/10x_data/1.2.0/multiple_genomes.h5 b/scanpy/tests/_data/10x_data/1.2.0/multiple_genomes.h5
new file mode 100644
index 0000000000..3d04d4e909
Binary files /dev/null and b/scanpy/tests/_data/10x_data/1.2.0/multiple_genomes.h5 differ
diff --git a/scanpy/tests/test_read_10x.py b/scanpy/tests/test_read_10x.py
index e9cf188a4b..0f4373334a 100644
--- a/scanpy/tests/test_read_10x.py
+++ b/scanpy/tests/test_read_10x.py
@@ -73,6 +73,23 @@ def test_read_10x_h5_v1():
assert_anndata_equal(spec_genome_v1, nospec_genome_v1)
+def test_read_10x_h5_v2_multiple_genomes():
+ genome1_v1 = sc.read_10x_h5(
+ ROOT / '1.2.0' / 'multiple_genomes.h5',
+ genome='hg19_chr21',
+ )
+ genome2_v1 = sc.read_10x_h5(
+ ROOT / '1.2.0' / 'multiple_genomes.h5',
+ genome='another_genome',
+ )
+ # the test data are such that X is the same shape for both "genomes",
+ # but the values are different
+ assert (genome1_v1.X != genome2_v1.X).sum() > 0, (
+ 'loading data from two different genomes in 10x v2 format. '
+ 'should be different, but is the same. '
+ )
+
+
def test_read_10x_h5():
spec_genome_v3 = sc.read_10x_h5(
ROOT / '3.0.0' / 'filtered_feature_bc_matrix.h5',
|
paperless-ngx__paperless-ngx-6474 | [BUG] E-Mail-Filter matching on non-Emails
### Description
I am using an Email-Rule to receive invoices. Two different for two different users.
To assign them to an user, I add a workflow that uses the E-Mail-Rule-Filter and then assigns the owner and the storage location.
For whatever reason this is applying on every scanned document (which is consumed via consumption folder).
The Mail Rule:

The Workflow:

### Steps to reproduce
1. Add a mail rule to collect a document from mail
2. Add a workflow which is filtering the mail rule from stepp 1.
3. Add a document to the consumption folder.
### Webserver logs
```bash
...
[2024-04-20 11:05:11,431] [INFO] [paperless.management.consumer] Adding /usr/src/paperless/consume/20240420_100319.pdf to the task queue.
[2024-04-20 11:05:12,235] [DEBUG] [paperless.tasks] Skipping plugin CollatePlugin
[2024-04-20 11:05:12,236] [DEBUG] [paperless.tasks] Skipping plugin BarcodePlugin
[2024-04-20 11:05:12,236] [DEBUG] [paperless.tasks] Executing plugin WorkflowTriggerPlugin
[2024-04-20 11:05:13,475] [INFO] [paperless.matching] Document matched WorkflowTrigger 6 from Workflow: Pauschal zu Finkman
[2024-04-20 11:05:13,678] [INFO] [paperless.matching] Document did not match Workflow: Tag Kontoauszug
[2024-04-20 11:05:13,678] [DEBUG] [paperless.matching] ('Document path /usr/src/paperless/consume/20240420_100319.pdf does not match Sven/Kontoauszug/*',)
[2024-04-20 11:05:13,686] [INFO] [paperless.matching] Document did not match Workflow: Lohnabrechnung
[2024-04-20 11:05:13,686] [DEBUG] [paperless.matching] ('Document path /usr/src/paperless/consume/20240420_100319.pdf does not match Sven/Lohnabrechnung/*',)
[2024-04-20 11:05:13,693] [INFO] [paperless.matching] Document did not match Workflow: Karo Import
[2024-04-20 11:05:13,694] [DEBUG] [paperless.matching] ('Document path /usr/src/paperless/consume/20240420_100319.pdf does not match */Karo/*',)
[2024-04-20 11:05:13,698] [INFO] [paperless.matching] Document did not match Workflow: Kinder zu alle
[2024-04-20 11:05:13,699] [DEBUG] [paperless.matching] No matching triggers with type 1 found
[2024-04-20 11:05:13,703] [INFO] [paperless.matching] Document did not match Workflow: Landstuhl zu ImmoGbr
[2024-04-20 11:05:13,704] [DEBUG] [paperless.matching] No matching triggers with type 1 found
[2024-04-20 11:05:13,710] [INFO] [paperless.matching] Document matched WorkflowTrigger 7 from Workflow: Karo Email Import zu Karo Speicherordner
[2024-04-20 11:05:13,730] [INFO] [paperless.tasks] WorkflowTriggerPlugin completed with: Applying WorkflowAction 6 from Workflow: Pauschal zu Finkman
Applying WorkflowAction 7 from Workflow: Karo Email Import zu Karo Speicherordner
[2024-04-20 11:05:14,030] [INFO] [paperless.consumer] Consuming 20240420_100319.pdf
...
```
### Browser logs
_No response_
### Paperless-ngx version
2.7.2
### Host OS
Synology
### Installation method
Docker - official image
### Browser
Safari
### Configuration changes
_No response_
### Other
_No response_
### Please confirm the following
- [X] I believe this issue is a bug that affects all users of Paperless-ngx, not something specific to my installation.
- [X] I have already searched for relevant existing issues and discussions before opening this report.
- [X] I have updated the title field above with a concise description.
| [
{
"content": "import logging\nimport re\nfrom fnmatch import fnmatch\nfrom typing import Union\n\nfrom documents.classifier import DocumentClassifier\nfrom documents.data_models import ConsumableDocument\nfrom documents.data_models import DocumentSource\nfrom documents.models import Correspondent\nfrom document... | [
{
"content": "import logging\nimport re\nfrom fnmatch import fnmatch\nfrom typing import Union\n\nfrom documents.classifier import DocumentClassifier\nfrom documents.data_models import ConsumableDocument\nfrom documents.data_models import DocumentSource\nfrom documents.models import Correspondent\nfrom document... | diff --git a/src/documents/matching.py b/src/documents/matching.py
index 6ffa1b3aac8..586ca3a6a6e 100644
--- a/src/documents/matching.py
+++ b/src/documents/matching.py
@@ -269,8 +269,7 @@ def consumable_document_matches_workflow(
# Document mail rule vs trigger mail rule
if (
- document.mailrule_id is not None
- and trigger.filter_mailrule is not None
+ trigger.filter_mailrule is not None
and document.mailrule_id != trigger.filter_mailrule.pk
):
reason = (
|
mitmproxy__mitmproxy-2067 | brotli encode/decode crash
##### Steps to reproduce the problem:
1. load google.com in browser
2. press `enter` on `GET https://www.google.com/ HTTP/2.0`
3. press `z` to select encoding in either `Request` or `Response`
4. press `b` to select brotli
##### Any other comments? What have you tried so far?
```
Traceback (most recent call last):
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/master.py", line 281, in run
self.loop.run()
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 278, in run
self._run()
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 376, in _run
self.event_loop.run()
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 682, in run
self._loop()
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 719, in _loop
self._watch_files[fd]()
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/raw_display.py", line 393, in <lambda>
event_loop, callback, self.get_available_raw_input())
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/raw_display.py", line 493, in parse_input
callback(processed, processed_codes)
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 403, in _update
self.process_input(keys)
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/main_loop.py", line 503, in process_input
k = self._topmost_widget.keypress(self.screen_size, k)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/window.py", line 84, in keypress
k = super().keypress(size, k)
File "/home/whackashoe/code/mitmproxy/venv/lib/python3.5/site-packages/urwid/container.py", line 1116, in keypress
return self.footer.keypress((maxcol,),key)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/statusbar.py", line 155, in keypress
return self.master.ab.keypress(*args, **kwargs)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/statusbar.py", line 108, in keypress
self.prompt_execute(k)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/statusbar.py", line 133, in prompt_execute
msg = p(txt)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/statusbar.py", line 31, in __call__
return self.callback(txt, *self.args)
File "/home/whackashoe/code/mitmproxy/mitmproxy/tools/console/flowview.py", line 686, in encode_callback
conn.encode(encoding_map[key])
File "/home/whackashoe/code/mitmproxy/mitmproxy/net/http/message.py", line 245, in encode
raise ValueError("Invalid content encoding {}".format(repr(e)))
ValueError: Invalid content encoding 'brotli'
```
Here is a patch which suppresses the error:
```
diff --git a/mitmproxy/tools/console/flowview.py b/mitmproxy/tools/console/flowview.py
index a97a9b3..650ef42 100644
--- a/mitmproxy/tools/console/flowview.py
+++ b/mitmproxy/tools/console/flowview.py
@@ -683,5 +683,9 @@ class FlowView(tabs.Tabs):
"d": "deflate",
"b": "brotli",
}
- conn.encode(encoding_map[key])
+ try:
+ conn.encode(encoding_map[key])
+ except ValueError:
+ pass
+
signals.flow_change.send(self, flow = self.flow)
```
##### System information
```
$ mitmproxy --version
Mitmproxy version: 3.0.0 (2.0.0dev0020-0x2aecffd)
Python version: 3.5.0
Platform: Linux-3.13.0-107-generic-x86_64-with-Ubuntu-14.04-trusty
SSL version: OpenSSL 1.0.2k 26 Jan 2017
Linux distro: Ubuntu 14.04 trusty
```
| [
{
"content": "import math\nimport os\nimport sys\nfrom functools import lru_cache\nfrom typing import Optional, Union # noqa\n\nimport urwid\n\nfrom mitmproxy import contentviews\nfrom mitmproxy import exceptions\nfrom mitmproxy import export\nfrom mitmproxy import http\nfrom mitmproxy.net.http import Headers\... | [
{
"content": "import math\nimport os\nimport sys\nfrom functools import lru_cache\nfrom typing import Optional, Union # noqa\n\nimport urwid\n\nfrom mitmproxy import contentviews\nfrom mitmproxy import exceptions\nfrom mitmproxy import export\nfrom mitmproxy import http\nfrom mitmproxy.net.http import Headers\... | diff --git a/mitmproxy/tools/console/flowview.py b/mitmproxy/tools/console/flowview.py
index a97a9b3156..90cca1c5ac 100644
--- a/mitmproxy/tools/console/flowview.py
+++ b/mitmproxy/tools/console/flowview.py
@@ -681,7 +681,7 @@ def encode_callback(self, key, conn):
encoding_map = {
"z": "gzip",
"d": "deflate",
- "b": "brotli",
+ "b": "br",
}
conn.encode(encoding_map[key])
signals.flow_change.send(self, flow = self.flow)
|
getnikola__nikola-3437 | The post_list plugin prevents 'else' functionality in templates
<!--
Before creating an issue:
* make sure you are using an up-to-date version of Nikola
* search for existing issues that might be related
Make sure to:
* provide information about your environment (below)
* include all the output you get, and any other information related to your problem
Nikola v7.6.4, as provided by Ubuntu, is NOT SUPPORTED.
If you are using this version, you should upgrade: https://getnikola.com/getting-started.html
-->
### Environment
**Python Version:**
3.7.8
**Nikola Version:**
8.1.1
**Operating System:**
Mac OS Catalina (10.15.5) / Ubuntu 19.10
### Description:
In the default template for the `post-list` plugin, namely `post_list_directive.tmpl`
```python
{% if posts %}
<ul class="post-list">
...
```
Which suggests that there is some possibility that the template will be called with no posts.
While in `list_post.tmpl`, which you can also use with `post-list`, we have this:
```python
{% if posts %}
<ul class="postlist">
{% for post in posts %}
<li><time class="listdate" datetime="{{ post.formatted_date('webiso') }}" title="{{ post.formatted_date(date_format)|e }}">{{ post.formatted_date(date_format)|e }}</time> <a href="{{ post.permalink() }}" class="listtitle">{{ post.title()|e }}</a></li>
{% endfor %}
</ul>
{% else %}
<p>{{ messages("No posts found.") }}</p>
{% endif %}
```
Which is obviously expected to be able to handle the situation when there are no posts.
However, when the plugin returns no posts, the `else` block is not executed. In fact, it appears that the template is not called at all when no posts are returned.
This is because of these lines in `post_list.py`, at around lines 221-222:
```python
if not posts:
return '', []
```
It seems that because the empty values are returned, processing is not passed to the template. Removing those lines fixes the problem and allows the template's `else` clause to work.
I can't see that this change breaks anything else, so I'll submit a pull request for it, unless someone has an objection.
| [
{
"content": "# -*- coding: utf-8 -*-\n\n# Copyright © 2013-2020 Udo Spallek, Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction,... | [
{
"content": "# -*- coding: utf-8 -*-\n\n# Copyright © 2013-2020 Udo Spallek, Roberto Alsina and others.\n\n# Permission is hereby granted, free of charge, to any\n# person obtaining a copy of this software and associated\n# documentation files (the \"Software\"), to deal in the\n# Software without restriction,... | diff --git a/CHANGES.txt b/CHANGES.txt
index 6d599abaa6..dd742c785e 100644
--- a/CHANGES.txt
+++ b/CHANGES.txt
@@ -9,6 +9,8 @@ Features
Bugfixes
--------
+* Allow else clause in post-list plugin. (Issue #3436)
+
New in v8.1.1
=============
diff --git a/nikola/plugins/shortcode/post_list.py b/nikola/plugins/shortcode/post_list.py
index b71e523626..462984a576 100644
--- a/nikola/plugins/shortcode/post_list.py
+++ b/nikola/plugins/shortcode/post_list.py
@@ -218,9 +218,6 @@ def handler(self, start=None, stop=None, reverse=False, tags=None, require_all_t
posts += [post]
- if not posts:
- return '', []
-
template_deps = site.template_system.template_deps(template)
if state:
# Register template as a dependency (Issue #2391)
|
getredash__redash-464 | Error running query: datetime.time(13, 52, 27) is not JSON serializable
My table schema:
``` sql
CREATE TABLE F_entrances (
id SERIAL PRIMARY KEY,
timeOfEntrance time,
customerId int REFERENCES D_customers
);
```
(and yes, I committed the horrible sin of camel_case vs underScore. I'll be fixing that soonish)
The query
``` sql
SELECT
timeofentrance
FROM F_entrances
```
Gives me the error `Error running query: datetime.time(13, 52, 27) is not JSON serializable`. I worked around it with `to_char` but this seems to be a problem at the [Python layer](http://stackoverflow.com/a/11875813/1216976).
| [
{
"content": "import cStringIO\nimport csv\nimport codecs\nimport decimal\nimport datetime\nimport json\nimport re\nimport hashlib\nimport sqlparse\nimport pytz\n\nCOMMENTS_REGEX = re.compile(\"/\\*.*?\\*/\")\n\n\nclass SQLMetaData(object):\n TABLE_SELECTION_KEYWORDS = ('FROM', 'JOIN', 'LEFT JOIN', 'FULL JOI... | [
{
"content": "import cStringIO\nimport csv\nimport codecs\nimport decimal\nimport datetime\nimport json\nimport re\nimport hashlib\nimport sqlparse\nimport pytz\n\nCOMMENTS_REGEX = re.compile(\"/\\*.*?\\*/\")\n\n\nclass SQLMetaData(object):\n TABLE_SELECTION_KEYWORDS = ('FROM', 'JOIN', 'LEFT JOIN', 'FULL JOI... | diff --git a/redash/utils.py b/redash/utils.py
index 41b0d813f8..41d23372f2 100644
--- a/redash/utils.py
+++ b/redash/utils.py
@@ -95,7 +95,7 @@ def default(self, o):
if isinstance(o, decimal.Decimal):
return float(o)
- if isinstance(o, datetime.date):
+ if isinstance(o, (datetime.date, datetime.time, datetime.timedelta)):
return o.isoformat()
super(JSONEncoder, self).default(o)
|
napari__napari-1088 | ListModel.append does not check type
## 🐛 Bug
in working on layer groups, I found a strange lack of type checking when appending to a `ListModel` (which inherits from `TypedList`). [`ListModel.append`](https://github.com/napari/napari/blob/59ed366e9d492a2389c451468fd8b9f96508b4e2/napari/utils/list/_model.py#L59) jumps right over `TypedList.append`
https://github.com/napari/napari/blob/59ed366e9d492a2389c451468fd8b9f96508b4e2/napari/utils/list/_model.py#L58-L60
... and if you try to something that is not a `Layer` to a `LayerList`, it works fine up until throwing an error (unrelated to typing) in `components.layerlist._add`. Is that supposed to be `TypedList.append(self, obj)`? or was that intentional?
| [
{
"content": "from ...utils.event import EmitterGroup\n\nfrom ._multi import MultiIndexList\nfrom ._typed import TypedList\n\n\nclass ListModel(MultiIndexList, TypedList):\n \"\"\"List with events, tuple-indexing, typing, and filtering.\n\n Parameters\n ----------\n basetype : type\n Type of ... | [
{
"content": "from ...utils.event import EmitterGroup\n\nfrom ._multi import MultiIndexList\nfrom ._typed import TypedList\n\n\nclass ListModel(MultiIndexList, TypedList):\n \"\"\"List with events, tuple-indexing, typing, and filtering.\n\n Parameters\n ----------\n basetype : type\n Type of ... | diff --git a/napari/components/_tests/test_layers_list.py b/napari/components/_tests/test_layers_list.py
index 96b273f0def..d19c975c970 100644
--- a/napari/components/_tests/test_layers_list.py
+++ b/napari/components/_tests/test_layers_list.py
@@ -1,6 +1,7 @@
from napari.components import LayerList
from napari.layers import Image
import numpy as np
+import pytest
def test_empty_layers_list():
@@ -27,6 +28,10 @@ def test_adding_layer():
layer = Image(np.random.random((10, 10)))
layers.append(layer)
+ # LayerList should err if you add anything other than a layer
+ with pytest.raises(TypeError):
+ layers.append('something')
+
assert len(layers) == 1
diff --git a/napari/utils/list/_model.py b/napari/utils/list/_model.py
index daaad2ba34b..ad90c2f1be0 100644
--- a/napari/utils/list/_model.py
+++ b/napari/utils/list/_model.py
@@ -56,7 +56,7 @@ def insert(self, index, obj):
self.events.added(item=obj, index=self.__locitem__(index))
def append(self, obj):
- super(TypedList, self).append(obj)
+ TypedList.append(self, obj)
self.events.added(item=obj, index=len(self) - 1)
def pop(self, key):
|
ansible__molecule-2716 | Add a config option to run podman as root
# Issue Type
- Feature request
# Molecule and Ansible details
```
ansible 2.9.9
molecule 3.0.4
```
Molecule installation method (one of):
- pip
Ansible installation method (one of):
- OS package
# Desired Behavior
Podman allows running containers in a rootless mode, although this introduces some limitations.
For example this makes it difficult to test ansible roles which rely on a working systemd instance running in the container.
It would therefore be nice if it would be possible to specify whether podman should be executed as root or as the current user in the molecule config.
| [
{
"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# right... | [
{
"content": "# Copyright (c) 2015-2018 Cisco Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to\n# deal in the Software without restriction, including without limitation the\n# right... | diff --git a/molecule/model/schema_v3.py b/molecule/model/schema_v3.py
index ffc5ace19c..0d7b59030e 100644
--- a/molecule/model/schema_v3.py
+++ b/molecule/model/schema_v3.py
@@ -358,6 +358,7 @@ def pre_validate_base_schema(env, keep_string):
"cgroup_manager": {"type": "string"},
"storage_opt": {"type": "string"},
"storage_driver": {"type": "string"},
+ "rootless": {"type": "boolean"},
},
},
}
diff --git a/molecule/provisioner/ansible/playbooks/podman/create.yml b/molecule/provisioner/ansible/playbooks/podman/create.yml
index 068863f554..36f945442c 100644
--- a/molecule/provisioner/ansible/playbooks/podman/create.yml
+++ b/molecule/provisioner/ansible/playbooks/podman/create.yml
@@ -4,6 +4,7 @@
connection: local
gather_facts: false
no_log: "{{ molecule_no_log }}"
+ become: "{{ not (item.rootless|default(true)) }}"
tasks:
- name: Log into a container registry
diff --git a/molecule/provisioner/ansible/playbooks/podman/destroy.yml b/molecule/provisioner/ansible/playbooks/podman/destroy.yml
index 7da2dc60b3..71389dd7a7 100644
--- a/molecule/provisioner/ansible/playbooks/podman/destroy.yml
+++ b/molecule/provisioner/ansible/playbooks/podman/destroy.yml
@@ -4,6 +4,7 @@
connection: local
gather_facts: false
no_log: "{{ molecule_no_log }}"
+ become: "{{ not (item.rootless|default(true)) }}"
tasks:
- name: Destroy molecule instance(s)
shell: podman container exists {{ item.name }} && podman rm -f {{ item.name }} || true
diff --git a/molecule/test/resources/playbooks/podman/create.yml b/molecule/test/resources/playbooks/podman/create.yml
index d2c04a8b3e..1af6685eb5 100644
--- a/molecule/test/resources/playbooks/podman/create.yml
+++ b/molecule/test/resources/playbooks/podman/create.yml
@@ -3,6 +3,7 @@
hosts: localhost
connection: local
gather_facts: false
+ become: "{{ not (item.rootless|default(true)) }}"
tasks:
- name: Log into a container registry
command: >
diff --git a/molecule/test/resources/playbooks/podman/destroy.yml b/molecule/test/resources/playbooks/podman/destroy.yml
index 7da2dc60b3..71389dd7a7 100644
--- a/molecule/test/resources/playbooks/podman/destroy.yml
+++ b/molecule/test/resources/playbooks/podman/destroy.yml
@@ -4,6 +4,7 @@
connection: local
gather_facts: false
no_log: "{{ molecule_no_log }}"
+ become: "{{ not (item.rootless|default(true)) }}"
tasks:
- name: Destroy molecule instance(s)
shell: podman container exists {{ item.name }} && podman rm -f {{ item.name }} || true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.