in_source_id stringlengths 13 58 | issue stringlengths 3 241k | before_files listlengths 0 3 | after_files listlengths 0 3 | pr_diff stringlengths 109 107M ⌀ |
|---|---|---|---|---|
netbox-community__netbox-14828 | Only one event rule triggers for a content type
### Deployment Type
Self-hosted
### NetBox Version
v3.7.0
### Python Version
3.11
### Steps to Reproduce
Setup:
1. Create a webhook: Name = Test, URL = http://127.0.0.1:9000
2. Create event rule 1:
- Name = Rule 1
- Content types = Prefix
- select Updates
- Condition = `{ "and": [{"attr": "status.value", "value": "deprecated"}]}`
- Action type = Webhook
- Webhook = Test
3. Create event rule 2:
- Name = Rule 2
- Content types = Prefix
- select Updates
- Condition = `{ "and": [{"attr": "status.value", "value": "active"}]}`
- Action type = Webhook
- Webhook = Test
4. Start webhook receiver (`python manage.py webhook_receiver`), or observe the webhooks happen in some other way
(Sorry, couldn't figure out the correct condition syntax without using the "and" operator)
Demo:
5. Create a prefix, like 10.1.2.0/24, status = Active (the defaults)
6. Edit the prefix: change its status to **Deprecated**
7. Edit the prefix again: change its status to **Active**
### Expected Behavior
Webhook is run **twice**: first when prefix status was changed to **Deprecated** (step 6), second when changed to **Active** again (step 7).
### Observed Behavior
Webhook is run **only once**, that's in step 6, but not in step 7.
Additionally: If Rule 1 is disabled, and steps 6 and 7 are executed again, now the webhook is run in step 7.
Looks like only the first enabled event rule is run for a specific object type.
| [
{
"content": "import logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.utils import timezone\nfrom django.utils.module_loading import import_s... | [
{
"content": "import logging\n\nfrom django.conf import settings\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.core.exceptions import ObjectDoesNotExist\nfrom django.utils import timezone\nfrom django.utils.module_loading import import_s... | diff --git a/netbox/extras/events.py b/netbox/extras/events.py
index 6d0654929fc..90cca83cd07 100644
--- a/netbox/extras/events.py
+++ b/netbox/extras/events.py
@@ -81,7 +81,7 @@ def process_event_rules(event_rules, model_name, event, data, username, snapshot
# Evaluate event rule conditions (if any)
if not event_rule.eval_conditions(data):
- return
+ continue
# Webhooks
if event_rule.action_type == EventRuleActionChoices.WEBHOOK:
|
deeppavlov__DeepPavlov-79 | What is "'Chainer' object has no attribute 'infer'
2018-03-04 14:09:23,638 (util.py:64 WorkerThread2) ERROR - TeleBot: "AttributeError occurred, args=("'Chainer' object has no attribute 'infer'",)
Traceback (most recent call last):
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/util.py", line 58, in run
task(*args, **kwargs)
File "/Users/developer/Project/DeepPavlov/telegram_utils/telegram_ui.py", line 48, in handle_inference
pred = model.infer(context)
AttributeError: 'Chainer' object has no attribute 'infer'
"
2018-03-04 14:09:23.638 ERROR in 'TeleBot'['util'] at line 64: AttributeError occurred, args=("'Chainer' object has no attribute 'infer'",)
Traceback (most recent call last):
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/util.py", line 58, in run
task(*args, **kwargs)
File "/Users/developer/Project/DeepPavlov/telegram_utils/telegram_ui.py", line 48, in handle_inference
pred = model.infer(context)
AttributeError: 'Chainer' object has no attribute 'infer'
Traceback (most recent call last):
File "deep.py", line 60, in <module>
main()
File "deep.py", line 56, in main
interact_model_by_telegram(pipeline_config_path, token)
File "/Users/developer/Project/DeepPavlov/telegram_utils/telegram_ui.py", line 58, in interact_model_by_telegram
init_bot_for_model(token, model)
File "/Users/developer/Project/DeepPavlov/telegram_utils/telegram_ui.py", line 52, in init_bot_for_model
bot.polling()
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/__init__.py", line 264, in polling
self.__threaded_polling(none_stop, interval, timeout)
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/__init__.py", line 288, in __threaded_polling
self.worker_pool.raise_exceptions()
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/util.py", line 107, in raise_exceptions
six.reraise(self.exc_info[0], self.exc_info[1], self.exc_info[2])
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/six.py", line 693, in reraise
raise value
File "/Users/developer/DeepPavlov/lib/python3.6/site-packages/telebot/util.py", line 58, in run
task(*args, **kwargs)
File "/Users/developer/Project/DeepPavlov/telegram_utils/telegram_ui.py", line 48, in handle_inference
pred = model.infer(context)
AttributeError: 'Chainer' object has no attribute 'infer'
Telegram interface bug
Alexander Seliverstov, [04.03.18 15:20]
/start
jhfirufoiug_bot, [04.03.18 15:20]
Welcome to DeepPavlov inference bot!
Alexander Seliverstov, [04.03.18 15:20]
Hi
jhfirufoiug_bot, [04.03.18 15:20]
['Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?']
Alexander Seliverstov, [04.03.18 15:20]
I want cheap russian food
jhfirufoiug_bot, [04.03.18 15:20]
['Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?', 'Hello, welcome to the Cambridge restaurant system. You can ask for restaurants by area, price range or food type. How may I help you?']
| [
{
"content": "\"\"\"\nCopyright 2017 Neural Networks and Deep Learning lab, MIPT\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUn... | [
{
"content": "\"\"\"\nCopyright 2017 Neural Networks and Deep Learning lab, MIPT\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUn... | diff --git a/README.md b/README.md
index 300f903e81..85db4b47c3 100644
--- a/README.md
+++ b/README.md
@@ -3,6 +3,7 @@
# <center>DeepPavlov</center>
### *We are in a really early Alpha release. You have to be ready for hard adventures.*
+### *If you have updated to version 0.0.2 - please re-download all pretrained models*
An open-source conversational AI library, built on TensorFlow and Keras, and designed for
* NLP and dialog systems research
* implementation and evaluation of complex conversational systems
@@ -21,7 +22,7 @@ and AI-application developers with:
| Component | Description |
| --------- | ----------- |
-| [Slot filling component](deeppavlov/models/ner/README.md) | is based on neural Named Entity Recognition network and fuzzy Levenshtein search to extract normalized slot values from the text. The NER network component reproduces architecture from the paper [Application of a Hybrid Bi-LSTM-CRF model to the task of Russian Named Entity Recognition](https://arxiv.org/pdf/1709.09686.pdf), which is inspired by LSTM+CRF architecture from https://arxiv.org/pdf/1603.01360.pdf. |
+| [Slot filling and NER componenst](deeppavlov/models/ner/README.md) | Based on neural Named Entity Recognition network and fuzzy Levenshtein search to extract normalized slot values from the text. The NER component reproduces architecture from the paper [Application of a Hybrid Bi-LSTM-CRF model to the task of Russian Named Entity Recognition](https://arxiv.org/pdf/1709.09686.pdf), which is inspired by Bi-LSTM+CRF architecture from https://arxiv.org/pdf/1603.01360.pdf. |
| [Intent classification component](deeppavlov/models/classifiers/intents/README.md) | Based on shallow-and-wide Convolutional Neural Network architecture from [Kim Y. Convolutional neural networks for sentence classification – 2014](https://arxiv.org/pdf/1408.5882). The model allows multilabel classification of sentences. |
| [Automatic spelling correction component](deeppavlov/models/spellers/error_model/README.md) | Based on [An Improved Error Model for Noisy Channel Spelling Correction by Eric Brill and Robert C. Moore](http://www.aclweb.org/anthology/P00-1037) and uses statistics based error model, a static dictionary and an ARPA language model to correct spelling errors. |
| **Skill** | |
@@ -37,7 +38,7 @@ View video demo of deploy goal-oriented bot and slot-filling model with Telegram
* Run goal-oriented bot with Telegram interface:
```
- python deep.py interactbot configs/go_bot/config.json -t <TELEGRAM_TOKEN>
+ python deep.py interactbot configs/go_bot/gobot_dstc2.json -t <TELEGRAM_TOKEN>
```
* Run goal-oriented bot with console interface:
```
diff --git a/telegram_utils/telegram_ui.py b/telegram_utils/telegram_ui.py
index 3841847438..cf86ee8947 100644
--- a/telegram_utils/telegram_ui.py
+++ b/telegram_utils/telegram_ui.py
@@ -45,8 +45,8 @@ def handle_inference(message):
chat_id = message.chat.id
context = message.text
- pred = model.infer(context)
- reply_message = str(pred)
+ pred = model([context])
+ reply_message = str(pred[0])
bot.send_message(chat_id, reply_message)
bot.polling()
|
lutris__lutris-559 | Lutris shortcuts broken
See: https://forums.lutris.net/t/desktop-shortcut-not-work-for-any-game/456
| [
{
"content": "import os\nimport re\nimport concurrent.futures\nfrom urllib.parse import urlparse, parse_qsl\n\nfrom lutris import settings\nfrom lutris import api\nfrom lutris.util.log import logger\nfrom lutris.util.http import Request\n\nBANNER = \"banner\"\nICON = \"icon\"\n\n\ndef get_icon_path(game, icon_t... | [
{
"content": "import os\nimport re\nimport concurrent.futures\nfrom urllib.parse import urlparse, parse_qsl\n\nfrom lutris import settings\nfrom lutris import api\nfrom lutris.util.log import logger\nfrom lutris.util.http import Request\n\nBANNER = \"banner\"\nICON = \"icon\"\n\n\ndef get_icon_path(game, icon_t... | diff --git a/lutris/util/resources.py b/lutris/util/resources.py
index 52530544ce..65eb9e40e2 100644
--- a/lutris/util/resources.py
+++ b/lutris/util/resources.py
@@ -107,6 +107,8 @@ def parse_installer_url(url):
game_slug = parsed_url.path
if not game_slug:
return False
+ if game_slug.startswith('lutris:'):
+ game_slug = game_slug[7:]
revision = None
if parsed_url.query:
query = dict(parse_qsl(parsed_url.query))
|
xonsh__xonsh-1890 | No output from scp command
While running scp in xonsh, the progress does not showed up:
https://asciinema.org/a/322p80uvb0pjyaic2e51iqmhq
I'm using version 3f45378
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Module for caching command & alias names as well as for predicting whether\na command will be able to be run in the background.\n\nA background predictor is a function that accepect a single argument list\nand returns whethere or not the process can be run in the bac... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Module for caching command & alias names as well as for predicting whether\na command will be able to be run in the background.\n\nA background predictor is a function that accepect a single argument list\nand returns whethere or not the process can be run in the bac... | diff --git a/news/scp_predict_false.rst b/news/scp_predict_false.rst
new file mode 100644
index 0000000000..ac4523f110
--- /dev/null
+++ b/news/scp_predict_false.rst
@@ -0,0 +1,13 @@
+**Added:** None
+
+**Changed:** None
+
+**Deprecated:** None
+
+**Removed:** None
+
+**Fixed:**
+
+* Fix ``scp`` progress not being outputted
+
+**Security:** None
diff --git a/xonsh/commands_cache.py b/xonsh/commands_cache.py
index 92176c23a6..0dee3febdf 100644
--- a/xonsh/commands_cache.py
+++ b/xonsh/commands_cache.py
@@ -258,6 +258,7 @@ def default_threadable_predictors():
'less': predict_help_ver,
'man': predict_help_ver,
'more': predict_help_ver,
+ 'scp': predict_false,
'sh': predict_shell,
'ssh': predict_false,
'startx': predict_false,
|
geopandas__geopandas-785 | VIS/PERF: only flatten geometries when there are actually multi-geometries
Currently we always loop through all geometries in `_flatten_multi_geoms` when plotting geometries, even when there are no Multi-Point/LineString/Polygons.
A check in advance if there are multi-geometries, and thus to know whether the flattening is needed, would increase the performance of the plotting of many single geometries.
| [
{
"content": "from __future__ import print_function\nfrom distutils.version import LooseVersion\nimport warnings\n\nimport numpy as np\nimport pandas as pd\n\ndef _flatten_multi_geoms(geoms, colors=None):\n \"\"\"\n Returns Series like geoms and colors, except that any Multi geometries\n are split into... | [
{
"content": "from __future__ import print_function\nfrom distutils.version import LooseVersion\nimport warnings\n\nimport numpy as np\nimport pandas as pd\n\ndef _flatten_multi_geoms(geoms, colors=None):\n \"\"\"\n Returns Series like geoms and colors, except that any Multi geometries\n are split into... | diff --git a/geopandas/plotting.py b/geopandas/plotting.py
index becf65540f..02763e67f5 100644
--- a/geopandas/plotting.py
+++ b/geopandas/plotting.py
@@ -26,6 +26,9 @@ def _flatten_multi_geoms(geoms, colors=None):
colors = [None] * len(geoms)
components, component_colors = [], []
+
+ if not geoms.geom_type.str.startswith('Multi').any():
+ return geoms, colors
# precondition, so zip can't short-circuit
assert len(geoms) == len(colors)
|
buildbot__buildbot-3106 | Tarball on pypi lacks secrets/providers subdirectory
https://pypi.python.org/pypi/buildbot/0.9.5
The tarball provided there lacks buildbot/master /buildbot/secrets/providers (it does contain secrets itself though). Hence unit tests won't run completely.
```
]$ ls -al buildbot/secrets/
total 9
drwxr-xr-x 2 502 staff 5 Mar 20 14:54 .
drwxr-xr-x 24 502 staff 41 Mar 20 14:54 ..
-rw-r--r-- 1 502 staff 705 Mar 20 11:28 __init__.py
-rw-r--r-- 1 502 staff 1621 Mar 20 11:28 manager.py
-rw-r--r-- 1 502 staff 1713 Mar 20 11:28 secret.py
```
| [
{
"content": "#!/usr/bin/env python\n#\n# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it wil... | [
{
"content": "#!/usr/bin/env python\n#\n# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it wil... | diff --git a/master/setup.py b/master/setup.py
index 05b97e576a8a..eac3ea4c4c8a 100755
--- a/master/setup.py
+++ b/master/setup.py
@@ -180,6 +180,7 @@ def define_plugin_entries(groups):
"buildbot.schedulers",
"buildbot.scripts",
"buildbot.secrets",
+ "buildbot.secrets.providers",
"buildbot.statistics",
"buildbot.statistics.storage_backends",
"buildbot.status",
|
pypi__warehouse-8210 | Ordering at /stats is incorrect for TestPyPI
The list at https://test.pypi.org/stats/ is incorrect and lists projects with 0 bytes first:
<img width="736" alt="Screen Shot 2020-07-01 at 10 06 59 AM" src="https://user-images.githubusercontent.com/294415/86260240-a06f1b00-bb82-11ea-8940-4cc7aced95f6.png">
The 0-byte projects should not be included in that list.
---
**Good First Issue**: This issue is good for first time contributors. If you've already contributed to Warehouse, work on [another issue without this label](https://github.com/pypa/warehouse/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+-label%3A%22good+first+issue%22) instead. If there is not a corresponding pull request for this issue, it is up for grabs. For directions for getting set up, see our [Getting Started Guide](https://warehouse.pypa.io/development/getting-started/).
If you are working on this issue and have questions, feel free to ask them here, in the [`#pypa-dev` chat channel on Freenode](https://webchat.freenode.net/?channels=%23pypa-dev), or on the [distutils-sig.python.org mailing list](https://mail.python.org/mailman3/lists/distutils-sig.python.org/).
**Screenshot Required**: *If your pull request makes a visual change*, include a screenshot of your update. This helps our team give you feedback faster.
Ordering at /stats is incorrect for TestPyPI
The list at https://test.pypi.org/stats/ is incorrect and lists projects with 0 bytes first:
<img width="736" alt="Screen Shot 2020-07-01 at 10 06 59 AM" src="https://user-images.githubusercontent.com/294415/86260240-a06f1b00-bb82-11ea-8940-4cc7aced95f6.png">
The 0-byte projects should not be included in that list.
---
**Good First Issue**: This issue is good for first time contributors. If you've already contributed to Warehouse, work on [another issue without this label](https://github.com/pypa/warehouse/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+-label%3A%22good+first+issue%22) instead. If there is not a corresponding pull request for this issue, it is up for grabs. For directions for getting set up, see our [Getting Started Guide](https://warehouse.pypa.io/development/getting-started/).
If you are working on this issue and have questions, feel free to ask them here, in the [`#pypa-dev` chat channel on Freenode](https://webchat.freenode.net/?channels=%23pypa-dev), or on the [distutils-sig.python.org mailing list](https://mail.python.org/mailman3/lists/distutils-sig.python.org/).
**Screenshot Required**: *If your pull request makes a visual change*, include a screenshot of your update. This helps our team give you feedback faster.
| [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | diff --git a/warehouse/views.py b/warehouse/views.py
index bcd126a68a8f..9ea8e60252d3 100644
--- a/warehouse/views.py
+++ b/warehouse/views.py
@@ -399,7 +399,7 @@ def stats(request):
top_100_packages = (
request.db.query(Project)
.with_entities(Project.name, Project.total_size)
- .order_by(Project.total_size.desc())
+ .order_by(Project.total_size.desc().nullslast())
.limit(100)
.all()
)
|
learningequality__kolibri-7702 | Subquery in `FaclityViewset` `last_synced` annotation can return multiple results
<!--
Instructions:
* Fill out the sections below, replace …'s with information about your issue
* Use the 'preview' function above this text box to verify formatting before submitting
-->
### Observed behavior
This subquery from the `TransferSession` table can return multiple results, which causes the `/api/auth/facility` endpoint to break
https://github.com/learningequality/kolibri/blob/release-v0.14.x/kolibri/core/auth/api.py#L359
### Expected behavior
The `/api/auth/facility` endpoint doesn't break because of this query. The subquery should be limited to a returning a single value.
### User-facing consequences
<!--
Implications and real-world consequences for learners, coaches, admins, and other users of the application
-->
…
### Errors and logs
<!--
Relevant logs from:
* the command line
* ~/.kolibri/logs/kolibri.txt
* the browser console
Please wrap errors in triple backticks for clean formatting like this:
```
01:10 info: something happened
01:12 error: something bad happened
```
-->
…
### Steps to reproduce
<!--
Precise steps that someone else can follow in order to see this behavior
-->
…
### Context
<!--
Tell us about your environment, including:
* Kolibri version
* Operating system
* Browser
-->
…
| [
{
"content": "from __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport time\nfrom datetime import datetime\nfrom datetime import timedelta\nfrom itertools import groupby\nfrom uuid import uuid4\n\nfrom django.contrib.auth import authenticat... | [
{
"content": "from __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport time\nfrom datetime import datetime\nfrom datetime import timedelta\nfrom itertools import groupby\nfrom uuid import uuid4\n\nfrom django.contrib.auth import authenticat... | diff --git a/kolibri/core/auth/api.py b/kolibri/core/auth/api.py
index 695786901bb..5c36a4f269d 100644
--- a/kolibri/core/auth/api.py
+++ b/kolibri/core/auth/api.py
@@ -366,7 +366,7 @@ def annotate_queryset(self, queryset):
)
)
.order_by("-last_activity_timestamp")
- .values("last_activity_timestamp")
+ .values("last_activity_timestamp")[:1]
)
)
)
|
nautobot__nautobot-4946 | Computed fields not return by GraphQL query
### Environment
* Nautobot version (Docker tag too if applicable): 2.0.3
* Python version: 3.10.12
* Database platform, version: postgres, 14.9
* Middleware(s):
<!--
Having issues accessing computed_fields in version 2.0.3. Computed fields are visible in the UI however, GraphQL query (from UI) throws the following message “CustomFieldModel.get_computed_field() got an unexpected keyword argument ‘slug’”
Query:
{
devices{
name
cpf_olt_name
}
}
Have stop/started Nautobot and rebooted the system. This was working prior to upgrading from 1.5. Has anyone else seen this? Was told to submit a bug by Glenn M in the Nautobot Slack channel.
-->
### Steps to Reproduce
1. Upgraded from Nautobot 1.5. to 2.0.3
2. Created new computed field
{% if obj.type == "xgs-pon" %}
olt-{{ obj.mac_address }}
{% endif %}
4. Attempted GraphQL for new and pre-upgrade computed fields from the web interface and with pynautobot.
<!-- What did you expect to happen? -->
Computed field key:value is returned
<!-- What happened instead? -->
Received this error after a query from the web interface:
“CustomFieldModel.get_computed_field() got an unexpected keyword argument ‘slug’”
| [
{
"content": "\"\"\"Library of generators for GraphQL.\"\"\"\n\nimport logging\n\nimport graphene\nimport graphene_django_optimizer as gql_optimizer\nfrom graphql import GraphQLError\n\nfrom nautobot.core.graphql.types import OptimizedNautobotObjectType\nfrom nautobot.core.graphql.utils import str_to_var_name, ... | [
{
"content": "\"\"\"Library of generators for GraphQL.\"\"\"\n\nimport logging\n\nimport graphene\nimport graphene_django_optimizer as gql_optimizer\nfrom graphql import GraphQLError\n\nfrom nautobot.core.graphql.types import OptimizedNautobotObjectType\nfrom nautobot.core.graphql.utils import str_to_var_name, ... | diff --git a/changes/4851.fixed b/changes/4851.fixed
new file mode 100644
index 00000000000..c45c803acd0
--- /dev/null
+++ b/changes/4851.fixed
@@ -0,0 +1 @@
+Fixed an exception when trying to access computed fields via GraphQL.
diff --git a/nautobot/core/graphql/generators.py b/nautobot/core/graphql/generators.py
index bf9251e87e2..24d9cdf2972 100644
--- a/nautobot/core/graphql/generators.py
+++ b/nautobot/core/graphql/generators.py
@@ -116,7 +116,7 @@ def generate_computed_field_resolver(name, resolver_name):
"""
def resolve_computed_field(self, info, **kwargs):
- return self.get_computed_field(slug=name)
+ return self.get_computed_field(key=name)
resolve_computed_field.__name__ = resolver_name
return resolve_computed_field
|
liberapay__liberapay.com-2027 | Limit the input field for team names
The first step when creating a team at https://liberapay.com/about/teams is selecting a name. I tried to enter a name and it was rejected because it was too long. The input field name currently accepts 30 characters. I suggest to reduce it to the maximum allowed characters for team names.
| [
{
"content": "from decimal import Decimal\nfrom ipaddress import ip_network\nimport json\nimport logging\nfrom operator import itemgetter\nimport os\nimport re\nimport socket\nfrom tempfile import mkstemp\nfrom time import time\nimport traceback\n\nimport babel.localedata\nfrom babel.messages.pofile import read... | [
{
"content": "from decimal import Decimal\nfrom ipaddress import ip_network\nimport json\nimport logging\nfrom operator import itemgetter\nimport os\nimport re\nimport socket\nfrom tempfile import mkstemp\nfrom time import time\nimport traceback\n\nimport babel.localedata\nfrom babel.messages.pofile import read... | diff --git a/Dockerfile b/Dockerfile
deleted file mode 100644
index 43897bd92b..0000000000
--- a/Dockerfile
+++ /dev/null
@@ -1,16 +0,0 @@
-FROM debian:10
-
-RUN apt-get update && \
- apt-get install -y build-essential python3-pip \
- libpq-dev libffi-dev python3-dev postgresql-client && \
- rm -rf /var/lib/apt/lists/*
-
-
-COPY requirements_*.txt /tmp/
-
-RUN pip3 install --require-hashes -r /tmp/requirements_base.txt \
- -r /tmp/requirements_tests.txt \
- -r /tmp/requirements_dev.txt
-
-COPY . /app
-WORKDIR /app
diff --git a/emails/payin_failed.spt b/emails/payin_failed.spt
index 0948f9b463..93e1955c47 100644
--- a/emails/payin_failed.spt
+++ b/emails/payin_failed.spt
@@ -1,7 +1,7 @@
{{ _("Your payment has failed") }}
[---] text/html
-% if payin.off_session
+% if payin.off_session|default(False)
<p>{{ _(
"The automatic payment of {money_amount} initiated today has failed.",
money_amount=payin.amount,
diff --git a/emails/payin_succeeded.spt b/emails/payin_succeeded.spt
index e125530cbb..c11f3a7bc7 100644
--- a/emails/payin_succeeded.spt
+++ b/emails/payin_succeeded.spt
@@ -1,7 +1,7 @@
{{ _("Your payment has succeeded") }}
[---] text/html
-% if payin.off_session
+% if payin.off_session|default(False)
<p>{{ _(
"The automatic payment of {money_amount} initiated today has succeeded.",
money_amount=payin.amount,
diff --git a/liberapay/wireup.py b/liberapay/wireup.py
index 8e36ac7f47..f722ffea0c 100644
--- a/liberapay/wireup.py
+++ b/liberapay/wireup.py
@@ -265,6 +265,7 @@ class AppConf:
smtp_password=str,
smtp_use_tls=bool,
stripe_callback_secret=str,
+ stripe_connect_callback_secret=str,
stripe_connect_id=str,
stripe_publishable_key=str,
stripe_secret_key=str,
diff --git a/style/base/base.scss b/style/base/base.scss
index ebe6295410..5475728c74 100644
--- a/style/base/base.scss
+++ b/style/base/base.scss
@@ -177,9 +177,24 @@ div.account {
}
button.close {
+ margin-left: 5px;
padding: 0 5px;
}
+button.corner-icon {
+ background: transparent;
+ border: none;
+ color: #000;
+ font-size: 18px;
+ line-height: 24px;
+ margin-left: 5px;
+ opacity: 0.2;
+ padding: 0 5px;
+}
+button.corner-icon:hover {
+ opacity: 0.5;
+}
+
img.platform-icon {
height: 16px;
}
diff --git a/www/%username/payment/index.spt b/www/%username/payment/index.spt
index 246d169f09..7d4b9cd922 100644
--- a/www/%username/payment/index.spt
+++ b/www/%username/payment/index.spt
@@ -8,19 +8,49 @@ participant = get_participant(state, restrict=True)
if request.method == 'POST':
account_pk = request.body.get_int('account_pk')
- account = website.db.one("""
- UPDATE payment_accounts
- SET is_current = NULL
- WHERE participant = %s
- AND pk = %s
- RETURNING *
- """, (participant.id, account_pk))
- if account and account.provider == 'stripe':
- try:
- stripe.oauth.OAuth.deauthorize(stripe_user_id=account.id)
- except stripe.oauth_error.InvalidClientError as e:
- if "This application is not connected to stripe account" not in str(e):
- website.warning("unexpected error message: " + str(e))
+ action = request.body.get_choice('action', ('disconnect', 'refresh'), default='disconnect')
+ if action == 'disconnect':
+ account = website.db.one("""
+ UPDATE payment_accounts
+ SET is_current = NULL
+ WHERE participant = %s
+ AND pk = %s
+ RETURNING *
+ """, (participant.id, account_pk))
+ if account and account.provider == 'stripe':
+ try:
+ stripe.oauth.OAuth.deauthorize(stripe_user_id=account.id)
+ except stripe.oauth_error.InvalidClientError as e:
+ if "This application is not connected to stripe account" not in str(e):
+ website.warning("unexpected error message: " + str(e))
+ elif action == 'refresh':
+ account = website.db.one("""
+ SELECT *
+ FROM payment_accounts
+ WHERE participant = %s
+ AND pk = %s
+ """, (participant.id, account_pk))
+ if not account:
+ raise response.invalid_input(account_pk, 'account_pk', 'body')
+ if account.provider == 'stripe':
+ stripe_account = stripe.Account.retrieve(account.id)
+ website.db.run("""
+ UPDATE payment_accounts
+ SET country = %(country)s
+ , default_currency = %(default_currency)s
+ , charges_enabled = %(charges_enabled)s
+ , display_name = %(display_name)s
+ WHERE provider = 'stripe'
+ AND id = %(account_id)s
+ """, dict(
+ country=stripe_account.country,
+ default_currency=stripe_account.default_currency.upper(),
+ charges_enabled=stripe_account.charges_enabled,
+ display_name=stripe_account.settings.dashboard.display_name,
+ account_id=stripe_account.id,
+ ))
+ else:
+ raise response.error(400, f"refresh isn't implemented for provider {account.provider}")
response.redirect(request.path.raw)
accounts = website.db.all("""
@@ -99,12 +129,14 @@ subhead = _("Payment Processors")
"bank accounts.)"
) }}</p>
% if stripe_accounts
- <form action="" method="POST">
- <input type="hidden" name="csrf_token" value="{{ csrf_token }}" />
% for account in stripe_accounts
- <div class="card card-default">
- <button class="close pull-right" name="account_pk" value="{{ account.pk }}"
- title="{{ _('Disconnect') }}">×</button>
+ <form class="card card-default" action="" method="POST">
+ <input type="hidden" name="csrf_token" value="{{ csrf_token }}" />
+ <input type="hidden" name="account_pk" value="{{ account.pk }}" />
+ <button class="corner-icon fa fa-close" name="action" value="disconnect"
+ title="{{ _('Disconnect') }}"></button>
+ <button class="corner-icon fa fa-refresh" name="action" value="refresh"
+ title="{{ _('Refresh') }}"></button>
% if account.display_name
<h4>{{ account.display_name }}</h4>
{{ _("Account ID: {0}", account.id) }}<br>
@@ -129,9 +161,8 @@ subhead = _("Payment Processors")
fontawesome("external-link") }} {{ _(
"Manage this {platform} account", platform="Stripe"
) }}</a>
- </div>
+ </form>
% endfor
- </form>
<br>
% elif country in locale.countries
% if country in constants.PAYOUT_COUNTRIES['stripe']
diff --git a/www/about/teams.spt b/www/about/teams.spt
index 2c5a996f55..a4aa242641 100644
--- a/www/about/teams.spt
+++ b/www/about/teams.spt
@@ -92,7 +92,7 @@ title = _("Teams")
<input type="hidden" name="csrf_token" value="{{ csrf_token }}" />
<div class="form-group">
- <input class="form-control" name="name" size=30
+ <input class="form-control" name="name" size=30 maxlength="{{ constants.USERNAME_MAX_SIZE }}"
placeholder="{{ _('Name of the team') }}" />
</div>
<div class="form-group">
diff --git a/www/callbacks/stripe.spt b/www/callbacks/stripe.spt
index d83c9f9027..094cdbcbad 100644
--- a/www/callbacks/stripe.spt
+++ b/www/callbacks/stripe.spt
@@ -19,10 +19,12 @@ PRODUCTION = website.env.instance_type == 'production'
request.allow('POST')
payload = request.body_bytes
sig = request.headers[b'Stripe-Signature'].decode('ascii', 'replace')
+if 'connect' in request.qs:
+ secret = website.app_conf.stripe_connect_callback_secret
+else:
+ secret = website.app_conf.stripe_callback_secret
try:
- event = stripe.Webhook.construct_event(
- payload, sig, website.app_conf.stripe_callback_secret
- )
+ event = stripe.Webhook.construct_event(payload, sig, secret)
except ValueError as e:
raise response.error(400, str(e))
except stripe.error.SignatureVerificationError:
|
rootpy__rootpy-791 | Unable to set errorlow for a graph
There is a bug in `rootpy.plotting.graph`, setting exl or eyl trough a GraphPoint throws an AttributeError:
```python
from rootpy.plotting import Graph
g = Graph(name="test",type='asymm')
g[0] = (1,1)
g[0].y.error_hi = 0.1
g[0].y.error_low = 0.1
```
| [
{
"content": "from __future__ import absolute_import\n\nimport math\nimport numbers\nfrom operator import add, sub\n\nimport ROOT\n\nfrom .. import log; log = log[__name__]\nfrom .. import QROOT\nfrom ..extern.six.moves import range\nfrom ..base import NamelessConstructorObject\nfrom ..decorators import snake_c... | [
{
"content": "from __future__ import absolute_import\n\nimport math\nimport numbers\nfrom operator import add, sub\n\nimport ROOT\n\nfrom .. import log; log = log[__name__]\nfrom .. import QROOT\nfrom ..extern.six.moves import range\nfrom ..base import NamelessConstructorObject\nfrom ..decorators import snake_c... | diff --git a/rootpy/plotting/graph.py b/rootpy/plotting/graph.py
index 7cd5f0b8..b62ba6eb 100644
--- a/rootpy/plotting/graph.py
+++ b/rootpy/plotting/graph.py
@@ -93,7 +93,7 @@ def error_low(self, val):
if self.isdefault: return
getattr(
self.graph_,
- 'voidSetPointE{0}low'.format(self.axis_.upper())
+ 'SetPointE{0}low'.format(self.axis_.upper())
)(self.index_, val)
|
helmholtz-analytics__heat-406 | Recent CI runs failing with NetCDF: HDF error
**Description**
Recent CI (and local) runs of our tests fail with messages like
```
E RuntimeError: NetCDF: HDF error
netCDF4/_netCDF4.pyx:1887: RuntimeError
During handling of the above exception, another exception occurred:
self = <heat.core.tests.test_io.TestIO testMethod=test_save_netcdf>
def test_save_netcdf(self):
# netcdf support is optional
if not ht.io.supports_netcdf():
return
# local unsplit data
local_data = ht.arange(100)
> ht.save_netcdf(local_data, self.NETCDF_OUT_PATH, self.NETCDF_VARIABLE)
heat/core/tests/test_io.py:373:
```
**To Reproduce**
Steps to reproduce the behavior:
1. Which module/class/function is affected?
heat/core/tests/test_io.py
2. What are the circumstances under which the bug appears?
ANY, just run from current master
3. What is the exact error-message/errorous behavious?
cf. above.
**Expected behavior**
Tests should run successfully.
**Illustrative**
https://travis-ci.com/helmholtz-analytics/heat/builds/135270829
**Version Info**
Topic branch, but master would suffer from a rebuild.
**Additional comments**
The fix will be to pin the NetCDF dependency to <=1.5.2. Problems start to occur with 1.5.3.
| [
{
"content": "from setuptools import setup\nimport sys\n\nsys.path.append(\"./heat/core\")\nimport version\n\nprint(version, dir(version))\n\nwith open(\"README.md\", \"r\") as handle:\n long_description = handle.read()\n\n# with open('./heat/core/version.py') as handle:\n# exec(handle.read())\n# pri... | [
{
"content": "from setuptools import setup\nimport sys\n\nsys.path.append(\"./heat/core\")\nimport version\n\nprint(version, dir(version))\n\nwith open(\"README.md\", \"r\") as handle:\n long_description = handle.read()\n\n# with open('./heat/core/version.py') as handle:\n# exec(handle.read())\n# pri... | diff --git a/setup.py b/setup.py
index bedeb05b67..5ab6519be3 100644
--- a/setup.py
+++ b/setup.py
@@ -35,7 +35,7 @@
install_requires=["mpi4py>=3.0.0", "numpy>=1.13.0", "torch==1.3.0"],
extras_require={
"hdf5": ["h5py>=2.8.0"],
- "netcdf": ["netCDF4>=1.4.0"],
+ "netcdf": ["netCDF4>=1.4.0,<=1.5.2"],
"dev": ["pre-commit>=1.18.3"],
},
)
|
ansible__awx-13645 | Websocket not working at non-root path
### Please confirm the following
- [X] I agree to follow this project's [code of conduct](https://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
- [X] I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
- [X] I understand that AWX is open source software provided for free and that I might not receive a timely response.
### Summary
Changes from #11342 and #652 are not full
### AWX version
21.0.0
### Select the relevant components
- [X] UI
- [ ] API
- [ ] Docs
### Installation method
kubernetes
### Modifications
no
### Ansible version
_No response_
### Operating system
_No response_
### Web browser
_No response_
### Steps to reproduce
Deploy AWX with custom `ingress_path: /awx`
### Expected results
websocket should work
### Actual results
`2022-05-17 08:46:41,031 ERROR [-] daphne.ws_protocol [Failure instance: Traceback: <class 'ValueError'>: No route found for path 'awx/websocket/'.
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/autobahn/websocket/protocol.py:2841:processHandshake
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/txaio/tx.py:366:as_future
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/twisted/internet/defer.py:151:maybeDeferred
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/daphne/ws_protocol.py:72:onConnect
--- <exception caught here> ---
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/twisted/internet/defer.py:151:maybeDeferred
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/daphne/server.py:201:create_application
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/routing.py:54:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:47:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:145:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:169:__init__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/middleware.py:31:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/routing.py:150:__call__
]
2022-05-17 08:46:41,031 ERROR [Failure instance: Traceback: <class 'ValueError'>: No route found for path 'awx/websocket/'.
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/autobahn/websocket/protocol.py:2841:processHandshake
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/txaio/tx.py:366:as_future
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/twisted/internet/defer.py:151:maybeDeferred
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/daphne/ws_protocol.py:72:onConnect
--- <exception caught here> ---
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/twisted/internet/defer.py:151:maybeDeferred
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/daphne/server.py:201:create_application
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/routing.py:54:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:47:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:145:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/sessions.py:169:__init__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/middleware.py:31:__call__
/var/lib/awx/venv/awx/lib64/python3.9/site-packages/channels/routing.py:150:__call__`
### Additional information
It seems that issue is in https://github.com/ansible/awx/blob/48b016802c517ff04d1cff4c43e64f17bb77a7a8/awx/main/routing.py
```
websocket_urlpatterns = [
re_path(r'websocket/$', consumers.EventConsumer),
re_path(r'websocket/broadcast/$', consumers.BroadcastConsumer),
]
```
From https://docs.djangoproject.com/en/4.0/ref/urls/:
When a route ends with $ the whole requested URL, matching against path_info, must match the regular expression pattern (re.fullmatch() is used).
Replacing with
```
websocket_urlpatterns = [
re_path(r'websocket/', consumers.EventConsumer),
re_path(r'websocket/broadcast/', consumers.BroadcastConsumer),
]
```
solves the issue
| [
{
"content": "import redis\nimport logging\n\nfrom django.conf import settings\nfrom django.urls import re_path\n\nfrom channels.auth import AuthMiddlewareStack\nfrom channels.routing import ProtocolTypeRouter, URLRouter\n\nfrom . import consumers\n\n\nlogger = logging.getLogger('awx.main.routing')\n\n\nclass A... | [
{
"content": "import redis\nimport logging\n\nfrom django.conf import settings\nfrom django.urls import re_path\n\nfrom channels.auth import AuthMiddlewareStack\nfrom channels.routing import ProtocolTypeRouter, URLRouter\n\nfrom . import consumers\n\n\nlogger = logging.getLogger('awx.main.routing')\n\n\nclass A... | diff --git a/awx/main/routing.py b/awx/main/routing.py
index c96505b7e120..100347f64e55 100644
--- a/awx/main/routing.py
+++ b/awx/main/routing.py
@@ -27,8 +27,8 @@ def __init__(self, *args, **kwargs):
websocket_urlpatterns = [
- re_path(r'websocket/', consumers.EventConsumer.as_asgi()),
- re_path(r'websocket/broadcast/', consumers.BroadcastConsumer.as_asgi()),
+ re_path(r'websocket/$', consumers.EventConsumer.as_asgi()),
+ re_path(r'websocket/broadcast/$', consumers.BroadcastConsumer.as_asgi()),
]
application = AWXProtocolTypeRouter(
|
encode__httpx-1255 | Docs incorrectly reference `HTTPError.message` attribute
### Checklist
<!-- Please make sure you check all these items before submitting your bug report. -->
- [x] The bug is reproducible against the latest release and/or `master`.
- [x] There are no similar issues or pull requests to fix it yet.
### Describe the bug
<!-- A clear and concise description of what the bug is. -->
The documentation indicates that message field can be used in the HTTPError:
https://github.com/encode/httpx/blob/master/httpx/_exceptions.py#L54
```
try:
response = httpx.get("https://www.example.com")
response.raise_for_status()
except httpx.HTTPError as exc:
print(f"HTTP Exception for {exc.request.url} - {exc.message}")
```
But there is not such field:
```
AttributeError: 'HTTPStatusError' object has no attribute 'message'
```
### To reproduce
Execute the example from the doc
### Expected behavior
Print the string without raising any exceptions
### Actual behavior
AttributeError is raised
### Possible fixes
1. Update the documentation to use `str(exc)` instead of `exc.message`
or
2. Set `self.message` field in HTTPError
| [
{
"content": "\"\"\"\nOur exception hierarchy:\n\n* HTTPError\n x RequestError\n + TransportError\n - TimeoutException\n · ConnectTimeout\n · ReadTimeout\n · WriteTimeout\n · PoolTimeout\n - NetworkError\n · ConnectError\n · ReadError\n · WriteError... | [
{
"content": "\"\"\"\nOur exception hierarchy:\n\n* HTTPError\n x RequestError\n + TransportError\n - TimeoutException\n · ConnectTimeout\n · ReadTimeout\n · WriteTimeout\n · PoolTimeout\n - NetworkError\n · ConnectError\n · ReadError\n · WriteError... | diff --git a/httpx/_exceptions.py b/httpx/_exceptions.py
index 4d6837778a..260d14ee5f 100644
--- a/httpx/_exceptions.py
+++ b/httpx/_exceptions.py
@@ -55,7 +55,7 @@ class HTTPError(Exception):
response = httpx.get("https://www.example.com")
response.raise_for_status()
except httpx.HTTPError as exc:
- print(f"HTTP Exception for {exc.request.url} - {exc.message}")
+ print(f"HTTP Exception for {exc.request.url} - {exc}")
```
"""
|
ivy-llc__ivy-23027 | solve
### Bug Explanation
The `paddle.linalg.solve` tests are failing. The tests and the front-end function are not implemented properly.
The test should generate two matrices of shape [ *, M, M ] and [ *, M, K ] but the written test just generates two matrices of the same shape, and function arguments are mismatched returning
`TypeError: solve() got an unexpected keyword argument 'x'`
### Steps to Reproduce Bug
Run : `pytest ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_linalg.py::test_paddle_solve`
### Environment
MacOs : 13.5
### Ivy Version
0.0.0.0.0
### Backend
- [ ] NumPy
- [ ] TensorFlow
- [ ] PyTorch
- [ ] JAX
### Device
Mac M1
| [
{
"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle import promote_types_of_paddle_inputs\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_supported_dtypes({\"2... | [
{
"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle import promote_types_of_paddle_inputs\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_supported_dtypes({\"2... | diff --git a/ivy/functional/frontends/paddle/tensor/linalg.py b/ivy/functional/frontends/paddle/tensor/linalg.py
index 4ae10e9824324..34eff9474cb1b 100644
--- a/ivy/functional/frontends/paddle/tensor/linalg.py
+++ b/ivy/functional/frontends/paddle/tensor/linalg.py
@@ -183,10 +183,10 @@ def qr(x, mode="reduced", name=None):
# solve
-@with_unsupported_dtypes({"2.5.1 and below": ("float16", "bfloat16")}, "paddle")
+@with_supported_dtypes({"2.5.1 and below": ("float32", "float64")}, "paddle")
@to_ivy_arrays_and_back
-def solve(x1, x2, name=None):
- return ivy.solve(x1, x2)
+def solve(x, y, name=None):
+ return ivy.solve(x, y)
# transpose
diff --git a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_linalg.py b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_linalg.py
index 1290c944bef8d..9712e8fede1d4 100644
--- a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_linalg.py
+++ b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_linalg.py
@@ -12,6 +12,7 @@
)
from ivy_tests.test_ivy.test_frontends.test_tensorflow.test_linalg import (
+ _get_first_matrix,
_get_second_matrix,
_get_cholesky_matrix,
)
@@ -872,36 +873,35 @@ def test_paddle_qr(
# solve
@handle_frontend_test(
- fn_tree="paddle.solve",
- dtype_x=helpers.dtype_and_values(
- available_dtypes=helpers.get_dtypes("float"),
- num_arrays=2,
- shared_dtype=True,
- min_value=-10,
- max_value=10,
- ),
- aliases=["paddle.tensor.linalg.solve"],
+ fn_tree="paddle.tensor.linalg.solve",
+ aliases=["paddle.linalg.solve"],
+ x=_get_first_matrix(),
+ y=_get_second_matrix(),
test_with_out=st.just(False),
)
def test_paddle_solve(
*,
- dtype_x,
+ x,
+ y,
frontend,
- test_flags,
backend_fw,
+ test_flags,
fn_tree,
on_device,
):
- input_dtype, x = dtype_x
+ input_dtype1, x1 = x
+ input_dtype2, x2 = y
helpers.test_frontend_function(
- input_dtypes=input_dtype,
- frontend=frontend,
+ input_dtypes=[input_dtype1, input_dtype2],
backend_to_test=backend_fw,
+ frontend=frontend,
test_flags=test_flags,
fn_tree=fn_tree,
on_device=on_device,
- x=x[0],
- y=x[1],
+ rtol=1e-3,
+ atol=1e-3,
+ x=x1,
+ y=x2,
)
|
DataDog__dd-trace-py-5408 | Throws exception when reloading module in REPL (specifically with iPython), even when `DD_TRACE_ENABLED=false`
<!--
Thanks for taking the time for reporting an issue!
Before reporting an issue on dd-trace-py, please be sure to provide all
necessary information.
If you're hitting a bug, make sure that you're using the latest version of this
library.
-->
### Summary of problem
The exception includes this stack trace:
```python
[autoreload of my_module failed: Traceback (most recent call last):
File "/home/ory/.pyenv/versions/3.10.7/envs/my_module/lib/python3.10/site-packages/IPython/extensions/autoreload.py", line 261, in check
superreload(m, reload, self.old_objects)
File "/home/ory/.pyenv/versions/3.10.7/envs/my_module/lib/python3.10/site-packages/IPython/extensions/autoreload.py", line 459, in superreload
module = reload(module)
File "/home/ory/.pyenv/versions/3.10.7/lib/python3.10/importlib/__init__.py", line 166, in reload
spec = module.__spec__ = _bootstrap._find_spec(name, pkgpath, target)
File "<frozen importlib._bootstrap>", line 945, in _find_spec
File "/home/ory/.pyenv/versions/3.10.7/envs/my_module/lib/python3.10/site-packages/ddtrace/internal/module.py", line 368, in find_spec
spec = find_spec(fullname)
File "/home/ory/.pyenv/versions/3.10.7/lib/python3.10/importlib/util.py", line 111, in find_spec
raise ValueError('{}.__spec__ is not set'.format(name)) from None
ValueError: my_module.__spec__ is not set
```
### Which version of dd-trace-py are you using?
1.8.0
### Which version of pip are you using?
22.2.2
| [
{
"content": "from collections import defaultdict\nfrom os.path import abspath\nfrom os.path import expanduser\nfrom os.path import isdir\nfrom os.path import isfile\nfrom os.path import join\nimport sys\nfrom types import ModuleType\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Defau... | [
{
"content": "from collections import defaultdict\nfrom os.path import abspath\nfrom os.path import expanduser\nfrom os.path import isdir\nfrom os.path import isfile\nfrom os.path import join\nimport sys\nfrom types import ModuleType\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Defau... | diff --git a/ddtrace/internal/module.py b/ddtrace/internal/module.py
index e608e648487..b0668518ffc 100644
--- a/ddtrace/internal/module.py
+++ b/ddtrace/internal/module.py
@@ -404,7 +404,12 @@ def find_spec(self, fullname, path=None, target=None):
self._finding.add(fullname)
try:
- spec = find_spec(fullname)
+ try:
+ # Best effort
+ spec = find_spec(fullname)
+ except Exception:
+ return None
+
if spec is None:
return None
diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt
index 2289730a2aa..37e5cdb4051 100644
--- a/docs/spelling_wordlist.txt
+++ b/docs/spelling_wordlist.txt
@@ -1,5 +1,7 @@
AArch
AnyCallable
+autoreload
+autoreloading
CPython
Fargate
Firehose
diff --git a/releasenotes/notes/fix-internal-module-spec-best-effort-adbb7c32399d7317.yaml b/releasenotes/notes/fix-internal-module-spec-best-effort-adbb7c32399d7317.yaml
new file mode 100644
index 00000000000..5975bb9514e
--- /dev/null
+++ b/releasenotes/notes/fix-internal-module-spec-best-effort-adbb7c32399d7317.yaml
@@ -0,0 +1,5 @@
+---
+fixes:
+ - |
+ Prevent exceptions when autoreloading modules that directly or indirectly
+ import ddtrace with the iPython autoreload extension.
|
pytorch__pytorch-4656 | Bug in CosineAnnealingLR in Python 2 (?)
The learning rate only takes two values `base_lr` or `eta_min` if the `T_max` parameter passed in the constructor is an integer. This is probably because `self.last_epoch / self.T_max` evaluates to an integer in Python 2.
| [
{
"content": "import math\nfrom bisect import bisect_right\nfrom .optimizer import Optimizer\n\n\nclass _LRScheduler(object):\n def __init__(self, optimizer, last_epoch=-1):\n if not isinstance(optimizer, Optimizer):\n raise TypeError('{} is not an Optimizer'.format(\n type(o... | [
{
"content": "import math\nfrom bisect import bisect_right\nfrom .optimizer import Optimizer\n\n\nclass _LRScheduler(object):\n def __init__(self, optimizer, last_epoch=-1):\n if not isinstance(optimizer, Optimizer):\n raise TypeError('{} is not an Optimizer'.format(\n type(o... | diff --git a/test/test_optim.py b/test/test_optim.py
index 66a78fd8119c42..f05b219a42fccb 100644
--- a/test/test_optim.py
+++ b/test/test_optim.py
@@ -503,7 +503,7 @@ def test_cos_anneal_lr(self):
epochs = 10
eta_min = 1e-10
single_targets = [eta_min + (0.05 - eta_min) *
- (1 + math.cos(x / epochs * math.pi)) / 2
+ (1 + math.cos(math.pi * x / epochs)) / 2
for x in range(epochs)]
targets = [single_targets, list(map(lambda x: x * epochs, single_targets))]
scheduler = CosineAnnealingLR(self.opt, T_max=epochs, eta_min=eta_min)
diff --git a/torch/optim/lr_scheduler.py b/torch/optim/lr_scheduler.py
index 33b7ebf34db103..0b125c459c6658 100644
--- a/torch/optim/lr_scheduler.py
+++ b/torch/optim/lr_scheduler.py
@@ -194,7 +194,7 @@ def __init__(self, optimizer, T_max, eta_min=0, last_epoch=-1):
def get_lr(self):
return [self.eta_min + (base_lr - self.eta_min) *
- (1 + math.cos(self.last_epoch / self.T_max * math.pi)) / 2
+ (1 + math.cos(math.pi * self.last_epoch / self.T_max)) / 2
for base_lr in self.base_lrs]
|
Qiskit__qiskit-2448 | No module named 'vcr': requirement is missing (vcrpy)
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.10.1
- **Python version**: 3.7.3
- **Operating system**: windows 10
### What is the current behavior?
Fresh qiskit installation inside a new environment on windows 10.
In one of the terra tutorial (using_the_transpiler) `from qiskit.test.mock import FakeTokyo` is failing 'ModuleNotFoundError: No module named vcr'
### Suggested solutions
'pip install vcrpy'
'vcrpy' needs to be added in requirements.
| [
{
"content": "# -*- coding: utf-8 -*-\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/license... | [
{
"content": "# -*- coding: utf-8 -*-\n# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2017.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/license... | diff --git a/.pylintrc b/.pylintrc
index ae9061c82f2a..6040fbfed61f 100644
--- a/.pylintrc
+++ b/.pylintrc
@@ -21,7 +21,7 @@ persistent=yes
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=pylint.extensions.docparams, # enable checking of docstring args
- pylint.extensions.docstyle, # basic docstring stle checks
+ pylint.extensions.docstyle, # basic docstring style checks
pylintfileheader # Check license comments
file-header=(?:(?:#[^\n]*)?\n)*# This code is part of Qiskit.\n#\n# \(C\) Copyright IBM [0-9, -]*.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n
diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index d9344065b780..2d495534781e 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -22,7 +22,8 @@ The format is based on `Keep a Changelog`_.
Deprecated
----------
-- The gates `U` and `CX` are being deprecated in favor of `u3` and `cx`.
+- The gates ``U`` and ``CX`` are being deprecated in favor of ``u3`` and ``cx``.
+- The decorator ``requires_qe_access`` is being deprecated in favor of ``online_test``.
Added
-----
@@ -40,6 +41,8 @@ Changed
- When adding a register to a circuit, an error will now be raised if a register
of the same name is already present. Previously, an error would only be raised
if the same register was added twice.
+- Qubits and classical bits are not represented as a tuples anymore, but as
+ instances of ``Qubit`` and ``Clbit`` respectively.
Removed
-------
diff --git a/Makefile b/Makefile
index 79399d949809..4dabcdaae082 100644
--- a/Makefile
+++ b/Makefile
@@ -32,7 +32,7 @@ else
CONCURRENCY := $(shell echo "$(NPROCS) 2" | awk '{printf "%.0f", $$1 / $$2}')
endif
-.PHONY: env lint test test_record test_mock test_ci
+.PHONY: env lint test test_ci
# Dependencies need to be installed on the Anaconda virtual environment.
env:
@@ -55,13 +55,6 @@ style:
test:
python3 -m unittest discover -s test -v
-test_mock:
- env QISKIT_TESTS=mock_online python3 -m unittest discover -s test -v
-
-test_recording:
- -rm test/cassettes/*
- env QISKIT_TESTS=rec python3 -m unittest discover -s test -v
-
test_ci:
echo "Detected $(NPROCS) CPUs running with $(CONCURRENCY) workers"
stestr run --concurrency $(CONCURRENCY)
diff --git a/qiskit/test/__init__.py b/qiskit/test/__init__.py
index 820df145f4e0..4eb20ffa2326 100644
--- a/qiskit/test/__init__.py
+++ b/qiskit/test/__init__.py
@@ -15,6 +15,6 @@
"""Functionality and helpers for testing Qiskit."""
from .base import QiskitTestCase
-from .decorators import requires_aer_provider, requires_qe_access, slow_test
+from .decorators import requires_aer_provider, online_test, slow_test, requires_qe_access
from .reference_circuits import ReferenceCircuits
from .utils import Path
diff --git a/qiskit/test/decorators.py b/qiskit/test/decorators.py
index e55cf681de48..bf8917cca37e 100644
--- a/qiskit/test/decorators.py
+++ b/qiskit/test/decorators.py
@@ -18,11 +18,13 @@
import os
import sys
import unittest
+from warnings import warn
-from .utils import Path
-from .http_recorder import http_recorder
+from qiskit.util import _has_connection
from .testing_options import get_test_options
+HAS_NET_CONNECTION = None
+
def is_aer_provider_available():
"""Check if the C++ simulator can be instantiated.
@@ -137,7 +139,27 @@ def _get_credentials(test_object, test_options):
def requires_qe_access(func):
- """Decorator that signals that the test uses the online API:
+ """Deprecated in favor of `online_test`"""
+ warn("`requires_qe_access` is going to be replaced in favor of `online_test`",
+ DeprecationWarning)
+
+ @functools.wraps(func)
+ def _wrapper(self, *args, **kwargs):
+ if TEST_OPTIONS['skip_online']:
+ raise unittest.SkipTest('Skipping online tests')
+
+ credentials = _get_credentials(self, TEST_OPTIONS)
+ self.using_ibmq_credentials = credentials.is_ibmq()
+ kwargs.update({'qe_token': credentials.token,
+ 'qe_url': credentials.url})
+
+ return func(self, *args, **kwargs)
+
+ return _wrapper
+
+
+def online_test(func):
+ """Decorator that signals that the test uses the network (and the online API):
It involves:
* determines if the test should be skipped by checking environment
@@ -159,23 +181,24 @@ def requires_qe_access(func):
@functools.wraps(func)
def _wrapper(self, *args, **kwargs):
+ # To avoid checking the connection in each test
+ global HAS_NET_CONNECTION # pylint: disable=global-statement
+
if TEST_OPTIONS['skip_online']:
raise unittest.SkipTest('Skipping online tests')
+ if HAS_NET_CONNECTION is None:
+ HAS_NET_CONNECTION = _has_connection('qiskit.org', 443)
+
+ if not HAS_NET_CONNECTION:
+ raise unittest.SkipTest("Test requires internet connection.")
+
credentials = _get_credentials(self, TEST_OPTIONS)
self.using_ibmq_credentials = credentials.is_ibmq()
kwargs.update({'qe_token': credentials.token,
'qe_url': credentials.url})
- decorated_func = func
- if TEST_OPTIONS['rec'] or TEST_OPTIONS['mock_online']:
- # For recording or for replaying existing cassettes, the test
- # should be decorated with @use_cassette.
- vcr_mode = 'new_episodes' if TEST_OPTIONS['rec'] else 'none'
- decorated_func = http_recorder(
- vcr_mode, Path.CASSETTES.value).use_cassette()(decorated_func)
-
- return decorated_func(self, *args, **kwargs)
+ return func(self, *args, **kwargs)
return _wrapper
diff --git a/qiskit/test/http_recorder.py b/qiskit/test/http_recorder.py
deleted file mode 100644
index 83a18880c459..000000000000
--- a/qiskit/test/http_recorder.py
+++ /dev/null
@@ -1,289 +0,0 @@
-# -*- coding: utf-8 -*-
-
-# This code is part of Qiskit.
-#
-# (C) Copyright IBM 2017, 2018.
-#
-# This code is licensed under the Apache License, Version 2.0. You may
-# obtain a copy of this license in the LICENSE.txt file in the root directory
-# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
-#
-# Any modifications or derivative works of this code must retain this
-# copyright notice, and modified files need to carry a notice indicating
-# that they have been altered from the originals.
-
-"""Utilities (based on VCRpy) to record remote requests and allow testing offline/cached."""
-
-import json
-from contextlib import suppress
-from vcr.persisters.filesystem import FilesystemPersister
-from vcr import VCR
-
-
-class IdRemoverPersister(FilesystemPersister):
- """VCR Persister for Qiskit.
-
- IdRemoverPersister is a VCR persister. This is, it implements a way to save and load cassettes.
- This persister in particular inherits load_cassette from FilesystemPersister (basically, it
- loads a standard cassette in the standard way from the file system). On the saving side, it
- replaces some fields in the JSON content of the responses with dummy values.
- """
-
- @staticmethod
- def get_responses_with(string_to_find, cassette_dict):
- """Filters the requests from cassette_dict
-
- Args:
- string_to_find (str): request path
- cassette_dict (dict): a VCR cassette dictionary
-
- Returns:
- Request: VCR's representation of a request.
- """
- return [response for response, request in
- zip(cassette_dict['responses'], cassette_dict['requests'])
- if string_to_find in request.path]
-
- @staticmethod
- def get_new_id(field, path, id_tracker, type_=str):
- """Creates a new dummy id (or value) for replacing an existing id (or value).
-
- Args:
- field (str): field name is used, in same cases, to create a dummy value.
- path (str): path of the request is used, in same cases, to create a dummy value.
- id_tracker (dict): a map of already assigned ids and generated ids.
- type_ (type): type of the value.
-
- Returns:
- str: that is used to replace a value.
- """
-
- if type_ == float:
- return 0.42
- if type_ == int:
- return 42
- dummy_name = 'dummy%s%s' % (path.replace('/', ''), field)
- count = len(list(filter(lambda x: str(x).startswith(dummy_name), id_tracker.values())))
- return "%s%02d" % (dummy_name, count + 1)
-
- @staticmethod
- def get_matching_dicts(data_dict, map_list):
- """Find subdicts that are described in map_list.
-
- Args:
- data_dict (dict): in which the map_list is going to be searched.
- map_list (list): the list of nested keys to find in the data_dict
-
- Returns:
- list: a list of dictionaries, each of them matches map_list.
- """
- ret = []
- if not map_list:
- return ret
- if isinstance(data_dict, list):
- for sub_data_dict in data_dict:
- ret.extend(IdRemoverPersister.get_matching_dicts(sub_data_dict, map_list))
- if isinstance(data_dict, dict):
- if map_list[0] in data_dict.keys():
- if len(map_list) == 1:
- return [data_dict]
- else:
- ret.extend(
- IdRemoverPersister.get_matching_dicts(data_dict[map_list[0]], map_list[1:]))
- return ret
-
- @staticmethod
- def remove_id_in_a_json(jsonobj, field, path, id_tracker):
- """Replaces ids with dummy values in a json.
-
- Replaces in jsonobj (in-place) the field with dummy value (which is constructed with
- id_tracker, if it was already replaced, or path, if it needs to be created).
-
- Args:
- jsonobj (dict): json dictionary from the response body
- field (str): string with the field in the response to by replaced
- path (str): request path
- id_tracker (dict): a dictionary of the ids already assigned.
- """
-
- map_list = field.split('.')
- for matching_dict in IdRemoverPersister.get_matching_dicts(jsonobj, map_list):
- with suppress(KeyError):
- old_id = matching_dict[map_list[-1]]
- if old_id not in id_tracker:
- new_id = IdRemoverPersister.get_new_id(field, path, id_tracker, type(old_id))
- id_tracker[old_id] = new_id
- matching_dict[map_list[-1]] = id_tracker[old_id]
-
- @staticmethod
- def remove_ids_in_a_response(response, fields, path, id_tracker):
- """Replaces ids with dummy values in a response.
-
- Replaces in response (in-place) the fields with dummy values (which is constructed with
- id_tracker, if it was already replaced, or path, if it needs to be created).
-
- Args:
- response (dict): dictionary of the response body
- fields (list): list of fields in the response to by replaced
- path (str): request path
- id_tracker (dict): a dictionary of the ids already assigned.
- """
- body = json.loads(response['body']['string'].decode('utf-8'))
- for field in fields:
- IdRemoverPersister.remove_id_in_a_json(body, field, path, id_tracker)
- response['body']['string'] = json.dumps(body).encode('utf-8')
-
- @staticmethod
- def remove_ids(ids2remove, cassette_dict):
- """Replaces ids with dummy values in a cassette.
-
- Replaces in cassette_dict (in-place) the fields defined by ids2remove with dummy values.
- Internally, it used a map (id_tracker) between real values and dummy values to keep
- consistency during the renaming.
-
- Args:
- ids2remove (dict): {request_path: [json_fields]}
- cassette_dict (dict): a VCR cassette dictionary.
- """
-
- id_tracker = {} # {old_id: new_id}
- for path, fields in ids2remove.items():
- responses = IdRemoverPersister.get_responses_with(path, cassette_dict)
- for response in responses:
- IdRemoverPersister.remove_ids_in_a_response(response, fields, path, id_tracker)
- for old_id, new_id in id_tracker.items():
- if isinstance(old_id, str):
- for request in cassette_dict['requests']:
- request.uri = request.uri.replace(old_id, new_id)
-
- @staticmethod
- def save_cassette(cassette_path, cassette_dict, serializer):
- """Extends FilesystemPersister.save_cassette
-
- Extends FilesystemPersister.save_cassette. Replaces particular values (defined by
- ids2remove) which are replaced by a dummy value. The full manipulation is in
- cassette_dict, before saving it using FilesystemPersister.save_cassette
-
- Args:
- cassette_path (str): the file location where the cassette will be saved.
- cassette_dict (dict): a VCR cassette dictionary. This is the information that will
- be dump in cassette_path, using serializer.
- serializer (callable): the serializer for dumping cassette_dict in cassette_path.
- """
- ids2remove = {'/api/users/loginWithToken': ['id',
- 'userId',
- 'created'],
- '/api/Jobs': ['id',
- 'userId',
- 'creationDate',
- 'qasms.executionId',
- 'qasms.result.date',
- 'qasms.result.data.time',
- 'qasms.result.data.additionalData.seed'],
- '/api/Backends': ['internalId',
- 'topologyId'],
- '/api/Backends/ibmqx5/queue/status': ['lengthQueue'],
- '/api/Backends/ibmqx4/queue/status': ['lengthQueue']}
- IdRemoverPersister.remove_ids(ids2remove, cassette_dict)
- super(IdRemoverPersister, IdRemoverPersister).save_cassette(cassette_path,
- cassette_dict,
- serializer)
-
-
-def http_recorder(vcr_mode, cassette_dir):
- """Creates a VCR object in vcr_mode mode.
-
- Args:
- vcr_mode (string): the parameter for record_mode.
- cassette_dir (string): path to the cassettes.
-
- Returns:
- VCR: a VCR object.
- """
- my_vcr = VCR(
- cassette_library_dir=cassette_dir,
- record_mode=vcr_mode,
- match_on=['method', 'scheme', 'host', 'port', 'path', 'unordered_query'],
- filter_headers=['x-qx-client-application', 'User-Agent'],
- filter_query_parameters=[('access_token', 'dummyapiusersloginWithTokenid01')],
- filter_post_data_parameters=[('apiToken', 'apiToken_dummy')],
- decode_compressed_response=True,
- before_record_response=_purge_headers_cb(['Date',
- ('Set-Cookie', 'dummy_cookie'),
- 'X-Global-Transaction-ID',
- 'Etag',
- 'Content-Security-Policy',
- 'X-Content-Security-Policy',
- 'X-Webkit-Csp',
- 'content-length']))
- my_vcr.register_matcher('unordered_query', _unordered_query_matcher)
- my_vcr.register_persister(IdRemoverPersister)
- return my_vcr
-
-
-def _purge_headers_cb(headers):
- """Remove headers from the response.
-
- Args:
- headers (list): headers to remove from the response
-
- Returns:
- callable: for been used in before_record_response VCR constructor.
- """
- header_list = []
- for item in headers:
- if not isinstance(item, tuple):
- item = (item, None)
- header_list.append(item[0:2]) # ensure the tuple is a pair
-
- def before_record_response_cb(response):
- """Purge headers from response.
-
- Args:
- response (dict): a VCR response
-
- Returns:
- dict: a VCR response
- """
- for (header, value) in header_list:
- with suppress(KeyError):
- if value:
- response['headers'][header] = value
- else:
- del response['headers'][header]
- return response
-
- return before_record_response_cb
-
-
-def _unordered_query_matcher(request1, request2):
- """A VCR matcher that ignores the order of values in the query string.
-
- A VCR matcher (a la VCR.matcher) that ignores the order of the values in the query string.
- Useful for filter params, for example.
-
- Args:
- request1 (Request): a VCR request
- request2 (Request): a VCR request
-
- Returns:
- bool: True if they match.
- """
- if request1.query == request2.query:
- return True
-
- dict1 = dict(request1.query)
- dict2 = dict(request2.query)
-
- if dict1 == dict2:
- return True
-
- if dict1.keys() != dict2.keys():
- return False
-
- for key, value in dict1.items():
- with suppress(ValueError):
- dict1[key] = json.loads(value)
- dict2[key] = json.loads(dict2[key])
-
- return dict1 == dict2
diff --git a/qiskit/test/utils.py b/qiskit/test/utils.py
index a861fa7a7ca2..929e3a6bddda 100644
--- a/qiskit/test/utils.py
+++ b/qiskit/test/utils.py
@@ -33,8 +33,6 @@ class Path(Enum):
EXAMPLES = os.path.normpath(os.path.join(SDK, '..', 'examples'))
# Schemas path: qiskit/schemas
SCHEMAS = os.path.normpath(os.path.join(SDK, 'schemas'))
- # VCR cassettes path: qiskit/test/cassettes/
- CASSETTES = os.path.normpath(os.path.join(TEST, '..', 'cassettes'))
# Sample QASMs path: qiskit/test/python/qasm
QASMS = os.path.normpath(os.path.join(TEST, 'qasm'))
diff --git a/qiskit/util.py b/qiskit/util.py
index ca6998629203..90ad9f30abf8 100644
--- a/qiskit/util.py
+++ b/qiskit/util.py
@@ -97,7 +97,7 @@ def _has_connection(hostname, port):
"""
try:
host = socket.gethostbyname(hostname)
- socket.create_connection((host, port), 2)
+ socket.create_connection((host, port), 2).close()
return True
except Exception: # pylint: disable=broad-except
return False
diff --git a/requirements-dev.txt b/requirements-dev.txt
index 737f11814f22..4dedc3cd967f 100644
--- a/requirements-dev.txt
+++ b/requirements-dev.txt
@@ -9,7 +9,6 @@ pydot
pylint>=2.3,<2.4
pylintfileheader>=0.0.2
stestr>=2.0.0
-vcrpy
PyGithub
wheel
cython>=0.27.1
diff --git a/test/cassettes/test_backend_monitor b/test/cassettes/test_backend_monitor
deleted file mode 100644
index 087f4fcf861f..000000000000
--- a/test/cassettes/test_backend_monitor
+++ /dev/null
@@ -1,381 +0,0 @@
-interactions:
-- request:
- body: apiToken=apiToken_dummy
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- Content-Length: ['137']
- Content-Type: [application/x-www-form-urlencoded]
- method: POST
- uri: https://quantumexperience.ng.bluemix.net/api/users/loginWithToken
- response:
- body: {string: '{"created": "dummyapiusersloginWithTokencreated01", "userId":
- "dummyapiusersloginWithTokenuserId01", "id": "dummyapiusersloginWithTokenid01",
- "ttl": 1209600}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmqx4/queue/status
- response:
- body: {string: '{"state": true, "status": "active", "backend_version": "1.0.0",
- "lengthQueue": 42}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmqx4/properties?access_token=dummyapiusersloginWithTokenid01&version=1
- response:
- body: {string: '{"backend_name": "ibmqx4", "backend_version": "1.0.0", "qubits":
- [[{"date": "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value":
- 40.23580728497557}, {"date": "2019-02-05T02:16:42Z", "unit": "\u00b5s", "name":
- "T2", "value": 28.053796455696148}, {"date": "2019-02-05T02:54:28Z", "unit":
- "GHz", "name": "frequency", "value": 5.2498546876946115}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.08450000000000002}], [{"date":
- "2019-02-04T10:14:24Z", "unit": "\u00b5s", "name": "T1", "value": 67.39179431370326},
- {"date": "2019-02-05T02:17:25Z", "unit": "\u00b5s", "name": "T2", "value":
- 10.763705758897595}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.295772109235333}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.07925000000000004}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 43.393200515665946},
- {"date": "2019-02-05T02:18:06Z", "unit": "\u00b5s", "name": "T2", "value":
- 27.3542760576606}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.3533332743669355}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.03699999999999992}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 55.541546751266985},
- {"date": "2019-02-05T02:17:25Z", "unit": "\u00b5s", "name": "T2", "value":
- 14.846271380938276}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.434936834513384}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.03875000000000006}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 53.860484623965284},
- {"date": "2019-02-05T02:16:42Z", "unit": "\u00b5s", "name": "T2", "value":
- 4.983364732947786}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.175855462568935}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.26875000000000004}]], "gates":
- [{"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error",
- "value": 0}], "qubits": [0], "gate": "u1"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0.0006867731322012238}], "qubits":
- [0], "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit":
- "", "name": "gate_error", "value": 0.0013735462644024476}], "qubits": [0],
- "gate": "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "",
- "name": "gate_error", "value": 0}], "qubits": [1], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value":
- 0.00128782749692391}], "qubits": [1], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0.00257565499384782}],
- "qubits": [1], "gate": "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [2], "gate": "u1"},
- {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error",
- "value": 0.00128782749692391}], "qubits": [2], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value":
- 0.00257565499384782}], "qubits": [2], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0}], "qubits":
- [3], "gate": "u1"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit":
- "", "name": "gate_error", "value": 0.001803112096824766}], "qubits": [3],
- "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "",
- "name": "gate_error", "value": 0.003606224193649532}], "qubits": [3], "gate":
- "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name":
- "gate_error", "value": 0}], "qubits": [4], "gate": "u1"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0.006444645993361475}],
- "qubits": [4], "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0.01288929198672295}], "qubits":
- [4], "gate": "u3"}, {"parameters": [{"date": "2019-02-05T02:25:32Z", "unit":
- "", "name": "gate_error", "value": 0.03594617578113263}], "qubits": [1, 0],
- "name": "CX1_0", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:31:04Z",
- "unit": "", "name": "gate_error", "value": 0.03205473341614962}], "qubits":
- [2, 0], "name": "CX2_0", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:36:21Z",
- "unit": "", "name": "gate_error", "value": 0.048500617566183984}], "qubits":
- [2, 1], "name": "CX2_1", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:41:40Z",
- "unit": "", "name": "gate_error", "value": 0.07474221943376097}], "qubits":
- [3, 2], "name": "CX3_2", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:47:44Z",
- "unit": "", "name": "gate_error", "value": 0.07660114123887399}], "qubits":
- [3, 4], "name": "CX3_4", "gate": "cx"}, {"parameters": [{"date": "2019-02-04T10:53:35Z",
- "unit": "", "name": "gate_error", "value": 0.06824929220587475}], "qubits":
- [4, 2], "name": "CX4_2", "gate": "cx"}], "last_update_date": "2019-02-05T02:54:28.000Z",
- "general": []}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-version: 1
diff --git a/test/cassettes/test_backend_overview b/test/cassettes/test_backend_overview
deleted file mode 100644
index af512f726732..000000000000
--- a/test/cassettes/test_backend_overview
+++ /dev/null
@@ -1,976 +0,0 @@
-interactions:
-- request:
- body: apiToken=apiToken_dummy
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- Content-Length: ['137']
- Content-Type: [application/x-www-form-urlencoded]
- method: POST
- uri: https://quantumexperience.ng.bluemix.net/api/users/loginWithToken
- response:
- body: {string: '{"created": "dummyapiusersloginWithTokencreated01", "userId":
- "dummyapiusersloginWithTokenuserId01", "id": "dummyapiusersloginWithTokenid01",
- "ttl": 1209600}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmqx4/queue/status
- response:
- body: {string: '{"state": true, "status": "active", "backend_version": "1.0.0",
- "lengthQueue": 42}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmq_16_melbourne/queue/status
- response:
- body: {string: '{"state": true, "status": "active", "backend_version": "1.0.0",
- "lengthQueue": 40}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmq_16_melbourne/queue/status
- response:
- body: {string: '{"state": true, "status": "active", "backend_version": "1.0.0",
- "lengthQueue": 40}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmqx4/queue/status
- response:
- body: {string: '{"state": true, "status": "active", "backend_version": "1.0.0",
- "lengthQueue": 42}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmq_16_melbourne/properties?access_token=dummyapiusersloginWithTokenid01&version=1
- response:
- body: {string: '{"backend_name": "ibmq_16_melbourne", "backend_version": "1.0.0",
- "qubits": [[{"date": "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1",
- "value": 75.96106177842839}, {"date": "2019-02-08T07:26:00Z", "unit": "\u00b5s",
- "name": "T2", "value": 20.8057158459146}, {"date": "2019-02-08T09:22:04Z",
- "unit": "GHz", "name": "frequency", "value": 5.1000759392113855}, {"date":
- "2019-02-08T07:24:20Z", "unit": "", "name": "readout_error", "value": 0.0867}],
- [{"date": "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value":
- 48.35498967351742}, {"date": "2019-02-08T07:27:01Z", "unit": "\u00b5s", "name":
- "T2", "value": 106.19700447796879}, {"date": "2019-02-08T09:22:04Z", "unit":
- "GHz", "name": "frequency", "value": 5.238659501129794}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.07750000000000001}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 83.33016929069103},
- {"date": "2019-02-08T07:28:02Z", "unit": "\u00b5s", "name": "T2", "value":
- 143.9968400517331}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 5.03300771358076}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.08389999999999997}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 61.35153237405397},
- {"date": "2019-02-08T07:29:05Z", "unit": "\u00b5s", "name": "T2", "value":
- 59.591728676307696}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.896170097816411}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.2329}], [{"date": "2019-02-08T07:24:46Z",
- "unit": "\u00b5s", "name": "T1", "value": 58.37651453120168}, {"date": "2019-02-08T07:26:00Z",
- "unit": "\u00b5s", "name": "T2", "value": 39.26840578288146}, {"date": "2019-02-08T09:22:04Z",
- "unit": "GHz", "name": "frequency", "value": 5.0272302387915655}, {"date":
- "2019-02-08T07:24:20Z", "unit": "", "name": "readout_error", "value": 0.02300000000000002}],
- [{"date": "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value":
- 22.72871799765391}, {"date": "2019-02-08T07:27:01Z", "unit": "\u00b5s", "name":
- "T2", "value": 37.79789351697824}, {"date": "2019-02-08T09:22:04Z", "unit":
- "GHz", "name": "frequency", "value": 5.067144859702533}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.05010000000000003}], [{"date":
- "2019-02-07T07:31:24Z", "unit": "\u00b5s", "name": "T1", "value": 55.96731254275887},
- {"date": "2019-02-08T07:28:02Z", "unit": "\u00b5s", "name": "T2", "value":
- 50.2573329320607}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.92380186054606}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.03970000000000007}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 43.23735783723179},
- {"date": "2019-02-08T07:29:05Z", "unit": "\u00b5s", "name": "T2", "value":
- 97.48037449118453}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.974517320613159}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.053300000000000014}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 56.847309252980956},
- {"date": "2019-02-08T07:26:00Z", "unit": "\u00b5s", "name": "T2", "value":
- 92.92386230529024}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.7397852983601725}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.06499999999999995}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 33.31011991217951},
- {"date": "2019-02-08T07:28:02Z", "unit": "\u00b5s", "name": "T2", "value":
- 75.07465442414373}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.963366733569522}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.12109999999999999}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 40.028390636704025},
- {"date": "2019-02-08T07:27:01Z", "unit": "\u00b5s", "name": "T2", "value":
- 49.201566672903326}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.945087520892438}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.05059999999999998}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 65.67291748177534},
- {"date": "2019-02-08T07:28:02Z", "unit": "\u00b5s", "name": "T2", "value":
- 119.6563074422006}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 5.005281828430722}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.10749999999999993}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 78.93072850344592},
- {"date": "2019-02-08T07:27:01Z", "unit": "\u00b5s", "name": "T2", "value":
- 75.83757478264616}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.760146329316313}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.05049999999999999}], [{"date":
- "2019-02-08T07:24:46Z", "unit": "\u00b5s", "name": "T1", "value": 27.14948595011101},
- {"date": "2019-02-08T07:26:00Z", "unit": "\u00b5s", "name": "T2", "value":
- 53.26526727648976}, {"date": "2019-02-08T09:22:04Z", "unit": "GHz", "name":
- "frequency", "value": 4.9684746329685225}, {"date": "2019-02-08T07:24:20Z",
- "unit": "", "name": "readout_error", "value": 0.04730000000000001}]], "gates":
- [{"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error",
- "value": 0}], "qubits": [0], "gate": "u1"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0.001679428505927727}], "qubits":
- [0], "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0.003358857011855454}], "qubits": [0],
- "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "",
- "name": "gate_error", "value": 0}], "qubits": [1], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.006080477938646023}], "qubits": [1], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.012160955877292046}],
- "qubits": [1], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [2], "gate": "u1"},
- {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error",
- "value": 0.004801001301362573}], "qubits": [2], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.009602002602725146}], "qubits": [2], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0}], "qubits":
- [3], "gate": "u1"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0.0019100464965977615}], "qubits": [3],
- "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "",
- "name": "gate_error", "value": 0.003820092993195523}], "qubits": [3], "gate":
- "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name":
- "gate_error", "value": 0}], "qubits": [4], "gate": "u1"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.0016789859784202}],
- "qubits": [4], "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0.0033579719568404}], "qubits":
- [4], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0}], "qubits": [5], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.0024183102041727134}], "qubits": [5], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.004836620408345427}],
- "qubits": [5], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [6], "gate": "u1"},
- {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error",
- "value": 0.0014604602446032233}], "qubits": [6], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.0029209204892064466}], "qubits": [6], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0}], "qubits":
- [7], "gate": "u1"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0.0031350322639342454}], "qubits": [7],
- "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "",
- "name": "gate_error", "value": 0.006270064527868491}], "qubits": [7], "gate":
- "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name":
- "gate_error", "value": 0}], "qubits": [8], "gate": "u1"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.0024551154718702173}],
- "qubits": [8], "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0.0049102309437404346}], "qubits":
- [8], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0}], "qubits": [9], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.003478010867496828}], "qubits": [9], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.006956021734993656}],
- "qubits": [9], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [10], "gate": "u1"},
- {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error",
- "value": 0.002305088055136073}], "qubits": [10], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.004610176110272146}], "qubits": [10], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0}], "qubits":
- [11], "gate": "u1"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit":
- "", "name": "gate_error", "value": 0.0014647255665026226}], "qubits": [11],
- "gate": "u2"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "",
- "name": "gate_error", "value": 0.0029294511330052453}], "qubits": [11], "gate":
- "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name":
- "gate_error", "value": 0}], "qubits": [12], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.0034429776451645466}], "qubits": [12], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value": 0.006885955290329093}],
- "qubits": [12], "gate": "u3"}, {"parameters": [{"date": "2019-02-09T07:42:54Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [13], "gate": "u1"},
- {"parameters": [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error",
- "value": 0.006289145077929659}], "qubits": [13], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-09T07:42:54Z", "unit": "", "name": "gate_error", "value":
- 0.012578290155859317}], "qubits": [13], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-08T08:22:17Z", "unit": "", "name": "gate_error", "value": 0.03992150159559102}],
- "qubits": [1, 0], "name": "CX1_0", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:25:30Z", "unit": "", "name": "gate_error", "value": 0.02898713750680537}],
- "qubits": [1, 2], "name": "CX1_2", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:29:06Z", "unit": "", "name": "gate_error", "value": 0.03636241333882234}],
- "qubits": [2, 3], "name": "CX2_3", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:32:22Z", "unit": "", "name": "gate_error", "value": 0.030334963393280623}],
- "qubits": [4, 3], "name": "CX4_3", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:35:36Z", "unit": "", "name": "gate_error", "value": 0.03885429984423519}],
- "qubits": [4, 10], "name": "CX4_10", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:38:49Z", "unit": "", "name": "gate_error", "value": 0.05486455397851017}],
- "qubits": [5, 4], "name": "CX5_4", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:42:15Z", "unit": "", "name": "gate_error", "value": 0.0637315588872944}],
- "qubits": [5, 6], "name": "CX5_6", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:46:04Z", "unit": "", "name": "gate_error", "value": 0.06629596672877747}],
- "qubits": [5, 9], "name": "CX5_9", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:49:36Z", "unit": "", "name": "gate_error", "value": 0.038006868775274066}],
- "qubits": [6, 8], "name": "CX6_8", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:53:04Z", "unit": "", "name": "gate_error", "value": 0.033951635142591974}],
- "qubits": [7, 8], "name": "CX7_8", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T08:56:16Z", "unit": "", "name": "gate_error", "value": 0.041215682504731455}],
- "qubits": [9, 8], "name": "CX9_8", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:00:12Z", "unit": "", "name": "gate_error", "value": 0.046416114232033484}],
- "qubits": [9, 10], "name": "CX9_10", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-06T09:11:14Z", "unit": "", "name": "gate_error", "value": 0.04309999280284493}],
- "qubits": [11, 3], "name": "CX11_3", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:03:39Z", "unit": "", "name": "gate_error", "value": 0.034499732994308}],
- "qubits": [11, 10], "name": "CX11_10", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:06:56Z", "unit": "", "name": "gate_error", "value": 0.0574353496208172}],
- "qubits": [11, 12], "name": "CX11_12", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:13:31Z", "unit": "", "name": "gate_error", "value": 0.05926881901365755}],
- "qubits": [12, 2], "name": "CX12_2", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:18:08Z", "unit": "", "name": "gate_error", "value": 0.1110388544989424}],
- "qubits": [13, 1], "name": "CX13_1", "gate": "cx"}, {"parameters": [{"date":
- "2019-02-08T09:22:04Z", "unit": "", "name": "gate_error", "value": 0.04073863017965598}],
- "qubits": [13, 12], "name": "CX13_12", "gate": "cx"}], "last_update_date":
- "2019-02-08T09:22:04.000Z", "general": []}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/v/1?access_token=dummyapiusersloginWithTokenid01
- response:
- body: {string: '[{"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmqx4",
- "max_shots": 8192, "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "simulator": false, "sample_name": "raven", "max_experiments": 75,
- "local": false, "backend_version": "1.0.0", "n_qubits": 5, "basis_gates":
- ["u1", "u2", "u3", "cx", "id"], "conditional": false, "url": "None", "gates":
- [{"qasm_def": "gate id q { U(0,0,0) q; }", "parameters": [], "coupling_map":
- [[0], [1], [2], [3], [4]], "name": "id"}, {"qasm_def": "gate u1(lambda) q
- { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3],
- [4]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda) q { U(theta,phi,lambda)
- q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map": [[0], [1],
- [2], [3], [4]], "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }",
- "parameters": [], "coupling_map": [[1, 0], [2, 0], [2, 1], [3, 2], [3, 4],
- [4, 2]], "name": "cx"}], "description": "5 qubit device", "open_pulse": false,
- "memory": true, "credits_required": true, "allow_q_object": true, "n_registers":
- 1}, {"online_date": "2018-11-06T05:00:00Z", "backend_name": "ibmq_16_melbourne",
- "max_shots": 8192, "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10],
- [5, 4], [5, 6], [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10],
- [11, 12], [12, 2], [13, 1], [13, 12]], "simulator": false, "sample_name":
- "albatross", "max_experiments": 75, "local": false, "backend_version": "1.0.0",
- "n_qubits": 14, "basis_gates": ["u1", "u2", "u3", "cx", "id"], "conditional":
- false, "url": "None", "gates": [{"qasm_def": "gate id q { U(0,0,0) q; }",
- "parameters": [], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "id"}, {"qasm_def": "gate u1(lambda)
- q { U(0,0,lambda) q; }", "parameters": ["lambda"], "coupling_map": [[0], [1],
- [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]], "name": "u1"},
- {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda) q; }", "parameters":
- ["phi", "lambda"], "coupling_map": [[0], [1], [2], [3], [4], [5], [6], [7],
- [8], [9], [10], [11], [12], [13]], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "coupling_map":
- [[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13]],
- "name": "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [],
- "coupling_map": [[1, 0], [1, 2], [2, 3], [4, 3], [4, 10], [5, 4], [5, 6],
- [5, 9], [6, 8], [7, 8], [9, 8], [9, 10], [11, 3], [11, 10], [11, 12], [12,
- 2], [13, 1], [13, 12]], "name": "cx"}], "description": "14 qubit device",
- "open_pulse": false, "memory": false, "credits_required": true, "allow_q_object":
- true, "n_registers": 1}, {"backend_name": "ibmq_qasm_simulator", "backend_version":
- "0.1.547", "simulator": true, "max_experiments": 300, "local": false, "max_shots":
- 8192, "n_qubits": 32, "basis_gates": ["u1", "u2", "u3", "cx"], "conditional":
- true, "gates": [{"qasm_def": "gate u1(lambda) q { U(0,0,lambda) q; }", "parameters":
- ["lambda"], "name": "u1"}, {"qasm_def": "gate u2(phi,lambda) q { U(pi/2,phi,lambda)
- q; }", "parameters": ["phi", "lambda"], "name": "u2"}, {"qasm_def": "u3(theta,phi,lambda)
- q { U(theta,phi,lambda) q; }", "parameters": ["theta", "phi", "lambda"], "name":
- "u3"}, {"qasm_def": "gate cx q1,q2 { CX q1,q2; }", "parameters": [], "name":
- "cx"}], "open_pulse": false, "memory": true, "allow_q_object": true}]'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-- request:
- body: null
- headers:
- Accept: ['*/*']
- Accept-Encoding: ['gzip, deflate']
- Connection: [keep-alive]
- method: GET
- uri: https://quantumexperience.ng.bluemix.net/api/Backends/ibmqx4/properties?access_token=dummyapiusersloginWithTokenid01&version=1
- response:
- body: {string: '{"backend_name": "ibmqx4", "backend_version": "1.0.0", "qubits":
- [[{"date": "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value":
- 40.23580728497557}, {"date": "2019-02-05T02:16:42Z", "unit": "\u00b5s", "name":
- "T2", "value": 28.053796455696148}, {"date": "2019-02-05T02:54:28Z", "unit":
- "GHz", "name": "frequency", "value": 5.2498546876946115}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.08450000000000002}], [{"date":
- "2019-02-04T10:14:24Z", "unit": "\u00b5s", "name": "T1", "value": 67.39179431370326},
- {"date": "2019-02-05T02:17:25Z", "unit": "\u00b5s", "name": "T2", "value":
- 10.763705758897595}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.295772109235333}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.07925000000000004}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 43.393200515665946},
- {"date": "2019-02-05T02:18:06Z", "unit": "\u00b5s", "name": "T2", "value":
- 27.3542760576606}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.3533332743669355}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.03699999999999992}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 55.541546751266985},
- {"date": "2019-02-05T02:17:25Z", "unit": "\u00b5s", "name": "T2", "value":
- 14.846271380938276}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.434936834513384}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.03875000000000006}], [{"date":
- "2019-02-05T02:15:59Z", "unit": "\u00b5s", "name": "T1", "value": 53.860484623965284},
- {"date": "2019-02-05T02:16:42Z", "unit": "\u00b5s", "name": "T2", "value":
- 4.983364732947786}, {"date": "2019-02-05T02:54:28Z", "unit": "GHz", "name":
- "frequency", "value": 5.175855462568935}, {"date": "2019-02-05T02:15:40Z",
- "unit": "", "name": "readout_error", "value": 0.26875000000000004}]], "gates":
- [{"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error",
- "value": 0}], "qubits": [0], "gate": "u1"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0.0006867731322012238}], "qubits":
- [0], "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit":
- "", "name": "gate_error", "value": 0.0013735462644024476}], "qubits": [0],
- "gate": "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "",
- "name": "gate_error", "value": 0}], "qubits": [1], "gate": "u1"}, {"parameters":
- [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value":
- 0.00128782749692391}], "qubits": [1], "gate": "u2"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0.00257565499384782}],
- "qubits": [1], "gate": "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0}], "qubits": [2], "gate": "u1"},
- {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error",
- "value": 0.00128782749692391}], "qubits": [2], "gate": "u2"}, {"parameters":
- [{"date": "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value":
- 0.00257565499384782}], "qubits": [2], "gate": "u3"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0}], "qubits":
- [3], "gate": "u1"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit":
- "", "name": "gate_error", "value": 0.001803112096824766}], "qubits": [3],
- "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "",
- "name": "gate_error", "value": 0.003606224193649532}], "qubits": [3], "gate":
- "u3"}, {"parameters": [{"date": "2019-02-05T10:57:11Z", "unit": "", "name":
- "gate_error", "value": 0}], "qubits": [4], "gate": "u1"}, {"parameters": [{"date":
- "2019-02-05T10:57:11Z", "unit": "", "name": "gate_error", "value": 0.006444645993361475}],
- "qubits": [4], "gate": "u2"}, {"parameters": [{"date": "2019-02-05T10:57:11Z",
- "unit": "", "name": "gate_error", "value": 0.01288929198672295}], "qubits":
- [4], "gate": "u3"}, {"parameters": [{"date": "2019-02-05T02:25:32Z", "unit":
- "", "name": "gate_error", "value": 0.03594617578113263}], "qubits": [1, 0],
- "name": "CX1_0", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:31:04Z",
- "unit": "", "name": "gate_error", "value": 0.03205473341614962}], "qubits":
- [2, 0], "name": "CX2_0", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:36:21Z",
- "unit": "", "name": "gate_error", "value": 0.048500617566183984}], "qubits":
- [2, 1], "name": "CX2_1", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:41:40Z",
- "unit": "", "name": "gate_error", "value": 0.07474221943376097}], "qubits":
- [3, 2], "name": "CX3_2", "gate": "cx"}, {"parameters": [{"date": "2019-02-05T02:47:44Z",
- "unit": "", "name": "gate_error", "value": 0.07660114123887399}], "qubits":
- [3, 4], "name": "CX3_4", "gate": "cx"}, {"parameters": [{"date": "2019-02-04T10:53:35Z",
- "unit": "", "name": "gate_error", "value": 0.06824929220587475}], "qubits":
- [4, 2], "name": "CX4_2", "gate": "cx"}], "last_update_date": "2019-02-05T02:54:28.000Z",
- "general": []}'}
- headers:
- Access-Control-Allow-Credentials: ['true']
- Access-Control-Allow-Origin: ['https://quantumexperience.mybluemix.net/']
- Cache-Control: ['no-store, no-cache, must-revalidate, proxy-revalidate']
- Connection: [Keep-Alive]
- Content-Type: [application/json; charset=utf-8]
- Expires: ['0']
- Pragma: [no-cache]
- Set-Cookie: dummy_cookie
- Strict-Transport-Security: [max-age=86400]
- Surrogate-Control: [no-store]
- Transfer-Encoding: [chunked]
- Vary: ['Origin, Accept-Encoding']
- X-Backside-Transport: [OK OK]
- X-Content-Type-Options: [nosniff]
- X-Download-Options: [noopen]
- X-Frame-Options: [SAMEORIGIN]
- X-Xss-Protection: [1; mode=block]
- status: {code: 200, message: OK}
-version: 1
diff --git a/test/python/compiler/test_transpiler.py b/test/python/compiler/test_transpiler.py
index d0b26f63c358..1f4939656d1e 100644
--- a/test/python/compiler/test_transpiler.py
+++ b/test/python/compiler/test_transpiler.py
@@ -16,6 +16,7 @@
import math
import unittest
+from unittest.mock import patch
from qiskit import QuantumRegister, ClassicalRegister, QuantumCircuit
from qiskit import BasicAer
@@ -33,8 +34,6 @@
class TestTranspile(QiskitTestCase):
"""Test transpile function."""
- barrier_pass = BarrierBeforeFinalMeasurements()
-
def test_pass_manager_none(self):
"""Test passing the default (None) pass manager to the transpiler.
@@ -430,16 +429,16 @@ def test_parameterized_circuit_for_device(self):
self.assertEqual(expected_qc, transpiled_qc)
- @unittest.mock.patch.object(BarrierBeforeFinalMeasurements, 'run', wraps=barrier_pass.run)
- def test_final_measurement_barrier_for_devices(self, mock_pass):
+ def test_final_measurement_barrier_for_devices(self):
"""Verify BarrierBeforeFinalMeasurements pass is called in default pipeline for devices."""
circ = QuantumCircuit.from_qasm_file(self._get_resource_path('example.qasm', Path.QASMS))
layout = Layout.generate_trivial_layout(*circ.qregs)
- transpile(circ, coupling_map=FakeRueschlikon().configuration().coupling_map,
- initial_layout=layout)
-
- self.assertTrue(mock_pass.called)
+ orig_pass = BarrierBeforeFinalMeasurements()
+ with patch.object(BarrierBeforeFinalMeasurements, 'run', wraps=orig_pass.run) as mock_pass:
+ transpile(circ, coupling_map=FakeRueschlikon().configuration().coupling_map,
+ initial_layout=layout)
+ self.assertTrue(mock_pass.called)
def test_do_not_run_cxdirection_with_symmetric_cm(self):
"""When the coupling map is symmetric, do not run CXDirection."""
@@ -451,8 +450,8 @@ def test_do_not_run_cxdirection_with_symmetric_cm(self):
coupling_map.append([node1, node2])
coupling_map.append([node2, node1])
- cxdir_pass = CXDirection(CouplingMap(coupling_map))
- with unittest.mock.patch.object(CXDirection, 'run', wraps=cxdir_pass.run) as mock_pass:
+ orig_pass = CXDirection(CouplingMap(coupling_map))
+ with patch.object(CXDirection, 'run', wraps=orig_pass.run) as mock_pass:
transpile(circ, coupling_map=coupling_map, initial_layout=layout)
self.assertFalse(mock_pass.called)
diff --git a/test/python/tools/jupyter/test_notebooks.py b/test/python/tools/jupyter/test_notebooks.py
index 72176874b02c..f985e383e9e4 100644
--- a/test/python/tools/jupyter/test_notebooks.py
+++ b/test/python/tools/jupyter/test_notebooks.py
@@ -20,7 +20,7 @@
import nbformat
from nbconvert.preprocessors import ExecutePreprocessor
from qiskit.tools.visualization import HAS_MATPLOTLIB
-from qiskit.test import (Path, QiskitTestCase, requires_qe_access, slow_test)
+from qiskit.test import (Path, QiskitTestCase, online_test, slow_test)
# Timeout (in seconds) for a single notebook.
@@ -64,7 +64,7 @@ def test_jupyter_jobs_pbars(self):
'notebooks/test_pbar_status.ipynb'))
@unittest.skipIf(not HAS_MATPLOTLIB, 'matplotlib not available.')
- @requires_qe_access
+ @online_test
@slow_test
def test_backend_tools(self, qe_token, qe_url):
"""Test Jupyter backend tools."""
diff --git a/test/python/tools/monitor/test_backend_monitor.py b/test/python/tools/monitor/test_backend_monitor.py
index d17a14f95633..cb0c6b92e862 100644
--- a/test/python/tools/monitor/test_backend_monitor.py
+++ b/test/python/tools/monitor/test_backend_monitor.py
@@ -19,17 +19,13 @@
from io import StringIO
from qiskit.tools.monitor import backend_overview, backend_monitor
-from qiskit.test import QiskitTestCase, requires_qe_access
-from qiskit.util import _has_connection
-# Check if internet connection exists
-HAS_NET_CONNECTION = _has_connection('qiskit.org', 443)
+from qiskit.test import QiskitTestCase, online_test
class TestBackendOverview(QiskitTestCase):
"""Tools test case."""
- @unittest.skipIf(not HAS_NET_CONNECTION, "requries internet connection.")
- @requires_qe_access
+ @online_test
def test_backend_overview(self, qe_token, qe_url):
"""Test backend_overview"""
from qiskit import IBMQ # pylint: disable: import-error
@@ -42,8 +38,7 @@ def test_backend_overview(self, qe_token, qe_url):
self.assertIn('Avg. T1:', stdout)
self.assertIn('Num. Qubits:', stdout)
- @unittest.skipIf(not HAS_NET_CONNECTION, "requries internet connection.")
- @requires_qe_access
+ @online_test
def test_backend_monitor(self, qe_token, qe_url):
"""Test backend_monitor"""
from qiskit import IBMQ # pylint: disable: import-error
diff --git a/tox.ini b/tox.ini
index b546a4dcfa1c..ac2dcb16792f 100644
--- a/tox.ini
+++ b/tox.ini
@@ -17,14 +17,6 @@ deps = -r{toxinidir}/requirements.txt
commands =
stestr run {posargs}
-[testenv:online-mock]
-setenv = {[testenv]setenv}
- QISKIT_TESTS=mock_online
-
-[testenv:recording]
-setenv = {[testenv]setenv}
- QISKIT_TESTS=rec
-
[testenv:lint]
commands =
pycodestyle --max-line-length=100 qiskit test
|
scrapy__scrapy-2929 | LinkExtractor is not ignoring .m4v extension (video) by default
By chance I found out that LinkExtractor is not ignoring the video extension m4v in the same way it is ignoring other video formats.
https://en.wikipedia.org/wiki/M4V
| [
{
"content": "\"\"\"\nscrapy.linkextractors\n\nThis package contains a collection of Link Extractors.\n\nFor more info see docs/topics/link-extractors.rst\n\"\"\"\nimport re\n\nfrom six.moves.urllib.parse import urlparse\nfrom parsel.csstranslator import HTMLTranslator\nfrom w3lib.url import canonicalize_url\n\... | [
{
"content": "\"\"\"\nscrapy.linkextractors\n\nThis package contains a collection of Link Extractors.\n\nFor more info see docs/topics/link-extractors.rst\n\"\"\"\nimport re\n\nfrom six.moves.urllib.parse import urlparse\nfrom parsel.csstranslator import HTMLTranslator\nfrom w3lib.url import canonicalize_url\n\... | diff --git a/scrapy/linkextractors/__init__.py b/scrapy/linkextractors/__init__.py
index 8676c3b926d..2d7115cc504 100644
--- a/scrapy/linkextractors/__init__.py
+++ b/scrapy/linkextractors/__init__.py
@@ -28,7 +28,7 @@
# video
'3gp', 'asf', 'asx', 'avi', 'mov', 'mp4', 'mpg', 'qt', 'rm', 'swf', 'wmv',
- 'm4a',
+ 'm4a', 'm4v',
# office suites
'xls', 'xlsx', 'ppt', 'pptx', 'pps', 'doc', 'docx', 'odt', 'ods', 'odg',
|
open-mmlab__mmocr-663 | locals() should not be modified
```python
args = locals()
[args.pop(x, None) for x in ['kwargs', 'self']]
```
https://github.com/open-mmlab/mmocr/blob/b04775fd78ac89e32d38ef6fbd5493dedbfd76f4/mmocr/utils/ocr.py#L414
| [
{
"content": "#!/usr/bin/env python\n# Copyright (c) OpenMMLab. All rights reserved.\nimport copy\nimport os\nimport warnings\nfrom argparse import ArgumentParser, Namespace\nfrom pathlib import Path\n\nimport mmcv\nimport numpy as np\nimport torch\nfrom mmcv.image.misc import tensor2imgs\nfrom mmcv.runner impo... | [
{
"content": "#!/usr/bin/env python\n# Copyright (c) OpenMMLab. All rights reserved.\nimport copy\nimport os\nimport warnings\nfrom argparse import ArgumentParser, Namespace\nfrom pathlib import Path\n\nimport mmcv\nimport numpy as np\nimport torch\nfrom mmcv.image.misc import tensor2imgs\nfrom mmcv.runner impo... | diff --git a/mmocr/utils/ocr.py b/mmocr/utils/ocr.py
index 1c5f342e1..5631b1ada 100755
--- a/mmocr/utils/ocr.py
+++ b/mmocr/utils/ocr.py
@@ -411,7 +411,7 @@ def readtext(self,
merge=False,
merge_xdist=20,
**kwargs):
- args = locals()
+ args = locals().copy()
[args.pop(x, None) for x in ['kwargs', 'self']]
args = Namespace(**args)
|
ibis-project__ibis-4271 | bug(impala) A delimited table should be explicitly stored as textfile
https://github.com/ibis-project/ibis/blob/88ffe3367cb6a34936e578f6a9b68dc30d559507/ibis/backends/impala/ddl.py#L67
when the cluster's default format is set as parquet, this will cause an exception. It should be explicitly stored as textfile. such as
```python
if self.lineterminator is not None:
yield f"LINES TERMINATED BY '{self.lineterminator}'"
yield 'STORED AS TEXTFILE'
yield f"LOCATION '{self.path}'"
```
| [
{
"content": "# Copyright 2014 Cloudera Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable... | [
{
"content": "# Copyright 2014 Cloudera Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable... | diff --git a/ibis/backends/impala/ddl.py b/ibis/backends/impala/ddl.py
index 2df5daf4bf6a..8762c0c99d81 100644
--- a/ibis/backends/impala/ddl.py
+++ b/ibis/backends/impala/ddl.py
@@ -91,6 +91,7 @@ def to_ddl(self):
if self.lineterminator is not None:
yield f"LINES TERMINATED BY '{self.lineterminator}'"
+ yield 'STORED AS TEXTFILE'
yield f"LOCATION '{self.path}'"
if self.na_rep is not None:
diff --git a/ibis/backends/impala/tests/test_ddl_compilation.py b/ibis/backends/impala/tests/test_ddl_compilation.py
index 3c57116e90d5..6a6c75f27d44 100644
--- a/ibis/backends/impala/tests/test_ddl_compilation.py
+++ b/ibis/backends/impala/tests/test_ddl_compilation.py
@@ -407,6 +407,7 @@ def test_create_table_delimited():
FIELDS TERMINATED BY '|'
ESCAPED BY '\\'
LINES TERMINATED BY '\0'
+STORED AS TEXTFILE
LOCATION '{}'""".format(
path
)
|
pyca__cryptography-1530 | Release Automation Fixes for Seventh Release
The release script is not properly waiting for the wheel job it starts to finish before downloading. This causes it to download previous releases and attempt to upload them.
| [
{
"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport getpass\nimport os\nimport time\n\nimp... | [
{
"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport getpass\nimport os\nimport time\n\nimp... | diff --git a/tasks.py b/tasks.py
index 2dd005ba2594..c109f14974b8 100644
--- a/tasks.py
+++ b/tasks.py
@@ -17,6 +17,9 @@
def wait_for_build_completed(session):
+ # Wait 3 seconds before actually checking if the build is complete, to
+ # ensure that it had time to really start.
+ time.sleep(3)
while True:
response = session.get(
"{0}/lastBuild/api/json/".format(JENKINS_URL),
|
ckan__ckan-6110 | Plugin order for translations is reversed
**CKAN version**
2.8, 2.9, master
**Describe the bug**
If the developer has multiple plugins implementing ITranslation interface and has the same translation keys in them, the last plugin wins.
**Steps to reproduce**
Create two plugins with ITranslation interface and the same translation key.
Translations from last plugin will be used.
**Expected behavior**
Translations from the first plugin should be used as the common convention is that the first plugin wins.
**Additional details**
https://github.com/vrk-kpa/ckanext-forcetranslation we made this couple years ago to circumvent this. Simple plugin which allows to choose which plugins translations to use. Related bug in https://github.com/ckan/ckanext-harvest/issues/266 which in essence is caused by the same thing.
| [
{
"content": "# encoding: utf-8\n\nimport os\nimport sys\nimport re\nimport time\nimport inspect\nimport itertools\nimport pkgutil\n\nfrom flask import Blueprint, send_from_directory\nfrom flask.ctx import _AppCtxGlobals\nfrom flask.sessions import SessionInterface\nfrom flask_multistatic import MultiStaticFlas... | [
{
"content": "# encoding: utf-8\n\nimport os\nimport sys\nimport re\nimport time\nimport inspect\nimport itertools\nimport pkgutil\n\nfrom flask import Blueprint, send_from_directory\nfrom flask.ctx import _AppCtxGlobals\nfrom flask.sessions import SessionInterface\nfrom flask_multistatic import MultiStaticFlas... | diff --git a/ckan/config/middleware/flask_app.py b/ckan/config/middleware/flask_app.py
index 2f15bcfe271..abc82b7cf9d 100644
--- a/ckan/config/middleware/flask_app.py
+++ b/ckan/config/middleware/flask_app.py
@@ -238,7 +238,7 @@ def ungettext_alias():
(_ckan_i18n_dir, u'ckan')
] + [
(p.i18n_directory(), p.i18n_domain())
- for p in PluginImplementations(ITranslation)
+ for p in reversed(list(PluginImplementations(ITranslation)))
]
i18n_dirs, i18n_domains = zip(*pairs)
|
dask__dask-3076 | Slicing list/array issue
Someone reported this slicing issue
da.zeros(5,chunks=5)[:,None][:,[0]*2].compute()
leads to:
```
Traceback (most recent call last):
File "<ipython-input-154-e9a840dad4ae>", line 1, in <module>
da.zeros(5,chunks=5)[:,None][:,[0]*2].compute()
File "/dask/dask/base.py", line 135, in compute
(result,) = compute(self, traverse=False, **kwargs)
File "/dask/dask/base.py", line 329, in compute
dsk = collections_to_dsk(variables, optimize_graph, **kwargs)
File "/dask/dask/base.py", line 240, in collections_to_dsk
for opt, (dsk, keys) in groups.items()))
File "/dask/dask/base.py", line 240, in <genexpr>
for opt, (dsk, keys) in groups.items()))
File "/dask/dask/array/optimization.py", line 46, in optimize
dsk5 = optimize_slices(dsk4)
File "/dask/dask/array/optimization.py", line 134, in optimize_slices
c_index = fuse_slice(b_index, a_index)
File "/dask/dask/array/optimization.py", line 285, in fuse_slice
result.append(fuse_slice(a[i], b[j])) # Common case
File "/dask/dask/array/optimization.py", line 223, in fuse_slice
if a is None and b == slice(None, None):
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
```
| [
{
"content": "from __future__ import absolute_import, division, print_function\n\nfrom operator import getitem\n\nimport numpy as np\n\nfrom .core import getter, getter_nofancy, getter_inline\nfrom ..compatibility import zip_longest\nfrom ..core import flatten, reverse_dict\nfrom ..optimize import cull, fuse, i... | [
{
"content": "from __future__ import absolute_import, division, print_function\n\nfrom operator import getitem\n\nimport numpy as np\n\nfrom .core import getter, getter_nofancy, getter_inline\nfrom ..compatibility import zip_longest\nfrom ..core import flatten, reverse_dict\nfrom ..optimize import cull, fuse, i... | diff --git a/dask/array/optimization.py b/dask/array/optimization.py
index bf49a32c94b..06d11ed1fa8 100644
--- a/dask/array/optimization.py
+++ b/dask/array/optimization.py
@@ -220,7 +220,7 @@ def fuse_slice(a, b):
None
"""
# None only works if the second side is a full slice
- if a is None and b == slice(None, None):
+ if a is None and isinstance(b, slice) and b == slice(None, None):
return None
# Replace None with 0 and one in start and step
diff --git a/dask/array/tests/test_optimization.py b/dask/array/tests/test_optimization.py
index 3e43c5ba176..6c203ca0873 100644
--- a/dask/array/tests/test_optimization.py
+++ b/dask/array/tests/test_optimization.py
@@ -139,6 +139,9 @@ def test_fuse_slice():
with pytest.raises(NotImplementedError):
fuse_slice(slice(10, 15, 2), -1)
+ # Regression test for #3076
+ with pytest.raises(NotImplementedError):
+ fuse_slice(None, np.array([0, 0]))
def test_fuse_slice_with_lists():
diff --git a/docs/source/changelog.rst b/docs/source/changelog.rst
index f2014106dfb..0f9c0a3b2e6 100644
--- a/docs/source/changelog.rst
+++ b/docs/source/changelog.rst
@@ -10,6 +10,7 @@ Array
- Update error handling when len is called with empty chunks (:issue:`3058`) `Xander Johnson`_
- Fixes a metadata bug with ``store``'s ``return_stored`` option (:pr:`3064`) `John A Kirkham`_
+- Fix a bug in ``optimization.fuse_slice`` to properly handle when first input is ``None`` (:pr:`3076`) `James Bourbeau`_
DataFrame
+++++++++
|
airctic__icevision-733 | Better Record __repr__ to show ClassMap when it is stored internally
## 🚀 Feature
**Is your feature request related to a problem? Please describe.**
More informative to show `class_map` when a `Record` object is storing it
Take for example a record loaded from the main repo:
```python
from icevision.all import *
data_dir = Path("~/icevision/samples/")
class_map = icedata.coco.class_map()
parser = parsers.COCOMaskParser(annotations_filepath=data_dir/'annotations.json', img_dir=data_dir/'images')
records = parser.parse(data_splitter=SingleSplitSplitter())[0]
record = records[0]
print(record)
## Output:
BaseRecord
common:
- Filepath: /Users/rahulsomani/git/icevision-orig/samples/images/000000343934.jpg
- Image: None
- Image size ImgSize(width=640, height=480)
- Image ID: 0
detection:
- Masks: <EncodedRLEs with 1 objects>
- Labels: [4]
- Areas: [43522.80595]
- BBoxes: [<BBox (xmin:175.14, ymin:175.68, xmax:496.21999999999997, ymax:415.68)>]
- Is Crowds: [0]
```
This record internally has access to the `class_map` via `record.detection.class_map`, which is great, but not known when you print the record. Additionally, if you print `record.components`, you get:
```python
{<icevision.core.record_components.AreasRecordComponent at 0x7fbb5b54a4d0>,
<icevision.core.record_components.BBoxesRecordComponent at 0x7fbb5b54acd0>,
<icevision.core.record_components.FilepathRecordComponent at 0x7fbb5b54a690>,
<icevision.core.record_components.InstancesLabelsRecordComponent at 0x7fbb5b54a7d0>,
<icevision.core.record_components.IsCrowdsRecordComponent at 0x7fbb5b54ad90>,
<icevision.core.record_components.MasksRecordComponent at 0x7fbb5b54a150>,
<icevision.core.record_components.RecordIDRecordComponent at 0x7fbb5b54a9d0>,
<icevision.core.record_components.SizeRecordComponent at 0x7fbb5b54a810>}
```
I'd have expected `ClassMapRecordComponent` to be in there as well?
| [
{
"content": "__all__ = [\n \"RecordComponent\",\n \"ClassMapRecordComponent\",\n \"RecordIDRecordComponent\",\n \"ImageRecordComponent\",\n \"FilepathRecordComponent\",\n \"SizeRecordComponent\",\n \"BaseLabelsRecordComponent\",\n \"InstancesLabelsRecordComponent\",\n \"Classificatio... | [
{
"content": "__all__ = [\n \"RecordComponent\",\n \"ClassMapRecordComponent\",\n \"RecordIDRecordComponent\",\n \"ImageRecordComponent\",\n \"FilepathRecordComponent\",\n \"SizeRecordComponent\",\n \"BaseLabelsRecordComponent\",\n \"InstancesLabelsRecordComponent\",\n \"Classificatio... | diff --git a/icevision/core/record_components.py b/icevision/core/record_components.py
index 65238d214..db2c0e6a5 100644
--- a/icevision/core/record_components.py
+++ b/icevision/core/record_components.py
@@ -80,6 +80,9 @@ def __init__(self, task):
def set_class_map(self, class_map: ClassMap):
self.class_map = class_map
+ def _repr(self) -> List[str]:
+ return [f"Class Map: {self.class_map}"]
+
def as_dict(self) -> dict:
return {"class_map": self.class_map}
|
microsoft__MLOS-477 | SMAC optimizer messes up mlos_bench logging
SMAC optimizer completely overrides our logging setup and installs its own formatter, output handler, and so on. As a result, as soon as SMAC optimizer is initialized, mlos_bench stops writing to its log file, and all logging goes to stdout, in different format, and at different log level (always INFO). We need to find a way to make SMAC use our logger instead of setting up its own from scratch
| [
{
"content": "#\n# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT License.\n#\n\"\"\"\nContains the wrapper class for SMAC Bayesian optimizers.\nSee Also: <https://automl.github.io/SMAC3/main/index.html>\n\"\"\"\n\nfrom pathlib import Path\nfrom typing import Dict, List, Optional, TYPE_CHECKING\... | [
{
"content": "#\n# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT License.\n#\n\"\"\"\nContains the wrapper class for SMAC Bayesian optimizers.\nSee Also: <https://automl.github.io/SMAC3/main/index.html>\n\"\"\"\n\nfrom pathlib import Path\nfrom typing import Dict, List, Optional, TYPE_CHECKING\... | diff --git a/mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc b/mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc
index cff09de12ba..d4179baa8e2 100644
--- a/mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc
+++ b/mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc
@@ -13,7 +13,6 @@
],
"environment": "environments/mock/mock_env.jsonc",
- "storage": "storage/in-memory.jsonc",
"tunable_values": [
"tunable-values/tunable-values-example.jsonc"
diff --git a/mlos_bench/mlos_bench/tests/config/cli/mock-opt.jsonc b/mlos_bench/mlos_bench/tests/config/cli/mock-opt.jsonc
new file mode 100644
index 00000000000..4d30d58d5e0
--- /dev/null
+++ b/mlos_bench/mlos_bench/tests/config/cli/mock-opt.jsonc
@@ -0,0 +1,28 @@
+//
+// Copyright (c) Microsoft Corporation.
+// Licensed under the MIT License.
+//
+
+// *** DO *NOT* CHANGE! This config is used for tests! ***
+{
+ "$schema": "https://raw.githubusercontent.com/microsoft/MLOS/main/mlos_bench/mlos_bench/config/schemas/cli/cli-schema.json",
+
+ "config_path": [
+ // relative to the root of the repo (for now), where this is expected to be executed from
+ "mlos_bench/mlos_bench/tests/config",
+ "mlos_bench/mlos_bench/config"
+ ],
+
+ "environment": "environments/mock/mock_env.jsonc",
+ "optimizer": "optimizers/mlos_core_default_opt.jsonc",
+
+ // "globals": ["global_config.json"],
+
+ "experiment_id": "MockExperiment",
+ "trial_id": 1,
+
+ "teardown": true,
+
+ // "log_file": "mock-opt.log",
+ "log_level": "DEBUG"
+}
diff --git a/mlos_bench/mlos_bench/tests/launcher_test.py b/mlos_bench/mlos_bench/tests/launcher_test.py
index 32a917d782e..8850477b8ca 100644
--- a/mlos_bench/mlos_bench/tests/launcher_test.py
+++ b/mlos_bench/mlos_bench/tests/launcher_test.py
@@ -6,6 +6,8 @@
Unit tests to check the main CLI launcher.
"""
import os
+import re
+from typing import List
import pytest
@@ -37,27 +39,73 @@ def local_exec_service() -> LocalExecService:
}))
-def test_launch_main_app(root_path: str,
- local_exec_service: LocalExecService) -> None:
+def _launch_main_app(root_path: str, local_exec_service: LocalExecService,
+ cli_config: str, re_expected: List[str]) -> None:
"""
- Run mlos_bench command-line application with mock config and check the results in the log.
+ Run mlos_bench command-line application with given config
+ and check the results in the log.
"""
with local_exec_service.temp_dir_context() as temp_dir:
- log_path = path_join(temp_dir, "mock-bench.log")
- cmd = "./mlos_bench/mlos_bench/run.py" + \
- " --config mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc" + \
- f" --log_file '{log_path}'"
- (return_code, _stdout, _stderr) = local_exec_service.local_exec([cmd], cwd=root_path)
-
+ # Test developers note: for local debugging,
+ # uncomment the following line to use a known file path that can be examined:
+ # temp_dir = '/tmp'
+ log_path = path_join(temp_dir, "mock-test.log")
+ (return_code, _stdout, _stderr) = local_exec_service.local_exec(
+ [f"./mlos_bench/mlos_bench/run.py {cli_config} --log_file '{log_path}'"],
+ cwd=root_path)
assert return_code == 0
- with open(log_path, "rt", encoding="utf-8") as fh_out:
- best_score_lines = [
- ln.strip() for ln in fh_out.readlines()
- if " INFO Env: Mock environment best score: " in ln
- ]
- assert len([
- ln for ln in best_score_lines
- if " best score: 65.67" in ln
- ]) == 1
+ try:
+ iter_expected = iter(re_expected)
+ re_log = re.compile(next(iter_expected))
+ with open(log_path, "rt", encoding="utf-8") as fh_out:
+ for ln in fh_out:
+ if re_log.match(ln):
+ re_log = re.compile(next(iter_expected))
+ assert False, f"Pattern not found: '{re_log.pattern}'"
+ except StopIteration:
+ pass # Success: all patterns found
+
+
+_RE_DATE = r"\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3}"
+
+
+def test_launch_main_app_bench(root_path: str, local_exec_service: LocalExecService) -> None:
+ """
+ Run mlos_bench command-line application with mock benchmark config
+ and check the results in the log.
+ """
+ _launch_main_app(
+ root_path, local_exec_service,
+ "--config mlos_bench/mlos_bench/tests/config/cli/mock-bench.jsonc",
+ [
+ f"^{_RE_DATE} run\\.py:\\d+ " +
+ r"_optimize INFO Env: Mock environment best score: 65\.67\d+\s*$",
+ ]
+ )
+
+
+def test_launch_main_app_opt(root_path: str, local_exec_service: LocalExecService) -> None:
+ """
+ Run mlos_bench command-line application with mock optimization config
+ and check the results in the log.
+ """
+ _launch_main_app(
+ root_path, local_exec_service,
+ "--config mlos_bench/mlos_bench/tests/config/cli/mock-opt.jsonc --max_iterations 3",
+ [
+ # Iteration 1: Expect first value to be the baseline
+ f"^{_RE_DATE} mlos_core_optimizer\\.py:\\d+ " +
+ r"register DEBUG Score: 65\.67\d+ Dataframe:\s*$",
+ # Iteration 2: The result may not always be deterministic
+ f"^{_RE_DATE} mlos_core_optimizer\\.py:\\d+ " +
+ r"register DEBUG Score: \d+\.\d+ Dataframe:\s*$",
+ # Iteration 3: non-deterministic (depends on the optimizer)
+ f"^{_RE_DATE} mlos_core_optimizer\\.py:\\d+ " +
+ r"register DEBUG Score: \d+\.\d+ Dataframe:\s*$",
+ # Final result: baseline is the optimum for the mock environment
+ f"^{_RE_DATE} run\\.py:\\d+ " +
+ r"_optimize INFO Env: Mock environment best score: 65\.67\d+\s*$",
+ ]
+ )
diff --git a/mlos_core/mlos_core/optimizers/bayesian_optimizers/smac_optimizer.py b/mlos_core/mlos_core/optimizers/bayesian_optimizers/smac_optimizer.py
index 867abd73c66..bb222b78e18 100644
--- a/mlos_core/mlos_core/optimizers/bayesian_optimizers/smac_optimizer.py
+++ b/mlos_core/mlos_core/optimizers/bayesian_optimizers/smac_optimizer.py
@@ -124,6 +124,7 @@ def __init__(self, *, # pylint: disable=too-many-locals
random_design=random_design,
config_selector=config_selector,
overwrite=True,
+ logging_level=False, # Use the existing logger
)
def __del__(self) -> None:
|
WeblateOrg__weblate-1655 | File download is outdated
### Steps to reproduce
1. Edit string.
2. Donwload original translation file (without conversion).
### Actual behaviour
The file does not have recent changes.
### Expected behaviour
All changes should be reflected.
### Server configuration
Current master
| [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright © 2012 - 2017 Michal Čihař <michal@cihar.com>\n#\n# This file is part of Weblate <https://weblate.org/>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the ... | [
{
"content": "# -*- coding: utf-8 -*-\n#\n# Copyright © 2012 - 2017 Michal Čihař <michal@cihar.com>\n#\n# This file is part of Weblate <https://weblate.org/>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the ... | diff --git a/weblate/trans/tests/test_files.py b/weblate/trans/tests/test_files.py
index 816cb79a628f..0a3cce423e3d 100644
--- a/weblate/trans/tests/test_files.py
+++ b/weblate/trans/tests/test_files.py
@@ -360,6 +360,7 @@ def test_export(self):
)
)
self.assertContains(response, 'Weblate Hello World 2016')
+ self.assertContains(response, 'Nazdar svete!')
self.assertEqual(
response['Content-Disposition'],
'attachment; filename=test-test-cs.po'
diff --git a/weblate/trans/views/helper.py b/weblate/trans/views/helper.py
index ac817a219989..1c5c9b4df663 100644
--- a/weblate/trans/views/helper.py
+++ b/weblate/trans/views/helper.py
@@ -123,6 +123,10 @@ def download_translation_file(translation, fmt=None):
)
)
+ # Force flushing pending units
+ author = translation.get_last_author(True)
+ translation.update_units(author)
+
srcfilename = translation.get_filename()
# Construct file name (do not use real filename as it is usually not
|
dask__distributed-2975 | dask.distributed.progress no longer callable in 2.3.0?
We've used the progress() function from dask.distributed a bunch in the past to display a progress bar in JupyterLab, but it seems to have stopped working after upgrading to Dask 2.3.0:
```
from dask.distributed import Client, progress
import dask.dataframe as dd
df = dd.demo.make_timeseries('2010', '2016',
{'value': float, 'name': str, 'id': int},
freq='10s', partition_freq='7d', seed=1)
df = df.persist()
progress(df)
```
Executing this in a single cell in JupyterLab (with an existing Dask cluster already running) results in:
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-1-16af814d7204> in <module>
7
8 df = df.persist()
----> 9 progress(df)
TypeError: 'module' object is not callable
```
Let me know if I can provide any more info. Thanks!
| [
{
"content": "from . import config\nfrom dask.config import config\nfrom .actor import Actor, ActorFuture\nfrom .core import connect, rpc\nfrom .deploy import LocalCluster, Adaptive, SpecCluster\nfrom .diagnostics import progress\nfrom .client import (\n Client,\n Executor,\n CompatibleExecutor,\n w... | [
{
"content": "from . import config\nfrom dask.config import config\nfrom .actor import Actor, ActorFuture\nfrom .core import connect, rpc\nfrom .deploy import LocalCluster, Adaptive, SpecCluster\nfrom .diagnostics.progressbar import progress\nfrom .client import (\n Client,\n Executor,\n CompatibleExec... | diff --git a/distributed/__init__.py b/distributed/__init__.py
index ca36613c815..d79993dfef7 100644
--- a/distributed/__init__.py
+++ b/distributed/__init__.py
@@ -3,7 +3,7 @@
from .actor import Actor, ActorFuture
from .core import connect, rpc
from .deploy import LocalCluster, Adaptive, SpecCluster
-from .diagnostics import progress
+from .diagnostics.progressbar import progress
from .client import (
Client,
Executor,
|
numpy__numpy-15672 | FIXME in `numpy/__init__.py` related to `numpy.lib` imports
There is a FIXME comment in `numpy/__init__.py` that doesn't seem to have a corresponding issue on GitHub, at least not one that I noticed with a cursory search of the issues.
https://github.com/numpy/numpy/blob/eb167a3fe540780f397a14817f54a95333fbcc6c/numpy/__init__.py#L140-L145
There is additional code in `numpy/__init__.py` related to this FIXME:
https://github.com/numpy/numpy/blob/eb167a3fe540780f397a14817f54a95333fbcc6c/numpy/__init__.py#L178-L184
My intent is getting this into the issue tracker so that it can be discussed/documented and synced up with the code comments. If there is an existing issue that I missed, I'd recommend updating the comment in `numpy/__init__.py` to point there.
| [
{
"content": "\"\"\"\nNumPy\n=====\n\nProvides\n 1. An array object of arbitrary homogeneous items\n 2. Fast mathematical operations over arrays\n 3. Linear Algebra, Fourier Transforms, Random Number Generation\n\nHow to use the documentation\n----------------------------\nDocumentation is available in two f... | [
{
"content": "\"\"\"\nNumPy\n=====\n\nProvides\n 1. An array object of arbitrary homogeneous items\n 2. Fast mathematical operations over arrays\n 3. Linear Algebra, Fourier Transforms, Random Number Generation\n\nHow to use the documentation\n----------------------------\nDocumentation is available in two f... | diff --git a/numpy/__init__.py b/numpy/__init__.py
index c5c58b0200ca..ba35224e6ad5 100644
--- a/numpy/__init__.py
+++ b/numpy/__init__.py
@@ -141,7 +141,8 @@
from .core import *
from . import compat
from . import lib
- # FIXME: why have numpy.lib if everything is imported here??
+ # NOTE: to be revisited following future namespace cleanup.
+ # See gh-14454 and gh-15672 for discussion.
from .lib import *
from . import linalg
|
PrefectHQ__prefect-10669 | Email notifications example form input needs adjusted
### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the Prefect documentation for this issue.
- [X] I checked that this issue is related to Prefect and not one of its dependencies.
### Bug summary
Here's the default example in a Prefect server UI.
<img width="1185" alt="Screenshot 2023-09-07 at 1 51 05 PM" src="https://github.com/PrefectHQ/prefect/assets/7703961/270dcce8-c9f8-484f-aeab-11dfebdabb92">
But this doesn't work.
<img width="1187" alt="Screenshot 2023-09-07 at 1 51 24 PM" src="https://github.com/PrefectHQ/prefect/assets/7703961/d4e38e4f-9aa3-4c04-8615-3c5cf2144e50">
Quotes are required around the address.
### Reproduction
```python3
see above
```
### Error
_No response_
### Versions
```Text
Version: 2.12.1
API version: 0.8.4
Python version: 3.10.8
Git commit: f5eed67c
Built: Fri, Sep 1, 2023
4:01 PM
OS/Arch: darwin/arm64
Profile: local
Server type: ephemeral
Server:
Database: sqlite
SQLite version: 3.40.0
```
### Additional context
_No response_
| [
{
"content": "from abc import ABC\nfrom typing import Dict, List, Optional\n\nfrom pydantic import AnyHttpUrl, Field, SecretStr\nfrom typing_extensions import Literal\n\nfrom prefect.blocks.abstract import NotificationBlock\nfrom prefect.blocks.fields import SecretDict\nfrom prefect.events.instrument import ins... | [
{
"content": "from abc import ABC\nfrom typing import Dict, List, Optional\n\nfrom pydantic import AnyHttpUrl, Field, SecretStr\nfrom typing_extensions import Literal\n\nfrom prefect.blocks.abstract import NotificationBlock\nfrom prefect.blocks.fields import SecretDict\nfrom prefect.events.instrument import ins... | diff --git a/src/prefect/blocks/notifications.py b/src/prefect/blocks/notifications.py
index 52cfc6e3ad03..86d7c1d6ff53 100644
--- a/src/prefect/blocks/notifications.py
+++ b/src/prefect/blocks/notifications.py
@@ -749,7 +749,7 @@ class SendgridEmail(AbstractAppriseNotificationBlock):
default=...,
title="Recipient emails",
description="Email ids of all recipients.",
- example="recipient1@gmail.com",
+ example='"recipient1@gmail.com"',
)
def block_initialization(self) -> None:
|
tobymao__sqlglot-2598 | sqlglot corrupts date_format spec for MySQL
**Before you file an issue**
> - Make sure you specify the "read" dialect eg. parse_one(sql, read="spark")
Yes, `read='mysql'`
> - Check if the issue still exists on main
Yes
**Fully reproducible code snippet**
> Please include a fully reproducible code snippet or the input sql, dialect, and expected output.
```
In [19]: import sqlglot
In [20]: sqlglot.parse_one("date_format(now(), '%Y-%m-%d %H:%i:00.0000')", read='mysql').sql(dialect='mysql')
Out[20]: "DATE_FORMAT(NOW(), '%Y-%m-%d %H:%M:00.0000')"
```
sqlglot uses `%M` specifier for minute, but in MySQL `%i` should be used.
**Official Documentation**
> Please include links to official SQL documentation related to your issue.
https://dev.mysql.com/doc/refman/8.0/en/date-and-time-functions.html#function_date-format
| [
{
"content": "import typing as t\n\n# The generic time format is based on python time.strftime.\n# https://docs.python.org/3/library/time.html#time.strftime\nfrom sqlglot.trie import TrieResult, in_trie, new_trie\n\n\ndef format_time(\n string: str, mapping: t.Dict[str, str], trie: t.Optional[t.Dict] = None\... | [
{
"content": "import typing as t\n\n# The generic time format is based on python time.strftime.\n# https://docs.python.org/3/library/time.html#time.strftime\nfrom sqlglot.trie import TrieResult, in_trie, new_trie\n\n\ndef format_time(\n string: str, mapping: t.Dict[str, str], trie: t.Optional[t.Dict] = None\... | diff --git a/sqlglot/time.py b/sqlglot/time.py
index c286ec1e8c..50ec2ec3f0 100644
--- a/sqlglot/time.py
+++ b/sqlglot/time.py
@@ -42,6 +42,10 @@ def format_time(
end -= 1
chars = sym
sym = None
+ else:
+ chars = chars[0]
+ end = start + 1
+
start += len(chars)
chunks.append(chars)
current = trie
diff --git a/tests/dialects/test_mysql.py b/tests/dialects/test_mysql.py
index 45bb763b8a..ab246a3d6a 100644
--- a/tests/dialects/test_mysql.py
+++ b/tests/dialects/test_mysql.py
@@ -123,6 +123,7 @@ def test_ddl(self):
self.validate_identity("ALTER TABLE test_table ALTER COLUMN test_column SET DEFAULT 1")
def test_identity(self):
+ self.validate_identity("SELECT DATE_FORMAT(NOW(), '%Y-%m-%d %H:%i:00.0000')")
self.validate_identity("SELECT @var1 := 1, @var2")
self.validate_identity("UNLOCK TABLES")
self.validate_identity("LOCK TABLES `app_fields` WRITE")
|
OpenEnergyPlatform__oeplatform-483 | Aliases are not resolved in column matching
Aliases in from-clauses will cause an error when parsing column elements. There must be a structure that keeps track of aliases while parsing to resolve them appropriately.
| [
{
"content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n c... | [
{
"content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n c... | diff --git a/api/parser.py b/api/parser.py
index 3c76c36ef..37cb06463 100644
--- a/api/parser.py
+++ b/api/parser.py
@@ -357,7 +357,10 @@ def parse_column(d, mapper):
if is_literal:
return literal_column(name)
else:
- return column(name)
+ if table_name is not None:
+ return literal_column(table_name + "." + name)
+ else:
+ return column(name)
def parse_type(dt_string, **kwargs):
diff --git a/api/tests/test_regression/test_issue_482.py b/api/tests/test_regression/test_issue_482.py
new file mode 100644
index 000000000..57e361b0a
--- /dev/null
+++ b/api/tests/test_regression/test_issue_482.py
@@ -0,0 +1,132 @@
+import json
+
+import requests
+
+from api.tests import APITestCase
+
+from ..util import load_content_as_json
+
+
+class TestAliasesTracking(APITestCase):
+ def setUp(self):
+ self._structure_data = {
+ "constraints": [
+ {
+ "constraint_type": "PRIMARY KEY",
+ "constraint_parameter": "id",
+ "reference_table": None,
+ "reference_column": None,
+ }
+ ],
+ "columns": [
+ {
+ "name": "id",
+ "data_type": "bigserial",
+ "is_nullable": False,
+ "character_maximum_length": None,
+ },
+ {
+ "name": "name",
+ "data_type": "character varying",
+ "is_nullable": True,
+ "character_maximum_length": 123,
+ },
+ ],
+ }
+
+ resp = self.__class__.client.put(
+ "/api/v0/schema/{schema}/tables/{table}/".format(
+ schema=self.test_schema, table=self.test_table
+ ),
+ data=json.dumps({"query": self._structure_data}),
+ HTTP_AUTHORIZATION="Token %s" % self.__class__.token,
+ content_type="application/json",
+ )
+
+ # Check HTTP-response (201 = Successful create)
+ self.assertEqual(
+ resp.status_code, 201, resp.json().get("reason", "No reason returned")
+ )
+
+ resp = self.__class__.client.post(
+ "/api/v0/schema/{schema}/tables/{table}/rows/new".format(
+ schema=self.test_schema, table=self.test_table
+ ),
+ data=json.dumps({"query": [{"name": "Hans"}, {"name": "Petra"}]}),
+ HTTP_AUTHORIZATION="Token %s" % self.__class__.token,
+ content_type="application/json",
+ )
+
+ # Check HTTP-response (201 = Successful create)
+ self.assertEqual(
+ resp.status_code,
+ 201,
+ load_content_as_json(resp).get("reason", "No reason returned"),
+ )
+
+ def test_aliases_in_form_clauses(self):
+ data = {
+ "query": {
+ "fields": [dict(type="column", column="id", table="a")],
+ "where": [
+ {
+ "type": "operator",
+ "operator": "=",
+ "operands": [
+ {"type": "column", "column": "name", "table": "a"},
+ {"type": "value", "value": "Hans"},
+ ],
+ }
+ ],
+ "from": {
+ "type": "join",
+ "left": {
+ "type": "table",
+ "table": self.test_table,
+ "schema": self.test_schema,
+ "alias": "a"
+ },
+ "right": {
+ "type": "table",
+ "table": self.test_table,
+ "schema": self.test_schema,
+ "alias": "b"
+ },
+ "on":[
+ {
+ "type": "operator",
+ "operator": "=",
+ "operands": [
+ {"type": "column", "column": "id", "table": "a"},
+ {"type": "column", "column": "id", "table": "b"},
+ ],
+ }
+ ]
+ }
+ }
+ }
+
+ resp = self.__class__.client.post(
+ "/api/v0/advanced/search",
+ data=json.dumps(data),
+ HTTP_AUTHORIZATION="Token %s" % self.__class__.token,
+ content_type="application/json",
+ )
+
+ self.check_api_post(
+ "/api/v0/advanced/search", data=data, expected_result=[[1]]
+ )
+
+ def tearDown(self):
+ resp = self.__class__.client.delete(
+ "/api/v0/schema/{schema}/tables/{table}/".format(
+ schema=self.test_schema, table=self.test_table
+ ),
+ HTTP_AUTHORIZATION="Token %s" % self.__class__.token,
+ content_type="application/json",
+ )
+
+ # Check HTTP-response (200 = Successful request)
+ self.assertEqual(
+ resp.status_code, 200, resp.json().get("reason", "No reason returned")
+ )
diff --git a/versions/changelogs/current.md b/versions/changelogs/current.md
index a601a060f..830f0834e 100644
--- a/versions/changelogs/current.md
+++ b/versions/changelogs/current.md
@@ -6,4 +6,5 @@
### Bugs
* API: Fixed negation in where clauses
-* API: Fixed metadata tooltips
\ No newline at end of file
+* API: Fixed metadata tooltips
+* API: Fixed alias handling (#482)
\ No newline at end of file
|
googleapis__google-api-python-client-1221 | Published package is missing discovery files in discovery_cache
Many thanks to the @wyk9787 for noticing this and reaching out.
All calls to `discovery.build()` using `2.0.0` fail with "unknown api name or version".
```python
from googleapiclient import discovery
client = discovery.build("cloudprofiler", "v2")
```
This is because the published package has no `discovery_cache/documents` directory.
1. `python3 -m venv env`
2. `source env/bin/activate`
3. `python3 -m pip install google-api-python-client`
4. `ls env/lib/python*/site-packages/googleapiclient/discovery_cache`
```
busunkim@busunkim:~/github$ ls env/lib/python*/site-packages/googleapiclient/discovery_cache
appengine_memcache.py base.py file_cache.py __init__.py __pycache__
```
| [
{
"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | [
{
"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unles... | diff --git a/setup.py b/setup.py
index 894018a1b1e..d3ef571a9a3 100644
--- a/setup.py
+++ b/setup.py
@@ -60,7 +60,7 @@
install_requires=install_requires,
python_requires=">=3.6",
packages=packages,
- package_data={},
+ package_data={"googleapiclient": ["discovery_cache/documents/*.json"]},
license="Apache 2.0",
keywords="google api client",
classifiers=[
|
buildbot__buildbot-3490 | UnboundLocalError in mq/base.py on master shutdown
Hello,
We're using buildbot in multi-master mode and got this stacktrace on one of the master when shutting it down:
```
2017-07-17 12:33:29+0000 [-] Waiting for 1 build(s) to finish
2017-07-17 12:33:29+0000 [-] Builder <Builder 'u'sql-monitor-bitbucket_scality_ring-monitor_ring_frequent-prod-frontend-0'' at 140555339856784> has 1 builds running
2017-07-17 12:33:29+0000 [-] Not shutting down, there are 1 builds running
2017-07-17 12:33:29+0000 [-] Trying shutdown sequence again
2017-07-17 12:33:30+0000 [-] <Build sql-monitor-bitbucket_scality_ring-monitor_ring_frequent-prod-frontend-0 number:32108L results:exception>: stopping build: Master Shutdown 5
2017-07-17 12:33:30+0000 [-] Unhandled error in Deferred:
2017-07-17 12:33:30+0000 [-] Unhandled Error
Traceback (most recent call last):
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = g.send(result)
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/buildbot/process/botmaster.py", line 105, in cleanShutdown
l.append(build.waitUntilFinished())
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/buildbot/process/build.py", line 687, in waitUntilFinished
lambda: self.finished)
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1445, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
— <exception caught here> —
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = g.send(result)
File "/root/bitbucket/scality/ring/venv/local/lib/python2.7/site-packages/buildbot/mq/base.py", line 40, in waitUntilEvent
defer.returnValue(res)
exceptions.UnboundLocalError: local variable 'res' referenced before assignment
```
Looking at the code at the end of `waitUntilEvent()`:
```
if not check:
res = yield d
yield buildCompleteConsumer.stopConsuming
defer.returnValue(res)
```
If the check returned false, we try to return a value (`res`) that was never defined.
| [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | [
{
"content": "# This file is part of Buildbot. Buildbot is free software: you can\n# redistribute it and/or modify it under the terms of the GNU General Public\n# License as published by the Free Software Foundation, version 2.\n#\n# This program is distributed in the hope that it will be useful, but WITHOUT\n... | diff --git a/master/buildbot/mq/base.py b/master/buildbot/mq/base.py
index 2379dead364c..b27f462492a4 100644
--- a/master/buildbot/mq/base.py
+++ b/master/buildbot/mq/base.py
@@ -36,7 +36,9 @@ def waitUntilEvent(self, filter, check_callback):
# we only wait if the check callback return true
if not check:
res = yield d
- yield buildCompleteConsumer.stopConsuming
+ else:
+ res = None
+ yield buildCompleteConsumer.stopConsuming()
defer.returnValue(res)
diff --git a/master/buildbot/newsfragments/3478.bugfix b/master/buildbot/newsfragments/3478.bugfix
new file mode 100644
index 000000000000..5b79c0e6da39
--- /dev/null
+++ b/master/buildbot/newsfragments/3478.bugfix
@@ -0,0 +1 @@
+Fix exception when shutting down a master (:issue:`3478`)
|
liberapay__liberapay.com-195 | Twitter API chokes on at-sign
https://liberapay.com/on/twitter/@korben/ returns a 500. sentry#35, public link: https://sentry.changaco.oy.lc/share/issue/322e3335/.
| [
{
"content": "from __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom datetime import timedelta\nimport json\nimport uuid\nimport xml.etree.ElementTree as ET\n\nfrom six.moves.urllib.parse import urlsplit, urlunsplit\n\nfrom aspen import Response\nfrom aspen.utils import utcno... | [
{
"content": "from __future__ import absolute_import, division, print_function, unicode_literals\n\nfrom datetime import timedelta\nimport json\nimport uuid\nimport xml.etree.ElementTree as ET\n\nfrom six.moves.urllib.parse import urlsplit, urlunsplit\n\nfrom aspen import Response\nfrom aspen.utils import utcno... | diff --git a/liberapay/models/account_elsewhere.py b/liberapay/models/account_elsewhere.py
index cc3f674596..81361dfb4c 100644
--- a/liberapay/models/account_elsewhere.py
+++ b/liberapay/models/account_elsewhere.py
@@ -242,6 +242,8 @@ def get_account_elsewhere(website, state, api_lookup=True):
uid = uid[1:]
else:
key = 'user_name'
+ if uid[:1] == '@':
+ uid = uid[1:]
try:
account = AccountElsewhere._from_thing(key, platform.name, uid)
except UnknownAccountElsewhere:
diff --git a/tests/py/test_elsewhere.py b/tests/py/test_elsewhere.py
index c156dcc191..dfc79c1c63 100644
--- a/tests/py/test_elsewhere.py
+++ b/tests/py/test_elsewhere.py
@@ -116,6 +116,19 @@ def test_user_page_shows_pledges(self, get_user_info):
r = self.client.GET('/on/github/alice/')
assert str(amount) in r.body, r.body.decode('utf8')
+ @mock.patch('liberapay.elsewhere._base.Platform.get_user_info')
+ def test_user_page_doesnt_fail_on_at_sign(self, get_user_info):
+ def f(k, v, *a):
+ if (k, v) == ('user_name', 'alice'):
+ return UserInfo(
+ platform='twitter', user_id='0', user_name='alice',
+ is_team=False
+ )
+ raise Exception
+ get_user_info.side_effect = f
+ response = self.client.GET('/on/twitter/@alice/')
+ assert response.code == 200
+
def test_user_pages_not_found(self):
user_name = 'adhsjakdjsdkjsajdhksda'
error = "There doesn't seem to be a user named %s on %s."
|
tornadoweb__tornado-3167 | Tornado 6.2 release readiness
I'm creating this issue to collect feedback on the 6.2 betas. For the folks who have tried them, do you think the release is ready to go or are there still more changes to be made?
Tagging @minrk and @graingert as authors of relevant PRs, although I'd welcome feedback from anyone interested in this release.
| [
{
"content": "#\n# Copyright 2009 Facebook\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | [
{
"content": "#\n# Copyright 2009 Facebook\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"); you may\n# not use this file except in compliance with the License. You may obtain\n# a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | diff --git a/docs/releases/v6.2.0.rst b/docs/releases/v6.2.0.rst
index a2277b9267..57b76ecb14 100644
--- a/docs/releases/v6.2.0.rst
+++ b/docs/releases/v6.2.0.rst
@@ -1,8 +1,8 @@
What's new in Tornado 6.2.0
===========================
-Jun XX, 2022
-------------
+Jul 3, 2022
+-----------
Deprecation notice
~~~~~~~~~~~~~~~~~~
@@ -75,6 +75,8 @@ General changes
has been unnecessary since Python 3.2 added a logger of last resort.
- The `.IOLoop` constructor now accepts an ``asyncio_loop`` keyword argument to
initialize with a specfied asyncio event loop.
+- It is now possible to construct an `.IOLoop` on one thread (with
+ ``make_current=False``) and start it on a different thread.
`tornado.iostream`
~~~~~~~~~~~~~~~~~~
diff --git a/tornado/__init__.py b/tornado/__init__.py
index 43fe83cb3d..39d7c44bf4 100644
--- a/tornado/__init__.py
+++ b/tornado/__init__.py
@@ -22,5 +22,5 @@
# is zero for an official release, positive for a development branch,
# or negative for a release candidate or beta (after the base version
# number has been incremented)
-version = "6.2b2"
-version_info = (6, 2, 0, -98)
+version = "6.2"
+version_info = (6, 2, 0, 0)
|
mkdocs__mkdocs-708 | Remove unicode_literals import in the CLI code
Looks like this can cause some issues with Click, I've not seen any but we should probably remove it anyway or we will start to get warnings from Click 5.0.
https://github.com/mitsuhiko/click/commit/5f337705f68bdfa66d7c7a9fe7fc5d6bfd48db94
| [
{
"content": "#!/usr/bin/env python\n# coding: utf-8\n\nfrom __future__ import unicode_literals\nimport logging\nimport click\nimport socket\n\nfrom mkdocs import __version__\nfrom mkdocs import utils\nfrom mkdocs import exceptions\nfrom mkdocs.config import load_config\nfrom mkdocs.commands import build, gh_de... | [
{
"content": "#!/usr/bin/env python\n# coding: utf-8\n\nfrom __future__ import unicode_literals\nimport logging\nimport click\nimport socket\n\nfrom mkdocs import __version__\nfrom mkdocs import utils\nfrom mkdocs import exceptions\nfrom mkdocs.config import load_config\nfrom mkdocs.commands import build, gh_de... | diff --git a/mkdocs/__main__.py b/mkdocs/__main__.py
index 6d76f97235..7f788141d2 100644
--- a/mkdocs/__main__.py
+++ b/mkdocs/__main__.py
@@ -14,6 +14,11 @@
log = logging.getLogger(__name__)
+# Disable the warning that Click displays (as of Click version 5.0) when users
+# use unicode_literals in Python 2.
+# See http://click.pocoo.org/dev/python3/#unicode-literals for more details.
+click.disable_unicode_literals_warning = True
+
class State(object):
''' Maintain logging level.'''
|
alltheplaces__alltheplaces-2638 | Spider costco is broken
During the global build at 2021-08-18-14-42-26, spider **costco** failed with **0 features** and **2 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/logs/costco.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/costco.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-08-18-14-42-26/output/costco.geojson))
| [
{
"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\nfrom urllib.parse import urlencode\n\nfrom locations.items import GeojsonPointItem\n\nDAYS_NAME = {\n 'm': 'Mo',\n 'mon': 'Mo',\n 't': 'Tu',\n 'w': 'We',\n 's': 'Th',\n 'f': 'Fr',\n 'f ': 'Fr',\n 'sun': 'Su',\n... | [
{
"content": "# -*- coding: utf-8 -*-\nimport scrapy\nimport json\nimport re\nfrom urllib.parse import urlencode\n\nfrom locations.items import GeojsonPointItem\n\nDAYS_NAME = {\n 'm': 'Mo',\n 'mon': 'Mo',\n 't': 'Tu',\n 'w': 'We',\n 's': 'Th',\n 'f': 'Fr',\n 'f ': 'Fr',\n 'sun': 'Su',\n... | diff --git a/locations/spiders/costco.py b/locations/spiders/costco.py
index 355fb6314e6..28162637fee 100644
--- a/locations/spiders/costco.py
+++ b/locations/spiders/costco.py
@@ -28,7 +28,7 @@ class CostcoSpider(scrapy.Spider):
'https://www.costco.com/warehouse-locations',
)
custom_settings = {
- 'USER_AGENT': 'Mozilla/5.0',
+ 'USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159 Safari/537.36',
}
download_delay = 0.5
|
cloud-custodian__cloud-custodian-5545 | core - source get_resources should early exit on empty set
some of the service apis will return back all resources if we issue an api call for an empty set of ids. when using get_resources we should explicitly check for an empty set and return in the base describe source / or query resource manager get_resources method.
| [
{
"content": "# Copyright 2016-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requi... | [
{
"content": "# Copyright 2016-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requi... | diff --git a/c7n/query.py b/c7n/query.py
index b0e6faa0997..3eff822c2b1 100644
--- a/c7n/query.py
+++ b/c7n/query.py
@@ -491,6 +491,8 @@ def _get_cached_resources(self, ids):
return None
def get_resources(self, ids, cache=True, augment=True):
+ if not ids:
+ return []
if cache:
resources = self._get_cached_resources(ids)
if resources is not None:
|
quantumlib__Cirq-3689 | Add to heatmap visualization tests
In the `test_colorbar` test there is a comment about testing that the position size and pad arguments are respected.
| [
{
"content": "# Copyright 2019 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by... | [
{
"content": "# Copyright 2019 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by... | diff --git a/cirq/vis/heatmap.py b/cirq/vis/heatmap.py
index 392db174213..87033f27806 100644
--- a/cirq/vis/heatmap.py
+++ b/cirq/vis/heatmap.py
@@ -38,7 +38,7 @@ def _get_qubit_row_col(qubit: QubitCoordinate) -> Tuple[int, int]:
if isinstance(qubit, grid_qubit.GridQubit):
return qubit.row, qubit.col
elif isinstance(qubit, tuple):
- return qubit[0], qubit[1]
+ return int(qubit[0]), int(qubit[1])
def relative_luminance(color: np.ndarray) -> float:
diff --git a/cirq/vis/heatmap_test.py b/cirq/vis/heatmap_test.py
index c85bc8c4eea..574efbc525b 100644
--- a/cirq/vis/heatmap_test.py
+++ b/cirq/vis/heatmap_test.py
@@ -13,7 +13,9 @@
# limitations under the License.
"""Tests for Heatmap."""
+import pathlib
import string
+from tempfile import mkdtemp
import numpy as np
import pytest
@@ -226,20 +228,72 @@ def test_urls(ax, test_GridQubit):
assert mesh.get_urls() == expected_urls
-def test_colorbar(ax):
+@pytest.mark.parametrize(
+ 'position,size,pad',
+ [
+ ('right', "5%", "2%"),
+ ('right', "5%", "10%"),
+ ('right', "20%", "2%"),
+ ('right', "20%", "10%"),
+ ('left', "5%", "2%"),
+ ('left', "5%", "10%"),
+ ('left', "20%", "2%"),
+ ('left', "20%", "10%"),
+ ('top', "5%", "2%"),
+ ('top', "5%", "10%"),
+ ('top', "20%", "2%"),
+ ('top', "20%", "10%"),
+ ('bottom', "5%", "2%"),
+ ('bottom', "5%", "10%"),
+ ('bottom', "20%", "2%"),
+ ('bottom', "20%", "10%"),
+ ],
+)
+def test_colorbar(ax, position, size, pad):
qubits = ((0, 5), (8, 1), (7, 0), (13, 5), (1, 6), (3, 2), (2, 8))
values = np.random.random(len(qubits))
test_value_map = {qubit: value for qubit, value in zip(qubits, values)}
random_heatmap = heatmap.Heatmap(test_value_map).unset_colorbar()
fig1, ax1 = plt.subplots()
random_heatmap.plot(ax1)
- random_heatmap.set_colorbar()
+ random_heatmap.set_colorbar(position=position, size=size, pad=pad)
fig2, ax2 = plt.subplots()
random_heatmap.plot(ax2)
+ # We need to call savefig() explicitly for updating axes position since the figure
+ # object has been altered in the HeatMap._plot_colorbar function.
+ tmp_dir = mkdtemp()
+ fig2.savefig(pathlib.Path(tmp_dir) / 'tmp.png')
+
# Check that the figure has one more object in it when colorbar is on.
assert len(fig2.get_children()) == len(fig1.get_children()) + 1
- # TODO: Make this is a more thorough test, e.g., we should test that the
- # position, size and pad arguments are respected.
- # Github issue: https://github.com/quantumlib/Cirq/issues/2969
+ fig_pos = fig2.get_axes()[0].get_position()
+ colorbar_pos = fig2.get_axes()[1].get_position()
+
+ origin_axes_size = (
+ fig_pos.xmax - fig_pos.xmin
+ if position in ["left", "right"]
+ else fig_pos.ymax - fig_pos.ymin
+ )
+ expected_pad = int(pad.replace("%", "")) / 100 * origin_axes_size
+ expected_size = int(size.replace("%", "")) / 100 * origin_axes_size
+
+ if position == "right":
+ pad_distance = colorbar_pos.xmin - fig_pos.xmax
+ colorbar_size = colorbar_pos.xmax - colorbar_pos.xmin
+ elif position == "left":
+ pad_distance = fig_pos.xmin - colorbar_pos.xmax
+ colorbar_size = colorbar_pos.xmax - colorbar_pos.xmin
+ elif position == "top":
+ pad_distance = colorbar_pos.ymin - fig_pos.ymax
+ colorbar_size = colorbar_pos.ymax - colorbar_pos.ymin
+ elif position == "bottom":
+ pad_distance = fig_pos.ymin - colorbar_pos.ymax
+ colorbar_size = colorbar_pos.ymax - colorbar_pos.ymin
+
+ assert np.isclose(colorbar_size, expected_size)
+ assert np.isclose(pad_distance, expected_pad)
+
+ plt.close(fig1)
+ plt.close(fig2)
|
searx__searx-2130 | Allow server admins to choose default search method
Currently, the default search method used by Searx is `POST` which breaks compatibility with Firefox containers. Since FF's query to `/opensearch.xml` does not include the cookies, the user's preferred method is not reflected and they're forced create a custom search engine with the correct URL formatting and method.
A good solution is to make the method parameter configurable in `settings.yml`. This was already mentioned in #703 and looks fairly easy to implement. Let me know if you want me to open a PR.
| [
{
"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"Searx preferences implementation.\n\"\"\"\n\n# pylint: disable=useless-object-inheritance\n\nfrom base64 import urlsafe_b64encode, urlsafe_b64decode\nfrom zlib import compress, decompress\nfrom sys import version\n\nfrom searx import settings, au... | [
{
"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"Searx preferences implementation.\n\"\"\"\n\n# pylint: disable=useless-object-inheritance\n\nfrom base64 import urlsafe_b64encode, urlsafe_b64decode\nfrom zlib import compress, decompress\nfrom sys import version\n\nfrom searx import settings, au... | diff --git a/searx/preferences.py b/searx/preferences.py
index f70aee37aa..34da1b7c68 100644
--- a/searx/preferences.py
+++ b/searx/preferences.py
@@ -348,7 +348,7 @@ def __init__(self, themes, categories, engines, plugins):
}
),
'method': EnumStringSetting(
- 'POST',
+ settings['server'].get('method', 'POST'),
choices=('GET', 'POST')
),
'safesearch': MapSetting(
diff --git a/searx/settings.yml b/searx/settings.yml
index 63685be8bf..68fd0ee6f5 100644
--- a/searx/settings.yml
+++ b/searx/settings.yml
@@ -16,6 +16,7 @@ server:
base_url : False # Set custom base_url. Possible values: False or "https://your.custom.host/location/"
image_proxy : False # Proxying image results through searx
http_protocol_version : "1.0" # 1.0 and 1.1 are supported
+ method: "POST" # POST queries are more secure as they don't show up in history but may cause problems when using Firefox containers
ui:
static_path : "" # Custom static path - leave it blank if you didn't change
|
optuna__optuna-5055 | Use `__future__.annotations` everywhere in the Optuna code base
### Motivation
Optuna drops Python 3.6 from v3.1, so we can use `__future__.annotations`, which simplifies the code base. See [PEP 563](https://peps.python.org/pep-0563/), [PEP584](https://peps.python.org/pep-0584/), [PEP 585](https://peps.python.org/pep-0585/), and [PEP 604](https://peps.python.org/pep-0604/) for more details. This issue suggests to use the module and simplifies the code base.
### Suggestion
Use `__future__.annotations` for each file and simplify the type annotations. The list of classes whose type annotations can be simplified is [here](https://peps.python.org/pep-0585/#implementation). The list of files where the `__future__.annotations` can be used is as follows. In order to reduce review costs and to encourage more contributors to work on it, please, as a rule, fix one file per PR.
- [x] optuna/_convert_positional_args.py
- [x] optuna/visualization/_optimization_history.py
- [x] optuna/visualization/_hypervolume_history.py
- [x] optuna/visualization/_edf.py
- [x] optuna/visualization/_pareto_front.py
- [x] optuna/visualization/matplotlib/_optimization_history.py
- [x] optuna/visualization/matplotlib/_hypervolume_history.py
- [x] optuna/visualization/matplotlib/_edf.py
- [x] optuna/visualization/matplotlib/_pareto_front.py
- [x] optuna/visualization/matplotlib/_contour.py
- [x] optuna/visualization/_utils.py
- [x] optuna/logging.py
- [ ] optuna/storages/_base.py
- [ ] optuna/storages/_cached_storage.py
- [ ] optuna/storages/__init__.py
- [ ] optuna/storages/_heartbeat.py
- [ ] optuna/storages/_in_memory.py
- [ ] optuna/storages/_rdb/models.py
- [ ] optuna/storages/_rdb/storage.py
- [ ] optuna/storages/_rdb/alembic/versions/v3.0.0.c.py
- [ ] optuna/storages/_rdb/alembic/versions/v3.0.0.d.py
- [ ] optuna/storages/_rdb/alembic/versions/v3.0.0.a.py
- [ ] optuna/storages/_journal/file.py
- [ ] optuna/storages/_journal/redis.py
- [ ] optuna/storages/_journal/storage.py
- [ ] optuna/storages/_journal/base.py
- [ ] optuna/study/_dataframe.py
- [ ] optuna/study/_optimize.py
- [ ] optuna/study/_tell.py
- [ ] optuna/study/_multi_objective.py
- [ ] optuna/study/_frozen.py
- [ ] optuna/study/study.py
- [ ] optuna/study/_study_summary.py
- [ ] optuna/search_space/group_decomposed.py
- [ ] optuna/search_space/intersection.py
- [ ] optuna/_typing.py
- [ ] optuna/_deprecated.py
- [ ] optuna/pruners/_hyperband.py
- [ ] optuna/pruners/_patient.py
- [ ] optuna/pruners/_successive_halving.py
- [ ] optuna/pruners/_percentile.py
- [ ] optuna/pruners/_threshold.py
- [ ] optuna/trial/_base.py
- [ ] optuna/trial/_fixed.py
- [ ] optuna/trial/_trial.py
- [ ] optuna/trial/_frozen.py
- [ ] optuna/integration/cma.py
- [ ] optuna/integration/shap.py
- [ ] optuna/integration/lightgbm.py
- [ ] optuna/integration/pytorch_distributed.py
- [ ] optuna/integration/_lightgbm_tuner/optimize.py
- [ ] optuna/integration/_lightgbm_tuner/alias.py
- [ ] optuna/integration/mlflow.py
- [ ] optuna/integration/wandb.py
- [ ] optuna/integration/catboost.py
- [ ] optuna/integration/skopt.py
- [ ] optuna/integration/botorch.py
- [ ] optuna/integration/dask.py
- [x] optuna/integration/sklearn.py
- [ ] optuna/integration/tensorboard.py
- [ ] optuna/terminator/callback.py
- [ ] optuna/terminator/terminator.py
- [ ] optuna/terminator/improvement/_preprocessing.py
- [ ] optuna/terminator/improvement/gp/botorch.py
- [ ] optuna/terminator/improvement/gp/base.py
- [ ] optuna/terminator/improvement/evaluator.py
- [ ] optuna/importance/_base.py
- [ ] optuna/importance/_mean_decrease_impurity.py
- [ ] optuna/importance/__init__.py
- [ ] optuna/importance/_fanova/_fanova.py
- [ ] optuna/importance/_fanova/_evaluator.py
- [ ] optuna/importance/_fanova/_tree.py
- [ ] optuna/_imports.py
- [ ] optuna/testing/tempfile_pool.py
- [ ] optuna/testing/threading.py
- [ ] optuna/testing/distributions.py
- [ ] optuna/testing/samplers.py
- [ ] optuna/testing/storages.py
- [ ] optuna/distributions.py
- [ ] optuna/cli.py
- [ ] optuna/multi_objective/visualization/_pareto_front.py
- [ ] optuna/multi_objective/trial.py
- [ ] optuna/multi_objective/samplers/_base.py
- [ ] optuna/multi_objective/samplers/_nsga2.py
- [ ] optuna/multi_objective/samplers/_adapter.py
- [ ] optuna/multi_objective/samplers/_random.py
- [ ] optuna/multi_objective/samplers/_motpe.py
- [ ] optuna/multi_objective/study.py
- [ ] optuna/_experimental.py
- [ ] optuna/samplers/_base.py
- [ ] optuna/samplers/nsgaii/_crossovers/_undx.py
- [ ] optuna/samplers/nsgaii/_crossovers/_spx.py
- [ ] optuna/samplers/nsgaii/_crossovers/_sbx.py
- [ ] optuna/samplers/nsgaii/_crossovers/_vsbx.py
- [ ] optuna/samplers/nsgaii/_sampler.py
- [ ] optuna/samplers/nsgaii/_crossover.py
- [ ] optuna/samplers/_search_space/intersection.py
- [ ] optuna/samplers/_qmc.py
- [ ] optuna/samplers/_tpe/probability_distributions.py
- [ ] optuna/samplers/_tpe/_truncnorm.py
- [ ] optuna/samplers/_tpe/multi_objective_sampler.py
- [ ] optuna/samplers/_tpe/parzen_estimator.py
- [ ] optuna/samplers/_tpe/sampler.py
- [ ] optuna/samplers/_random.py
- [ ] optuna/samplers/_cmaes.py
- [ ] optuna/samplers/_partial_fixed.py
- [ ] optuna/samplers/_brute_force.py
- [ ] optuna/samplers/_nsgaiii.py
- [ ] optuna/samplers/_grid.py
- [ ] optuna/_hypervolume/wfg.py
- [ ] optuna/_hypervolume/hssp.py
- [ ] optuna/progress_bar.py
- [ ] optuna/_transform.py
- [ ] optuna/_callbacks.py
- [ ] tests/multi_objective_tests/test_study.py
- [ ] tests/multi_objective_tests/samplers_tests/test_motpe.py
- [ ] tests/multi_objective_tests/samplers_tests/test_nsga2.py
- [ ] tests/multi_objective_tests/test_trial.py
- [ ] tests/multi_objective_tests/visualization_tests/test_pareto_front.py
- [ ] tests/trial_tests/test_frozen.py
- [ ] tests/trial_tests/test_trials.py
- [ ] tests/trial_tests/test_trial.py
- [ ] tests/pruners_tests/test_percentile.py
- [ ] tests/pruners_tests/test_median.py
- [ ] tests/pruners_tests/test_patient.py
- [ ] tests/pruners_tests/test_successive_halving.py
- [ ] tests/study_tests/test_optimize.py
- [ ] tests/study_tests/test_study.py
- [ ] tests/hypervolume_tests/test_hssp.py
- [x] tests/integration_tests/test_skopt.py
- [x] tests/integration_tests/test_pytorch_lightning.py
- [ ] tests/integration_tests/test_shap.py
- [ ] tests/integration_tests/test_cma.py
- [ ] tests/integration_tests/test_pytorch_distributed.py
- [ ] tests/integration_tests/lightgbm_tuner_tests/test_optimize.py
- [ ] tests/integration_tests/lightgbm_tuner_tests/test_alias.py
- [ ] tests/integration_tests/test_botorch.py
- [ ] tests/integration_tests/test_mlflow.py
- [ ] tests/integration_tests/test_mxnet.py
- [ ] tests/integration_tests/test_wandb.py
- [ ] tests/importance_tests/fanova_tests/test_tree.py
- [ ] tests/importance_tests/test_mean_decrease_impurity.py
- [ ] tests/importance_tests/test_fanova.py
- [ ] tests/importance_tests/test_init.py
- [ ] tests/test_convert_positional_args.py
- [ ] tests/test_deprecated.py
- [ ] tests/storages_tests/test_journal.py
- [ ] tests/storages_tests/test_heartbeat.py
- [ ] tests/storages_tests/test_storages.py
- [ ] tests/storages_tests/rdb_tests/test_storage.py
- [ ] tests/storages_tests/rdb_tests/create_db.py
- [ ] tests/storages_tests/test_with_server.py
- [ ] tests/samplers_tests/test_grid.py
- [ ] tests/samplers_tests/tpe_tests/test_parzen_estimator.py
- [ ] tests/samplers_tests/tpe_tests/test_multi_objective_sampler.py
- [ ] tests/samplers_tests/tpe_tests/test_sampler.py
- [ ] tests/samplers_tests/test_cmaes.py
- [ ] tests/samplers_tests/test_samplers.py
- [x] tests/samplers_tests/test_nsgaii.py
- [x] tests/samplers_tests/test_nsgaiii.py
- [ ] tests/samplers_tests/test_qmc.py
- [ ] tests/test_distributions.py
- [ ] tests/test_multi_objective.py
- [ ] tests/test_cli.py
- [ ] tests/visualization_tests/test_hypervolume_history.py
- [ ] tests/visualization_tests/test_pareto_front.py
- [ ] tests/terminator_tests/improvement_tests/test_evaluator.py
- [ ] benchmarks/kurobako/problems/wfg/transformation_functions.py
- [ ] benchmarks/bayesmark/report_bayesmark.py
- [ ] benchmarks/bayesmark/optuna_optimizer.py
### Additional context (optional)
The above list is generated by the following script.
<details>
<summary>script</summary>
```python
import os
import pathlib
PATTERS = [
"from typing import Union",
"from typing import Optional",
"from typing import Tuple",
"from typing import List",
"from typing import Dict",
"from typing import Set",
"from typing import FrozenSet",
"from typing import Type",
"from typing import FrozenSet",
"from typing import Sequence",
]
def get_filenames_to_be_simplified(dir_path):
ret = []
for f in os.listdir(dir_path):
file_path = os.path.join(dir_path, f)
if not os.path.isfile(file_path):
ret.extend(get_filenames_to_be_simplified(file_path))
else:
try:
with open(file_path) as fd:
contents = fd.read()
if any([s in contents for s in PATTERS]):
ret.append(str(file_path))
except UnicodeDecodeError as e:
pass
return ret
def main():
dirs = ["optuna", "tests", "benchmarks"]
for dir_name in dirs:
filenames = get_filenames_to_be_simplified(pathlib.Path(dir_name))
for filename in filenames:
print(f"- [ ] {filename}")
if __name__ == "__main__":
main()
```
</details>
| [
{
"content": "import abc\nfrom typing import Optional\n\nfrom optuna._experimental import experimental_class\nfrom optuna.study.study import Study\nfrom optuna.terminator.erroreval import BaseErrorEvaluator\nfrom optuna.terminator.erroreval import CrossValidationErrorEvaluator\nfrom optuna.terminator.erroreval ... | [
{
"content": "from __future__ import annotations\n\nimport abc\nfrom typing import Optional\n\nfrom optuna._experimental import experimental_class\nfrom optuna.study.study import Study\nfrom optuna.terminator.erroreval import BaseErrorEvaluator\nfrom optuna.terminator.erroreval import CrossValidationErrorEvalua... | diff --git a/optuna/terminator/callback.py b/optuna/terminator/callback.py
index 74c484d0e8..dc75f23aa4 100644
--- a/optuna/terminator/callback.py
+++ b/optuna/terminator/callback.py
@@ -1,3 +1,5 @@
+from __future__ import annotations
+
from typing import Optional
from optuna._experimental import experimental_class
diff --git a/optuna/terminator/terminator.py b/optuna/terminator/terminator.py
index 4970fd2661..7536079f0d 100644
--- a/optuna/terminator/terminator.py
+++ b/optuna/terminator/terminator.py
@@ -1,3 +1,5 @@
+from __future__ import annotations
+
import abc
from typing import Optional
|
piskvorky__gensim-2154 | ZeroDivisionError: float division by zero
Getting error : ZeroDivisionError: float division by zero
https://github.com/RaRe-Technologies/gensim/blob/9481915915bf61aa6e4e719a2f26d509677e6779/gensim/summarization/pagerank_weighted.py#L53

ZeroDivisionError: float division by zero
Getting error : ZeroDivisionError: float division by zero
https://github.com/RaRe-Technologies/gensim/blob/9481915915bf61aa6e4e719a2f26d509677e6779/gensim/summarization/pagerank_weighted.py#L53

| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Licensed under the GNU LGPL v2.1 - http://www.gnu.org/licenses/lgpl.html\n\n\"\"\"This module contains functions to find keywords of the text and building graph on tokens from text.\n\nExamples\n--------\nExtract keywords from text\n\n>>> from g... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Licensed under the GNU LGPL v2.1 - http://www.gnu.org/licenses/lgpl.html\n\n\"\"\"This module contains functions to find keywords of the text and building graph on tokens from text.\n\nExamples\n--------\nExtract keywords from text\n\n>>> from g... | diff --git a/gensim/summarization/keywords.py b/gensim/summarization/keywords.py
index 4074088a04..9f43158146 100644
--- a/gensim/summarization/keywords.py
+++ b/gensim/summarization/keywords.py
@@ -512,6 +512,9 @@ def keywords(text, ratio=0.2, words=None, split=False, scores=False, pos_filter=
_remove_unreachable_nodes(graph)
+ if not graph.edges():
+ return _format_results([], [], split, scores)
+
# Ranks the tokens using the PageRank algorithm. Returns dict of lemma -> score
pagerank_scores = _pagerank(graph)
diff --git a/gensim/test/test_keywords.py b/gensim/test/test_keywords.py
index c8fae400da..79df82fba6 100644
--- a/gensim/test/test_keywords.py
+++ b/gensim/test/test_keywords.py
@@ -95,6 +95,12 @@ def test_text_keywords_with_small_graph(self):
kwds = keywords(text, words=1, split=True)
self.assertTrue(len(kwds))
+ def test_text_keywords_without_graph_edges(self):
+ # regression test, we get graph with no edges on this text
+ text = 'Sitio construcción. Estaremos línea.'
+ kwds = keywords(text, deacc=False, scores=True)
+ self.assertFalse(len(kwds))
+
if __name__ == '__main__':
logging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s', level=logging.DEBUG)
|
vispy__vispy-2059 | Transforms broken in SceneCanvas with glfw backend
Hello. I'm using vispy 0.5.1 on Debian 8 with a PyQt5 backend. When I run my app, the SceneCanvas is stuck with an identity transform (which looks broken, obviously) until I resize the window or use a mouse key in the window. This also happens with the example https://github.com/vispy/vispy/blob/master/examples/demo/scene/oscilloscope.py when not using fullscreen.
My workaround was to call SceneCanvas' _update_transforms() right after creating the canvas.
| [
{
"content": "# -*- coding: utf-8 -*-\n# Copyright (c) Vispy Development Team. All Rights Reserved.\n# Distributed under the (new) BSD License. See LICENSE.txt for more info.\n\"\"\"vispy backend for glfw.\"\"\"\n\n# To install GLFW on Ubuntu, use sudo apt-get install libglfw3.\n# On OSX, consider using brew.\n... | [
{
"content": "# -*- coding: utf-8 -*-\n# Copyright (c) Vispy Development Team. All Rights Reserved.\n# Distributed under the (new) BSD License. See LICENSE.txt for more info.\n\"\"\"vispy backend for glfw.\"\"\"\n\n# To install GLFW on Ubuntu, use sudo apt-get install libglfw3.\n# On OSX, consider using brew.\n... | diff --git a/vispy/app/backends/_glfw.py b/vispy/app/backends/_glfw.py
index f662fbabd1..6c59d96dc9 100644
--- a/vispy/app/backends/_glfw.py
+++ b/vispy/app/backends/_glfw.py
@@ -289,6 +289,7 @@ def __init__(self, *args, **kwargs):
self._next_key_text = {}
self._vispy_canvas.set_current()
self._vispy_canvas.events.initialize()
+ self._on_resize(self._id, size[0], size[1])
def _vispy_warmup(self):
etime = time() + 0.25
|
bookwyrm-social__bookwyrm-1864 | Invalid table limit error
**Describe the bug**
When running a fresh dev instance I get an `Invalid table limit` error, coming from `initdb.py`. Not sure if something is broken in the latest main branch, or I need to update my configuration.
**To Reproduce**
Steps to reproduce the behavior:
1. fetch latest `main` branch
2. `./bw-dev resetdb`
3. Get error (see below)
**Expected behavior**
BookWyrm resets database and new install works without errors.
**Screenshots**
```
Applying sessions.0001_initial... OK
+ execweb python manage.py initdb
+ docker-compose exec web python manage.py initdb
Traceback (most recent call last):
File "/app/manage.py", line 18, in <module>
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 413, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 354, in run_from_argv
self.execute(*args, **cmd_options)
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 398, in execute
output = self.handle(*args, **options)
File "/app/bookwyrm/management/commands/initdb.py", line 168, in handle
raise Exception("Invalid table limit:", limit)
Exception: ('Invalid table limit:', None)
```
**Instance**
local development, current `main` branch.
**Additional context**
I initially started getting this error on a branch I was working on, but it's occuring on the latest `main` branch without any changes.
---
**Desktop (please complete the following information):**
- OS: MacOS
| [
{
"content": "\"\"\" What you need in the database to make it work \"\"\"\nfrom django.core.management.base import BaseCommand\nfrom django.contrib.auth.models import Group, Permission\nfrom django.contrib.contenttypes.models import ContentType\n\nfrom bookwyrm import models\n\n\ndef init_groups():\n \"\"\"p... | [
{
"content": "\"\"\" What you need in the database to make it work \"\"\"\nfrom django.core.management.base import BaseCommand\nfrom django.contrib.auth.models import Group, Permission\nfrom django.contrib.contenttypes.models import ContentType\n\nfrom bookwyrm import models\n\n\ndef init_groups():\n \"\"\"p... | diff --git a/bookwyrm/management/commands/initdb.py b/bookwyrm/management/commands/initdb.py
index 37dd66af4d..b54055744e 100644
--- a/bookwyrm/management/commands/initdb.py
+++ b/bookwyrm/management/commands/initdb.py
@@ -164,7 +164,7 @@ def handle(self, *args, **options):
"settings",
"linkdomain",
]
- if limit not in tables:
+ if limit and limit not in tables:
raise Exception("Invalid table limit:", limit)
if not limit or limit == "group":
|
horovod__horovod-2651 | Wrong default for horovod.tensorflow.keras.allreduce(average...
In Horovod 0.21.1 the default for `average` in `allreduce` is still `True` leading to
> ValueError: The op parameter supersedes average. Please provide only one of them.
when using `op=...` (only).
This is only in in `horovod.tensorflow.keras`, not in `horovod.tensorflow`
BTW: In TF2, is there any benefit of using `horovod.tensorflow.keras` over `horovod.tensorflow` when not disabling eager execution (which in my tests is pretty much unfeasible)
| [
{
"content": "# Copyright 2018 Uber Technologies, Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\... | [
{
"content": "# Copyright 2018 Uber Technologies, Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 27d8ecad07..3033704106 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -22,6 +22,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Fixed `local_rank` support for Ray. ([#2596](https://github.com/horovod/horovod/pull/2596))
+- Fixed wrong default for horovod.tensorflow.keras.allreduce average ([#2627](https://github.com/horovod/horovod/pull/2627))
+
## [v0.21.1] - 2021-01-06
### Added
diff --git a/horovod/tensorflow/keras/__init__.py b/horovod/tensorflow/keras/__init__.py
index fdf4bca91d..aaf7ac66cd 100644
--- a/horovod/tensorflow/keras/__init__.py
+++ b/horovod/tensorflow/keras/__init__.py
@@ -119,7 +119,7 @@ def broadcast_global_variables(root_rank):
return _impl.broadcast_global_variables(K, root_rank)
-def allreduce(value, name=None, average=True,
+def allreduce(value, name=None, average=None,
prescale_factor=1.0,
postscale_factor=1.0,
op=None,
|
e-valuation__EvaP-1321 | Evaluation preview button visibility
As a teaching assistant, I might be a contributor to a given course and therefore get my own feedback in the main evaluation. If that course also has an exam evaluation, I see that listed on my "own evaluations" page with the option to preview the questionnaire. However, as not being responsible, I miss the access rights to preview the linked page, resulting in an error.
I would like to either don't have the preview button (it already knows while rendering that page that I am not a contributor, shown through the corresponding icon next to the exam evaluation title) or to give me the rights to preview the questionnaire.
| [
{
"content": "from django.forms import TypedChoiceField\nfrom django.template import Library\n\nfrom evap.evaluation.models import BASE_UNIPOLAR_CHOICES\nfrom evap.evaluation.tools import STATES_ORDERED, STATE_DESCRIPTIONS\nfrom evap.rewards.tools import can_reward_points_be_used_by\nfrom evap.student.forms imp... | [
{
"content": "from django.forms import TypedChoiceField\nfrom django.template import Library\n\nfrom evap.evaluation.models import BASE_UNIPOLAR_CHOICES\nfrom evap.evaluation.tools import STATES_ORDERED, STATE_DESCRIPTIONS\nfrom evap.rewards.tools import can_reward_points_be_used_by\nfrom evap.student.forms imp... | diff --git a/evap/contributor/templates/contributor_index.html b/evap/contributor/templates/contributor_index.html
index 0769b1d562..04724bcecb 100644
--- a/evap/contributor/templates/contributor_index.html
+++ b/evap/contributor/templates/contributor_index.html
@@ -180,10 +180,12 @@
</a>
{% endif %}
{% endif %}
- <a href="{% url 'contributor:evaluation_preview' evaluation.id %}" class="btn btn-sm btn-light"
- data-toggle="tooltip" data-placement="top" title="{% trans 'Preview' %}">
- <span class="fas fa-eye"></span>
- </a>
+ {% if evaluation|is_user_responsible_or_contributor_or_delegate:user %}
+ <a href="{% url 'contributor:evaluation_preview' evaluation.id %}" class="btn btn-sm btn-light"
+ data-toggle="tooltip" data-placement="top" title="{% trans 'Preview' %}">
+ <span class="fas fa-eye"></span>
+ </a>
+ {% endif %}
{% elif evaluation.state != 'published' and evaluation.is_single_result %}
<div class="d-flex" data-toggle="tooltip" data-placement="left" title="{% trans 'You will receive an email when the results are published.' %}">
{% include 'distribution_bar_disabled.html' with icon="fas fa-hourglass-half" weight=evaluation.weight weight_sum=evaluation.course.evaluation_weight_sum %}
diff --git a/evap/evaluation/templatetags/evaluation_filters.py b/evap/evaluation/templatetags/evaluation_filters.py
index b582d17402..3dfcaae851 100644
--- a/evap/evaluation/templatetags/evaluation_filters.py
+++ b/evap/evaluation/templatetags/evaluation_filters.py
@@ -99,6 +99,10 @@ def is_user_editor_or_delegate(evaluation, user):
return evaluation.is_user_editor_or_delegate(user)
+@register.filter
+def is_user_responsible_or_contributor_or_delegate(evaluation, user):
+ return evaluation.is_user_responsible_or_contributor_or_delegate(user)
+
@register.filter
def message_class(level):
return {
|
mitmproxy__mitmproxy-1336 | Snapshots have weird filename due to py3
##### Steps to reproduce the problem:
1. Look at https://snapshots.mitmproxy.org/v0.18/
```
mitmproxy-0.18dev0636-0xg588dad1-py2-none-any.whl 26-Jun-2016 21:55 985K
mitmproxy-0.18dev0759-0xg22c0db3-win32.zip 09-Jul-2016 20:45 35M
mitmproxy-0.18dev0775-0xb'g6bb267c'-osx.tar.gz 10-Jul-2016 09:29 35M
mitmproxy-0.18dev0775-0xb'g6bb267c'-py2.py3-non..> 10-Jul-2016 09:28 986K
pathod-0.18dev0759-0xg22c0db3-win32.zip 09-Jul-2016 20:46 12M
pathod-0.18dev0775-0xb'g6bb267c'-osx.tar.gz 10-Jul-2016 09:28 13M
```
##### What is the expected behavior?
Have proper filename.
##### What went wrong?
Some files contain e.g. `0xb'g6bb267c'`, others contain `0xg22c0db3`. The `0x` part is ok, but the other thing looks like a bytes vs. str problem introduced by creating snapshots with Python 3.
| [
{
"content": "#!/usr/bin/env python\nfrom __future__ import absolute_import, print_function, division\nfrom os.path import join\nimport contextlib\nimport os\nimport shutil\nimport subprocess\nimport re\nimport shlex\nimport runpy\nimport zipfile\nimport tarfile\nimport platform\nimport click\nimport pysftp\nim... | [
{
"content": "#!/usr/bin/env python\nfrom __future__ import absolute_import, print_function, division\nfrom os.path import join\nimport contextlib\nimport os\nimport shutil\nimport subprocess\nimport re\nimport shlex\nimport runpy\nimport zipfile\nimport tarfile\nimport platform\nimport click\nimport pysftp\nim... | diff --git a/release/rtool.py b/release/rtool.py
index 04e1249d07..4e43eaefd2 100755
--- a/release/rtool.py
+++ b/release/rtool.py
@@ -76,7 +76,7 @@ def get_snapshot_version():
return "{version}dev{tag_dist:04}-0x{commit}".format(
version=get_version(), # this should already be the next version
tag_dist=tag_dist,
- commit=commit
+ commit=commit.decode()
)
|
cloud-custodian__cloud-custodian-3597 | cli metrics subcommand and azure - throws errors
@kapilt what should the expected behavior be here?
```
(cloud-custodian) $ custodian metrics policies/policy.yml
2019-02-20 11:19:18,346: custodian.azure.session:INFO Creating session with Azure CLI Authentication
2019-02-20 11:19:18,347: custodian.azure.session:INFO Session using Subscription ID: <my sub redacted>
2019-02-20 11:19:18,347: custodian.commands:INFO Getting <Policy resource: azure.resourcegroup name: delete-empty-resource-groups region: > metrics
Traceback (most recent call last):
File "/Users/andyluong/Projects/forks/cloud-custodian/bin/custodian", line 11, in <module>
load_entry_point('c7n', 'console_scripts', 'custodian')()
File "/Users/andyluong/Projects/forks/cloud-custodian/c7n/cli.py", line 368, in main
command(config)
File "/Users/andyluong/Projects/forks/cloud-custodian/c7n/commands.py", line 136, in _load_policies
return f(options, list(policies))
File "/Users/andyluong/Projects/forks/cloud-custodian/c7n/commands.py", line 491, in metrics_cmd
data[p.name] = p.get_metrics(start, end, options.period)
File "/Users/andyluong/Projects/forks/cloud-custodian/c7n/policy.py", line 912, in get_metrics
return mode.get_metrics(start, end, period)
File "/Users/andyluong/Projects/forks/cloud-custodian/c7n/policy.py", line 170, in get_metrics
client = session.client('cloudwatch')
File "/Users/andyluong/Projects/forks/cloud-custodian/tools/c7n_azure/c7n_azure/session.py", line 148, in client
service_name, client_name = client.rsplit('.', 1)
ValueError: not enough values to unpack (expected 2, got 1)
```
| [
{
"content": "# Copyright 2015-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requi... | [
{
"content": "# Copyright 2015-2017 Capital One Services, LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless requi... | diff --git a/c7n/commands.py b/c7n/commands.py
index 58cd81041dc..5140d5f2ee8 100644
--- a/c7n/commands.py
+++ b/c7n/commands.py
@@ -487,6 +487,8 @@ def _metrics_get_endpoints(options):
@policy_command
def metrics_cmd(options, policies):
+ log.warning("metrics command is deprecated, and will be removed in future")
+ policies = [p for p in policies if p.provider_name == 'aws']
start, end = _metrics_get_endpoints(options)
data = {}
for p in policies:
|
google__mobly-518 | The kill signal param in `stop_standing_subprocess` is never used
https://github.com/google/mobly/blob/master/mobly/utils.py#L318
| [
{
"content": "# Copyright 2016 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | [
{
"content": "# Copyright 2016 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicab... | diff --git a/mobly/utils.py b/mobly/utils.py
index b4f85362..a9f065cd 100644
--- a/mobly/utils.py
+++ b/mobly/utils.py
@@ -316,7 +316,7 @@ def start_standing_subprocess(cmd, shell=False):
return proc
-def stop_standing_subprocess(proc, kill_signal=signal.SIGTERM):
+def stop_standing_subprocess(proc):
"""Stops a subprocess started by start_standing_subprocess.
Before killing the process, we check if the process is running, if it has
|
DDMAL__CantusDB-1182 | Source create/edit: Provenance autocompletes should be icontains
...rather than istartswith. "Berne" should find "Abdij ban Berne", etc.
| [
{
"content": "import csv\nfrom typing import Optional, Union\nfrom django.http.response import JsonResponse\nfrom django.http import HttpResponse, HttpResponseNotFound\nfrom django.shortcuts import render, redirect\nfrom django.urls.base import reverse\nfrom articles.models import Article\nfrom main_app.models ... | [
{
"content": "import csv\nfrom typing import Optional, Union\nfrom django.http.response import JsonResponse\nfrom django.http import HttpResponse, HttpResponseNotFound\nfrom django.shortcuts import render, redirect\nfrom django.urls.base import reverse\nfrom articles.models import Article\nfrom main_app.models ... | diff --git a/django/cantusdb_project/main_app/views/views.py b/django/cantusdb_project/main_app/views/views.py
index 61d06a15c..cbcba9c17 100644
--- a/django/cantusdb_project/main_app/views/views.py
+++ b/django/cantusdb_project/main_app/views/views.py
@@ -1063,7 +1063,7 @@ def get_queryset(self):
return Provenance.objects.none()
qs = Provenance.objects.all().order_by("name")
if self.q:
- qs = qs.filter(name__istartswith=self.q)
+ qs = qs.filter(name__icontains=self.q)
return qs
|
microsoft__ptvsd-362 | PTVSD fails to run on windows
```
Traceback (most recent call last):
File "C:\Users\karth\.vscode\extensions\ms-python.python-2018.3.1\pythonFiles\experimental\ptvsd_launcher.py", line 96,
in <module>
vspd.debug(filename, port_num, debug_id, debug_options, run_as)
File "c:\git\ptvsd\ptvsd\debugger.py", line 36, in debug
run(address, filename, *args, **kwargs)
File "c:\git\ptvsd\ptvsd\__main__.py", line 37, in run_file
run(argv, addr, **kwargs)
File "c:\git\ptvsd\ptvsd\__main__.py", line 85, in _run
daemon = _install(_pydevd, addr, **kwargs)
File "c:\git\ptvsd\ptvsd\pydevd_hooks.py", line 52, in install
daemon = Daemon(**kwargs)
File "c:\git\ptvsd\ptvsd\daemon.py", line 53, in __init__
self.install_exit_handlers()
File "c:\git\ptvsd\ptvsd\daemon.py", line 91, in install_exit_handlers
signal.SIGHUP: [],
AttributeError: module 'signal' has no attribute 'SIGHUP'
```
PTVSD fails to run on windows
```
Traceback (most recent call last):
File "C:\Users\karth\.vscode\extensions\ms-python.python-2018.3.1\pythonFiles\experimental\ptvsd_launcher.py", line 96,
in <module>
vspd.debug(filename, port_num, debug_id, debug_options, run_as)
File "c:\git\ptvsd\ptvsd\debugger.py", line 36, in debug
run(address, filename, *args, **kwargs)
File "c:\git\ptvsd\ptvsd\__main__.py", line 37, in run_file
run(argv, addr, **kwargs)
File "c:\git\ptvsd\ptvsd\__main__.py", line 85, in _run
daemon = _install(_pydevd, addr, **kwargs)
File "c:\git\ptvsd\ptvsd\pydevd_hooks.py", line 52, in install
daemon = Daemon(**kwargs)
File "c:\git\ptvsd\ptvsd\daemon.py", line 53, in __init__
self.install_exit_handlers()
File "c:\git\ptvsd\ptvsd\daemon.py", line 91, in install_exit_handlers
signal.SIGHUP: [],
AttributeError: module 'signal' has no attribute 'SIGHUP'
```
| [
{
"content": "import atexit\nimport os\nimport platform\nimport signal\nimport sys\n\nfrom ptvsd import wrapper\nfrom ptvsd.socket import close_socket\n\n\ndef _wait_on_exit():\n if sys.__stdout__ is not None:\n try:\n import msvcrt\n except ImportError:\n sys.__stdout__.w... | [
{
"content": "import atexit\nimport os\nimport platform\nimport signal\nimport sys\n\nfrom ptvsd import wrapper\nfrom ptvsd.socket import close_socket\n\n\ndef _wait_on_exit():\n if sys.__stdout__ is not None:\n try:\n import msvcrt\n except ImportError:\n sys.__stdout__.w... | diff --git a/ptvsd/daemon.py b/ptvsd/daemon.py
index 7ab4622d5..6dc4681fc 100644
--- a/ptvsd/daemon.py
+++ b/ptvsd/daemon.py
@@ -168,6 +168,9 @@ def handler():
self._atexit_handlers.append(handler)
def _add_signal_handlers(self):
+ if platform.system() == 'Windows':
+ return
+
def handler(signum, frame):
if not self._closed:
self.close()
|
opendatacube__datacube-core-262 | Error reading rainfall grids
### Expected behaviour
Return an xarray Dataset like the following:
```python
<xarray.Dataset>
Dimensions: (latitude: 1, longitude: 1, time: 366)
Coordinates:
* time (time) datetime64[ns] 2000-01-01 2000-01-02 2000-01-03 ...
* latitude (latitude) float64 -27.52
* longitude (longitude) float64 132.1
Data variables:
rainfall (time, latitude, longitude) float32 0.0 0.0 7.44684e-13 0.0 ...
Attributes:
crs: EPSG:4326
```
Data Cube, version 1.3.2
GDAL 2.1.3, released 2017/20/01
rasterio, version 1.0a8
And the conda environment at NCI is:
/g/data/v10/public/modules/agdc-py3-env/20170427
### Actual behaviour
Fails hard with the first file:
```python
Error opening source dataset: NETCDF:/g/data/rr5/agcd/0_05/rainfall/daily/2000/rr.2000010120000101.grid.nc:rain_day
```
And then continuing on in trying to assess the crs of the object which is None.
```python
/g/data/v10/public/modules/agdc-py3/1.5.0/lib/python3.6/site-packages/datacube/storage/storage.py in _rasterio_crs_wkt(src)
62 if str(rasterio.__version__) >= '0.36.0':
63 def _rasterio_crs_wkt(src):
---> 64 return str(src.crs.wkt)
65 else:
66 def _rasterio_crs_wkt(src):
AttributeError: 'NoneType' object has no attribute 'wkt'
```
### Steps to reproduce the behaviour
```python
import datacube
dc = datacube.Datacube()
rain = dc.load(product='bom_rainfall_grids', longitude=132.1, latitude=-27.5, time=('2000-1-1', '2001-1-1'))
```
### Environment information
* Which ``datacube --version`` are you using?
Open Data Cube core, version 1.5.0
* What datacube deployment/environment are you running against?
GDAL 2.2.1, released 2017/06/23
rasterio, version 1.0a9
The conda environment being used at NCI is:
/g/data/v10/public/modules/agdc-py3-env/20170710
| [
{
"content": "# coding=utf-8\n\"\"\"\nCreate/store dataset data into storage units based on the provided storage mappings.\n\nImportant functions are:\n\n* :func:`reproject_and_fuse`\n* :func:`read_from_source`\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function\n\nimport logging\nimport... | [
{
"content": "# coding=utf-8\n\"\"\"\nCreate/store dataset data into storage units based on the provided storage mappings.\n\nImportant functions are:\n\n* :func:`reproject_and_fuse`\n* :func:`read_from_source`\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function\n\nimport logging\nimport... | diff --git a/datacube/storage/storage.py b/datacube/storage/storage.py
index 5c19d2d587..f525c9e612 100644
--- a/datacube/storage/storage.py
+++ b/datacube/storage/storage.py
@@ -61,7 +61,10 @@ def _rasterio_resampling_method(resampling):
if str(rasterio.__version__) >= '0.36.0':
def _rasterio_crs_wkt(src):
- return str(src.crs.wkt)
+ if src.crs:
+ return str(src.crs.wkt)
+ else:
+ return ''
else:
def _rasterio_crs_wkt(src):
return str(src.crs_wkt)
diff --git a/docs/about/whats_new.rst b/docs/about/whats_new.rst
index 70ac62026f..527b1aa025 100644
--- a/docs/about/whats_new.rst
+++ b/docs/about/whats_new.rst
@@ -5,8 +5,16 @@
What's New
==========
-v1.5.0 (????)
--------------
+v1.5.1 Purpler Unicorn (13 July 2017)
+-------------------------------------
+
+ - Fix bug #261. Unable to load Australian Rainfall Grid Data. This was as a
+ result of the CRS/Transformation override functionality being broken when
+ using the latest `rasterio` version `1.0a9`
+
+
+v1.5.0 Purple Unicorn (9 July 2017)
+-----------------------------------
Usability Improvements
~~~~~~~~~~~~~~~~~~~~~~
diff --git a/tests/conftest.py b/tests/conftest.py
index e35c66be13..55feb0c83c 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -20,6 +20,12 @@ def example_gdal_path(data_folder):
return str(os.path.join(data_folder, 'sample_tile_151_-29.tif'))
+@pytest.fixture
+def no_crs_gdal_path(data_folder):
+ """Return the pathname of a GDAL file that doesn't contain a valid CRS."""
+ return str(os.path.join(data_folder, 'no_crs_ds.tif'))
+
+
@pytest.fixture
def data_folder():
return os.path.join(os.path.split(os.path.realpath(__file__))[0], 'data')
diff --git a/tests/data/no_crs_ds.tif b/tests/data/no_crs_ds.tif
new file mode 100644
index 0000000000..a35f8b0ad7
Binary files /dev/null and b/tests/data/no_crs_ds.tif differ
diff --git a/tests/storage/test_storage.py b/tests/storage/test_storage.py
index 1e673d4d69..38ae3fc8fc 100644
--- a/tests/storage/test_storage.py
+++ b/tests/storage/test_storage.py
@@ -4,7 +4,7 @@
import mock
import netCDF4
-import numpy
+import numpy as np
import pytest
import rasterio.warp
import xarray
@@ -12,7 +12,7 @@
import datacube
from datacube.model import Dataset, DatasetType, MetadataType
-from datacube.storage.storage import OverrideBandDataSource
+from datacube.storage.storage import OverrideBandDataSource, RasterFileDataSource
from datacube.storage.storage import write_dataset_to_netcdf, reproject_and_fuse, read_from_source, Resampling, \
DatasetSource
from datacube.utils import geometry
@@ -30,7 +30,7 @@ def test_write_dataset_to_netcdf(tmpnetcdf_filename):
dataset[name] = (name, coord.values, {'units': coord.units, 'crs': geobox.crs})
dataset['B10'] = (geobox.dimensions,
- numpy.arange(10000, dtype='int16').reshape(geobox.shape),
+ np.arange(10000, dtype='int16').reshape(geobox.shape),
{'nodata': 0, 'units': '1', 'crs': geobox.crs})
write_dataset_to_netcdf(dataset, tmpnetcdf_filename, global_attributes={'foo': 'bar'},
@@ -57,7 +57,7 @@ def test_write_dataset_to_netcdf(tmpnetcdf_filename):
# dataset[name] = (name, coord.values, {'units': coord.units, 'crs': geobox.crs})
#
# dataset['B10'] = (geobox.dimensions,
-# numpy.arange(11000, dtype='int16').reshape(geobox.shape),
+# np.arange(11000, dtype='int16').reshape(geobox.shape),
# {'nodata': 0, 'units': '1', 'crs': geobox.crs})
#
# write_dataset_to_netcdf(dataset, tmpnetcdf_filename, global_attributes={'foo': 'bar'},
@@ -70,7 +70,7 @@ def test_write_dataset_to_netcdf(tmpnetcdf_filename):
# assert source.transform.almost_equals(affine)
# assert (source.read() == dataset['B10']).all()
#
-# dest = numpy.empty((60, 50))
+# dest = np.empty((60, 50))
# source.reproject(dest, affine, geobox.crs, 0, Resampling.nearest)
# assert (dest == dataset['B10'][:60, :50]).all()
#
@@ -80,7 +80,7 @@ def test_write_dataset_to_netcdf(tmpnetcdf_filename):
# source.reproject(dest, affine * Affine.translation(-10, -10), geobox.crs, 0, Resampling.nearest)
# assert (dest[10:, 10:] == dataset['B10'][:50, :40]).all()
#
-# dest = numpy.empty((200, 200))
+# dest = np.empty((200, 200))
# source.reproject(dest, affine, geobox.crs, 0, Resampling.nearest)
# assert (dest[:100, :110] == dataset['B10']).all()
#
@@ -111,7 +111,7 @@ def test_first_source_is_priority_in_reproject_and_fuse():
source2 = _mock_datasetsource([[2, 2], [2, 2]], crs=crs, shape=shape)
sources = [source1, source2]
- output_data = numpy.full(shape, fill_value=no_data, dtype='int16')
+ output_data = np.full(shape, fill_value=no_data, dtype='int16')
reproject_and_fuse(sources, output_data, dst_transform=identity, dst_projection=crs, dst_nodata=no_data)
assert (output_data == 1).all()
@@ -126,7 +126,7 @@ def test_second_source_used_when_first_is_empty():
source2 = _mock_datasetsource([[2, 2], [2, 2]], crs=crs, shape=shape)
sources = [source1, source2]
- output_data = numpy.full(shape, fill_value=no_data, dtype='int16')
+ output_data = np.full(shape, fill_value=no_data, dtype='int16')
reproject_and_fuse(sources, output_data, dst_transform=identity, dst_projection=crs, dst_nodata=no_data)
assert (output_data == 2).all()
@@ -141,7 +141,7 @@ def test_mixed_result_when_first_source_partially_empty():
source2 = _mock_datasetsource([[2, 2], [2, 2]], crs=crs, shape=shape)
sources = [source1, source2]
- output_data = numpy.full(shape, fill_value=no_data, dtype='int16')
+ output_data = np.full(shape, fill_value=no_data, dtype='int16')
reproject_and_fuse(sources, output_data, dst_transform=identity, dst_projection=crs, dst_nodata=no_data)
assert (output_data == [[1, 1], [2, 2]]).all()
@@ -154,7 +154,7 @@ def _mock_datasetsource(value, crs=None, shape=(2, 2)):
rio_reader.crs = crs
rio_reader.transform = identity
rio_reader.shape = shape
- rio_reader.read.return_value = numpy.array(value)
+ rio_reader.read.return_value = np.array(value)
# Use the following if a reproject were to be required
# def fill_array(dest, *args, **kwargs):
@@ -175,7 +175,7 @@ def test_read_from_broken_source():
rio_reader = source1.open.return_value.__enter__.return_value
rio_reader.read.side_effect = OSError('Read or write failed')
- output_data = numpy.full(shape, fill_value=no_data, dtype='int16')
+ output_data = np.full(shape, fill_value=no_data, dtype='int16')
# Check exception is raised
with pytest.raises(OSError):
@@ -212,17 +212,17 @@ def __init__(self):
self.nodata = -999
self.shape = (613, 597)
- self.data = numpy.full(self.shape, self.nodata, dtype='int16')
- self.data[:512, :512] = numpy.arange(512) + numpy.arange(512).reshape((512, 1))
+ self.data = np.full(self.shape, self.nodata, dtype='int16')
+ self.data[:512, :512] = np.arange(512) + np.arange(512).reshape((512, 1))
def read(self, window=None, out_shape=None):
data = self.data
if window:
data = self.data[slice(*window[0]), slice(*window[1])]
if out_shape:
- xidx = ((numpy.arange(out_shape[1]) + 0.5) * (data.shape[1] / out_shape[1]) - 0.5).round().astype('int')
- yidx = ((numpy.arange(out_shape[0]) + 0.5) * (data.shape[0] / out_shape[0]) - 0.5).round().astype('int')
- data = data[numpy.meshgrid(yidx, xidx, indexing='ij')]
+ xidx = ((np.arange(out_shape[1]) + 0.5) * (data.shape[1] / out_shape[1]) - 0.5).round().astype('int')
+ yidx = ((np.arange(out_shape[0]) + 0.5) * (data.shape[0] / out_shape[0]) - 0.5).round().astype('int')
+ data = data[np.meshgrid(yidx, xidx, indexing='ij')]
return data
def reproject(self, dest, dst_transform, dst_crs, dst_nodata, resampling, **kwargs):
@@ -239,7 +239,7 @@ def reproject(self, dest, dst_transform, dst_crs, dst_nodata, resampling, **kwar
def assert_same_read_results(source, dst_shape, dst_dtype, dst_transform, dst_nodata, dst_projection, resampling):
- expected = numpy.empty(dst_shape, dtype=dst_dtype)
+ expected = np.empty(dst_shape, dtype=dst_dtype)
with source.open() as src:
rasterio.warp.reproject(src.data,
expected,
@@ -251,7 +251,7 @@ def assert_same_read_results(source, dst_shape, dst_dtype, dst_transform, dst_no
dst_nodata=dst_nodata,
resampling=resampling)
- result = numpy.empty(dst_shape, dtype=dst_dtype)
+ result = np.empty(dst_shape, dtype=dst_dtype)
with datacube.set_options(reproject_threads=1):
read_from_source(source,
result,
@@ -260,7 +260,7 @@ def assert_same_read_results(source, dst_shape, dst_dtype, dst_transform, dst_no
dst_projection=dst_projection,
resampling=resampling)
- assert numpy.isclose(result, expected, atol=0, rtol=0.05, equal_nan=True).all()
+ assert np.isclose(result, expected, atol=0, rtol=0.05, equal_nan=True).all()
return result
@@ -420,8 +420,6 @@ def fake_open():
def test_read_raster_with_custom_crs_and_transform(example_gdal_path):
- import numpy as np
-
with rasterio.open(example_gdal_path) as src:
band = rasterio.band(src, 1)
crs = geometry.CRS('EPSG:3577')
@@ -444,6 +442,22 @@ def test_read_raster_with_custom_crs_and_transform(example_gdal_path):
assert (dest1 == dest2).all()
+def test_read_from_file_with_missing_crs(no_crs_gdal_path):
+ """
+ We need to be able to read from data files even when GDAL can't automatically gather all the metdata.
+
+ The :class:`RasterFileDataSource` is able to override the nodata, CRS and transform attributes if necessary.
+ """
+ crs = geometry.CRS('EPSG:4326')
+ nodata = -999
+ transform = Affine(0.01, 0.0, 111.975,
+ 0.0, 0.01, -9.975)
+ data_source = RasterFileDataSource(no_crs_gdal_path, bandnumber=1, nodata=nodata, crs=crs, transform=transform)
+ with data_source.open() as src:
+ dest1 = src.read()
+ assert dest1.shape == (10, 10)
+
+
_EXAMPLE_METADATA_TYPE = MetadataType(
{
'name': 'eo',
|
kserve__kserve-864 | explanations no longer working with 0.3.0
Am following the steps in with 0.3.0 of kfserving: https://github.com/kubeflow/kfserving/tree/master/docs/samples/explanation/alibi/income
When I execute the curl for the explain I get a 500 error and the container logs show the below. I'm guessing the [update to master](https://github.com/kubeflow/kfserving/pull/803) means that the explainer models have also been updated and so they no longer work with 0.3.0 (the latest release version)
```
[E 200605 17:15:14 web:1792] Uncaught exception POST /v1/models/income:explain (127.0.0.1)
HTTPServerRequest(protocol='http', host='income-explainer-default.default.svc.cluster.local', method='POST', uri='/v1/models/income:explain', version='HTTP/1.1', remote_ip='127.0.0.1')
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/tornado/web.py", line 1701, in _execute
result = method(*self.path_args, **self.path_kwargs)
File "/kfserving/kfserving/handlers/http.py", line 61, in post
response = model.explain(request)
File "/alibiexplainer/alibiexplainer/explainer.py", line 74, in explain
explanation = self.wrapper.explain(request["instances"])
File "/alibiexplainer/alibiexplainer/anchor_tabular.py", line 89, in explain
anchor_exp = self.anchors_tabular.explain(arr[0], **self.kwargs)
File "/usr/local/lib/python3.7/site-packages/alibi/explainers/anchor_tabular.py", line 803, in explain
for sampler in self.samplers:
AttributeError: 'AnchorTabular' object has no attribute 'samplers'
[E 200605 17:15:14 web:2250] 500 POST /v1/models/income:explain (127.0.0.1) 58.80ms
[I 200605 17:18:22 anchor_tabular:83] Arr shape ((1, 12),)
[E 200605 17:18:22 web:1792] Uncaught exception POST /v1/models/income:explain (127.0.0.1)
HTTPServerRequest(protocol='http', host='income-explainer-default.default.svc.cluster.local', method='POST', uri='/v1/models/income:explain', version='HTTP/1.1', remote_ip='127.0.0.1')
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/tornado/web.py", line 1701, in _execute
result = method(*self.path_args, **self.path_kwargs)
File "/kfserving/kfserving/handlers/http.py", line 61, in post
response = model.explain(request)
File "/alibiexplainer/alibiexplainer/explainer.py", line 74, in explain
explanation = self.wrapper.explain(request["instances"])
File "/alibiexplainer/alibiexplainer/anchor_tabular.py", line 89, in explain
anchor_exp = self.anchors_tabular.explain(arr[0], **self.kwargs)
File "/usr/local/lib/python3.7/site-packages/alibi/explainers/anchor_tabular.py", line 803, in explain
for sampler in self.samplers:
AttributeError: 'AnchorTabular' object has no attribute 'samplers'
[E 200605 17:18:22 web:2250] 500 POST /v1/models/income:explain (127.0.0.1) 31.17ms
```
Presumably it would work on master. Does that sound right @cliveseldon ? If so maybe we should just close this.
| [
{
"content": "# Copyright 2019 kubeflow.org.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applica... | [
{
"content": "# Copyright 2019 kubeflow.org.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applica... | diff --git a/python/alibiexplainer.Dockerfile b/python/alibiexplainer.Dockerfile
index 762a305ef20..5ef19232bae 100644
--- a/python/alibiexplainer.Dockerfile
+++ b/python/alibiexplainer.Dockerfile
@@ -5,8 +5,5 @@ COPY kfserving kfserving
COPY third_party third_party
RUN pip install --upgrade pip && pip install -e ./kfserving
-RUN git clone https://github.com/SeldonIO/alibi.git && \
- cd alibi && \
- pip install .
RUN pip install -e ./alibiexplainer
ENTRYPOINT ["python", "-m", "alibiexplainer"]
diff --git a/python/alibiexplainer/setup.py b/python/alibiexplainer/setup.py
index 8eba40cdbb0..3591cb3e1b1 100644
--- a/python/alibiexplainer/setup.py
+++ b/python/alibiexplainer/setup.py
@@ -33,7 +33,7 @@
packages=find_packages("alibiexplainer"),
install_requires=[
"kfserving>=0.3.0",
- "alibi>=0.3",
+ "alibi==0.3.2",
"scikit-learn>=0.20.3",
"argparse>=1.4.0",
"requests>=2.22.0",
diff --git a/python/pytorch.Dockerfile b/python/pytorch.Dockerfile
index b8b80db83f2..60ac34895e8 100644
--- a/python/pytorch.Dockerfile
+++ b/python/pytorch.Dockerfile
@@ -11,7 +11,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libpng-dev && \
rm -rf /var/lib/apt/lists/*
-RUN curl -o ~/miniconda.sh -O https://repo.continuum.io/miniconda/Miniconda3-4.2.12-Linux-x86_64.sh && \
+RUN curl -L -o ~/miniconda.sh -O https://repo.continuum.io/miniconda/Miniconda3-4.2.12-Linux-x86_64.sh && \
chmod +x ~/miniconda.sh && \
~/miniconda.sh -b -p /opt/conda && \
rm ~/miniconda.sh && \
|
celery__celery-5898 | Python 3.9 compatibility issue regarding usage of threading.Thread.isAlive
<!--
Please fill this template entirely and do not erase parts of it.
We reserve the right to close without a response
bug reports which are incomplete.
-->
# Checklist
<!--
To check an item on the list replace [ ] with [x].
-->
- [x] I have verified that the issue exists against the `master` branch of Celery.
- [ ] This has already been asked to the [discussion group](https://groups.google.com/forum/#!forum/celery-users) first.
- [ ] I have read the relevant section in the
[contribution guide](http://docs.celeryproject.org/en/latest/contributing.html#other-bugs)
on reporting bugs.
- [x] I have checked the [issues list](https://github.com/celery/celery/issues?q=is%3Aissue+label%3A%22Issue+Type%3A+Bug+Report%22+-label%3A%22Category%3A+Documentation%22)
for similar or identical bug reports.
- [x] I have checked the [pull requests list](https://github.com/celery/celery/pulls?q=is%3Apr+label%3A%22PR+Type%3A+Bugfix%22+-label%3A%22Category%3A+Documentation%22)
for existing proposed fixes.
- [x] I have checked the [commit log](https://github.com/celery/celery/commits/master)
to find out if the bug was already fixed in the master branch.
- [ ] I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
- [x] I have verified that the issue exists against the `master` branch of Celery.
## Optional Debugging Information
`isAlive` was deprecated and removed in Python 3.9 . Celery has the deprecation warning that will become error in Python 3.9 .
https://travis-ci.org/celery/celery/jobs/628813003#L3262-L3263
Relevant CPython PR : https://github.com/python/cpython/pull/15225
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Scheduler for Python functions.\n\n.. note::\n This is used for the thread-based worker only,\n not for amqp/redis/sqs/qpid where :mod:`kombu.asynchronous.timer` is used.\n\"\"\"\nfrom __future__ import absolute_import, print_function, unicode_literals\n\nimpor... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Scheduler for Python functions.\n\n.. note::\n This is used for the thread-based worker only,\n not for amqp/redis/sqs/qpid where :mod:`kombu.asynchronous.timer` is used.\n\"\"\"\nfrom __future__ import absolute_import, print_function, unicode_literals\n\nimpor... | diff --git a/celery/utils/timer2.py b/celery/utils/timer2.py
index 58de4ac278b..87f29b36891 100644
--- a/celery/utils/timer2.py
+++ b/celery/utils/timer2.py
@@ -102,7 +102,7 @@ def stop(self):
self.running = False
def ensure_started(self):
- if not self.running and not self.isAlive():
+ if not self.running and not self.is_alive():
if self.on_start:
self.on_start(self)
self.start()
|
aio-libs__aiohttp-3295 | request.content.iter_chunks() stalls when content is empty.
## Long story short
The title says it all.
Shouldn't [EmptyStreamReader.readchunk](https://github.com/aio-libs/aiohttp/blob/master/aiohttp/streams.py#L470) on its face `return (b'', True)` and not `return (b'', False)` ?
Without it, the special EOS chunk is never sent...
request.content.iter_chunks() stalls when content is empty.
## Long story short
The title says it all.
Shouldn't [EmptyStreamReader.readchunk](https://github.com/aio-libs/aiohttp/blob/master/aiohttp/streams.py#L470) on its face `return (b'', True)` and not `return (b'', False)` ?
Without it, the special EOS chunk is never sent...
| [
{
"content": "import asyncio\nimport collections\nfrom typing import List # noqa\nfrom typing import Awaitable, Callable, Optional, Tuple\n\nfrom .base_protocol import BaseProtocol\nfrom .helpers import BaseTimerContext, set_exception, set_result\nfrom .log import internal_logger\n\n\ntry: # pragma: no cover\... | [
{
"content": "import asyncio\nimport collections\nfrom typing import List # noqa\nfrom typing import Awaitable, Callable, Optional, Tuple\n\nfrom .base_protocol import BaseProtocol\nfrom .helpers import BaseTimerContext, set_exception, set_result\nfrom .log import internal_logger\n\n\ntry: # pragma: no cover\... | diff --git a/CHANGES/3186.bugfix b/CHANGES/3186.bugfix
new file mode 100644
index 00000000000..e434938b3c1
--- /dev/null
+++ b/CHANGES/3186.bugfix
@@ -0,0 +1 @@
+Return empty bytes with end-of-chunk marker in empty stream reader.
diff --git a/aiohttp/streams.py b/aiohttp/streams.py
index 20f38f06039..7cdb97abc61 100644
--- a/aiohttp/streams.py
+++ b/aiohttp/streams.py
@@ -482,7 +482,7 @@ async def readany(self) -> bytes:
return b''
async def readchunk(self) -> Tuple[bytes, bool]:
- return (b'', False)
+ return (b'', True)
async def readexactly(self, n: int) -> bytes:
raise asyncio.streams.IncompleteReadError(b'', n)
diff --git a/tests/test_streams.py b/tests/test_streams.py
index 66a96a23379..fbb760c80d0 100644
--- a/tests/test_streams.py
+++ b/tests/test_streams.py
@@ -750,7 +750,7 @@ async def test_empty_stream_reader() -> None:
assert await s.read() == b''
assert await s.readline() == b''
assert await s.readany() == b''
- assert await s.readchunk() == (b'', False)
+ assert await s.readchunk() == (b'', True)
with pytest.raises(asyncio.IncompleteReadError):
await s.readexactly(10)
assert s.read_nowait() == b''
|
canonical__snapcraft-4329 | SNAPCRAFT_BUILD_ENVIRONMENT apparently takes precedence over "--use-lxd"
### Bug Description
Despite what is printed here, https://snapcraft.io/docs/build-providers, if I set the env var SNAPCRAFT_BUILD_ENVIRONMENT=host, that appears to take priority over using "--use-lxd" on the command line.
### To Reproduce
Set env var SNAPCRAFT_BUILD_ENVIRONMENT=host, then try overriding that with "--use-lxd".
### Environment
Ubuntu 22.04
### snapcraft.yaml
```shell
Not relevant.
```
### Relevant log output
```shell
Build is done on host in destructive mode.
```
### Additional context
_No response_
| [
{
"content": "# -*- Mode:Python; indent-tabs-mode:nil; tab-width:4 -*-\n#\n# Copyright 2022-2023 Canonical Ltd.\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License version 3 as\n# published by the Free Software Foundation.\n#\n# This... | [
{
"content": "# -*- Mode:Python; indent-tabs-mode:nil; tab-width:4 -*-\n#\n# Copyright 2022-2023 Canonical Ltd.\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License version 3 as\n# published by the Free Software Foundation.\n#\n# This... | diff --git a/snapcraft/parts/lifecycle.py b/snapcraft/parts/lifecycle.py
index 3b70b876e1..6fb3ceb3c6 100644
--- a/snapcraft/parts/lifecycle.py
+++ b/snapcraft/parts/lifecycle.py
@@ -251,7 +251,7 @@ def _run_command(
permanent=True,
)
- if (
+ if parsed_args.use_lxd or (
not managed_mode
and not parsed_args.destructive_mode
and not os.getenv("SNAPCRAFT_BUILD_ENVIRONMENT") == "host"
diff --git a/tests/spread/providers/use-lxd/task.yaml b/tests/spread/providers/use-lxd/task.yaml
new file mode 100644
index 0000000000..e636b1e922
--- /dev/null
+++ b/tests/spread/providers/use-lxd/task.yaml
@@ -0,0 +1,29 @@
+summary: Test --use-lxd takes priority over environment variables
+systems:
+ - ubuntu-22.04*
+
+prepare: |
+ snapcraft init
+
+restore: |
+ rm -rf ./*.snap
+
+execute: |
+ export SNAPCRAFT_BUILD_ENVIRONMENT="host"
+
+ snapcraft pull --use-lxd
+
+ if [[ -d parts ]]; then
+ echo "snapcraft did not run inside a lxd instance"
+ exit 1
+ fi
+
+ unset SNAPCRAFT_BUILD_ENVIRONMENT
+ export SNAPCRAFT_MANAGED_MODE=1
+
+ snapcraft pull --use-lxd
+
+ if [[ -d parts ]]; then
+ echo "snapcraft did not run inside a lxd instance"
+ exit 1
+ fi
diff --git a/tests/unit/parts/test_lifecycle.py b/tests/unit/parts/test_lifecycle.py
index ed1d7b9c27..e770f2f7c3 100644
--- a/tests/unit/parts/test_lifecycle.py
+++ b/tests/unit/parts/test_lifecycle.py
@@ -306,11 +306,33 @@ def test_lifecycle_run_command_step(
assert run_mock.mock_calls == [call(step, **call_args)]
+@pytest.mark.parametrize("managed_mode", [True, False])
+@pytest.mark.parametrize("build_env", [None, "host", "multipass", "lxd", "other"])
@pytest.mark.parametrize("cmd", ["pack", "snap"])
-def test_lifecycle_run_command_pack(cmd, snapcraft_yaml, project_vars, new_dir, mocker):
+def test_lifecycle_run_local_destructive_mode(
+ managed_mode,
+ build_env,
+ cmd,
+ snapcraft_yaml,
+ project_vars,
+ new_dir,
+ mocker,
+ monkeypatch,
+):
+ """Run the lifecycle locally when destructive_mode is True."""
project = Project.unmarshal(snapcraft_yaml(base="core22"))
+ run_in_provider_mock = mocker.patch("snapcraft.parts.lifecycle._run_in_provider")
run_mock = mocker.patch("snapcraft.parts.PartsLifecycle.run")
pack_mock = mocker.patch("snapcraft.pack.pack_snap")
+ mocker.patch("snapcraft.utils.is_managed_mode", return_value=managed_mode)
+ mocker.patch(
+ "snapcraft.utils.get_managed_environment_home_path",
+ return_value=new_dir / "home",
+ )
+ if build_env:
+ monkeypatch.setenv("SNAPCRAFT_BUILD_ENVIRONMENT", build_env)
+ else:
+ monkeypatch.delenv("SNAPCRAFT_BUILD_ENVIRONMENT", raising=False)
parts_lifecycle._run_command(
cmd,
@@ -333,10 +355,11 @@ def test_lifecycle_run_command_pack(cmd, snapcraft_yaml, project_vars, new_dir,
),
)
+ assert run_in_provider_mock.mock_calls == []
assert run_mock.mock_calls == [call("prime", shell=False, shell_after=False)]
assert pack_mock.mock_calls[:1] == [
call(
- new_dir / "prime",
+ new_dir / "home/prime" if managed_mode else new_dir / "prime",
output=None,
compression="xz",
name="mytest",
@@ -346,10 +369,20 @@ def test_lifecycle_run_command_pack(cmd, snapcraft_yaml, project_vars, new_dir,
]
+@pytest.mark.parametrize("destructive_mode", [True, False])
+@pytest.mark.parametrize("build_env", [None, "host", "multipass", "lxd", "other"])
@pytest.mark.parametrize("cmd", ["pack", "snap"])
-def test_lifecycle_pack_destructive_mode(
- cmd, snapcraft_yaml, project_vars, new_dir, mocker
+def test_lifecycle_run_local_managed_mode(
+ destructive_mode,
+ build_env,
+ cmd,
+ snapcraft_yaml,
+ project_vars,
+ new_dir,
+ mocker,
+ monkeypatch,
):
+ """Run the lifecycle locally when managed_mode is True."""
project = Project.unmarshal(snapcraft_yaml(base="core22"))
run_in_provider_mock = mocker.patch("snapcraft.parts.lifecycle._run_in_provider")
run_mock = mocker.patch("snapcraft.parts.PartsLifecycle.run")
@@ -359,6 +392,10 @@ def test_lifecycle_pack_destructive_mode(
"snapcraft.utils.get_managed_environment_home_path",
return_value=new_dir / "home",
)
+ if build_env:
+ monkeypatch.setenv("SNAPCRAFT_BUILD_ENVIRONMENT", build_env)
+ else:
+ monkeypatch.delenv("SNAPCRAFT_BUILD_ENVIRONMENT", raising=False)
parts_lifecycle._run_command(
cmd,
@@ -372,7 +409,7 @@ def test_lifecycle_pack_destructive_mode(
output=None,
debug=False,
enable_manifest=False,
- destructive_mode=True,
+ destructive_mode=destructive_mode,
shell=False,
shell_after=False,
use_lxd=False,
@@ -395,17 +432,30 @@ def test_lifecycle_pack_destructive_mode(
]
+@pytest.mark.parametrize("managed_mode", [True, False])
+@pytest.mark.parametrize("destructive_mode", [True, False])
@pytest.mark.parametrize("cmd", ["pack", "snap"])
-def test_lifecycle_pack_managed(cmd, snapcraft_yaml, project_vars, new_dir, mocker):
+def test_lifecycle_run_local_build_env(
+ managed_mode,
+ destructive_mode,
+ cmd,
+ monkeypatch,
+ snapcraft_yaml,
+ project_vars,
+ new_dir,
+ mocker,
+):
+ """Run the lifecycle locally when the build environment is 'host'."""
project = Project.unmarshal(snapcraft_yaml(base="core22"))
run_in_provider_mock = mocker.patch("snapcraft.parts.lifecycle._run_in_provider")
run_mock = mocker.patch("snapcraft.parts.PartsLifecycle.run")
pack_mock = mocker.patch("snapcraft.pack.pack_snap")
- mocker.patch("snapcraft.utils.is_managed_mode", return_value=True)
+ mocker.patch("snapcraft.utils.is_managed_mode", return_value=managed_mode)
mocker.patch(
"snapcraft.utils.get_managed_environment_home_path",
return_value=new_dir / "home",
)
+ monkeypatch.setenv("SNAPCRAFT_BUILD_ENVIRONMENT", "host")
parts_lifecycle._run_command(
cmd,
@@ -422,7 +472,7 @@ def test_lifecycle_pack_managed(cmd, snapcraft_yaml, project_vars, new_dir, mock
build_for=None,
enable_manifest=False,
manifest_image_information=None,
- destructive_mode=False,
+ destructive_mode=destructive_mode,
shell=False,
shell_after=False,
use_lxd=False,
@@ -435,7 +485,7 @@ def test_lifecycle_pack_managed(cmd, snapcraft_yaml, project_vars, new_dir, mock
assert run_mock.mock_calls == [call("prime", shell=False, shell_after=False)]
assert pack_mock.mock_calls[:1] == [
call(
- new_dir / "home/prime",
+ new_dir / "home/prime" if managed_mode else new_dir / "prime",
output=None,
compression="xz",
name="mytest",
@@ -445,12 +495,21 @@ def test_lifecycle_pack_managed(cmd, snapcraft_yaml, project_vars, new_dir, mock
]
+@pytest.mark.parametrize("build_env", [None, "lxd", "multipass", "other"])
@pytest.mark.parametrize("cmd", ["pack", "snap"])
-def test_lifecycle_pack_not_managed(cmd, snapcraft_yaml, new_dir, mocker):
+def test_lifecycle_run_in_provider_by_default(
+ build_env, cmd, snapcraft_yaml, new_dir, mocker, monkeypatch
+):
+ """Run lifecycle in a provider when not in managed_mode, not in destructive_mode,
+ and the build environment is not 'host'."""
project = Project.unmarshal(snapcraft_yaml(base="core22"))
run_in_provider_mock = mocker.patch("snapcraft.parts.lifecycle._run_in_provider")
run_mock = mocker.patch("snapcraft.parts.PartsLifecycle.run")
mocker.patch("snapcraft.utils.is_managed_mode", return_value=False)
+ if build_env:
+ monkeypatch.setenv("SNAPCRAFT_BUILD_ENVIRONMENT", build_env)
+ else:
+ monkeypatch.delenv("SNAPCRAFT_BUILD_ENVIRONMENT", raising=False)
parts_lifecycle._run_command(
cmd,
@@ -484,6 +543,78 @@ def test_lifecycle_pack_not_managed(cmd, snapcraft_yaml, new_dir, mocker):
]
+@pytest.mark.parametrize("managed_mode", [True, False])
+@pytest.mark.parametrize("destructive_mode", [True, False])
+@pytest.mark.parametrize("build_env", [None, "host", "lxd", "multipass", "other"])
+@pytest.mark.parametrize("cmd", ["pack", "snap"])
+def test_lifecycle_run_in_provider_use_lxd(
+ managed_mode,
+ destructive_mode,
+ build_env,
+ cmd,
+ mocker,
+ monkeypatch,
+ new_dir,
+ project_vars,
+ snapcraft_yaml,
+):
+ """Run the lifecycle in a provider when `use_lxd` is true."""
+ project = Project.unmarshal(snapcraft_yaml(base="core22"))
+ run_in_provider_mock = mocker.patch("snapcraft.parts.lifecycle._run_in_provider")
+ run_mock = mocker.patch("snapcraft.parts.PartsLifecycle.run")
+ mocker.patch("snapcraft.pack.pack_snap")
+ mocker.patch(
+ "snapcraft.utils.get_managed_environment_home_path",
+ return_value=new_dir / "home",
+ )
+ mocker.patch("snapcraft.utils.is_managed_mode", return_value=managed_mode)
+ if build_env:
+ monkeypatch.setenv("SNAPCRAFT_BUILD_ENVIRONMENT", build_env)
+ else:
+ monkeypatch.delenv("SNAPCRAFT_BUILD_ENVIRONMENT", raising=False)
+
+ parts_lifecycle._run_command(
+ cmd,
+ project=project,
+ parse_info={},
+ assets_dir=Path(),
+ start_time=datetime.now(),
+ parallel_build_count=8,
+ parsed_args=argparse.Namespace(
+ directory=None,
+ output=None,
+ debug=False,
+ enable_manifest=False,
+ destructive_mode=destructive_mode,
+ shell=False,
+ shell_after=False,
+ use_lxd=True,
+ ua_token=None,
+ parts=[],
+ ),
+ )
+
+ assert run_mock.mock_calls == []
+ assert run_in_provider_mock.mock_calls == [
+ call(
+ project,
+ cmd,
+ argparse.Namespace(
+ directory=None,
+ output=None,
+ debug=False,
+ enable_manifest=False,
+ destructive_mode=destructive_mode,
+ shell=False,
+ shell_after=False,
+ use_lxd=True,
+ ua_token=None,
+ parts=[],
+ ),
+ )
+ ]
+
+
@pytest.mark.parametrize("cmd", ["pack", "snap"])
def test_lifecycle_pack_metadata_error(cmd, snapcraft_yaml, new_dir, mocker):
project = Project.unmarshal(snapcraft_yaml(base="core22"))
@@ -1703,6 +1834,7 @@ def test_lifecycle_write_metadata(
parsed_args = argparse.Namespace(
debug=False,
destructive_mode=True,
+ use_lxd=False,
enable_manifest=True,
ua_token=None,
parts=[],
|
pypa__pip-1390 | pip doesn't detect a venv created virtual environment as a virtual environment
The venv integration in Python 3.4 fails if PIP_REQUIREVIRTUALENV is set (http://bugs.python.org/issue19734)
I'm currently working around this by forcibly clearing the setting in the test, but the PIP_REQUIREVIRTUALENV check should pass when sys.prefix and sys.base_prefix are different.
| [
{
"content": "\"\"\"Locations where we look for configs, install stuff, etc\"\"\"\n\nimport sys\nimport site\nimport os\nimport tempfile\nfrom distutils.command.install import install, SCHEME_KEYS\nimport getpass\nfrom pip.backwardcompat import get_python_lib, get_path_uid, user_site\nimport pip.exceptions\n\n\... | [
{
"content": "\"\"\"Locations where we look for configs, install stuff, etc\"\"\"\n\nimport sys\nimport site\nimport os\nimport tempfile\nfrom distutils.command.install import install, SCHEME_KEYS\nimport getpass\nfrom pip.backwardcompat import get_python_lib, get_path_uid, user_site\nimport pip.exceptions\n\n\... | diff --git a/pip/locations.py b/pip/locations.py
index 61699434665..1d402651689 100644
--- a/pip/locations.py
+++ b/pip/locations.py
@@ -34,7 +34,12 @@ def running_under_virtualenv():
Return True if we're running inside a virtualenv, False otherwise.
"""
- return hasattr(sys, 'real_prefix')
+ if hasattr(sys, 'real_prefix'):
+ return True
+ elif sys.prefix != getattr(sys, "base_prefix", sys.prefix):
+ return True
+
+ return False
def virtualenv_no_global():
|
paperless-ngx__paperless-ngx-3554 | [BUG] Mail rule action "Move to specified folder" not working
### Description
If mail rule action is "Move to specified folder", the mail.log contains:
```
[2023-06-06 23:50:00,229] [DEBUG] [paperless_mail] Processing mail account T-Online (Thorsten)
[2023-06-06 23:50:00,342] [DEBUG] [paperless_mail] GMAIL Label Support: False
[2023-06-06 23:50:00,343] [DEBUG] [paperless_mail] AUTH=PLAIN Support: False
[2023-06-06 23:50:00,389] [DEBUG] [paperless_mail] Account T-Online (Thorsten): Processing 1 rule(s)
[2023-06-06 23:50:00,395] [DEBUG] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Selecting folder INBOX.Paperless
[2023-06-06 23:50:00,410] [DEBUG] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Searching folder with criteria None
[2023-06-06 23:50:00,422] [ERROR] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Error while processing rule: SEARCH command error: BAD [b'Error in IMAP command SEARCH: Unknown argument NONE (0.001 + 0.000 secs).']
Traceback (most recent call last):
File "/usr/src/paperless/src/paperless_mail/mail.py", line 495, in handle_mail_account
total_processed_files += self._handle_mail_rule(
File "/usr/src/paperless/src/paperless_mail/mail.py", line 562, in _handle_mail_rule
for message in messages:
File "/usr/local/lib/python3.9/site-packages/imap_tools/mailbox.py", line 170, in fetch
nums = tuple((reversed if reverse else iter)(self.numbers(criteria, charset)))[limit_range]
File "/usr/local/lib/python3.9/site-packages/imap_tools/mailbox.py", line 108, in numbers
search_result = self.client.search(charset, encoded_criteria)
File "/usr/local/lib/python3.9/imaplib.py", line 732, in search
typ, dat = self._simple_command(name, 'CHARSET', charset, *criteria)
File "/usr/local/lib/python3.9/imaplib.py", line 1230, in _simple_command
return self._command_complete(name, self._command(name, *args))
File "/usr/local/lib/python3.9/imaplib.py", line 1055, in _command_complete
raise self.error('%s command error: %s %s' % (name, typ, data))
imaplib.IMAP4.error: SEARCH command error: BAD [b'Error in IMAP command SEARCH: Unknown argument NONE (0.001 + 0.000 secs).']
```
It seems that this action is the only one not working, since it's the only one with `Searching folder with criteria None` , and this seems to be the root cause for the error.
If mail rule action is set to e.g "Flag the mail, don't process flaged mails", then the mail.log looks fine:
```
[2023-06-06 23:40:00,273] [DEBUG] [paperless_mail] Processing mail account T-Online (Thorsten)
[2023-06-06 23:40:00,425] [DEBUG] [paperless_mail] GMAIL Label Support: False
[2023-06-06 23:40:00,426] [DEBUG] [paperless_mail] AUTH=PLAIN Support: False
[2023-06-06 23:40:00,468] [DEBUG] [paperless_mail] Account T-Online (Thorsten): Processing 1 rule(s)
[2023-06-06 23:40:00,474] [DEBUG] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Selecting folder INBOX.Paperless
[2023-06-06 23:40:00,489] [DEBUG] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Searching folder with criteria (UNKEYWORD PaperlessConsumed)
[2023-06-06 23:40:00,502] [DEBUG] [paperless_mail] Rule T-Online (Thorsten).Paperless Consume Ordner: Processed 0 matching mail(s)
```
### Steps to reproduce
1. Create mail rule, choose action "Move to specified folder"
2. Wait ten minutes
3. Check mail.log
### Webserver logs
```bash
See error description
```
### Browser logs
_No response_
### Paperless-ngx version
1.15.0
### Host OS
Synology
### Installation method
Docker - official image
### Browser
_No response_
### Configuration changes
_No response_
### Other
_No response_
| [
{
"content": "import datetime\nimport itertools\nimport logging\nimport os\nimport tempfile\nimport traceback\nfrom datetime import date\nfrom datetime import timedelta\nfrom fnmatch import fnmatch\nfrom typing import Dict\nfrom typing import List\nfrom typing import Union\n\nimport magic\nimport pathvalidate\n... | [
{
"content": "import datetime\nimport itertools\nimport logging\nimport os\nimport tempfile\nimport traceback\nfrom datetime import date\nfrom datetime import timedelta\nfrom fnmatch import fnmatch\nfrom typing import Dict\nfrom typing import List\nfrom typing import Union\n\nimport magic\nimport pathvalidate\n... | diff --git a/src/paperless_mail/mail.py b/src/paperless_mail/mail.py
index b525ef91d3f..bfb306e5abc 100644
--- a/src/paperless_mail/mail.py
+++ b/src/paperless_mail/mail.py
@@ -384,6 +384,8 @@ def make_criterias(rule: MailRule, supports_gmail_labels: bool):
if isinstance(rule_query, dict):
if len(rule_query) or len(criterias):
return AND(**rule_query, **criterias)
+ else:
+ return "ALL"
else:
return AND(rule_query, **criterias)
diff --git a/src/paperless_mail/tests/test_mail.py b/src/paperless_mail/tests/test_mail.py
index e69dbbef8ff..82b874fd808 100644
--- a/src/paperless_mail/tests/test_mail.py
+++ b/src/paperless_mail/tests/test_mail.py
@@ -721,6 +721,31 @@ def test_handle_mail_account_move(self):
self.assertEqual(len(self.bogus_mailbox.messages), 2)
self.assertEqual(len(self.bogus_mailbox.messages_spam), 1)
+ def test_handle_mail_account_move_no_filters(self):
+ account = MailAccount.objects.create(
+ name="test",
+ imap_server="",
+ username="admin",
+ password="secret",
+ )
+
+ _ = MailRule.objects.create(
+ name="testrule",
+ account=account,
+ action=MailRule.MailAction.MOVE,
+ action_parameter="spam",
+ maximum_age=0,
+ )
+
+ self.assertEqual(len(self.bogus_mailbox.messages), 3)
+ self.assertEqual(len(self.bogus_mailbox.messages_spam), 0)
+
+ self.mail_account_handler.handle_mail_account(account)
+ self.apply_mail_actions()
+
+ self.assertEqual(len(self.bogus_mailbox.messages), 0)
+ self.assertEqual(len(self.bogus_mailbox.messages_spam), 3)
+
def test_handle_mail_account_tag(self):
account = MailAccount.objects.create(
name="test",
|
ansible__ansible-modules-core-2723 | pip module "requirements" parameter documentation is incomplete.
For the "requirements" parameter, the comment "The path to a pip requirements file" is incomplete. I am left with the following questions (I am a very new Ansible user):
- Is this a local or remote path?
- If local, is there a way to refer to the path relatively? There doesn't appear to be, and if there is, it is not documented.
- If the path is local and must be absolute, that should be clearly stated instead of being inferred by the example (which uses an absolute path, making the role unmoveable which seems broken).
| [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2012, Matt Wright <matt@nobien.net>\n#\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, ... | [
{
"content": "#!/usr/bin/python\n# -*- coding: utf-8 -*-\n\n# (c) 2012, Matt Wright <matt@nobien.net>\n#\n# This file is part of Ansible\n#\n# Ansible is free software: you can redistribute it and/or modify\n# it under the terms of the GNU General Public License as published by\n# the Free Software Foundation, ... | diff --git a/packaging/language/pip.py b/packaging/language/pip.py
index 6d325282770..d896c5b9ed5 100755
--- a/packaging/language/pip.py
+++ b/packaging/language/pip.py
@@ -44,7 +44,8 @@
default: null
requirements:
description:
- - The path to a pip requirements file
+ - The path to a pip requirements file, which should be local to the remote system.
+ File can be specified as a relative path if using the chdir option.
required: false
default: null
virtualenv:
|
lutris__lutris-1723 | Saving game settings causes a traceback
```
Traceback (most recent call last):
File "/mnt/extrastorage/lutris/lutris/gui/lutriswindow.py", line 666, in on_game_updated
self.view.set_selected_game(game.id)
File "/mnt/extrastorage/lutris/lutris/gui/views/list.py", line 123, in set_selected_game
row = self.get_row_by_id(game_id, filtered=True)
AttributeError: 'GameListView' object has no attribute 'get_row_by_id'
```
`GameListView` does not seem to provide that method.
After the traceback occurs, Lutris will try to update the default wine prefix (`~/.wine`) rather than the correct one and must be restarted.
| [
{
"content": "# pylint: disable=no-member\nfrom gi.repository import Gtk, Pango\nfrom lutris import settings\nfrom lutris.gui.views.base import GameView\nfrom lutris.gui.views import (\n COL_NAME,\n COL_ICON,\n COL_YEAR,\n COL_RUNNER_HUMAN_NAME,\n COL_PLATFORM,\n COL_LASTPLAYED,\n COL_LASTP... | [
{
"content": "# pylint: disable=no-member\nfrom gi.repository import Gtk, Pango\nfrom lutris import settings\nfrom lutris.gui.views.base import GameView\nfrom lutris.gui.views import (\n COL_NAME,\n COL_ICON,\n COL_YEAR,\n COL_RUNNER_HUMAN_NAME,\n COL_PLATFORM,\n COL_LASTPLAYED,\n COL_LASTP... | diff --git a/lutris/gui/views/list.py b/lutris/gui/views/list.py
index 5f7e9b7f81..14ff5bd340 100644
--- a/lutris/gui/views/list.py
+++ b/lutris/gui/views/list.py
@@ -120,7 +120,7 @@ def select(self):
self.set_cursor(self.current_path[0])
def set_selected_game(self, game_id):
- row = self.get_row_by_id(game_id, filtered=True)
+ row = self.game_store.get_row_by_id(game_id, filtered=True)
if row:
self.set_cursor(row.path)
|
bridgecrewio__checkov-3226 | Parsing does not work with terraform files being encoded with UTF-8/BOM
**Example**
Create an empty file with Unix line endings and save it in UTF-8 encoding with a BOM. Running checkov will fail.
Removing BOM will resolve the issue
**Stacktrace**
```
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/lark/lexer.py", line 536, in lex
token = self.root_lexer.next_token(lexer_state, parser_state)
File "/usr/local/lib/python3.8/dist-packages/lark/lexer.py", line 466, in next_token
raise UnexpectedCharacters(lex_state.text, line_ctr.char_pos, line_ctr.line, line_ctr.column,
lark.exceptions.UnexpectedCharacters: <unprintable UnexpectedCharacters object>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/checkov/terraform/parser.py", line 742, in _load_or_die_quietly
raw_data = hcl2.load(f)
```
| [
{
"content": "from __future__ import annotations\n\nimport json\nimport logging\nimport os\nimport re\nfrom collections.abc import Sequence\nfrom copy import deepcopy\nfrom json import dumps, loads\nfrom pathlib import Path\nfrom typing import Optional, Dict, Mapping, Set, Tuple, Callable, Any, List, Type\n\nim... | [
{
"content": "from __future__ import annotations\n\nimport json\nimport logging\nimport os\nimport re\nfrom collections.abc import Sequence\nfrom copy import deepcopy\nfrom json import dumps, loads\nfrom pathlib import Path\nfrom typing import Optional, Dict, Mapping, Set, Tuple, Callable, Any, List, Type\n\nim... | diff --git a/checkov/terraform/parser.py b/checkov/terraform/parser.py
index c5bb19dd0d..bfd0d7d036 100644
--- a/checkov/terraform/parser.py
+++ b/checkov/terraform/parser.py
@@ -735,7 +735,7 @@ def _load_or_die_quietly(
try:
logging.debug(f"Parsing {file_path}")
- with open(file_path, "r") as f:
+ with open(file_path, "r", encoding="utf-8-sig") as f:
if file_name.endswith(".json"):
return json.load(f)
else:
diff --git a/tests/terraform/parser/resources/file_bom/with_bom.tf b/tests/terraform/parser/resources/file_bom/with_bom.tf
new file mode 100644
index 0000000000..55a50a84bf
--- /dev/null
+++ b/tests/terraform/parser/resources/file_bom/with_bom.tf
@@ -0,0 +1,3 @@
+resource "aws_s3_bucket" "example" {
+ bucket = "example"
+}
diff --git a/tests/terraform/parser/resources/file_bom/without_bom.tf b/tests/terraform/parser/resources/file_bom/without_bom.tf
new file mode 100644
index 0000000000..a478590748
--- /dev/null
+++ b/tests/terraform/parser/resources/file_bom/without_bom.tf
@@ -0,0 +1,3 @@
+resource "aws_s3_bucket" "example" {
+ bucket = "example"
+}
diff --git a/tests/terraform/parser/test_parser_internals.py b/tests/terraform/parser/test_parser_internals.py
index fd13abdda3..ea80cceb94 100644
--- a/tests/terraform/parser/test_parser_internals.py
+++ b/tests/terraform/parser/test_parser_internals.py
@@ -1,9 +1,56 @@
-import unittest
+from pathlib import Path
-from checkov.terraform import parser
+from checkov.common.util.parser_utils import eval_string
+from checkov.terraform.parser import _load_or_die_quietly
-class TestParserInternals(unittest.TestCase):
- def test_eval_string_to_list(self):
- expected = ["a", "b", "c"]
- assert parser.eval_string('["a", "b", "c"]') == expected
+def test_eval_string_to_list():
+ # given
+ expected = ["a", "b", "c"]
+
+ # when
+ actual = eval_string('["a", "b", "c"]')
+
+ assert actual == expected
+
+
+def test__load_or_die_quietly_with_bom():
+ # given
+ test_file = Path(__file__).parent / "resources/file_bom/with_bom.tf"
+ parsing_errors = {}
+
+ # when
+ definition = _load_or_die_quietly(file=test_file, parsing_errors=parsing_errors)
+
+ # then
+ assert not parsing_errors
+ assert definition == {
+ "resource": [
+ {
+ "aws_s3_bucket": {
+ "example": {"bucket": ["example"], "__start_line__": 1, "__end_line__": 3},
+ },
+ }
+ ]
+ }
+
+
+def test__load_or_die_quietly_without_bom():
+ # given
+ test_file = Path(__file__).parent / "resources/file_bom/without_bom.tf"
+ parsing_errors = {}
+
+ # when
+ definition = _load_or_die_quietly(file=test_file, parsing_errors=parsing_errors)
+
+ # then
+ assert not parsing_errors
+ assert definition == {
+ "resource": [
+ {
+ "aws_s3_bucket": {
+ "example": {"bucket": ["example"], "__start_line__": 1, "__end_line__": 3},
+ },
+ }
+ ]
+ }
|
celery__celery-4203 | -u option does not exist
## Steps to reproduce
Start -> celery -A application worker -l info
## Actual behavior
RuntimeWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!
Please specify a different user using the -u option.
User information: uid=0 euid=0 gid=0 egid=0
uid=uid, euid=euid, gid=gid, egid=egid
## Fixes
When displaying the help menu -> celery -A application worker -l info -h
There is currently no longer a -u option and the warning should be changed to use --uid / --gid options
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Platforms.\n\nUtilities dealing with platform specifics: signals, daemonization,\nusers, groups, and so on.\n\"\"\"\nfrom __future__ import absolute_import, print_function, unicode_literals\n\nimport atexit\nimport errno\nimport math\nimport numbers\nimport os\nimpor... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"Platforms.\n\nUtilities dealing with platform specifics: signals, daemonization,\nusers, groups, and so on.\n\"\"\"\nfrom __future__ import absolute_import, print_function, unicode_literals\n\nimport atexit\nimport errno\nimport math\nimport numbers\nimport os\nimpor... | diff --git a/celery/platforms.py b/celery/platforms.py
index bd7ae58ea9f..7620e1d8210 100644
--- a/celery/platforms.py
+++ b/celery/platforms.py
@@ -86,7 +86,7 @@
You're running the worker with superuser privileges: this is
absolutely not recommended!
-Please specify a different user using the -u option.
+Please specify a different user using the --uid option.
User information: uid={uid} euid={euid} gid={gid} egid={egid}
"""
|
facebookresearch__CompilerGym-592 | Is running CompilerGym intended to leave cache directories behind?
## ❓ Questions and Help
Not sure if this is a bug or not, so submitting as a question. Running a CompilerGym experiment leaves behind many cache directories. When running a large experiment, this can create problems through the sheer number of directories in `COMPILER_GYM_CACHE`. I expected the `COMPILER_GYM_CACHE` to not have anything after the experiment exited cleanly.
Is there a way to avoid the experiments leaving the directories behind?
## Steps to reproduce
Running the following on my machine leaves behind about 270 cache directories.
```python
import compiler_gym
import compiler_gym.wrappers
from ray import tune
from ray.rllib.agents.ppo import PPOTrainer
def make_env(env_config):
env = compiler_gym.make(env_config['cgym_id'])
env = compiler_gym.wrappers.TimeLimit(env, env_config['timelimit'])
dataset = env.datasets[env_config['dataset']]
env = compiler_gym.wrappers.CycleOverBenchmarks(
env, dataset.benchmarks())
return env
config = {
"env_config": {
"cgym_id": "llvm-autophase-ic-v0",
"timelimit": 45,
"dataset": "benchmark://cbench-v1",
},
"env": "CompilerGym",
}
stop = {
"timesteps_total": 10_000,
}
tune.register_env("CompilerGym", make_env)
tune.run(
PPOTrainer,
config=config,
stop=stop,
name='cgym_cache_dir_demo',
)
```
## Environment
Please fill in this checklist:
- CompilerGym: 0.2.2
- How you installed CompilerGym (conda, pip, source): pip
- OS: Ubuntu 20.04.1 LTS (x86_64)
- Python version: 3.9.7
- Build command you used (if compiling from source): N/A
- GCC/clang version (if compiling from source): N/A
- Bazel version (if compiling from source): N/A
- Versions of any other relevant libraries: ray: 1.10.0, gym: 0.20.0
| [
{
"content": "#! /usr/bin/env python3\n#\n# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"An example CompilerGym service in python.\"\"\"\nimport os\nimport sys\nfrom concu... | [
{
"content": "#! /usr/bin/env python3\n#\n# Copyright (c) Facebook, Inc. and its affiliates.\n#\n# This source code is licensed under the MIT license found in the\n# LICENSE file in the root directory of this source tree.\n\"\"\"An example CompilerGym service in python.\"\"\"\nimport os\nimport sys\nfrom concu... | diff --git a/compiler_gym/envs/llvm/service/BUILD b/compiler_gym/envs/llvm/service/BUILD
index 6b1d21c6b..294f037e9 100644
--- a/compiler_gym/envs/llvm/service/BUILD
+++ b/compiler_gym/envs/llvm/service/BUILD
@@ -63,6 +63,7 @@ cc_binary(
name = "compiler_gym-llvm-service-prelinked",
srcs = ["RunService.cc"],
deps = [
+ ":BenchmarkFactory",
":LlvmSession",
"//compiler_gym/service/runtime:cc_runtime",
],
diff --git a/compiler_gym/envs/llvm/service/Benchmark.cc b/compiler_gym/envs/llvm/service/Benchmark.cc
index 0ad8be841..7d18e73a3 100644
--- a/compiler_gym/envs/llvm/service/Benchmark.cc
+++ b/compiler_gym/envs/llvm/service/Benchmark.cc
@@ -162,9 +162,12 @@ Benchmark::Benchmark(const std::string& name, std::unique_ptr<llvm::LLVMContext>
needsRecompile_(true) {}
void Benchmark::close() {
+ VLOG(3) << "Closing benchmark " << name() << " with scratch directory "
+ << scratchDirectory().string();
sys::error_code ec;
fs::remove_all(scratchDirectory(), ec);
CHECK(!ec) << "Failed to delete scratch directory: " << scratchDirectory().string();
+ VLOG(3) << "Closed benchmark " << name();
}
std::unique_ptr<Benchmark> Benchmark::clone(const fs::path& workingDirectory) const {
diff --git a/compiler_gym/envs/llvm/service/BenchmarkFactory.cc b/compiler_gym/envs/llvm/service/BenchmarkFactory.cc
index 1108f2069..638aa651e 100644
--- a/compiler_gym/envs/llvm/service/BenchmarkFactory.cc
+++ b/compiler_gym/envs/llvm/service/BenchmarkFactory.cc
@@ -48,6 +48,7 @@ void BenchmarkFactory::close() {
for (auto& entry : benchmarks_) {
entry.second.close();
}
+ benchmarks_.clear();
}
Status BenchmarkFactory::getBenchmark(const BenchmarkProto& benchmarkMessage,
diff --git a/compiler_gym/envs/llvm/service/CMakeLists.txt b/compiler_gym/envs/llvm/service/CMakeLists.txt
index c267b40d7..62f23f9f7 100644
--- a/compiler_gym/envs/llvm/service/CMakeLists.txt
+++ b/compiler_gym/envs/llvm/service/CMakeLists.txt
@@ -21,6 +21,7 @@ cg_cc_binary(
"RunService.cc"
DEPS
::LlvmSession
+ ::BenchmarkFactory
compiler_gym::service::runtime::cc_runtime
)
diff --git a/compiler_gym/envs/llvm/service/RunService.cc b/compiler_gym/envs/llvm/service/RunService.cc
index f271fd89a..508ae456e 100644
--- a/compiler_gym/envs/llvm/service/RunService.cc
+++ b/compiler_gym/envs/llvm/service/RunService.cc
@@ -2,6 +2,7 @@
//
// This source code is licensed under the MIT license found in the
// LICENSE file in the root directory of this source tree.
+#include "compiler_gym/envs/llvm/service/BenchmarkFactory.h"
#include "compiler_gym/envs/llvm/service/LlvmSession.h"
#include "compiler_gym/service/runtime/Runtime.h"
#include "llvm/InitializePasses.h"
@@ -59,5 +60,17 @@ void initLlvm() {
int main(int argc, char** argv) {
initLlvm();
- createAndRunCompilerGymService<LlvmSession>(argc, argv, usage);
+ const auto ret = createAndRunCompilerGymService<LlvmSession>(argc, argv, usage);
+
+ // NOTE(github.com/facebookresearch/CompilerGym/issues/582): We need to make
+ // sure that BenchmarkFactory::close() is called on the global singleton
+ // instance, so that the temporary scratch directories are tidied up.
+ //
+ // TODO(github.com/facebookresearch/CompilerGym/issues/591): Once the runtime
+ // has been refactored to support intra-session mutable state, this singleton
+ // can be replaced by a member variable that is closed on
+ // CompilerGymServiceContext::shutdown().
+ BenchmarkFactory::getSingleton(FLAGS_working_dir).close();
+
+ return ret;
}
diff --git a/compiler_gym/service/runtime/CreateAndRunCompilerGymServiceImpl.h b/compiler_gym/service/runtime/CreateAndRunCompilerGymServiceImpl.h
index 6a4f1b2c1..e22d6b85e 100644
--- a/compiler_gym/service/runtime/CreateAndRunCompilerGymServiceImpl.h
+++ b/compiler_gym/service/runtime/CreateAndRunCompilerGymServiceImpl.h
@@ -51,7 +51,7 @@ void setGrpcChannelOptions(grpc::ServerBuilder& builder);
// createAndRunCompilerGymServiceImpl(argc, argv, "usage string");
// }
template <typename CompilationSessionType>
-[[noreturn]] void createAndRunCompilerGymServiceImpl(int argc, char** argv, const char* usage) {
+[[nodiscard]] int createAndRunCompilerGymServiceImpl(int argc, char** argv, const char* usage) {
// Register a signal handler for SIGTERM that will set the shutdown_signal
// future value.
std::signal(SIGTERM, shutdown_handler);
@@ -62,7 +62,7 @@ template <typename CompilationSessionType>
gflags::ParseCommandLineFlags(&argc, &argv, /*remove_flags=*/true);
if (argc > 1) {
std::cerr << "ERROR: unknown command line argument '" << argv[1] << '\'';
- exit(1);
+ return 1;
}
// Set up the working and logging directories.
@@ -129,15 +129,16 @@ template <typename CompilationSessionType>
VLOG(2) << "Shutting down the RPC service";
server->Shutdown();
serverThread.join();
+ VLOG(2) << "Service closed";
if (service.sessionCount()) {
LOG(ERROR) << "ERROR: Killing a service with " << service.sessionCount()
<< (service.sessionCount() > 1 ? " active sessions!" : " active session!")
<< std::endl;
- exit(6);
+ return 6;
}
- exit(0);
+ return 0;
}
} // namespace compiler_gym::runtime
diff --git a/compiler_gym/service/runtime/Runtime.h b/compiler_gym/service/runtime/Runtime.h
index ef154bb1c..42d162eb2 100644
--- a/compiler_gym/service/runtime/Runtime.h
+++ b/compiler_gym/service/runtime/Runtime.h
@@ -20,20 +20,20 @@ namespace compiler_gym::runtime {
* #include "my_compiler_service/MyCompilationSession.h"
*
* int main(int argc, char** argv) {
- * createAndRunCompilerGymService<MyCompilationSession>(
+ * return createAndRunCompilerGymService<MyCompilationSession>(
* argc, argc, "My compiler service"
* );
* }
* \endcode
*
- * This function never returns.
- *
* @tparam CompilationSessionType A sublass of CompilationSession that provides
* implementations of the abstract methods.
+ *
+ * @return An integer return code.
*/
template <typename CompilationSessionType>
-[[noreturn]] void createAndRunCompilerGymService(int argc, char** argv, const char* usage) {
- createAndRunCompilerGymServiceImpl<CompilationSessionType>(argc, argv, usage);
+[[nodiscard]] int createAndRunCompilerGymService(int argc, char** argv, const char* usage) {
+ return createAndRunCompilerGymServiceImpl<CompilationSessionType>(argc, argv, usage);
}
} // namespace compiler_gym::runtime
diff --git a/compiler_gym/service/runtime/create_and_run_compiler_gym_service.py b/compiler_gym/service/runtime/create_and_run_compiler_gym_service.py
index 94e410465..3b53acde1 100644
--- a/compiler_gym/service/runtime/create_and_run_compiler_gym_service.py
+++ b/compiler_gym/service/runtime/create_and_run_compiler_gym_service.py
@@ -130,6 +130,7 @@ def main(argv):
logging.info("Shutting down the RPC service")
server.stop(60).wait()
server_thread.join()
+ logging.info("Service closed")
if len(service.sessions):
print(
diff --git a/examples/example_compiler_gym_service/service_cc/ExampleService.cc b/examples/example_compiler_gym_service/service_cc/ExampleService.cc
index 99cacb769..76de9635d 100644
--- a/examples/example_compiler_gym_service/service_cc/ExampleService.cc
+++ b/examples/example_compiler_gym_service/service_cc/ExampleService.cc
@@ -133,5 +133,5 @@ class ExampleCompilationSession final : public CompilationSession {
} // namespace
int main(int argc, char** argv) {
- runtime::createAndRunCompilerGymService<ExampleCompilationSession>(argc, argv, usage);
+ return runtime::createAndRunCompilerGymService<ExampleCompilationSession>(argc, argv, usage);
}
|
urllib3__urllib3-2843 | flaky and pytest-memray incompatible
### Subject
```
______________________________________________________________________________________________________ TestHTTPProxyManager.test_forwarding_proxy_request_timeout[https-https-True] ______________________________________________________________________________________________________
Traceback (most recent call last):
File "/home/graingert/projects/urllib3/.nox/test-3-11/lib/python3.11/site-packages/pytest_memray/plugin.py", line 122, in wrapper
result: object | None = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/graingert/projects/urllib3/.nox/test-3-11/lib/python3.11/site-packages/pytest_memray/plugin.py", line 121, in wrapper
with Tracker(result_file):
File "src/memray/_memray.pyx", line 404, in memray._memray.Tracker.__enter__
RuntimeError: No more than one Tracker instance can be active at the same time
```
caused by a flaky test:
```
===Flaky Test Report===
test_forwarding_proxy_request_timeout[https-https-True] failed (1 runs remaining out of 2).
<class 'AssertionError'>
assert <class 'urllib3.exceptions.ProxyError'> == ReadTimeoutError
+ where <class 'urllib3.exceptions.ProxyError'> = type(ProxyError('Unable to connect to proxy', ReadTimeoutError("HTTPSConnectionPool(host='240.0.0.0', port=443): Read timed out. (read timeout=0.01)")))
+ where ProxyError('Unable to connect to proxy', ReadTimeoutError("HTTPSConnectionPool(host='240.0.0.0', port=443): Read timed out. (read timeout=0.01)")) = MaxRetryError('HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Max retries exceeded with url: https://240.0.0.0 (Caused by ProxyError(\'Unable to connect to proxy\', ReadTimeoutError("HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Read timed out. (read timeout=0.01)")))').reason
+ where MaxRetryError('HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Max retries exceeded with url: https://240.0.0.0 (Caused by ProxyError(\'Unable to connect to proxy\', ReadTimeoutError("HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Read timed out. (read timeout=0.01)")))') = <ExceptionInfo MaxRetryError('HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Max retries exceeded with url: https://240.0.0.0 (Ca...proxy\', ReadTimeoutError("HTTPSConnectionPool(host=\'240.0.0.0\', port=443): Read timed out. (read timeout=0.01)")))') tblen=10>.value
[<TracebackEntry /home/graingert/projects/urllib3/test/with_dummyserver/test_proxy_poolmanager.py:484>]
test_forwarding_proxy_request_timeout[https-https-True] failed; it passed 0 out of the required 1 times.
<class 'RuntimeError'>
No more than one Tracker instance can be active at the same time
[<TracebackEntry /home/graingert/projects/urllib3/.nox/test-3-11/lib/python3.11/site-packages/pytest_memray/plugin.py:122>, <TracebackEntry /home/graingert/projects/urllib3/.nox/test-3-11/lib/python3.11/site-packages/pytest_memray/plugin.py:121>, <TracebackEntry src/memray/_memray.pyx:404>]
```
see also https://github.com/bloomberg/pytest-memray/issues/53
| [
{
"content": "from __future__ import annotations\n\nimport os\nimport shutil\nimport subprocess\nimport sys\n\nimport nox\n\nSOURCE_FILES = [\n \"docs/\",\n \"dummyserver/\",\n \"src/\",\n \"test/\",\n \"noxfile.py\",\n \"setup.py\",\n]\n\n\ndef tests_impl(\n session: nox.Session,\n extr... | [
{
"content": "from __future__ import annotations\n\nimport os\nimport shutil\nimport subprocess\nimport sys\n\nimport nox\n\nSOURCE_FILES = [\n \"docs/\",\n \"dummyserver/\",\n \"src/\",\n \"test/\",\n \"noxfile.py\",\n \"setup.py\",\n]\n\n\ndef tests_impl(\n session: nox.Session,\n extr... | diff --git a/dev-requirements.txt b/dev-requirements.txt
index 2d1ef23faf..84840cd0f2 100644
--- a/dev-requirements.txt
+++ b/dev-requirements.txt
@@ -4,7 +4,6 @@ PySocks==1.7.1
pytest==7.2.0
pytest-timeout==2.1.0
pytest-freezegun==0.4.2
-flaky==3.7.0
trustme==0.9.0
cryptography==39.0.0
backports.zoneinfo==0.2.1;python_version<"3.9"
diff --git a/noxfile.py b/noxfile.py
index 81414f42e2..b0bec11470 100644
--- a/noxfile.py
+++ b/noxfile.py
@@ -56,7 +56,6 @@ def tests_impl(
"-ra",
f"--color={'yes' if 'GITHUB_ACTIONS' in os.environ else 'auto'}",
"--tb=native",
- "--no-success-flaky-report",
"--durations=10",
"--strict-config",
"--strict-markers",
diff --git a/test/with_dummyserver/test_chunked_transfer.py b/test/with_dummyserver/test_chunked_transfer.py
index 707b59f306..c2dc12e769 100644
--- a/test/with_dummyserver/test_chunked_transfer.py
+++ b/test/with_dummyserver/test_chunked_transfer.py
@@ -13,9 +13,6 @@
from urllib3.util import SKIP_HEADER
from urllib3.util.retry import Retry
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
class TestChunkedTransfer(SocketDummyServerTestCase):
def start_chunked_handler(self) -> None:
diff --git a/test/with_dummyserver/test_connectionpool.py b/test/with_dummyserver/test_connectionpool.py
index aea46c8935..13ad811d06 100644
--- a/test/with_dummyserver/test_connectionpool.py
+++ b/test/with_dummyserver/test_connectionpool.py
@@ -35,8 +35,6 @@
from .. import INVALID_SOURCE_ADDRESSES, TARPIT_HOST, VALID_SOURCE_ADDRESSES
from ..port_helpers import find_unused_port
-pytestmark = pytest.mark.flaky
-
def wait_for_socket(ready_event: Event) -> None:
ready_event.wait()
diff --git a/test/with_dummyserver/test_https.py b/test/with_dummyserver/test_https.py
index ac0fa9419c..7678bfbed1 100644
--- a/test/with_dummyserver/test_https.py
+++ b/test/with_dummyserver/test_https.py
@@ -47,10 +47,6 @@
from .. import has_alpn
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
-
TLSv1_CERTS = DEFAULT_CERTS.copy()
TLSv1_CERTS["ssl_version"] = getattr(ssl, "PROTOCOL_TLSv1", None)
diff --git a/test/with_dummyserver/test_no_ssl.py b/test/with_dummyserver/test_no_ssl.py
index 12e07839ee..6529636c3b 100644
--- a/test/with_dummyserver/test_no_ssl.py
+++ b/test/with_dummyserver/test_no_ssl.py
@@ -5,16 +5,11 @@
"""
from __future__ import annotations
-import pytest
-
import urllib3
from dummyserver.testcase import HTTPDummyServerTestCase, HTTPSDummyServerTestCase
from ..test_no_ssl import TestWithoutSSL
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
class TestHTTPWithoutSSL(HTTPDummyServerTestCase, TestWithoutSSL):
def test_simple(self) -> None:
diff --git a/test/with_dummyserver/test_poolmanager.py b/test/with_dummyserver/test_poolmanager.py
index 2c0f1002aa..c4f1947037 100644
--- a/test/with_dummyserver/test_poolmanager.py
+++ b/test/with_dummyserver/test_poolmanager.py
@@ -14,9 +14,6 @@
from urllib3.poolmanager import PoolManager
from urllib3.util.retry import Retry
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
class TestPoolManager(HTTPDummyServerTestCase):
@classmethod
diff --git a/test/with_dummyserver/test_proxy_poolmanager.py b/test/with_dummyserver/test_proxy_poolmanager.py
index a0566ecea3..171cb23b67 100644
--- a/test/with_dummyserver/test_proxy_poolmanager.py
+++ b/test/with_dummyserver/test_proxy_poolmanager.py
@@ -37,9 +37,6 @@
from .. import TARPIT_HOST, requires_network
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
class TestHTTPProxyManager(HTTPDummyProxyTestCase):
@classmethod
diff --git a/test/with_dummyserver/test_socketlevel.py b/test/with_dummyserver/test_socketlevel.py
index df005e5a40..56cd224ec0 100644
--- a/test/with_dummyserver/test_socketlevel.py
+++ b/test/with_dummyserver/test_socketlevel.py
@@ -60,9 +60,6 @@
else:
StrOrBytesPath = object
-# Retry failed tests
-pytestmark = pytest.mark.flaky
-
class TestCookies(SocketDummyServerTestCase):
def test_multi_setcookie(self) -> None:
|
mne-tools__mne-bids-pipeline-289 | `ValueError: n_jobs must be an integer` when calling freesurfer
Hi,
When I run `python run.py freesurfer --config=~/hMT+/config.py`, I get the following error traceback:
```
Traceback (most recent call last):
File "/home/merlin/PhD/mne-bids-pipeline/run.py", line 194, in <module>
fire.Fire(process)
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/merlin/PhD/mne-bids-pipeline/run.py", line 189, in process
_run_script(script_path, config, root_dir, subject, session, task, run)
File "/home/merlin/PhD/mne-bids-pipeline/run.py", line 98, in _run_script
runpy.run_path(script_path, run_name='__main__')
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/runpy.py", line 268, in run_path
return _run_module_code(code, init_globals, run_name,
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/merlin/PhD/mne-bids-pipeline/scripts/freesurfer/recon_all.py", line 112, in <module>
fire.Fire(main)
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/merlin/PhD/mne-bids-pipeline/scripts/freesurfer/recon_all.py", line 94, in main
parallel, run_func, _ = parallel_func(run_recon, n_jobs=n_jobs)
File "<decorator-gen-42>", line 24, in parallel_func
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/mne/parallel.py", line 112, in parallel_func
n_jobs = check_n_jobs(n_jobs)
File "/home/merlin/miniconda3/envs/mne-bids/lib/python3.9/site-packages/mne/parallel.py", line 159, in check_n_jobs
raise ValueError('n_jobs must be an integer')
ValueError: n_jobs must be an integer
```
Checking with pdb, it seems that `recon_all` is called with `n_jobs = "freesurfer"`. I'm not sure why that is.
```
19:36:39 Using custom configuration: /home/merlin/hMT+/config.py
19:36:39 [Step-01] Running: Initializing output directories.
19:36:39 [Step-01] Initializing output directories.
19:36:39 [Step-01] Completed: Initializing output directories.
2021-04-14 19:36:39 INFO Successfully finished running: init_derivatives_dir
2021-04-14 19:36:39 INFO Now running: on_all
> /home/merlin/PhD/mne-bids-pipeline/scripts/freesurfer/recon_all.py(88)main()
87
---> 88 logger.info('Running FreeSurfer')
89
ipdb> n_jobs
'freesurfer'
ipdb>
```
It might be the config needs to be changed, but I can't figure out how.
| [
{
"content": "#!/usr/bin/env python\n\nimport os\nimport shutil\nimport sys\nfrom pathlib import Path\nimport logging\nfrom typing import Union\n\nimport fire\n\nfrom mne.utils import run_subprocess\nfrom mne.parallel import parallel_func\n\nimport config\n\nPathLike = Union[str, Path]\nlogger = logging.getLogg... | [
{
"content": "#!/usr/bin/env python\n\nimport os\nimport shutil\nimport sys\nfrom pathlib import Path\nimport logging\nfrom typing import Union\n\nimport fire\n\nfrom mne.utils import run_subprocess\nfrom mne.parallel import parallel_func\n\nimport config\n\nPathLike = Union[str, Path]\nlogger = logging.getLogg... | diff --git a/scripts/freesurfer/recon_all.py b/scripts/freesurfer/recon_all.py
index 311be181c..f9186c9ae 100755
--- a/scripts/freesurfer/recon_all.py
+++ b/scripts/freesurfer/recon_all.py
@@ -60,7 +60,7 @@ def run_recon(root_dir, subject, fs_bids_app) -> None:
run_subprocess(cmd, env=env, verbose=logger.level)
-def main(n_jobs: int = 1) -> None:
+def main(*, n_jobs: int = 1) -> None:
"""Run freesurfer recon-all command on BIDS dataset.
The command allows to run the freesurfer recon-all
|
ephios-dev__ephios-384 | Cannot delete section
As a planner, I cannot delete an existing section from a shift with the section_based signup method
| [
{
"content": "import uuid\nfrom functools import cached_property\nfrom itertools import groupby\nfrom operator import itemgetter\n\nfrom django import forms\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import redirect\nfrom django.template.loade... | [
{
"content": "import uuid\nfrom functools import cached_property\nfrom itertools import groupby\nfrom operator import itemgetter\n\nfrom django import forms\nfrom django.contrib import messages\nfrom django.core.exceptions import ValidationError\nfrom django.shortcuts import redirect\nfrom django.template.loade... | diff --git a/ephios/plugins/basesignup/signup/section_based.py b/ephios/plugins/basesignup/signup/section_based.py
index 82a006e49..a99000858 100644
--- a/ephios/plugins/basesignup/signup/section_based.py
+++ b/ephios/plugins/basesignup/signup/section_based.py
@@ -134,6 +134,7 @@ def clean_sections(self):
for key in ("title", "qualifications", "min_count", "uuid")
}
for form in self.sections_formset
+ if not form.cleaned_data.get("DELETE")
]
return sections
|
nautobot__nautobot-2730 | Changelog Filter "Object Type" - The results could not be loaded.
<!--
NOTE: IF YOUR ISSUE DOES NOT FOLLOW THIS TEMPLATE, IT WILL BE CLOSED.
This form is only for reporting reproducible bugs. If you need assistance
with Nautobot installation, or if you have a general question, please start a
discussion instead: https://github.com/nautobot/nautobot/discussions
Please describe the environment in which you are running Nautobot. Be sure
that you are running an unmodified instance of the latest stable release
before submitting a bug report, and that any plugins have been disabled.
-->
### Environment
* Nautobot version (Docker tag too if applicable): d3bb49d5c396 (v1.4.7)
* Python version: 3.9
* Database platform, version: postgres
* Middleware(s):
<!--
Describe in detail the exact steps that someone else can take to reproduce
this bug using the current stable release of Nautobot. Begin with the
creation of any necessary database objects and call out every operation
being performed explicitly. If reporting a bug in the REST API, be sure to
reconstruct the raw HTTP request(s) being made: Don't rely on a client
library such as pynautobot.
-->
### Steps to Reproduce
1. Open Changelog
2. Klick into "Object Type" on "Serach" Field
3.
<!-- What did you expect to happen? -->
### Expected Behavior
A list of object types should be displayed
<!-- What happened instead? -->
### Observed Behavior

| [
{
"content": "import django_filters\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.db.models import Q\nfrom django.forms import IntegerField\n\nfrom nautobot.dcim.models import DeviceRole, DeviceType, Location, Platform, Region, Site\nfro... | [
{
"content": "import django_filters\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.db.models import Q\nfrom django.forms import IntegerField\n\nfrom nautobot.dcim.models import DeviceRole, DeviceType, Location, Platform, Region, Site\nfro... | diff --git a/changes/2684.fixed b/changes/2684.fixed
new file mode 100644
index 00000000000..d4a31df7285
--- /dev/null
+++ b/changes/2684.fixed
@@ -0,0 +1 @@
+Fixed "The results could not be loaded" when filtering `ContentTypes` in the UI.
diff --git a/nautobot/extras/filters.py b/nautobot/extras/filters.py
index e500274f5c0..d0121f3b5ec 100644
--- a/nautobot/extras/filters.py
+++ b/nautobot/extras/filters.py
@@ -416,6 +416,13 @@ class Meta:
class ContentTypeFilterSet(BaseFilterSet):
+ q = SearchFilter(
+ filter_predicates={
+ "app_label": "icontains",
+ "model": "icontains",
+ },
+ )
+
class Meta:
model = ContentType
fields = ["id", "app_label", "model"]
diff --git a/nautobot/extras/tests/test_filters.py b/nautobot/extras/tests/test_filters.py
index d07fdfe3ba8..974ff9b7ebd 100644
--- a/nautobot/extras/tests/test_filters.py
+++ b/nautobot/extras/tests/test_filters.py
@@ -2,6 +2,7 @@
from django.contrib.auth import get_user_model
from django.contrib.contenttypes.models import ContentType
+from django.db.models import Q
from nautobot.dcim.filters import DeviceFilterSet
from nautobot.dcim.models import (
@@ -24,6 +25,7 @@
from nautobot.extras.filters import (
ComputedFieldFilterSet,
ConfigContextFilterSet,
+ ContentTypeFilterSet,
CustomLinkFilterSet,
ExportTemplateFilterSet,
GitRepositoryFilterSet,
@@ -312,6 +314,28 @@ def test_search(self):
self.assertEqual(self.filterset(params, self.queryset).qs.values_list("pk", flat=True)[0], value)
+class ContentTypeFilterSetTestCase(FilterTestCases.FilterTestCase):
+ queryset = ContentType.objects.order_by("app_label", "model")
+ filterset = ContentTypeFilterSet
+
+ def test_app_label(self):
+ params = {"app_label": ["dcim"]}
+ self.assertQuerysetEqual(self.filterset(params, self.queryset).qs, self.queryset.filter(app_label="dcim"))
+
+ def test_model(self):
+ params = {"model": ["device", "virtualmachine"]}
+ self.assertQuerysetEqual(
+ self.filterset(params, self.queryset).qs, self.queryset.filter(model__in=["device", "virtualmachine"])
+ )
+
+ def test_search(self):
+ params = {"q": "circ"}
+ self.assertQuerysetEqual(
+ self.filterset(params, self.queryset).qs,
+ self.queryset.filter(Q(app_label__icontains="circ") | Q(model__icontains="circ")),
+ )
+
+
class CustomLinkTestCase(FilterTestCases.FilterTestCase):
queryset = CustomLink.objects.all()
filterset = CustomLinkFilterSet
|
wagtail__wagtail-11430 | Issue: Usage of an instead of a.
Usage of an instead of a in
client/src/includes/tabs.js: * Set url to have tab an tab hash at the end.
and in many more location

I have already changed the simple an to a in most of them, I hope it helps.
Issue: Usage of an instead of a.
Usage of an instead of a in
client/src/includes/tabs.js: * Set url to have tab an tab hash at the end.
and in many more location

I have already changed the simple an to a in most of them, I hope it helps.
| [
{
"content": "import uuid\nfrom typing import Dict\n\nfrom django.apps import apps\nfrom django.conf import settings\nfrom django.core import checks\nfrom django.db import migrations, models, transaction\nfrom django.db.models.signals import pre_save\nfrom django.dispatch import receiver\nfrom django.utils impo... | [
{
"content": "import uuid\nfrom typing import Dict\n\nfrom django.apps import apps\nfrom django.conf import settings\nfrom django.core import checks\nfrom django.db import migrations, models, transaction\nfrom django.db.models.signals import pre_save\nfrom django.dispatch import receiver\nfrom django.utils impo... | diff --git a/CONTRIBUTORS.md b/CONTRIBUTORS.md
index 811caf992cf0..ccb59504ba97 100644
--- a/CONTRIBUTORS.md
+++ b/CONTRIBUTORS.md
@@ -780,6 +780,7 @@
* Nikhil S Kalburgi
* Salvo Polizzi
* Badr Fourane
+* Vaishnav Dasari
## Translators
diff --git a/client/src/includes/tabs.js b/client/src/includes/tabs.js
index 3281acf32b62..f8db67d28083 100644
--- a/client/src/includes/tabs.js
+++ b/client/src/includes/tabs.js
@@ -294,7 +294,7 @@ class Tabs {
}
/**
- * Set url to have tab an tab hash at the end
+ * Set url to have a tab hash at the end
*/
setURLHash(tabId) {
if (
diff --git a/wagtail/admin/tests/pages/test_bulk_actions/test_bulk_publish.py b/wagtail/admin/tests/pages/test_bulk_actions/test_bulk_publish.py
index 86d2d019dcee..21a0d235d845 100644
--- a/wagtail/admin/tests/pages/test_bulk_actions/test_bulk_publish.py
+++ b/wagtail/admin/tests/pages/test_bulk_actions/test_bulk_publish.py
@@ -71,12 +71,12 @@ def setUp(self):
def test_publish_view(self):
"""
- This tests that the publish view responds with an publish confirm page
+ This tests that the publish view responds with a publish confirm page
"""
# Request confirm publish page
response = self.client.get(self.url)
- # # Check that the user received an publish confirm page
+ # # Check that the user received a publish confirm page
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(
response, "wagtailadmin/pages/bulk_actions/confirm_bulk_publish.html"
@@ -206,12 +206,12 @@ def hook_func(request, action_type, pages, action_class_instance):
def test_publish_descendants_view(self):
"""
- This tests that the publish view responds with an publish confirm page that does not contain the form field 'include_descendants'
+ This tests that the publish view responds with a publish confirm page that does not contain the form field 'include_descendants'
"""
# Get publish page for page with no descendants
response = self.client.get(self.url)
- # Check that the user received an publish confirm page
+ # Check that the user received a publish confirm page
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(
response, "wagtailadmin/pages/bulk_actions/confirm_bulk_publish.html"
@@ -315,12 +315,12 @@ def setUp(self):
def test_publish_descendants_view(self):
"""
- This tests that the publish view responds with an publish confirm page that contains the form field 'include_descendants'
+ This tests that the publish view responds with a publish confirm page that contains the form field 'include_descendants'
"""
# Get publish page
response = self.client.get(self.url)
- # Check that the user received an publish confirm page
+ # Check that the user received a publish confirm page
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(
response, "wagtailadmin/pages/bulk_actions/confirm_bulk_publish.html"
diff --git a/wagtail/admin/tests/test_collections_views.py b/wagtail/admin/tests/test_collections_views.py
index b0ef5ed7bc86..223e0d1f1e1f 100644
--- a/wagtail/admin/tests/test_collections_views.py
+++ b/wagtail/admin/tests/test_collections_views.py
@@ -535,7 +535,7 @@ def test_page_shows_delete_link_only_if_delete_permitted(self):
# Retrieve edit form and check fields
response = self.get(collection_id=self.marketing_sub_collection.id)
self.assertNotContains(response, "Delete collection")
- # Add delete permission to parent collection an try again
+ # Add delete permission to parent collection and try again
GroupCollectionPermission.objects.create(
group=self.marketing_group,
collection=self.marketing_collection,
diff --git a/wagtail/contrib/search_promotions/tests.py b/wagtail/contrib/search_promotions/tests.py
index 0af4ff2cbc9d..f0c4ad601d49 100644
--- a/wagtail/contrib/search_promotions/tests.py
+++ b/wagtail/contrib/search_promotions/tests.py
@@ -483,7 +483,7 @@ class TestSearchPromotionsEditView(WagtailTestUtils, TestCase):
def setUp(self):
self.user = self.login()
- # Create an search pick to edit
+ # Create a search pick to edit
self.query = Query.get("Hello")
self.search_pick = self.query.editors_picks.create(
page_id=1, sort_order=0, description="Root page"
@@ -645,7 +645,7 @@ class TestSearchPromotionsDeleteView(WagtailTestUtils, TestCase):
def setUp(self):
self.login()
- # Create an search pick to delete
+ # Create a search pick to delete
self.query = Query.get("Hello")
self.search_pick = self.query.editors_picks.create(
page_id=1, description="Root page"
diff --git a/wagtail/models/i18n.py b/wagtail/models/i18n.py
index 474e9732f02c..68dcd596bcf0 100644
--- a/wagtail/models/i18n.py
+++ b/wagtail/models/i18n.py
@@ -474,7 +474,7 @@ def set_locale_on_new_instance(sender, instance, **kwargs):
return
# If this is a fixture load, use the global default Locale
- # as the page tree is probably in an flux
+ # as the page tree is probably in flux
if kwargs["raw"]:
instance.locale = Locale.get_default()
return
|
ivy-llc__ivy-13655 | Add torch.Tensor.mul
| [
{
"content": "# global\n\n# local\nimport ivy\nimport ivy.functional.frontends.torch as torch_frontend\nimport ivy.functional.frontends.torch.nn.functional as torch_frontend_nn\nfrom ivy.functional.frontends.numpy.creation_routines.from_existing_data import (\n array as np_frontend_array,\n)\nfrom ivy.func_w... | [
{
"content": "# global\n\n# local\nimport ivy\nimport ivy.functional.frontends.torch as torch_frontend\nimport ivy.functional.frontends.torch.nn.functional as torch_frontend_nn\nfrom ivy.functional.frontends.numpy.creation_routines.from_existing_data import (\n array as np_frontend_array,\n)\nfrom ivy.func_w... | diff --git a/ivy/functional/frontends/torch/tensor.py b/ivy/functional/frontends/torch/tensor.py
index e652125384f9b..ce9f3630d2753 100644
--- a/ivy/functional/frontends/torch/tensor.py
+++ b/ivy/functional/frontends/torch/tensor.py
@@ -890,3 +890,6 @@ def cumprod(self, dim, dtype):
@with_unsupported_dtypes({"1.11.0 and below": ("bfloat16",)}, "torch")
def exp(self, *, out=None):
return torch_frontend.exp(self._ivy_array)
+
+ def mul(self, other, *, out=None):
+ return torch_frontend.mul(self._ivy_array, other)
diff --git a/ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py b/ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py
index 5fa8523d009fb..d8f30a5c4058a 100644
--- a/ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py
+++ b/ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py
@@ -6026,3 +6026,39 @@ def test_torch_instance_exp(
frontend=frontend,
on_device=on_device,
)
+
+
+# mul
+@handle_frontend_method(
+ class_tree=CLASS_TREE,
+ init_tree="torch.tensor",
+ method_name="mul",
+ dtype_and_x=helpers.dtype_and_values(
+ available_dtypes=helpers.get_dtypes("numeric"),
+ num_arrays=2,
+ ),
+)
+def test_torch_instance_mul(
+ dtype_and_x,
+ frontend_method_data,
+ init_flags,
+ method_flags,
+ frontend,
+ on_device,
+):
+ input_dtype, x = dtype_and_x
+ helpers.test_frontend_method(
+ init_input_dtypes=input_dtype,
+ init_all_as_kwargs_np={
+ "data": x[0],
+ },
+ method_input_dtypes=input_dtype,
+ method_all_as_kwargs_np={
+ "other": x[1],
+ },
+ frontend_method_data=frontend_method_data,
+ init_flags=init_flags,
+ method_flags=method_flags,
+ frontend=frontend,
+ on_device=on_device,
+ )
|
napari__napari-680 | Napari viewer closes unexpectetly when pressing the 3D button
## 🐛 Bug
Hi guys,
I use the following code to automatically display a 3D stack with 2 Channels and 10 Timepoints in Napari. This works fine and the viewer opens up just fine.
When I then press the 3D button the viewer is gone and I get a long error stack: See below:
Any help is welcome.
Sebi

```
def show_napari(array, metadata, verbose=True):
import napari
with napari.gui_qt():
# create scalefcator with all ones
scalefactors = [1] * len(array.shape)
# initialize the napari viewer
viewer = napari.Viewer()
if metadata['ImageType'] == 'czi':
# find position of dimensions
posZ = metadata['Axes'].find('Z')
posC = metadata['Axes'].find('C')
posT = metadata['Axes'].find('T')
# get the scalefactors from the metadata
scalef = get_scalefactor(metadata)
scalefactors[posZ] = scalef['zx']
if verbose:
print('Dim PosT : ', posT)
print('Dim PosZ : ', posZ)
print('Dim PosC : ', posC)
print('Scale Factors XYZ: ', scalefactors)
# add all channels as layers
for ch in range(metadata['SizeC']):
chname = metadata['Channels'][ch]
# cut out channel
channel = array.take(ch, axis=posC)
print(channel.shape)
# actually show the image array
print('Adding Channel: ', chname)
viewer.add_image(channel, name=chname, scale=scalefactors)
```
Error message:
```
(1, 10, 2, 15, 256, 256, 1)
BTCZYX0
(1, 10, 2, 15, 256, 256)
Dim PosT : 1
Dim PosZ : 3
Dim PosC : 2
Scale Factors XYZ: [1, 1, 1, 3.516, 1, 1]
(1, 10, 15, 256, 256)
Adding Channel: AF555
(1, 10, 15, 256, 256)
Adding Channel: AF488
WARNING: Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_qt\qt_viewer_buttons.py", line 173, in <lambda>
lambda state=self: self.change_ndisplay(state)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_qt\qt_viewer_buttons.py", line 178, in change_ndisplay
self.viewer.dims.ndisplay = 3
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\dims.py", line 205, in ndisplay
self.events.ndisplay()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\viewer_model.py", line 89, in <lambda>
self.dims.events.ndisplay.connect(lambda e: self._update_layers())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\viewer_model.py", line 1018, in _update_layers
layer.dims.ndisplay = self.dims.ndisplay
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\dims.py", line 205, in ndisplay
self.events.ndisplay()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\base\base.py", line 174, in <lambda>
self.dims.events.ndisplay.connect(lambda e: self._update_dims())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\base\base.py", line 372, in _update_dims
self._set_view_slice()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\image\image.py", line 502, in _set_view_slice
self.events.set_data()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 46, in <lambda>
self.layer.events.set_data.connect(lambda e: self._on_data_change())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 59, in _on_data_change
self._on_display_change()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 48, in _on_display_change
self.reset()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 216, in reset
self._reset_base()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 165, in _reset_base
self._on_scale_change()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 123, in _on_scale_change
self.layer.position = self._transform_position(self._position)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 153, in _transform_position
transform.map(list(position))[: len(self.layer.dims.displayed)]
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\chain.py", line 148, in map
coords = tr.map(coords)
File "<decorator-gen-4>", line 2, in imap
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\_util.py", line 111, in arg_to_vec4
arg = as_vec4(arg)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\_util.py", line 81, in as_vec4
% obj.shape)
TypeError: not all arguments converted during string formatting
WARNING:vispy:Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_qt\qt_viewer_buttons.py", line 173, in <lambda>
lambda state=self: self.change_ndisplay(state)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_qt\qt_viewer_buttons.py", line 178, in change_ndisplay
self.viewer.dims.ndisplay = 3
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\dims.py", line 205, in ndisplay
self.events.ndisplay()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\viewer_model.py", line 89, in <lambda>
self.dims.events.ndisplay.connect(lambda e: self._update_layers())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\viewer_model.py", line 1018, in _update_layers
layer.dims.ndisplay = self.dims.ndisplay
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\components\dims.py", line 205, in ndisplay
self.events.ndisplay()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\base\base.py", line 174, in <lambda>
self.dims.events.ndisplay.connect(lambda e: self._update_dims())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\base\base.py", line 372, in _update_dims
self._set_view_slice()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\layers\image\image.py", line 502, in _set_view_slice
self.events.set_data()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 508, in __call__
self._invoke_callback(cb, event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 529, in _invoke_callback
cb_event=(cb, event),
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\util\event.py", line 523, in _invoke_callback
cb(event)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 46, in <lambda>
self.layer.events.set_data.connect(lambda e: self._on_data_change())
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 59, in _on_data_change
self._on_display_change()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 48, in _on_display_change
self.reset()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 216, in reset
self._reset_base()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 165, in _reset_base
self._on_scale_change()
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_image_layer.py", line 123, in _on_scale_change
self.layer.position = self._transform_position(self._position)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\napari\_vispy\vispy_base_layer.py", line 153, in _transform_position
transform.map(list(position))[: len(self.layer.dims.displayed)]
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\chain.py", line 148, in map
coords = tr.map(coords)
File "<decorator-gen-4>", line 2, in imap
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\_util.py", line 111, in arg_to_vec4
arg = as_vec4(arg)
File "C:\ProgramData\Anaconda3\envs\imageanalysis\lib\site-packages\vispy\visuals\transforms\_util.py", line 81, in as_vec4
% obj.shape)
TypeError: not all arguments converted during string formatting
```
| [
{
"content": "from vispy.visuals.transforms import STTransform\nfrom abc import ABC, abstractmethod\n\n\nclass VispyBaseLayer(ABC):\n \"\"\"Base object for individual layer views\n\n Meant to be subclassed.\n\n Parameters\n ----------\n layer : napari.layers.Layer\n Layer model.\n node ... | [
{
"content": "from vispy.visuals.transforms import STTransform\nfrom abc import ABC, abstractmethod\n\n\nclass VispyBaseLayer(ABC):\n \"\"\"Base object for individual layer views\n\n Meant to be subclassed.\n\n Parameters\n ----------\n layer : napari.layers.Layer\n Layer model.\n node ... | diff --git a/napari/_vispy/vispy_base_layer.py b/napari/_vispy/vispy_base_layer.py
index b315736b206..1ab86e1b452 100644
--- a/napari/_vispy/vispy_base_layer.py
+++ b/napari/_vispy/vispy_base_layer.py
@@ -39,7 +39,7 @@ def __init__(self, layer, node):
self.layer = layer
self.node = node
- self._position = (0,) * self.layer.ndim
+ self._position = (0,) * self.layer.dims.ndisplay
self.camera = None
self.layer.events.refresh.connect(lambda e: self.node.update())
diff --git a/napari/tests/test_advanced.py b/napari/tests/test_advanced.py
index 81de93d0c47..22e7a0ef63a 100644
--- a/napari/tests/test_advanced.py
+++ b/napari/tests/test_advanced.py
@@ -36,6 +36,34 @@ def test_4D_5D_images(qtbot):
viewer.window.close()
+def test_5D_image_3D_rendering(qtbot):
+ """Test 3D rendering of a 5D image."""
+ np.random.seed(0)
+ viewer = Viewer()
+ view = viewer.window.qt_viewer
+ qtbot.addWidget(view)
+
+ # add 4D image data
+ data = np.random.random((2, 10, 12, 13, 14))
+ viewer.add_image(data)
+ assert np.all(viewer.layers[0].data == data)
+ assert len(viewer.layers) == 1
+ assert viewer.dims.ndim == 5
+ assert viewer.dims.ndisplay == 2
+ assert viewer.layers[0]._data_view.ndim == 2
+ assert view.dims.nsliders == viewer.dims.ndim
+ assert np.sum(view.dims._displayed_sliders) == 3
+
+ # switch to 3D rendering
+ viewer.dims.ndisplay = 3
+ assert viewer.dims.ndisplay == 3
+ assert viewer.layers[0]._data_view.ndim == 3
+ assert np.sum(view.dims._displayed_sliders) == 2
+
+ # Close the viewer
+ viewer.window.close()
+
+
def test_change_image_dims(qtbot):
"""Test changing the dims and shape of an image layer in place and checking
the numbers of sliders and their ranges changes appropriately.
|
sktime__sktime-4010 | [BUG] Tensorflow failing on macOS
When following development environment setup guide in sktime documentation, conda installation was not able to run properly on macOS doe to missing tensorflow-macos dependency.
To reproduce the issue run on macOS:
```shell
$ conda create -n sktime-dev python=3.8
$ conda activate sktime-dev
$ pip install -e ."[all_extras,dev]"
$ make test
zsh: illegal hardware instruction
```
A factor that can complicate things is that tensorflow-macos cannot be installed using Python 3.8.
Possible solutions can be adding a warning to sktime documentation or adjusting pyproject.toml to tackle this issue.
The expected behavior would be to run successfully all tests after initial installation.
| [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n\"\"\"Configuration file for the Sphinx documentation builder.\"\"\"\n\nimport os\nimport sys\nfrom importlib import import_module\n\nimport sktime\n\n# -- Path setup --------------------------------------------------------------\n\n# If extension... | [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\n\"\"\"Configuration file for the Sphinx documentation builder.\"\"\"\n\nimport os\nimport sys\nfrom importlib import import_module\n\nimport sktime\n\n# -- Path setup --------------------------------------------------------------\n\n# If extension... | diff --git a/.all-contributorsrc b/.all-contributorsrc
index 87bcf4d0fd1..016c6a1081f 100644
--- a/.all-contributorsrc
+++ b/.all-contributorsrc
@@ -1937,6 +1937,16 @@
"contributions": [
"doc"
]
+ },
+ {
+ "login": "dainelli98",
+ "name": "Daniel Martín Martínez",
+ "avatar_url": "https://avatars.githubusercontent.com/dainelli98",
+ "profile": "https://www.linkedin.com/in/daniel-martin-martinez",
+ "contributions": [
+ "doc",
+ "bug"
+ ]
}
]
}
diff --git a/docs/source/conf.py b/docs/source/conf.py
index a659e87ac68..998e31c161c 100644
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -42,6 +42,7 @@
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
+ "sphinx.ext.autosectionlabel",
"numpydoc",
"sphinx.ext.intersphinx",
"sphinx.ext.linkcode", # link to GitHub source code via linkcode_resolve()
diff --git a/docs/source/installation.rst b/docs/source/installation.rst
index a89d3b5e46b..22f6fb4ce43 100644
--- a/docs/source/installation.rst
+++ b/docs/source/installation.rst
@@ -37,6 +37,11 @@ To install ``sktime`` with maximum dependencies, including soft dependencies, in
pip install sktime[all_extras]
+.. warning::
+ Some of the dependencies included in ``all_extras`` do not work on mac ARM-based processors, such
+ as M1, M2, M1Pro, M1Max or M1Ultra. This may cause an error during installation. Mode details can
+ be found in the :ref:`troubleshooting section<Dependency error on mac ARM>` below.
+
Installing sktime from conda
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -171,7 +176,7 @@ In the ``anaconda prompt`` terminal:
2. Create new environment with python 3.8: :code:`conda create -n sktime-dev python=3.8`
.. warning::
- If you already have an environment called "sktime-dev" from a previous attempt you will first need to remove this
+ If you already have an environment called "sktime-dev" from a previous attempt you will first need to remove this.
3. Activate the environment: :code:`conda activate sktime-dev`
@@ -214,6 +219,28 @@ your environment is activated and linked to whatever IDE you are using. If you
Notebooks, follow `these instructions <https://janakiev.com/blog/jupyter-virtual-envs/>`_ for
adding your virtual environment as a new kernel for your notebook.
+Installing ``all_extras`` on mac with ARM processor
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+If you are using a mac with an ARM processor, you may encounter an error when installing
+``sktime[all_extras]``. This is due to the fact that some libraries included in ``all_extras``
+are not compatible with ARM-based processors.
+
+The workaround is not to install some of the packages in ``all_extras`` and install ARM compatible
+replacements for others:
+
+* Do not install the following packages:
+ * ``esig``
+ * ``prophet``
+ * ``tsfresh``
+ * ``tslearn``
+* Replace ``tensorflow`` package with the following packages:
+ * ``tensorflow-macos``
+ * ``tensorflow-metal`` (optional)
+
+Also, ARM-based processors have issues when installing packages distributed as source distributions
+instead of Python wheels. To avoid this issue when installing a package you can try installing it
+through conda or use a prior version of the package that was distributed as a wheel.
+
Other Startup Resources
-----------------------
diff --git a/estimator_overview_table.md b/estimator_overview_table.md
new file mode 100644
index 00000000000..e69de29bb2d
|
OpenMined__PySyft-4035 | Return invalid dtype when MPC is applied to Other Dtype Tensor
## Description
When MPC is applied to the int tensor, it must be int but float return.
## How to Reproduce
```python
x = torch.tensor([1, 2, 3])
print(x.dtype) # torch.int64
x = share(bob, alice, crypto_provider=theo)
print(x.dtype) # torch.float32 # should be torch.int64
print(x.get().dtype) # torch.int64
```
## Expected Behavior
should be `torch.int64`
## Screenshots

## System Information
- OS: MAC
- OS Version: Catalina
- Language Version: Python3.7
- Package Manager Version: Conda 4.8.3
- Browser (if applicable): [e.g. Google Chrome]
- Browser Version (if applicable): [e.g. 81.0.4044.138]
## Additional Context
| [
{
"content": "from typing import Union, List\nimport weakref\nimport warnings\n\nimport torch\n\nimport syft\nfrom syft.generic.frameworks.hook import hook_args\nfrom syft.generic.frameworks.overload import overloaded\nfrom syft.frameworks.torch.tensors.interpreters.paillier import PaillierTensor\nfrom syft.mes... | [
{
"content": "from typing import Union, List\nimport weakref\nimport warnings\n\nimport torch\n\nimport syft\nfrom syft.generic.frameworks.hook import hook_args\nfrom syft.generic.frameworks.overload import overloaded\nfrom syft.frameworks.torch.tensors.interpreters.paillier import PaillierTensor\nfrom syft.mes... | diff --git a/syft/frameworks/torch/tensors/interpreters/native.py b/syft/frameworks/torch/tensors/interpreters/native.py
index f847e3d9b13..11ebeac534f 100644
--- a/syft/frameworks/torch/tensors/interpreters/native.py
+++ b/syft/frameworks/torch/tensors/interpreters/native.py
@@ -945,7 +945,7 @@ def share(
shared_tensor = syft.AutogradTensor().on(shared_tensor, wrap=False)
if not no_wrap:
- shared_tensor = shared_tensor.wrap()
+ shared_tensor = shared_tensor.wrap(type=self.dtype)
return shared_tensor
diff --git a/test/torch/tensors/test_additive_shared.py b/test/torch/tensors/test_additive_shared.py
index 6bd9bfb9b5d..4384976beb4 100644
--- a/test/torch/tensors/test_additive_shared.py
+++ b/test/torch/tensors/test_additive_shared.py
@@ -41,6 +41,7 @@ def test_share_get(workers, protocol, dtype, n_workers):
t = torch.tensor([1, 2, 3])
x = t.share(*share_holders[:n_workers], **kwargs)
+ assert t.dtype == x.dtype
x = x.get()
assert (x == t).all()
|
quantumlib__Cirq-4780 | Fix deprecation warning for newly added `ClassicallyControlledOperation`
**Description of the issue**
The following deprecation warning is emitted on running the json serialization test and should be fixed.
```python
~/quantum/Cirq/cirq-core/cirq/protocols/json_serialization.py:283: DeprecationWarning: Found 'cirq_type': 'ClassicallyControlledOperation' in _json_dict_. Custom values of this field are not permitted, and will produce an error starting in Cirq v0.15.
```
**Cirq version**
0.14dev
| [
{
"content": "# Copyright 2021 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by... | [
{
"content": "# Copyright 2021 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by... | diff --git a/cirq-core/cirq/ops/classically_controlled_operation.py b/cirq-core/cirq/ops/classically_controlled_operation.py
index 74a4c3dbb53..3ac6f18bb93 100644
--- a/cirq-core/cirq/ops/classically_controlled_operation.py
+++ b/cirq-core/cirq/ops/classically_controlled_operation.py
@@ -171,7 +171,6 @@ def _circuit_diagram_info_(
def _json_dict_(self) -> Dict[str, Any]:
return {
- 'cirq_type': self.__class__.__name__,
'conditions': self._conditions,
'sub_operation': self._sub_operation,
}
diff --git a/docs/dev/serialization.md b/docs/dev/serialization.md
index 48d285b971a..4b8983e3b79 100644
--- a/docs/dev/serialization.md
+++ b/docs/dev/serialization.md
@@ -89,10 +89,8 @@ There are several steps needed to support an object's serialization and deserial
and pass `cirq-core/cirq/protocols/json_serialization_test.py`:
1. The object should have a `_json_dict_` method that returns a dictionary
-containing a `"cirq_type"` key as well as keys for each of the value's
-attributes. If these keys do not match the names of the class' initializer
-arguments, a `_from_json_dict_` class method must also be defined.
-Typically the `"cirq_type"` will be the name of your class.
+containing keys for each of the value's attributes. If these keys do not match the names of
+the class' initializer arguments, a `_from_json_dict_` class method must also be defined.
2. In `class_resolver_dictionary` within the packages's `json_resolver_cache.py` file,
for each serializable class, the `cirq_type` of the class should be mapped to the imported class
|
django__channels-1951 | HttpCommunicator does not raise exception from consumer
If `WebsocketCommunicator` encounters an error, it shows the exception raised by the underlying consumer. In contrast, `HttpCommunicator` just shows a `TimeoutError`, which is not useful for debugging.
Example tests:
```py
from channels.generic.http import AsyncHttpConsumer
from channels.generic.websocket import AsyncWebsocketConsumer
from channels.testing import HttpCommunicator
from channels.testing import WebsocketCommunicator
from django.test import TestCase
class HttpConsumer(AsyncHttpConsumer):
async def handle(self, body):
1 / 0
class WebsocketConsumer(AsyncWebsocketConsumer):
async def connect(self):
1 / 0
class ConsumerTests(TestCase):
async def test_http(self):
communicator = HttpCommunicator(HttpConsumer.as_asgi(), "GET", "/")
await communicator.get_response()
async def test_websocket(self):
communicator = WebsocketCommunicator(WebsocketConsumer.as_asgi(), "/")
connected, subprotocol = await communicator.connect()
```
Output:
```
$ python manage.py test
Found 2 test(s).
Creating test database for alias 'default'...
Destroying old test database for alias 'default'...
System check identified no issues (0 silenced).
<Task finished name='Task-2' coro=<HttpConsumer() done, defined at /Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/consumer.py:92> result=None>
E<Task finished name='Task-7' coro=<WebsocketConsumer() done, defined at /Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/consumer.py:92> exception=ZeroDivisionError('division by zero')>
E
======================================================================
ERROR: test_http (example.tests.ConsumerTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 74, in receive_output
return await self.output_queue.get()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/asyncio/queues.py", line 159, in get
await getter
asyncio.exceptions.CancelledError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/sync.py", line 218, in __call__
return call_result.result()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/sync.py", line 284, in main_wrap
result = await self.awaitable(*args, **kwargs)
File "/Users/chainz/tmp/channelstest/example/tests.py", line 21, in test_http
await communicator.get_response()
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/testing/http.py", line 42, in get_response
response_start = await self.receive_output(timeout)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 86, in receive_output
raise e
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 73, in receive_output
async with async_timeout(timeout):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/timeout.py", line 65, in __aexit__
self._do_exit(exc_type)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/timeout.py", line 102, in _do_exit
raise asyncio.TimeoutError
asyncio.exceptions.TimeoutError
======================================================================
ERROR: test_websocket (example.tests.ConsumerTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 74, in receive_output
return await self.output_queue.get()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/asyncio/queues.py", line 159, in get
await getter
asyncio.exceptions.CancelledError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 73, in receive_output
async with async_timeout(timeout):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/timeout.py", line 65, in __aexit__
self._do_exit(exc_type)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/timeout.py", line 102, in _do_exit
raise asyncio.TimeoutError
asyncio.exceptions.TimeoutError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/sync.py", line 218, in __call__
return call_result.result()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/Users/chainz/.pyenv/versions/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/sync.py", line 284, in main_wrap
result = await self.awaitable(*args, **kwargs)
File "/Users/chainz/tmp/channelstest/example/tests.py", line 25, in test_websocket
connected, subprotocol = await communicator.connect()
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/testing/websocket.py", line 36, in connect
response = await self.receive_output(timeout)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/asgiref/testing.py", line 79, in receive_output
self.future.result()
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/consumer.py", line 94, in app
return await consumer(scope, receive, send)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/consumer.py", line 62, in __call__
await await_many_dispatch([receive], self.dispatch)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/utils.py", line 50, in await_many_dispatch
await dispatch(result)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/consumer.py", line 73, in dispatch
await handler(message)
File "/Users/chainz/tmp/channelstest/venv/lib/python3.10/site-packages/channels/generic/websocket.py", line 173, in websocket_connect
await self.connect()
File "/Users/chainz/tmp/channelstest/example/tests.py", line 15, in connect
1 / 0
ZeroDivisionError: division by zero
----------------------------------------------------------------------
Ran 2 tests in 2.016s
FAILED (errors=2)
Destroying test database for alias 'default'...
```
Please also try and include, if you can:
(Channels 4.0.0, with Django 4.1.3, on Python 3.10.8)
| [
{
"content": "from channels.consumer import AsyncConsumer\n\nfrom ..exceptions import StopConsumer\n\n\nclass AsyncHttpConsumer(AsyncConsumer):\n \"\"\"\n Async HTTP consumer. Provides basic primitives for building asynchronous\n HTTP endpoints.\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n... | [
{
"content": "from channels.consumer import AsyncConsumer\n\nfrom ..exceptions import StopConsumer\n\n\nclass AsyncHttpConsumer(AsyncConsumer):\n \"\"\"\n Async HTTP consumer. Provides basic primitives for building asynchronous\n HTTP endpoints.\n \"\"\"\n\n def __init__(self, *args, **kwargs):\n... | diff --git a/channels/generic/http.py b/channels/generic/http.py
index 8bbf35236..909e85704 100644
--- a/channels/generic/http.py
+++ b/channels/generic/http.py
@@ -81,7 +81,7 @@ async def http_request(self, message):
await self.handle(b"".join(self.body))
finally:
await self.disconnect()
- raise StopConsumer()
+ raise StopConsumer()
async def http_disconnect(self, message):
"""
diff --git a/tests/test_generic_http.py b/tests/test_generic_http.py
index 85ecdd041..bfb889c0e 100644
--- a/tests/test_generic_http.py
+++ b/tests/test_generic_http.py
@@ -38,6 +38,19 @@ async def handle(self, body):
assert response["headers"] == [(b"Content-Type", b"application/json")]
+@pytest.mark.asyncio
+async def test_error():
+ class TestConsumer(AsyncHttpConsumer):
+ async def handle(self, body):
+ raise AssertionError("Error correctly raised")
+
+ communicator = HttpCommunicator(TestConsumer(), "GET", "/")
+ with pytest.raises(AssertionError) as excinfo:
+ await communicator.get_response(timeout=0.05)
+
+ assert str(excinfo.value) == "Error correctly raised"
+
+
@pytest.mark.asyncio
async def test_per_scope_consumers():
"""
|
sql-machine-learning__elasticdl-1666 | Parse arguments(flag) in ps/server.go
| [
{
"content": "import argparse\nfrom itertools import chain\n\nfrom elasticdl.python.common.constants import DistributionStrategy\nfrom elasticdl.python.common.log_utils import default_logger as logger\n\nMODEL_SPEC_GROUP = [\n \"dataset_fn\",\n \"eval_metrics_fn\",\n \"model_def\",\n \"model_params\... | [
{
"content": "import argparse\nfrom itertools import chain\n\nfrom elasticdl.python.common.constants import DistributionStrategy\nfrom elasticdl.python.common.log_utils import default_logger as logger\n\nMODEL_SPEC_GROUP = [\n \"dataset_fn\",\n \"eval_metrics_fn\",\n \"model_def\",\n \"model_params\... | diff --git a/elasticdl/pkg/main/main.go b/elasticdl/pkg/main/main.go
index 0f59d5f88..18e1598c5 100644
--- a/elasticdl/pkg/main/main.go
+++ b/elasticdl/pkg/main/main.go
@@ -9,8 +9,22 @@ import (
)
var (
- // TODO: parse more args
- port = flag.Int("port", 2222, "The server port")
+ jobName = flag.String("job_name", "", "ElasticDL job name")
+ namespace = flag.String("namespace", "", "The name of the Kubernetes namespace where ElasticDL pods will be created")
+ masterAddr = flag.String("master_addr", "localhost:50001", "The master pod address")
+ port = flag.Int("port", 2222, "The server port")
+ useAsync = flag.Bool("use_async", false, "true for asynchronous SGD, false for synchronous SGD")
+ gradsToWait = flag.Int("grads_to_wait", 1, "Number of gradients to wait before updating mode")
+ lrStalenessModulation = flag.Bool("lr_staleness_modulation", false, "If True, PS will modulate the learning rate with staleness")
+ syncVersionTolerance = flag.Int("sync_version_tolerance", 0, "The maximum model version difference between reported gradients and PS that synchronous SGD can accepts")
+ evaluationSteps = flag.Int("evaluation_steps", 0, "Evaluate the model every this many steps. If 0, evaluation is disabled")
+ numPsPods = flag.Int("num_ps_pods", 1, "Number of PS pod")
+ psID = flag.Int("ps_id", 0, "PS id")
+ numWorkers = flag.Int("num_workers", 1, "Number of workers")
+ checkpointDirForInit = flag.String("checkpoint_dir_for_init", "", "The checkpoint directory to initialize the training model")
+ checkpointDir = flag.String("checkpoint_dir", "", "The directory to store the checkpoint file")
+ checkpointSteps = flag.Int("checkpoint_steps", 0, "Save checkpoint every this many steps. If 0, no checkpoints to save")
+ keepCheckpointMax = flag.Int("keep_checkpoint_max", 3, "The maximum number of recent checkpoint files to keep. If 0, keep all")
)
func main() {
diff --git a/elasticdl/python/common/args.py b/elasticdl/python/common/args.py
index 2864686b5..1a98ce6da 100644
--- a/elasticdl/python/common/args.py
+++ b/elasticdl/python/common/args.py
@@ -333,7 +333,7 @@ def add_train_params(parser):
parser=parser,
name="--lr_staleness_modulation",
default=False,
- help="If True, master will modulate the learning rate with staleness "
+ help="If True, PS will modulate the learning rate with staleness "
"in asynchronous SGD",
)
|
docker__docker-py-3200 | Can't create config object
Much like https://github.com/docker/docker-py/issues/2025 the config model is failing to create a new object due to 'name' KeyError
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "docker\models\configs.py", line 10, in __repr__
return f"<{self.__class__.__name__}: '{self.name}'>"
File "docker\models\configs.py", line 14, in name
return self.attrs['Spec']['Name']
```
This https://github.com/docker/docker-py/pull/2793 appears to be the fix that was implemented and should likely be implements for configs as well (if not other models that might have this issue)
| [
{
"content": "from ..api import APIClient\nfrom .resource import Model, Collection\n\n\nclass Config(Model):\n \"\"\"A config.\"\"\"\n id_attribute = 'ID'\n\n def __repr__(self):\n return f\"<{self.__class__.__name__}: '{self.name}'>\"\n\n @property\n def name(self):\n return self.a... | [
{
"content": "from ..api import APIClient\nfrom .resource import Model, Collection\n\n\nclass Config(Model):\n \"\"\"A config.\"\"\"\n id_attribute = 'ID'\n\n def __repr__(self):\n return f\"<{self.__class__.__name__}: '{self.name}'>\"\n\n @property\n def name(self):\n return self.a... | diff --git a/docker/models/configs.py b/docker/models/configs.py
index 3588c8b5d..5ef137784 100644
--- a/docker/models/configs.py
+++ b/docker/models/configs.py
@@ -30,6 +30,7 @@ class ConfigCollection(Collection):
def create(self, **kwargs):
obj = self.client.api.create_config(**kwargs)
+ obj.setdefault("Spec", {})["Name"] = kwargs.get("name")
return self.prepare_model(obj)
create.__doc__ = APIClient.create_config.__doc__
diff --git a/tests/unit/fake_api.py b/tests/unit/fake_api.py
index 0524becdc..03e53cc64 100644
--- a/tests/unit/fake_api.py
+++ b/tests/unit/fake_api.py
@@ -19,6 +19,8 @@
FAKE_NODE_ID = '24ifsmvkjbyhk'
FAKE_SECRET_ID = 'epdyrw4tsi03xy3deu8g8ly6o'
FAKE_SECRET_NAME = 'super_secret'
+FAKE_CONFIG_ID = 'sekvs771242jfdjnvfuds8232'
+FAKE_CONFIG_NAME = 'super_config'
# Each method is prefixed with HTTP method (get, post...)
# for clarity and readability
@@ -512,6 +514,11 @@ def post_fake_secret():
response = {'ID': FAKE_SECRET_ID}
return status_code, response
+def post_fake_config():
+ status_code = 200
+ response = {'ID': FAKE_CONFIG_ID}
+ return status_code, response
+
# Maps real api url to fake response callback
prefix = 'http+docker://localhost'
@@ -630,4 +637,6 @@ def post_fake_secret():
post_fake_network_disconnect,
f'{prefix}/{CURRENT_VERSION}/secrets/create':
post_fake_secret,
+ f'{prefix}/{CURRENT_VERSION}/configs/create':
+ post_fake_config,
}
diff --git a/tests/unit/fake_api_client.py b/tests/unit/fake_api_client.py
index 95cf63b49..797994216 100644
--- a/tests/unit/fake_api_client.py
+++ b/tests/unit/fake_api_client.py
@@ -37,6 +37,7 @@ def make_fake_api_client(overrides=None):
'create_host_config.side_effect': api_client.create_host_config,
'create_network.return_value': fake_api.post_fake_network()[1],
'create_secret.return_value': fake_api.post_fake_secret()[1],
+ 'create_config.return_value': fake_api.post_fake_config()[1],
'exec_create.return_value': fake_api.post_fake_exec_create()[1],
'exec_start.return_value': fake_api.post_fake_exec_start()[1],
'images.return_value': fake_api.get_fake_images()[1],
diff --git a/tests/unit/models_configs_test.py b/tests/unit/models_configs_test.py
new file mode 100644
index 000000000..6960397ff
--- /dev/null
+++ b/tests/unit/models_configs_test.py
@@ -0,0 +1,10 @@
+import unittest
+
+from .fake_api_client import make_fake_client
+from .fake_api import FAKE_CONFIG_NAME
+
+class CreateConfigsTest(unittest.TestCase):
+ def test_create_config(self):
+ client = make_fake_client()
+ config = client.configs.create(name="super_config", data="config")
+ assert config.__repr__() == "<Config: '{}'>".format(FAKE_CONFIG_NAME)
|
googleapis__google-cloud-python-10168 | PubSub: declaratively drop Python 3.4 support
The README and the language classifiers in `setup.py` both only claim support for Python 3.5+ (and 2.7), but not Python 3.4. However, the `python_requires` in `setup.py` does not reflect that, and does not prevent installing the library in Python 3.4.
| [
{
"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | [
{
"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | diff --git a/pubsub/setup.py b/pubsub/setup.py
index e26fb4b75778..69f19b3db72e 100644
--- a/pubsub/setup.py
+++ b/pubsub/setup.py
@@ -84,7 +84,7 @@
namespace_packages=namespaces,
install_requires=dependencies,
extras_require=extras,
- python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*",
+ python_requires=">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*",
include_package_data=True,
zip_safe=False,
)
|
agconti__cookiecutter-django-rest-150 | Set APPEND_SLASH to False
Since Django cannot send POST data in a `301` redirect, `POST`s to an api at `api/v1/resource` will fail with a `400` bad request, assuming the configured route is `api/v1/resource/`. With `APPEND_SLASH` set to false, `POST`s to an api at `api/v1/resource` will fail with a `404`, letting the developer know they have forgotten a slash.
While its convenient to redirect `GET` requests, from the perspective of consuming an api; its preferred to have a more direct and straight forward errors.
| [
{
"content": "import os\nfrom os.path import join\n\nfrom configurations import Configuration, values\n\nBASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n\nclass Common(Configuration):\n\n INSTALLED_APPS = (\n 'django.contrib.admin',\n 'django.contrib.auth',\n 'd... | [
{
"content": "import os\nfrom os.path import join\n\nfrom configurations import Configuration, values\n\nBASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n\nclass Common(Configuration):\n\n INSTALLED_APPS = (\n 'django.contrib.admin',\n 'django.contrib.auth',\n 'd... | diff --git a/{{cookiecutter.github_repository_name}}/{{cookiecutter.app_name}}/config/common.py b/{{cookiecutter.github_repository_name}}/{{cookiecutter.app_name}}/config/common.py
index 9c95c1466..afda9ce22 100755
--- a/{{cookiecutter.github_repository_name}}/{{cookiecutter.app_name}}/config/common.py
+++ b/{{cookiecutter.github_repository_name}}/{{cookiecutter.app_name}}/config/common.py
@@ -86,6 +86,7 @@ class Common(Configuration):
DATABASES = values.DatabaseURLValue('postgres://localhost/{{cookiecutter.app_name}}')
# General
+ APPEND_SLASH = False
TIME_ZONE = 'UTC'
LANGUAGE_CODE = 'en-us'
SITE_ID = 1
|
MAKENTNU__web-204 | Fix delete permissions for course registration
| [
{
"content": "import io\n\nimport xlsxwriter\nfrom django.contrib.auth.mixins import PermissionRequiredMixin\nfrom django.db.models import Q\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect\nfrom django.urls import reverse\nfrom django.views.generic import TemplateView, View, CreateV... | [
{
"content": "import io\n\nimport xlsxwriter\nfrom django.contrib.auth.mixins import PermissionRequiredMixin\nfrom django.db.models import Q\nfrom django.http import HttpResponse\nfrom django.shortcuts import redirect\nfrom django.urls import reverse\nfrom django.views.generic import TemplateView, View, CreateV... | diff --git a/make_queue/templates/make_queue/course/course_panel.html b/make_queue/templates/make_queue/course/course_panel.html
index d9fe53cee..c83f0bca7 100644
--- a/make_queue/templates/make_queue/course/course_panel.html
+++ b/make_queue/templates/make_queue/course/course_panel.html
@@ -105,7 +105,7 @@ <h4>
<a href="{% url "edit_course_registration" registration.pk %}">
<i class="ui yellow pencil icon"></i>
</a>
- {% if perms.delete_printer3dcourse %}
+ {% if perms.make_queue.delete_printer3dcourse %}
<a class="delete confirm" href="{% url "delete_course_registration" registration.pk %}">
<i class="ui red trash icon"></i>
</a>
diff --git a/make_queue/views/admin/course.py b/make_queue/views/admin/course.py
index 4abae218c..f1aa7e57b 100644
--- a/make_queue/views/admin/course.py
+++ b/make_queue/views/admin/course.py
@@ -58,7 +58,7 @@ def get_success_url(self):
class DeleteRegistrationView(PermissionRequiredMixin, DeleteView):
model = Printer3DCourse
permission_required = (
- "make_queue.delete_printer3d_course",
+ "make_queue.delete_printer3dcourse",
)
def get_success_url(self):
|
ivy-llc__ivy-17429 | empty_like
| [
{
"content": "# global\r\nimport ivy\r\nfrom ivy.func_wrapper import with_unsupported_dtypes\r\nfrom .tensor import Tensor\r\nfrom ivy.functional.frontends.paddle.func_wrapper import (\r\n to_ivy_arrays_and_back,\r\n)\r\n\r\n\r\n@to_ivy_arrays_and_back\r\ndef to_tensor(data, /, *, dtype=None, place=None, sto... | [
{
"content": "# global\r\nimport ivy\r\nfrom ivy.func_wrapper import with_unsupported_dtypes\r\nfrom .tensor import Tensor\r\nfrom ivy.functional.frontends.paddle.func_wrapper import (\r\n to_ivy_arrays_and_back,\r\n)\r\n\r\n\r\n@to_ivy_arrays_and_back\r\ndef to_tensor(data, /, *, dtype=None, place=None, sto... | diff --git a/ivy/functional/frontends/paddle/tensor/creation.py b/ivy/functional/frontends/paddle/tensor/creation.py
index 43939f0aab490..da5b22abd7a9a 100644
--- a/ivy/functional/frontends/paddle/tensor/creation.py
+++ b/ivy/functional/frontends/paddle/tensor/creation.py
@@ -71,3 +71,8 @@ def empty(shape, dtype=None):
@to_ivy_arrays_and_back
def eye(num_rows, num_columns=None, dtype=None, name=None):
return ivy.eye(num_rows, num_columns, dtype=dtype)
+
+
+@to_ivy_arrays_and_back
+def empty_like(x, dtype=None, name=None):
+ return ivy.empty_like(x, dtype=dtype)
diff --git a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_creation.py b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_creation.py
index e19ce390bb7c7..d534514fed0cb 100644
--- a/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_creation.py
+++ b/ivy_tests/test_ivy/test_frontends/test_paddle/test_tensor/test_paddle_creation.py
@@ -332,3 +332,31 @@ def test_paddle_eye(
num_columns=num_columns,
dtype=dtypes[0],
)
+
+
+# empty_like
+@handle_frontend_test(
+ fn_tree="paddle.empty_like",
+ dtype_and_x=helpers.dtype_and_values(available_dtypes=helpers.get_dtypes("valid")),
+ dtype=helpers.get_dtypes("valid", full=False),
+ test_with_out=st.just(False),
+)
+def test_paddle_empty_like(
+ dtype_and_x,
+ dtype,
+ test_flags,
+ frontend,
+ fn_tree,
+ on_device,
+):
+ input_dtype, x = dtype_and_x
+ helpers.test_frontend_function(
+ input_dtypes=input_dtype,
+ frontend=frontend,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ test_values=False,
+ x=x[0],
+ dtype=dtype[0],
+ )
|
pyinstaller__pyinstaller-2225 | missing hidden import for skimage
When packaging an application that imports skimage.feature (and nothing else), the app would not run due to an ImportError on the "transform" module. This can be fixed by adding one item to the hiddenimports in hook-skimage.transform.py file (bolded below):
> hiddenimports = ['skimage.draw.draw',
> 'skimage._shared.geometry',
> 'skimage.filters.rank.core_cy',
> **'skimage._shared.transform'**]
>
> datas = collect_data_files('skimage')
PyInstaller 3.2, Windows 7 64 bit, Python 2.7.12, Anaconda 4.1.1 distribution.
| [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2014-2016, PyInstaller Development Team.\n#\n# Distributed under the terms of the GNU General Public License with exception\n# for distributing bootloader.\n#\n# The full license is in the file COPYING.... | [
{
"content": "#-----------------------------------------------------------------------------\n# Copyright (c) 2014-2016, PyInstaller Development Team.\n#\n# Distributed under the terms of the GNU General Public License with exception\n# for distributing bootloader.\n#\n# The full license is in the file COPYING.... | diff --git a/PyInstaller/hooks/hook-skimage.transform.py b/PyInstaller/hooks/hook-skimage.transform.py
index 8768c0c1f0..8c2b452094 100644
--- a/PyInstaller/hooks/hook-skimage.transform.py
+++ b/PyInstaller/hooks/hook-skimage.transform.py
@@ -12,6 +12,7 @@
# 64-bit
hiddenimports = ['skimage.draw.draw',
'skimage._shared.geometry',
+ 'skimage._shared.transform',
'skimage.filters.rank.core_cy']
datas = collect_data_files('skimage')
|
cupy__cupy-3468 | Remove mock from test requirements?
I'm learning how to write mock tests, and I noticed things like `import mock` are workarounds to support PY27 and older PY3. Since CuPy now support PY35+ only and `mock` becomes part of the standard Python library, I suppose this line is no longer needed:
https://github.com/cupy/cupy/blob/74dcb4172578a0771e06f4e44b10b5f73f68fb59/setup.py#L39
and all `import mock` can be replaced by `from unittest import mock`?
| [
{
"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup, find_packages\nimport sys\n\nimport cupy_setup_build\n\n\nif sys.version_info[:3] == (3, 5, 0):\n if not int(os.getenv('CUPY_PYTHON_350_FORCE', '0')):\n msg = \"\"\"\nCuPy does not work with Python 3.5.0.\n\nWe strongly re... | [
{
"content": "#!/usr/bin/env python\n\nimport os\nfrom setuptools import setup, find_packages\nimport sys\n\nimport cupy_setup_build\n\n\nif sys.version_info[:3] == (3, 5, 0):\n if not int(os.getenv('CUPY_PYTHON_350_FORCE', '0')):\n msg = \"\"\"\nCuPy does not work with Python 3.5.0.\n\nWe strongly re... | diff --git a/setup.py b/setup.py
index 2dba2314e41..dfa680ee986 100644
--- a/setup.py
+++ b/setup.py
@@ -36,7 +36,6 @@
'test': [
'pytest<4.2.0', # 4.2.0 is slow collecting tests and times out on CI.
'attrs<19.2.0', # pytest 4.1.1 does not run with attrs==19.2.0
- 'mock',
],
'doctest': [
'matplotlib',
diff --git a/tests/cupy_tests/core_tests/fusion_tests/test_function.py b/tests/cupy_tests/core_tests/fusion_tests/test_function.py
index 79e5acaee75..6ca8cb57659 100644
--- a/tests/cupy_tests/core_tests/fusion_tests/test_function.py
+++ b/tests/cupy_tests/core_tests/fusion_tests/test_function.py
@@ -1,7 +1,6 @@
import threading
import unittest
-
-import mock
+from unittest import mock
import cupy
from cupy import testing
@@ -200,7 +199,7 @@ def check(self, xp, func, expected_name, is_elementwise):
with mock.patch(target_full_name) as kernel:
func(a, b, c)
- kernel.assert_called_once()
+ assert kernel.call_count == 1
self.assertEqual(kernel.call_args[1]['name'], expected_name)
# Test there's no error in computation (without mock)
diff --git a/tests/cupy_tests/cuda_tests/test_compiler.py b/tests/cupy_tests/cuda_tests/test_compiler.py
index 59aab999232..6d8a21cdd1c 100644
--- a/tests/cupy_tests/cuda_tests/test_compiler.py
+++ b/tests/cupy_tests/cuda_tests/test_compiler.py
@@ -1,7 +1,6 @@
import pickle
import unittest
-
-import mock
+from unittest import mock
import cupy
from cupy.cuda import compiler
diff --git a/tests/cupy_tests/cuda_tests/test_profile.py b/tests/cupy_tests/cuda_tests/test_profile.py
index 626128877a0..1467822a76c 100644
--- a/tests/cupy_tests/cuda_tests/test_profile.py
+++ b/tests/cupy_tests/cuda_tests/test_profile.py
@@ -1,6 +1,5 @@
import unittest
-
-import mock
+from unittest import mock
from cupy import cuda
diff --git a/tests/cupy_tests/prof_tests/test_range.py b/tests/cupy_tests/prof_tests/test_range.py
index f1ab075108a..4cec205cfb3 100644
--- a/tests/cupy_tests/prof_tests/test_range.py
+++ b/tests/cupy_tests/prof_tests/test_range.py
@@ -1,6 +1,5 @@
import unittest
-
-import mock
+from unittest import mock
from cupy import cuda
from cupy import prof
diff --git a/tests/cupy_tests/random_tests/test_sample.py b/tests/cupy_tests/random_tests/test_sample.py
index b99627e609c..d3dacb887a3 100644
--- a/tests/cupy_tests/random_tests/test_sample.py
+++ b/tests/cupy_tests/random_tests/test_sample.py
@@ -1,5 +1,5 @@
-import mock
import unittest
+from unittest import mock
import numpy
diff --git a/tests/cupy_tests/test_init.py b/tests/cupy_tests/test_init.py
index ea06ef0f63a..c8e01086871 100644
--- a/tests/cupy_tests/test_init.py
+++ b/tests/cupy_tests/test_init.py
@@ -4,8 +4,8 @@
import sys
import tempfile
import unittest
+from unittest import mock
-import mock
import numpy
import cupy
diff --git a/tests/cupyx_tests/scipy_tests/sparse_tests/test_construct.py b/tests/cupyx_tests/scipy_tests/sparse_tests/test_construct.py
index 55cca5df142..951dae01917 100644
--- a/tests/cupyx_tests/scipy_tests/sparse_tests/test_construct.py
+++ b/tests/cupyx_tests/scipy_tests/sparse_tests/test_construct.py
@@ -1,7 +1,7 @@
import re
import unittest
+from unittest import mock
-import mock
import numpy
import pytest
try:
diff --git a/tests/cupyx_tests/test_optimize.py b/tests/cupyx_tests/test_optimize.py
index 0e33d013a7f..28927f62e40 100644
--- a/tests/cupyx_tests/test_optimize.py
+++ b/tests/cupyx_tests/test_optimize.py
@@ -1,7 +1,8 @@
-import mock
-import pytest
import tempfile
import unittest
+from unittest import mock
+
+import pytest
import cupy
from cupy import testing
diff --git a/tests/cupyx_tests/test_runtime.py b/tests/cupyx_tests/test_runtime.py
index bdaed711ef7..0df56e7e757 100644
--- a/tests/cupyx_tests/test_runtime.py
+++ b/tests/cupyx_tests/test_runtime.py
@@ -1,6 +1,5 @@
import unittest
-
-import mock
+from unittest import mock
import cupy
import cupyx
diff --git a/tests/cupyx_tests/test_time.py b/tests/cupyx_tests/test_time.py
index bccf78c99fc..79973a35fa1 100644
--- a/tests/cupyx_tests/test_time.py
+++ b/tests/cupyx_tests/test_time.py
@@ -1,5 +1,5 @@
-import mock
import unittest
+from unittest import mock
import numpy
|
mkdocs__mkdocs-904 | Error while executing gh-deploy
I've successfully deployed a MkDocs site using the gh-deploy command. When I try to deploy some additional changes to my master branch, I get the following error:
```
c:\docs>mkdocs gh-deploy --clean
INFO - Cleaning site directory
INFO - Building documentation to directory: c:\docs\site
INFO - Copying 'c:\docs\site' to 'gh-pages' branch and pushing to GitHub.
Traceback (most recent call last):
File "C:\Python34\lib\runpy.py", line 170, in _run_module_as_main
"__main__", mod_spec)
File "C:\Python34\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "c:\Python34\Scripts\mkdocs.exe\__main__.py", line 9, in <module>
File "C:\Python34\lib\site-packages\click\core.py", line 664, in __call__
return self.main(*args, **kwargs)
File "C:\Python34\lib\site-packages\click\core.py", line 644, in main
rv = self.invoke(ctx)
File "C:\Python34\lib\site-packages\click\core.py", line 991, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Python34\lib\site-packages\click\core.py", line 837, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Python34\lib\site-packages\click\core.py", line 464, in invoke
return callback(*args, **kwargs)
File "C:\Python34\lib\site-packages\mkdocs\cli.py", line 186, in gh_deploy_command
gh_deploy.gh_deploy(config, message=message)
File "C:\Python34\lib\site-packages\mkdocs\gh_deploy.py", line 69, in gh_deploy
remote_branch)
File "C:\Python34\lib\site-packages\mkdocs\utils\ghp_import.py", line 163, in ghp_import
if not try_rebase(remote, branch):
File "C:\Python34\lib\site-packages\mkdocs\utils\ghp_import.py", line 78, in try_rebase
if sp.call(cmd) != 0:
File "C:\Python34\lib\subprocess.py", line 537, in call
with Popen(*popenargs, **kwargs) as p:
File "C:\Python34\lib\subprocess.py", line 859, in __init__
restore_signals, start_new_session)
File "C:\Python34\lib\subprocess.py", line 1086, in _execute_child
args = list2cmdline(args)
File "C:\Python34\lib\subprocess.py", line 663, in list2cmdline
needquote = (" " in arg) or ("\t" in arg) or not arg
TypeError: 'str' does not support the buffer interface
```
| [
{
"content": "#! /usr/bin/env python\n#\n# This file is part of the ghp-import package released under\n# the Tumbolia Public License.\n\n# Tumbolia Public License\n\n# Copyright 2013, Paul Davis <paul.joseph.davis@gmail.com>\n\n# Copying and distribution of this file, with or without ... | [
{
"content": "#! /usr/bin/env python\n#\n# This file is part of the ghp-import package released under\n# the Tumbolia Public License.\n\n# Tumbolia Public License\n\n# Copyright 2013, Paul Davis <paul.joseph.davis@gmail.com>\n\n# Copying and distribution of this file, with or without ... | diff --git a/mkdocs/utils/ghp_import.py b/mkdocs/utils/ghp_import.py
index c7cc85c091..d6f543563f 100644
--- a/mkdocs/utils/ghp_import.py
+++ b/mkdocs/utils/ghp_import.py
@@ -74,7 +74,7 @@ def try_rebase(remote, branch):
(rev, _) = p.communicate()
if p.wait() != 0:
return True
- cmd = ['git', 'update-ref', 'refs/heads/%s' % branch, rev.strip()]
+ cmd = ['git', 'update-ref', 'refs/heads/%s' % branch, dec(rev.strip())]
if sp.call(cmd) != 0:
return False
return True
|
ansible-collections__community.aws-1207 | ec2_customer_gateway: bgp_asn is not required
### Summary
The ec2_customer_gateway module has incorrect documentation for the bgp_asn parameter.
It says the ASN must be passed when state=present, but the code defaults to 25000 if the parameter is absent. See the ensure_cgw_present() method:
```
def ensure_cgw_present(self, bgp_asn, ip_address):
if not bgp_asn:
bgp_asn = 65000
response = self.ec2.create_customer_gateway(
DryRun=False,
Type='ipsec.1',
PublicIp=ip_address,
BgpAsn=bgp_asn,
)
return response
### Issue Type
Documentation Report
### Component Name
ec2_customer_gateway
### Ansible Version
```console (paste below)
$ ansible --version
ansible [core 2.12.4]
config file = None
configured module search path = ['/home/neil/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/lib/python3.10/site-packages/ansible
ansible collection location = /home/neil/.ansible/collections:/usr/share/ansible/collections
executable location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/bin/ansible
python version = 3.10.1 (main, Jan 10 2022, 00:00:00) [GCC 11.2.1 20211203 (Red Hat 11.2.1-7)]
jinja version = 3.1.1
libyaml = True
```
### Collection Versions
```console (paste below)
$ ansible-galaxy collection list
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
```
### OS / Environment
main branch, as of 2022-04-18.
### Additional Information
Suggested rewording:
```
options:
bgp_asn:
description:
- Border Gateway Protocol (BGP) Autonomous System Number (ASN), defaults to 25000.
type: int
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
| [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | diff --git a/plugins/modules/ec2_customer_gateway.py b/plugins/modules/ec2_customer_gateway.py
index 9c00783a58a..f07e92f4f7c 100644
--- a/plugins/modules/ec2_customer_gateway.py
+++ b/plugins/modules/ec2_customer_gateway.py
@@ -23,7 +23,8 @@
options:
bgp_asn:
description:
- - Border Gateway Protocol (BGP) Autonomous System Number (ASN), required when I(state=present).
+ - Border Gateway Protocol (BGP) Autonomous System Number (ASN).
+ - Defaults to C(65000) if not specified when I(state=present).
type: int
ip_address:
description:
|
biolab__orange3-text-358 | Guardian: Fix failing tests on Travis
<!--
This is an issue template. Please fill in the relevant details in the
sections below.
-->
##### Text version
<!-- From menu _Options→Add-ons→Orange3-Text_ or code `orangecontrib.text.version.full_version` -->
0.3.0
##### Orange version
<!-- From menu _Help→About→Version_ or code `Orange.version.full_version` -->
3.15.dev
##### Expected behavior
Tests pass.
##### Actual behavior
Guardian tests is failing.
##### Steps to reproduce the behavior
##### Additional info (worksheets, data, screenshots, ...)
Fix tests.
| [
{
"content": "\"\"\" This module fetches data from The Guardian API.\n\nTo use first create :class:`TheGuardianCredentials`:\n\n >>> from orangecontrib.text.guardian import TheGuardianCredentials\n >>> credentials = TheGuardianCredentials('<your-api-key>')\n\nThen create :class:`TheGuardianAPI` object and... | [
{
"content": "\"\"\" This module fetches data from The Guardian API.\n\nTo use first create :class:`TheGuardianCredentials`:\n\n >>> from orangecontrib.text.guardian import TheGuardianCredentials\n >>> credentials = TheGuardianCredentials('<your-api-key>')\n\nThen create :class:`TheGuardianAPI` object and... | diff --git a/orangecontrib/text/guardian.py b/orangecontrib/text/guardian.py
index 56177f642..d7222d41e 100644
--- a/orangecontrib/text/guardian.py
+++ b/orangecontrib/text/guardian.py
@@ -155,7 +155,7 @@ def search(self, query, from_date=None, to_date=None, max_documents=None,
if __name__ == '__main__':
- credentials = TheGuardianCredentials('')
+ credentials = TheGuardianCredentials('test')
print(credentials.valid)
api = TheGuardianAPI(credentials=credentials)
c = api.search('refugees', max_documents=10)
|
ipython__ipython-8798 | `pip install ipython[all]` ignores platform dependent dependencies
If I try to run `pip install ipython[all]` on my python install on windows (Win 7 64-bit, WinPython 2.7.10), it fails with the following:
```
C:\Python\WinPython-64bit-2.7.10.1\python-2.7.10.amd64>pip install --upgrade ipy
thon[all]
Requirement already up-to-date: ipython[all] in c:\python\winpython-64bit-2.7.10
.1\python-2.7.10.amd64\lib\site-packages
Requirement already up-to-date: decorator in c:\python\winpython-64bit-2.7.10.1\
python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: simplegeneric>0.8 in c:\python\winpython-64bit-2
.7.10.1\python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: traitlets in c:\python\winpython-64bit-2.7.10.1\
python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: pickleshare in c:\python\winpython-64bit-2.7.10.
1\python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: nose>=0.10.1 in c:\python\winpython-64bit-2.7.10
.1\python-2.7.10.amd64\lib\site-packages (from ipython[all])
Collecting ipyparallel (from ipython[all])
Downloading ipyparallel-4.0.2-py2.py3-none-any.whl (164kB)
100% |################################| 167kB 718kB/s
Requirement already up-to-date: notebook in c:\python\winpython-64bit-2.7.10.1\p
ython-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: requests in c:\python\winpython-64bit-2.7.10.1\p
ython-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: nbformat in c:\python\winpython-64bit-2.7.10.1\p
ython-2.7.10.amd64\lib\site-packages (from ipython[all])
Collecting pyreadline>=2 (from ipython[all])
Downloading pyreadline-2.0.zip (108kB)
100% |################################| 110kB 2.0MB/s
Requirement already up-to-date: nbconvert in c:\python\winpython-64bit-2.7.10.1\
python-2.7.10.amd64\lib\site-packages (from ipython[all])
Collecting testpath (from ipython[all])
Downloading testpath-0.2-py2.py3-none-any.whl
Requirement already up-to-date: ipykernel in c:\python\winpython-64bit-2.7.10.1\
python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: numpydoc in c:\python\winpython-64bit-2.7.10.1\p
ython-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: qtconsole in c:\python\winpython-64bit-2.7.10.1\
python-2.7.10.amd64\lib\site-packages (from ipython[all])
Requirement already up-to-date: Sphinx>=1.1 in c:\python\winpython-64bit-2.7.10.
1\python-2.7.10.amd64\lib\site-packages (from ipython[all])
Collecting mock (from ipython[all])
Downloading mock-1.3.0-py2.py3-none-any.whl (56kB)
100% |################################| 57kB 2.9MB/s
Collecting gnureadline (from ipython[all])
Downloading gnureadline-6.3.3.tar.gz (2.5MB)
100% |################################| 2.5MB 128kB/s
Complete output from command python setup.py egg_info:
Error: this module is not meant to work on Windows (try pyreadline instead)
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in <userpath>\ap
pdata\local\temp\pip-build-scarmp\gnureadline
```
pip is v7.1.2.
Now, I wasn't able to figure out from pip and setuptools docs how `[all]` is supposed to work, so I'm not sure if this is an issue with the ipython setup file, setuptools, or pip, but I figured this would be the best place to start.
| [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"Setup script for IPython.\n\nUnder Posix environments it works like a typical setup.py script.\nUnder Windows, the command sdist is not supported, since IPython\nrequires utilities which are not available under Windows.\"\"\"\n\n#--------------... | [
{
"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\"\"\"Setup script for IPython.\n\nUnder Posix environments it works like a typical setup.py script.\nUnder Windows, the command sdist is not supported, since IPython\nrequires utilities which are not available under Windows.\"\"\"\n\n#--------------... | diff --git a/setup.py b/setup.py
index dd0e3727a9c..e871b6fe2d1 100755
--- a/setup.py
+++ b/setup.py
@@ -221,8 +221,9 @@ def run(self):
install_requires.append('pexpect')
everything = set()
-for deps in extras_require.values():
- everything.update(deps)
+for key, deps in extras_require.items():
+ if ':' not in key:
+ everything.update(deps)
extras_require['all'] = everything
if 'setuptools' in sys.modules:
|
Parsl__parsl-186 | Allow `DataFuture` to be initialized with a `str` file object
[Here](https://github.com/Parsl/parsl/blob/master/parsl/app/futures.py#L77) we check if `file_obj` is `str`. Now that `File` is subclassed from `str`, this will always evaluate as `True`.
| [
{
"content": "\"\"\"This module implements DataFutures.\n\nWe have two basic types of futures:\n 1. DataFutures which represent data objects\n 2. AppFutures which represent the futures on App/Leaf tasks.\n\"\"\"\nimport os\nimport logging\nfrom concurrent.futures import Future\n\nfrom parsl.dataflow.futur... | [
{
"content": "\"\"\"This module implements DataFutures.\n\nWe have two basic types of futures:\n 1. DataFutures which represent data objects\n 2. AppFutures which represent the futures on App/Leaf tasks.\n\"\"\"\nimport os\nimport logging\nfrom concurrent.futures import Future\n\nfrom parsl.dataflow.futur... | diff --git a/parsl/app/futures.py b/parsl/app/futures.py
index 41b0c7946b..9b13e13aec 100644
--- a/parsl/app/futures.py
+++ b/parsl/app/futures.py
@@ -74,7 +74,7 @@ def __init__(self, fut, file_obj, parent=None, tid=None):
"""
super().__init__()
self._tid = tid
- if isinstance(file_obj, str):
+ if isinstance(file_obj, str) and not isinstance(file_obj, File):
self.file_obj = File(file_obj)
else:
self.file_obj = file_obj
|
microsoft__botbuilder-python-1190 | No module named 'botbuilder.ai.qna.dialogs' - Python QnA Sample 49
## Version
botbuilder-ai - 4.9.1
## Describe the bug
I was trying out the QnA Maker Sample - 49.qnamaker-all-features . I've configured my QnA KB and also the config.py with the necessary info. However the module botbuilder.ai.qna.dialogs does not seem to exist. I've manually verified for the class QnAMakermDialog and it does not exist
> from botbuilder.ai.qna.dialogs import QnAMakermDialog
## To Reproduce
Steps to reproduce the behavior:
1. Download the sample 49.qnamaker-all-features
2. Install the necessary requirements and configure QnAMaker.
3. Run python app.py in the folder
## Expected behavior
The sample should've run successfully.
[bug]
| [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nREQUIRES = [\n \"azure-cognitiveservices-language-luis==0.2.0\",\n \"botbuilder-schema>=4.7.1\",\n \"botbuilder-core>=4.7.1\",\n \"aiohttp==3.6.2... | [
{
"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nREQUIRES = [\n \"azure-cognitiveservices-language-luis==0.2.0\",\n \"botbuilder-schema>=4.7.1\",\n \"botbuilder-core>=4.7.1\",\n \"aiohttp==3.6.2... | diff --git a/libraries/botbuilder-ai/setup.py b/libraries/botbuilder-ai/setup.py
index 72f112a5a..65b7a8d85 100644
--- a/libraries/botbuilder-ai/setup.py
+++ b/libraries/botbuilder-ai/setup.py
@@ -39,6 +39,7 @@
"botbuilder.ai.luis",
"botbuilder.ai.qna.models",
"botbuilder.ai.qna.utils",
+ "botbuilder.ai.qna.dialogs",
],
install_requires=REQUIRES + TESTS_REQUIRES,
tests_require=TESTS_REQUIRES,
|
OpenNMT__OpenNMT-tf-953 | An issue with SequenceRecordInputter ?
I tried to create SequenceClassifier model which used SequenceRecordInputter as a part of ParallelInputter - it produced an error before ending first learning step. After isolating the problem, it seems that SequenceRecordInputter dataset generation is the source of it:
Reproducible code:
```python3
import numpy as np
from opennmt import encoders, inputters, models, Runner
vectors = []
for i in range(1000):
vectors.append(np.random.rand(np.random.randint(1, 9), 16))
inputters.create_sequence_records(vectors, "train.records")
with open("train_labels.txt", "w") as f:
f.write("\n".join(np.random.randint(0, 2, 1000).astype("str")))
with open("labels_vocab.txt", "w") as f:
f.write("\n".join(["0", "1"]))
model = models.SequenceClassifier(
inputters.SequenceRecordInputter(16),
encoders.SelfAttentionEncoder(
num_layers=2, num_units=16, num_heads=4, ffn_inner_dim=64
),
)
config = {
"model_dir": ".",
"data": {
"target_vocabulary": "labels_vocab.txt",
"train_features_file": "train.records",
"train_labels_file": "train_labels.txt",
},
"params": {"optimizer": "Adam", "learning_rate": 0.001},
"train": {"batch_size": 1, "max_step": 2},
}
runner = Runner(model, config, auto_config=False)
runner.train()
```
Error text
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Input In [12], in <cell line: 1>()
----> 1 runner.train()
File ~/.local/lib/python3.9/site-packages/opennmt/runner.py:281, in Runner.train(self, num_devices, with_eval, checkpoint_path, hvd, return_summary, fallback_to_cpu, continue_from_checkpoint)
278 else:
279 trainer = training_util.Trainer(model, optimizer, checkpoint=checkpoint)
--> 281 summary = trainer(
282 dataset_fn,
283 max_step=train_config.get("max_step"),
284 accum_steps=accum_steps,
285 report_steps=train_config.get("save_summary_steps", 100),
286 save_steps=train_config.get("save_checkpoints_steps", 5000),
287 evaluator=evaluator,
288 eval_steps=eval_config.get("steps", 5000),
289 moving_average_decay=train_config.get("moving_average_decay"),
290 )
292 average_last_checkpoints = train_config.get("average_last_checkpoints", 0)
293 if checkpoint is None:
File ~/.local/lib/python3.9/site-packages/opennmt/training.py:109, in Trainer.__call__(self, dataset, max_step, accum_steps, report_steps, save_steps, evaluator, eval_steps, moving_average_decay)
107 step = None
108 moving_average = None
--> 109 for i, loss in enumerate(
110 self._steps(dataset, accum_steps=accum_steps, report_steps=report_steps)
111 ):
112 if i == 0:
113 self._log_model_info()
File ~/.local/lib/python3.9/site-packages/opennmt/training.py:221, in Trainer._steps(self, dataset, accum_steps, report_steps)
209 def _steps(self, dataset, accum_steps=1, report_steps=None):
210 """Returns a generator over training steps (i.e. parameters update).
211
212 Args:
(...)
219 A generator that yields a loss value to report for this step.
220 """
--> 221 dataset = self._finalize_dataset(dataset)
222 iterator = iter(dataset)
224 # We define 2 separate functions to support gradient accumulation:
225 # * forward: compute and accumulate the gradients
226 # * step: apply the gradients
227 # When gradient accumulation is disabled, the forward function also applies the gradients.
File ~/.local/lib/python3.9/site-packages/opennmt/training.py:206, in Trainer._finalize_dataset(self, dataset)
196 """Returns the final dataset instance to be used for training.
197
198 Args:
(...)
203 A ``tf.data.Dataset``.
204 """
205 if callable(dataset):
--> 206 dataset = dataset(tf.distribute.InputContext())
207 return dataset
File ~/.local/lib/python3.9/site-packages/opennmt/runner.py:220, in Runner.train.<locals>.<lambda>(input_context)
216 batch_type = train_config["batch_type"]
217 batch_size_multiple = 8 if mixed_precision and batch_type == "tokens" else 1
219 dataset_fn = (
--> 220 lambda input_context: model.examples_inputter.make_training_dataset(
221 data_config["train_features_file"],
222 data_config.get("train_labels_file"),
223 train_config["batch_size"],
224 batch_type=batch_type,
225 batch_size_multiple=batch_size_multiple,
226 shuffle_buffer_size=train_config["sample_buffer_size"],
227 length_bucket_width=train_config["length_bucket_width"],
228 maximum_features_length=train_config.get("maximum_features_length"),
229 maximum_labels_length=train_config.get("maximum_labels_length"),
230 single_pass=train_config.get("single_pass", False),
231 num_shards=input_context.num_input_pipelines,
232 shard_index=input_context.input_pipeline_id,
233 prefetch_buffer_size=train_config.get("prefetch_buffer_size"),
234 cardinality_multiple=input_context.num_replicas_in_sync,
235 weights=data_config.get("train_files_weights"),
236 batch_autotune_mode=train_config.get("batch_autotune_mode"),
237 )
238 )
240 checkpoint = None
241 evaluator = None
File ~/.local/lib/python3.9/site-packages/opennmt/inputters/inputter.py:834, in ExampleInputterAdapter.make_training_dataset(self, features_file, labels_file, batch_size, batch_type, batch_multiplier, batch_size_multiple, shuffle_buffer_size, length_bucket_width, maximum_features_length, maximum_labels_length, single_pass, num_shards, shard_index, num_threads, prefetch_buffer_size, cardinality_multiple, weights, batch_autotune_mode)
832 if weights is not None:
833 dataset = (dataset, weights)
--> 834 dataset = dataset_util.training_pipeline(
835 batch_size,
836 batch_type=batch_type,
837 batch_multiplier=batch_multiplier,
838 batch_size_multiple=batch_size_multiple,
839 transform_fns=transform_fns,
840 length_bucket_width=length_bucket_width,
841 features_length_fn=features_length_fn,
842 labels_length_fn=labels_length_fn,
843 single_pass=single_pass,
844 num_shards=num_shards,
845 shard_index=shard_index,
846 num_threads=num_threads,
847 dataset_size=self.get_dataset_size(data_files),
848 shuffle_buffer_size=shuffle_buffer_size,
849 prefetch_buffer_size=prefetch_buffer_size,
850 cardinality_multiple=cardinality_multiple,
851 )(dataset)
852 return dataset
File ~/.local/lib/python3.9/site-packages/opennmt/data/dataset.py:637, in training_pipeline.<locals>._pipeline(dataset)
635 if labels_length_fn is not None:
636 length_fn.append(labels_length_fn)
--> 637 dataset = dataset.apply(
638 batch_sequence_dataset(
639 batch_size,
640 batch_type=batch_type,
641 batch_multiplier=batch_multiplier,
642 batch_size_multiple=batch_size_multiple,
643 length_bucket_width=length_bucket_width,
644 length_fn=length_fn,
645 )
646 )
647 dataset = dataset.apply(filter_irregular_batches(batch_multiplier))
648 if not single_pass:
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/dataset_ops.py:2270, in DatasetV2.apply(self, transformation_func)
2248 def apply(self, transformation_func):
2249 """Applies a transformation function to this dataset.
2250
2251 `apply` enables chaining of custom `Dataset` transformations, which are
(...)
2268 dataset.
2269 """
-> 2270 dataset = transformation_func(self)
2271 if not isinstance(dataset, DatasetV2):
2272 raise TypeError(
2273 f"`transformation_func` must return a `tf.data.Dataset` object. "
2274 f"Got {type(dataset)}.")
File ~/.local/lib/python3.9/site-packages/opennmt/data/dataset.py:482, in batch_sequence_dataset.<locals>.<lambda>(dataset)
475 else:
476 raise ValueError(
477 "Invalid batch type: '{}'; should be 'examples' or 'tokens'".format(
478 batch_type
479 )
480 )
--> 482 return lambda dataset: dataset.group_by_window(_key_func, _reduce_func, **kwargs)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/dataset_ops.py:2823, in DatasetV2.group_by_window(self, key_func, reduce_func, window_size, window_size_func, name)
2819 window_size_func = constant_window_func
2821 assert window_size_func is not None
-> 2823 return _GroupByWindowDataset(
2824 self, key_func, reduce_func, window_size_func, name=name)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/dataset_ops.py:5683, in _GroupByWindowDataset.__init__(self, input_dataset, key_func, reduce_func, window_size_func, name)
5681 """See `group_by_window()` for details."""
5682 self._input_dataset = input_dataset
-> 5683 self._make_key_func(key_func, input_dataset)
5684 self._make_reduce_func(reduce_func, input_dataset)
5685 self._make_window_size_func(window_size_func)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/dataset_ops.py:5721, in _GroupByWindowDataset._make_key_func(self, key_func, input_dataset)
5718 def key_func_wrapper(*args):
5719 return ops.convert_to_tensor(key_func(*args), dtype=dtypes.int64)
-> 5721 self._key_func = structured_function.StructuredFunctionWrapper(
5722 key_func_wrapper, self._transformation_name(), dataset=input_dataset)
5723 if not self._key_func.output_structure.is_compatible_with(
5724 tensor_spec.TensorSpec([], dtypes.int64)):
5725 raise ValueError(f"Invalid `key_func`. `key_func` must return a single "
5726 f"`tf.int64` scalar tensor but its return type is "
5727 f"{self._key_func.output_structure}.")
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/structured_function.py:271, in StructuredFunctionWrapper.__init__(self, func, transformation_name, dataset, input_classes, input_shapes, input_types, input_structure, add_to_graph, use_legacy_function, defun_kwargs)
264 warnings.warn(
265 "Even though the `tf.config.experimental_run_functions_eagerly` "
266 "option is set, this option does not apply to tf.data functions. "
267 "To force eager execution of tf.data functions, please use "
268 "`tf.data.experimental.enable_debug_mode()`.")
269 fn_factory = trace_tf_function(defun_kwargs)
--> 271 self._function = fn_factory()
272 # There is no graph to add in eager mode.
273 add_to_graph &= not context.executing_eagerly()
File ~/.local/lib/python3.9/site-packages/tensorflow/python/eager/function.py:2567, in Function.get_concrete_function(self, *args, **kwargs)
2558 def get_concrete_function(self, *args, **kwargs):
2559 """Returns a `ConcreteFunction` specialized to inputs and execution context.
2560
2561 Args:
(...)
2565 or `tf.Tensor` or `tf.TensorSpec`.
2566 """
-> 2567 graph_function = self._get_concrete_function_garbage_collected(
2568 *args, **kwargs)
2569 graph_function._garbage_collector.release() # pylint: disable=protected-access
2570 return graph_function
File ~/.local/lib/python3.9/site-packages/tensorflow/python/eager/function.py:2533, in Function._get_concrete_function_garbage_collected(self, *args, **kwargs)
2531 args, kwargs = None, None
2532 with self._lock:
-> 2533 graph_function, _ = self._maybe_define_function(args, kwargs)
2534 seen_names = set()
2535 captured = object_identity.ObjectIdentitySet(
2536 graph_function.graph.internal_captures)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/eager/function.py:2711, in Function._maybe_define_function(self, args, kwargs)
2708 cache_key = self._function_cache.generalize(cache_key)
2709 (args, kwargs) = cache_key._placeholder_value() # pylint: disable=protected-access
-> 2711 graph_function = self._create_graph_function(args, kwargs)
2712 self._function_cache.add(cache_key, cache_key_deletion_observer,
2713 graph_function)
2715 return graph_function, filtered_flat_args
File ~/.local/lib/python3.9/site-packages/tensorflow/python/eager/function.py:2627, in Function._create_graph_function(self, args, kwargs)
2622 missing_arg_names = [
2623 "%s_%d" % (arg, i) for i, arg in enumerate(missing_arg_names)
2624 ]
2625 arg_names = base_arg_names + missing_arg_names
2626 graph_function = ConcreteFunction(
-> 2627 func_graph_module.func_graph_from_py_func(
2628 self._name,
2629 self._python_function,
2630 args,
2631 kwargs,
2632 self.input_signature,
2633 autograph=self._autograph,
2634 autograph_options=self._autograph_options,
2635 arg_names=arg_names,
2636 capture_by_value=self._capture_by_value),
2637 self._function_attributes,
2638 spec=self.function_spec,
2639 # Tell the ConcreteFunction to clean up its graph once it goes out of
2640 # scope. This is not the default behavior since it gets used in some
2641 # places (like Keras) where the FuncGraph lives longer than the
2642 # ConcreteFunction.
2643 shared_func_graph=False)
2644 return graph_function
File ~/.local/lib/python3.9/site-packages/tensorflow/python/framework/func_graph.py:1141, in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, acd_record_initial_resource_uses)
1138 else:
1139 _, original_func = tf_decorator.unwrap(python_func)
-> 1141 func_outputs = python_func(*func_args, **func_kwargs)
1143 # invariant: `func_outputs` contains only Tensors, CompositeTensors,
1144 # TensorArrays and `None`s.
1145 func_outputs = nest.map_structure(
1146 convert, func_outputs, expand_composites=True)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/structured_function.py:248, in StructuredFunctionWrapper.__init__.<locals>.trace_tf_function.<locals>.wrapped_fn(*args)
242 @eager_function.defun_with_attributes(
243 input_signature=structure.get_flat_tensor_specs(
244 self._input_structure),
245 autograph=False,
246 attributes=defun_kwargs)
247 def wrapped_fn(*args): # pylint: disable=missing-docstring
--> 248 ret = wrapper_helper(*args)
249 ret = structure.to_tensor_list(self._output_structure, ret)
250 return [ops.convert_to_tensor(t) for t in ret]
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/structured_function.py:177, in StructuredFunctionWrapper.__init__.<locals>.wrapper_helper(*args)
175 if not _should_unpack(nested_args):
176 nested_args = (nested_args,)
--> 177 ret = autograph.tf_convert(self._func, ag_ctx)(*nested_args)
178 if _should_pack(ret):
179 ret = tuple(ret)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/autograph/impl/api.py:689, in convert.<locals>.decorator.<locals>.wrapper(*args, **kwargs)
687 try:
688 with conversion_ctx:
--> 689 return converted_call(f, args, kwargs, options=options)
690 except Exception as e: # pylint:disable=broad-except
691 if hasattr(e, 'ag_error_metadata'):
File ~/.local/lib/python3.9/site-packages/tensorflow/python/autograph/impl/api.py:377, in converted_call(f, args, kwargs, caller_fn_scope, options)
374 return _call_unconverted(f, args, kwargs, options)
376 if not options.user_requested and conversion.is_allowlisted(f):
--> 377 return _call_unconverted(f, args, kwargs, options)
379 # internal_convert_user_code is for example turned off when issuing a dynamic
380 # call conversion from generated code while in nonrecursive mode. In that
381 # case we evidently don't want to recurse, but we still have to convert
382 # things like builtins.
383 if not options.internal_convert_user_code:
File ~/.local/lib/python3.9/site-packages/tensorflow/python/autograph/impl/api.py:458, in _call_unconverted(f, args, kwargs, options, update_cache)
455 return f.__self__.call(args, kwargs)
457 if kwargs is not None:
--> 458 return f(*args, **kwargs)
459 return f(*args)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/data/ops/dataset_ops.py:5719, in _GroupByWindowDataset._make_key_func.<locals>.key_func_wrapper(*args)
5718 def key_func_wrapper(*args):
-> 5719 return ops.convert_to_tensor(key_func(*args), dtype=dtypes.int64)
File ~/.local/lib/python3.9/site-packages/opennmt/data/dataset.py:442, in batch_sequence_dataset.<locals>._key_func(*args)
437 raise ValueError(
438 "%d length functions were passed but this dataset contains "
439 "%d parallel elements" % (len(length_fns), len(args))
440 )
441 # Take the highest bucket id.
--> 442 bucket_id = tf.reduce_max(
443 [
444 _get_bucket_id(features, length_fn)
445 for features, length_fn in zip(args, length_fns)
446 ]
447 )
448 return tf.cast(bucket_id, tf.int64)
File ~/.local/lib/python3.9/site-packages/tensorflow/python/util/traceback_utils.py:153, in filter_traceback.<locals>.error_handler(*args, **kwargs)
151 except Exception as e:
152 filtered_tb = _process_traceback_frames(e.__traceback__)
--> 153 raise e.with_traceback(filtered_tb) from None
154 finally:
155 del filtered_tb
File ~/.local/lib/python3.9/site-packages/tensorflow/python/ops/array_ops.py:1506, in _autopacking_helper(list_or_tuple, dtype, name)
1504 if isinstance(elem, core.Tensor):
1505 if dtype is not None and elem.dtype.base_dtype != dtype:
-> 1506 raise TypeError(f"Cannot convert a list containing a tensor of dtype "
1507 f"{elem.dtype} to {dtype} (Tensor is: {elem!r})")
1508 converted_elems.append(elem)
1509 must_pack = True
TypeError: Cannot convert a list containing a tensor of dtype <dtype: 'int32'> to <dtype: 'int64'> (Tensor is: <tf.Tensor 'Const_1:0' shape=() dtype=int32>)
```
| [
{
"content": "\"\"\"Define inputters reading from TFRecord files.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom opennmt.data import dataset as dataset_util\nfrom opennmt.inputters.inputter import Inputter\n\n\nclass SequenceRecordInputter(Inputter):\n \"\"\"Inputter that reads ``tf.train.Seque... | [
{
"content": "\"\"\"Define inputters reading from TFRecord files.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom opennmt.data import dataset as dataset_util\nfrom opennmt.inputters.inputter import Inputter\n\n\nclass SequenceRecordInputter(Inputter):\n \"\"\"Inputter that reads ``tf.train.Seque... | diff --git a/opennmt/inputters/record_inputter.py b/opennmt/inputters/record_inputter.py
index dc5d68909..26fbd5cdb 100644
--- a/opennmt/inputters/record_inputter.py
+++ b/opennmt/inputters/record_inputter.py
@@ -48,7 +48,7 @@ def make_features(self, element=None, features=None, training=None):
},
)
tensor = feature_lists["values"]
- features["length"] = lengths["values"]
+ features["length"] = tf.cast(lengths["values"], tf.int32)
features["tensor"] = tf.cast(tensor, self.dtype)
return features
diff --git a/opennmt/tests/inputter_test.py b/opennmt/tests/inputter_test.py
index 48036665c..f273d40c5 100644
--- a/opennmt/tests/inputter_test.py
+++ b/opennmt/tests/inputter_test.py
@@ -770,6 +770,7 @@ def testSequenceRecordBatch(self):
features = next(iter(dataset))
lengths = features["length"]
tensors = features["tensor"]
+ self.assertEqual(lengths.dtype, tf.int32)
self.assertAllEqual(lengths, [3, 6, 1])
for length, tensor, expected_vector in zip(lengths, tensors, vectors):
self.assertAllClose(tensor[:length], expected_vector)
|
mitmproxy__mitmproxy-1801 | Make Travis Great Again (Master Edition)
Working on improving the speed and accuracy of Travis's testing.
| [
{
"content": "import os\nimport select\nimport socket\nimport sys\nimport threading\nimport time\nimport traceback\n\nimport binascii\n\nfrom typing import Optional # noqa\n\nfrom mitmproxy.utils import strutils\n\nimport certifi\nfrom backports import ssl_match_hostname\nimport OpenSSL\nfrom OpenSSL import SS... | [
{
"content": "import os\nimport select\nimport socket\nimport sys\nimport threading\nimport time\nimport traceback\n\nimport binascii\n\nfrom typing import Optional # noqa\n\nfrom mitmproxy.utils import strutils\n\nimport certifi\nfrom backports import ssl_match_hostname\nimport OpenSSL\nfrom OpenSSL import SS... | diff --git a/.travis.yml b/.travis.yml
index 0df3289967..c078e30ac0 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -1,15 +1,6 @@
sudo: false
language: python
-addons:
- apt:
- sources:
- # Debian sid currently holds OpenSSL 1.0.2
- # change this with future releases!
- - debian-sid
- packages:
- - libssl-dev
-
env:
global:
- CI_DEPS=codecov>=2.0.5
@@ -25,9 +16,21 @@ matrix:
language: generic
env: TOXENV=py35 BDIST=1
- python: 3.5
- env: TOXENV=py35 BDIST=1
+ env: TOXENV=py35 OPENSSL_OLD
+ addons:
+ apt:
+ packages:
+ - libssl-dev
- python: 3.5
- env: TOXENV=py35 NO_ALPN=1
+ env: TOXENV=py35 BDIST=1 OPENSSL_ALPN
+ addons:
+ apt:
+ sources:
+ # Debian sid currently holds OpenSSL 1.1.0
+ # change this with future releases!
+ - debian-sid
+ packages:
+ - libssl-dev
- python: 3.5
env: TOXENV=docs
git:
@@ -39,10 +42,8 @@ install:
- |
if [[ $TRAVIS_OS_NAME == "osx" ]]
then
- brew update || brew update # try again if it fails
- brew upgrade
- brew reinstall openssl
- brew reinstall pyenv
+ brew update || brew update
+ brew outdated pyenv || brew upgrade pyenv
eval "$(pyenv init -)"
env PYTHON_CONFIGURE_OPTS="--enable-framework" pyenv install --skip-existing 3.5.2
pyenv global 3.5.2
@@ -52,8 +53,8 @@ install:
- pip install tox
script:
+ - tox -- --cov mitmproxy --cov pathod -v
- |
- tox -- --cov mitmproxy --cov pathod -v
if [[ $BDIST == "1" ]]
then
git fetch --unshallow --tags
@@ -80,3 +81,4 @@ cache:
directories:
- $HOME/.pyenv
- $HOME/.cache/pip
+ # - $HOME/build/mitmproxy/mitmproxy/.tox
diff --git a/mitmproxy/net/tcp.py b/mitmproxy/net/tcp.py
index ac78e70d40..11cabf07e1 100644
--- a/mitmproxy/net/tcp.py
+++ b/mitmproxy/net/tcp.py
@@ -30,10 +30,7 @@
socket_fileobject = socket.SocketIO
EINTR = 4
-if os.environ.get("NO_ALPN"):
- HAS_ALPN = False
-else:
- HAS_ALPN = SSL._lib.Cryptography_HAS_ALPN
+HAS_ALPN = SSL._lib.Cryptography_HAS_ALPN
# To enable all SSL methods use: SSLv23
# then add options to disable certain methods
diff --git a/test/conftest.py b/test/conftest.py
new file mode 100644
index 0000000000..3d129ecf70
--- /dev/null
+++ b/test/conftest.py
@@ -0,0 +1,14 @@
+import pytest
+import OpenSSL
+import mitmproxy.net.tcp
+
+
+requires_alpn = pytest.mark.skipif(
+ not mitmproxy.net.tcp.HAS_ALPN,
+ reason='requires OpenSSL with ALPN support')
+
+
+@pytest.fixture()
+def disable_alpn(monkeypatch):
+ monkeypatch.setattr(mitmproxy.net.tcp, 'HAS_ALPN', False)
+ monkeypatch.setattr(OpenSSL.SSL._lib, 'Cryptography_HAS_ALPN', False)
diff --git a/test/mitmproxy/net/test_tcp.py b/test/mitmproxy/net/test_tcp.py
index cf3d30f7c8..fe44973bdc 100644
--- a/test/mitmproxy/net/test_tcp.py
+++ b/test/mitmproxy/net/test_tcp.py
@@ -6,7 +6,7 @@
import os
import threading
import mock
-
+import pytest
from OpenSSL import SSL
from mitmproxy import certs
@@ -15,6 +15,7 @@
from mitmproxy import exceptions
from . import tservers
+from ...conftest import requires_alpn
class EchoHandler(tcp.BaseHandler):
@@ -526,40 +527,47 @@ def test_timeout(self):
tutils.raises(exceptions.TcpTimeout, c.rfile.read, 10)
+class TestCryptographyALPN:
+
+ def test_has_alpn(self):
+ if 'OPENSSL_ALPN' in os.environ:
+ assert tcp.HAS_ALPN
+ assert SSL._lib.Cryptography_HAS_ALPN
+ elif 'OPENSSL_OLD' in os.environ:
+ assert not tcp.HAS_ALPN
+ assert not SSL._lib.Cryptography_HAS_ALPN
+
+
class TestALPNClient(tservers.ServerTestBase):
handler = ALPNHandler
ssl = dict(
alpn_select=b"bar"
)
- if tcp.HAS_ALPN:
- def test_alpn(self):
- c = tcp.TCPClient(("127.0.0.1", self.port))
- with c.connect():
- c.convert_to_ssl(alpn_protos=[b"foo", b"bar", b"fasel"])
- assert c.get_alpn_proto_negotiated() == b"bar"
- assert c.rfile.readline().strip() == b"bar"
-
- def test_no_alpn(self):
- c = tcp.TCPClient(("127.0.0.1", self.port))
- with c.connect():
- c.convert_to_ssl()
- assert c.get_alpn_proto_negotiated() == b""
- assert c.rfile.readline().strip() == b"NONE"
+ @requires_alpn
+ @pytest.mark.parametrize('has_alpn,alpn_protos, expected_negotiated, expected_response', [
+ (True, [b"foo", b"bar", b"fasel"], b'bar', b'bar'),
+ (True, [], b'', b'NONE'),
+ (True, None, b'', b'NONE'),
+ (False, [b"foo", b"bar", b"fasel"], b'', b'NONE'),
+ (False, [], b'', b'NONE'),
+ (False, None, b'', b'NONE'),
+ ])
+ def test_alpn(self, monkeypatch, has_alpn, alpn_protos, expected_negotiated, expected_response):
+ monkeypatch.setattr(tcp, 'HAS_ALPN', has_alpn)
+ monkeypatch.setattr(SSL._lib, 'Cryptography_HAS_ALPN', has_alpn)
- else:
- def test_none_alpn(self):
- c = tcp.TCPClient(("127.0.0.1", self.port))
- with c.connect():
- c.convert_to_ssl(alpn_protos=[b"foo", b"bar", b"fasel"])
- assert c.get_alpn_proto_negotiated() == b""
- assert c.rfile.readline() == b"NONE"
+ c = tcp.TCPClient(("127.0.0.1", self.port))
+ with c.connect():
+ c.convert_to_ssl(alpn_protos=alpn_protos)
+ assert c.get_alpn_proto_negotiated() == expected_negotiated
+ assert c.rfile.readline().strip() == expected_response
class TestNoSSLNoALPNClient(tservers.ServerTestBase):
handler = ALPNHandler
- def test_no_ssl_no_alpn(self):
+ def test_no_ssl_no_alpn(self, disable_alpn):
c = tcp.TCPClient(("127.0.0.1", self.port))
with c.connect():
assert c.get_alpn_proto_negotiated() == b""
diff --git a/test/mitmproxy/protocol/test_http2.py b/test/mitmproxy/protocol/test_http2.py
index d135cf0870..8e8ba6448a 100644
--- a/test/mitmproxy/protocol/test_http2.py
+++ b/test/mitmproxy/protocol/test_http2.py
@@ -1,7 +1,6 @@
# coding=utf-8
-import pytest
import os
import tempfile
import traceback
@@ -17,6 +16,7 @@
from mitmproxy.net.http import http1, http2
from .. import tservers
+from ...conftest import requires_alpn
import logging
logging.getLogger("hyper.packages.hpack.hpack").setLevel(logging.WARNING)
@@ -27,11 +27,6 @@
logging.getLogger("PIL.PngImagePlugin").setLevel(logging.WARNING)
-requires_alpn = pytest.mark.skipif(
- not mitmproxy.net.tcp.HAS_ALPN,
- reason='requires OpenSSL with ALPN support')
-
-
# inspect the log:
# for msg in self.proxy.tmaster.tlog:
# print(msg)
diff --git a/test/mitmproxy/test_dump.py b/test/mitmproxy/test_dump.py
index e331637d9d..c6b15c845c 100644
--- a/test/mitmproxy/test_dump.py
+++ b/test/mitmproxy/test_dump.py
@@ -51,14 +51,14 @@ def test_error(self):
assert "error" in o.tfile.getvalue()
def test_replay(self):
- o = dump.Options(server_replay=["nonexistent"], replay_kill_extra=True)
+ o = dump.Options(http2=False, server_replay=["nonexistent"], replay_kill_extra=True)
tutils.raises(exceptions.OptionsError, dump.DumpMaster, o, proxy.DummyServer())
with tutils.tmpdir() as t:
p = os.path.join(t, "rep")
self.flowfile(p)
- o = dump.Options(server_replay=[p], replay_kill_extra=True)
+ o = dump.Options(http2=False, server_replay=[p], replay_kill_extra=True)
o.verbosity = 0
o.flow_detail = 0
m = dump.DumpMaster(o, proxy.DummyServer())
@@ -66,13 +66,13 @@ def test_replay(self):
self.cycle(m, b"content")
self.cycle(m, b"content")
- o = dump.Options(server_replay=[p], replay_kill_extra=False)
+ o = dump.Options(http2=False, server_replay=[p], replay_kill_extra=False)
o.verbosity = 0
o.flow_detail = 0
m = dump.DumpMaster(o, proxy.DummyServer())
self.cycle(m, b"nonexistent")
- o = dump.Options(client_replay=[p], replay_kill_extra=False)
+ o = dump.Options(http2=False, client_replay=[p], replay_kill_extra=False)
o.verbosity = 0
o.flow_detail = 0
m = dump.DumpMaster(o, proxy.DummyServer())
diff --git a/test/pathod/test_pathoc.py b/test/pathod/test_pathoc.py
index 69baae545e..274e2be7f0 100644
--- a/test/pathod/test_pathoc.py
+++ b/test/pathod/test_pathoc.py
@@ -1,8 +1,8 @@
import io
from mock import Mock
+import pytest
from mitmproxy.net import http
-from mitmproxy.net import tcp
from mitmproxy.net.http import http1
from mitmproxy import exceptions
@@ -11,6 +11,7 @@
from mitmproxy.test import tutils
from . import tservers
+from ..conftest import requires_alpn
def test_response():
@@ -211,45 +212,57 @@ class TestDaemonHTTP2(PathocTestDaemon):
ssl = True
explain = False
- if tcp.HAS_ALPN:
-
- def test_http2(self):
- c = pathoc.Pathoc(
- ("127.0.0.1", self.d.port),
- fp=None,
- ssl=True,
- use_http2=True,
- )
- assert isinstance(c.protocol, HTTP2StateProtocol)
-
- c = pathoc.Pathoc(
- ("127.0.0.1", self.d.port),
- )
- assert c.protocol == http1
-
- def test_http2_alpn(self):
- c = pathoc.Pathoc(
- ("127.0.0.1", self.d.port),
- fp=None,
- ssl=True,
- use_http2=True,
- http2_skip_connection_preface=True,
- )
-
- tmp_convert_to_ssl = c.convert_to_ssl
- c.convert_to_ssl = Mock()
- c.convert_to_ssl.side_effect = tmp_convert_to_ssl
- with c.connect():
- _, kwargs = c.convert_to_ssl.call_args
- assert set(kwargs['alpn_protos']) == set([b'http/1.1', b'h2'])
-
- def test_request(self):
- c = pathoc.Pathoc(
- ("127.0.0.1", self.d.port),
- fp=None,
- ssl=True,
- use_http2=True,
- )
+ @requires_alpn
+ def test_http2(self):
+ c = pathoc.Pathoc(
+ ("127.0.0.1", self.d.port),
+ fp=None,
+ ssl=True,
+ use_http2=True,
+ )
+ assert isinstance(c.protocol, HTTP2StateProtocol)
+
+ c = pathoc.Pathoc(
+ ("127.0.0.1", self.d.port),
+ )
+ assert c.protocol == http1
+
+ @requires_alpn
+ def test_http2_alpn(self):
+ c = pathoc.Pathoc(
+ ("127.0.0.1", self.d.port),
+ fp=None,
+ ssl=True,
+ use_http2=True,
+ http2_skip_connection_preface=True,
+ )
+
+ tmp_convert_to_ssl = c.convert_to_ssl
+ c.convert_to_ssl = Mock()
+ c.convert_to_ssl.side_effect = tmp_convert_to_ssl
+ with c.connect():
+ _, kwargs = c.convert_to_ssl.call_args
+ assert set(kwargs['alpn_protos']) == set([b'http/1.1', b'h2'])
+
+ @requires_alpn
+ def test_request(self):
+ c = pathoc.Pathoc(
+ ("127.0.0.1", self.d.port),
+ fp=None,
+ ssl=True,
+ use_http2=True,
+ )
+ with c.connect():
+ resp = c.request("get:/p/200")
+ assert resp.status_code == 200
+
+ def test_failing_request(self, disable_alpn):
+ c = pathoc.Pathoc(
+ ("127.0.0.1", self.d.port),
+ fp=None,
+ ssl=True,
+ use_http2=True,
+ )
+ with pytest.raises(NotImplementedError):
with c.connect():
- resp = c.request("get:/p/200")
- assert resp.status_code == 200
+ c.request("get:/p/200")
diff --git a/test/pathod/test_pathod.py b/test/pathod/test_pathod.py
index 6a4e1c6239..1e34af23d9 100644
--- a/test/pathod/test_pathod.py
+++ b/test/pathod/test_pathod.py
@@ -1,11 +1,14 @@
import io
+import pytest
+
from pathod import pathod
from mitmproxy.net import tcp
from mitmproxy import exceptions
from mitmproxy.test import tutils
from . import tservers
+from ..conftest import requires_alpn
class TestPathod:
@@ -257,8 +260,11 @@ class TestHTTP2(tservers.DaemonTests):
ssl = True
nohang = True
- if tcp.HAS_ALPN:
+ @requires_alpn
+ def test_http2(self):
+ r, _ = self.pathoc(["GET:/"], ssl=True, use_http2=True)
+ assert r[0].status_code == 800
- def test_http2(self):
+ def test_no_http2(self, disable_alpn):
+ with pytest.raises(NotImplementedError):
r, _ = self.pathoc(["GET:/"], ssl=True, use_http2=True)
- assert r[0].status_code == 800
diff --git a/test/pathod/test_protocols_http2.py b/test/pathod/test_protocols_http2.py
index d77702a3a8..8531887b18 100644
--- a/test/pathod/test_protocols_http2.py
+++ b/test/pathod/test_protocols_http2.py
@@ -11,6 +11,8 @@
from pathod.protocols.http2 import HTTP2StateProtocol, TCPHandler
+from ..conftest import requires_alpn
+
class TestTCPHandlerWrapper:
def test_wrapped(self):
@@ -66,37 +68,35 @@ def test_perform_connection_preface_server(self, mock_client_method, mock_server
assert mock_server_method.called
+@requires_alpn
class TestCheckALPNMatch(net_tservers.ServerTestBase):
handler = EchoHandler
ssl = dict(
alpn_select=b'h2',
)
- if tcp.HAS_ALPN:
-
- def test_check_alpn(self):
- c = tcp.TCPClient(("127.0.0.1", self.port))
- with c.connect():
- c.convert_to_ssl(alpn_protos=[b'h2'])
- protocol = HTTP2StateProtocol(c)
- assert protocol.check_alpn()
+ def test_check_alpn(self):
+ c = tcp.TCPClient(("127.0.0.1", self.port))
+ with c.connect():
+ c.convert_to_ssl(alpn_protos=[b'h2'])
+ protocol = HTTP2StateProtocol(c)
+ assert protocol.check_alpn()
+@requires_alpn
class TestCheckALPNMismatch(net_tservers.ServerTestBase):
handler = EchoHandler
ssl = dict(
alpn_select=None,
)
- if tcp.HAS_ALPN:
-
- def test_check_alpn(self):
- c = tcp.TCPClient(("127.0.0.1", self.port))
- with c.connect():
- c.convert_to_ssl(alpn_protos=[b'h2'])
- protocol = HTTP2StateProtocol(c)
- with raises(NotImplementedError):
- protocol.check_alpn()
+ def test_check_alpn(self):
+ c = tcp.TCPClient(("127.0.0.1", self.port))
+ with c.connect():
+ c.convert_to_ssl(alpn_protos=[b'h2'])
+ protocol = HTTP2StateProtocol(c)
+ with raises(NotImplementedError):
+ protocol.check_alpn()
class TestPerformServerConnectionPreface(net_tservers.ServerTestBase):
diff --git a/tox.ini b/tox.ini
index 3f8040d736..dc76cb704e 100644
--- a/tox.ini
+++ b/tox.ini
@@ -8,7 +8,7 @@ basepython = python3.5
deps =
{env:CI_DEPS:}
-rrequirements.txt
-passenv = CODECOV_TOKEN CI CI_* TRAVIS TRAVIS_* APPVEYOR APPVEYOR_* SNAPSHOT_*
+passenv = CODECOV_TOKEN CI CI_* TRAVIS TRAVIS_* APPVEYOR APPVEYOR_* SNAPSHOT_* OPENSSL_*
setenv = HOME = {envtmpdir}
commands =
mitmdump --sysinfo
|
ansible-collections__community.aws-1206 | ec2_customer_gateway: bgp_asn is not required
### Summary
The ec2_customer_gateway module has incorrect documentation for the bgp_asn parameter.
It says the ASN must be passed when state=present, but the code defaults to 25000 if the parameter is absent. See the ensure_cgw_present() method:
```
def ensure_cgw_present(self, bgp_asn, ip_address):
if not bgp_asn:
bgp_asn = 65000
response = self.ec2.create_customer_gateway(
DryRun=False,
Type='ipsec.1',
PublicIp=ip_address,
BgpAsn=bgp_asn,
)
return response
### Issue Type
Documentation Report
### Component Name
ec2_customer_gateway
### Ansible Version
```console (paste below)
$ ansible --version
ansible [core 2.12.4]
config file = None
configured module search path = ['/home/neil/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/lib/python3.10/site-packages/ansible
ansible collection location = /home/neil/.ansible/collections:/usr/share/ansible/collections
executable location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/bin/ansible
python version = 3.10.1 (main, Jan 10 2022, 00:00:00) [GCC 11.2.1 20211203 (Red Hat 11.2.1-7)]
jinja version = 3.1.1
libyaml = True
```
### Collection Versions
```console (paste below)
$ ansible-galaxy collection list
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
```
### OS / Environment
main branch, as of 2022-04-18.
### Additional Information
Suggested rewording:
```
options:
bgp_asn:
description:
- Border Gateway Protocol (BGP) Autonomous System Number (ASN), defaults to 25000.
type: int
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
| [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | diff --git a/plugins/modules/ec2_customer_gateway.py b/plugins/modules/ec2_customer_gateway.py
index 9c00783a58a..f07e92f4f7c 100644
--- a/plugins/modules/ec2_customer_gateway.py
+++ b/plugins/modules/ec2_customer_gateway.py
@@ -23,7 +23,8 @@
options:
bgp_asn:
description:
- - Border Gateway Protocol (BGP) Autonomous System Number (ASN), required when I(state=present).
+ - Border Gateway Protocol (BGP) Autonomous System Number (ASN).
+ - Defaults to C(65000) if not specified when I(state=present).
type: int
ip_address:
description:
|
kedro-org__kedro-1977 | pickle.PickleDataSet docstring examples are incorrect
## Description
Kind of a small issue but the "advanced" example in the [pickle.PickleDataSet API docs](https://kedro.readthedocs.io/en/stable/kedro.extras.datasets.pickle.PickleDataSet.html) is wrong.
`compression` is not a valid [`joblib.dump`](https://joblib.readthedocs.io/en/latest/generated/joblib.dump.html) parameter (it should simply be `compress`) and [`joblib.load`](https://joblib.readthedocs.io/en/latest/generated/joblib.load.html) does not require a `compression` kwarg at all since it can automagically discover the correct compression algorithm used.
## Context
Even if it's a trivial issue I stumbled upon it and I hope to fix it so that future users will not have to go the joblib docs to find the problem.
## Possible Alternatives
I'a m working on a trivial fix, I'm going to open a PR as soon as possible.
| [
{
"content": "\"\"\"``PickleDataSet`` loads/saves data from/to a Pickle file using an underlying\nfilesystem (e.g.: local, S3, GCS). The underlying functionality is supported by\nthe specified backend library passed in (defaults to the ``pickle`` library), so it\nsupports all allowed options for loading and sav... | [
{
"content": "\"\"\"``PickleDataSet`` loads/saves data from/to a Pickle file using an underlying\nfilesystem (e.g.: local, S3, GCS). The underlying functionality is supported by\nthe specified backend library passed in (defaults to the ``pickle`` library), so it\nsupports all allowed options for loading and sav... | diff --git a/docs/source/deployment/aws_sagemaker.md b/docs/source/deployment/aws_sagemaker.md
index bf4904d5a0..c062252580 100644
--- a/docs/source/deployment/aws_sagemaker.md
+++ b/docs/source/deployment/aws_sagemaker.md
@@ -111,9 +111,9 @@ s3:
### Update the project settings
-Now you need to tell Kedro to use the [`TemplatedConfigLoader`](/kedro.config.TemplatedConfigLoader) instead of the default `ConfigLoader` class by setting the `CONFIG_LOADER_CLASS` accordingly.
+Now you need to tell Kedro to use the [`TemplatedConfigLoader`](/kedro.config.TemplatedConfigLoader) instead of the default `ConfigLoader` class by setting the `CONFIG_LOADER_CLASS` accordingly.
-You also need to point Kedro to your `globals.yml` file.
+You also need to point Kedro to your `globals.yml` file.
To make both changes, open the `src/kedro_tutorial/settings.py` file and set the `CONFIG_LOADER_CLASS` and `CONFIG_LOADER_ARGS` variables:
diff --git a/kedro/extras/datasets/pickle/pickle_dataset.py b/kedro/extras/datasets/pickle/pickle_dataset.py
index f565edd37d..b52ee9ced3 100644
--- a/kedro/extras/datasets/pickle/pickle_dataset.py
+++ b/kedro/extras/datasets/pickle/pickle_dataset.py
@@ -42,9 +42,7 @@ class PickleDataSet(AbstractVersionedDataSet[Any, Any]):
>>> backend: joblib
>>> credentials: s3_credentials
>>> save_args:
- >>> compression: lz4
- >>> load_args:
- >>> compression: lz4
+ >>> compress: lz4
Example using Python API:
::
|
googleapis__python-bigquery-426 | _from_api_repr_scalar fails, if parameter value is None
Tested on latest (2.6.0) version, using python 3.8 on linux.
If `ArrayQueryParameter` contaning at least one `None` value is added to `query_parameters`, when using `job.query_parameters` (result after submitting the job) `'NoneType' object has no attribute 'mode'` will be raised.
This is because:
- `_from_api_repr_scalar` is called on `ArrayQueryParameter`
- `_QUERY_PARAMS_FROM_JSON[array_type](value, None) for value in values`, the `value` is None
- this corresponds to `_int_from_json(None, None)`
- `_not_null(None, None)` is called
- `return value is not None or field.mode != "NULLABLE"` raises exception
Stack trace:
```
AttributeError: 'NoneType' object has no attribute 'mode'
(snip)
File "(snip)/site-packages/google/cloud/bigquery/job/query.py", line 632, in query_parameters
return self._configuration.query_parameters
File "(snip)/site-packages/google/cloud/bigquery/job/query.py", line 314, in query_parameters
return _from_api_repr_query_parameters(prop)
File "(snip)/site-packages/google/cloud/bigquery/job/query.py", line 79, in _from_api_repr_query_parameters
return [_query_param_from_api_repr(mapping) for mapping in resource]
File "(snip)/site-packages/google/cloud/bigquery/job/query.py", line 79, in <listcomp>
return [_query_param_from_api_repr(mapping) for mapping in resource]
File "(snip)/site-packages/google/cloud/bigquery/query.py", line 632, in _query_param_from_api_repr
return klass.from_api_repr(resource)
File "(snip)/site-packages/google/cloud/bigquery/query.py", line 257, in from_api_repr
return cls._from_api_repr_scalar(resource)
File "(snip)/site-packages/google/cloud/bigquery/query.py", line 239, in _from_api_repr_scalar
converted = [
File "(snip)/site-packages/google/cloud/bigquery/query.py", line 240, in <listcomp>
_QUERY_PARAMS_FROM_JSON[array_type](value, None) for value in values
File "(snip)/site-packages/google/cloud/bigquery/_helpers.py", line 48, in _int_from_json
if _not_null(value, field):
File "(snip)/site-packages/google/cloud/bigquery/_helpers.py", line 43, in _not_null
return value is not None or field.mode != "NULLABLE"
```
| [
{
"content": "# Copyright 2015 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | [
{
"content": "# Copyright 2015 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicabl... | diff --git a/google/cloud/bigquery/_helpers.py b/google/cloud/bigquery/_helpers.py
index 716c8a394..100136108 100644
--- a/google/cloud/bigquery/_helpers.py
+++ b/google/cloud/bigquery/_helpers.py
@@ -40,7 +40,7 @@
def _not_null(value, field):
"""Check whether 'value' should be coerced to 'field' type."""
- return value is not None or field.mode != "NULLABLE"
+ return value is not None or (field is not None and field.mode != "NULLABLE")
def _int_from_json(value, field):
diff --git a/tests/unit/test_query.py b/tests/unit/test_query.py
index a7c639ed1..cf268daf1 100644
--- a/tests/unit/test_query.py
+++ b/tests/unit/test_query.py
@@ -383,6 +383,16 @@ def test_from_api_repr_wo_values(self):
self.assertEqual(param.array_type, "INT64")
self.assertEqual(param.values, [])
+ def test_from_api_repr_w_none_values(self):
+ RESOURCE = {
+ "parameterType": {"type": "ARRAY", "arrayType": {"type": "INT64"}},
+ "parameterValue": {"arrayValues": [{"value": "1"}, {"value": None}]},
+ }
+ klass = self._get_target_class()
+ param = klass.from_api_repr(RESOURCE)
+ self.assertEqual(param.array_type, "INT64")
+ self.assertEqual(param.values, [1, None])
+
def test_from_api_repr_w_struct_type(self):
from google.cloud.bigquery.query import StructQueryParameter
|
Gallopsled__pwntools-2083 | Remote SSH debugging is broken due to missing qemu_port
When GDB is invoked on a remote host (via `gdb.debug(..., ssh=shell)`) the following error is thrown. It's not particularly helpful -- perhaps we should double-check that `gdbserver` is available first?
This error only occurs when debugging a cross-arch binary on a remote host.
```
Traceback (most recent call last):
File "exploit.py", line 66, in <module>
io = start()
File "exploit.py", line 46, in start
return remote(argv, *a, **kw)
File "exploit.py", line 37, in remote
return gdb.debug([remote_path] + argv, gdbscript=gdbscript, ssh=shell, *a, **kw)
File "/home/pwntools/pwntools/pwnlib/context/__init__.py", line 1449, in setter
return function(*a, **kw)
File "/home/pwntools/pwntools/pwnlib/gdb.py", line 454, in debug
port = qemu_port
UnboundLocalError: local variable 'qemu_port' referenced before assignment
```
This error can be reproduced as follows:
```
$ pwn template --user level3 --pass c1aXb9E2OrgybHXE --path /levels/level03 --host ioarm.netgarage.org --port 2201 > exploit.py
$ python exploit.py GDB
```
| [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\nDuring exploit development, it is frequently useful to debug the\ntarget binary under GDB.\n\nPwntools makes this easy-to-do with a handful of helper routines, designed\nto make your exploit-debug-update cycles much faster.\n\nUseful Functions\n----------------\n\n... | [
{
"content": "# -*- coding: utf-8 -*-\n\"\"\"\nDuring exploit development, it is frequently useful to debug the\ntarget binary under GDB.\n\nPwntools makes this easy-to-do with a handful of helper routines, designed\nto make your exploit-debug-update cycles much faster.\n\nUseful Functions\n----------------\n\n... | diff --git a/pwnlib/gdb.py b/pwnlib/gdb.py
index 45019dc5d..7e14cc9f1 100644
--- a/pwnlib/gdb.py
+++ b/pwnlib/gdb.py
@@ -568,7 +568,7 @@ def debug(args, gdbscript=None, exe=None, ssh=None, env=None, sysroot=None, api=
gdbserver.executable = exe
# Find what port we need to connect to
- if context.native or (context.os == 'android'):
+ if ssh or context.native or (context.os == 'android'):
port = _gdbserver_port(gdbserver, ssh)
else:
port = qemu_port
|
google__flax-3785 | [struct.dataclass] Consider adding optional `kw_only` arguments
I often run into the following issue:
```python
from flax import struct
class Foo(struct.PyTreeNode):
bar: int = struct.field(pytree_node=False, default=1)
class Baz(Foo):
qux: str
```
Since `qux` does not have a default value, I get:
```
Fields without default values cannot appear after fields with default values
```
Can we consider adding a simple wrapper to `dataclasses.dataclass(kw_only=True)`?
It should be easy for the `struct.dataclass`, we can maybe have another object for inheritance, like `PyTreeNodeKwOnly`?
| [
{
"content": "# Copyright 2024 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by ap... | [
{
"content": "# Copyright 2024 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by ap... | diff --git a/flax/struct.py b/flax/struct.py
index 7a8283a9d..29dbb9c2f 100644
--- a/flax/struct.py
+++ b/flax/struct.py
@@ -227,8 +227,8 @@ class PyTreeNode:
>>> model_grad = jax.grad(loss_fn)(model)
"""
- def __init_subclass__(cls):
- dataclass(cls) # pytype: disable=wrong-arg-types
+ def __init_subclass__(cls, **kwargs):
+ dataclass(cls, **kwargs) # pytype: disable=wrong-arg-types
def __init__(self, *args, **kwargs):
# stub for pytype
diff --git a/tests/struct_test.py b/tests/struct_test.py
index da517c739..8ab3119d0 100644
--- a/tests/struct_test.py
+++ b/tests/struct_test.py
@@ -18,7 +18,7 @@
from typing import Any
import jax
-from absl.testing import absltest
+from absl.testing import absltest, parameterized
from jax._src.tree_util import prefix_errors
from flax import struct
@@ -34,7 +34,7 @@ class Point:
meta: Any = struct.field(pytree_node=False)
-class StructTest(absltest.TestCase):
+class StructTest(parameterized.TestCase):
def test_no_extra_fields(self):
p = Point(x=1, y=2, meta={})
with self.assertRaises(dataclasses.FrozenInstanceError):
@@ -93,24 +93,68 @@ class A(struct.PyTreeNode):
a: int
# TODO(marcuschiam): Uncomment when Flax upgrades to Python 3.10.
- # def test_kw_only(self):
- # @struct.dataclass
- # class A:
- # a: int = 1
-
- # with self.assertRaisesRegex(TypeError, "non-default argument 'b' follows default argument"):
+ # @parameterized.parameters(
+ # {'mode': 'dataclass'},
+ # {'mode': 'pytreenode'},
+ # )
+ # def test_kw_only(self, mode):
+ # if mode == 'dataclass':
# @struct.dataclass
+ # class A:
+ # a: int = 1
+
+ # @functools.partial(struct.dataclass, kw_only=True)
# class B(A):
# b: int
+ # elif mode == 'pytreenode':
+ # class A(struct.PyTreeNode):
+ # a: int = 1
- # @functools.partial(struct.dataclass, kw_only=True)
- # class B(A):
- # b: int
+ # class B(A, struct.PyTreeNode, kw_only=True):
+ # b: int
# obj = B(b=2)
# self.assertEqual(obj.a, 1)
# self.assertEqual(obj.b, 2)
+ # with self.assertRaisesRegex(TypeError, "non-default argument 'b' follows default argument"):
+ # if mode == 'dataclass':
+ # @struct.dataclass
+ # class B(A):
+ # b: int
+ # elif mode == 'pytreenode':
+ # class B(A, struct.PyTreeNode):
+ # b: int
+
+ # TODO(marcuschiam): Uncomment when Flax upgrades to Python 3.10.
+ # @parameterized.parameters(
+ # {'mode': 'dataclass'},
+ # {'mode': 'pytreenode'},
+ # )
+ # def test_mutable(self, mode):
+ # if mode == 'dataclass':
+ # @struct.dataclass
+ # class A:
+ # a: int = 1
+
+ # @functools.partial(struct.dataclass, frozen=False)
+ # class B:
+ # b: int = 1
+ # elif mode == 'pytreenode':
+ # class A(struct.PyTreeNode):
+ # a: int = 1
+
+ # class B(struct.PyTreeNode, frozen=False):
+ # b: int = 1
+
+ # obj = A()
+ # with self.assertRaisesRegex(dataclasses.FrozenInstanceError, "cannot assign to field 'a'"):
+ # obj.a = 2
+
+ # obj = B()
+ # obj.b = 2
+ # self.assertEqual(obj.b, 2)
+
if __name__ == '__main__':
absltest.main()
|
secdev__scapy-1417 | No /dev/bpf handle is available !
I'm running on mac high sierra 10.13.4
after downloading from https://github.com/secdev/scapy/archive/v2.4.0.zip I unzipping and ran each of the following as root:
run_scapy, run_scapy2 and run_scapy_py3
within each repl I ran:
```send(IP(dst="2.2.2.2", src="1.1.1.1"))```
and the traceback was the same:
```
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/sendrecv.py", line 302, in send
realtime=realtime, return_packets=return_packets)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/sendrecv.py", line 276, in __gen_send
s.send(p)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/arch/bpf/supersocket.py", line 345, in send
frame = raw(self.guessed_cls()/pkt)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/compat.py", line 96, in raw
return bytes(x)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/packet.py", line 345, in __bytes__
return self.build()
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/packet.py", line 444, in build
p = self.do_build()
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/packet.py", line 426, in do_build
pkt = self.self_build()
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/packet.py", line 407, in self_build
p = f.addfield(self, p, val)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/fields.py", line 80, in addfield
return s+struct.pack(self.fmt, self.i2m(pkt,val))
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/layers/l2.py", line 109, in i2m
return MACField.i2m(self, pkt, self.i2h(pkt, x))
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/layers/l2.py", line 101, in i2h
x = conf.neighbor.resolve(pkt,pkt.payload)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/layers/l2.py", line 49, in resolve
return self.resolvers[k](l2inst,l3inst)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/layers/inet.py", line 821, in inet_register_l3
return getmacbyip(l3.dst)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/layers/l2.py", line 84, in getmacbyip
nofilter=1)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/sendrecv.py", line 434, in srp1
ans, _ = srp(*args, **kargs)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/sendrecv.py", line 416, in srp
s = conf.L2socket(promisc=promisc, iface=iface, filter=filter, nofilter=nofilter, type=type)
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/arch/bpf/supersocket.py", line 58, in __init__
(self.ins, self.dev_bpf) = get_dev_bpf()
File "/Users/idobn/dev/research/something/playground/tmp/scapy-2.4.0/scapy/arch/bpf/core.py", line 98, in get_dev_bpf
raise Scapy_Exception("No /dev/bpf handle is available !")
scapy.error.Scapy_Exception: No /dev/bpf handle is available !
```
after looking at some of the past issues it appears similar to this one: [#1015](https://github.com/secdev/scapy/issues/1015)
however it was solved some time ago...
Update:
The above was ran while I had wireshark running, after quitting wireshark the error stopped.
| [
{
"content": "# Guillaume Valadon <guillaume@valadon.net>\n\n\"\"\"\nScapy *BSD native support - core\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom scapy.config import conf\nfrom scapy.error import Scapy_Exception, warning\nfrom scapy.data import ARPHDR_LOOPBACK, ARPHDR_ETHER\nfrom scapy.arch.common i... | [
{
"content": "# Guillaume Valadon <guillaume@valadon.net>\n\n\"\"\"\nScapy *BSD native support - core\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom scapy.config import conf\nfrom scapy.error import Scapy_Exception, warning\nfrom scapy.data import ARPHDR_LOOPBACK, ARPHDR_ETHER\nfrom scapy.arch.common i... | diff --git a/scapy/arch/bpf/core.py b/scapy/arch/bpf/core.py
index 79f1e17ab69..7e32e4d5a03 100644
--- a/scapy/arch/bpf/core.py
+++ b/scapy/arch/bpf/core.py
@@ -88,7 +88,7 @@ def get_dev_bpf():
"""Returns an opened BPF file object"""
# Get the first available BPF handle
- for bpf in range(0, 8):
+ for bpf in range(256):
try:
fd = os.open("/dev/bpf%i" % bpf, os.O_RDWR)
return (fd, bpf)
|
learningequality__kolibri-4935 | users should not be able to get 1000% on an exam, unfortunately
### Observed behavior
reported by @jtamiace re: @radinamatic's apparent good luck:

### Expected behavior
exams are scored between 0 and 100
### User-facing consequences
????
### Errors and logs
unknown
### Steps to reproduce
see http://kolibribeta.learningequality.org/coach/#/fa4cbfeda32c0c0fbf1832fc1ddd10c3/reports/learners
### Context
k 0.12.0 alpha 7
| [
{
"content": "from django.db.models import Max\nfrom django.db.models import Sum\nfrom django.shortcuts import get_object_or_404\nfrom rest_framework import serializers\nfrom rest_framework import viewsets\nfrom rest_framework.response import Response\n\nfrom kolibri.core.auth import models as auth_models\nfrom... | [
{
"content": "from django.db.models import Max\nfrom django.db.models import Sum\nfrom django.shortcuts import get_object_or_404\nfrom rest_framework import serializers\nfrom rest_framework import viewsets\nfrom rest_framework.response import Response\n\nfrom kolibri.core.auth import models as auth_models\nfrom... | diff --git a/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleServerResponse.js b/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleServerResponse.js
index 7758f81365c..d8487072b51 100644
--- a/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleServerResponse.js
+++ b/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleServerResponse.js
@@ -77,6 +77,8 @@ export default {
{ exercise_id: '2a722a9e57575148bc55deed7550ed62', question_id: '3' },
],
groups: ['c4625c3fef6b7d918e9417d92e482e6f'],
+ data_version_model: 1,
+ question_count: 3,
},
{
id: 'd7033a1cb888493763dc9b5f3ab2505b',
@@ -87,6 +89,8 @@ export default {
{ exercise_id: 'eadec7f803994b6eb8f401237ec0f777', question_id: 'B' },
],
groups: ['8d2e8c66c05004657d676155dd0b305d'],
+ data_version_model: 1,
+ question_count: 2,
},
{
id: '4018bcea43cee3d05811b641fca0b152',
@@ -98,6 +102,8 @@ export default {
{ exercise_id: '3a655a4b8adb5114a571dfd0c75cbc19', question_id: '12' },
],
groups: ['7c20f664b6a5c43d64b0cdd3161be513'],
+ data_version_model: 1,
+ question_count: 3,
},
{
id: '97316f077d470b45e912096edb534076',
@@ -109,6 +115,8 @@ export default {
{ exercise_id: '9baf781e43b0514085cc205176b0ee71', question_id: 'z' },
],
groups: [],
+ data_version_model: 1,
+ question_count: 3,
},
],
exam_learner_status: [
diff --git a/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleState.js b/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleState.js
index fd3fbeae748..a8dab4003e8 100644
--- a/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleState.js
+++ b/kolibri/plugins/coach/assets/src/modules/classSummary/__test__/sampleState.js
@@ -153,6 +153,8 @@ export default {
{ exercise_id: '2a722a9e57575148bc55deed7550ed62', question_id: '3' },
],
groups: ['c4625c3fef6b7d918e9417d92e482e6f'],
+ data_version_model: 1,
+ question_count: 3,
},
d7033a1cb888493763dc9b5f3ab2505b: {
id: 'd7033a1cb888493763dc9b5f3ab2505b',
@@ -163,6 +165,8 @@ export default {
{ exercise_id: 'eadec7f803994b6eb8f401237ec0f777', question_id: 'B' },
],
groups: ['8d2e8c66c05004657d676155dd0b305d'],
+ data_version_model: 1,
+ question_count: 2,
},
'4018bcea43cee3d05811b641fca0b152': {
id: '4018bcea43cee3d05811b641fca0b152',
@@ -174,6 +178,8 @@ export default {
{ exercise_id: '3a655a4b8adb5114a571dfd0c75cbc19', question_id: '12' },
],
groups: ['7c20f664b6a5c43d64b0cdd3161be513'],
+ data_version_model: 1,
+ question_count: 3,
},
'97316f077d470b45e912096edb534076': {
id: '97316f077d470b45e912096edb534076',
@@ -185,6 +191,8 @@ export default {
{ exercise_id: '9baf781e43b0514085cc205176b0ee71', question_id: 'z' },
],
groups: [],
+ data_version_model: 1,
+ question_count: 3,
},
},
examLearnerStatusMap: {
diff --git a/kolibri/plugins/coach/assets/src/modules/classSummary/index.js b/kolibri/plugins/coach/assets/src/modules/classSummary/index.js
index af3ac95f501..ead5d87db6a 100644
--- a/kolibri/plugins/coach/assets/src/modules/classSummary/index.js
+++ b/kolibri/plugins/coach/assets/src/modules/classSummary/index.js
@@ -39,6 +39,7 @@ function defaultState() {
* question_sources: [{exercise_id, question_id}, ...],
* groups: [id, ...],
* data_model_version,
+ * question_count,
* }
* }
*/
@@ -264,8 +265,7 @@ export default {
if (status.num_correct === null) {
status.score = null;
} else {
- status.score =
- (1.0 * status.num_correct) / examMap[status.exam_id].question_sources.length;
+ status.score = (1.0 * status.num_correct) / examMap[status.exam_id].question_count;
}
});
summary.content_learner_status.forEach(status => {
diff --git a/kolibri/plugins/coach/class_summary_api.py b/kolibri/plugins/coach/class_summary_api.py
index eefe42e1969..73e6792ae18 100644
--- a/kolibri/plugins/coach/class_summary_api.py
+++ b/kolibri/plugins/coach/class_summary_api.py
@@ -188,7 +188,7 @@ class ExamSerializer(serializers.ModelSerializer):
class Meta:
model = Exam
- fields = ("id", "title", "active", "question_sources", "groups", "data_model_version")
+ fields = ("id", "title", "active", "question_sources", "groups", "data_model_version", "question_count")
class ContentSerializer(serializers.ModelSerializer):
|
ansible-collections__community.aws-1197 | ec2_customer_gateway: bgp_asn is not required
### Summary
The ec2_customer_gateway module has incorrect documentation for the bgp_asn parameter.
It says the ASN must be passed when state=present, but the code defaults to 25000 if the parameter is absent. See the ensure_cgw_present() method:
```
def ensure_cgw_present(self, bgp_asn, ip_address):
if not bgp_asn:
bgp_asn = 65000
response = self.ec2.create_customer_gateway(
DryRun=False,
Type='ipsec.1',
PublicIp=ip_address,
BgpAsn=bgp_asn,
)
return response
### Issue Type
Documentation Report
### Component Name
ec2_customer_gateway
### Ansible Version
```console (paste below)
$ ansible --version
ansible [core 2.12.4]
config file = None
configured module search path = ['/home/neil/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/lib/python3.10/site-packages/ansible
ansible collection location = /home/neil/.ansible/collections:/usr/share/ansible/collections
executable location = /home/neil/.local/share/virtualenvs/community.aws-uRL047Ho/bin/ansible
python version = 3.10.1 (main, Jan 10 2022, 00:00:00) [GCC 11.2.1 20211203 (Red Hat 11.2.1-7)]
jinja version = 3.1.1
libyaml = True
```
### Collection Versions
```console (paste below)
$ ansible-galaxy collection list
```
### Configuration
```console (paste below)
$ ansible-config dump --only-changed
```
### OS / Environment
main branch, as of 2022-04-18.
### Additional Information
Suggested rewording:
```
options:
bgp_asn:
description:
- Border Gateway Protocol (BGP) Autonomous System Number (ASN), defaults to 25000.
type: int
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct
| [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | [
{
"content": "#!/usr/bin/python\n#\n# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)\n\nfrom __future__ import absolute_import, division, print_function\n__metaclass__ = type\n\n\nDOCUMENTATION = '''\n---\nmodule: ec2_customer_gateway\nversion_added: 1.0.0\nshort_desc... | diff --git a/plugins/modules/ec2_customer_gateway.py b/plugins/modules/ec2_customer_gateway.py
index 9c00783a58a..f07e92f4f7c 100644
--- a/plugins/modules/ec2_customer_gateway.py
+++ b/plugins/modules/ec2_customer_gateway.py
@@ -23,7 +23,8 @@
options:
bgp_asn:
description:
- - Border Gateway Protocol (BGP) Autonomous System Number (ASN), required when I(state=present).
+ - Border Gateway Protocol (BGP) Autonomous System Number (ASN).
+ - Defaults to C(65000) if not specified when I(state=present).
type: int
ip_address:
description:
|
TabbycatDebate__tabbycat-1883 | Should we set DEFAULT_AUTO_FIELD?
Related to the Django 3.2 upgrade. Just thought this should be a conscious documented discussion rather than an informal one, since there seem to be (minor but nontrivial) consequences.
https://docs.djangoproject.com/en/3.2/releases/3.2/#customizing-type-of-auto-created-primary-keys:
> Maintaining the historical behavior, the default value for `DEFAULT_AUTO_FIELD` is `AutoField`. Starting with 3.2 new projects are generated with `DEFAULT_AUTO_FIELD` set to `BigAutoField`. Also, new apps are generated with `AppConfig.default_auto_field` set to `BigAutoField`. In a future Django release the default value of `DEFAULT_AUTO_FIELD` will be changed to `BigAutoField`.
But migrations aren't seamless. https://docs.djangoproject.com/en/3.2/ref/settings/#std:setting-DEFAULT_AUTO_FIELD:
> Unfortunately, the primary keys of existing auto-created through tables cannot currently be updated by the migrations framework.
>
> This means that if you switch the value of `DEFAULT_AUTO_FIELD` and then generate migrations, the primary keys of the related models will be updated, as will the foreign keys from the through table, but the primary key of the auto-created through table will not be migrated.
To me the path of least resistance would be to set `DEFAULT_AUTO_FIELD` to `AutoField`, and kick the `BigAutoField` can down the road until (maybe) migrations work for it without manually added code, or until it becomes necessary. I can't imagine hitting 2 billion entries (what I presume the `AutoField` limit would be) in a table in a Tabbycat instance any time soon. But there's nothing prohibitive about `BigAutoField` migration, if others would prefer to get this change out of the way.
| [
{
"content": "import os\n\nfrom django.contrib.messages import constants as messages\nfrom django.utils.translation import gettext_lazy as _\n\n\nBASE_DIR = os.path.dirname(os.path.abspath(os.path.join(__file__, os.pardir)))\nMEDIA_ROOT = os.path.join(BASE_DIR, 'media')\n\n# ====================================... | [
{
"content": "import os\n\nfrom django.contrib.messages import constants as messages\nfrom django.utils.translation import gettext_lazy as _\n\n\nBASE_DIR = os.path.dirname(os.path.abspath(os.path.join(__file__, os.pardir)))\nMEDIA_ROOT = os.path.join(BASE_DIR, 'media')\n\n# ====================================... | diff --git a/tabbycat/settings/core.py b/tabbycat/settings/core.py
index fcee78da7da..34bbc91b105 100644
--- a/tabbycat/settings/core.py
+++ b/tabbycat/settings/core.py
@@ -301,6 +301,8 @@
},
}
+DEFAULT_AUTO_FIELD = 'django.db.models.AutoField'
+
# ==============================================================================
# Channels
# ==============================================================================
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.