message stringlengths 13 484 | diff stringlengths 38 4.63k |
|---|---|
Stop trying to test Elasticsearch 6.8.0 on ARM
* Stop trying to test 6.8.0 on ARM
This will allow developers with Apple Silicon hardware to still run
integration tests.
* Bump metrics store version for Apple Silicon hardware | @@ -19,6 +19,7 @@ import errno
import functools
import json
import os
+import platform
import random
import socket
import subprocess
@@ -30,7 +31,10 @@ from esrally import client, config, version
from esrally.utils import process
CONFIG_NAMES = ["in-memory-it", "es-it"]
-DISTRIBUTIONS = ["6.8.0", "8.4.0"]
+DISTRIBUTION... |
Serialize Notification now also returns sender email address
sent_by_email_address field was added because sometimes two
people at one institution have the same name and then email
address, which is unique, is more useful. | @@ -1413,6 +1413,12 @@ class Notification(db.Model):
else:
return None
+ def get_created_by_email_address(self):
+ if self.created_by:
+ return self.created_by.email_address
+ else:
+ return None
+
def serialize_for_csv(self):
created_at_in_bst = convert_utc_to_bst(self.created_at)
serialized = {
@@ -1424,6 +1430,7 @@ ... |
update tests/test_poly_spaces.py for Bernstein basis
update _gen_common_data(), test_partition_of_unity(), test_continuity() | @@ -89,7 +89,8 @@ def _gen_common_data(orders, gels, report):
from sfepy.discrete.common.global_interp import get_ref_coors
bases = ([ii for ii in combine([['2_4', '3_8'],
- ['lagrange', 'serendipity', 'lobatto']])]
+ ['lagrange', 'serendipity', 'bernstein',
+ 'lobatto']])]
+ [ii for ii in combine([['2_3', '3_4'],
['la... |
Use same Travis build badge for Linux and MacOS
Before, it looked like Linux was untested. Only a very careful reading revealed that the build
badge for MacOS also applied to Linux. | @@ -61,8 +61,9 @@ For example, pyfakefs will not work with [`lxml`](http://lxml.de/). In this cas
pyfakefs is currently automatically tested:
* On Linux, with Python 2.7, and 3.4 to 3.7, using [Travis](https://travis-ci.org/jmcgeheeiv/pyfakefs)
-* On MacOS, with Python 2.7, 3.6 and 3.7, also using [Travis](https://trav... |
Set run model default timestamp to 0
Without setting to 0, the finished at field could be null if the argo workflow is already evicted from the cluster.
This result in errors parsing the table.
Alternatively we can use sql.NullInt64 type to parse the sql but that's less elegant. | @@ -20,10 +20,10 @@ type Run struct {
Name string `gorm:"column:Name; not null;"` /* The name of the K8s resource. Follow regex '[a-z0-9]([-a-z0-9]*[a-z0-9])?'*/
StorageState string `gorm:"column:StorageState; not null;"`
Namespace string `gorm:"column:Namespace; not null;"`
- Description string `gorm:"column:Descripti... |
Update 'title' and 'managedby' entries
Update yaml file to better reflect recent developments regarding A2D2. | -Name: "A2D2: AEV Autonomous Driving Dataset"
+Name: "A2D2: Audi Autonomous Driving Dataset"
Description:
An open multi-sensor dataset for autonomous driving research.
This dataset comprises semantically segmented images, semantic
@@ -8,8 +8,8 @@ Description:
active research and development in AI, computer vision, and
... |
Sometimes gfy fails.
KeyError
'gfyItem' | @@ -121,12 +121,15 @@ def get_url(submission, mp4_instead_gif=True):
elif 'gfycat.com' in urlparse(url).netloc:
client = GfycatClient()
rname = re.findall(r'gfycat.com\/(?:detail\/)?(\w*)', url)[0]
+ try:
urls = client.query_gfy(rname)['gfyItem']
logging.warning('Gfy url!')
if mp4_instead_gif:
return TYPE_GIF, urls['mp... |
Making boost compiling again with emscripten 1.38.29
We had to move again the "hack" of transforming the bc files to a files
in the packaging. I think the b2 change on how it packages the files
so now that is possible only in the packaging. | @@ -440,26 +440,6 @@ class BoostConan(ConanFile):
# self.run("%s --show-libraries" % b2_exe)
self.run(full_command)
- arch = self.settings.get_safe('arch')
- if arch.startswith("asm.js"):
- self._create_emscripten_libs()
-
- def _create_emscripten_libs(self):
- # Boost Build doesn't create the libraries, but it gets cl... |
fix(doc): update gnome-settings-daemon to org.gnome.SettingsDaemon
Since Gnome 3.23.2, gnome-settings-daemon was split into helper daemons (source: https://gitlab.gnome.org/GNOME/gnome-settings-daemon/blob/master/NEWS).
The old config lead to an immediate logout after login in Qtile. | @@ -42,6 +42,16 @@ This adds a new entry "Qtile GNOME" to GDM's login screen.
The custom session for gnome-session.
+For Gnome >= 3.23.2 (Ubuntu >= 17.04, Fedora >= 26, etc.)
+::
+
+ $ cat /usr/share/gnome-session/sessions/qtile.session
+ [GNOME Session]
+ Name=Qtile session
+ RequiredComponents=qtile;org.gnome.Setting... |
utils: don't assume all exceptions have .strerror
Now that we're calling the actual sensors code, there's a bug :). Not all
python Exceptions have strerror. | @@ -152,8 +152,8 @@ def catch_exception_and_warn(warning=Warning, return_on_exception=None,
try:
return_value = func(*args, **kwargs)
except excepts as err:
- logger.warning(err.strerror)
- warnings.warn(err.strerror, warning)
+ logger.warning(str(err))
+ warnings.warn(str(err), warning)
return return_value
return wrap... |
BUG FIX in scenario.py: Utility object is not JSON serializable
celery workers cannot pass Python objects between tasks. Because the Utility object used to be in dfm.available_techs it was not noticed that dfm.util was not getting deleted until I posted a job to a running API instance (locally). | @@ -240,7 +240,7 @@ def setup_scenario(self, run_uuid, data, raw_post):
dfm_dict = vars(dfm) # serialize for celery
# delete python objects, which are not serializable
- for k in ['storage', 'site', 'elec_tariff', 'pvs', 'pvnms', 'load'] + dfm.available_techs:
+ for k in ['storage', 'site', 'elec_tariff', 'pvs', 'pvnms... |
Fix links in README for java client
Fix links in README for java client | @@ -71,8 +71,8 @@ ListArtifacts.Response listArtifacts(String runUuid, String path)
### Java Usage
-For a simple example see [QuickStartDriver.java](src/main/java/org/mlflow/client/samples/QuickStartDriver.java).
-For full examples of API coverage see the [tests](src/test/java/org/mlflow/client) such as [ApiClientTest.... |
yaml: Update load methods to use Text rather than str
Yaml loading accepts bytes and unicode, either directly or via IO.
For python 3, bytes and str work fine, but for Python 2 code this is redundant and limited.
Text instead of str should make type checks more accurate. | @@ -26,14 +26,14 @@ def scan(stream, Loader=...): ...
def parse(stream, Loader=...): ...
def compose(stream, Loader=...): ...
def compose_all(stream, Loader=...): ...
-def load(stream: Union[bytes, IO[bytes], str, IO[str]], Loader=...) -> Any: ...
-def load_all(stream: Union[bytes, IO[bytes], str, IO[str]], Loader=...)... |
osd: fix automatic prepare when auto_discover
Use `devices` variable instead of `ansible_devices`, otherwise it means
we are not using the devices which have been 'auto discovered' | docker run --net=host \
--pid=host \
--privileged=true \
- --name=ceph-osd-prepare-{{ ansible_hostname }}-{{ item.key }} \
+ --name=ceph-osd-prepare-{{ ansible_hostname }}-{{ item.split('/')[-1] }} \
-v /etc/ceph:/etc/ceph \
-v /var/lib/ceph/:/var/lib/ceph/ \
-v /dev:/dev \
-e DEBUG=verbose \
-e CLUSTER={{ cluster }} \... |
Add interaction check to command tree
In some cases, it's desirable for our command tree to only process a
subset of incoming interactions, such as in a multi process deployment. | @@ -971,6 +971,17 @@ class CommandTree(Generic[ClientT]):
await ctx_menu.on_error(interaction, e)
await self.on_error(interaction, ctx_menu, e)
+ async def interaction_check(self, interaction: Interaction, /) -> bool:
+ """|coro|
+
+ A global check to determine if an :class:`~discord.Interaction` should
+ be processed ... |
Update tests
We want to ensure that targeting invalid minions does *not* allow us to
run any sort of shenanigans on the minions in question! This ensures
that valid commands (i.e. file.touch in this case) cannot inadvertently
be sent to minions that we don't have ACL for. | @@ -112,12 +112,14 @@ publisher_acl:
bob:
- '*1':
- test.*
+ - file.touch
external_auth:
pam:
bob:
- '*1':
- test.*
+ - file.touch
nodegroups:
second_string: "minion_*2"
@@ -139,12 +141,14 @@ publisher_acl:
bob:
- '*1':
- test.*
+ - file.touch
external_auth:
pam:
bob:
- '*1':
- test.*
+ - file.touch
"""
)
@@ -491,6 +49... |
[GH 3611] `YOUR_ENV_VAR` is leaking from helm defaults into user clusters
Summary:
[helm] `YOUR_ENV_VAR` is leaking from helm defaults into user clusters
Test Plan: integration
Reviewers: nate | @@ -274,8 +274,7 @@ pipelineRun:
# env:
# ENV_ONE: one
# ENV_TWO: two
- env:
- YOUR_ENV_VAR: ""
+ env: {}
####################################################################################################
# Scheduler: Configuration for the scheduler
|
tippy: Tranfer subs-sort tooltip to tippyjs.
As zulip is tranfering its tooltip to tippy the
tooltips for subs sort options are tranfered to
use tippy instead of title. Placement is bottom.
Refer | import $ from "jquery";
import _ from "lodash";
+import tippy from "tippy.js";
import render_subscription from "../templates/subscription.hbs";
import render_subscription_settings from "../templates/subscription_settings.hbs";
@@ -564,19 +565,19 @@ export function setup_page(callback) {
const sort_toggler = components.... |
Removed the "providing_args" argument - this is deprecated in Django 3.1
and was purely documentational, so no replacement is needed | @@ -10,5 +10,5 @@ run_weekly_jobs = Signal()
run_monthly_jobs = Signal()
run_yearly_jobs = Signal()
-pre_command = Signal(providing_args=["args", "kwargs"])
-post_command = Signal(providing_args=["args", "kwargs", "outcome"])
+pre_command = Signal()
+post_command = Signal()
|
Fixed errors in time-service startup with local Redis
Periods were trying to get configured before Redis is started | @@ -71,11 +71,6 @@ sed -i 's@/{S3_URL}@'$S3_URL'@g' /etc/onearth/config/layers/*/*.yaml # in case t
sed -i 's@{S3_URL}@'$S3_URL'@g' /etc/onearth/config/layers/*/*/*.yaml
sed -i 's@{S3_URL}@'$S3_URL'@g' /etc/onearth/config/layers/*/*.yaml
-# Load custom time period configurations
-for i in /etc/onearth/config/endpoint/e... |
fix Dispatcher.release_key
del bucker['key'] raises KeyError: 'key' | @@ -1367,7 +1367,7 @@ class Dispatcher(DataMixin, ContextInstanceMixin):
bucket = await self.storage.get_bucket(chat=chat_id, user=user_id)
if bucket and key in bucket:
- del bucket['key']
+ del bucket[key]
await self.storage.set_bucket(chat=chat_id, user=user_id, bucket=bucket)
return True
return False
|
Multiple slashes processing modified
Instead of deleting empty keys on put() stage, replacement of multiple consecutive slashes with one while running parse() is implemented. | """ Config class"""
+import re
class Config:
""" Class for configs that can be represented as nested dicts with easy indexing by slashes """
@@ -141,7 +142,6 @@ class Config:
variable = variable.strip('/')
if '/' in variable:
var = variable.split('/')
- var = list(filter(('').__ne__, var)) #remove empty keys
prefix = v... |
Check for what DVLALetterTemplate is called with
Extends the test to make sure that the thing that builds each line of
the file is getting called with the right template, personalisation and
numeric ID. Will be helpful the more complicated the call to the
template gets. | @@ -971,17 +971,29 @@ def test_build_dvla_file(sample_letter_template, mocker):
create_notification(template=job.template, job=job)
create_notification(template=job.template, job=job)
- mocked = mocker.patch("app.celery.tasks.s3upload")
- mocker.patch("app.celery.tasks.LetterDVLATemplate.__str__", return_value="dvla|st... |
MappingProjection: instantiate function for MATRIX PState instead of just setting
- Fixes | @@ -694,11 +694,17 @@ class MappingProjection(PathwayProjection_Base):
matrix = get_matrix(self._parameter_states[MATRIX].value)
initial_rate = matrix * 0.0
- self._parameter_states[MATRIX].function = AccumulatorIntegrator(owner=self._parameter_states[MATRIX],
+ # KDM 7/11/19: instead of simply setting the function, we... |
[DOCKER] Update lint to reflect the latest state
Pins mypy version. | //
// NOTE: these lines are scanned by docker/dev_common.sh. Please update the regex as needed. -->
-ci_lint = "tlcpack/ci-lint:v0.65"
+ci_lint = "tlcpack/ci-lint:v0.66"
ci_gpu = "tlcpack/ci-gpu:v0.75"
ci_cpu = "tlcpack/ci-cpu:v0.74"
ci_wasm = "tlcpack/ci-wasm:v0.71"
|
ONLY_USER works when cache is disabled
The _get_user_attempts function now checks for AXES_ONLY_USER_FAILURES,
and only includes the IP when AXES_ONLY_USER_FAILURES = False. | @@ -171,11 +171,18 @@ def _get_user_attempts(request):
)
if not attempts:
- params = {'ip_address': ip, 'trusted': False}
+ params = {'trusted': False}
+
+ if AXES_ONLY_USER_FAILURES:
+ params['username'] = username
+ elif LOCK_OUT_BY_COMBINATION_USER_AND_IP:
+ params['username'] = username
+ params['ip_address'] = ip
... |
Update develop_guide.md
edit a sentence | @@ -23,7 +23,7 @@ Parameter object is the only way to pass user-define runtime parameters to the d
In order to define a usable parameter object, three steps will be needed.
a. Open a new python file, rename it as xxx_param.py where xxx stands for your module'name, putting it in folder federatedm/param/.
- The class obj... |
improve `TestUpscaleDouble`
implement cardinality test
use `assert_close`
fix gradcheck test
- enable fast mode | @@ -200,27 +200,33 @@ class TestUpscaleDouble(BaseTester):
x = self.prepare_data(shape, device, dtype)
assert kornia.geometry.transform.upscale_double(x) is not None
- def test_exception(self, device, dtype):
+ def test_exception(self):
with pytest.raises(TypeError):
- assert kornia.geometry.transform.upscale_double(No... |
[Core] fix test_object_directory_failure flakiness
There could be a race that the task finished execution before owner is killed. | @@ -664,12 +664,12 @@ def test_object_directory_failure(ray_start_cluster):
def task(x):
pass
+ cluster.remove_node(node_to_kill, allow_graceful=False)
tasks = []
repeat = 3
for i in range(num_nodes):
for _ in range(repeat):
tasks.append(task.options(resources={str(i): 1}).remote(obj))
- cluster.remove_node(node_to_kil... |
svtplay: kanaler will work again
fixes: | @@ -41,13 +41,8 @@ class Svtplay(Service, MetadataThumbMixin):
urldata = self.get_urldata()
if parse.path[:8] == "/kanaler":
- match = re.search('data-video-id="([\\w-]+)"', urldata)
-
- if not match:
- yield ServiceError("Can't find video info.")
- return
-
- _url = urljoin(URL_VIDEO_API, match.group(1))
+ ch = "ch-{}... |
Update molecule tag in examples.rst
##### SUMMARY
<!--- Your description here -->
Updating to the latest version, since in the documentation it is still using an old version. Also, quay.io/ansible/molecule:latest pulls a 2.20 image, not the new v3.
##### ISSUE TYPE
Docs Pull Request
+label: docsite_pr | @@ -21,7 +21,7 @@ follows.
-v "$(pwd)":/tmp/$(basename "${PWD}"):ro \
-v /var/run/docker.sock:/var/run/docker.sock \
-w /tmp/$(basename "${PWD}") \
- quay.io/ansible/molecule:2.20 \
+ quay.io/ansible/molecule:3.0.8 \
molecule test
.. _`quay.io`: https://quay.io/repository/ansible/molecule
|
nfs: add missing | bool filters
To address this warning:
```
[DEPRECATION WARNING]: evaluating nfs_ganesha_dev as a bare variable, this
behaviour will go away and you might need to add |bool to the expression in the
future
``` | gpgkey: "{{ ceph_stable_key }}"
baseurl: "{{ ceph_mirror }}/nfs-ganesha/rpm-{{ nfs_ganesha_stable_branch }}/{{ ceph_release }}/$basearch"
when:
- - nfs_ganesha_stable
+ - nfs_ganesha_stable | bool
- ceph_repository == 'community'
- name: red hat based systems - dev repo related tasks
group: root
backup: yes
when:
- - n... |
Fix zpsp to have element names as its keys
symbol shows the POTCAR label e.g. "Ca_pv" while element its name "Ca". | @@ -297,7 +297,7 @@ class Critic2Caller:
if potcar_path:
potcar = Potcar.from_file(potcar_path)
- zpsp = {p.symbol: p.zval for p in potcar}
+ zpsp = {p.element: p.zval for p in potcar}
if not zpsp:
|
Update INSTALLATION.md
Clarify Python versions. | If Python isn't already available on your system, detailed instructions by platform can be found in the [Python Setup and Usage][using python] section of the official documentation.
Real Python also offers a [very helpful guide][helpful guide] to installation on various platforms, including iOS and Android.
-Exercism t... |
Remove NASA APOD official and unofficial
NASA APOD (official API) is already included in NASA
NASA APOD (unofficial API) is a mirror of NASA APOD | @@ -1348,8 +1348,6 @@ API | Description | Auth | HTTPS | CORS |
| [Minor Planet Center](http://www.asterank.com/mpc) | Asterank.com Information | No | No | Unknown |
| [NASA](https://api.nasa.gov) | NASA data, including imagery | No | Yes | No |
| [NASA ADS](https://ui.adsabs.harvard.edu/help/api/api-docs.html) | NASA ... |
Allow for skipping custom env.d file checks
Currently if the env.d check fails and halts the upgrade, re-running
the run-upgrade.sh script doesn't pass the skip-tags flag to
Ansible. This prompts the user to set the env variable if they want
to further skip checks and then passes the skip-tags as necessary. | layout in {{ repo_root_dir }}/inventory/env.d. The difference between these files
should be carefully reviewed to understand whether the changes are still necessary
and applicable to the environment. If all the user-space env.d files are necessary,
- then please re-run this playbook with the CLI option '--skip-tags cus... |
Integrating MaskedAdagrad
Summary: Pull Request resolved:
Test Plan: unit test | @@ -519,7 +519,8 @@ class AdagradOptimizer(Optimizer):
def __init__(self, alpha=0.01, epsilon=1e-4, decay=1, policy="fixed",
sparse_dedup_aggregator=None, rowWise=False, engine='',
lars=None, output_effective_lr=False,
- output_effective_lr_and_update=False, **kwargs):
+ output_effective_lr_and_update=False,
+ mask_ten... |
utilities: Add fastpath when checking if object is compatible with itself
This is a common situation in tests that assign the same object to e.g.
default variable, and then variable for execution. | @@ -459,6 +459,12 @@ def iscompatible(candidate, reference=None, **kargs):
# ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
pass
+ # If the two are the same thing, can settle it right here
+ # This is a common pattern for tests that use the same structure
+ # as ... |
Use default Redis port in RedisStore constructor
Summary: TSIA | @@ -25,7 +25,7 @@ namespace rendezvous {
class RedisStore : public Store {
public:
- RedisStore(const std::string& host, int port);
+ explicit RedisStore(const std::string& host, int port = 6379);
virtual ~RedisStore();
virtual void set(const std::string& key, const std::vector<char>& data)
|
Fixed docstring errors in function
* Renamed function parameters to avoid name conflicts in documentation
build | @@ -358,14 +358,14 @@ RHEL6 = "Red Hat Enterprise Linux Server release 6.5 (Santiago)"
RHEL7 = "Red Hat Enterprise Linux Server release 7.0 (Maipo)"
-def redhat_release(major, minor=""):
+def redhat_release(rel_major, rel_minor=""):
"""
Helper function to construct a redhat-release string for a specific RHEL
major and ... |
several fixes. more informative logging at beginning of training.
Now buckets never exceed max_batch_len and ordering is mantained when shuffle is false | @@ -12,7 +12,6 @@ import logging
from operator import itemgetter
from torch.utils.data import RandomSampler, DistributedSampler, Sampler
import numpy as np
-import math
from typing import List
from speechbrain.dataio.dataset import DynamicItemDataset
@@ -334,7 +333,7 @@ class DynamicBatchSampler(Sampler):
):
self._data... |
refactor: test_ui_tools: Move reactions_view outputs into parameters.
Add expected_text and expected_attributes to test parameters, from test
body. | @@ -2736,8 +2736,9 @@ class TestMessageBox:
# FIXME This is the same parametrize as MsgInfoView:test_height_reactions
@pytest.mark.parametrize(
- "to_vary_in_each_message",
+ "to_vary_in_each_message, expected_text, expected_attributes",
[
+ case(
{
"reactions": [
{
@@ -2780,11 +2781,26 @@ class TestMessageBox:
},
"rea... |
Reset circle ci cache keys.
Testing Circle CI before submitting PR. | @@ -71,27 +71,27 @@ jobs:
- restore_cache:
name: Restore /opt/conda from cache
keys:
- - v0-opt-conda-{{ checksum "~/python_version.md5" }}
+ - v11-opt-conda-{{ checksum "~/python_version.md5" }}
- restore_cache:
name: Restore virtualenv from cache
keys:
- - v0-python-venv-{{ checksum "~/python_version.md5" }}
+ - v11-... |
Add illustration of connect via hostname, password, etc.
When I am using this library, I cannot find the demonstration of how can I connect with hostname and password before I read the source code. | @@ -86,6 +86,8 @@ Simple consumer:
async def main(loop):
+ # Connect with the givien parameters is also valiable.
+ # aio_pika.connect_robust(host="host", login="login", password="password")
connection = await aio_pika.connect_robust(
"amqp://guest:guest@127.0.0.1/", loop=loop
)
|
Escape markdown in faulty source commands
Closes | @@ -2,7 +2,7 @@ import inspect
from pathlib import Path
from typing import Optional, Tuple, Union
-from discord import Embed
+from discord import Embed, utils
from discord.ext import commands
from bot.bot import Bot
@@ -36,7 +36,7 @@ class SourceConverter(commands.Converter):
return argument.lower()
raise commands.BadA... |
TST: pass `solver` kwarg to function `synth.is_realizable`
because `solver` is an argument given to the function
`multiple_env_actions_check`. | @@ -411,7 +411,7 @@ def multiple_env_actions_check(solver='omega'):
moore=False,
plus_one=False,
qinit='\A \E')
- r = synth.is_realizable(solver, specs, sys=sys)
+ r = synth.is_realizable(specs, sys=sys, solver=solver)
assert r
# slightly relax assumption
specs = spec.GRSpec(
@@ -419,7 +419,7 @@ def multiple_env_action... |
Update __init__.py
fix check for pyarrow | @@ -37,7 +37,7 @@ from .utils import *
from .utils.tqdm_utils import disable_progress_bar
-if int(pyarrow.__version__.split(".")[1]) < 16 or int(pyarrow.__version__.split(".")[0]) > 0:
+if int(pyarrow.__version__.split(".")[1]) < 16 and int(pyarrow.__version__.split(".")[0]) == 0:
raise ImportWarning(
"To use `nlp`, th... |
Update hyperspectral_tutorial.md
update to apply_mask param | @@ -207,10 +207,10 @@ Binary mask after [filtering objects by the region of interest](roi_objects.md)
# Apply the mask of the leaf to the entire datacube, and store it where the datacube is stored.
# Inputs:
- # rgb_img - RGB image data or hyperspectral image data
+ # img - RGB image data or hyperspectral image data
# ... |
Node schema: split remote and local
With this change, the schema structure is clear. The following changes
will update node classes. | @@ -587,33 +587,27 @@ class Capability(NodeSpace):
self.node_count = 1
-@dataclass_json()
+@dataclass_json(undefined=Undefined.INCLUDE)
@dataclass
-class LocalNode(TypedSchema):
- type: str = field(
- default=constants.ENVIRONMENTS_NODES_LOCAL,
- metadata=metadata(
- required=True,
- validate=validate.OneOf([constants.... |
Fix calls to disconnect after logout
Introduced by | @@ -606,9 +606,6 @@ class TelegramBaseClient(abc.ABC):
# You don't need to use this if you used "with client"
await client.disconnect()
"""
- if self.session is None:
- return # already logged out and disconnected
-
if self.loop.is_running():
# Disconnect may be called from an event handler, which would
# cancel itself... |
Add TODOs for all pending review feedback
Should be incorporated before merging to master. | @@ -23,6 +23,9 @@ class RemoteState(object):
key = self._cache_key(resource)
if key in self._cache:
return self._cache[key]
+ # TODO: This code will likely be refactored and pulled into
+ # per-resource classes so the RemoteState object doesn't need
+ # to know about every type of resource.
if isinstance(resource, mode... |
Don't crash when receiving updates prior to login
Fixes and enables | @@ -397,6 +397,9 @@ class UpdateMethods:
# Some updates require our own ID, so we must make sure
# that the event builder has offline access to it. Calling
# `get_me()` will cache it under `self._self_input_peer`.
+ #
+ # It will return `None` if we haven't logged in yet which is
+ # fine, we will just retry next time ... |
fix printing a node header (a kind wasn't being printed)
Summary: Pull Request resolved: | @@ -244,22 +244,20 @@ std::ostream &Node::print(std::ostream &out, size_t level,
auto* pyOp = static_cast<const ::torch::jit::PythonOp*>(this);
out << "^" << pyOp->name();
pyOp->writeScalars(out);
- } else if (print_attributes) {
- if (hasAttribute(attr::Subgraph) && groups) {
+ } else if (hasAttribute(attr::Subgraph) ... |
Fixed i386 build
TypeError: longs are not supported for this option | @@ -383,7 +383,7 @@ class HttpCurlTimeoutLoaderTestCase(DummyAsyncHttpClientTestCase):
config = Config()
config.HTTP_LOADER_CURL_ASYNC_HTTP_CLIENT = True
config.HTTP_LOADER_CURL_LOW_SPEED_TIME = 1
- config.HTTP_LOADER_CURL_LOW_SPEED_LIMIT = 1000000000000
+ config.HTTP_LOADER_CURL_LOW_SPEED_LIMIT = 1000000000
ctx = Cont... |
Reset filters if a new search term is searched for.
Disable filter selection when nothing to filter by. | :label="$tr('resourceType')"
:options="contentKindFilterOptions"
:inline="true"
+ :disabled="!contentKindFilterOptions.length"
class="filter"
v-model="contentKindFilterSelection"
/>
:label="$tr('channels')"
:options="channelFilterOptions"
:inline="true"
+ :disabled="!channelFilterOptions.length"
class="filter"
v-model=... |
[bugfix] Remove test_bluwiki test_T235768_failure
Neither nor exits
(any longer). Remove this test. | @@ -88,10 +88,6 @@ class StandardVersionSiteTestCase(SiteDetectionTestCase):
"""Test detection of MediaWiki sites for en.wikifur.com."""
self.assertSite('https://en.wikifur.com/wiki/$1')
- def test_bluwiki(self):
- """Test detection of MediaWiki sites for bluwiki.com."""
- self.assertSite('http://bluwiki.com/go/$1')
-
... |
Fix bug causing default flag argument to fail.
`",".join` fails when passed a list of integers | @@ -115,7 +115,8 @@ def evaluate_task_on_model(task: str, model: str):
container_cmd.append(json_file)
if FLAGS.json_shots:
- container_cmd.append(f"--json_shots={','.join(FLAGS.json_shots)}")
+ json_shots = [str(shot) for shot in FLAGS.json_shots]
+ container_cmd.append(f"--json_shots={','.join(json_shots)}")
if FLAGS... |
Fix, add Missing Scipy v1.6.0 Implicit Imports to ImplicitImports Plugin
* Adding missing implicit imports used in scipy v1.6.0
Not sure from which version of scipy this is needed, but on my machine it's version v1.6.0 | @@ -770,8 +770,12 @@ class NuitkaPluginPopularImplicitImports(NuitkaPluginBase):
yield "scipy.sparse.csgraph._validation"
elif full_name == "scipy._lib":
yield "scipy._lib.messagestream"
+ elif full_name == "scipy.spatial":
+ yield "scipy.spatial.transform"
+ elif full_name == "scipy.spatial.transform":
+ yield "scipy.... |
enhancement: add an utility function for memoization
add an utility function for memoization will be used to cache some list
of instances of processor classes for example later. | from __future__ import absolute_import
import collections
+import functools
import glob
import itertools
import os.path
@@ -448,4 +449,30 @@ def filter_options(keys, options):
"""
return dict((k, options[k]) for k in keys if k in options)
+
+def memoize(fnc):
+ """memoization function.
+
+ >>> import random
+ >>> imax ... |
changing simple_dispatch with simple_dispatch
The simple_least_costs example is not executable in the last oemof version, so I suggest to change it to simple_dispatch or similar. | @@ -248,7 +248,7 @@ Execute an example with different solver (default: 'cbc').
.. code:: console
- oemof_examples simple_least_costs
- oemof_examples simple_least_costs -s glpk
+ oemof_examples simple_dispatch
+ oemof_examples simple_dispatch -s glpk
If you want to run solph examples you need to have a solver installed... |
docs: Note need to log out and in again on push notifs setup.
This often surprises people, so mention it up front.
(Also it'd probably be good to add some code to make this step
unnecessary.) | @@ -32,6 +32,10 @@ follows:
Note that if you installed Zulip older than 1.6, you'll need to add
the line (it won't be there to uncomment).
+4. If you or your users have already set up the Zulip mobile app,
+ you'll each need to log out and log back in again in order to start
+ getting push notifications.
+
That should ... |
fix return type of tfds.load in doc
Was referring to `tfds.data.Dataset` instead of `tf.data.Dataset`. | @@ -291,7 +291,7 @@ def load(
Returns:
ds: `tf.data.Dataset`, the dataset requested, or if `split` is None, a
- `dict<key: tfds.Split, value: tfds.data.Dataset>`. If `batch_size=-1`,
+ `dict<key: tfds.Split, value: tf.data.Dataset>`. If `batch_size=-1`,
these will be full datasets as `tf.Tensor`s.
ds_info: `tfds.core.D... |
Allow running manage.py without arguments
This is useful for listing all available management commands and it is
how Django's bash completion script works | @@ -22,7 +22,7 @@ SITE_NAME = basename(SETTINGS_ROOT)
SITE_ID = 1
# Useful flag for special-casing shell operations
-SHELL = sys.argv[1] in ["shell", "dbshell"]
+SHELL = len(sys.argv) > 1 and sys.argv[1] in ["shell", "dbshell"]
# Add our project to our pythonpath, this way we don't need to type our project
# name in ou... |
Worked in suggested edits for PR
see | @@ -3,15 +3,10 @@ Messaging Basics
--------------
-.. |Main menu icon| image:: ../../images/sidebar_main_menu_icon.png
- :width: 33
- :alt: Main menu
-
**Write messages** using the text input box at the bottom of the screen.
Press ENTER to send a message. Use SHIFT+ENTER to create a new
-line without sending a message.... |
Remove mox from nova/tests/unit/consoleauth/test_consoleauth.py
Partially-Implements: blueprint remove-mox-pike | @@ -19,7 +19,6 @@ Tests for Consoleauth Code.
"""
import mock
-from mox3 import mox
from oslo_utils import timeutils
import six
@@ -32,6 +31,8 @@ from nova import test
class ConsoleauthTestCase(test.NoDBTestCase):
"""Test Case for consoleauth."""
+ rpcapi = 'nova.compute.rpcapi.ComputeAPI.'
+
def setUp(self):
super(Con... |
fix args not being passed
In function coherence_function_g2 | @@ -457,6 +457,8 @@ def coherence_function_g2(H, state0, taulist, c_ops, a_op, solver="me", args={},
`me` or `mc`.
a_op : Qobj
operator A.
+ args : dict
+ Dictionary of arguments to be passed to solver.
solver : str
choice of solver (`me` for master-equation and
`es` for exponential series).
@@ -478,7 +480,7 @@ def coh... |
Fix autodiff of nll_loss
Summary: Pull Request resolved: | @@ -129,7 +129,7 @@ bool isDifferentiable(Node* n) {
if (n->matches(
"aten::nll_loss(Tensor self, Tensor target, Tensor? weight, int reduction, int ignore_index) -> Tensor")) {
// TODO(asuhan): support weight
- return n->namedInput(attr::weight)->node()->kind() == prim::Undefined;
+ return n->namedInput(attr::weight)->... |
Don't print newline when decrypting
This is just used by humans at the moment so we haven't cared about the
whitespace that much. I've updated a runbook where I tell people they
can "recreate" a secret using this as the stdin. We need to not print
the newline in that case. | @@ -201,10 +201,12 @@ def paasta_secret(args):
print_paasta_helper(secret_path, args.secret_name, args.shared)
elif args.action == "decrypt":
- print(decrypt_secret(
+ print(
+ decrypt_secret(
secret_provider=secret_provider,
secret_name=args.secret_name,
- ))
+ ), end='',
+ )
else:
print("Unknown action")
sys.exit(1)
|
Update ua.txt
Covering this case more reliably. | @@ -1871,11 +1871,12 @@ jndi:iiop
# Reference: https://twitter.com/BillDemirkapi/status/1470055644740923398
# Reference: https://twitter.com/VessOnSecurity/status/1470373438363734026
# Reference: https://twitter.com/gwillem/status/1470395476570746885
+# Reference: https://twitter.com/11xuxx/status/1471236310299906050
#... |
Use GetModelConstraints from ModelConfigFacade
previously it was ClientFacade.GetModelConstraints | @@ -2012,7 +2012,7 @@ class Model:
:returns: A ``dict`` of constraints.
"""
constraints = {}
- client_facade = client.ClientFacade.from_connection(self.connection())
+ client_facade = client.ModelConfigFacade.from_connection(self.connection())
result = await client_facade.GetModelConstraints()
# GetModelConstraints ret... |
hyperparams -> hyperparameters
Thanks | @@ -25,12 +25,12 @@ def _prepare_study_with_trials(no_trials=False, less_than_two=False, with_c_d=Tr
Args:
no_trials: If ``False``, create a study with no trials.
- less_than_two: If ``True``, create a study with two/four hyperparams where
+ less_than_two: If ``True``, create a study with two/four hyperparameters where... |
Fix issue labeler
It didn't work yet, this is why | @@ -11,5 +11,5 @@ jobs:
with:
repo-token: "${{ secrets.GITHUB_TOKEN }}"
configuration-path: .github/labeler.yml
- not-before: 2022-08-07T00:00:00Z
+ include-title: 1
enable-versioned-regex: 0
|
Fix rename of H7 RM0455 OCTOSPI peripheral
Previous modification did not work as this peripheral is derivedFrom OCTOSPI in
the original SVD | @@ -12,8 +12,6 @@ _modify:
name: FDCAN2
DAC:
name: DAC1
- OCTOSPI1_CONTROL_REGISTER:
- name: OCTOSPI1
# The SVD is just quite different to the RM for all these registers.
# We'll go with the RM convention even though it is inconsistent too.
@@ -313,6 +311,8 @@ AXI:
# Work around the DMA_STR? interrupt mess in the SVD.
... |
AC: fix a divide-by-zero warning in AverageMeter
We're dividing by `increment`, so that's what we should be checking. This was
probably a copy-paste error from `evaluate`. | @@ -41,7 +41,7 @@ class AverageMeter:
loss = float(loss)
else:
loss = loss.astype(float)
- return np.divide(loss, increment, out=np.zeros_like(loss), where=self.total_count != 0)
+ return np.divide(loss, increment, out=np.zeros_like(loss), where=increment != 0)
def evaluate(self):
if self.total_count is None:
|
Add `TRANSITIONING` to possible values in `current_transport_state`
In a loop that async calls get_current_transport_info and prints it during a change from stopped to playing a uri, I noticed that Sonos can also return `TRANSITIONING` as a possible transport state. | @@ -1246,7 +1246,7 @@ class SoCo(_SocoSingletonBase):
Returns:
A dictionary containing the following information about the speakers
playing state
- current_transport_state (PLAYING, PAUSED_PLAYBACK, STOPPED),
+ current_transport_state (PLAYING, TRANSITIONING, PAUSED_PLAYBACK, STOPPED),
current_trasnport_status (OK, ?),... |
Fix tx receipt `status`
It can be `None` or missing for old transactions | @@ -128,7 +128,7 @@ class EthereumTxManager(models.Manager):
if ethereum_tx.block is None:
ethereum_tx.block = EthereumBlock.objects.get_or_create_from_block(block, current_block_number=current_block_number)
ethereum_tx.gas_used = tx_receipt['gasUsed']
- ethereum_tx.status = tx_receipt['status']
+ ethereum_tx.status = ... |
[internal] Add output to confirm immutable input race condition.
[ci skip-build-wheels] | @@ -44,6 +44,15 @@ impl ImmutableInputs {
let digest_str = digest.hash.to_hex();
let path = self.workdir.path().join(digest_str);
+ if let Ok(meta) = tokio::fs::metadata(&path).await {
+ // TODO: If this error triggers, it indicates that we have previously checked out this
+ // directory, either due to a race condition... |
CVE number was assigned
As stated. | id: wp-plugin-marmoset-viewer-xss
info:
- name: Wordpress Plugin Marmoset Viewer XSS
+ name: Wordpress Plugin Marmoset Viewer XSS [CVE-2021-24495]
author: johnjhacking
severity: medium
tags: wordpress,xss
|
urplay: sometimes it adds country code several times
this happen when you download all subtitles. | @@ -52,7 +52,7 @@ class Urplay(Service, OpenGraphThumbMixin):
label = stream["tt"]["language"]
if stream["tt"]["scope"] != "complete":
label = "{}-{}".format(label, stream["tt"]["scope"])
- yield subtitle(copy.copy(self.config), "tt", stream["tt"]["location"], label, output=self.output)
+ yield subtitle(copy.copy(self.... |
[query/service] retry entire partition when we encounter transient errors in compiled code
Ideally, a stream would be able to recover from a transient error by
seeking, but until we have that functionality, this avoids having
one failure out of 5000 (which I have now seen twice).
Example: | @@ -138,7 +138,9 @@ object Worker {
var result: Array[Byte] = null
var userError: HailException = null
try {
+ retryTransientErrors {
result = f(context, htc, theHailClassLoader, fs)
+ }
} catch {
case err: HailException => userError = err
}
|
Typo fix in text sentiment tutorial
Trivial typo fix in docs | @@ -101,7 +101,7 @@ label_pipeline = lambda x: int(x) - 1
#
# Before sending to the model, ``collate_fn`` function works on a batch of samples generated from ``DataLoader``. The input to ``collate_fn`` is a batch of data with the batch size in ``DataLoader``, and ``collate_fn`` processes them according to the data proc... |
StandardNodeGadget : Remove outdated LRUCache workaround
LRUCache now rethrows the previous exception when `get()` is called again for a second time. | @@ -87,16 +87,7 @@ class StandardNodeGadget::ErrorGadget : public Gadget
void addError( PlugPtr plug, const std::string &error )
{
PlugEntry &entry = m_errors[plug];
- if( entry.error.empty() || !boost::ends_with( error, "Previous attempt to get item failed." ) )
- {
- // Update the error message. Unfortunately the IEC... |
packaging: Use cpu directory by default
Default to using the `cpu` directory since the base one is not reliable | @@ -34,7 +34,7 @@ setup_cuda() {
# First, compute version suffixes. By default, assume no version suffixes
export VERSION_SUFFIX=""
export PYTORCH_VERSION_SUFFIX=""
- export WHEEL_DIR=""
+ export WHEEL_DIR="cpu/"
# Wheel builds need suffixes (but not if they're on OS X, which never has suffix)
if [[ "$BUILD_TYPE" == "w... |
Added .explain() method to QueryBuilder.
This provides a way to access the query execution plan. | @@ -1600,6 +1600,16 @@ class QueryBuilder(ObservesEvents):
sql = grammar.compile(self._action, qmark=False).to_sql()
return sql
+ def explain(self):
+ """Explains the Query execution plan.
+
+ Returns:
+ Collection
+ """
+ sql = self.to_sql()
+ explanation = self.statement(f'EXPLAIN {sql}')
+ return explanation
+
def r... |
update data owner upload dataset notebook to use dataset url
update utils to use dataset url instead of participation number | @@ -58,20 +58,10 @@ def split_into_train_test_val_sets(data, test=0.10, val=0.10):
return data_dict
-def load_data_as_df(
- participation_number, total_participants, file_path="./MedNIST.pkl"
-):
+def load_data_as_df(file_path="./MedNIST.pkl"):
df = pd.read_pickle(file_path)
df.sort_values("patient_id", inplace=True, i... |
Fix a bug about variable spelling errors
The variable subsampling_factors in class CustomConverterMulEnc was incorrectly written as subsamping_factors , resulting in inconsistency. | @@ -324,12 +324,12 @@ class CustomConverterMulEnc(object):
"""
- def __init__(self, subsamping_factors=[1, 1], dtype=torch.float32):
+ def __init__(self, subsampling_factors=[1, 1], dtype=torch.float32):
"""Initialize the converter."""
- self.subsamping_factors = subsamping_factors
+ self.subsampling_factors = subsampl... |
update mpca.py docstring
update docstring for two additional attributions shape_in and shape_out | @@ -72,8 +72,10 @@ class MPCA(BaseEstimator, TransformerMixin):
max_iter (int, optional): max number of iteration. Defaults to 1.
Attributes:
- proj_mats: a list of transposed projection matrices
- idx_order: the ordering index of projected (and vectorised) features in decreasing variance
+ proj_mats (list): a list of ... |
Add UUID to batch jobs
This was somehow lost in the monorepo-ization, was merged into batch as | @@ -119,7 +119,8 @@ class Job(object):
metadata = kube.client.V1ObjectMeta(generate_name = 'job-{}-'.format(self.id),
labels = {
'app': 'batch-job',
- 'hail.is/batch-instance': instance_id
+ 'hail.is/batch-instance': instance_id,
+ 'uuid': uuid.uuid4().hex
}),
spec = pod_spec)
|
tests: Speed up transport config tests by avoiding interpreter discovery
Reduced execution time of tests/ansible/integration/transport_config/all.yml
from 11 minutes to 49 seconds. | # integration/transport_config
# Hosts with twiddled configs that need to be checked somehow.
+[transport_config:children]
+transport_config_undiscover
+tc_python_path
-# tansport()
+[transport_config_undiscover:children]
+tc_become
+tc_become_method
+tc_become_pass
+tc_become_user
+tc_password
+tc_port
+tc_remote_addr... |
Reword somewhat confusing "top or bottom" description
Relates to | {% if stats.interesting %}
{% for measure in stats.interesting %}
{% if forloop.first %}
- <p>Over the last three months, we found that this {{ bookmark.org_type }} was in the top or bottom 10% on
+ <p>Over the last three months, we found that this {{ bookmark.org_type }} deviated a long way from the median practice on... |
Clarify degree to which DDT is implemented
This commit reflects the fact that `first_derivative` is almost but
not quite a drop-in replacement for DDT. Using red for this instance
seemed too pessimistic, so I opted for the blue indication. | @@ -367,10 +367,10 @@ blue is uncertain of parity, and white is unevaluated.
<tr>
<td class="tg-implemented">DDT(S)</td>
<td class="tg-implemented">Time derivative</td>
- <td class="tg-implemented"><a href="api/generated/generated/metpy.calc.first_derivative.html">metpy.calc.first_derivative</a></td>
- <td></td>
- <td ... |
Add BaseException.__getitem__ & __getslice__
Refs python/mypy#4215
Fixes false positive
error: Value of type Exception is not indexable | @@ -874,6 +874,9 @@ class BaseException(object):
args = ... # type: Tuple[Any, ...]
message = ... # type: str
def __init__(self, *args: object, **kwargs: object) -> None: ...
+ def __getitem__(self, i: int) -> Any: ...
+ def __getslice__(self, start: int, stop: int) -> Tuple[Any, ...]: ...
+
class GeneratorExit(BaseExc... |
Update README_EN.md
update link for figure 3.4 | @@ -75,7 +75,7 @@ The meaning of the expression is the increase or decrease of the document $U_i$
Based on the above inference, the RankNet network structure is constructed, which is composed of several layers of hidden layers and full connected layers. As shown in the figure, the document features are used in the hidd... |
packages/dcos-image-deps: remove enum34 dependency
That dependency only makes sense when executing Python code
in a CPython interpreter older than version 3.4. | "url": "https://pypi.python.org/packages/cf/23/ef729d6ef3a9d19732d480eaaf94a72799a99a38ed25eda10f8e68ffd408/azure_storage-0.32.0-py3-none-any.whl",
"sha1": "3cc425a291fe6359f6d786aa040059004082795d"
},
- "enum34": {
- "kind": "url",
- "url": "https://pypi.python.org/packages/af/42/cb9355df32c69b553e72a2e28daee25d1611d2... |
Fix frontend entrypoint
it's either bash:
```
[[ ! -f "/certs/cert.crt" || ! -f "/certs/key.key" ]]
```
or shell:
```
[ ! -f "/certs/cert.crt" ] || [ ! -f "/certs/key.key" ]
```
but not a combination of both. | @@ -6,7 +6,7 @@ then
exit 0
fi
-if [ ! -f "/certs/cert.crt" || ! -f "/certs/key.key" ]; then
+if [ ! -f "/certs/cert.crt" ] || [ ! -f "/certs/key.key" ]; then
echo "No certificates found. Generating self signed"
openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /certs/key.key -out /certs/cert.crt -subj "/C=US... |
Update README.md
add Checklist in list of attack recipes | @@ -109,6 +109,8 @@ Attacks on classification tasks, like sentiment classification and entailment:
- **alzantot**: Genetic algorithm attack from (["Generating Natural Language Adversarial Examples" (Alzantot et al., 2018)](https://arxiv.org/abs/1804.07998)).
- **bae**: BERT masked language model transformation attack f... |
RegexBlock error_message typo in docs
In the documention an example had a singular | @@ -132,7 +132,7 @@ A single-line text input that validates a string against a regex expression. The
.. code-block:: python
- blocks.RegexBlock(regex=r'^[0-9]{3}$', error_message={
+ blocks.RegexBlock(regex=r'^[0-9]{3}$', error_messages={
'invalid': "Not a valid library card number."
})
|
Support 'applications' key in bundles
Fixes | @@ -1658,8 +1658,9 @@ class BundleHandler(object):
apps, args = [], []
default_series = bundle.get('series')
+ apps_dict = bundle.get('applications', bundle.get('services', {}))
for app_name in self.applications:
- app_dict = bundle['services'][app_name]
+ app_dict = apps_dict[app_name]
charm_dir = os.path.abspath(os.p... |
Suppress stderr for expected errors
Messages caused confusion for some users | @@ -77,7 +77,7 @@ function setup_target_org() {
if [ -z "$CF_ORG" ]; then
CF_ORG={{ context.name }}-org
fi
- if ! $CF org $CF_ORG >/dev/null; then
+ if ! $CF org $CF_ORG >/dev/null 2>/dev/null; then
cf create-org $CF_ORG
ignore_failure=`$CF set-quota $CF_ORG runaway`
fi
@@ -88,7 +88,7 @@ function setup_target_space() {... |
Updated vae readme
fix broken link with regards to sigma vae samples | @@ -82,7 +82,7 @@ python run.py -config ./configs/vanilla_vae.yaml
[5]: https://github.com/probml/pyprobml/blob/master/scripts/vae/assets/info_vae_recon.png
[6]: https://github.com/probml/pyprobml/blob/master/scripts/vae/assets/logcosh_vae_recon.png
[7]: https://github.com/probml/pyprobml/blob/master/scripts/vae/assets... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.