id stringlengths 24 28 | content stringlengths 121 2.08k |
|---|---|
codereview_python_data_453 | ----------
angle : float
Rotation angle in degrees.
- axis : array_like or tuple of 2 AtomGroups
- Rotation axis vector. If a tuple is given the axis will be
- determined by the difference vector of the centroid for both
- AtomGroups.
point... |
codereview_python_data_456 | try:
state = ext_handler.properties.state
- self.get_artifact_error_state.reset()
if self.last_etag == etag:
if self.log_etag:
ext_handler_i.logger.verbose("Incarnation {0} did not change, not processing GoalState", etag)
the error sta... |
codereview_python_data_461 | @pyqtSlot(usertypes.KeyMode)
def _on_mode_entered(self, mode):
self._tab.run_js_async(
- javascript.assemble('caret', 'setInitialCursor'))
@pyqtSlot(usertypes.KeyMode)
def _on_mode_left(self):
You do `self._tab.run_js_async(javascript.assemble('caret', ...)))` a lot here. Why no... |
codereview_python_data_465 | err = "No atoms found in obj argument"
with pytest.raises(TypeError, match=err):
c = ParmEdConverter()
- c.convert("we still don't support emojis :(")
Do we not? I thought py3+ was UTF-8 compliant?
err = "No atoms found in obj argument"
with pytest.raises(TypeError, match=err):
... |
codereview_python_data_466 | forseti_config_file_path (str): Path to Forseti configuration file.
log_level (str): Sets the threshold for Forseti's logger.
enable_console_log (bool): Enable console logging.
- enable_debug_mode (bool): Enable console logging.
max_workers (int): maximum number of workers for... |
codereview_python_data_468 | 'google-auth-httplib2==0.0.3',
'Jinja2==2.10.1',
'jmespath==0.9.3',
- 'mailjet-rest==1.3.3',
'netaddr==0.7.19',
'pyyaml==4.2b4',
'python-graph-core==1.8.2',
I recommend that we move this to be optional, as other users might not need it. Can you look at `OPTIONAL_PACKAGES` section, arou... |
codereview_python_data_470 | iters_num = 17
num_workers = 4
for prefetch_queue_depths in ((3, 1, 1), (1, 3, 1), (1, 1, 3), (1, 1, 1), (3, 3, 3)):
- for cycle_policies in (("raise", "raise"), ("quiet", "raise"), ("raise", "quiet"), ("quiet", "quiet")):
for epoch_sizes in ((8, 4, 6), (8, 6, 4), (4, 6, 8), (1, 1, 1)... |
codereview_python_data_474 | "in minutes from current time. Prevents infinite loop when stop is none")
minutes_interval = luigi.IntParameter(
- default=5,
description="separation between events in minutes"
)
This seems way to arbitrary. Why not just `default=1`?
"in minutes from... |
codereview_python_data_482 | if config.val.qt.force_platform is not None:
os.environ['QT_QPA_PLATFORM'] = config.val.qt.force_platform
if config.val.qt.highdpi:
os.environ['QT_AUTO_SCREEN_SCALE_FACTOR'] = '1'
You'll still need to do this (but with the new setting name), otherwise the window decoration shows up again on... |
codereview_python_data_490 | def tick(self):
self.health_record.heartbeat()
self.cell = self.get_meta_cell()
- inventory.refresh_inventory()
now = time.time() * 1000
Refresh_inventory will trigger packet sent to server which is not the same behavior as original.
def tick(self):
self.health_rec... |
codereview_python_data_494 | from datetime import date
import os
-from unittest.mock import patch, mock_open
# pylint: disable=redefined-outer-name
import pytest
No blocker here but another way to write this could be: ``` mock_join = "test_api_report_yamls/complex_metadata.yaml" monkeypatch.setattr(os.path, "join", Mock(return_value=mock_join... |
codereview_python_data_504 | return console_handler, ram_handler
-def change_loglevel(level):
- value = LOG_LEVELS[level.upper()]
- console_handler.setLevel(value)
-
-
def _init_formatters(level, color, force_color, json_logging):
"""Init log formatters.
I'd do this inside the command, no point in adding a new function in `log.py... |
codereview_python_data_505 | <h1>Error 503 Backend is unhealthy</h1>
<p>Backend is unhealthy</p>
<h3>Guru Mediation:</h3>
- <p>Details: cache-sea4482-SEA 1645521621 1839665831</p>
<hr>
<p>Varnish cache server</p>
</body>
Getting the below error for this import Unable to register plugins: cannot import name 'DEFAULT_... |
codereview_python_data_521 | # -*- TRACE -*-
try:
try:
- from celery.concurrency.future import get_future_executor
except RuntimeError:
R = retval = fun(*args, **kwargs)
state = SUCCESS
What happens ... |
codereview_python_data_525 | self._molecule.driver.testinfra_args,
self._molecule.verifier_options)
testinfra_options['env'] = ansible.env
- testinfra_options['debug'] = self._molecule.args.get('debug', False)
if self._molecule.args.get('sudo'):
testinfra_options['sudo'] = True
Would... |
codereview_python_data_531 | # If title is empty, it couldn't be generated.
if not title:
return WorkerResult.SUCCESS
- self._update_title(title, _platform)
- if(self.terminal is True):
self._log_on_terminal(title)
return WorkerResult.SUCCESS
Parenthesis are not necessary.
... |
codereview_python_data_535 | ' number of rows is unknown. Make sure there is at least'
' one column in the frame so number of rows can be inferred.' % name)
if self.initializer is None:
- self.set_initializer()
# TODO(minjie): directly init data on the targer devi... |
codereview_python_data_538 | if __name__ == "__main__":
from Bio._utils import run_doctest
This line doesn't do anything - you import the function `run_doctest` but don't use it: ``` python from Bio._utils import run_doctest ```
if __name__ == "__main__":
from Bio._utils import run_doctest
+ run_doctest(verbose=0) |
codereview_python_data_545 | run_keras_single_device('cpu', 0)
-@raises(Exception, "TF device and DALI device mismatch. TF*: CPU, DALI*: GPU for output*")
def test_keras_wrong_placement_gpu():
with tf.device('cpu:0'):
model = keras_model()
You don't need to add trailing `*` at the end of glob pattern.
run_keras_single_de... |
codereview_python_data_552 | yield {
'resource_id': violation.resource_id,
'resource_type': violation.resource_type,
- 'resource_name': violation.resource_name,
'full_name': violation.full_name,
'rule_index': violation.rule_index,
'rule... |
codereview_python_data_553 | headers["Content-Type"] = "application/json"
self.log.debug("Request: %s %s %s", log_method, url, data[:self.logger_limit] if data else None)
- with log_std_streams(logger=self.log):
- response = self.http_request(method=log_method, url=url, data=data,
- ... |
codereview_python_data_560 | family=task.task_family,
module=task.task_module,
retry_policy_dict=_get_retry_policy_dict(task),
- deps_retry_policy_dicts=deps_retry_policy_dicts)
def _validate_dependency(self, dependency):
if isinstance(depen... |
codereview_python_data_561 | Parameters
----------
row_labels : list, optional
- Indices of rows to select.
row_positions : list-like of ints, optional
Numeric indices of rows to select.
col_labels : list, optional
- Indices of columns to select.
col_positions... |
codereview_python_data_563 | from google.cloud.security.common.data_access import csv_writer
from google.cloud.security.common.data_access import firewall_rule_dao
-from google.cloud.security.common.gcp_type.resource import ResourceType
from google.cloud.security.common.gcp_type import resource_util
from google.cloud.security.scanner.audit imp... |
codereview_python_data_565 | ordered = []
newscripts = []
for s in scripts:
- if s[-2:] != "py":
- continue
if s in current:
ordered.append(current[s])
else:
Any reason to hard-code and enforce only `.py` files? I could ... |
codereview_python_data_567 | testcase.get_metadata('last_tested_crash_revision') or
testcase.crash_revision)
fuzzer_display = get_fuzzer_display(testcase)
- fuzzer_name = fuzzer_display.name or ''
- fuzz_target = fuzzer_display.target or ''
- engine = (fuzzer_display.engine or '').lower()
- sanitizer = environment.get_memory_t... |
codereview_python_data_569 | """
universe = MDAnalysis.Universe(topology_path)
for element in elements:
- assert element in universe._topology[topology_section]
def test_all_bonds():
If you've gone to the effort of writing down all the bond names and passing them this far, you may as well include some sort of detailed error... |
codereview_python_data_574 | def validateFloat(value):
- return isinstance(value, float)
def validateInteger(value):
What happens if we pass an "integer" value to a parameter that expects a float? I suspect that the evaluator (since it doesn't interact with the param definition at all) will pass it in as an integer type and this will fail.
d... |
codereview_python_data_576 | if self.is_cpp_class_scope and function and function.scope is self:
# for C++ classes we can have both member and non-member operators
# and we really want to consider both
- outer_scope = self.outer_scope
- while outer_scope and not outer_scope.is_module_scope:
... |
codereview_python_data_579 | """List members by prefix.
Args:
- member_name_prefix(str): the prefix of member_name to query
Returns:
proto: the returned proto message of list_members
Same thing, it's not clear exactly what is the `prefix` here. So it would be nice to have an short example.
... |
codereview_python_data_585 | # https://github.com/mitmproxy/mitmproxy/issues/2197
if hf.request.http_version == "HTTP/2.0":
hf.request.http_version = "HTTP/1.1"
- host = hf.request.headers.pop(":authority", hf.request.pretty_host)
- hf.request.headers.insert(0, "host", host)
... |
codereview_python_data_588 | )
return None
- evm_assets = (
- AssetType.ETHEREUM_TOKEN,
- AssetType.POLYGON_TOKEN,
- AssetType.XDAI_TOKEN,
- AssetType.AVALANCHE_TOKEN,
- )
if asset_type in evm_assets:
cursor.execute(
Move that to a constant. ... |
codereview_python_data_589 | import dgl
-import sys
-import random
import time
import numpy as np
from multiprocessing import Process
What's the type of `g` here? `GraphStore`?
import dgl
import time
import numpy as np
from multiprocessing import Process |
codereview_python_data_594 | from bzt.modules.siege import SiegeExecutor, DataLogReader
from tests import BZTestCase
from tests.mocks import EngineEmul
-from bzt.utils import is_windows
-def tool_name():
- if is_windows():
- return 'siege.bat'
- else:
- return 'siege.sh'
def get_res_path(resource):
Look how it's done every... |
codereview_python_data_599 | # Generate autosummary pages. Output should be set with: `:toctree: pythonapi/`
autosummary_generate = ['Python-API.rst']
-# Add any paths that contain templates here, relative to this directory.
-templates_path = ['_templates']
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of st... |
codereview_python_data_614 | os.environ,
)
- create_args(x, zipline.extension_args)
-
def extract_option_object(option):
"""Convert a click.option call into a click.Option object.
should we parse the args before loading the extension? I could imagine the extension code wanting access to this to find resources
os.e... |
codereview_python_data_615 | @singledispatch
def get_validator_set(conn, height):
- """Get validator set at which are not synced"""
raise NotImplementedError
The docstring is not very clear.
@singledispatch
def get_validator_set(conn, height):
+ """Get validator set for a given `height`, if `height` is not specified
+ then retu... |
codereview_python_data_618 | worker_chunk = chunk_size + (minibatch_i < remainder)
if worker_chunk == 0:
break
- sample_slice = sample_range.get_slice(queued_no, queued_no + worker_chunk)
minibatch = TaskArgs(minibatch_i, sample_range=sample_slice)
minibatches.append(m... |
codereview_python_data_624 | def run_tasks(services):
loop = asyncio.get_event_loop()
loop.create_task(build_docs())
- loop.run_until_complete(app_svc.validate_requirements())
loop.run_until_complete(data_svc.restore_state())
loop.run_until_complete(RestApi(services).enable())
loop.run_until_complete(app_svc.register_c... |
codereview_python_data_626 | if not self._schema.IsDeprecatedArg(arg_name):
continue
meta = self._schema.DeprecatedArgMeta(arg_name)
with warnings.catch_warnings():
warnings.simplefilter("default")
Do I understand it correctly that now Python only issues ... |
codereview_python_data_632 | for entry in py_entries:
if entry.is_cglobal:
code.put_var_gotref(entry)
- code.put_decref_set(entry.cname, entry.type, "Py_None")
else:
code.put_var_xdecref_clear(entry)
This kind of code doesn't seem to appear anywhere else, but shou... |
codereview_python_data_638 | model_name = request.handle
try:
self.modeller.delete_model(model_name)
- success = model_pb2.DeleteModelReply.Status.Value('SUCCESS')
- reply = model_pb2.DeleteModelReply(status=success)
except Exception:
LOGGER.exception('Unable to delete mode... |
codereview_python_data_640 | # See the License for the specific language governing permissions and
# limitations under the License.
-"""Creates a Cloud SQL instance template for forseti_inventory."""
def GenerateConfig(context):
Update the pydoc as we're not creating Cloud SQL.
# See the License for the specific language governing permission... |
codereview_python_data_642 | try:
stream = openfunction(filename, mode=mode)
except (IOError, OSError) as err:
if errno.errorcode[err.errno] in ['ENOENT', 'EACCES']:
six.reraise(*sys.exc_info())
return None
Aha I missed this change. So the error is getting raised here, when it should (apparently) ... |
codereview_python_data_643 | def get_ann_info(self, idx):
"""Get annotation of concatenated dataset by index.
- This is needed by MixUp.
-
Args:
idx (int): Index of data.
line 83-92 can be encapsulated into a function like get_sample_idx and we can use it in many places.
def get_ann_info(self, idx)... |
codereview_python_data_644 | matrix = [column] * nd
out_types = [ltype.int] * nd + [ltype.int]
out_value = [list(range(div))] * nd + \
- [[nrows // div for i in range(div)]]
d_in = dt.Frame(matrix)
d_members = aggregate(d_in, min_rows=0, nd_max_bins=div, seed=1,
could be `[nrows // div] * div` too
mat... |
codereview_python_data_656 | defaults_tuple = TupleNode(
self.pos,
args=[
- arg.default.arg if hasattr(arg.default, "arg") else arg.default
for arg in default_args
]
)
Haven't... |
codereview_python_data_665 | if asset in self.sources_map:
# go find this asset in our custom sources
try:
- return self.sources_map[asset].loc[self.current_day].\
- loc[column]
except:
log.error(
"Could not find price for asset=... |
codereview_python_data_670 | to the number of vertices minus one), making it possible to assign a
meaningful value to all graphs.
Parameters
----------
G : NetworkX graph
This would not work for `weighted` graphs, is `harmonic_diameter` defined for a weighted graph?
to the number of vertices minus one), making it po... |
codereview_python_data_677 | return self.has_run
def run(self):
- if self.set_tracking_url is not None:
- self.set_tracking_url(tracking_url)
self.has_run = True
a = A()
Wait. Do the user really have to do this check before using the tracking_url? Can't we s... |
codereview_python_data_678 | iterator = tf_v1.data.make_initializable_iterator(daliset)
images, labels = iterator.get_next()
- images = tf_v1.reshape(images, [BATCH_SIZE, IMAGE_SIZE*IMAGE_SIZE])
labels = tf_v1.reshape(
tf_v1.one_hot(labels, NUM_CLASSES),
[BATCH_SIZE, NUM_CLASSES])
out of... |
codereview_python_data_681 | Returns:
user_recommendations_top_artist: list of recommended recordings of top artist.
- user_recommendations_top_artist: list of recommended recordings of similar artist.
"""
top_artists_recordings = top_artists_candidate_set.select('user_id', 'recording_id') \
... |
codereview_python_data_684 | "PDB_CHECK_RIGHTHAND_PA", # for testing right handedness of principal_axes
"MMTF_NOCRYST", # File with meaningless CRYST1 record (Issue #2679, PR #2685)
"FHIAIMS", # to test FHIAIMS coordinate files
- "SDF_molecule" # MDL SDFile for rdkit
"PDBX", # PDBxfile
]
```suggestion "SDF_molecule", # M... |
codereview_python_data_688 | .concat_map(lambda tx: tx['outputs']['public_keys'])
.reduce(lambda l, r: l + r), multi=True))
- # secondary index on inputs/transaction links (txid, cid)
connection.run(
r.db(dbname)
.table('bigchain')
Maybe use `output id` instead of `cid`.... |
codereview_python_data_703 | name='recommended_dict', probability=0.10, manually_enable=False)
VALUE_PROFILE_STRATEGY = Strategy(
name='value_profile', probability=0.33, manually_enable=False)
PEACH_GRAMMAR_MUTATION_STRATEGY = Strategy(
name='peach_grammar_mutation', probability=0.10, manually_enable=True)
This is fine to start, ... |
codereview_python_data_712 | """
Replace all variables with facts from the combo to build a single test variant
"""
- score, rewards, combo_set_id, combo_link_id = 0, [], set(), set()
for var in combo:
score += (score + var['score'])
rewards.append(var['id'])
declare the set and... |
codereview_python_data_717 | self, data,
print_example=False,
is_final=False,
- expected_failure=None,
):
text_repr = [None]
if self.settings.deadline is None:
Cleaner to just refer to `self.collector` in the next few lines without the assignment.
self, data,
print_exampl... |
codereview_python_data_720 | # Services
def debug(self, lvl, msg):
if self.debug_level >= lvl:
- if conf.interactive:
- log_interactive.debug(msg)
- else:
- print(msg)
def send(self, pkt):
if self.state.state in self.interception_points:
Is there a reason why we ... |
codereview_python_data_723 | new_db_name = db_name + '_new'
old_path = os.path.join(db_dir, db_name)
new_path = os.path.join(db_dir, new_db_name)
- new_seqno_db_name = config.stateTsDbName + '_new'
- # new_seq_no_path = os.path.join(db_dir, new_seqno_db_name)
try:
dest_seq_no_db_storage = initKeyValueStorage(conf... |
codereview_python_data_726 | )
def get_config(self):
- super_config = super().get_config()
- super_config = {k: v for k, v in super_config.items() if k != "scale_fn"}
- return {**super_config}
@tf.keras.utils.register_keras_serializable(package="Addons")
Would you mind making these serializations explicit for e... |
codereview_python_data_730 | out_size = self.roi_layers[0].output_size
num_levels = len(feats)
roi_feats = feats[0].new_zeros(
- rois.size(0), self.out_channels, *out_size)
# TODO: remove this when parrots supports
if torch.__version__ == 'parrots':
roi_feats.requires_grad = True... |
codereview_python_data_732 | -# Copyright 2020 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Are you missing something in the config dictionary e.g. like `use_bias` or initializes?
+# Copyright 2019 The TensorF... |
codereview_python_data_733 | apigateway_models_Stage_init_orig(self, name=None, deployment_id=None, variables=None, description='',
cacheClusterEnabled=False, cacheClusterSize=None)
- if cacheClusterSize or cacheClusterEnabled:
self['cacheClusterStatus'] = 'AVAILABLE'
apigateway_models.Stage.__init__ ... |
codereview_python_data_735 | if batch_norm:
self.bn = nn.BatchNorm(in_channels=out_feat)
- def message(self, edges):
- r"""The message computation function
- """
- theta_x = self.theta(edges.dst['x'] - edges.src['x'])
- phi_x = self.phi(edges.dst['x'])
- return {'e': theta_x + phi... |
codereview_python_data_739 | -from networkx.algorithms.applications import *
from networkx.algorithms.assortativity import *
from networkx.algorithms.block import *
from networkx.algorithms.boundary import *
Maybe this would be better in a `networkx.algorithms.tsp` module; all the other algorithms modules are named for the type of problem they... |
codereview_python_data_745 | serialization='pickle',
)
- market_data = ('^GSPC_benchmark.csv', 'treasury_curves.csv')
for data in market_data:
update_modified_time(
cls.tmpdir.getpath(
Shall we make a global name for the default benchmark symbol, so we don't have to change it in ... |
codereview_python_data_762 | with open(QemuProcess.LOG_PATH) as f:
# Strip non-printable characters at beginning of qemu log
qemu_log = ''.join(c for c in f.read() if c in string.printable)
- # Only report the tail of the log; otherwise we would only end up seeing
- # the beginning of it once the logging libr... |
codereview_python_data_766 | -def hey(self, stimulus):
if _is_silence(stimulus):
return 'Fine. Be that way!'
elif _is_shouting(stimulus):
You need to remove the `self`.
+def hey(stimulus):
if _is_silence(stimulus):
return 'Fine. Be that way!'
elif _is_shouting(stimulus): |
codereview_python_data_769 | from abc import abstractmethod
-from pprint import pformat
from bzt.engine import EngineModule
from bzt.utils import BetterDict, iteritems
class FunctionalAggregator(EngineModule):
def __init__(self):
super(FunctionalAggregator, self).__init__()
self.underlings = []
It's reporter's job to re... |
codereview_python_data_770 | class Meta(DashboardTable.Meta):
model = Category
fields = ('name', 'description', 'is_public')
- sequenze = ('name', 'description', '...', 'is_public', 'actions')
class AttributeOptionGroupTable(DashboardTable):
`sequenze` should be `sequence`. Also please check the failing lint errors.... |
codereview_python_data_782 | return self._process(el, key)
if isinstance(self.p.operation, ElementOperation):
return OperationCallable(dynamic_operation, inputs=[map_obj],
operation=self.p.operation)
else:
- return Callable(dynamic_operation, inputs=[m... |
codereview_python_data_783 | 'scripts/enable_bls',
'scripts/create_dirs.sh',
'scripts/indy_old_cli_export_dids',
- 'scripts/setup_indy_node_iptable']
)
missing 's' at the end of file name
'scripts/enable_bls',
'scripts/create_dirs.sh',
'scripts/indy_... |
codereview_python_data_787 | try:
from nvidia.dali.plugin.pytorch import DALIClassificationIterator, LastBatchPolicy
- from nvidia.dali.pipeline import pipeline
import nvidia.dali.types as types
import nvidia.dali.fn as fn
except ImportError:
have we decided that we want to replace examples?
try:
from nvidia.dali.plugin... |
codereview_python_data_789 | msg_aggregator=self.msg_aggregator,
)
- def _initialize_uniswap(self, premium: Optional[Premium]) -> None:
- self.eth_modules['uniswap'] = Uniswap(
- ethereum_manager=self.ethereum,
- database=self.database,
- premium=premium,
- msg_aggregator=... |
codereview_python_data_795 | def test_hostname(host):
- assert host.check_output('hostname -s') == 'instance'
def test_etc_molecule_directory(host):
Why is this flipped? Looks unrelated and our pattern is `expected == returned`.
def test_hostname(host):
+ assert 'instance' == host.check_output('hostname -s')
def test_etc_molecule_direc... |
codereview_python_data_796 | if check_exception_type:
assert isinstance(
md_e.value, type(pd_e)
- ), "Got Modin Exception type {}, but pandas Exception type {}".format(
type(md_e.value), type(pd_e)
)
if raising_exceptions:
This... |
codereview_python_data_799 | try:
from Bio.Align import _aligners
except ImportError as e:
- new_exc = ImportError("""{}: you should not import directly from the
- biopython source directory; please exit the source
- tree and re-launch your code from there""".format(e))
new_exc.__cau... |
codereview_python_data_807 | # TODO(crbug.com/920355): Reenable this when fork mode works with ChromeOS's
# MSAN.
- memory_tool = environment.get_memory_tool_name(
- environment.get_value('JOB_NAME'))
if memory_tool == 'MSAN' and environment.is_chromeos_system_job():
return False
nit: move environment.get_value('JOB_NAME') to ... |
codereview_python_data_809 | verbose_name = _('Catalogue reviews')
include_urls_in_parent = True
- hidable_feature_name = 'reviews'
-
def ready(self):
self.detail_view = get_class('catalogue.reviews.views', 'ProductReviewDetail')
self.create_view = get_class('catalogue.reviews.views', 'CreateProductReview')
You... |
codereview_python_data_810 | return graph_data
if readonly:
gidx = GraphIndex(None, multigraph, readonly)
else:
handle = _CAPI_DGLGraphCreateMutable(multigraph)
To avoid this awkward `None`, the solution is to first process all the graph data. There are functions to convert different types of graph data to `sr... |
codereview_python_data_819 | Using the unnormalized Laplacion, the layout shows possible clusters of
nodes which are an approximation of the ratio cut. The positions are the
entries of the second and third eigenvectors corresponding to the
- eigenvalues in ascending order.
Parameters
----------
maybe add: starting from... |
codereview_python_data_822 | if not self.is_sig_count_accepted(request, auth_constraint):
return False, "Not enough signatures"
if not self.is_owner_accepted(auth_constraint, auth_action):
- if auth_action.txn_type == NYM:
- return False, "{} can not touch verkey field since only the owner c... |
codereview_python_data_824 | from qutebrowser.utils import docutils
from qutebrowser.browser import pdfjs
-from end2end.features.test_scroll_bdd import check_scrolled, check_not_scrolled
-
bdd.scenarios('misc.feature')
Hmm, I'd really expect this to work, and yet it doesn't. I'll investigate later, though it might get Monday until I get the ti... |
codereview_python_data_825 | self.module = module
self.params = params
self.description = description
- self.stopping_conditions = []
- if stopping_conditions:
- self.stopping_conditions = [Fact(trait, value) for sc in stopping_conditions for trait, value in
- ... |
codereview_python_data_828 | Tensor from which to copy
`arr` : mxnet.nd.NDArray
Destination of the copy
- `cuda_stream` : Any value that can be cast to cudaStream_t
CUDA stream to be used for the copy
(if not provided, an internal user stream will be selected)
... |
codereview_python_data_831 | for index, reporter in enumerate(reporting):
reporter = ensure_is_dict(reporting, index, "module")
cls = reporter.get('module', ValueError())
- if cls != 'blazemeter':
new_reporting.append(reporter)
self.engine.config[Reporter.REP] = new_reporting
... |
codereview_python_data_856 | ]
-def laplacian_spectrum(G, weight="weight",
- =False):
"""Returns eigenvalues of the Laplacian of G
Parameters
looks like a syntax /typo error here at top of spectrum.py.
]
+def laplacian_spectrum(G, weight="weight", signless=False):
"""Returns eigenvalues of the Laplacian of... |
codereview_python_data_859 | upper_saturation: Number = 1,
lower_value: Number = 1,
upper_value: Number = 1,
- seed: Optional[str] = None,
name: Optional[str] = None,
) -> tf.Tensor:
"""Adjust hue, saturation, value of an RGB image randomly in YIQ color
```suggestion seed: Optional[int] = None, ```
upper_saturat... |
codereview_python_data_862 | assert len(parts.dml_ctes) == 1
cte = next(iter(parts.dml_ctes.values()))[0]
relctx.add_type_rel_overlay(
- ir_stmt.subject.typeref, 'unIon', cte,
dml_stmts=dml_stack, path_id=ir_stmt.subject.path_id, ctx=ctx)
elif isinstance(ir_stmt, irast.DeleteStmt):
r... |
codereview_python_data_865 | http_error_msg = u'%s Server Error: %s for url: %s' % (self.status_code, reason, self.url)
if http_error_msg:
- if isinstance(body_text, basestring):
- http_error_msg += u' Response Body: %s' % body_text
raise HTTPError(http_error_msg, response=self)
def... |
codereview_python_data_868 | self.rule_name = rule_name
self.rule_index = rule_index
self.rule = rule
def rule_requirements(self):
"""Used to create violation reason.
Break down the code to check if violation is returned or not and then form the violation reason.
self.rule_name = rule_name
... |
codereview_python_data_869 | data, feature_name, categorical_feature, self.pandas_categorical = _data_from_pandas(data, feature_name, categorical_feature, self.pandas_categorical)
label = _label_from_pandas(label)
self.data_has_header = False
- """process for args"""
params = {} if params is None else par... |
codereview_python_data_872 | """
if isinstance(u, groups.UpdatingAtomGroup):
- raise TypeError("""UpdatingAtomGroups are not valid for MSD
- computation""")
self.u = u
super(EinsteinMSD, self).__init__(self.u.universe.trajectory, **kwargs)
Why have a newline in your error message? Maybe sugge... |
codereview_python_data_876 | 'creating-managing-organization')
MESSAGE_RUN_FREQUENCY = (
- 'Forseti will run once every 8 hours, you can update the run '
'frequency in the server deployment template field "run-frequency"'
- ' or edit the cron job scheduled on the server VM directly.')
# Questions templates
QUESTION_ENABLE_WRITE_... |
codereview_python_data_888 | # These need to happen after the other imports.
from . algorithm import TradingAlgorithm
from . import api
-import zipline.extensions as ext
# PERF: Fire a warning if calendars were instantiated during zipline import.
# Having calendars doesn't break anything per-se, but it makes zipline imports
Shall we make thi... |
codereview_python_data_894 | def crop_func(image):
return function(image, layout=self.data_layout, shape=self.data_shape)
- self.crop = ops.PythonFunction(function = crop_func, output_layouts[data_layout])
def define_graph(self):
self.data = self.inputs()
```suggestion self.crop = ops.PythonFunction(func... |
codereview_python_data_895 | def get_dhcp_pid(self):
return self._get_dhcp_pid(["pidof", "dhcpcd"])
- def restart_if(self, ifname, unused_retries=None, unused_wait=None):
logger.info('restarting {} (sort of, actually SIGHUPing dhcpcd)'.format(ifname))
pid = self.get_dhcp_pid()
if pid != None: # pylint: d... |
codereview_python_data_904 | - Component properties
- Transplant methods
"""
- self._Group._add_prop(attr)
try:
self._classes[attr.level]._add_prop(attr)
The reason for this try-except isn't clear to me. What happens without it?
- Component properties
- Transplant meth... |
codereview_python_data_906 | warnings = Column(Text(16777215))
def __init__(self, *args, **kwargs):
- """Args:
*args (list): Arguments.
**kwargs (dict): Arguments.
"""
Please add a proper title, rather than `Args` ``` """Initialize Args: *args (list): Arguments. **kwargs (dict): Arguments. ```
... |
codereview_python_data_907 | str: GCP project id
str: GCP Authenticated user
bool: Whether or not the installer is running in cloudshell
- bool: Whether or not authenticated user is a service account
"""
return_code, out, err = utils.run_command(
['gcloud', 'info', '--format=json'])
Remove as t... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.