content_type
stringclasses
8 values
main_lang
stringclasses
7 values
message
stringlengths
1
50
sha
stringlengths
40
40
patch
stringlengths
52
962k
file_count
int64
1
300
PHP
PHP
fix function names
525d786dffce89935c52240ddbcae7d7645c51b7
<ide><path>src/Mailer/Message.php <ide> public function serialize(): string <ide> $array = $this->jsonSerialize(); <ide> array_walk_recursive($array, function (&$item, $key) { <ide> if ($item instanceof SimpleXMLElement) { <del> $item = jsondecodeForHeader(jsonencode((array)$item), true); <add> $item = json_decode(json_encode((array)$item), true); <ide> } <ide> }); <ide>
1
Text
Text
update documentations for embedding and unit-norm
a46e50a3227b0840d159537907ed2ea82f598ff1
<ide><path>docs/sources/constraints.md <ide> model.add(Dense(64, 64, W_constraint = maxnorm(2))) <ide> ## Available constraints <ide> <ide> - __maxnorm__(m=2): maximum-norm constraint <del>- __nonneg__(): non-negativity constraint <ide>\ No newline at end of file <add>- __nonneg__(): non-negativity constraint <add>- __unitnorm__(): unit-norm constraint, enforces the matrix to have unit norm along the last axis <ide>\ No newline at end of file <ide><path>docs/sources/layers/embeddings.md <ide> ## Embedding <ide> <ide> ```python <del>keras.layers.embeddings.Embedding(input_dim, output_dim, init='uniform', weights=None) <add>keras.layers.embeddings.Embedding(input_dim, output_dim, init='uniform', weights=None, W_regularizer=None, W_constraint=None) <ide> ``` <ide> <ide> Turn positive integers (indexes) into denses vectors of fixed size, <ide> eg. `[[4], [20]] -> [[0.25, 0.1], [0.6, -0.2]]` <ide> - __output_dim__: int >= 0. Dimension of the dense embedding. <ide> - __init__: name of initialization function for the weights of the layer (see: [initializations](../initializations.md)), or alternatively, Theano function to use for weights initialization. This parameter is only relevant if you don't pass a `weights` argument. <ide> - __weights__: list of numpy arrays to set as initial weights. The list should have 1 element, of shape `(input_dim, output_dim)`. <add> - __W_regularizer__: instance of the [regularizers](../regularizers.md) module (eg. L1 or L2 regularization), applied to the embedding matrix. <add> - __W_constraint__: instance of the [constraints](../constraints.md) module (eg. maxnorm, nonneg), applied to the embedding matrix. <ide> <ide> <ide> ## WordContextProduct
2
Java
Java
add marbles for observable (12/06)
0bab46d4a974fd0514fc2dc97414d259d1e0689e
<ide><path>src/main/java/io/reactivex/Observable.java <ide> public final <R> Observable<R> concatMap(Function<? super T, ? extends Observabl <ide> * one at a time and emits their values in order <ide> * while delaying any error from either this or any of the inner ObservableSources <ide> * till all of them terminate. <del> * <add> * <p> <add> * <img width="640" height="347" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapDelayError.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code concatMapDelayError} does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapDelayError(Function<? super T, ? extends <ide> * one at a time and emits their values in order <ide> * while delaying any error from either this or any of the inner ObservableSources <ide> * till all of them terminate. <del> * <add> * <p> <add> * <img width="640" height="347" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapDelayError.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code concatMapDelayError} does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapDelayError(Function<? super T, ? extends <ide> * Eager concatenation means that once a subscriber subscribes, this operator subscribes to all of the <ide> * source ObservableSources. The operator buffers the values emitted by these ObservableSources and then drains them in <ide> * order, each one after the previous one completes. <add> * <p> <add> * <img width="640" height="360" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapEager.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>This method does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapEager(Function<? super T, ? extends Obse <ide> * Eager concatenation means that once a subscriber subscribes, this operator subscribes to all of the <ide> * source ObservableSources. The operator buffers the values emitted by these ObservableSources and then drains them in <ide> * order, each one after the previous one completes. <add> * <p> <add> * <img width="640" height="360" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapEager.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>This method does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapEager(Function<? super T, ? extends Obse <ide> * Eager concatenation means that once a subscriber subscribes, this operator subscribes to all of the <ide> * source ObservableSources. The operator buffers the values emitted by these ObservableSources and then drains them in <ide> * order, each one after the previous one completes. <add> * <p> <add> * <img width="640" height="390" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapEagerDelayError.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>This method does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapEagerDelayError(Function<? super T, ? ex <ide> * Eager concatenation means that once a subscriber subscribes, this operator subscribes to all of the <ide> * source ObservableSources. The operator buffers the values emitted by these ObservableSources and then drains them in <ide> * order, each one after the previous one completes. <add> * <p> <add> * <img width="640" height="390" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapEagerDelayError.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>This method does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final <R> Observable<R> concatMapEagerDelayError(Function<? super T, ? ex <ide> /** <ide> * Maps each element of the upstream Observable into CompletableSources, subscribes to them one at a time in <ide> * order and waits until the upstream and all CompletableSources complete. <add> * <p> <add> * <img width="640" height="505" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapCompletable.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code concatMapCompletable} does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final Completable concatMapCompletable(Function<? super T, ? extends Comp <ide> /** <ide> * Maps each element of the upstream Observable into CompletableSources, subscribes to them one at a time in <ide> * order and waits until the upstream and all CompletableSources complete. <add> * <p> <add> * <img width="640" height="505" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapCompletable.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code concatMapCompletable} does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final Completable concatMapCompletable(Function<? super T, ? extends Comp <ide> /** <ide> * Returns an Observable that concatenate each item emitted by the source ObservableSource with the values in an <ide> * Iterable corresponding to that item that is generated by a selector. <add> * <p> <add> * <img width="640" height="275" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapIterable.o.png" alt=""> <ide> * <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> public final <U> Observable<U> concatMapIterable(final Function<? super T, ? ext <ide> /** <ide> * Returns an Observable that concatenate each item emitted by the source ObservableSource with the values in an <ide> * Iterable corresponding to that item that is generated by a selector. <add> * <p> <add> * <img width="640" height="275" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/concatMapIterable.o.png" alt=""> <ide> * <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> public final Observable<T> distinctUntilChanged(BiPredicate<? super T, ? super T <ide> * Calls the specified consumer with the current item after this item has been emitted to the downstream. <ide> * <p>Note that the {@code onAfterNext} action is shared between subscriptions and as such <ide> * should be thread-safe. <add> * <p> <add> * <img width="640" height="360" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/doAfterNext.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code doAfterNext} does not operate by default on a particular {@link Scheduler}.</dd> <ide> public final Observable<T> doAfterTerminate(Action onFinally) { <ide> * is executed once per subscription. <ide> * <p>Note that the {@code onFinally} action is shared between subscriptions and as such <ide> * should be thread-safe. <add> * <p> <add> * <img width="640" height="281" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/doFinally.o.png" alt=""> <ide> * <dl> <ide> * <dt><b>Scheduler:</b></dt> <ide> * <dd>{@code doFinally} does not operate by default on a particular {@link Scheduler}.</dd>
1
PHP
PHP
update docblock typo
be8af6910def484ad8ee70d1b3025fc59223f2e3
<ide><path>src/Illuminate/Foundation/Http/FormRequest.php <ide> protected function passesAuthorization() <ide> * <ide> * @return mixed <ide> * <del> * @throws \\Illuminate\Http\Exception\HttpResponseExceptio <add> * @throws \\Illuminate\Http\Exception\HttpResponseException <ide> */ <ide> protected function failedAuthorization() <ide> {
1
Python
Python
add support for basic dataset inputs
f0e215987c51a4f0e98c96a726276465be2f3071
<ide><path>keras/utils/dataset_utils.py <ide> <ide> <ide> <add> <ide> import tensorflow.compat.v2 as tf <ide> # pylint: disable=g-classes-have-attributes <ide> <ide> import multiprocessing <ide> import os <ide> import time <ide> import warnings <del>import math <add>from random import Random <ide> <ide> import numpy as np <ide> from tensorflow.python.util.tf_export import keras_export <ide> def split_dataset(dataset, <ide> Returns: <ide> A tuple of two `tf.data.Dataset` objects: the left and right splits. <ide> """ <del> # TODO (prakashsellathurai) : integrate unit test. <ide> <ide> if not isinstance(dataset,(tf.data.Dataset,list,tuple)): <ide> raise TypeError('`dataset` must be either a tf.data.Dataset object' <del> 'or a list/tuple of arrays',f'Got {type(dataset)}') <add> f' or a list/tuple of arrays. Received : {type(dataset)}') <ide> <ide> if right_size is None and left_size is None: <del> raise ValueError('Both `left_size` and `right_size`cannot be `None`' <del> 'atleast specify one with valid value') <add> raise ValueError('you must specify either `left_size` or `right_size`' <add> ' Received: `left_size`= None, and `right_size`=None') <ide> <del> dataset_as_list = [] <del> <del> if isinstance(dataset,tf.data.Dataset): <del> data_size_warning_flag = False <del> start_time = time.time() <del> i = 0 <del> for datum in list(dataset): <del> cur_time = time.time() <del> # warns user if the dataset is too large to iterate within 10s <del> if int(cur_time - start_time) > 10 and not data_size_warning_flag: <del> warnings.warn('Takes too long time to process the `dataset`,' <del> 'this function is only for small datasets' <del> 'that fits within the memory') <del> data_size_warning_flag = True <del> dataset_as_list.append(datum) <del> elif isinstance(dataset,list): <del> dataset_as_list = dataset.copy() <del> elif isinstance(dataset,tuple): <del> dataset_as_list = list(zip(*dataset)) <add> dataset_as_list = _convert_dataset_to_list(dataset) <add> <add> if seed is None: <add> seed = np.random.randint(1e6) <ide> <ide> if shuffle: <del> if seed: <del> np.random.seed(seed) <del> np.random.shuffle(dataset_as_list) <add> Random(seed).shuffle(dataset_as_list) <ide> <del> total_size = len(dataset_as_list) <add> total_length = len(dataset_as_list) <ide> <del> left_size,right_size = convert_dataset_split_sizes( <del> left_size,right_size,total_size <add> left_size,right_size = _rescale_dataset_split_sizes( <add> left_size,right_size,total_length <ide> ) <ide> <ide> left_dataset = dataset_as_list[:left_size] <del> right_dataset = dataset_as_list[left_size:] <add> right_dataset = dataset_as_list[-right_size:] <ide> <ide> left_dataset = tf.data.Dataset.from_tensor_slices(left_dataset) <ide> right_dataset = tf.data.Dataset.from_tensor_slices(right_dataset) <ide> def split_dataset(dataset, <ide> <ide> return left_dataset, right_dataset <ide> <del>def convert_dataset_split_sizes(left_size,right_size,total_size): <del> """Helper function to convert left_size/right_size relative to dataset's size <add>def _convert_dataset_to_list(dataset,data_size_warning_flag = True): <add> """Helper function to convert a tf.data.Dataset object or a list/tuple of numpy.ndarrays to a list <add> """ <add> <add> if isinstance(dataset,tuple): <add> dataset_iterator = list(zip(*dataset)) <add> elif isinstance(dataset,list): <add> dataset_iterator = dataset.copy() <add> elif isinstance(dataset,tf.data.Dataset): <add> dataset_iterator = list(dataset) <add> else: <add> raise TypeError('`dataset` must be either a tf.data.Dataset object' <add> f' or a list/tuple of arrays. Received : {type(dataset)}' <add> ) <add> <add> <add> dataset_as_list = [] <add> start_time = time.time() <add> i = 0 <add> for i,datum in enumerate(dataset_iterator): <add> if data_size_warning_flag: <add> if i % 10 == 0: <add> cur_time = time.time() <add> # warns user if the dataset is too large to iterate within 10s <add> if int(cur_time - start_time) > 10 and data_size_warning_flag: <add> warnings.warn('Takes too long time to process the `dataset`,' <add> 'this function is only for small datasets ' <add> '(e.g. < 10,000 samples).' <add> ) <add> data_size_warning_flag = False <add> <add> dataset_as_list.append(datum) <add> <add> return dataset_as_list <add> <add>def _rescale_dataset_split_sizes(left_size,right_size,total_length): <add> """Helper function to rescale left_size/right_size args relative <add> to dataset's size <ide> """ <ide> <ide> left_size_type = type(left_size) <ide> right_size_type = type(right_size) <ide> <add> if ((left_size is not None and left_size_type not in [int,float]) and <add> (right_size is not None and right_size_type not in [int,float])): <add> raise TypeError('Invalid `left_size` and `right_size` Types. ' <add> 'Expected: integer or float or None. ' <add> f' Received: {left_size_type} and {right_size_type}') <ide> <ide> if left_size is not None and left_size_type not in [int,float]: <del> raise ValueError(f'Invalid `left_size` type Got {left_size_type}' <del> 'It should be one of float,int or None') <add> raise TypeError(f'Invalid `left_size` Type. Received: {left_size_type}. ' <add> ' Expected: int or float or None') <add> <ide> if right_size is not None and right_size_type not in [int,float]: <del> raise ValueError(f'Invalid `right_size` type Got {right_size_type}' <del> 'It should be one of float,int or None') <add> raise TypeError(f'Invalid `right_size` Type. Received: {right_size_type}.' <add> ' Expected: int or float or None') <ide> <add> if left_size == 0 and right_size == 0: <add> raise ValueError('Invalid `left_size` and `right_size` values. ' <add> 'You must specify either `left_size` or `right_size` with ' <add> f'value greater than 0 and less than {total_length} ' <add> 'or a float within range [0,1] to split the dataset' <add> f'Received: `left_size`={left_size}, ' <add> f'`right_size`={right_size}') <ide> <ide> if (left_size_type == int <del> and (left_size <= 0 or left_size>= total_size) <add> and (left_size <= 0 or left_size>= total_length) <ide> or left_size_type == float <ide> and (left_size <= 0 or left_size>= 1) ): <del> raise ValueError('`left_size` should be either a positive integer' <del> f'and smaller than {total_size} or a float ' <del> 'within the range `[0, 1]`') <add> raise ValueError('`left_size` should be either a positive integer ' <add> f'and smaller than {total_length} or a float ' <add> 'within the range `[0, 1]`. Received: left_size=' <add> f'{left_size}') <ide> <ide> if (right_size_type == int <del> and (right_size <= 0 or right_size>= total_size) <add> and (right_size <= 0 or right_size>= total_length) <ide> or right_size_type == float <ide> and (right_size <= 0 or right_size>= 1)): <ide> raise ValueError('`right_size` should be either a positive integer ' <del> f'and smaller than {total_size} or' <del> 'a float within the range `[0, 1]`') <add> f'and smaller than {total_length} or ' <add> 'a float within the range `[0, 1]`. Received: right_size=' <add> f'{right_size}') <ide> <ide> if right_size_type == left_size_type == float and right_size + left_size > 1: <ide> raise ValueError('sum of `left_size` and `right_size`' <del> ' should be within `[0,1]`' <del> f'Got {right_size + left_size} ,' <add> ' should be within `[0,1]`.' <add> f'Received: {right_size + left_size} ,' <ide> 'reduce the `left_size` or `right_size`') <ide> <ide> if left_size_type == float: <del> left_size = math.ceil(left_size*total_size) <add> left_size = round(left_size*total_length) <ide> else: <ide> left_size = float(left_size) <ide> <ide> if right_size_type == float: <del> right_size = math.ceil(right_size*total_size) <add> right_size = round(right_size*total_length) <ide> else: <ide> right_size = float(right_size) <ide> <ide> <ide> if left_size is None: <del> left_size = total_size - right_size <add> left_size = total_length - right_size <ide> elif right_size is None: <del> right_size = total_size - left_size <add> right_size = total_length - left_size <ide> <del> if left_size + right_size > total_size: <add> if left_size + right_size > total_length: <ide> raise ValueError('The sum of `left_size` and `right_size`' <del> f' should be smaller than the samples {total_size} ' <add> f' should be smaller than the samples {total_length} ' <ide> ' reduce `left_size` or `right_size` ' ) <ide> <ide> <del> if left_size == 0: <del> raise ValueError(f'with dataset of length={total_size}' <del> '`left_size`={left_size} and `right_size`={right_size} ' <del> 'resulting left dataset split will be empty, ' <del> 'adjust any of the aforementioned parameters') <add> for split,side in [(left_size,'left'),(right_size,'right')]: <add> if split == 0: <add> raise ValueError(f'with dataset of length={total_length} ' <add> '`left_size`={left_size} and `right_size`={right_size}, ' <add> f'resulting {side} dataset split will be empty. ' <add> 'Adjust any of the aforementioned parameters') <ide> <ide> left_size,right_size = int(left_size) ,int(right_size) <ide> return left_size,right_size <ide><path>keras/utils/dataset_utils_test.py <ide> <ide> import tensorflow.compat.v2 as tf <ide> <add>import numpy as np <add> <ide> from keras.utils import dataset_utils <ide> <ide> <del>class TestSplitDataset(tf.test.TestCase): <add>class SplitDatasetTest(tf.test.TestCase): <ide> <del> def test_invalid_dataset_cases(self): <add> def test_with_list_dataset(self): <add> dataset = [np.ones(shape=(10,10,10)) for _ in range(10)] <add> left_dataset,right_dataset = dataset_utils.split_dataset(dataset, <add> left_size=5, <add> right_size=5) <add> self.assertEqual(len(left_dataset), len(right_dataset)) <add> self.assertIsInstance(left_dataset, tf.data.Dataset) <add> self.assertIsInstance(left_dataset, tf.data.Dataset) <add> <add> dataset = [np.ones(shape=(10,10,10)) for _ in range(10)] <add> left_dataset,right_dataset = dataset_utils.split_dataset(dataset, <add> left_size=0.6, <add> right_size=0.4) <add> self.assertEqual(len(left_dataset), 6) <add> self.assertEqual(len(right_dataset), 4) <add> <add> <add> def test_with_tuple_dataset(self): <add> dataset = (np.ones(shape=(10,10,10)),np.zeros(shape=(10,10,10))) <add> left_dataset,right_dataset = dataset_utils.split_dataset(dataset, <add> left_size=0.75, <add> right_size=0.25) <add> self.assertLen(left_dataset, 8) <add> self.assertLen(right_dataset, 2) <add> <add> left_dataset,right_dataset = dataset_utils.split_dataset(dataset, <add> left_size=0.35, <add> right_size=0.65) <add> self.assertLen(left_dataset, 4) <add> self.assertLen(right_dataset, 6) <add> self.assertIsInstance(left_dataset, tf.data.Dataset) <add> self.assertIsInstance(right_dataset, tf.data.Dataset) <add> <add> <add> def test_with_invalid_dataset(self): <ide> with self.assertRaises(TypeError): <del> dataset_utils.split_dataset(dataset=None, left_size=5) <del> <add> dataset_utils.split_dataset(dataset=None, left_size=5) <ide> with self.assertRaises(TypeError): <del> dataset_utils.split_dataset(dataset=1, left_size=5) <del> <add> dataset_utils.split_dataset(dataset=1, left_size=5) <ide> with self.assertRaises(TypeError): <del> dataset_utils.split_dataset(dataset=float(1.2), left_size=5) <del> <add> dataset_utils.split_dataset(dataset=float(1.2), left_size=5) <ide> with self.assertRaises(TypeError): <del> dataset_utils.split_dataset(dataset=dict({})) <del> <add> dataset_utils.split_dataset(dataset=dict({}), left_size=5) <ide> with self.assertRaises(TypeError): <del> dataset_utils.split_dataset(dataset=float('INF')) <add> dataset_utils.split_dataset(dataset=float('INF'), left_size=5) <ide> <del> def test_valid_left_size_cases(self): <add> def test_with_valid_left_and_right_sizes(self): <ide> <ide> dataset = [1,2,3] <del> splitted_dataset = dataset_utils.split_dataset(dataset, left_size=1,right_size=2) <add> splitted_dataset = dataset_utils.split_dataset(dataset, <add> left_size=1, <add> right_size=2) <ide> assert(len(splitted_dataset) == 2) <ide> left_dataset,right_dataset = splitted_dataset <ide> self.assertEqual(len(left_dataset), 1) <ide> self.assertEqual(len(right_dataset), 2) <ide> self.assertEqual(list(left_dataset), [1]) <ide> self.assertEqual(list(right_dataset), [2,3]) <del> <ide> <del> def test_invalid_left_and_right_case(self): <add> <add> dataset = [1,2,3,4,5,6,7,8,9,10] <add> splitted_dataset = dataset_utils.split_dataset(dataset, <add> left_size=0.1, <add> right_size=0.9) <add> assert(len(splitted_dataset) == 2) <add> left_dataset,right_dataset = splitted_dataset <add> self.assertEqual(len(left_dataset), 1 ) <add> self.assertEqual(len(right_dataset), 9 ) <add> self.assertEqual(list(left_dataset), [1]) <add> self.assertEqual(list(right_dataset), [2,3,4,5,6,7,8,9,10]) <add> <add> dataset = [1,2,3,4,5,6,7,8,9,10] <add> splitted_dataset = dataset_utils.split_dataset(dataset, <add> left_size=2, <add> right_size=5) <add> assert(len(splitted_dataset) == 2) <add> left_dataset,right_dataset = splitted_dataset <add> self.assertEqual(len(left_dataset), 2 ) <add> self.assertEqual(len(right_dataset), 5 ) <add> self.assertEqual(list(left_dataset), [1,2]) <add> self.assertEqual(list(right_dataset), [6,7,8,9,10]) <add> <add> def test_with_float_left_and_right_sizes(self): <add> dataset = tf.data.Dataset.from_tensor_slices(np.array([[0.1,0.2,0.3], <add> [0.4,0.5,0.6], <add> [0.7,0.8,0.9]])) <add> left_dataset,right_dataset = dataset_utils.split_dataset(dataset, <add> left_size=0.8, <add> right_size=0.2) <add> self.assertEqual(len(left_dataset), 2) <add> self.assertEqual(len(right_dataset), 1) <add> <add> def test_with_invalid_float_left_and_right_sizes(self): <ide> with self.assertRaises(ValueError): <del> dataset_utils.split_dataset(dataset=[1,2,3], left_size=None) <add> dataset = [np.ones(shape=(200, 32,32)), np.zeros(shape=(200, 32,32))] <add> dataset_utils.split_dataset(dataset, left_size=0.8,right_size=0.2) <add> with self.assertRaises(ValueError): <add> dataset = [1] <add> dataset_utils.split_dataset(dataset, left_size=0.8,right_size=0.2) <add> <add> <ide> <add> def test_with_None_and_zero_left_and_right_size(self): <add> with self.assertRaises(ValueError): <add> dataset_utils.split_dataset(dataset=[1,2,3], left_size=None) <ide> with self.assertRaises(ValueError): <ide> dataset_utils.split_dataset([1,2,3], left_size=None,right_size=None) <del> <ide> with self.assertRaises(ValueError): <ide> dataset_utils.split_dataset([1,2,3], left_size=3,right_size=None) <add> with self.assertRaises(ValueError): <add> dataset_utils.split_dataset([1,2], left_size=3,right_size=None) <add> with self.assertRaises(ValueError): <add> dataset_utils.split_dataset([1,2], left_size=0,right_size=0) <add> <add> def test_with_invalid_left_and_right_size_types(self): <add> with self.assertRaises(TypeError): <add> dataset_utils.split_dataset([1,2], left_size='1',right_size='1') <add> with self.assertRaises(TypeError): <add> dataset_utils.split_dataset([1,2], left_size=0,right_size='1') <add> with self.assertRaises(TypeError): <add> dataset_utils.split_dataset([1,2], left_size='100',right_size=None) <add> with self.assertRaises(TypeError): <add> dataset_utils.split_dataset([1,2], right_size='1') <add> with self.assertRaises(TypeError): <add> dataset_utils.split_dataset([1,2], left_size=0.5,right_size='1') <add> <ide> <ide> <ide>
2
Javascript
Javascript
remove unused code in doc generation tool
5315304e715e28ab160f9d71c6260f96ba7b637a
<ide><path>tools/doc/html.js <ide> function versionSort(a, b) { <ide> <ide> function buildToc({ filename, apilinks }) { <ide> return (tree, file) => { <del> const startIncludeRefRE = /^\s*<!-- \[start-include:(.+)\] -->\s*$/; <del> const endIncludeRefRE = /^\s*<!-- \[end-include:.+\] -->\s*$/; <del> const realFilenames = [filename]; <ide> const idCounters = Object.create(null); <ide> let toc = ''; <ide> let depth = 0; <ide> <ide> visit(tree, null, (node) => { <del> // Keep track of the current filename for comment wrappers of inclusions. <del> if (node.type === 'html') { <del> const [, includedFileName] = node.value.match(startIncludeRefRE) || []; <del> if (includedFileName !== undefined) <del> realFilenames.unshift(includedFileName); <del> else if (endIncludeRefRE.test(node.value)) <del> realFilenames.shift(); <del> } <del> <ide> if (node.type !== 'heading') return; <ide> <ide> if (node.depth - depth > 1) { <ide> function buildToc({ filename, apilinks }) { <ide> } <ide> <ide> depth = node.depth; <del> const realFilename = path.basename(realFilenames[0], '.md'); <add> const realFilename = path.basename(filename, '.md'); <ide> const headingText = file.contents.slice( <ide> node.children[0].position.start.offset, <ide> node.position.end.offset).trim();
1
Text
Text
add devsnek to collaborators
eacc88c3a6ef68a88a3a4b819b375c698c05fd9a
<ide><path>README.md <ide> For more information about the governance of the Node.js project, see <ide> **Daniel Bevenius** &lt;daniel.bevenius@gmail.com&gt; <ide> * [DavidCai1993](https://github.com/DavidCai1993) - <ide> **David Cai** &lt;davidcai1993@yahoo.com&gt; (he/him) <add>* [devsnek](https://github.com/devsnek) - <add>**Gus Caplan** &lt;me@gus.host&gt; (he/him) <ide> * [edsadr](https://github.com/edsadr) - <ide> **Adrian Estrada** &lt;edsadr@gmail.com&gt; (he/him) <ide> * [eljefedelrodeodeljefe](https://github.com/eljefedelrodeodeljefe) -
1
Ruby
Ruby
use `delegate` to call the methods to `@conn`
7ba2cd06a215f4f1f48e61957dda9ca4a880d0a4
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/schema_creation.rb <ide> def visit_AddColumn(o) <ide> "ADD #{accept(o)}" <ide> end <ide> <add> delegate :quote_column_name, :quote_table_name, :quote_default_expression, :type_to_sql, to: :@conn <add> private :quote_column_name, :quote_table_name, :quote_default_expression, :type_to_sql <add> <ide> private <ide> <ide> def visit_AlterTable(o) <ide> def column_options(o) <ide> column_options <ide> end <ide> <del> def quote_column_name(name) <del> @conn.quote_column_name name <del> end <del> <del> def quote_table_name(name) <del> @conn.quote_table_name name <del> end <del> <del> def type_to_sql(type, limit, precision, scale) <del> @conn.type_to_sql type.to_sym, limit, precision, scale <del> end <del> <ide> def add_column_options!(sql, options) <ide> sql << " DEFAULT #{quote_default_expression(options[:default], options[:column])}" if options_include_default?(options) <ide> # must explicitly check for :null to allow change_column to work on migrations <ide> def add_column_options!(sql, options) <ide> sql <ide> end <ide> <del> def quote_default_expression(value, column) <del> @conn.quote_default_expression(value, column) <del> end <del> <ide> def options_include_default?(options) <ide> options.include?(:default) && !(options[:null] == false && options[:default].nil?) <ide> end
1
Ruby
Ruby
pass `debug?` and `verbose?` to `cleaner`
add10377b893d8e970bc4122a32f0dfcd1091481
<ide><path>Library/Homebrew/cleaner.rb <ide> # * sets permissions on executables <ide> # * removes unresolved symlinks <ide> class Cleaner <add> extend Predicable <add> <add> attr_predicate :verbose?, :debug? <add> <ide> # Create a cleaner for the given formula <del> def initialize(f) <add> def initialize(f, verbose: false, debug: false) <ide> @f = f <add> @verbose = verbose <add> @debug = debug <ide> end <ide> <ide> # Clean the keg of formula @f <ide> def prune <ide> # actual files gets removed correctly. <ide> dirs.reverse_each do |d| <ide> if d.children.empty? <del> puts "rmdir: #{d} (empty)" if Homebrew.args.verbose? <add> puts "rmdir: #{d} (empty)" if verbose? <ide> d.rmdir <ide> end <ide> end <ide> def clean_dir(d) <ide> else <ide> 0444 <ide> end <del> if Homebrew.args.debug? <add> if debug? <ide> old_perms = path.stat.mode & 0777 <ide> odebug "Fixing #{path} permissions from #{old_perms.to_s(8)} to #{perms.to_s(8)}" if perms != old_perms <ide> end <ide><path>Library/Homebrew/formula_installer.rb <ide> def fix_dynamic_linkage(keg) <ide> <ide> def clean <ide> ohai "Cleaning" if verbose? <del> Cleaner.new(formula).clean <add> Cleaner.new(formula, verbose: verbose?, debug: debug?).clean <ide> rescue Exception => e # rubocop:disable Lint/RescueException <ide> opoo "The cleaning step did not complete successfully" <ide> puts "Still, the installation was successful, so we will link it into your prefix"
2
PHP
PHP
fix typo in parser.php
459290c88e8e5ba07fcfe2bece01c063fd2ab178
<ide><path>src/Illuminate/Console/Parser.php <ide> protected static function parameters(array $tokens) <ide> if (!Str::startsWith($token, '--')) { <ide> $arguments[] = static::parseArgument($token); <ide> } else { <del> $options [] = static::parseOption(ltrim($token, '-')); <add> $options[] = static::parseOption(ltrim($token, '-')); <ide> } <ide> } <ide>
1
Javascript
Javascript
remove weird handlers added by wabt
67e92aa05491e7278289c575818243aec9342a73
<ide><path>test/setupTestFramework.js <ide> if(process.env.DEBUG_INFO) { <ide> // It leaks an Error object on construction <ide> // so it leaks the whole stack trace <ide> require("wast-loader"); <add>process.removeAllListeners("uncaughtException"); <add>process.removeAllListeners("unhandledRejection"); <ide>
1
PHP
PHP
remove problematic test
e719d2df07e64433671fe3a639b1016f0998cc94
<ide><path>tests/TestCase/ORM/QueryRegressionTest.php <ide> public function testAssociationSubQueryNoOffset() { <ide> $query = $table->find('translations')->limit(10)->offset(1); <ide> $result = $query->toArray(); <ide> $this->assertCount(2, $result); <del> <del> $query = $table->find('translations')->having(['Articles.id >' => 1]); <del> $result = $query->toArray(); <del> $this->assertCount(2, $result); <ide> } <ide> <ide> }
1
Javascript
Javascript
remove unused args and comparison fix
55d202a346cb5cbf4f641051559addb93fd68a07
<ide><path>test/internet/test-dgram-multicast-multi-process.js <ide> if (common.inFreeBSDJail) { <ide> return; <ide> } <ide> <del>function launchChildProcess(index) { <add>function launchChildProcess() { <ide> const worker = fork(__filename, ['child']); <ide> workers[worker.pid] = worker; <ide> <ide> worker.messagesReceived = []; <ide> <ide> // Handle the death of workers. <del> worker.on('exit', function(code, signal) { <add> worker.on('exit', function(code) { <ide> // Don't consider this the true death if the worker has finished <ide> // successfully or if the exit code is 0. <ide> if (worker.isDone || code === 0) { <ide> if (process.argv[2] === 'child') { <ide> <ide> process.send({ message: buf.toString() }); <ide> <del> if (receivedMessages.length == messages.length) { <add> if (receivedMessages.length === messages.length) { <ide> // .dropMembership() not strictly needed but here as a sanity check. <ide> listenSocket.dropMembership(LOCAL_BROADCAST_HOST); <ide> process.nextTick(function() {
1
Python
Python
fix valueerror in chord with single task header
fe2c47d4e62c36d3b78b57ad41518fbf6748a708
<ide><path>celery/canvas.py <ide> def apply_async(self, args=(), kwargs={}, task_id=None, <ide> if len(self.tasks) == 1: <ide> # chord([A], B) can be optimized as A | B <ide> # - Issue #3323 <del> return (self.tasks[0].set(task_id=task_id) | body).apply_async( <add> return (self.tasks[0] | body).set(task_id=task_id).apply_async( <ide> args, kwargs, **options) <ide> # chord([A, B, ...], C) <ide> return self.run(tasks, body, args, task_id=task_id, **options)
1
Ruby
Ruby
remove unused variables
1a799f04e6af870c03aa02f2c3a9db6e0bd55bca
<ide><path>Library/Homebrew/cmd/audit.rb <ide> def audit_file <ide> end <ide> <ide> def audit_deps <del> problems = [] <del> <ide> # Don't depend_on aliases; use full name <ide> aliases = Formula.aliases <ide> f.deps.select { |d| aliases.include? d.name }.each do |d| <ide> def audit_deps <ide> def audit_conflicts <ide> f.conflicts.each do |req| <ide> begin <del> conflict_f = Formula.factory req.formula <del> rescue <add> Formula.factory req.formula <add> rescue FormulaUnavailableError <ide> problem "Can't find conflicting formula \"#{req.formula}\"." <ide> end <ide> end <ide><path>Library/Homebrew/cmd/cleanup.rb <ide> def cleanup <ide> HOMEBREW_CELLAR.children.each do |rack| <ide> begin <ide> cleanup_formula rack.basename.to_s if rack.directory? <del> rescue FormulaUnavailableError => e <add> rescue FormulaUnavailableError <ide> # Don't complain about Cellar folders that are from DIY installs <ide> # instead of core formulae. <ide> end <ide><path>Library/Homebrew/cmd/update.rb <ide> def update <ide> master_updater.pull! <ide> report.merge!(master_updater.report) <ide> <del> new_files = [] <ide> Dir["Library/Taps/*"].each do |tapd| <ide> next unless File.directory?(tapd) <ide> <ide><path>Library/Homebrew/formula.rb <ide> def brew <ide> # we allow formulae to do anything they want to the Ruby process <ide> # so load any deps before this point! And exit asap afterwards <ide> yield self <del> rescue RuntimeError, SystemCallError => e <add> rescue RuntimeError, SystemCallError <ide> %w(config.log CMakeCache.txt).each do |fn| <ide> (HOMEBREW_LOGS/name).install(fn) if File.file?(fn) <ide> end <ide> def self.each <ide> names.each do |name| <ide> yield begin <ide> Formula.factory(name) <del> rescue => e <add> rescue <ide> # Don't let one broken formula break commands. But do complain. <ide> onoe "Failed to import: #{name}" <ide> next <ide> def system cmd, *args <ide> mkdir_p(logd) <ide> <ide> rd, wr = IO.pipe <del> pid = fork do <add> fork do <ide> rd.close <ide> $stdout.reopen wr <ide> $stderr.reopen wr <ide> def system cmd, *args <ide> raise ErrorDuringExecution <ide> end <ide> end <del> rescue ErrorDuringExecution => e <add> rescue ErrorDuringExecution <ide> raise BuildError.new(self, cmd, args, $?) <ide> ensure <ide> f.close if f and not f.closed? <ide><path>Library/Homebrew/formula_installer.rb <ide> def finish <ide> if f.keg_only? <ide> begin <ide> Keg.new(f.prefix).optlink <del> rescue Exception => e <add> rescue Exception <ide> onoe "Failed to create: #{f.opt_prefix}" <ide> puts "Things that depend on #{f} will probably not build." <ide> end <ide> def build <ide> <ide> Tab.create(f, build_argv).write # INSTALL_RECEIPT.json <ide> <del> rescue Exception => e <add> rescue Exception <ide> ignore_interrupts do <ide> # any exceptions must leave us with nothing installed <ide> f.prefix.rmtree if f.prefix.directory? <ide><path>Library/Homebrew/test/test_formula.rb <ide> def test_mirror_support <ide> HOMEBREW_CACHE.mkpath unless HOMEBREW_CACHE.exist? <ide> nostdout do <ide> f = TestBallWithMirror.new <del> tarball, downloader = f.fetch <add> _, downloader = f.fetch <ide> assert_equal f.url, "file:///#{TEST_FOLDER}/bad_url/testball-0.1.tbz" <ide> assert_equal downloader.url, "file:///#{TEST_FOLDER}/tarballs/testball-0.1.tbz" <ide> end <ide><path>Library/Homebrew/test/test_versions.rb <ide> def test_no_version <ide> end <ide> <ide> def test_bad_version <del> assert_raises(RuntimeError) { f = TestBadVersion.new } <add> assert_raises(RuntimeError) { TestBadVersion.new } <ide> end <ide> <ide> def test_version_all_dots
7
PHP
PHP
add hascasterclass interface
327a1cefc07e8f307b5de1a0d6abf27d42cdbc15
<ide><path>src/Illuminate/Contracts/Database/Eloquent/HasCasterClass.php <add><?php <add> <add>namespace Illuminate\Contracts\Database\Eloquent; <add> <add>interface HasCasterClass <add>{ <add> /** <add> * Get the caster class for this class <add> * <add> * @return string <add> */ <add> public static function getCasterClass(); <add>} <ide><path>src/Illuminate/Database/Eloquent/Concerns/HasAttributes.php <ide> use Carbon\CarbonInterface; <ide> use DateTimeInterface; <ide> use Illuminate\Contracts\Database\Eloquent\CastsInboundAttributes; <add>use Illuminate\Contracts\Database\Eloquent\HasCasterClass; <ide> use Illuminate\Contracts\Support\Arrayable; <ide> use Illuminate\Database\Eloquent\JsonEncodingException; <ide> use Illuminate\Database\Eloquent\Relations\Relation; <ide> class_exists($class = $this->parseCasterClass($this->getCasts()[$key])) && <ide> */ <ide> protected function resolveCasterClass($key) <ide> { <del> if (strpos($castType = $this->getCasts()[$key], ':') === false) { <add> $castType = $this->getCasts()[$key]; <add> <add> if (is_subclass_of($castType, HasCasterClass::class)) { <add> $castType = $castType::getCasterClass(); <add> } <add> <add> if (strpos($castType, ':') === false) { <ide> return new $castType; <ide> } <ide> <ide><path>tests/Integration/Database/DatabaseEloquentModelCustomCastingTest.php <ide> <ide> use Illuminate\Contracts\Database\Eloquent\CastsAttributes; <ide> use Illuminate\Contracts\Database\Eloquent\CastsInboundAttributes; <add>use Illuminate\Contracts\Database\Eloquent\HasCasterClass; <ide> use Illuminate\Database\Eloquent\Model; <ide> <ide> /** <ide> public function testBasicCustomCasting() <ide> $model->syncOriginal(); <ide> $model->options = ['foo' => 'bar']; <ide> $this->assertTrue($model->isDirty('options')); <add> <add> $model = new TestEloquentModelWithCustomCast; <add> <add> $model->setRawAttributes([ <add> 'address_line_one' => '110 Kingsbrook St.', <add> 'address_line_two' => 'My Childhood House', <add> ]); <add> <add> $this->assertSame('110 Kingsbrook St.', $model->address_with_caster->lineOne); <add> $this->assertSame('My Childhood House', $model->address_with_caster->lineTwo); <add> <add> $this->assertSame('110 Kingsbrook St.', $model->toArray()['address_line_one']); <add> $this->assertSame('My Childhood House', $model->toArray()['address_line_two']); <add> <add> $model->address_with_caster->lineOne = '117 Spencer St.'; <add> <add> $this->assertFalse(isset($model->toArray()['address'])); <add> $this->assertSame('117 Spencer St.', $model->toArray()['address_line_one']); <add> $this->assertSame('My Childhood House', $model->toArray()['address_line_two']); <add> <add> $this->assertSame('117 Spencer St.', json_decode($model->toJson(), true)['address_line_one']); <add> $this->assertSame('My Childhood House', json_decode($model->toJson(), true)['address_line_two']); <add> <add> $model->address_with_caster = null; <add> <add> $this->assertNull($model->toArray()['address_line_one']); <add> $this->assertNull($model->toArray()['address_line_two']); <ide> } <ide> <ide> public function testOneWayCasting() <ide> class TestEloquentModelWithCustomCast extends Model <ide> 'other_password' => HashCaster::class.':md5', <ide> 'uppercase' => UppercaseCaster::class, <ide> 'options' => JsonCaster::class, <add> 'address_with_caster' => AddressWithCaster::class, <ide> ]; <ide> } <ide> <ide> public function __construct($lineOne, $lineTwo) <ide> $this->lineTwo = $lineTwo; <ide> } <ide> } <add> <add>class AddressWithCaster extends Address implements HasCasterClass <add>{ <add> public static function getCasterClass() <add> { <add> return AddressCaster::class; <add> } <add>}
3
Java
Java
fix appendix typos in contentdisposition
6884a3ac56074672d6649148c99f0beb73a4d260
<ide><path>spring-web/src/main/java/org/springframework/http/ContentDisposition.java <ide> public Charset getCharset() { <ide> /** <ide> * Return the value of the {@literal size} parameter, or {@code null} if not defined. <ide> * @deprecated since 5.2.3 as per <del> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Apendix B</a>, <add> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Appendix B</a>, <ide> * to be removed in a future release. <ide> */ <ide> @Deprecated <ide> public Long getSize() { <ide> /** <ide> * Return the value of the {@literal creation-date} parameter, or {@code null} if not defined. <ide> * @deprecated since 5.2.3 as per <del> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Apendix B</a>, <add> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Appendix B</a>, <ide> * to be removed in a future release. <ide> */ <ide> @Deprecated <ide> public ZonedDateTime getCreationDate() { <ide> /** <ide> * Return the value of the {@literal modification-date} parameter, or {@code null} if not defined. <ide> * @deprecated since 5.2.3 as per <del> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Apendix B</a>, <add> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Appendix B</a>, <ide> * to be removed in a future release. <ide> */ <ide> @Deprecated <ide> public ZonedDateTime getModificationDate() { <ide> /** <ide> * Return the value of the {@literal read-date} parameter, or {@code null} if not defined. <ide> * @deprecated since 5.2.3 as per <del> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Apendix B</a>, <add> * <a href="https://tools.ietf.org/html/rfc6266#appendix-B">RFC 6266, Appendix B</a>, <ide> * to be removed in a future release. <ide> */ <ide> @Deprecated
1
Python
Python
add image height and width to onnx dynamic axes
6519150c315bdcd415bbd115cec11e839f3eb866
<ide><path>src/transformers/models/beit/configuration_beit.py <ide> class BeitOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/clip/configuration_clip.py <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <ide> ("input_ids", {0: "batch", 1: "sequence"}), <del> ("pixel_values", {0: "batch"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ("attention_mask", {0: "batch", 1: "sequence"}), <ide> ] <ide> ) <ide><path>src/transformers/models/convnext/configuration_convnext.py <ide> class ConvNextOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/data2vec/configuration_data2vec_vision.py <ide> class Data2VecVisionOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/deit/configuration_deit.py <ide> class DeiTOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/detr/configuration_detr.py <ide> class DetrOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ("pixel_mask", {0: "batch"}), <ide> ] <ide> ) <ide><path>src/transformers/models/layoutlmv3/configuration_layoutlmv3.py <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> ("input_ids", {0: "batch", 1: "sequence"}), <ide> ("attention_mask", {0: "batch", 1: "sequence"}), <ide> ("bbox", {0: "batch", 1: "sequence"}), <del> ("pixel_values", {0: "batch", 1: "sequence"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> else: <ide><path>src/transformers/models/levit/configuration_levit.py <ide> class LevitOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/mobilevit/configuration_mobilevit.py <ide> class MobileViTOnnxConfig(OnnxConfig): <ide> <ide> @property <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <del> return OrderedDict([("pixel_values", {0: "batch", 1: "num_channels"})]) <add> return OrderedDict([("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"})]) <ide> <ide> @property <ide> def outputs(self) -> Mapping[str, Mapping[int, str]]: <ide><path>src/transformers/models/resnet/configuration_resnet.py <ide> class ResNetOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide> <ide><path>src/transformers/models/vit/configuration_vit.py <ide> class ViTOnnxConfig(OnnxConfig): <ide> def inputs(self) -> Mapping[str, Mapping[int, str]]: <ide> return OrderedDict( <ide> [ <del> ("pixel_values", {0: "batch", 1: "num_channels"}), <add> ("pixel_values", {0: "batch", 1: "num_channels", 2: "height", 3: "width"}), <ide> ] <ide> ) <ide>
11
Python
Python
remove unused variable [ci skip]
091a9b522a0a66e438c334d93766a836f8457cd2
<ide><path>spacy/ml/tb_framework.py <ide> def init(model, X=None, Y=None): <ide> <ide> <ide> def resize_output(model, new_nO): <del> tok2vec = model.get_ref("tok2vec") <ide> lower = model.get_ref("lower") <ide> upper = model.get_ref("upper") <ide> if not model.attrs["has_upper"]:
1
PHP
PHP
fix return type of session store save method
9293fec485fdd2a176ce27f26b31fde6bb7eed54
<ide><path>src/Illuminate/Contracts/Session/Session.php <ide> public function start(); <ide> /** <ide> * Save the session data to storage. <ide> * <del> * @return bool <add> * @return void <ide> */ <ide> public function save(); <ide> <ide><path>src/Illuminate/Session/Store.php <ide> protected function prepareForUnserialize($data) <ide> /** <ide> * Save the session data to storage. <ide> * <del> * @return bool <add> * @return void <ide> */ <ide> public function save() <ide> {
2
PHP
PHP
add dropmorphs to blueprint
692f792dc09885ec6191c4aa92ec3859711a3c32
<ide><path>src/Illuminate/Database/Schema/Blueprint.php <ide> public function dropRememberToken() <ide> $this->dropColumn('remember_token'); <ide> } <ide> <add> /** <add> * Indicate that the polymorphic columns should be dropped. <add> * <add> * @param string $name <add> * <add> * @return void <add> */ <add> public function dropMorphs($name) <add> { <add> $this->dropColumn("{$name}_type", "{$name}_id"); <add> } <add> <ide> /** <ide> * Rename the table to a given name. <ide> * <ide><path>tests/Database/DatabaseMySqlSchemaGrammarTest.php <ide> public function testDropTimestampsTz() <ide> $this->assertEquals('alter table `users` drop `created_at`, drop `updated_at`', $statements[0]); <ide> } <ide> <add> public function testDropMorphs() <add> { <add> $blueprint = new Blueprint('photos'); <add> $blueprint->dropMorphs('imageable'); <add> $statements = $blueprint->toSql($this->getConnection(), $this->getGrammar()); <add> <add> $this->assertCount(1, $statements); <add> $this->assertEquals('alter table `photos` drop `imageable_type`, drop `imageable_id`', $statements[0]); <add> } <add> <ide> public function testRenameTable() <ide> { <ide> $blueprint = new Blueprint('users'); <ide><path>tests/Database/DatabasePostgresSchemaGrammarTest.php <ide> public function testDropTimestampsTz() <ide> $this->assertEquals('alter table "users" drop column "created_at", drop column "updated_at"', $statements[0]); <ide> } <ide> <add> public function testDropMorphs() <add> { <add> $blueprint = new Blueprint('photos'); <add> $blueprint->dropMorphs('imageable'); <add> $statements = $blueprint->toSql($this->getConnection(), $this->getGrammar()); <add> <add> $this->assertCount(1, $statements); <add> $this->assertEquals('alter table "photos" drop column "imageable_type", drop column "imageable_id"', $statements[0]); <add> } <add> <ide> public function testRenameTable() <ide> { <ide> $blueprint = new Blueprint('users'); <ide><path>tests/Database/DatabaseSqlServerSchemaGrammarTest.php <ide> public function testDropTimestampsTz() <ide> $this->assertEquals('alter table "users" drop column "created_at", "updated_at"', $statements[0]); <ide> } <ide> <add> public function testDropMorphs() <add> { <add> $blueprint = new Blueprint('photos'); <add> $blueprint->dropMorphs('imageable'); <add> $statements = $blueprint->toSql($this->getConnection(), $this->getGrammar()); <add> <add> $this->assertCount(1, $statements); <add> $this->assertEquals('alter table "photos" drop column "imageable_type", "imageable_id"', $statements[0]); <add> } <add> <ide> public function testRenameTable() <ide> { <ide> $blueprint = new Blueprint('users');
4
Ruby
Ruby
raise a helpful error message on #mount misuse
afa68eb1766d8893a1bb79bf989061f3d8f98049
<ide><path>actionpack/lib/action_dispatch/routing/mapper.rb <ide> def mount(app, options = nil) <ide> if options <ide> path = options.delete(:at) <ide> else <add> unless Hash === app <add> raise ArgumentError, "must be called with mount point" <add> end <add> <ide> options = app <ide> app, path = options.find { |k, v| k.respond_to?(:call) } <ide> options.delete(app) if app <ide><path>actionpack/test/dispatch/mapper_test.rb <ide> def test_map_wildcard_with_format_true <ide> mapper.get '/*path', :to => 'pages#show', :format => true <ide> assert_equal '/*path.:format', fakeset.conditions.first[:path_info] <ide> end <add> <add> def test_raising_helpful_error_on_invalid_arguments <add> fakeset = FakeSet.new <add> mapper = Mapper.new fakeset <add> app = lambda { |env| [200, {}, [""]] } <add> assert_raises ArgumentError do <add> mapper.mount app <add> end <add> end <ide> end <ide> end <ide> end
2
Javascript
Javascript
add test for objecttomap util
3adf072eeadc77e30c0d09e95240e657c490c848
<ide><path>test/objectToMap.unittest.js <add>/* globals describe it */ <add> <add>require("should"); <add> <add>var objectToMap = require("../lib/util/objectToMap"); <add> <add>describe("objectToMap", function() { <add> it("should convert a plain object into a Map successfully", function() { <add> const map = objectToMap({ <add> foo: "bar", <add> bar: "baz" <add> }); <add> <add> map.get("foo").should.eql("bar"); <add> map.get("bar").should.eql("baz"); <add> }); <add>});
1
Javascript
Javascript
remove react.autobind for real
a42fd30fc2795909324c497350d0877a969b0cc7
<ide><path>src/core/React.js <ide> var React = { <ide> initializeTouchEvents: function(shouldUseTouch) { <ide> ReactMount.useTouchEvents = shouldUseTouch; <ide> }, <del> autoBind: ReactCompositeComponent.autoBind, <ide> createClass: ReactCompositeComponent.createClass, <ide> constructAndRenderComponent: ReactMount.constructAndRenderComponent, <ide> constructAndRenderComponentByID: ReactMount.constructAndRenderComponentByID, <ide><path>src/core/ReactCompositeComponent.js <ide> var ReactCompositeComponent = { <ide> return componentClass instanceof Function && <ide> 'componentConstructor' in componentClass && <ide> componentClass.componentConstructor instanceof Function; <del> }, <del> <del> /** <del> * TODO: Delete this when all callers have been updated to rely on this <del> * behavior being the default. <del> * <del> * Backwards compatible stub for what is now the default behavior. <del> * @param {function} method Method to be bound. <del> * @public <del> */ <del> autoBind: function(method) { <del> if (__DEV__) { <del> console.warn( <del> 'React.autoBind() is now deprecated. All React component methods ' + <del> 'are auto bound by default, so React.autoBind() is a no-op. It ' + <del> 'will be removed in the next version of React' <del> ); <del> } <del> return method; <ide> } <ide> }; <ide>
2
Text
Text
fix images in changes
bb13745eb08cdf08a68ebd67ea00ae2c14e4962d
<ide><path>CHANGES.md <ide> var color = d3.scaleOrdinal(d3.schemeCategory10); <ide> <ide> [Sequential scales](https://github.com/d3/d3-scale/blob/master/README.md#scaleSequential), are a new class of scales with a fixed output [interpolator](https://github.com/d3/d3-scale/blob/master/README.md#sequential_interpolator) instead of a [range](https://github.com/d3/d3-scale/blob/master/README.md#continuous_range). Typically these scales are used to implement continuous sequential or diverging color schemes. Inspired by Matplotlib’s new [perceptually-motived colormaps](https://bids.github.io/colormap/), 4.0 now features [viridis](https://github.com/d3/d3-scale/blob/master/README.md#interpolateViridis), [inferno](https://github.com/d3/d3-scale/blob/master/README.md#interpolateInferno), [magma](https://github.com/d3/d3-scale/blob/master/README.md#interpolateMagma), [plasma](https://github.com/d3/d3-scale/blob/master/README.md#interpolatePlasma) interpolators for use with sequential scales. Using [d3.quantize](https://github.com/d3/d3-interpolate/blob/master/README.md#quantize), these interpolators can also be applied to [quantile](https://github.com/d3/d3-scale/blob/master/README.md#quantile-scales), [quantize](https://github.com/d3/d3-scale/blob/master/README.md#quantize-scales) and [threshold](https://github.com/d3/d3-scale/blob/master/README.md#threshold-scales) scales. <ide> <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/viridis.png" width="100%" height="40" alt="viridis">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateViridis) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/inferno.png" width="100%" height="40" alt="inferno">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateInferno) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/magma.png" width="100%" height="40" alt="magma">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateMagma) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/plasma.png" width="100%" height="40" alt="plasma">](https://github.com/d3/d3-scale/blob/master/README.md#interpolatePlasma) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/viridis.png" width="100%" height="40" alt="viridis">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateViridis) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/inferno.png" width="100%" height="40" alt="inferno">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateInferno) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/magma.png" width="100%" height="40" alt="magma">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateMagma) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/plasma.png" width="100%" height="40" alt="plasma">](https://github.com/d3/d3-scale/blob/master/README.md#interpolatePlasma) <ide> <ide> 4.0 also ships new Cubehelix schemes, including [Dave Green’s default](https://github.com/d3/d3-scale/blob/master/README.md#interpolateCubehelixDefault) and a [cyclical rainbow](https://github.com/d3/d3-scale/blob/master/README.md#interpolateRainbow) inspired by [Matteo Niccoli](https://mycarta.wordpress.com/2013/02/21/perceptual-rainbow-palette-the-method/): <ide> <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/cubehelix.png" width="100%" height="40" alt="cubehelix">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateCubehelixDefault) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/rainbow.png" width="100%" height="40" alt="rainbow">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateRainbow) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/warm.png" width="100%" height="40" alt="warm">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateWarm) <del>[<img src="https://raw.githubusercontent.com/d3/d3-scale/master/img/cool.png" width="100%" height="40" alt="cool">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateCool) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/cubehelix.png" width="100%" height="40" alt="cubehelix">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateCubehelixDefault) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/rainbow.png" width="100%" height="40" alt="rainbow">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateRainbow) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/warm.png" width="100%" height="40" alt="warm">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateWarm) <add>[<img src="https://raw.githubusercontent.com/d3/d3-scale/v1.0.0/img/cool.png" width="100%" height="40" alt="cool">](https://github.com/d3/d3-scale/blob/master/README.md#interpolateCool) <ide> <ide> For even more sequential and categorical color schemes, see [d3-scale-chromatic](https://github.com/d3/d3-scale-chromatic). <ide>
1
Python
Python
simplify deploy_node and get rid of **kwargs
155c0f8bebbd9162ef847ffbbd831c979f3f659e
<ide><path>libcloud/compute/base.py <ide> def create_node(self, **kwargs): <ide> raise NotImplementedError( <ide> 'create_node not implemented for this driver') <ide> <del> def deploy_node(self, **kwargs): <add> def deploy_node(self, deploy, ssh_username='root', ssh_alternate_usernames=None, <add> ssh_port=22, ssh_timeout=10, ssh_key=None, auth=None, <add> timeout=SSH_CONNECT_TIMEOUT, max_tries=3, ssh_interface='public_ips', <add> **create_node_kwargs): <ide> # type: (...) -> Node <ide> """ <ide> Create a new node, and start deployment. <ide> def deploy_node(self, **kwargs): <ide> raise RuntimeError('paramiko is not installed. You can install ' + <ide> 'it using pip: pip install paramiko') <ide> <del> if 'auth' in kwargs: <del> auth = kwargs['auth'] <add> if auth: <ide> if not isinstance(auth, (NodeAuthSSHKey, NodeAuthPassword)): <ide> raise NotImplementedError( <ide> 'If providing auth, only NodeAuthSSHKey or' <ide> 'NodeAuthPassword is supported') <del> elif 'ssh_key' in kwargs: <add> elif ssh_key: <ide> # If an ssh_key is provided we can try deploy_node <ide> pass <ide> elif 'create_node' in self.features: <ide> def deploy_node(self, **kwargs): <ide> # NOTE 2: Some drivers which use password based SSH authentication <ide> # rely on password being stored on the "auth" argument and that's why <ide> # we also propagate that argument to "create_node()" method. <del> create_node_kwargs = dict([(key, value) for key, value in <del> kwargs.items() if key <del> not in DEPLOY_NODE_KWARGS]) <del> <ide> try: <del> node = self.create_node(**create_node_kwargs) <add> node = self.create_node(auth=auth, **create_node_kwargs) <ide> except TypeError as e: <ide> msg_1_re = (r'create_node\(\) missing \d+ required ' <ide> 'positional arguments.*') <ide> msg_2_re = r'create_node\(\) takes at least \d+ arguments.*' <ide> if re.match(msg_1_re, str(e)) or re.match(msg_2_re, str(e)): <del> node = self.create_node(**kwargs) <add> node = self.create_node(deploy=deploy, <add> ssh_username=ssh_username, <add> ssh_alternate_usernames=ssh_alternate_usernames, <add> ssh_port=ssh_port, <add> ssh_timeout=ssh_timeout, <add> ssh_key=ssh_key, <add> auth=auth, <add> timeout=timeout, <add> max_tries=max_tries, <add> ssh_interface=ssh_interface, <add> **create_node_kwargs) <ide> else: <ide> raise e <ide> <del> max_tries = kwargs.get('max_tries', 3) <del> <ide> password = None <del> if 'auth' in kwargs: <del> if isinstance(kwargs['auth'], NodeAuthPassword): <del> password = kwargs['auth'].password <add> if auth: <add> if isinstance(auth, NodeAuthPassword): <add> password = auth.password <ide> elif 'password' in node.extra: <ide> password = node.extra['password'] <ide> <del> ssh_interface = kwargs.get('ssh_interface', 'public_ips') <add> wait_timeout = timeout or NODE_ONLINE_WAIT_TIMEOUT <ide> <ide> # Wait until node is up and running and has IP assigned <ide> try: <ide> node, ip_addresses = self.wait_until_running( <ide> nodes=[node], <ide> wait_period=3, <del> timeout=float(kwargs.get('timeout', NODE_ONLINE_WAIT_TIMEOUT)), <add> timeout=wait_timeout, <ide> ssh_interface=ssh_interface)[0] <ide> except Exception as e: <ide> raise DeploymentError(node=node, original_exception=e, driver=self) <ide> <del> ssh_username = kwargs.get('ssh_username', 'root') <del> ssh_alternate_usernames = kwargs.get('ssh_alternate_usernames', []) <del> ssh_port = kwargs.get('ssh_port', 22) <del> ssh_timeout = kwargs.get('ssh_timeout', 10) <del> ssh_key_file = kwargs.get('ssh_key', None) <del> timeout = kwargs.get('timeout', SSH_CONNECT_TIMEOUT) <add> ssh_alternate_usernames = ssh_alternate_usernames or [] <add> deploy_timeout = timeout or SSH_CONNECT_TIMEOUT <ide> <ide> deploy_error = None <ide> <ide> for username in ([ssh_username] + ssh_alternate_usernames): <ide> try: <ide> self._connect_and_run_deployment_script( <del> task=kwargs['deploy'], node=node, <add> task=deploy, node=node, <ide> ssh_hostname=ip_addresses[0], ssh_port=ssh_port, <ide> ssh_username=username, ssh_password=password, <del> ssh_key_file=ssh_key_file, ssh_timeout=ssh_timeout, <del> timeout=timeout, max_tries=max_tries) <add> ssh_key_file=ssh_key, ssh_timeout=ssh_timeout, <add> timeout=deploy_timeout, max_tries=max_tries) <ide> except Exception as e: <ide> # Try alternate username <ide> # Todo: Need to fix paramiko so we can catch a more specific
1
PHP
PHP
apply fixes from styleci
51e910670daeeb2f9bbd80f4ccf8e48c3d0abbdb
<ide><path>src/Illuminate/Database/Eloquent/Concerns/QueriesRelationships.php <ide> public function whereRelation($relation, $column, $operator = null, $value = nul <ide> <ide> return $this->when( <ide> $relations->count() == 1, <del> function($query) use ($relations, $column, $operator, $value) { <add> function ($query) use ($relations, $column, $operator, $value) { <ide> $query->whereHas($relations->first(), function ($query) use ($column, $operator, $value) { <ide> $query->where($column, $operator, $value); <ide> }); <ide> }, <del> function($query) use ($relations, $column, $operator, $value) { <add> function ($query) use ($relations, $column, $operator, $value) { <ide> $query->whereHas($relations->first(), function ($query) use ($relations, $column, $operator, $value) { <ide> $relations->shift(); <ide> <ide> function($query) use ($relations, $column, $operator, $value) { <ide> public function orWhereRelation($relation, $column, $operator = null, $value = null) <ide> { <ide> $relations = collect(explode('.', $relation)); <del> <add> <ide> return $this->when( <ide> $relations->count() == 1, <del> function($query) use ($relations, $column, $operator, $value) { <add> function ($query) use ($relations, $column, $operator, $value) { <ide> $query->orWhereHas($relations->first(), function ($query) use ($column, $operator, $value) { <ide> $query->where($column, $operator, $value); <ide> }); <ide> }, <del> function($query) use ($relations, $column, $operator, $value) { <add> function ($query) use ($relations, $column, $operator, $value) { <ide> $query->orWhereHas($relations->first(), function ($query) use ($relations, $column, $operator, $value) { <ide> $relations->shift(); <ide>
1
Text
Text
fix typo in docs
fe5b72e7bb4f7c043409196cb98d46796f2dc390
<ide><path>docs/sources/reference/api/docker_remote_api_v1.1.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide><path>docs/sources/reference/api/docker_remote_api_v1.10.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.11.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.12.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.13.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.14.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.15.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.2.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **204** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.3.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.4.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.5.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.6.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.7.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.8.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images <ide><path>docs/sources/reference/api/docker_remote_api_v1.9.md <ide> Status Codes: <ide> - **201** – no error <ide> - **400** – bad parameter <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Remove an image <ide> Status Codes: <ide> <ide> - **200** – no error <ide> - **404** – no such image <del>- **409** – conflic <add>- **409** – conflict <ide> - **500** – server error <ide> <ide> ### Search images
15
Go
Go
prevent deadlock on attempt to use own net
f30d1c1835618eadea5d0a68d1301dffd9f09b22
<ide><path>daemon/container.go <ide> func (container *Container) getNetworkedContainer() (*Container, error) { <ide> if err != nil { <ide> return nil, err <ide> } <add> if container == nc { <add> return nil, fmt.Errorf("cannot join own network") <add> } <ide> if !nc.IsRunning() { <ide> return nil, fmt.Errorf("cannot join network of a non running container: %s", parts[1]) <ide> } <ide><path>integration-cli/docker_cli_run_test.go <ide> func (s *DockerSuite) TestContainerNetworkMode(c *check.C) { <ide> } <ide> } <ide> <add>func (s *DockerSuite) TestContainerNetworkModeToSelf(c *check.C) { <add> cmd := exec.Command(dockerBinary, "run", "--name=me", "--net=container:me", "busybox", "true") <add> out, _, err := runCommandWithOutput(cmd) <add> if err == nil || !strings.Contains(out, "cannot join own network") { <add> c.Fatalf("using container net mode to self should result in an error") <add> } <add>} <add> <ide> func (s *DockerSuite) TestRunModePidHost(c *check.C) { <ide> testRequires(c, NativeExecDriver, SameHostDaemon) <ide>
2
Python
Python
fix flake8 lint in tests
030e71b2624ad6f8d5458b3820efe3ef815318c6
<ide><path>t/unit/backends/test_base.py <ide> class ExpectedException(Exception): <ide> callback.keys.return_value = [] <ide> task = self.app.tasks[callback.task] = Mock() <ide> b.fail_from_current_stack = Mock() <del> group = self.patching('celery.group') <add> self.patching('celery.group') <ide> with patch.object( <ide> b, "_call_task_errbacks", side_effect=ExpectedException() <ide> ) as mock_call_errbacks: <ide><path>t/unit/tasks/test_tasks.py <ide> import socket <ide> import tempfile <ide> from datetime import datetime, timedelta <del>from unittest.mock import ANY, MagicMock, Mock, call, patch, sentinel <add>from unittest.mock import ANY, MagicMock, Mock, patch, sentinel <ide> <ide> import pytest <ide> from case import ContextMock <ide> def test_autoretry_backoff(self, randrange): <ide> <ide> assert task.iterations == 4 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [1, 2, 4, 8] <ide> <ide> def test_autoretry_backoff_jitter(self, randrange): <ide> <ide> assert task.iterations == 4 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [0, 1, 3, 7] <ide> <ide> def test_retry_backoff_from_base(self): <ide> <ide> assert task.iterations == 6 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [1, 2, 4, 8, 16, 32] <ide> <ide> def test_retry_backoff_max_from_base(self): <ide> <ide> assert task.iterations == 6 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [1, 2, 4, 8, 16, 32] <ide> <ide> def test_override_retry_backoff_max_from_base(self): <ide> <ide> assert task.iterations == 6 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [1, 2, 4, 8, 16, 16] <ide> <ide> def test_retry_backoff_jitter_from_base(self): <ide> <ide> assert task.iterations == 6 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [1, 2, 4, 8, 16, 32] <ide> <ide> def test_override_backoff_jitter_from_base(self, randrange): <ide> <ide> assert task.iterations == 6 <ide> retry_call_countdowns = [ <del> call[1]['countdown'] for call in fake_retry.call_args_list <add> call_[1]['countdown'] for call_ in fake_retry.call_args_list <ide> ] <ide> assert retry_call_countdowns == [0, 1, 3, 7, 15, 31] <ide> <ide><path>t/unit/utils/test_functional.py <ide> import collections <ide> <ide> import pytest <del>import pytest_subtests <add>import pytest_subtests # noqa: F401 <ide> from kombu.utils.functional import lazy <ide> <ide> from celery.utils.functional import (DummyContext, first, firstmethod,
3
Python
Python
change vector width
66252f3e7124d6114908cedf2c8e7f7b75fcf0f3
<ide><path>bin/parser/train_ud.py <ide> def main(lang_name, train_loc, dev_loc, model_dir, clusters_loc=None): <ide> for tag in tags: <ide> assert tag in vocab.morphology.tag_map, repr(tag) <ide> tagger = Tagger(vocab) <del> encoder = TokenVectorEncoder(vocab) <add> encoder = TokenVectorEncoder(vocab, width=128) <ide> parser = DependencyParser(vocab, actions=actions, features=features, L1=0.0) <ide> <ide> Xs, ys = organize_data(vocab, train_sents)
1
Javascript
Javascript
use latest selenium on sl
6898ac59858a4748c08804e0cb0444e4834e6d00
<ide><path>karma-shared.conf.js <ide> module.exports = function(config, specificOptions) { <ide> testName: specificOptions.testName || 'AngularJS', <ide> startConnect: true, <ide> options: { <del> 'selenium-version': '2.37.0' <add> 'selenium-version': '2.41.0' <ide> } <ide> }, <ide>
1
Go
Go
check loop devices of existing pool
bebf53443981c70a6a714ea518dc966a0e2b6558
<ide><path>daemon/graphdriver/devmapper/deviceset.go <ide> func determineDriverCapabilities(version string) error { <ide> return nil <ide> } <ide> <add>// Determine the major and minor number of loopback device <add>func getDeviceMajorMinor(file *os.File) (uint64, uint64, error) { <add> stat, err := file.Stat() <add> if err != nil { <add> return 0, 0, err <add> } <add> <add> dev := stat.Sys().(*syscall.Stat_t).Rdev <add> majorNum := major(dev) <add> minorNum := minor(dev) <add> <add> logrus.Debugf("[devmapper]: Major:Minor for device: %s is:%v:%v", file.Name(), majorNum, minorNum) <add> return majorNum, minorNum, nil <add>} <add> <add>// Given a file which is backing file of a loop back device, find the <add>// loopback device name and its major/minor number. <add>func getLoopFileDeviceMajMin(filename string) (string, uint64, uint64, error) { <add> file, err := os.Open(filename) <add> if err != nil { <add> logrus.Debugf("[devmapper]: Failed to open file %s", filename) <add> return "", 0, 0, err <add> } <add> <add> defer file.Close() <add> loopbackDevice := devicemapper.FindLoopDeviceFor(file) <add> if loopbackDevice == nil { <add> return "", 0, 0, fmt.Errorf("[devmapper]: Unable to find loopback mount for: %s", filename) <add> } <add> defer loopbackDevice.Close() <add> <add> Major, Minor, err := getDeviceMajorMinor(loopbackDevice) <add> if err != nil { <add> return "", 0, 0, err <add> } <add> return loopbackDevice.Name(), Major, Minor, nil <add>} <add> <add>// Get the major/minor numbers of thin pool data and metadata devices <add>func (devices *DeviceSet) getThinPoolDataMetaMajMin() (uint64, uint64, uint64, uint64, error) { <add> var params, poolDataMajMin, poolMetadataMajMin string <add> <add> _, _, _, params, err := devicemapper.GetTable(devices.getPoolName()) <add> if err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> if _, err = fmt.Sscanf(params, "%s %s", &poolMetadataMajMin, &poolDataMajMin); err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> logrus.Debugf("[devmapper]: poolDataMajMin=%s poolMetaMajMin=%s\n", poolDataMajMin, poolMetadataMajMin) <add> <add> poolDataMajMinorSplit := strings.Split(poolDataMajMin, ":") <add> poolDataMajor, err := strconv.ParseUint(poolDataMajMinorSplit[0], 10, 32) <add> if err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> poolDataMinor, err := strconv.ParseUint(poolDataMajMinorSplit[1], 10, 32) <add> if err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> poolMetadataMajMinorSplit := strings.Split(poolMetadataMajMin, ":") <add> poolMetadataMajor, err := strconv.ParseUint(poolMetadataMajMinorSplit[0], 10, 32) <add> if err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> poolMetadataMinor, err := strconv.ParseUint(poolMetadataMajMinorSplit[1], 10, 32) <add> if err != nil { <add> return 0, 0, 0, 0, err <add> } <add> <add> return poolDataMajor, poolDataMinor, poolMetadataMajor, poolMetadataMinor, nil <add>} <add> <add>func (devices *DeviceSet) loadThinPoolLoopBackInfo() error { <add> poolDataMajor, poolDataMinor, poolMetadataMajor, poolMetadataMinor, err := devices.getThinPoolDataMetaMajMin() <add> if err != nil { <add> return err <add> } <add> <add> dirname := devices.loopbackDir() <add> <add> // data device has not been passed in. So there should be a data file <add> // which is being mounted as loop device. <add> if devices.dataDevice == "" { <add> datafilename := path.Join(dirname, "data") <add> dataLoopDevice, dataMajor, dataMinor, err := getLoopFileDeviceMajMin(datafilename) <add> if err != nil { <add> return err <add> } <add> <add> // Compare the two <add> if poolDataMajor == dataMajor && poolDataMinor == dataMinor { <add> devices.dataDevice = dataLoopDevice <add> devices.dataLoopFile = datafilename <add> } <add> <add> } <add> <add> // metadata device has not been passed in. So there should be a <add> // metadata file which is being mounted as loop device. <add> if devices.metadataDevice == "" { <add> metadatafilename := path.Join(dirname, "metadata") <add> metadataLoopDevice, metadataMajor, metadataMinor, err := getLoopFileDeviceMajMin(metadatafilename) <add> if err != nil { <add> return err <add> } <add> if poolMetadataMajor == metadataMajor && poolMetadataMinor == metadataMinor { <add> devices.metadataDevice = metadataLoopDevice <add> devices.metadataLoopFile = metadatafilename <add> } <add> } <add> <add> return nil <add>} <add> <ide> func (devices *DeviceSet) initDevmapper(doInit bool) error { <ide> // give ourselves to libdm as a log handler <ide> devicemapper.LogInit(devices) <ide> func (devices *DeviceSet) initDevmapper(doInit bool) error { <ide> } <ide> } <ide> <add> // Pool already exists and caller did not pass us a pool. That means <add> // we probably created pool earlier and could not remove it as some <add> // containers were still using it. Detect some of the properties of <add> // pool, like is it using loop devices. <add> if info.Exists != 0 && devices.thinPoolDevice == "" { <add> if err := devices.loadThinPoolLoopBackInfo(); err != nil { <add> logrus.Debugf("Failed to load thin pool loopback device information:%v", err) <add> return err <add> } <add> } <add> <ide> // If we didn't just create the data or metadata image, we need to <ide> // load the transaction id and migrate old metadata <ide> if !createdLoopback { <ide><path>pkg/devicemapper/devmapper.go <ide> func GetStatus(name string) (uint64, uint64, string, string, error) { <ide> return start, length, targetType, params, nil <ide> } <ide> <add>func GetTable(name string) (uint64, uint64, string, string, error) { <add> task, err := TaskCreateNamed(DeviceTable, name) <add> if task == nil { <add> logrus.Debugf("GetTable: Error TaskCreateNamed: %s", err) <add> return 0, 0, "", "", err <add> } <add> if err := task.Run(); err != nil { <add> logrus.Debugf("GetTable: Error Run: %s", err) <add> return 0, 0, "", "", err <add> } <add> <add> devinfo, err := task.GetInfo() <add> if err != nil { <add> logrus.Debugf("GetTable: Error GetInfo: %s", err) <add> return 0, 0, "", "", err <add> } <add> if devinfo.Exists == 0 { <add> logrus.Debugf("GetTable: Non existing device %s", name) <add> return 0, 0, "", "", fmt.Errorf("Non existing device %s", name) <add> } <add> <add> _, start, length, targetType, params := task.GetNextTarget(unsafe.Pointer(nil)) <add> return start, length, targetType, params, nil <add>} <add> <ide> func SetTransactionId(poolName string, oldId uint64, newId uint64) error { <ide> task, err := TaskCreateNamed(DeviceTargetMsg, poolName) <ide> if task == nil {
2
Text
Text
remove section about amending commits in pr guide
94c9f629370a9ca75ef175a559dd55774637f99a
<ide><path>doc/guides/contributing/pull-requests.md <ide> $ git push --force-with-lease origin my-branch <ide> ``` <ide> <ide> **Important:** The `git push --force-with-lease` command is one of the few ways <del>to delete history in `git`. Before you use it, make sure you understand the <del>risks. If in doubt, you can always ask for guidance in the pull request. <del> <del>If you happen to make a mistake in any of your commits, do not worry. You can <del>amend the last commit (for example if you want to change the commit log). <del> <del>```text <del>$ git add any/changed/files <del>$ git commit --amend <del>$ git push --force-with-lease origin my-branch <del>``` <add>to delete history in `git`. It also complicates the review process, as it won't <add>allow reviewers to get a quick glance on what changed. Before you use it, make <add>sure you understand the risks. If in doubt, you can always ask for guidance in <add>the pull request. <ide> <ide> There are a number of more advanced mechanisms for managing commits using <ide> `git rebase` that can be used, but are beyond the scope of this guide.
1
Python
Python
drop another case of u'', this time from #530 code
99848b0321a07d7f05f2333f635756f45e058088
<ide><path>rest_framework/tests/serializer.py <ide> def test_field_ctor(self): <ide> """ <ide> This is check that ctor supports both label and help_text. <ide> """ <del> self.assertEquals(u'Label', fields.Field(label='Label', help_text='Help').label) <del> self.assertEquals(u'Help', fields.CharField(label='Label', help_text='Help').help_text) <del> self.assertEquals(u'Label', relations.ManyHyperlinkedRelatedField(view_name='fake', label='Label', help_text='Help').label) <add> self.assertEquals('Label', fields.Field(label='Label', help_text='Help').label) <add> self.assertEquals('Help', fields.CharField(label='Label', help_text='Help').help_text) <add> self.assertEquals('Label', relations.ManyHyperlinkedRelatedField(view_name='fake', label='Label', help_text='Help').label) <ide> <ide> <ide> class AttributeMappingOnAutogeneratedFieldsTests(TestCase):
1
Mixed
Python
add german bert model to code, update readme
16af9ff7b0a22abfaff24de3ae00e695d7c25dd9
<ide><path>README.md <ide> where <ide> - `bert-base-multilingual-uncased`: (Orig, not recommended) 102 languages, 12-layer, 768-hidden, 12-heads, 110M parameters <ide> - `bert-base-multilingual-cased`: **(New, recommended)** 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters <ide> - `bert-base-chinese`: Chinese Simplified and Traditional, 12-layer, 768-hidden, 12-heads, 110M parameters <add> - `bert-base-german-cased`: Trained on German data only, 12-layer, 768-hidden, 12-heads, 110M parameters [Performance Evaluation](https://deepset.ai/german-bert) <ide> - `openai-gpt`: OpenAI English model, 12-layer, 768-hidden, 12-heads, 110M parameters <ide> - `transfo-xl-wt103`: Transformer-XL English model trained on wikitext-103, 18-layer, 1024-hidden, 16-heads, 257M parameters <ide> - `gpt2`: OpenAI GPT-2 English model, 12-layer, 768-hidden, 12-heads, 117M parameters <ide><path>pytorch_pretrained_bert/modeling.py <ide> 'bert-base-multilingual-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz", <ide> 'bert-base-multilingual-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz", <ide> 'bert-base-chinese': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz", <add> 'bert-base-german-cased': "https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased.tar.gz", <ide> } <ide> BERT_CONFIG_NAME = 'bert_config.json' <ide> TF_WEIGHTS_NAME = 'model.ckpt' <ide><path>pytorch_pretrained_bert/tokenization.py <ide> 'bert-base-multilingual-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt", <ide> 'bert-base-multilingual-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt", <ide> 'bert-base-chinese': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt", <add> 'bert-base-german-cased': "https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-vocab.txt", <ide> } <ide> PRETRAINED_VOCAB_POSITIONAL_EMBEDDINGS_SIZE_MAP = { <ide> 'bert-base-uncased': 512, <ide> 'bert-base-multilingual-uncased': 512, <ide> 'bert-base-multilingual-cased': 512, <ide> 'bert-base-chinese': 512, <add> 'bert-base-german-cased': 512, <ide> } <ide> VOCAB_NAME = 'vocab.txt' <ide>
3
Javascript
Javascript
add index field to each fiber
cccc4ae2c61282938920ff57633df9e1513cc52a
<ide><path>src/renderers/shared/fiber/ReactChildFiber.js <ide> function deleteChild( <ide> returnFiber : Fiber, <ide> childToDelete : Fiber <ide> ) { <add> if (!shouldTrackSideEffects) { <add> // Noop. <add> return; <add> } <add> <ide> // TODO: Add this child to the side-effect queue for deletion. <ide> } <ide> <ide> function deleteRemainingChildren( <ide> returnFiber : Fiber, <ide> currentFirstChild : ?Fiber <ide> ) { <add> if (!shouldTrackSideEffects) { <add> // Noop. <add> return null; <add> } <ide> // TODO: Add these children to the side-effect queue for deletion. <ide> return null; <ide> } <ide> function mapAndDeleteRemainingChildren( <ide> // we will then undo the deletion as we restore children. Implicit (null) keys <ide> // don't get added to this set. <ide> const existingChildren : Map<string, Fiber> = new Map(); <del> // TODO: This also needs to store the "previous index of this node". That lets <del> // us determine whether something needs to be a placement. It might be best to <del> // just store this on the fiber itself since that lets us use the "implicit" <del> // index resolution mechanism without adding null values to the linked list. <ide> let existingChild = currentFirstChild; <ide> while (existingChild) { <ide> if (existingChild.key !== null) { <ide> function mapAndDeleteRemainingChildren( <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> <ide> function useFiber(fiber : Fiber, priority : PriorityLevel) { <del> // We currently set sibling to null here because it is easy to forget to do <del> // before returning it. <add> // We currently set sibling to null and index to 0 here because it is easy <add> // to forget to do before returning it. E.g. for the single child case. <ide> if (shouldClone) { <ide> const clone = cloneFiber(fiber, priority); <add> clone.index = 0; <ide> clone.sibling = null; <ide> return clone; <ide> } else { <ide> if (fiber.pendingWorkPriority === NoWork || <ide> fiber.pendingWorkPriority > priority) { <ide> fiber.pendingWorkPriority = priority; <ide> } <add> fiber.index = 0; <ide> fiber.sibling = null; <ide> return fiber; <ide> } <ide> } <ide> <add> function placeChild(newFiber : Fiber, lastPlacedIndex : number, newIndex : number) { <add> newFiber.index = newIndex; <add> if (!shouldTrackSideEffects) { <add> // Noop. <add> return lastPlacedIndex; <add> } <add> const current = newFiber.alternate; <add> if (current) { <add> const oldIndex = current.index; <add> if (oldIndex < lastPlacedIndex) { <add> // This is a move. <add> // TODO: Schedule a move side-effect for this child. <add> return lastPlacedIndex; <add> } else { <add> // This item can stay in place. <add> return oldIndex; <add> } <add> } else { <add> // This is an insertion. <add> // TODO: Schedule an insertion side-effect for this child. <add> return lastPlacedIndex; <add> } <add> } <add> <ide> function updateTextNode( <ide> returnFiber : Fiber, <ide> current : ?Fiber, <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> <ide> function updateSlot( <ide> returnFiber : Fiber, <del> oldFiber : Fiber, <add> oldFiber : ?Fiber, <ide> newChild : any, <ide> priority : PriorityLevel <ide> ) : ?Fiber { <ide> // Update the fiber if the keys match, otherwise return null. <ide> <add> const key = oldFiber ? oldFiber.key : null; <add> <ide> if (typeof newChild === 'string' || typeof newChild === 'number') { <ide> // Text nodes doesn't have keys. If the previous node is implicitly keyed <ide> // we can continue to replace it without aborting even if it is not a text <ide> // node. <del> if (oldFiber.key !== null) { <add> if (key !== null) { <ide> return null; <ide> } <ide> return updateTextNode(returnFiber, oldFiber, '' + newChild, priority); <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> if (typeof newChild === 'object' && newChild !== null) { <ide> switch (newChild.$$typeof) { <ide> case REACT_ELEMENT_TYPE: { <del> if (newChild.key === oldFiber.key) { <add> if (newChild.key === key) { <ide> return updateElement( <ide> returnFiber, <ide> oldFiber, <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> } <ide> <ide> case REACT_COROUTINE_TYPE: { <del> if (newChild.key === oldFiber.key) { <add> if (newChild.key === key) { <ide> return updateCoroutine( <ide> returnFiber, <ide> oldFiber, <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> } <ide> <ide> case REACT_YIELD_TYPE: { <del> if (newChild.key === oldFiber.key) { <add> if (newChild.key === key) { <ide> return updateYield( <ide> returnFiber, <ide> oldFiber, <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> if (isArray(newChild) || getIteratorFn(newChild)) { <ide> // Fragments doesn't have keys so if the previous key is implicit we can <ide> // update it. <del> if (oldFiber.key !== null) { <add> if (key !== null) { <ide> return null; <ide> } <ide> return updateFragment(returnFiber, oldFiber, newChild, priority); <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> let previousNewFiber : ?Fiber = null; <ide> <ide> let oldFiber = currentFirstChild; <add> let lastPlacedIndex = 0; <ide> let newIdx = 0; <add> let nextOldFiber = null; <ide> for (; oldFiber && newIdx < newChildren.length; newIdx++) { <del> const nextOldFiber = oldFiber.sibling; // In-case we mutate this fiber. <del> const newFiber = updateSlot(returnFiber, oldFiber, newChildren[newIdx], priority); <add> if (oldFiber) { <add> if (oldFiber.index > newIdx) { <add> nextOldFiber = oldFiber; <add> oldFiber = null; <add> } else { <add> nextOldFiber = oldFiber.sibling; <add> } <add> } <add> const newFiber = updateSlot( <add> returnFiber, <add> oldFiber, <add> newChildren[newIdx], <add> priority <add> ); <ide> if (!newFiber) { <add> // TODO: This breaks on empty slots like null children. That's <add> // unfortunate because it triggers the slow path all the time. We need <add> // a better way to communicate whether this was a miss or null, <add> // boolean, undefined, etc. <ide> break; <ide> } <add> lastPlacedIndex = placeChild(newFiber, lastPlacedIndex, newIdx); <ide> if (!previousNewFiber) { <ide> // TODO: Move out of the loop. This only happens for the first run. <ide> resultingFirstChild = newFiber; <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> for (; newIdx < newChildren.length; newIdx++) { <ide> // TODO: Since the mutation of existing fibers can happen at any order <ide> // we might break the link before we're done with it. :( <del> const nextOldFiber = oldFiber ? oldFiber.sibling : null; <add> if (oldFiber) { <add> if (oldFiber.index > newIdx) { <add> nextOldFiber = oldFiber; <add> oldFiber = null; <add> } else { <add> nextOldFiber = oldFiber.sibling; <add> } <add> } <ide> const newFiber = updateFromMap( <ide> existingChildren, <ide> returnFiber, <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> priority <ide> ); <ide> if (newFiber) { <add> lastPlacedIndex = placeChild(newFiber, lastPlacedIndex, newIdx); <ide> if (!previousNewFiber) { <ide> resultingFirstChild = newFiber; <ide> } else { <ide> function ChildReconciler(shouldClone, shouldTrackSideEffects) { <ide> } <ide> // We will keep traversing the oldFiber in order, in case the new child <ide> // has a null key that we'll need to match in the same slot. <del> if (oldFiber) { <del> oldFiber = nextOldFiber; <del> } <add> oldFiber = nextOldFiber; <ide> } <ide> <ide> // TODO: Add deletion side-effects to the returnFiber's side-effects. <ide><path>src/renderers/shared/fiber/ReactFiber.js <ide> export type Fiber = Instance & { <ide> // Singly Linked List Tree Structure. <ide> child: ?Fiber, <ide> sibling: ?Fiber, <add> index: number, <ide> <ide> // The ref last used to attach this node. <ide> // I'll avoid adding an owner field for prod and model that as functions. <ide> var createFiber = function(tag : TypeOfWork, key : null | string) : Fiber { <ide> <ide> child: null, <ide> sibling: null, <add> index: 0, <ide> <ide> ref: null, <ide> <ide> exports.cloneFiber = function(fiber : Fiber, priorityLevel : PriorityLevel) : Fi <ide> alt.stateNode = fiber.stateNode; <ide> alt.child = fiber.child; <ide> alt.sibling = fiber.sibling; // This should always be overridden. TODO: null <add> alt.index = fiber.index; // This should always be overridden. <ide> alt.ref = fiber.ref; <ide> // pendingProps is here for symmetry but is unnecessary in practice for now. <ide> // TODO: Pass in the new pendingProps as an argument maybe? <ide><path>src/renderers/shared/fiber/__tests__/ReactTopLevelFragment-test.js <ide> describe('ReactTopLevelFragment', function() { <ide> <ide> }); <ide> <add> it('preserves state if an implicit key slot switches from/to null', function() { <add> <add> var instance = null; <add> <add> class Stateful extends React.Component { <add> render() { <add> instance = this; <add> return <div>World</div>; <add> } <add> } <add> <add> function Fragment({ condition }) { <add> return condition ? [null, <Stateful />] : <add> [<div>Hello</div>, <Stateful />]; <add> } <add> ReactNoop.render(<Fragment />); <add> ReactNoop.flush(); <add> <add> var instanceA = instance; <add> <add> expect(instanceA).not.toBe(null); <add> <add> ReactNoop.render(<Fragment condition={true} />); <add> ReactNoop.flush(); <add> <add> var instanceB = instance; <add> <add> expect(instanceB).toBe(instanceA); <add> <add> ReactNoop.render(<Fragment condition={false} />); <add> ReactNoop.flush(); <add> <add> var instanceC = instance; <add> <add> expect(instanceC === instanceA).toBe(true); <add> <add> }); <add> <add> it('should preserve state in a reorder', function() { <add> <add> var instance = null; <add> <add> class Stateful extends React.Component { <add> render() { <add> instance = this; <add> return <div>Hello</div>; <add> } <add> } <add> <add> function Fragment({ condition }) { <add> return condition ? [[<div key="b">World</div>, <Stateful key="a" />]] : <add> [[<Stateful key="a" />, <div key="b">World</div>], <div />]; <add> } <add> ReactNoop.render(<Fragment />); <add> ReactNoop.flush(); <add> <add> var instanceA = instance; <add> <add> expect(instanceA).not.toBe(null); <add> <add> ReactNoop.render(<Fragment condition={true} />); <add> ReactNoop.flush(); <add> <add> var instanceB = instance; <add> <add> expect(instanceB).toBe(instanceA); <add> <add> }); <add> <ide> });
3
Ruby
Ruby
test the happy path for recursive yields too
fe7d77cc01ae652a15b2c1896f677a56133bc0f1
<ide><path>activesupport/test/share_lock_test.rb <ide> def test_manual_incompatible_yield <ide> threads.each(&:kill) if threads <ide> end <ide> <add> def test_manual_recursive_yield <add> ready = Concurrent::CyclicBarrier.new(2) <add> done = Concurrent::CyclicBarrier.new(2) <add> do_nesting = Concurrent::CountDownLatch.new <add> <add> threads = [ <add> Thread.new do <add> @lock.sharing do <add> ready.wait <add> @lock.exclusive(purpose: :x) {} <add> done.wait <add> end <add> end, <add> <add> Thread.new do <add> @lock.sharing do <add> @lock.yield_shares(compatible: [:x]) do <add> @lock.sharing do <add> ready.wait <add> do_nesting.wait <add> @lock.yield_shares(compatible: [:x, :y]) do <add> done.wait <add> end <add> end <add> end <add> end <add> end <add> ] <add> <add> assert_threads_stuck threads <add> do_nesting.count_down <add> <add> assert_threads_not_stuck threads <add> end <add> <ide> def test_manual_recursive_yield_cannot_expand_outer_compatible <ide> ready = Concurrent::CyclicBarrier.new(2) <ide> do_compatible_nesting = Concurrent::CountDownLatch.new
1
Javascript
Javascript
extract some of the tidy up changes from 19278
61dd00db24bec6305bd72908d3617b9f2a5183da
<ide><path>packages/react-dom/src/client/ReactDOMComponent.js <ide> import { <ide> enableDeprecatedFlareAPI, <ide> enableTrustedTypesIntegration, <ide> } from 'shared/ReactFeatureFlags'; <del>import {listenToReactPropEvent} from '../events/DOMModernPluginEventSystem'; <add>import {listenToReactEvent} from '../events/DOMModernPluginEventSystem'; <ide> import {getEventListenerMap} from './ReactDOMComponentTree'; <ide> <ide> let didWarnInvalidHydration = false; <ide> export function ensureListeningTo( <ide> 'ensureListeningTo(): received a container that was not an element node. ' + <ide> 'This is likely a bug in React.', <ide> ); <del> listenToReactPropEvent( <del> reactPropEvent, <del> ((rootContainerElement: any): Element), <del> ); <add> listenToReactEvent(reactPropEvent, ((rootContainerElement: any): Element)); <ide> } <ide> <ide> function getOwnerDocumentFromRootContainer( <ide><path>packages/react-dom/src/client/ReactDOMEventHandle.js <ide> import { <ide> } from './ReactDOMComponentTree'; <ide> import {ELEMENT_NODE} from '../shared/HTMLNodeType'; <ide> import { <del> listenToTopLevelEvent, <add> listenToNativeEvent, <ide> addEventTypeToDispatchConfig, <ide> } from '../events/DOMModernPluginEventSystem'; <ide> <ide> import {HostRoot, HostPortal} from 'react-reconciler/src/ReactWorkTags'; <ide> import { <ide> PLUGIN_EVENT_SYSTEM, <del> IS_TARGET_PHASE_ONLY, <add> IS_EVENT_HANDLE_NON_MANAGED_NODE, <ide> } from '../events/EventSystemFlags'; <ide> <ide> import { <ide> function isReactScope(target: EventTarget | ReactScopeInstance): boolean { <ide> <ide> function createEventHandleListener( <ide> type: DOMTopLevelEventType, <del> capture: boolean, <add> isCapturePhaseListener: boolean, <ide> callback: (SyntheticEvent<EventTarget>) => void, <ide> ): ReactDOMEventHandleListener { <ide> return { <ide> callback, <del> capture, <add> capture: isCapturePhaseListener, <ide> type, <ide> }; <ide> } <ide> <ide> function registerEventOnNearestTargetContainer( <ide> targetFiber: Fiber, <ide> topLevelType: DOMTopLevelEventType, <del> passive: boolean | void, <del> priority: EventPriority | void, <del> capture: boolean, <add> isPassiveListener: boolean | void, <add> listenerPriority: EventPriority | void, <add> isCapturePhaseListener: boolean, <ide> ): void { <ide> // If it is, find the nearest root or portal and make it <ide> // our event handle target container. <ide> function registerEventOnNearestTargetContainer( <ide> ); <ide> } <ide> const listenerMap = getEventListenerMap(targetContainer); <del> listenToTopLevelEvent( <add> listenToNativeEvent( <ide> topLevelType, <ide> targetContainer, <ide> listenerMap, <ide> PLUGIN_EVENT_SYSTEM, <del> capture, <del> passive, <del> priority, <add> isCapturePhaseListener, <add> isPassiveListener, <add> listenerPriority, <ide> ); <ide> } <ide> <ide> function registerReactDOMEvent( <ide> target: EventTarget | ReactScopeInstance, <ide> topLevelType: DOMTopLevelEventType, <del> passive: boolean | void, <del> capture: boolean, <del> priority: EventPriority | void, <add> isPassiveListener: boolean | void, <add> isCapturePhaseListener: boolean, <add> listenerPriority: EventPriority | void, <ide> ): void { <ide> // Check if the target is a DOM element. <ide> if ((target: any).nodeType === ELEMENT_NODE) { <ide> function registerReactDOMEvent( <ide> registerEventOnNearestTargetContainer( <ide> targetFiber, <ide> topLevelType, <del> passive, <del> priority, <del> capture, <add> isPassiveListener, <add> listenerPriority, <add> isCapturePhaseListener, <ide> ); <ide> } else if (enableScopeAPI && isReactScope(target)) { <ide> const scopeTarget = ((target: any): ReactScopeInstance); <ide> function registerReactDOMEvent( <ide> registerEventOnNearestTargetContainer( <ide> targetFiber, <ide> topLevelType, <del> passive, <del> priority, <del> capture, <add> isPassiveListener, <add> listenerPriority, <add> isCapturePhaseListener, <ide> ); <ide> } else if (isValidEventTarget(target)) { <ide> const eventTarget = ((target: any): EventTarget); <ide> const listenerMap = getEventListenerMap(eventTarget); <del> listenToTopLevelEvent( <add> listenToNativeEvent( <ide> topLevelType, <ide> eventTarget, <ide> listenerMap, <del> PLUGIN_EVENT_SYSTEM | IS_TARGET_PHASE_ONLY, <del> capture, <del> passive, <del> priority, <add> PLUGIN_EVENT_SYSTEM | IS_EVENT_HANDLE_NON_MANAGED_NODE, <add> isCapturePhaseListener, <add> isPassiveListener, <add> listenerPriority, <ide> ); <ide> } else { <ide> invariant( <ide> export function createEventHandle( <ide> ): ReactDOMEventHandle { <ide> if (enableCreateEventHandleAPI) { <ide> const topLevelType = ((type: any): DOMTopLevelEventType); <del> let capture = false; <del> let passive = undefined; // Undefined means to use the browser default <del> let priority; <add> let isCapturePhaseListener = false; <add> let isPassiveListener = undefined; // Undefined means to use the browser default <add> let listenerPriority; <ide> <ide> if (options != null) { <ide> const optionsCapture = options.capture; <ide> const optionsPassive = options.passive; <ide> const optionsPriority = options.priority; <ide> <ide> if (typeof optionsCapture === 'boolean') { <del> capture = optionsCapture; <add> isCapturePhaseListener = optionsCapture; <ide> } <ide> if (typeof optionsPassive === 'boolean') { <del> passive = optionsPassive; <add> isPassiveListener = optionsPassive; <ide> } <ide> if (typeof optionsPriority === 'number') { <del> priority = optionsPriority; <add> listenerPriority = optionsPriority; <ide> } <ide> } <del> if (priority === undefined) { <del> priority = getEventPriorityForListenerSystem(topLevelType); <add> if (listenerPriority === undefined) { <add> listenerPriority = getEventPriorityForListenerSystem(topLevelType); <ide> } <ide> <ide> const registeredReactDOMEvents = new PossiblyWeakSet(); <ide> export function createEventHandle( <ide> ); <ide> if (!registeredReactDOMEvents.has(target)) { <ide> registeredReactDOMEvents.add(target); <del> registerReactDOMEvent(target, topLevelType, passive, capture, priority); <add> registerReactDOMEvent( <add> target, <add> topLevelType, <add> isPassiveListener, <add> isCapturePhaseListener, <add> listenerPriority, <add> ); <ide> // Add the event to our known event types list. <ide> addEventTypeToDispatchConfig(topLevelType); <ide> } <ide> const listener = createEventHandleListener( <ide> topLevelType, <del> capture, <add> isCapturePhaseListener, <ide> callback, <ide> ); <ide> let targetListeners = getEventHandlerListeners(target); <ide><path>packages/react-dom/src/client/ReactDOMHostConfig.js <ide> import { <ide> } from 'shared/ReactFeatureFlags'; <ide> import {HostComponent, HostText} from 'react-reconciler/src/ReactWorkTags'; <ide> import {TOP_BEFORE_BLUR, TOP_AFTER_BLUR} from '../events/DOMTopLevelEventTypes'; <del>import {listenToReactPropEvent} from '../events/DOMModernPluginEventSystem'; <add>import {listenToReactEvent} from '../events/DOMModernPluginEventSystem'; <ide> <ide> export type Type = string; <ide> export type Props = { <ide> export function makeOpaqueHydratingObject( <ide> } <ide> <ide> export function preparePortalMount(portalInstance: Instance): void { <del> listenToReactPropEvent('onMouseEnter', portalInstance); <add> listenToReactEvent('onMouseEnter', portalInstance); <ide> } <ide> <ide> export function prepareScopeUpdate( <ide><path>packages/react-dom/src/events/DOMModernPluginEventSystem.js <ide> <ide> import type {TopLevelType, DOMTopLevelEventType} from './TopLevelEventTypes'; <ide> import type {EventSystemFlags} from './EventSystemFlags'; <del>import type { <del> AnyNativeEvent, <del> DispatchQueue, <del> DispatchQueueItem, <del> DispatchQueueItemPhase, <del> DispatchQueueItemPhaseEntry, <del>} from './PluginModuleType'; <add>import type {AnyNativeEvent} from './PluginModuleType'; <ide> import type {ReactSyntheticEvent} from './ReactSyntheticEventType'; <ide> import type { <ide> ElementListenerMap, <ide> import { <ide> LEGACY_FB_SUPPORT, <ide> IS_REPLAYED, <ide> IS_CAPTURE_PHASE, <del> IS_TARGET_PHASE_ONLY, <add> IS_EVENT_HANDLE_NON_MANAGED_NODE, <ide> } from './EventSystemFlags'; <ide> <ide> import { <ide> import * as ModernEnterLeaveEventPlugin from './plugins/ModernEnterLeaveEventPlu <ide> import * as ModernSelectEventPlugin from './plugins/ModernSelectEventPlugin'; <ide> import * as ModernSimpleEventPlugin from './plugins/ModernSimpleEventPlugin'; <ide> <add>type DispatchListener = {| <add> instance: null | Fiber, <add> listener: Function, <add> currentTarget: EventTarget, <add>|}; <add> <add>type DispatchEntry = {| <add> event: ReactSyntheticEvent, <add> listeners: Array<DispatchListener>, <add>|}; <add> <add>export type DispatchQueue = Array<DispatchEntry>; <add> <ide> // TODO: remove top-level side effect. <ide> ModernSimpleEventPlugin.registerEvents(); <ide> ModernEnterLeaveEventPlugin.registerEvents(); <ide> function extractEvents( <ide> nativeEvent: AnyNativeEvent, <ide> nativeEventTarget: null | EventTarget, <ide> eventSystemFlags: EventSystemFlags, <del> targetContainer: null | EventTarget, <add> targetContainer: EventTarget, <ide> ) { <ide> // TODO: we should remove the concept of a "SimpleEventPlugin". <ide> // This is the basic functionality of the event system. All <ide> function executeDispatch( <ide> <ide> function processDispatchQueueItemsInOrder( <ide> event: ReactSyntheticEvent, <del> phase: DispatchQueueItemPhase, <add> dispatchListeners: Array<DispatchListener>, <ide> inCapturePhase: boolean, <ide> ): void { <ide> let previousInstance; <ide> if (inCapturePhase) { <del> for (let i = phase.length - 1; i >= 0; i--) { <del> const {instance, currentTarget, listener} = phase[i]; <add> for (let i = dispatchListeners.length - 1; i >= 0; i--) { <add> const {instance, currentTarget, listener} = dispatchListeners[i]; <ide> if (instance !== previousInstance && event.isPropagationStopped()) { <ide> return; <ide> } <ide> executeDispatch(event, listener, currentTarget); <ide> previousInstance = instance; <ide> } <ide> } else { <del> for (let i = 0; i < phase.length; i++) { <del> const {instance, currentTarget, listener} = phase[i]; <add> for (let i = 0; i < dispatchListeners.length; i++) { <add> const {instance, currentTarget, listener} = dispatchListeners[i]; <ide> if (instance !== previousInstance && event.isPropagationStopped()) { <ide> return; <ide> } <ide> export function processDispatchQueue( <ide> ): void { <ide> const inCapturePhase = (eventSystemFlags & IS_CAPTURE_PHASE) !== 0; <ide> for (let i = 0; i < dispatchQueue.length; i++) { <del> const dispatchQueueItem: DispatchQueueItem = dispatchQueue[i]; <del> const {event, phase} = dispatchQueueItem; <del> processDispatchQueueItemsInOrder(event, phase, inCapturePhase); <add> const {event, listeners} = dispatchQueue[i]; <add> processDispatchQueueItemsInOrder(event, listeners, inCapturePhase); <ide> // Modern event system doesn't use pooling. <ide> } <ide> // This would be a good time to rethrow if any of the event handlers threw. <ide> function shouldUpgradeListener( <ide> ); <ide> } <ide> <del>export function listenToTopLevelEvent( <add>export function listenToNativeEvent( <ide> topLevelType: DOMTopLevelEventType, <ide> target: EventTarget, <ide> listenerMap: ElementListenerMap, <ide> eventSystemFlags: EventSystemFlags, <del> capture: boolean, <del> passive?: boolean, <add> isCapturePhaseListener: boolean, <add> isPassiveListener?: boolean, <ide> priority?: EventPriority, <ide> ): void { <ide> // TOP_SELECTION_CHANGE needs to be attached to the document <ide> export function listenToTopLevelEvent( <ide> target = (target: any).ownerDocument || target; <ide> listenerMap = getEventListenerMap(target); <ide> } <del> const listenerMapKey = getListenerMapKey(topLevelType, capture); <add> const listenerMapKey = getListenerMapKey( <add> topLevelType, <add> isCapturePhaseListener, <add> ); <ide> const listenerEntry = ((listenerMap.get( <ide> listenerMapKey, <ide> ): any): ElementListenerMapEntry | void); <del> const shouldUpgrade = shouldUpgradeListener(listenerEntry, passive); <add> const shouldUpgrade = shouldUpgradeListener(listenerEntry, isPassiveListener); <ide> <ide> // If the listener entry is empty or we should upgrade, then <ide> // we need to trap an event listener onto the target. <ide> export function listenToTopLevelEvent( <ide> removeTrappedEventListener( <ide> target, <ide> topLevelType, <del> capture, <add> isCapturePhaseListener, <ide> ((listenerEntry: any): ElementListenerMapEntry).listener, <ide> ); <ide> } <del> if (capture) { <add> if (isCapturePhaseListener) { <ide> eventSystemFlags |= IS_CAPTURE_PHASE; <ide> } <ide> const listener = addTrappedEventListener( <ide> target, <ide> topLevelType, <ide> eventSystemFlags, <del> capture, <add> isCapturePhaseListener, <ide> false, <del> passive, <add> isPassiveListener, <ide> priority, <ide> ); <del> listenerMap.set(listenerMapKey, {passive, listener}); <add> listenerMap.set(listenerMapKey, {passive: isPassiveListener, listener}); <ide> } <ide> } <ide> <ide> function isCaptureRegistrationName(registrationName: string): boolean { <ide> return registrationName.substr(len - 7) === 'Capture'; <ide> } <ide> <del>export function listenToReactPropEvent( <add>export function listenToReactEvent( <ide> reactPropEvent: string, <ide> rootContainerElement: Element, <ide> ): void { <ide> export function listenToReactPropEvent( <ide> const dependency = dependencies[i]; <ide> const capture = <ide> capturePhaseEvents.has(dependency) || registrationCapturePhase; <del> listenToTopLevelEvent( <add> listenToNativeEvent( <ide> dependency, <ide> rootContainerElement, <ide> listenerMap, <ide> function addTrappedEventListener( <ide> targetContainer: EventTarget, <ide> topLevelType: DOMTopLevelEventType, <ide> eventSystemFlags: EventSystemFlags, <del> capture: boolean, <add> isCapturePhaseListener: boolean, <ide> isDeferredListenerForLegacyFBSupport?: boolean, <del> passive?: boolean, <del> priority?: EventPriority, <add> isPassiveListener?: boolean, <add> listenerPriority?: EventPriority, <ide> ): any => void { <ide> let listener = createEventListenerWrapperWithPriority( <ide> targetContainer, <ide> topLevelType, <ide> eventSystemFlags, <del> priority, <add> listenerPriority, <ide> ); <ide> // If passive option is not supported, then the event will be <ide> // active and not passive. <del> if (passive === true && !passiveBrowserEventsSupported) { <del> passive = false; <add> if (isPassiveListener === true && !passiveBrowserEventsSupported) { <add> isPassiveListener = false; <ide> } <ide> <ide> targetContainer = <ide> function addTrappedEventListener( <ide> targetContainer, <ide> rawEventName, <ide> unsubscribeListener, <del> capture, <add> isCapturePhaseListener, <ide> ); <ide> } <ide> }; <ide> } <del> if (capture) { <del> if (enableCreateEventHandleAPI && passive !== undefined) { <add> if (isCapturePhaseListener) { <add> if (enableCreateEventHandleAPI && isPassiveListener !== undefined) { <ide> unsubscribeListener = addEventCaptureListenerWithPassiveFlag( <ide> targetContainer, <ide> rawEventName, <ide> listener, <del> passive, <add> isPassiveListener, <ide> ); <ide> } else { <ide> unsubscribeListener = addEventCaptureListener( <ide> function addTrappedEventListener( <ide> ); <ide> } <ide> } else { <del> if (enableCreateEventHandleAPI && passive !== undefined) { <add> if (enableCreateEventHandleAPI && isPassiveListener !== undefined) { <ide> unsubscribeListener = addEventBubbleListenerWithPassiveFlag( <ide> targetContainer, <ide> rawEventName, <ide> listener, <del> passive, <add> isPassiveListener, <ide> ); <ide> } else { <ide> unsubscribeListener = addEventBubbleListener( <ide> export function dispatchEventForPluginEventSystem( <ide> targetContainer: EventTarget, <ide> ): void { <ide> let ancestorInst = targetInst; <del> if (eventSystemFlags & IS_TARGET_PHASE_ONLY) { <add> if (eventSystemFlags & IS_EVENT_HANDLE_NON_MANAGED_NODE) { <ide> // For TargetEvent nodes (i.e. document, window) <ide> ancestorInst = null; <ide> } else { <ide> export function dispatchEventForPluginEventSystem( <ide> ); <ide> } <ide> <del>function createDispatchQueueItemPhaseEntry( <add>function createDispatchListener( <ide> instance: null | Fiber, <ide> listener: Function, <ide> currentTarget: EventTarget, <del>): DispatchQueueItemPhaseEntry { <add>): DispatchListener { <ide> return { <ide> instance, <ide> listener, <ide> currentTarget, <ide> }; <ide> } <ide> <del>function createDispatchQueueItem( <add>function createDispatchEntry( <ide> event: ReactSyntheticEvent, <del> phase: DispatchQueueItemPhase, <del>): DispatchQueueItem { <add> listeners: Array<DispatchListener>, <add>): DispatchEntry { <ide> return { <ide> event, <del> phase, <add> listeners, <ide> }; <ide> } <ide> <ide> export function accumulateSinglePhaseListeners( <ide> ): void { <ide> const bubbled = event._reactName; <ide> const captured = bubbled !== null ? bubbled + 'Capture' : null; <del> const phase: DispatchQueueItemPhase = []; <add> const listeners: Array<DispatchListener> = []; <ide> <ide> // If we are not handling EventTarget only phase, then we're doing the <ide> // usual two phase accumulation using the React fiber tree to pick up <ide> export function accumulateSinglePhaseListeners( <ide> lastHostComponent = currentTarget; <ide> // For Event Handle listeners <ide> if (enableCreateEventHandleAPI) { <del> const listeners = getEventHandlerListeners(currentTarget); <add> const eventHandlerlisteners = getEventHandlerListeners(currentTarget); <ide> <del> if (listeners !== null) { <del> const listenersArr = Array.from(listeners); <del> for (let i = 0; i < listenersArr.length; i++) { <del> const listener = listenersArr[i]; <del> const {callback, capture, type} = listener; <add> if (eventHandlerlisteners !== null) { <add> const eventHandlerlistenersArr = Array.from(eventHandlerlisteners); <add> for (let i = 0; i < eventHandlerlistenersArr.length; i++) { <add> const { <add> callback, <add> capture: isCapturePhaseListener, <add> type, <add> } = eventHandlerlistenersArr[i]; <ide> if (type === targetType) { <del> if (capture && inCapturePhase) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> callback, <del> currentTarget, <del> ), <add> if (isCapturePhaseListener && inCapturePhase) { <add> listeners.push( <add> createDispatchListener(instance, callback, currentTarget), <ide> ); <del> } else if (!capture) { <del> const entry = createDispatchQueueItemPhaseEntry( <add> } else if (!isCapturePhaseListener) { <add> const entry = createDispatchListener( <ide> instance, <ide> callback, <ide> currentTarget, <ide> ); <ide> if (shouldEmulateTwoPhase) { <del> phase.unshift(entry); <add> listeners.unshift(entry); <ide> } else if (!inCapturePhase) { <del> phase.push(entry); <add> listeners.push(entry); <ide> } <ide> } <ide> } <ide> export function accumulateSinglePhaseListeners( <ide> if (captured !== null && inCapturePhase) { <ide> const captureListener = getListener(instance, captured); <ide> if (captureListener != null) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> captureListener, <del> currentTarget, <del> ), <add> listeners.push( <add> createDispatchListener(instance, captureListener, currentTarget), <ide> ); <ide> } <ide> } <ide> if (bubbled !== null) { <ide> const bubbleListener = getListener(instance, bubbled); <ide> if (bubbleListener != null) { <del> const entry = createDispatchQueueItemPhaseEntry( <add> const entry = createDispatchListener( <ide> instance, <ide> bubbleListener, <ide> currentTarget, <ide> ); <ide> if (shouldEmulateTwoPhase) { <del> phase.unshift(entry); <add> listeners.unshift(entry); <ide> } else if (!inCapturePhase) { <del> phase.push(entry); <add> listeners.push(entry); <ide> } <ide> } <ide> } <ide> export function accumulateSinglePhaseListeners( <ide> lastHostComponent !== null <ide> ) { <ide> const reactScopeInstance = stateNode; <del> const listeners = getEventHandlerListeners(reactScopeInstance); <add> const eventHandlerlisteners = getEventHandlerListeners( <add> reactScopeInstance, <add> ); <ide> const lastCurrentTarget = ((lastHostComponent: any): Element); <ide> <del> if (listeners !== null) { <del> const listenersArr = Array.from(listeners); <del> for (let i = 0; i < listenersArr.length; i++) { <del> const listener = listenersArr[i]; <del> const {callback, capture, type} = listener; <add> if (eventHandlerlisteners !== null) { <add> const eventHandlerlistenersArr = Array.from(eventHandlerlisteners); <add> for (let i = 0; i < eventHandlerlistenersArr.length; i++) { <add> const { <add> callback, <add> capture: isCapturePhaseListener, <add> type, <add> } = eventHandlerlistenersArr[i]; <ide> if (type === targetType) { <del> if (capture && inCapturePhase) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> callback, <del> lastCurrentTarget, <del> ), <add> if (isCapturePhaseListener && inCapturePhase) { <add> listeners.push( <add> createDispatchListener(instance, callback, lastCurrentTarget), <ide> ); <del> } else if (!capture) { <del> const entry = createDispatchQueueItemPhaseEntry( <add> } else if (!isCapturePhaseListener) { <add> const entry = createDispatchListener( <ide> instance, <ide> callback, <ide> lastCurrentTarget, <ide> ); <ide> if (shouldEmulateTwoPhase) { <del> phase.unshift(entry); <add> listeners.unshift(entry); <ide> } else if (!inCapturePhase) { <del> phase.push(entry); <add> listeners.push(entry); <ide> } <ide> } <ide> } <ide> export function accumulateSinglePhaseListeners( <ide> } <ide> instance = instance.return; <ide> } <del> if (phase.length !== 0) { <del> dispatchQueue.push(createDispatchQueueItem(event, phase)); <add> if (listeners.length !== 0) { <add> dispatchQueue.push(createDispatchEntry(event, listeners)); <ide> } <ide> } <ide> <ide> export function accumulateTwoPhaseListeners( <ide> ): void { <ide> const bubbled = event._reactName; <ide> const captured = bubbled !== null ? bubbled + 'Capture' : null; <del> const phase: DispatchQueueItemPhase = []; <add> const listeners: Array<DispatchListener> = []; <ide> let instance = targetFiber; <ide> <ide> // Accumulate all instances and listeners via the target -> root path. <ide> export function accumulateTwoPhaseListeners( <ide> if (captured !== null) { <ide> const captureListener = getListener(instance, captured); <ide> if (captureListener != null) { <del> phase.unshift( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> captureListener, <del> currentTarget, <del> ), <add> listeners.unshift( <add> createDispatchListener(instance, captureListener, currentTarget), <ide> ); <ide> } <ide> } <ide> if (bubbled !== null) { <ide> const bubbleListener = getListener(instance, bubbled); <ide> if (bubbleListener != null) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> bubbleListener, <del> currentTarget, <del> ), <add> listeners.push( <add> createDispatchListener(instance, bubbleListener, currentTarget), <ide> ); <ide> } <ide> } <ide> } <ide> instance = instance.return; <ide> } <del> if (phase.length !== 0) { <del> dispatchQueue.push(createDispatchQueueItem(event, phase)); <add> if (listeners.length !== 0) { <add> dispatchQueue.push(createDispatchEntry(event, listeners)); <ide> } <ide> } <ide> <ide> function accumulateEnterLeaveListenersForEvent( <ide> event: ReactSyntheticEvent, <ide> target: Fiber, <ide> common: Fiber | null, <del> capture: boolean, <add> inCapturePhase: boolean, <ide> ): void { <ide> const registrationName = event._reactName; <ide> if (registrationName === undefined) { <ide> return; <ide> } <del> const phase: DispatchQueueItemPhase = []; <add> const listeners: Array<DispatchListener> = []; <ide> <ide> let instance = target; <ide> while (instance !== null) { <ide> function accumulateEnterLeaveListenersForEvent( <ide> } <ide> if (tag === HostComponent && stateNode !== null) { <ide> const currentTarget = stateNode; <del> if (capture) { <add> if (inCapturePhase) { <ide> const captureListener = getListener(instance, registrationName); <ide> if (captureListener != null) { <del> phase.unshift( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> captureListener, <del> currentTarget, <del> ), <add> listeners.unshift( <add> createDispatchListener(instance, captureListener, currentTarget), <ide> ); <ide> } <del> } else if (!capture) { <add> } else if (!inCapturePhase) { <ide> const bubbleListener = getListener(instance, registrationName); <ide> if (bubbleListener != null) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry( <del> instance, <del> bubbleListener, <del> currentTarget, <del> ), <add> listeners.push( <add> createDispatchListener(instance, bubbleListener, currentTarget), <ide> ); <ide> } <ide> } <ide> } <ide> instance = instance.return; <ide> } <del> if (phase.length !== 0) { <del> dispatchQueue.push(createDispatchQueueItem(event, phase)); <add> if (listeners.length !== 0) { <add> dispatchQueue.push(createDispatchEntry(event, listeners)); <ide> } <ide> } <ide> <ide> export function accumulateEnterLeaveTwoPhaseListeners( <ide> } <ide> } <ide> <del>export function accumulateEventHandleTargetListeners( <add>export function accumulateEventHandleNonManagedNodeListeners( <ide> dispatchQueue: DispatchQueue, <ide> event: ReactSyntheticEvent, <ide> currentTarget: EventTarget, <ide> inCapturePhase: boolean, <ide> ): void { <del> const phase: DispatchQueueItemPhase = []; <add> const listeners: Array<DispatchListener> = []; <ide> <ide> const eventListeners = getEventHandlerListeners(currentTarget); <ide> if (eventListeners !== null) { <ide> export function accumulateEventHandleTargetListeners( <ide> <ide> for (let i = 0; i < listenersArr.length; i++) { <ide> const listener = listenersArr[i]; <del> const {callback, capture, type} = listener; <add> const {callback, capture: isCapturePhaseListener, type} = listener; <ide> if (type === targetType) { <del> if (inCapturePhase && capture) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry(null, callback, currentTarget), <del> ); <del> } else if (!inCapturePhase && !capture) { <del> phase.push( <del> createDispatchQueueItemPhaseEntry(null, callback, currentTarget), <del> ); <add> if (inCapturePhase && isCapturePhaseListener) { <add> listeners.push(createDispatchListener(null, callback, currentTarget)); <add> } else if (!inCapturePhase && !isCapturePhaseListener) { <add> listeners.push(createDispatchListener(null, callback, currentTarget)); <ide> } <ide> } <ide> } <ide> } <del> if (phase.length !== 0) { <del> dispatchQueue.push(createDispatchQueueItem(event, phase)); <add> if (listeners.length !== 0) { <add> dispatchQueue.push(createDispatchEntry(event, listeners)); <ide> } <ide> } <ide> <ide><path>packages/react-dom/src/events/EventSystemFlags.js <ide> export type EventSystemFlags = number; <ide> <ide> export const PLUGIN_EVENT_SYSTEM = 1; <ide> export const RESPONDER_EVENT_SYSTEM = 1 << 1; <del>export const IS_TARGET_PHASE_ONLY = 1 << 2; <add>export const IS_EVENT_HANDLE_NON_MANAGED_NODE = 1 << 2; <ide> export const IS_CAPTURE_PHASE = 1 << 3; <ide> export const IS_PASSIVE = 1 << 4; <ide> export const PASSIVE_NOT_SUPPORTED = 1 << 5; <ide><path>packages/react-dom/src/events/PluginModuleType.js <ide> * @flow <ide> */ <ide> <del>import type {Fiber} from 'react-reconciler/src/ReactInternalTypes'; <del>import type {ReactSyntheticEvent} from './ReactSyntheticEventType'; <del> <ide> export type AnyNativeEvent = Event | KeyboardEvent | MouseEvent | TouchEvent; <ide> <ide> export type PluginName = string; <ide> <ide> export type EventSystemFlags = number; <del> <del>export type DispatchQueueItemPhaseEntry = {| <del> instance: null | Fiber, <del> listener: Function, <del> currentTarget: EventTarget, <del>|}; <del> <del>export type DispatchQueueItemPhase = Array<DispatchQueueItemPhaseEntry>; <del> <del>export type DispatchQueueItem = {| <del> event: ReactSyntheticEvent, <del> phase: DispatchQueueItemPhase, <del>|}; <del> <del>export type DispatchQueue = Array<DispatchQueueItem>; <ide><path>packages/react-dom/src/events/ReactDOMEventReplaying.js <ide> import { <ide> } from './DOMTopLevelEventTypes'; <ide> import {IS_REPLAYED, PLUGIN_EVENT_SYSTEM} from './EventSystemFlags'; <ide> import { <del> listenToTopLevelEvent, <add> listenToNativeEvent, <ide> capturePhaseEvents, <ide> } from './DOMModernPluginEventSystem'; <ide> import {addResponderEventSystemEvent} from './DeprecatedDOMEventResponderSystem'; <ide> function trapReplayableEventForContainer( <ide> listenerMap: ElementListenerMap, <ide> ) { <ide> const capture = capturePhaseEvents.has(topLevelType); <del> listenToTopLevelEvent( <add> listenToNativeEvent( <ide> topLevelType, <ide> ((container: any): Element), <ide> listenerMap, <ide><path>packages/react-dom/src/events/plugins/ModernChangeEventPlugin.js <ide> */ <ide> import type {AnyNativeEvent} from '../PluginModuleType'; <ide> import type {TopLevelType} from '../TopLevelEventTypes'; <del>import type {DispatchQueue} from '../PluginModuleType'; <add>import type {DispatchQueue} from '../DOMModernPluginEventSystem'; <ide> import type {EventSystemFlags} from '../EventSystemFlags'; <ide> <ide> import {registerTwoPhaseEvent} from '../EventRegistry'; <ide><path>packages/react-dom/src/events/plugins/ModernSimpleEventPlugin.js <ide> <ide> import type {TopLevelType} from '../../events/TopLevelEventTypes'; <ide> import type {Fiber} from 'react-reconciler/src/ReactInternalTypes'; <del>import type { <del> AnyNativeEvent, <del> DispatchQueue, <del>} from '../../events/PluginModuleType'; <add>import type {AnyNativeEvent} from '../../events/PluginModuleType'; <add>import type {DispatchQueue} from '../DOMModernPluginEventSystem'; <ide> import type {EventSystemFlags} from '../EventSystemFlags'; <ide> <ide> import SyntheticEvent from '../../events/SyntheticEvent'; <ide> import { <ide> } from '../DOMEventProperties'; <ide> import { <ide> accumulateSinglePhaseListeners, <del> accumulateEventHandleTargetListeners, <add> accumulateEventHandleNonManagedNodeListeners, <ide> } from '../DOMModernPluginEventSystem'; <del>import {IS_TARGET_PHASE_ONLY} from '../EventSystemFlags'; <add>import {IS_EVENT_HANDLE_NON_MANAGED_NODE} from '../EventSystemFlags'; <ide> import SyntheticAnimationEvent from '../SyntheticAnimationEvent'; <ide> import SyntheticClipboardEvent from '../SyntheticClipboardEvent'; <ide> import SyntheticFocusEvent from '../SyntheticFocusEvent'; <ide> function extractEvents( <ide> nativeEvent: AnyNativeEvent, <ide> nativeEventTarget: null | EventTarget, <ide> eventSystemFlags: EventSystemFlags, <del> targetContainer: null | EventTarget, <add> targetContainer: EventTarget, <ide> ): void { <ide> const reactName = topLevelEventsToReactNames.get(topLevelType); <ide> if (reactName === undefined) { <ide> function extractEvents( <ide> const inCapturePhase = (eventSystemFlags & IS_CAPTURE_PHASE) !== 0; <ide> if ( <ide> enableCreateEventHandleAPI && <del> eventSystemFlags !== undefined && <del> eventSystemFlags & IS_TARGET_PHASE_ONLY && <del> targetContainer != null <add> eventSystemFlags & IS_EVENT_HANDLE_NON_MANAGED_NODE <ide> ) { <del> accumulateEventHandleTargetListeners( <add> accumulateEventHandleNonManagedNodeListeners( <ide> dispatchQueue, <ide> event, <ide> targetContainer, <ide><path>packages/react-native-renderer/src/legacy-events/EventPluginRegistry.js <ide> import type { <ide> AnyNativeEvent, <ide> PluginName, <ide> LegacyPluginModule, <del> ModernPluginModule, <ide> } from './PluginModuleType'; <ide> <ide> import invariant from 'shared/invariant'; <ide> <ide> type NamesToPlugins = { <del> [key: PluginName]: <del> | LegacyPluginModule<AnyNativeEvent> <del> | ModernPluginModule<AnyNativeEvent>, <add> [key: PluginName]: LegacyPluginModule<AnyNativeEvent>, <ide> ..., <ide> }; <ide> type EventPluginOrder = null | Array<PluginName>; <ide> function recomputePluginOrdering(): void { <ide> */ <ide> function publishEventForPlugin( <ide> dispatchConfig: DispatchConfig, <del> pluginModule: <del> | LegacyPluginModule<AnyNativeEvent> <del> | ModernPluginModule<AnyNativeEvent>, <add> pluginModule: LegacyPluginModule<AnyNativeEvent>, <ide> eventName: string, <ide> ): boolean { <ide> invariant( <ide> function publishEventForPlugin( <ide> */ <ide> function publishRegistrationName( <ide> registrationName: string, <del> pluginModule: <del> | LegacyPluginModule<AnyNativeEvent> <del> | ModernPluginModule<AnyNativeEvent>, <add> pluginModule: LegacyPluginModule<AnyNativeEvent>, <ide> eventName: string, <ide> ): void { <ide> invariant( <ide> export function injectEventPluginsByName( <ide> recomputePluginOrdering(); <ide> } <ide> } <del> <del>export function injectEventPlugins( <del> eventPlugins: [ModernPluginModule<AnyNativeEvent>], <del>): void { <del> for (let i = 0; i < eventPlugins.length; i++) { <del> const pluginModule = eventPlugins[i]; <del> plugins.push(pluginModule); <del> const publishedEvents = pluginModule.eventTypes; <del> for (const eventName in publishedEvents) { <del> publishEventForPlugin( <del> publishedEvents[eventName], <del> pluginModule, <del> eventName, <del> ); <del> } <del> } <del>} <ide><path>packages/react-native-renderer/src/legacy-events/PluginModuleType.js <ide> export type LegacyPluginModule<NativeEvent> = { <ide> ) => ?ReactSyntheticEvent, <ide> tapMoveThreshold?: number, <ide> }; <del> <del>export type DispatchQueueItemPhaseEntry = {| <del> instance: null | Fiber, <del> listener: Function, <del> currentTarget: EventTarget, <del>|}; <del> <del>export type DispatchQueueItemPhase = Array<DispatchQueueItemPhaseEntry>; <del> <del>export type DispatchQueueItem = {| <del> event: ReactSyntheticEvent, <del> phase: DispatchQueueItemPhase, <del>|}; <del> <del>export type DispatchQueue = Array<DispatchQueueItem>; <del> <del>export type ModernPluginModule<NativeEvent> = { <del> eventTypes: EventTypes, <del> extractEvents: ( <del> dispatchQueue: DispatchQueue, <del> topLevelType: TopLevelType, <del> targetInst: null | Fiber, <del> nativeTarget: NativeEvent, <del> nativeEventTarget: null | EventTarget, <del> eventSystemFlags: number, <del> container: null | EventTarget, <del> ) => void, <del>};
11
Text
Text
add v3.28.9 to changelog
6c259fcafa2de6fae3810a6761d04a847f5bd49a
<ide><path>CHANGELOG.md <del># Ember Changelog <add># Ember Changelo <add> <add>## v3.28.9 (April 19, 2022) <add> <add>- [#20028](https://github.com/emberjs/ember.js/pull/20028) Fix a memory leak in the Router Service class <ide> <ide> ### v4.4.0-beta.1 (March 24, 2022) <ide>
1
Go
Go
add unit tests to cli/command/formatter/stats.go
51fb35f0b0aa9f9d2fcc00dd08cafc588ee864d2
<ide><path>cli/command/formatter/stats.go <ide> func (c *containerStatsContext) MemUsage() string { <ide> func (c *containerStatsContext) MemPerc() string { <ide> header := memPercHeader <ide> c.AddHeader(header) <del> if c.s.IsInvalid { <add> if c.s.IsInvalid || c.s.OSType == winOSType { <ide> return fmt.Sprintf("--") <ide> } <ide> return fmt.Sprintf("%.2f%%", c.s.MemoryPercentage) <ide><path>cli/command/formatter/stats_test.go <add>package formatter <add> <add>import ( <add> "bytes" <add> "testing" <add> <add> "github.com/docker/docker/pkg/stringid" <add> "github.com/docker/docker/pkg/testutil/assert" <add>) <add> <add>func TestContainerStatsContext(t *testing.T) { <add> containerID := stringid.GenerateRandomID() <add> <add> var ctx containerStatsContext <add> tt := []struct { <add> stats StatsEntry <add> expValue string <add> expHeader string <add> call func() string <add> }{ <add> {StatsEntry{Name: containerID}, containerID, containerHeader, ctx.Container}, <add> {StatsEntry{CPUPercentage: 5.5}, "5.50%", cpuPercHeader, ctx.CPUPerc}, <add> {StatsEntry{CPUPercentage: 5.5, IsInvalid: true}, "--", cpuPercHeader, ctx.CPUPerc}, <add> {StatsEntry{NetworkRx: 0.31, NetworkTx: 12.3}, "0.31 B / 12.3 B", netIOHeader, ctx.NetIO}, <add> {StatsEntry{NetworkRx: 0.31, NetworkTx: 12.3, IsInvalid: true}, "--", netIOHeader, ctx.NetIO}, <add> {StatsEntry{BlockRead: 0.1, BlockWrite: 2.3}, "0.1 B / 2.3 B", blockIOHeader, ctx.BlockIO}, <add> {StatsEntry{BlockRead: 0.1, BlockWrite: 2.3, IsInvalid: true}, "--", blockIOHeader, ctx.BlockIO}, <add> {StatsEntry{MemoryPercentage: 10.2}, "10.20%", memPercHeader, ctx.MemPerc}, <add> {StatsEntry{MemoryPercentage: 10.2, IsInvalid: true}, "--", memPercHeader, ctx.MemPerc}, <add> {StatsEntry{MemoryPercentage: 10.2, OSType: "windows"}, "--", memPercHeader, ctx.MemPerc}, <add> {StatsEntry{Memory: 24, MemoryLimit: 30}, "24 B / 30 B", memUseHeader, ctx.MemUsage}, <add> {StatsEntry{Memory: 24, MemoryLimit: 30, IsInvalid: true}, "-- / --", memUseHeader, ctx.MemUsage}, <add> {StatsEntry{Memory: 24, MemoryLimit: 30, OSType: "windows"}, "24 B", winMemUseHeader, ctx.MemUsage}, <add> {StatsEntry{PidsCurrent: 10}, "10", pidsHeader, ctx.PIDs}, <add> {StatsEntry{PidsCurrent: 10, IsInvalid: true}, "--", pidsHeader, ctx.PIDs}, <add> {StatsEntry{PidsCurrent: 10, OSType: "windows"}, "--", pidsHeader, ctx.PIDs}, <add> } <add> <add> for _, te := range tt { <add> ctx = containerStatsContext{s: te.stats} <add> if v := te.call(); v != te.expValue { <add> t.Fatalf("Expected %q, got %q", te.expValue, v) <add> } <add> <add> h := ctx.FullHeader() <add> if h != te.expHeader { <add> t.Fatalf("Expected %q, got %q", te.expHeader, h) <add> } <add> } <add>} <add> <add>func TestContainerStatsContextWrite(t *testing.T) { <add> tt := []struct { <add> context Context <add> expected string <add> }{ <add> { <add> Context{Format: "{{InvalidFunction}}"}, <add> `Template parsing error: template: :1: function "InvalidFunction" not defined <add>`, <add> }, <add> { <add> Context{Format: "{{nil}}"}, <add> `Template parsing error: template: :1:2: executing "" at <nil>: nil is not a command <add>`, <add> }, <add> { <add> Context{Format: "table {{.MemUsage}}"}, <add> `MEM USAGE / LIMIT <add>20 B / 20 B <add>-- / -- <add>`, <add> }, <add> { <add> Context{Format: "{{.Container}} {{.CPUPerc}}"}, <add> `container1 20.00% <add>container2 -- <add>`, <add> }, <add> } <add> <add> for _, te := range tt { <add> stats := []StatsEntry{ <add> { <add> Name: "container1", <add> CPUPercentage: 20, <add> Memory: 20, <add> MemoryLimit: 20, <add> MemoryPercentage: 20, <add> NetworkRx: 20, <add> NetworkTx: 20, <add> BlockRead: 20, <add> BlockWrite: 20, <add> PidsCurrent: 2, <add> IsInvalid: false, <add> OSType: "linux", <add> }, <add> { <add> Name: "container2", <add> CPUPercentage: 30, <add> Memory: 30, <add> MemoryLimit: 30, <add> MemoryPercentage: 30, <add> NetworkRx: 30, <add> NetworkTx: 30, <add> BlockRead: 30, <add> BlockWrite: 30, <add> PidsCurrent: 3, <add> IsInvalid: true, <add> OSType: "linux", <add> }, <add> } <add> var out bytes.Buffer <add> te.context.Output = &out <add> err := ContainerStatsWrite(te.context, stats) <add> if err != nil { <add> assert.Error(t, err, te.expected) <add> } else { <add> assert.Equal(t, out.String(), te.expected) <add> } <add> } <add>} <add> <add>func TestContainerStatsContextWriteWindows(t *testing.T) { <add> tt := []struct { <add> context Context <add> expected string <add> }{ <add> { <add> Context{Format: "table {{.MemUsage}}"}, <add> `PRIV WORKING SET <add>20 B <add>-- / -- <add>`, <add> }, <add> { <add> Context{Format: "{{.Container}} {{.CPUPerc}}"}, <add> `container1 20.00% <add>container2 -- <add>`, <add> }, <add> { <add> Context{Format: "{{.Container}} {{.MemPerc}} {{.PIDs}}"}, <add> `container1 -- -- <add>container2 -- -- <add>`, <add> }, <add> } <add> <add> for _, te := range tt { <add> stats := []StatsEntry{ <add> { <add> Name: "container1", <add> CPUPercentage: 20, <add> Memory: 20, <add> MemoryLimit: 20, <add> MemoryPercentage: 20, <add> NetworkRx: 20, <add> NetworkTx: 20, <add> BlockRead: 20, <add> BlockWrite: 20, <add> PidsCurrent: 2, <add> IsInvalid: false, <add> OSType: "windows", <add> }, <add> { <add> Name: "container2", <add> CPUPercentage: 30, <add> Memory: 30, <add> MemoryLimit: 30, <add> MemoryPercentage: 30, <add> NetworkRx: 30, <add> NetworkTx: 30, <add> BlockRead: 30, <add> BlockWrite: 30, <add> PidsCurrent: 3, <add> IsInvalid: true, <add> OSType: "windows", <add> }, <add> } <add> var out bytes.Buffer <add> te.context.Output = &out <add> err := ContainerStatsWrite(te.context, stats) <add> if err != nil { <add> assert.Error(t, err, te.expected) <add> } else { <add> assert.Equal(t, out.String(), te.expected) <add> } <add> } <add>} <add> <add>func TestContainerStatsContextWriteWithNoStats(t *testing.T) { <add> var out bytes.Buffer <add> <add> contexts := []struct { <add> context Context <add> expected string <add> }{ <add> { <add> Context{ <add> Format: "{{.Container}}", <add> Output: &out, <add> }, <add> "", <add> }, <add> { <add> Context{ <add> Format: "table {{.Container}}", <add> Output: &out, <add> }, <add> "CONTAINER\n", <add> }, <add> { <add> Context{ <add> Format: "table {{.Container}}\t{{.CPUPerc}}", <add> Output: &out, <add> }, <add> "CONTAINER CPU %\n", <add> }, <add> } <add> <add> for _, context := range contexts { <add> ContainerStatsWrite(context.context, []StatsEntry{}) <add> assert.Equal(t, context.expected, out.String()) <add> // Clean buffer <add> out.Reset() <add> } <add>}
2
Java
Java
add beanfactorycontribution for bean registrations
ec6a19fc6b37ef03d2667100a2ee9de1488c902d
<ide><path>spring-beans/src/main/java/org/springframework/beans/factory/generator/BeanDefinitionGenerationException.java <add>/* <add> * Copyright 2002-2022 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * https://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.beans.factory.generator; <add> <add>import org.springframework.beans.factory.config.BeanDefinition; <add> <add>/** <add> * Thrown when a bean definition could not be generated. <add> * <add> * @author Stephane Nicoll <add> * @since 6.0 <add> */ <add>@SuppressWarnings("serial") <add>public class BeanDefinitionGenerationException extends RuntimeException { <add> <add> private final String beanName; <add> <add> private final BeanDefinition beanDefinition; <add> <add> public BeanDefinitionGenerationException(String beanName, BeanDefinition beanDefinition, String message, Throwable cause) { <add> super(message, cause); <add> this.beanName = beanName; <add> this.beanDefinition = beanDefinition; <add> } <add> <add> public BeanDefinitionGenerationException(String beanName, BeanDefinition beanDefinition, String message) { <add> super(message); <add> this.beanName = beanName; <add> this.beanDefinition = beanDefinition; <add> } <add> <add> /** <add> * Return the bean name that could not be generated. <add> * @return the bean name <add> */ <add> public String getBeanName() { <add> return this.beanName; <add> } <add> <add> /** <add> * Return the bean definition that could not be generated. <add> * @return the bean definition <add> */ <add> public BeanDefinition getBeanDefinition() { <add> return this.beanDefinition; <add> } <add> <add>} <ide><path>spring-beans/src/main/java/org/springframework/beans/factory/generator/BeanDefinitionsContribution.java <add>/* <add> * Copyright 2002-2022 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * https://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.beans.factory.generator; <add> <add>import java.util.ArrayList; <add>import java.util.HashMap; <add>import java.util.List; <add>import java.util.Map; <add>import java.util.function.Consumer; <add> <add>import org.springframework.beans.factory.support.DefaultListableBeanFactory; <add>import org.springframework.beans.factory.support.RootBeanDefinition; <add>import org.springframework.core.io.support.SpringFactoriesLoader; <add> <add>/** <add> * A {@link BeanFactoryContribution} that generates the bean definitions of a <add> * bean factory, using {@link BeanRegistrationContributionProvider} to use <add> * appropriate customizations if necessary. <add> * <add> * <p>{@link BeanRegistrationContributionProvider} can be ordered, with the default <add> * implementation always coming last. <add> * <add> * @author Stephane Nicoll <add> * @since 6.0 <add> * @see DefaultBeanRegistrationContributionProvider <add> */ <add>public class BeanDefinitionsContribution implements BeanFactoryContribution { <add> <add> private final DefaultListableBeanFactory beanFactory; <add> <add> private final List<BeanRegistrationContributionProvider> contributionProviders; <add> <add> private final Map<String, BeanFactoryContribution> contributions; <add> <add> BeanDefinitionsContribution(DefaultListableBeanFactory beanFactory, <add> List<BeanRegistrationContributionProvider> contributionProviders) { <add> this.beanFactory = beanFactory; <add> this.contributionProviders = contributionProviders; <add> this.contributions = new HashMap<>(); <add> } <add> <add> public BeanDefinitionsContribution(DefaultListableBeanFactory beanFactory) { <add> this(beanFactory, initializeProviders(beanFactory)); <add> } <add> <add> private static List<BeanRegistrationContributionProvider> initializeProviders(DefaultListableBeanFactory beanFactory) { <add> List<BeanRegistrationContributionProvider> providers = new ArrayList<>(SpringFactoriesLoader.loadFactories( <add> BeanRegistrationContributionProvider.class, beanFactory.getBeanClassLoader())); <add> providers.add(new DefaultBeanRegistrationContributionProvider(beanFactory)); <add> return providers; <add> } <add> <add> @Override <add> public void applyTo(BeanFactoryInitialization initialization) { <add> writeBeanDefinitions(initialization); <add> } <add> <add> private void writeBeanDefinitions(BeanFactoryInitialization initialization) { <add> for (String beanName : this.beanFactory.getBeanDefinitionNames()) { <add> handleMergedBeanDefinition(beanName, beanDefinition -> { <add> BeanFactoryContribution registrationContribution = getBeanRegistrationContribution( <add> beanName, beanDefinition); <add> registrationContribution.applyTo(initialization); <add> }); <add> } <add> } <add> <add> private BeanFactoryContribution getBeanRegistrationContribution( <add> String beanName, RootBeanDefinition beanDefinition) { <add> return this.contributions.computeIfAbsent(beanName, name -> { <add> for (BeanRegistrationContributionProvider provider : this.contributionProviders) { <add> BeanFactoryContribution contribution = provider.getContributionFor( <add> beanName, beanDefinition); <add> if (contribution != null) { <add> return contribution; <add> } <add> } <add> throw new BeanRegistrationContributionNotFoundException(beanName, beanDefinition); <add> }); <add> } <add> <add> private void handleMergedBeanDefinition(String beanName, Consumer<RootBeanDefinition> consumer) { <add> RootBeanDefinition beanDefinition = (RootBeanDefinition) this.beanFactory.getMergedBeanDefinition(beanName); <add> try { <add> consumer.accept(beanDefinition); <add> } <add> catch (BeanDefinitionGenerationException ex) { <add> throw ex; <add> } <add> catch (Exception ex) { <add> String msg = String.format("Failed to handle bean with name '%s' and type '%s'", <add> beanName, beanDefinition.getResolvableType()); <add> throw new BeanDefinitionGenerationException(beanName, beanDefinition, msg, ex); <add> } <add> } <add> <add>} <ide><path>spring-beans/src/main/java/org/springframework/beans/factory/generator/BeanRegistrationContributionNotFoundException.java <add>/* <add> * Copyright 2002-2022 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * https://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.beans.factory.generator; <add> <add>import org.springframework.beans.factory.config.BeanDefinition; <add> <add>/** <add> * Thrown when no suitable {@link BeanFactoryContribution} can be provided <add> * for the registration of a given bean definition. <add> * <add> * @author Stephane Nicoll <add> * @since 6.0 <add> */ <add>@SuppressWarnings("serial") <add>public class BeanRegistrationContributionNotFoundException extends BeanDefinitionGenerationException { <add> <add> public BeanRegistrationContributionNotFoundException(String beanName, BeanDefinition beanDefinition) { <add> super(beanName, beanDefinition, String.format( <add> "No suitable contribution found for bean with name '%s' and type '%s'", <add> beanName, beanDefinition.getResolvableType())); <add> } <add> <add>} <ide><path>spring-beans/src/test/java/org/springframework/beans/factory/generator/BeanDefinitionsContributionTests.java <add>/* <add> * Copyright 2002-2022 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * https://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.beans.factory.generator; <add> <add>import java.util.List; <add> <add>import org.junit.jupiter.api.Test; <add>import org.mockito.ArgumentMatchers; <add>import org.mockito.BDDMockito; <add>import org.mockito.Mockito; <add> <add>import org.springframework.aot.generator.DefaultGeneratedTypeContext; <add>import org.springframework.aot.generator.GeneratedType; <add>import org.springframework.aot.generator.GeneratedTypeContext; <add>import org.springframework.beans.factory.support.BeanDefinitionBuilder; <add>import org.springframework.beans.factory.support.DefaultListableBeanFactory; <add>import org.springframework.beans.factory.support.RootBeanDefinition; <add>import org.springframework.javapoet.ClassName; <add>import org.springframework.javapoet.support.CodeSnippet; <add> <add>import static org.assertj.core.api.Assertions.assertThat; <add>import static org.assertj.core.api.Assertions.assertThatThrownBy; <add> <add>/** <add> * Tests for {@link BeanDefinitionsContribution}. <add> * <add> * @author Stephane Nicoll <add> */ <add>class BeanDefinitionsContributionTests { <add> <add> @Test <add> void contributeThrowsContributionNotFoundIfNoContributionIsAvailable() { <add> DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory(); <add> beanFactory.registerBeanDefinition("test", new RootBeanDefinition()); <add> BeanDefinitionsContribution contribution = new BeanDefinitionsContribution(beanFactory, <add> List.of(Mockito.mock(BeanRegistrationContributionProvider.class))); <add> BeanFactoryInitialization initialization = new BeanFactoryInitialization(createGenerationContext()); <add> assertThatThrownBy(() -> contribution.applyTo(initialization)) <add> .isInstanceOfSatisfying(BeanRegistrationContributionNotFoundException.class, ex -> { <add> assertThat(ex.getBeanName()).isEqualTo("test"); <add> assertThat(ex.getBeanDefinition()).isSameAs(beanFactory.getMergedBeanDefinition("test")); <add> }); <add> } <add> <add> @Test <add> void contributeThrowsBeanRegistrationExceptionIfContributionThrowsException() { <add> DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory(); <add> beanFactory.registerBeanDefinition("test", new RootBeanDefinition()); <add> BeanFactoryContribution testContribution = Mockito.mock(BeanFactoryContribution.class); <add> IllegalStateException testException = new IllegalStateException(); <add> BDDMockito.willThrow(testException).given(testContribution).applyTo(ArgumentMatchers.any(BeanFactoryInitialization.class)); <add> BeanDefinitionsContribution contribution = new BeanDefinitionsContribution(beanFactory, <add> List.of(new TestBeanRegistrationContributionProvider("test", testContribution))); <add> BeanFactoryInitialization initialization = new BeanFactoryInitialization(createGenerationContext()); <add> assertThatThrownBy(() -> contribution.applyTo(initialization)) <add> .isInstanceOfSatisfying(BeanDefinitionGenerationException.class, ex -> { <add> assertThat(ex.getBeanName()).isEqualTo("test"); <add> assertThat(ex.getBeanDefinition()).isSameAs(beanFactory.getMergedBeanDefinition("test")); <add> assertThat(ex.getCause()).isEqualTo(testException); <add> }); <add> } <add> <add> @Test <add> void contributeGeneratesBeanDefinitionsInOrder() { <add> DefaultListableBeanFactory beanFactory = new DefaultListableBeanFactory(); <add> beanFactory.registerBeanDefinition("counter", BeanDefinitionBuilder <add> .rootBeanDefinition(Integer.class, "valueOf").addConstructorArgValue(42).getBeanDefinition()); <add> beanFactory.registerBeanDefinition("name", BeanDefinitionBuilder <add> .rootBeanDefinition(String.class).addConstructorArgValue("Hello").getBeanDefinition()); <add> CodeSnippet code = contribute(beanFactory, createGenerationContext()); <add> assertThat(code.getSnippet()).isEqualTo(""" <add> BeanDefinitionRegistrar.of("counter", Integer.class).withFactoryMethod(Integer.class, "valueOf", int.class) <add> .instanceSupplier((instanceContext) -> instanceContext.create(beanFactory, (attributes) -> Integer.valueOf(attributes.get(0)))).customize((bd) -> bd.getConstructorArgumentValues().addIndexedArgumentValue(0, 42)).register(beanFactory); <add> BeanDefinitionRegistrar.of("name", String.class).withConstructor(String.class) <add> .instanceSupplier((instanceContext) -> instanceContext.create(beanFactory, (attributes) -> new String(attributes.get(0, String.class)))).customize((bd) -> bd.getConstructorArgumentValues().addIndexedArgumentValue(0, "Hello")).register(beanFactory); <add> """); <add> } <add> <add> private CodeSnippet contribute(DefaultListableBeanFactory beanFactory, GeneratedTypeContext generationContext) { <add> BeanDefinitionsContribution contribution = new BeanDefinitionsContribution(beanFactory); <add> BeanFactoryInitialization initialization = new BeanFactoryInitialization(generationContext); <add> contribution.applyTo(initialization); <add> return CodeSnippet.of(initialization.toCodeBlock()); <add> } <add> <add> private GeneratedTypeContext createGenerationContext() { <add> return new DefaultGeneratedTypeContext("com.example", packageName -> <add> GeneratedType.of(ClassName.get(packageName, "Test"))); <add> } <add> <add> static class TestBeanRegistrationContributionProvider implements BeanRegistrationContributionProvider { <add> <add> private final String beanName; <add> <add> private final BeanFactoryContribution contribution; <add> <add> public TestBeanRegistrationContributionProvider(String beanName, BeanFactoryContribution contribution) { <add> this.beanName = beanName; <add> this.contribution = contribution; <add> } <add> <add> @Override <add> public BeanFactoryContribution getContributionFor(String beanName, RootBeanDefinition beanDefinition) { <add> return (beanName.equals(this.beanName) ? this.contribution : null); <add> } <add> } <add> <add>}
4
Mixed
Python
add solution for project euler problem 67
11a15cc5842bb44a81bc8ee56af8f25d92a74287
<ide><path>DIRECTORY.md <ide> * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_065/sol1.py) <ide> * Problem 067 <ide> * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_067/sol1.py) <add> * [Sol2](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_067/sol2.py) <ide> * Problem 069 <ide> * [Sol1](https://github.com/TheAlgorithms/Python/blob/master/project_euler/problem_069/sol1.py) <ide> * Problem 070 <ide><path>project_euler/problem_067/sol2.py <add>""" <add>Problem Statement: <add>By starting at the top of the triangle below and moving to adjacent numbers on <add>the row below, the maximum total from top to bottom is 23. <add>3 <add>7 4 <add>2 4 6 <add>8 5 9 3 <add>That is, 3 + 7 + 4 + 9 = 23. <add>Find the maximum total from top to bottom in triangle.txt (right click and <add>'Save Link/Target As...'), a 15K text file containing a triangle with <add>one-hundred rows. <add>""" <add>import os <add> <add> <add>def solution() -> int: <add> """ <add> Finds the maximum total in a triangle as described by the problem statement <add> above. <add> >>> solution() <add> 7273 <add> """ <add> script_dir = os.path.dirname(os.path.realpath(__file__)) <add> triangle_path = os.path.join(script_dir, "triangle.txt") <add> <add> with open(triangle_path) as in_file: <add> triangle = [[int(i) for i in line.split()] for line in in_file] <add> <add> while len(triangle) != 1: <add> last_row = triangle.pop() <add> curr_row = triangle[-1] <add> for j in range(len(last_row) - 1): <add> curr_row[j] += max(last_row[j], last_row[j + 1]) <add> return triangle[0][0] <add> <add> <add>if __name__ == "__main__": <add> print(solution())
2
Javascript
Javascript
improve hmr plugin with multiple runtimes
6b4ce6e0118d0f18f8e554f21f3ffbfd965118ad
<ide><path>lib/HotModuleReplacementPlugin.js <ide> const { <ide> toConstantDependency <ide> } = require("./javascript/JavascriptParserHelpers"); <ide> const { find } = require("./util/SetHelpers"); <add>const TupleSet = require("./util/TupleSet"); <ide> const { compareModulesById } = require("./util/comparators"); <ide> <ide> /** @typedef {import("./Chunk")} Chunk */ <ide> class HotModuleReplacementPlugin { <ide> //#endregion <ide> <ide> let hotIndex = 0; <del> const fullHashModuleHashes = {}; <del> const moduleHashes = {}; <add> const fullHashChunkModuleHashes = {}; <add> const chunkModuleHashes = {}; <ide> <ide> compilation.hooks.record.tap( <ide> "HotModuleReplacementPlugin", <ide> class HotModuleReplacementPlugin { <ide> const chunkGraph = compilation.chunkGraph; <ide> records.hash = compilation.hash; <ide> records.hotIndex = hotIndex; <del> records.fullHashModuleHashes = fullHashModuleHashes; <del> records.moduleHashes = moduleHashes; <add> records.fullHashChunkModuleHashes = fullHashChunkModuleHashes; <add> records.chunkModuleHashes = chunkModuleHashes; <ide> records.chunkHashs = {}; <ide> for (const chunk of compilation.chunks) { <ide> records.chunkHashs[chunk.id] = chunk.hash; <ide> class HotModuleReplacementPlugin { <ide> } <ide> } <ide> ); <del> /** @type {Set<Module>} */ <del> const updatedModules = new Set(); <del> /** @type {Set<Module>} */ <del> const lazyHashedModules = new Set(); <add> /** @type {TupleSet<[Module, Chunk]>} */ <add> const updatedModules = new TupleSet(); <add> /** @type {TupleSet<[Module, Chunk]>} */ <add> const lazyHashedModules = new TupleSet(); <ide> compilation.hooks.fullHash.tap("HotModuleReplacementPlugin", hash => { <ide> const chunkGraph = compilation.chunkGraph; <ide> const records = compilation.records; <ide> for (const chunk of compilation.chunks) { <del> const modules = chunkGraph.getChunkFullHashModulesIterable(chunk); <del> if (modules !== undefined) { <del> for (const module of modules) { <del> lazyHashedModules.add(module); <add> /** @type {Set<Module>} */ <add> const lazyHashedModulesInThisChunk = new Set(); <add> const fullHashModules = chunkGraph.getChunkFullHashModulesIterable( <add> chunk <add> ); <add> if (fullHashModules !== undefined) { <add> for (const module of fullHashModules) { <add> lazyHashedModules.add(module, chunk); <add> lazyHashedModulesInThisChunk.add(module); <ide> } <ide> } <del> } <del> if (records.moduleHashes && records.fullHashModuleHashes) { <del> for (const module of compilation.modules) { <del> const identifier = module.identifier(); <del> const hash = chunkGraph.getModuleHash(module, undefined); <del> if (lazyHashedModules.has(module)) { <del> if (records.fullHashModuleHashes[identifier] !== hash) { <del> updatedModules.add(module); <add> const modules = chunkGraph.getChunkModulesIterable(chunk); <add> if (modules !== undefined) { <add> if ( <add> records.chunkModuleHashes && <add> records.fullHashChunkModuleHashes <add> ) { <add> for (const module of modules) { <add> const key = `${chunk.id}|${module.identifier()}`; <add> const hash = chunkGraph.getModuleHash(module, chunk.runtime); <add> if (lazyHashedModulesInThisChunk.has(module)) { <add> if (records.fullHashChunkModuleHashes[key] !== hash) { <add> updatedModules.add(module, chunk); <add> } <add> fullHashChunkModuleHashes[key] = hash; <add> } else { <add> if (records.chunkModuleHashes[key] !== hash) { <add> updatedModules.add(module, chunk); <add> } <add> chunkModuleHashes[key] = hash; <add> } <ide> } <del> fullHashModuleHashes[identifier] = hash; <ide> } else { <del> if (records.moduleHashes[identifier] !== hash) { <del> updatedModules.add(module); <add> for (const module of modules) { <add> const key = `${chunk.id}|${module.identifier()}`; <add> const hash = chunkGraph.getModuleHash(module, chunk.runtime); <add> if (lazyHashedModulesInThisChunk.has(module)) { <add> fullHashChunkModuleHashes[key] = hash; <add> } else { <add> chunkModuleHashes[key] = hash; <add> } <ide> } <del> moduleHashes[identifier] = hash; <del> } <del> } <del> } else { <del> for (const module of compilation.modules) { <del> const identifier = module.identifier(); <del> const hash = chunkGraph.getModuleHash(module, undefined); <del> if (lazyHashedModules.has(module)) { <del> fullHashModuleHashes[identifier] = hash; <del> } else { <del> moduleHashes[identifier] = hash; <ide> } <ide> } <ide> } <add> <ide> hotIndex = records.hotIndex || 0; <ide> if (updatedModules.size > 0) hotIndex++; <ide> <ide> class HotModuleReplacementPlugin { <ide> const records = compilation.records; <ide> if (records.hash === compilation.hash) return; <ide> if ( <del> !records.moduleHashes || <add> !records.chunkModuleHashes || <ide> !records.chunkHashs || <ide> !records.chunkModuleIds <ide> ) { <ide> return; <ide> } <del> for (const module of lazyHashedModules) { <del> const identifier = module.identifier(); <del> const hash = chunkGraph.getModuleHash(module, undefined); <del> if (records.moduleHashes[identifier] !== hash) { <del> updatedModules.add(module); <add> for (const [module, chunk] of lazyHashedModules) { <add> const key = `${chunk.id}|${module.identifier()}`; <add> const hash = chunkGraph.getModuleHash(module, chunk.runtime); <add> if (records.chunkModuleHashes[key] !== hash) { <add> updatedModules.add(module, chunk); <ide> } <del> moduleHashes[identifier] = hash; <add> chunkModuleHashes[key] = hash; <ide> } <ide> const hotUpdateMainContent = { <ide> c: [], <ide> class HotModuleReplacementPlugin { <ide> const chunkId = currentChunk.id; <ide> const newModules = chunkGraph <ide> .getChunkModules(currentChunk) <del> .filter(module => updatedModules.has(module)); <add> .filter(module => updatedModules.has(module, currentChunk)); <ide> const newRuntimeModules = Array.from( <ide> chunkGraph.getChunkRuntimeModulesIterable(currentChunk) <del> ).filter(module => updatedModules.has(module)); <add> ).filter(module => updatedModules.has(module, currentChunk)); <ide> const fullHashModules = chunkGraph.getChunkFullHashModulesIterable( <ide> currentChunk <ide> ); <ide> const newFullHashModules = <ide> fullHashModules && <ide> Array.from(fullHashModules).filter(module => <del> updatedModules.has(module) <add> updatedModules.has(module, currentChunk) <ide> ); <ide> /** @type {Set<number|string>} */ <ide> const allModules = new Set(); <ide> class HotModuleReplacementPlugin { <ide> const hotUpdateChunk = new HotUpdateChunk(); <ide> ChunkGraph.setChunkGraphForChunk(hotUpdateChunk, chunkGraph); <ide> hotUpdateChunk.id = chunkId; <add> hotUpdateChunk.runtime = currentChunk.runtime; <ide> chunkGraph.attachModules(hotUpdateChunk, newModules); <ide> chunkGraph.attachRuntimeModules( <ide> hotUpdateChunk, <ide><path>lib/hmr/JavascriptHotModuleReplacement.runtime.js <ide> module.exports = function () { <ide> } <ide> <ide> // call accept handlers <del> var error = null; <ide> for (var outdatedModuleId in outdatedDependencies) { <ide> if ($hasOwnProperty$(outdatedDependencies, outdatedModuleId)) { <ide> var module = $moduleCache$[outdatedModuleId]; <ide> module.exports = function () { <ide> }); <ide> } <ide> if (!options.ignoreErrored) { <del> if (!error) error = err; <add> reportError(err); <ide> } <ide> } <ide> } <ide><path>lib/util/TupleSet.js <add>/* <add> MIT License http://www.opensource.org/licenses/mit-license.php <add> Author Tobias Koppers @sokra <add>*/ <add> <add>"use strict"; <add> <add>/** <add> * @template {any[]} T <add> */ <add>class TupleSet { <add> constructor() { <add> this._map = new Map(); <add> this.size = 0; <add> } <add> <add> /** <add> * @param {T} args tuple <add> * @returns {void} <add> */ <add> add(...args) { <add> let map = this._map; <add> for (let i = 0; i < args.length - 2; i++) { <add> const arg = args[i]; <add> const innerMap = map.get(arg); <add> if (innerMap === undefined) { <add> map.set(arg, (map = new Map())); <add> } else { <add> map = innerMap; <add> } <add> } <add> <add> const beforeLast = args[args.length - 2]; <add> let set = map.get(beforeLast); <add> if (set === undefined) { <add> map.set(beforeLast, (set = new Set())); <add> } <add> <add> const last = args[args.length - 1]; <add> this.size -= set.size; <add> set.add(last); <add> this.size += set.size; <add> } <add> <add> /** <add> * @param {T} args tuple <add> * @returns {boolean} true, if the tuple is in the Set <add> */ <add> has(...args) { <add> /** @type {Map<any, any>} */ <add> let map = this._map; <add> for (let i = 0; i < args.length - 2; i++) { <add> const arg = args[i]; <add> map = map.get(arg); <add> if (map === undefined) { <add> return false; <add> } <add> } <add> <add> const beforeLast = args[args.length - 2]; <add> let set = map.get(beforeLast); <add> if (set === undefined) { <add> return false; <add> } <add> <add> const last = args[args.length - 1]; <add> return set.has(last); <add> } <add> <add> [Symbol.iterator]() { <add> const iteratorStack = []; <add> const tuple = []; <add> let currentSetIterator = undefined; <add> <add> const next = it => { <add> const result = it.next(); <add> if (result.done) { <add> if (iteratorStack.length === 0) return false; <add> tuple.pop(); <add> return next(iteratorStack.pop()); <add> } <add> const [key, value] = result.value; <add> iteratorStack.push(it); <add> tuple.push(key); <add> if (value instanceof Set) { <add> currentSetIterator = value[Symbol.iterator](); <add> return true; <add> } else { <add> return next(value[Symbol.iterator]()); <add> } <add> }; <add> <add> next(this._map[Symbol.iterator]()); <add> <add> return { <add> next() { <add> while (currentSetIterator) { <add> const result = currentSetIterator.next(); <add> if (result.done) { <add> tuple.pop(); <add> if (!next(iteratorStack.pop())) { <add> currentSetIterator = undefined; <add> } <add> } else { <add> return { <add> done: false, <add> value: tuple.concat(result.value) <add> }; <add> } <add> } <add> return { done: true, value: undefined }; <add> } <add> }; <add> } <add>} <add> <add>module.exports = TupleSet;
3
Javascript
Javascript
add padding to glb chunks
38de4ccca5ecc0df5779cbcee7a3c2f4e4f6cd64
<ide><path>examples/js/exporters/GLTFExporter.js <ide> THREE.GLTFExporter.prototype = { <ide> * @param {string} text <ide> * @return {ArrayBuffer} <ide> */ <del> function stringToArrayBuffer( text ) { <add> function stringToArrayBuffer( text, padded ) { <add> if ( padded ) { <add> <add> var pad = getPaddedBufferSize( text.length ) - text.length; <add> <add> for ( var i = 0; i < pad; i++ ) { <add> <add> text += ' '; <add> <add> } <add> <add> } <ide> <ide> if ( window.TextEncoder !== undefined ) { <ide> <ide> THREE.GLTFExporter.prototype = { <ide> <ide> } <ide> <add> /** <add> * Returns a buffer aligned to 4-byte boundary. <add> * <add> * @param {ArrayBuffer} arrayBuffer Buffer to pad <add> * @returns {ArrayBuffer} The same buffer if it's already aligned to 4-byte boundary or a new buffer <add> */ <add> function getPaddedArrayBuffer( arrayBuffer ) { <add> <add> var paddedLength = getPaddedBufferSize( arrayBuffer.byteLength ); <add> <add> if (paddedLength !== arrayBuffer.byteLength ) { <add> <add> var paddedBuffer = new ArrayBuffer( paddedLength ); <add> new Uint8Array( paddedBuffer ).set(new Uint8Array(arrayBuffer)); <add> return paddedBuffer; <add> <add> } <add> <add> return arrayBuffer; <add> <add> } <add> <ide> /** <ide> * Process a buffer to append to the default one. <ide> * @param {THREE.BufferAttribute} attribute Attribute to store <ide> THREE.GLTFExporter.prototype = { <ide> reader.onloadend = function () { <ide> <ide> // Binary chunk. <del> var binaryChunk = reader.result; <add> var binaryChunk = getPaddedArrayBuffer( reader.result ); <ide> var binaryChunkPrefix = new DataView( new ArrayBuffer( GLB_CHUNK_PREFIX_BYTES ) ); <ide> binaryChunkPrefix.setUint32( 0, binaryChunk.byteLength, true ); <ide> binaryChunkPrefix.setUint32( 4, GLB_CHUNK_TYPE_BIN, true ); <ide> <ide> // JSON chunk. <ide> delete outputJSON.buffers[ 0 ].uri; // Omitted URI indicates use of binary chunk. <del> var jsonChunk = stringToArrayBuffer( JSON.stringify( outputJSON ) ); <add> var jsonChunk = stringToArrayBuffer( JSON.stringify( outputJSON ), true ); <ide> var jsonChunkPrefix = new DataView( new ArrayBuffer( GLB_CHUNK_PREFIX_BYTES ) ); <ide> jsonChunkPrefix.setUint32( 0, jsonChunk.byteLength, true ); <ide> jsonChunkPrefix.setUint32( 4, GLB_CHUNK_TYPE_JSON, true );
1
PHP
PHP
modify release methods of the cache lock
dca45985ab1b2353615cef14d9d4d70ba7b1cd15
<ide><path>src/Illuminate/Cache/DynamoDbLock.php <ide> public function acquire() <ide> /** <ide> * Release the lock. <ide> * <del> * @return void <add> * @return bool <ide> */ <ide> public function release() <ide> { <ide> if ($this->isOwnedByCurrentProcess()) { <del> $this->dynamo->forget($this->name); <add> return $this->dynamo->forget($this->name); <ide> } <add> <add> return false; <ide> } <ide> <ide> /** <ide><path>src/Illuminate/Cache/Lock.php <ide> abstract public function acquire(); <ide> /** <ide> * Release the lock. <ide> * <del> * @return void <add> * @return bool <ide> */ <ide> abstract public function release(); <ide> <ide><path>src/Illuminate/Cache/MemcachedLock.php <ide> public function acquire() <ide> /** <ide> * Release the lock. <ide> * <del> * @return void <add> * @return bool <ide> */ <ide> public function release() <ide> { <ide> if ($this->isOwnedByCurrentProcess()) { <del> $this->memcached->delete($this->name); <add> return $this->memcached->delete($this->name); <ide> } <add> <add> return false; <ide> } <ide> <ide> /** <ide><path>src/Illuminate/Cache/RedisLock.php <ide> public function acquire() <ide> /** <ide> * Release the lock. <ide> * <del> * @return int <add> * @return bool <ide> */ <ide> public function release() <ide> { <del> return $this->redis->eval(LuaScripts::releaseLock(), 1, $this->name, $this->owner); <add> return (bool) $this->redis->eval(LuaScripts::releaseLock(), 1, $this->name, $this->owner); <ide> } <ide> <ide> /**
4
Text
Text
apply sentence case to release doc headers
dd731e26b8b65d6bd0a3eb0922b87b486c2ce68d
<ide><path>doc/guides/releases.md <del># Node.js Release Process <add># Node.js release process <ide> <ide> This document describes the technical aspects of the Node.js release process. <ide> The intended audience is those who have been authorized by the Node.js <ide> Technical Steering Committee (TSC) to create, promote, and sign <ide> official release builds for Node.js, hosted on <https://nodejs.org/>. <ide> <del>## Table of Contents <add>## Table of contents <ide> <ide> * [Who can make a release?](#who-can-make-a-release) <del> * [1. Jenkins Release Access](#1-jenkins-release-access) <del> * [2. <nodejs.org> Access](#2-nodejsorg-access) <del> * [3. A Publicly Listed GPG Key](#3-a-publicly-listed-gpg-key) <add> * [1. Jenkins release access](#1-jenkins-release-access) <add> * [2. <nodejs.org> access](#2-nodejsorg-access) <add> * [3. A publicly listed GPG key](#3-a-publicly-listed-gpg-key) <ide> * [How to create a release](#how-to-create-a-release) <ide> * [0. Pre-release steps](#0-pre-release-steps) <ide> * [1. Update the staging branch](#1-update-the-staging-branch) <ide> * [2. Create a new branch for the release](#2-create-a-new-branch-for-the-release) <ide> * [3. Update `src/node_version.h`](#3-update-srcnode_versionh) <del> * [4. Update the Changelog](#4-update-the-changelog) <del> * [5. Create Release Commit](#5-create-release-commit) <del> * [6. Propose Release on GitHub](#6-propose-release-on-github) <del> * [7. Ensure that the Release Branch is Stable](#7-ensure-that-the-release-branch-is-stable) <del> * [8. Produce a Nightly Build _(optional)_](#8-produce-a-nightly-build-optional) <del> * [9. Produce Release Builds](#9-produce-release-builds) <del> * [10. Test the Build](#10-test-the-build) <del> * [11. Tag and Sign the Release Commit](#11-tag-and-sign-the-release-commit) <del> * [12. Set Up For the Next Release](#12-set-up-for-the-next-release) <del> * [13. Cherry-pick the Release Commit to `master`](#13-cherry-pick-the-release-commit-to-master) <add> * [4. Update the changelog](#4-update-the-changelog) <add> * [5. Create release commit](#5-create-release-commit) <add> * [6. Propose release on GitHub](#6-propose-release-on-github) <add> * [7. Ensure that the release branch is stable](#7-ensure-that-the-release-branch-is-stable) <add> * [8. Produce a nightly build _(optional)_](#8-produce-a-nightly-build-optional) <add> * [9. Produce release builds](#9-produce-release-builds) <add> * [10. Test the build](#10-test-the-build) <add> * [11. Tag and sign the release commit](#11-tag-and-sign-the-release-commit) <add> * [12. Set up for the next release](#12-set-up-for-the-next-release) <add> * [13. Cherry-pick the release commit to `master`](#13-cherry-pick-the-release-commit-to-master) <ide> * [14. Push the release tag](#14-push-the-release-tag) <del> * [15. Promote and Sign the Release Builds](#15-promote-and-sign-the-release-builds) <del> * [16. Check the Release](#16-check-the-release) <del> * [17. Create a Blog Post](#17-create-a-blog-post) <add> * [15. Promote and sign the release builds](#15-promote-and-sign-the-release-builds) <add> * [16. Check the release](#16-check-the-release) <add> * [17. Create a blog post](#17-create-a-blog-post) <ide> * [18. Create the release on GitHub](#18-create-the-release-on-github) <ide> * [19. Cleanup](#19-cleanup) <ide> * [20. Announce](#20-announce) <ide> * [21. Celebrate](#21-celebrate) <del>* [LTS Releases](#lts-releases) <del>* [Major Releases](#major-releases) <add>* [LTS releases](#lts-releases) <add>* [Major releases](#major-releases) <ide> <ide> ## Who can make a release? <ide> <ide> Release authorization is given by the Node.js TSC. Once authorized, an <ide> individual must have the following: <ide> <del>### 1. Jenkins Release Access <add>### 1. Jenkins release access <ide> <ide> There are three relevant Jenkins jobs that should be used for a release flow: <ide> <ide> a manual step once they are ready (see below). <ide> The [Node.js build team](https://github.com/nodejs/build) is able to provide <ide> this access to individuals authorized by the TSC. <ide> <del>### 2. <nodejs.org> Access <add>### 2. <nodejs.org> access <ide> <ide> The _dist_ user on nodejs.org controls the assets available in <ide> <https://nodejs.org/download/>. <https://nodejs.org/dist/> is an alias for <ide> server as the _dist_ user. The <ide> [Node.js build team](https://github.com/nodejs/build) is able to provide this <ide> access to individuals authorized by the TSC. <ide> <del>### 3. A Publicly Listed GPG Key <add>### 3. A publicly-listed GPG key <ide> <ide> A `SHASUMS256.txt` file is produced for every promoted build, nightly, and <ide> releases. Additionally for releases, this file is signed by the individual <ide> It is current TSC policy to bump major version when ABI changes. If you <ide> see a need to bump `NODE_MODULE_VERSION` then you should consult the TSC. <ide> Commits may need to be reverted or a major version bump may need to happen. <ide> <del>### 4. Update the Changelog <add>### 4. Update the changelog <ide> <ide> #### Step 1: Collect the formatted list of changes <ide> <ide> must be assigned a number (e.g. `DEP0012`). This assignment should <ide> occur when the PR is landed, but a check will be made when the release build is <ide> run. <ide> <del>### 5. Create Release Commit <add>### 5. Create release commit <ide> <ide> The `CHANGELOG.md`, `doc/changelogs/CHANGELOG_Vx.md`, `src/node_version.h`, and <ide> `REPLACEME` changes should be the final commit that will be tagged for the <ide> Notable changes: <ide> * Copy the notable changes list here, reformatted for plain-text <ide> ``` <ide> <del>### 6. Propose Release on GitHub <add>### 6. Propose release on GitHub <ide> <ide> Push the release branch to `nodejs/node`, not to your own fork. This allows <ide> release branches to more easily be passed between members of the release team if <ide> good place to @-mention the relevant contributors. <ide> After opening the PR, update the release commit to include `PR-URL` metadata and <ide> force-push the proposal. <ide> <del>### 7. Ensure that the Release Branch is Stable <add>### 7. Ensure that the release branch is stable <ide> <ide> Run a **[`node-test-pull-request`](https://ci.nodejs.org/job/node-test-pull-request/)** <ide> test run to ensure that the build is stable and the HEAD commit is ready for <ide> purpose. Run it once with the base `vx.x` branch as a reference and with the <ide> proposal branch to check if new regressions could be introduced in the <ide> ecosystem. <ide> <del>### 8. Produce a Nightly Build _(optional)_ <add>### 8. Produce a nightly build _(optional)_ <ide> <ide> If there is a reason to produce a test release for the purpose of having others <ide> try out installers or specifics of builds, produce a nightly build using <ide> enter a proper length commit SHA, enter a date string, and select "nightly" for <ide> This is particularly recommended if there has been recent work relating to the <ide> macOS or Windows installers as they are not tested in any way by CI. <ide> <del>### 9. Produce Release Builds <add>### 9. Produce release builds <ide> <ide> Use **[iojs+release](https://ci-release.nodejs.org/job/iojs+release/)** to <ide> produce release artifacts. Enter the commit that you want to build from and <ide> can use the <ide> build in the release CI to re-run the build only for ARMv6. When launching the <ide> build make sure to use the same commit hash as for the original release. <ide> <del>### 10. Test the Build <add>### 10. Test the build <ide> <ide> Jenkins collects the artifacts from the builds, allowing you to download and <ide> install the new build. Make sure that the build appears correct. Check the <ide> version numbers, and perform some basic checks to confirm that all is well with <ide> the build before moving forward. <ide> <del>### 11. Tag and Sign the Release Commit <add>### 11. Tag and sign the release commit <ide> <ide> Once you have produced builds that you're happy with, create a new tag. By <ide> waiting until this stage to create tags, you can discard a proposed release if <ide> $ git secure-tag <vx.y.z> <commit-sha> -sm "YYYY-MM-DD Node.js vx.y.z (<release- <ide> The tag **must** be signed using the GPG key that's listed for you on the <ide> project README. <ide> <del>### 12. Set Up For the Next Release <add>### 12. Set up for the next release <ide> <ide> On release proposal branch, edit `src/node_version.h` again and: <ide> <ide> $ git rebase v1.x <ide> $ git push upstream v1.x-staging <ide> ``` <ide> <del>### 13. Cherry-pick the Release Commit to `master` <add>### 13. Cherry-pick the release commit to `master` <ide> <ide> ```console <ide> $ git checkout master <ide> $ git push <remote> <vx.y.z> <ide> *Note*: Please do not push the tag unless you are ready to complete the <ide> remainder of the release steps. <ide> <del>### 15. Promote and Sign the Release Builds <add>### 15. Promote and sign the release builds <ide> <ide> **The same individual who signed the release tag must be the one <ide> to promote the builds as the `SHASUMS256.txt` file needs to be signed with the <ide> be prompted to re-sign `SHASUMS256.txt`. <ide> **It is possible to only sign a release by running `./tools/release.sh -s <ide> vX.Y.Z`.** <ide> <del>### 16. Check the Release <add>### 16. Check the release <ide> <ide> Your release should be available at `https://nodejs.org/dist/vx.y.z/` and <ide> <https://nodejs.org/dist/latest/>. Check that the appropriate files are in <ide> have the right internal version strings. Check that the API docs are available <ide> at <https://nodejs.org/api/>. Check that the release catalog files are correct <ide> at <https://nodejs.org/dist/index.tab> and <https://nodejs.org/dist/index.json>. <ide> <del>### 17. Create a Blog Post <add>### 17. Create a blog post <ide> <ide> There is an automatic build that is kicked off when you promote new builds, so <ide> within a few minutes nodejs.org will be listing your new version as the latest <ide> _In whatever form you do this..._ <ide> <ide> ## LTS Releases <ide> <del>### Marking a Release Line as LTS <add>### Marking a release line as LTS <ide> <ide> To mark a release line as LTS, the following changes must be made to <ide> `src/node_version.h`: <ide> existing labels for that release line, such as `vN.x`. <ide> If the release is transitioning from Active LTS to Maintenance, the <ide> `backport-requested-vN.x` label must be deleted. <ide> <del>## Major Releases <add>## Major releases <ide> <ide> The process for cutting a new Node.js major release has a number of differences <ide> from cutting a minor or patch release. <ide> The release date for the next major release should be announced immediately <ide> following the current release (e.g. the release date for 13.0.0 should be <ide> announced immediately following the release of 12.0.0). <ide> <del>### Release Branch <add>### Release branch <ide> <ide> Approximately three months before a major release, new `vN.x` and <ide> `vN.x-staging` branches (where `N` indicates the major release) should be <ide> The label description can be copied from existing labels of previous releases. <ide> The label color must be the same for all new labels, but different from the <ide> labels of previous releases. <ide> <del>### Release Proposal <add>### Release proposal <ide> <ide> A draft release proposal should be created two months before the release. A <ide> separate `vN.x-proposal` branch should be created that tracks the `vN.x` <ide> Notify the `@nodejs/npm` team in the release proposal PR to inform them of the <ide> upcoming release. `npm` maintains a list of [supported versions](https://github.com/npm/cli/blob/latest/lib/utils/unsupported.js#L3) <ide> that will need updating to include the new major release. <ide> <del>### Test Releases and Release Candidates <add>### Test releases and release candidates <ide> <ide> Test builds should be generated from the `vN.x-proposal` branch starting at <ide> about 6 weeks before the release.
1
Ruby
Ruby
use original_paths over envs; reject nil path
02164a35dbc2320e9f4eb2e27ff952e0a157b6fa
<ide><path>Library/Homebrew/brew.rb <ide> args = Homebrew::CLI::Parser.new.parse(ARGV.dup.freeze, ignore_invalid_options: true) <ide> Context.current = args.context <ide> <del> path = PATH.new(ENV["PATH"]) <del> homebrew_path = PATH.new(ENV["HOMEBREW_PATH"]) <add> path = PATH.new(ENV.fetch("PATH")) <add> homebrew_path = PATH.new(ENV.fetch("HOMEBREW_PATH")) <ide> <ide> # Add shared wrappers. <ide> path.prepend(HOMEBREW_SHIMS_PATH/"shared") <ide><path>Library/Homebrew/cask/artifact/installer.rb <ide> def install_phase(command: nil, **_) <ide> executable_path, <ide> **args, <ide> env: { "PATH" => PATH.new( <del> HOMEBREW_PREFIX/"bin", HOMEBREW_PREFIX/"sbin", ENV["PATH"] <add> HOMEBREW_PREFIX/"bin", HOMEBREW_PREFIX/"sbin", ENV.fetch("PATH") <ide> ) }, <ide> ) <ide> end <ide><path>Library/Homebrew/caveats.rb <ide> def keg_only_text(skip_reason: false) <ide> <ide> s << " #{Utils::Shell.export_value("CPPFLAGS", "-I#{f.opt_include}")}\n" if f.include.directory? <ide> <del> if which("pkg-config", ENV["HOMEBREW_PATH"]) && <add> if which("pkg-config", ORIGINAL_PATHS) && <ide> ((f.lib/"pkgconfig").directory? || (f.share/"pkgconfig").directory?) <ide> s << <<~EOS <ide> <ide> def keg <ide> <ide> def function_completion_caveats(shell) <ide> return unless keg <del> return unless which(shell.to_s, ENV["HOMEBREW_PATH"]) <add> return unless which(shell.to_s, ORIGINAL_PATHS) <ide> <ide> completion_installed = keg.completion_installed?(shell) <ide> functions_installed = keg.functions_installed?(shell) <ide><path>Library/Homebrew/cleanup.rb <ide> def cleanup_lockfiles(*lockfiles) <ide> end <ide> <ide> def cleanup_portable_ruby <del> rubies = [which("ruby"), which("ruby", ENV["HOMEBREW_PATH"])].compact <add> rubies = [which("ruby"), which("ruby", ORIGINAL_PATHS)].compact <ide> system_ruby = Pathname.new("/usr/bin/ruby") <ide> rubies << system_ruby if system_ruby.exist? <ide> <ide><path>Library/Homebrew/cmd/log.rb <ide> def log <ide> <ide> # As this command is simplifying user-run commands then let's just use a <ide> # user path, too. <del> ENV["PATH"] = ENV["HOMEBREW_PATH"] <add> ENV["PATH"] = PATH.new(ORIGINAL_PATHS).to_s <ide> <ide> if args.no_named? <ide> git_log HOMEBREW_REPOSITORY, args: args <ide><path>Library/Homebrew/commands.rb <ide> def external_ruby_v2_cmd_path(cmd) <ide> <ide> # Ruby commands which are run by being `require`d. <ide> def external_ruby_cmd_path(cmd) <del> which("brew-#{cmd}.rb", PATH.new(ENV["PATH"]).append(Tap.cmd_directories)) <add> which("brew-#{cmd}.rb", PATH.new(ENV.fetch("PATH")).append(Tap.cmd_directories)) <ide> end <ide> <ide> def external_cmd_path(cmd) <del> which("brew-#{cmd}", PATH.new(ENV["PATH"]).append(Tap.cmd_directories)) <add> which("brew-#{cmd}", PATH.new(ENV.fetch("PATH")).append(Tap.cmd_directories)) <ide> end <ide> <ide> def path(cmd) <ide><path>Library/Homebrew/dev-cmd/bump-cask-pr.rb <ide> def bump_cask_pr <ide> <ide> # As this command is simplifying user-run commands then let's just use a <ide> # user path, too. <del> ENV["PATH"] = ENV["HOMEBREW_PATH"] <add> ENV["PATH"] = PATH.new(ORIGINAL_PATHS).to_s <ide> <ide> # Use the user's browser, too. <ide> ENV["BROWSER"] = Homebrew::EnvConfig.browser <ide><path>Library/Homebrew/dev-cmd/bump-formula-pr.rb <ide> def bump_formula_pr <ide> <ide> # As this command is simplifying user-run commands then let's just use a <ide> # user path, too. <del> ENV["PATH"] = ENV["HOMEBREW_PATH"] <add> ENV["PATH"] = PATH.new(ORIGINAL_PATHS).to_s <ide> <ide> # Use the user's browser, too. <ide> ENV["BROWSER"] = Homebrew::EnvConfig.browser <ide><path>Library/Homebrew/dev-cmd/bump-revision.rb <ide> def bump_revision <ide> <ide> # As this command is simplifying user-run commands then let's just use a <ide> # user path, too. <del> ENV["PATH"] = ENV["HOMEBREW_PATH"] <add> ENV["PATH"] = PATH.new(ORIGINAL_PATHS).to_s <ide> <ide> args.named.to_formulae.each do |formula| <ide> current_revision = formula.revision <ide><path>Library/Homebrew/dev-cmd/sh.rb <ide> def sh <ide> ENV.setup_build_environment <ide> if superenv?(args.env) <ide> # superenv stopped adding brew's bin but generally users will want it <del> ENV["PATH"] = PATH.new(ENV["PATH"]).insert(1, HOMEBREW_PREFIX/"bin") <add> ENV["PATH"] = PATH.new(ENV.fetch("PATH")).insert(1, HOMEBREW_PREFIX/"bin") <ide> end <ide> <ide> ENV["VERBOSE"] = "1" if args.verbose? <ide><path>Library/Homebrew/dev-cmd/update-test.rb <ide> def update_test <ide> safe_system "git", "reset", "--hard", start_commit <ide> <ide> # update ENV["PATH"] <del> ENV["PATH"] = PATH.new(ENV["PATH"]).prepend(curdir/"bin") <add> ENV["PATH"] = PATH.new(ENV.fetch("PATH")).prepend(curdir/"bin") <ide> <ide> # run brew help to install portable-ruby (if needed) <ide> quiet_system "brew", "help" <ide><path>Library/Homebrew/download_strategy.rb <ide> def source_modified_time <ide> private <ide> <ide> def env <del> { "PATH" => PATH.new("/usr/bin", Formula["cvs"].opt_bin, ENV["PATH"]) } <add> { "PATH" => PATH.new("/usr/bin", Formula["cvs"].opt_bin, ENV.fetch("PATH")) } <ide> end <ide> <ide> sig { returns(String) } <ide> def last_commit <ide> private <ide> <ide> def env <del> { "PATH" => PATH.new(Formula["mercurial"].opt_bin, ENV["PATH"]) } <add> { "PATH" => PATH.new(Formula["mercurial"].opt_bin, ENV.fetch("PATH")) } <ide> end <ide> <ide> sig { returns(String) } <ide> def last_commit <ide> <ide> def env <ide> { <del> "PATH" => PATH.new(Formula["bazaar"].opt_bin, ENV["PATH"]), <add> "PATH" => PATH.new(Formula["bazaar"].opt_bin, ENV.fetch("PATH")), <ide> "BZR_HOME" => HOMEBREW_TEMP, <ide> } <ide> end <ide> def repo_valid? <ide> private <ide> <ide> def env <del> { "PATH" => PATH.new(Formula["fossil"].opt_bin, ENV["PATH"]) } <add> { "PATH" => PATH.new(Formula["fossil"].opt_bin, ENV.fetch("PATH")) } <ide> end <ide> <ide> sig { returns(String) } <ide><path>Library/Homebrew/extend/ENV/std.rb <ide> def setup_build_environment(formula: nil, cc: nil, build_bottle: false, bottle_a <ide> <ide> self["HOMEBREW_ENV"] = "std" <ide> <del> PATH.new(ENV["HOMEBREW_PATH"]).reverse_each { |p| prepend_path "PATH", p } <add> ORIGINAL_PATHS.reverse_each { |p| prepend_path "PATH", p } <ide> prepend_path "PATH", HOMEBREW_SHIMS_PATH/"shared" <ide> <ide> # Set the default pkg-config search path, overriding the built-in paths <ide><path>Library/Homebrew/formula.rb <ide> def update_head_version <ide> return unless head.downloader.cached_location.exist? <ide> <ide> path = if ENV["HOMEBREW_ENV"] <del> ENV["PATH"] <add> ENV.fetch("PATH") <ide> else <del> ENV["HOMEBREW_PATH"] <add> PATH.new(ORIGINAL_PATHS) <ide> end <ide> <ide> with_env(PATH: path) do <ide> def run_post_install <ide> TMP: HOMEBREW_TEMP, <ide> _JAVA_OPTIONS: "-Djava.io.tmpdir=#{HOMEBREW_TEMP}", <ide> HOMEBREW_PATH: nil, <del> PATH: ENV["HOMEBREW_PATH"], <add> PATH: PATH.new(ORIGINAL_PATHS), <ide> } <ide> <ide> with_env(new_env) do <ide> def run_test(keep_tmp: false) <ide> TEMP: HOMEBREW_TEMP, <ide> TMP: HOMEBREW_TEMP, <ide> TERM: "dumb", <del> PATH: PATH.new(ENV["PATH"], HOMEBREW_PREFIX/"bin"), <add> PATH: PATH.new(ENV.fetch("PATH"), HOMEBREW_PREFIX/"bin"), <ide> HOMEBREW_PATH: nil, <ide> }.merge(common_stage_test_env) <ide> test_env[:_JAVA_OPTIONS] += " -Djava.io.tmpdir=#{HOMEBREW_TEMP}" <ide><path>Library/Homebrew/formula_cellar_checks.rb <ide> def check_cpuid_instruction(formula) <ide> objdump = Formula["llvm"].opt_bin/"llvm-objdump" if Formula["llvm"].any_version_installed? <ide> objdump ||= Formula["binutils"].opt_bin/"objdump" if Formula["binutils"].any_version_installed? <ide> objdump ||= which("objdump") <del> objdump ||= which("objdump", ENV["HOMEBREW_PATH"]) <add> objdump ||= which("objdump", ORIGINAL_PATHS) <ide> <ide> unless objdump <ide> return <<~EOS <ide><path>Library/Homebrew/global.rb <ide> def auditing? <ide> <ide> require "PATH" <ide> <del>ENV["HOMEBREW_PATH"] ||= ENV["PATH"] <del>ORIGINAL_PATHS = PATH.new(ENV["HOMEBREW_PATH"]).map do |p| <add>ENV["HOMEBREW_PATH"] ||= ENV.fetch("PATH") <add>ORIGINAL_PATHS = PATH.new(ENV.fetch("HOMEBREW_PATH")).map do |p| <ide> Pathname.new(p).expand_path <ide> rescue <ide> nil <ide><path>Library/Homebrew/requirement.rb <ide> def modify_build_environment(env: nil, cc: nil, build_bottle: false, bottle_arch <ide> parent = satisfied_result_parent <ide> return unless parent <ide> return if ["#{HOMEBREW_PREFIX}/bin", "#{HOMEBREW_PREFIX}/bin"].include?(parent.to_s) <del> return if PATH.new(ENV["PATH"]).include?(parent.to_s) <add> return if PATH.new(ENV.fetch("PATH")).include?(parent.to_s) <ide> <ide> ENV.prepend_path("PATH", parent) <ide> end <ide><path>Library/Homebrew/test/cask/artifact/installer_spec.rb <ide> expect(command).to receive(:run!).with( <ide> executable, <ide> a_hash_including( <del> env: { "PATH" => PATH.new("#{HOMEBREW_PREFIX}/bin", "#{HOMEBREW_PREFIX}/sbin", ENV["PATH"]) }, <add> env: { "PATH" => PATH.new("#{HOMEBREW_PREFIX}/bin", "#{HOMEBREW_PREFIX}/sbin", ENV.fetch("PATH")) }, <ide> ), <ide> ) <ide> <ide><path>Library/Homebrew/test/cmd/custom-external-command_spec.rb <ide> SH <ide> FileUtils.chmod "+x", file <ide> <del> expect { brew cmd, "PATH" => "#{path}#{File::PATH_SEPARATOR}#{ENV["PATH"]}" } <add> expect { brew cmd, "PATH" => "#{path}#{File::PATH_SEPARATOR}#{ENV.fetch("PATH")}" } <ide> .to output("I am #{cmd}.\n").to_stdout <ide> .and not_to_output.to_stderr <ide> .and be_a_success <ide><path>Library/Homebrew/test/diagnostic_checks_spec.rb <ide> FileUtils.chmod 0755, anaconda <ide> FileUtils.chmod 0755, python <ide> <del> ENV["PATH"] = "#{path}#{File::PATH_SEPARATOR}#{ENV["PATH"]}" <add> ENV["PATH"] = "#{path}#{File::PATH_SEPARATOR}#{ENV.fetch("PATH")}" <ide> <ide> expect(checks.check_for_anaconda).to match("Anaconda") <ide> end <ide> <ide> specify "#check_user_path_3" do <ide> sbin = HOMEBREW_PREFIX/"sbin" <del> ENV["HOMEBREW_PATH"] = <add> (sbin/"something").mkpath <add> <add> homebrew_path = <ide> "#{HOMEBREW_PREFIX}/bin#{File::PATH_SEPARATOR}" + <ide> ENV["HOMEBREW_PATH"].gsub(/(?:^|#{Regexp.escape(File::PATH_SEPARATOR)})#{Regexp.escape(sbin)}/, "") <del> (sbin/"something").mkpath <add> stub_const("ORIGINAL_PATHS", PATH.new(homebrew_path).map { |path| Pathname.new(path).expand_path }.compact) <ide> <ide> expect(checks.check_user_path_1).to be_nil <ide> expect(checks.check_user_path_2).to be_nil <ide> file = "#{path}/foo-config" <ide> FileUtils.touch file <ide> FileUtils.chmod 0755, file <del> ENV["HOMEBREW_PATH"] = ENV["PATH"] = <del> "#{path}#{File::PATH_SEPARATOR}#{ENV["PATH"]}" <add> ENV["PATH"] = "#{path}#{File::PATH_SEPARATOR}#{ENV.fetch("PATH")}" <ide> <ide> expect(checks.check_for_config_scripts) <ide> .to match('"config" scripts exist') <ide><path>Library/Homebrew/test/spec_helper.rb <ide> skip "Subversion is not installed." unless quiet_system svn_shim, "--version" <ide> <ide> svn_shim_path = Pathname(Utils.popen_read(svn_shim, "--homebrew=print-path").chomp.presence) <del> svn_paths = PATH.new(ENV["PATH"]) <add> svn_paths = PATH.new(ENV.fetch("PATH")) <ide> svn_paths.prepend(svn_shim_path.dirname) <ide> <ide> if OS.mac? <ide> svnadmin = which("svnadmin", svn_paths) <ide> skip "svnadmin is not installed." unless svnadmin <ide> <del> ENV["PATH"] = PATH.new(ENV["PATH"]) <add> ENV["PATH"] = PATH.new(ENV.fetch("PATH")) <ide> .append(svn.dirname) <ide> .append(svnadmin.dirname) <ide> end <ide><path>Library/Homebrew/test/support/helper/spec/shared_context/integration_test.rb <ide> def brew(*args) <ide> env["PATH"], <ide> (HOMEBREW_LIBRARY_PATH/"test/support/helper/cmd").realpath.to_s, <ide> (HOMEBREW_PREFIX/"bin").realpath.to_s, <del> ENV["PATH"], <add> ENV.fetch("PATH"), <ide> ].compact.join(File::PATH_SEPARATOR) <ide> <ide> env.merge!( <ide><path>Library/Homebrew/test/utils_spec.rb <ide> def esc(code) <ide> <ide> describe "#with_env" do <ide> it "sets environment variables within the block" do <del> expect(ENV["PATH"]).not_to eq("/bin") <add> expect(ENV.fetch("PATH")).not_to eq("/bin") <ide> with_env(PATH: "/bin") do <del> expect(ENV["PATH"]).to eq("/bin") <add> expect(ENV.fetch("PATH", nil)).to eq("/bin") <ide> end <ide> end <ide> <ide> it "restores ENV after the block" do <ide> with_env(PATH: "/bin") do <del> expect(ENV["PATH"]).to eq("/bin") <add> expect(ENV.fetch("PATH", nil)).to eq("/bin") <ide> end <del> expect(ENV["PATH"]).not_to eq("/bin") <add> path = ENV.fetch("PATH", nil) <add> expect(path).not_to be_nil <add> expect(path).not_to eq("/bin") <ide> end <ide> <ide> it "restores ENV if an exception is raised" do <ide> def esc(code) <ide> end <ide> }.to raise_error(StandardError) <ide> <del> expect(ENV["PATH"]).not_to eq("/bin") <add> path = ENV.fetch("PATH", nil) <add> expect(path).not_to be_nil <add> expect(path).not_to eq("/bin") <ide> end <ide> end <ide> <ide><path>Library/Homebrew/unpack_strategy/cab.rb <ide> def self.can_extract?(path) <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "cabextract", <ide> args: ["-d", unpack_dir, "--", path], <del> env: { "PATH" => PATH.new(Formula["cabextract"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["cabextract"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> <ide><path>Library/Homebrew/unpack_strategy/fossil.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "fossil", <ide> args: ["open", path, *args], <ide> chdir: unpack_dir, <del> env: { "PATH" => PATH.new(Formula["fossil"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["fossil"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/generic_unar.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> "-force-overwrite", "-quiet", "-no-directory", <ide> "-output-directory", unpack_dir, "--", path <ide> ], <del> env: { "PATH" => PATH.new(Formula["unar"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["unar"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/lha.rb <ide> def dependencies <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "lha", <ide> args: ["xq2w=#{unpack_dir}", path], <del> env: { "PATH" => PATH.new(Formula["lha"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["lha"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/lzip.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> quiet_flags = verbose ? [] : ["-q"] <ide> system_command! "lzip", <ide> args: ["-d", *quiet_flags, unpack_dir/basename], <del> env: { "PATH" => PATH.new(Formula["lzip"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["lzip"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/lzma.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> quiet_flags = verbose ? [] : ["-q"] <ide> system_command! "unlzma", <ide> args: [*quiet_flags, "--", unpack_dir/basename], <del> env: { "PATH" => PATH.new(Formula["xz"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["xz"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/mercurial.rb <ide> def self.can_extract?(path) <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "hg", <ide> args: ["--cwd", path, "archive", "--subrepos", "-y", "-t", "files", unpack_dir], <del> env: { "PATH" => PATH.new(Formula["mercurial"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["mercurial"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/p7zip.rb <ide> def dependencies <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "7zr", <ide> args: ["x", "-y", "-bd", "-bso0", path, "-o#{unpack_dir}"], <del> env: { "PATH" => PATH.new(Formula["p7zip"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["p7zip"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/rar.rb <ide> def dependencies <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> system_command! "unrar", <ide> args: ["x", "-inul", path, unpack_dir], <del> env: { "PATH" => PATH.new(Formula["unrar"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["unrar"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/xz.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> quiet_flags = verbose ? [] : ["-q"] <ide> system_command! "unxz", <ide> args: [*quiet_flags, "-T0", "--", unpack_dir/basename], <del> env: { "PATH" => PATH.new(Formula["xz"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["xz"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/unpack_strategy/zip.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> quiet_flags = verbose ? [] : ["-qq"] <ide> result = system_command! "unzip", <ide> args: [*quiet_flags, "-o", path, "-d", unpack_dir], <del> env: { "PATH" => PATH.new(unzip&.opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(unzip&.opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose, <ide> print_stderr: false <ide> <ide><path>Library/Homebrew/unpack_strategy/zstd.rb <ide> def extract_to_dir(unpack_dir, basename:, verbose:) <ide> quiet_flags = verbose ? [] : ["-q"] <ide> system_command! "unzstd", <ide> args: [*quiet_flags, "-T0", "--rm", "--", unpack_dir/basename], <del> env: { "PATH" => PATH.new(Formula["zstd"].opt_bin, ENV["PATH"]) }, <add> env: { "PATH" => PATH.new(Formula["zstd"].opt_bin, ENV.fetch("PATH")) }, <ide> verbose: verbose <ide> end <ide> end <ide><path>Library/Homebrew/utils.rb <ide> def interactive_shell(f = nil) <ide> end <ide> <ide> def with_homebrew_path(&block) <del> with_env(PATH: PATH.new(ENV["HOMEBREW_PATH"]), &block) <add> with_env(PATH: PATH.new(ORIGINAL_PATHS), &block) <ide> end <ide> <ide> def with_custom_locale(locale, &block) <ide> def quiet_system(cmd, *args) <ide> end <ide> end <ide> <del> def which(cmd, path = ENV["PATH"]) <add> def which(cmd, path = ENV.fetch("PATH")) <ide> PATH.new(path).each do |p| <ide> begin <ide> pcmd = File.expand_path(cmd, p) <ide> def which(cmd, path = ENV["PATH"]) <ide> nil <ide> end <ide> <del> def which_all(cmd, path = ENV["PATH"]) <add> def which_all(cmd, path = ENV.fetch("PATH")) <ide> PATH.new(path).map do |p| <ide> begin <ide> pcmd = File.expand_path(cmd, p) <ide> def which_editor <ide> <ide> # Find Atom, Sublime Text, VS Code, Textmate, BBEdit / TextWrangler, or vim <ide> editor = %w[atom subl code mate edit vim].find do |candidate| <del> candidate if which(candidate, ENV["HOMEBREW_PATH"]) <add> candidate if which(candidate, ORIGINAL_PATHS) <ide> end <ide> editor ||= "vim" <ide> <ide> def ensure_executable!(name, formula_name = nil, reason: "") <ide> <ide> executable = [ <ide> which(name), <del> which(name, ENV["HOMEBREW_PATH"]), <add> which(name, ORIGINAL_PATHS), <ide> HOMEBREW_PREFIX/"bin/#{name}", <ide> ].compact.first <ide> return executable if executable.exist? <ide> def ensure_executable!(name, formula_name = nil, reason: "") <ide> end <ide> <ide> def paths <del> @paths ||= PATH.new(ENV["HOMEBREW_PATH"]).map do |p| <del> File.expand_path(p).chomp("/") <del> rescue ArgumentError <del> onoe "The following PATH component is invalid: #{p}" <del> end.uniq.compact <add> @paths ||= ORIGINAL_PATHS.uniq.map(&:to_s) <ide> end <ide> <ide> def parse_author!(author) <ide><path>Library/Homebrew/utils/git.rb <ide> def setup_gpg! <ide> gnupg_bin = HOMEBREW_PREFIX/"opt/gnupg/bin" <ide> return unless gnupg_bin.directory? <ide> <del> ENV["PATH"] = PATH.new(ENV["PATH"]) <add> ENV["PATH"] = PATH.new(ENV.fetch("PATH")) <ide> .prepend(gnupg_bin) <ide> end <ide>
37
Python
Python
update some warning messages
53c0faa5535c4fb782271d21ca1bbb2191c5f19c
<ide><path>keras/legacy/interfaces.py <ide> def lstm_args_preprocessor(args, kwargs): <ide> kwargs.pop('forget_bias_init') <ide> warnings.warn('The `forget_bias_init` argument ' <ide> 'has been ignored. Use `unit_forget_bias=True` ' <del> 'instead to intialize with ones') <add> 'instead to intialize with ones.') <ide> return args, kwargs, converted <ide> <ide> legacy_recurrent_support = generate_legacy_interface( <ide> def convlstm2d_args_preprocessor(args, kwargs): <ide> else: <ide> warnings.warn('The `forget_bias_init` argument ' <ide> 'has been ignored. Use `unit_forget_bias=True` ' <del> 'instead to intialize with ones') <add> 'instead to intialize with ones.') <ide> args, kwargs, _converted = conv2d_args_preprocessor(args, kwargs) <ide> return args, kwargs, converted + _converted <ide> <ide> def zeropadding2d_args_preprocessor(args, kwargs): <ide> kwargs['padding'] = ((top_pad, bottom_pad), (left_pad, right_pad)) <ide> warnings.warn('The `padding` argument in the Keras 2 API no longer' <ide> 'accepts dict types. You can now input argument as: ' <del> '`padding`=(top_pad, bottom_pad, left_pad, right_pad)') <add> '`padding=(top_pad, bottom_pad, left_pad, right_pad)`.') <ide> elif len(args) == 2 and isinstance(args[1], dict): <ide> if set(args[1].keys()) <= {'top_pad', 'bottom_pad', <ide> 'left_pad', 'right_pad'}: <ide> def zeropadding2d_args_preprocessor(args, kwargs): <ide> args = (args[0], ((top_pad, bottom_pad), (left_pad, right_pad))) <ide> warnings.warn('The `padding` argument in the Keras 2 API no longer' <ide> 'accepts dict types. You can now input argument as: ' <del> '`padding`=((top_pad, bottom_pad), (left_pad, right_pad))') <add> '`padding=((top_pad, bottom_pad), (left_pad, right_pad))`') <ide> return args, kwargs, converted <ide> <ide> legacy_zeropadding2d_support = generate_legacy_interface(
1
Javascript
Javascript
add generator test cases
54e8068b87a4f0c2fb999534971c137c9154e748
<ide><path>test/cases/parsing/issue-11353/async_generator_function.js <add>"use strict"; <add> <add>export default async function* asyncIdMaker(start = 1, end = 5){ <add> for (let i = start; i <= end; i++) { <add> <add> // yay, can use await! <add> await new Promise(resolve => setTimeout(resolve, 1000)); <add> <add> yield i; <add> } <add>} <ide><path>test/cases/parsing/issue-11353/generator_function.js <add>"use strict"; <add> <add>export default function* idMaker(){ <add> var index = 0; <add> while(true) <add> yield index++; <add>} <ide><path>test/cases/parsing/issue-11353/index.js <add>import generator from "./generator_function.js"; <add>import asyncGenerator from "./async_generator_function"; <add> <add>it('should correctly import generator function', () => { <add> expect(typeof generator).toBe("function"); <add>}); <add> <add>it('should correctly build the correct function string', () => { <add> expect(generator.toString().indexOf('function* ')).toBe(0); // 0 <add>}); <add> <add>it('should correctly provide the generator function interface', () => { <add> var gen = generator(); // "Generator { }" <add> expect(gen.next().value).toBe(0); // 0 <add> expect(gen.next().value).toBe(1); // 0 <add> expect(gen.next().value).toBe(2); // 0 <add>}); <add> <add>it('should correctly import async generator function', () => { <add> expect(typeof asyncGenerator).toBe("function"); <add>}); <add> <add>it('should correctly build the correct async function string', () => { <add> expect(asyncGenerator.toString().indexOf('async function* ')).toBe(0); // 0 <add>}); <add> <add>it('should correctly provide the async generator function interface', async () => { <add> let gen = asyncGenerator(1, 5); <add> let start = 0; <add> for await (let value of gen) { <add> start += 1; <add> expect(value).toBe(start); <add> } <add>});
3
PHP
PHP
fix coding standards
be2a2523456a1414bf667ea482015d9d21918697
<ide><path>lib/Cake/Test/Case/Console/Command/Task/ControllerTaskTest.php <ide> if (!$imported) { <ide> define('ARTICLE_MODEL_CREATED', true); <ide> <del> /** <del> * Class BakeArticle <del> */ <add>/** <add> * Class BakeArticle <add> */ <ide> class BakeArticle extends Model { <ide> <ide> public $name = 'BakeArticle'; <ide><path>lib/Cake/Test/Case/TestSuite/ControllerTestCaseTest.php <ide> class AppController extends Controller { <ide> */ <ide> if (!class_exists('PostsController')) { <ide> <del> /** <del> * Class PostsController <del> * <del> * @package Cake.Test.Case.TestSuite <del> */ <add>/** <add> * Class PostsController <add> * <add> * @package Cake.Test.Case.TestSuite <add> */ <ide> class PostsController extends AppController { <ide> <ide> /**
2
Text
Text
add docs on meta inheritance. closes
78ac332f18c51bb151ae32f1f3d207595b0b3ca2
<ide><path>docs/api-guide/serializers.md <ide> Alternative representations include serializing using hyperlinks, serializing co <ide> <ide> For full details see the [serializer relations][relations] documentation. <ide> <add>## Inheritance of the 'Meta' class <add> <add>The inner `Meta` class on serializers is not inherited from parent classes by default. This is the same behaviour as with Django's `Model` and `ModelForm` classes. If you want the `Meta` class to inherit from a parent class you must do so explicitly. For example: <add> <add> class AccountSerializer(MyBaseSerializer): <add> class Meta(MyBaseSerializer.Meta): <add> model = Account <add> <add>Typically we would recommend *not* using inheritance on inner Meta classes, but instead declaring all options explicitly. <add> <ide> --- <ide> <ide> # HyperlinkedModelSerializer
1
Javascript
Javascript
fix condition where data is lost
dbe645f11460a8985f5f6e07f9ed829bee43e101
<ide><path>lib/internal/http2/core.js <ide> function onStreamClose(code) { <ide> <ide> if (state.fd !== undefined) <ide> tryClose(state.fd); <del> stream.push(null); <del> stream[kMaybeDestroy](null, code); <add> <add> // Defer destroy we actually emit end. <add> if (stream._readableState.endEmitted || code !== NGHTTP2_NO_ERROR) { <add> // If errored or ended, we can destroy immediately. <add> stream[kMaybeDestroy](null, code); <add> } else { <add> // Wait for end to destroy. <add> stream.on('end', stream[kMaybeDestroy]); <add> // Push a null so the stream can end whenever the client consumes <add> // it completely. <add> stream.push(null); <add> <add> // Same as net. <add> if (stream.readableLength === 0) { <add> stream.read(0); <add> } <add> } <ide> } <ide> <ide> // Receives a chunk of data for a given stream and forwards it on <ide> function onStreamRead(nread, buf) { <ide> } <ide> return; <ide> } <add> <ide> // Last chunk was received. End the readable side. <ide> debug(`Http2Stream ${stream[kID]} [Http2Session ` + <ide> `${sessionName(stream[kSession][kType])}]: ending readable.`); <del> stream.push(null); <del> stream[kMaybeDestroy](); <add> <add> // defer this until we actually emit end <add> if (stream._readableState.endEmitted) { <add> stream[kMaybeDestroy](); <add> } else { <add> stream.on('end', stream[kMaybeDestroy]); <add> stream.push(null); <add> stream.read(0); <add> } <ide> } <ide> <ide> // Called when the remote peer settings have been updated. <ide> class Http2Stream extends Duplex { <ide> session[kMaybeDestroy](); <ide> process.nextTick(emit, this, 'close', code); <ide> callback(err); <del> } <ide> <add> } <ide> // The Http2Stream can be destroyed if it has closed and if the readable <ide> // side has received the final chunk. <ide> [kMaybeDestroy](error, code = NGHTTP2_NO_ERROR) { <del> if (error == null) { <del> if (code === NGHTTP2_NO_ERROR && <del> (!this._readableState.ended || <del> !this._writableState.ended || <del> this._writableState.pendingcb > 0 || <del> !this.closed)) { <del> return; <del> } <add> if (error || code !== NGHTTP2_NO_ERROR) { <add> this.destroy(error); <add> return; <add> } <add> <add> // TODO(mcollina): remove usage of _*State properties <add> if (this._readableState.ended && <add> this._writableState.ended && <add> this._writableState.pendingcb === 0 && <add> this.closed) { <add> this.destroy(); <add> // This should return, but eslint complains. <add> // return <ide> } <del> this.destroy(error); <ide> } <ide> } <ide> <ide><path>test/parallel/test-http2-compat-short-stream-client-server.js <add>'use strict'; <add> <add>const common = require('../common'); <add>if (!common.hasCrypto) <add> common.skip('missing crypto'); <add>const assert = require('assert'); <add>const http2 = require('http2'); <add>const { Readable } = require('stream'); <add> <add>const server = http2.createServer(common.mustCall((req, res) => { <add> res.setHeader('content-type', 'text/html'); <add> const input = new Readable({ <add> read() { <add> this.push('test'); <add> this.push(null); <add> } <add> }); <add> input.pipe(res); <add>})); <add> <add>server.listen(0, common.mustCall(() => { <add> const port = server.address().port; <add> const client = http2.connect(`http://localhost:${port}`); <add> <add> const req = client.request(); <add> <add> req.on('response', common.mustCall((headers) => { <add> assert.strictEqual(headers[':status'], 200); <add> assert.strictEqual(headers['content-type'], 'text/html'); <add> })); <add> <add> let data = ''; <add> <add> const notCallClose = common.mustNotCall(); <add> <add> setTimeout(() => { <add> req.setEncoding('utf8'); <add> req.removeListener('close', notCallClose); <add> req.on('close', common.mustCall(() => { <add> server.close(); <add> client.close(); <add> })); <add> req.on('data', common.mustCallAtLeast((d) => data += d)); <add> req.on('end', common.mustCall(() => { <add> assert.strictEqual(data, 'test'); <add> })); <add> }, common.platformTimeout(100)); <add> <add> req.on('close', notCallClose); <add>})); <ide><path>test/parallel/test-http2-short-stream-client-server.js <add>'use strict'; <add> <add>const common = require('../common'); <add>if (!common.hasCrypto) <add> common.skip('missing crypto'); <add>const assert = require('assert'); <add>const http2 = require('http2'); <add>const { Readable } = require('stream'); <add> <add>const server = http2.createServer(); <add>server.on('stream', common.mustCall((stream) => { <add> stream.respond({ <add> ':status': 200, <add> 'content-type': 'text/html' <add> }); <add> const input = new Readable({ <add> read() { <add> this.push('test'); <add> this.push(null); <add> } <add> }); <add> input.pipe(stream); <add>})); <add> <add> <add>server.listen(0, common.mustCall(() => { <add> const port = server.address().port; <add> const client = http2.connect(`http://localhost:${port}`); <add> <add> const req = client.request(); <add> <add> req.on('response', common.mustCall((headers) => { <add> assert.strictEqual(headers[':status'], 200); <add> assert.strictEqual(headers['content-type'], 'text/html'); <add> })); <add> <add> let data = ''; <add> <add> const notCallClose = common.mustNotCall(); <add> <add> setTimeout(() => { <add> req.setEncoding('utf8'); <add> req.removeListener('close', notCallClose); <add> req.on('close', common.mustCall(() => { <add> server.close(); <add> client.close(); <add> })); <add> req.on('data', common.mustCallAtLeast((d) => data += d)); <add> req.on('end', common.mustCall(() => { <add> assert.strictEqual(data, 'test'); <add> })); <add> }, common.platformTimeout(100)); <add> <add> req.on('close', notCallClose); <add>}));
3
Javascript
Javascript
use a fresh meta to get the current value
7ed1ed7dd1f2bb00a350cb78e3d590167816cee7
<ide><path>packages/ember-metal/lib/watching.js <ide> var switchToWatched = function(obj, keyName, meta) { <ide> enumerable: true, <ide> set: mandatorySetter, <ide> get: function(key) { <del> return meta.values[keyName]; <add> return metaFor(this).values[keyName]; <ide> } <ide> }; <ide>
1
Text
Text
use jquery to modify the entire page
efe39a5da82c57967c5c17e127b1b6d7a8c65deb
<ide><path>guide/english/certifications/front-end-libraries/jquery/use-jquery-to-modify-the-entire-page/index.md <ide> Add the classes `animated` and `hinge` to your `body` element. <ide> - [.addClass()](https://api.jquery.com/addClass/e) <ide> <ide> ### Solution: <del>```javascript <add>```html <ide> <script> <ide> $("body").addClass("animated hinge"); <add> $(document).ready(function() { <add> $("#target1").css("color", "red"); <add> $("#target1").prop("disabled", true); <add> $("#target4").remove(); <add> $("#target2").appendTo("#right-well"); <add> $("#target5").clone().appendTo("#left-well"); <add> $("#target1").parent().css("background-color", "red"); <add> $("#right-well").children().css("color", "orange"); <add> $("#left-well").children().css("color", "green"); <add> $(".target:nth-child(2)").addClass("animated bounce"); <add> $(".target:even").addClass("animated shake"); <add> $("body").addClass("animated hinge"); <ide> }); <ide> </script> <del>``` <ide> <add><div class="container-fluid"> <add> <h3 class="text-primary text-center">jQuery Playground</h3> <add> <div class="row"> <add> <div class="col-xs-6"> <add> <h4>#left-well</h4> <add> <div class="well" id="left-well"> <add> <button class="btn btn-default target" id="target1">#target1</button> <add> <button class="btn btn-default target" id="target2">#target2</button> <add> <button class="btn btn-default target" id="target3">#target3</button> <add> </div> <add> </div> <add> <div class="col-xs-6"> <add> <h4>#right-well</h4> <add> <div class="well" id="right-well"> <add> <button class="btn btn-default target" id="target4">#target4</button> <add> <button class="btn btn-default target" id="target5">#target5</button> <add> <button class="btn btn-default target" id="target6">#target6</button> <add> </div> <add> </div> <add> </div> <add></div> <add>```
1
Javascript
Javascript
increase test coverage for os.js
7069e633caf90a77fced619689171a92059ee29a
<ide><path>test/parallel/test-os.js <ide> const path = require('path'); <ide> const { inspect } = require('util'); <ide> <ide> const is = { <add> number: (value, key) => { <add> assert(!isNaN(value), `${key} should not be NaN`); <add> assert.strictEqual(typeof value, 'number'); <add> }, <ide> string: (value) => { assert.strictEqual(typeof value, 'string'); }, <del> number: (value) => { assert.strictEqual(typeof value, 'number'); }, <ide> array: (value) => { assert.ok(Array.isArray(value)); }, <ide> object: (value) => { <ide> assert.strictEqual(typeof value, 'object'); <ide> assert.notStrictEqual(value, null); <ide> } <ide> }; <ide> <add>const flatten = (arr) => <add> arr.reduce((acc, c) => <add> acc.concat(Array.isArray(c) ? flatten(c) : c), []); <add> <ide> process.env.TMPDIR = '/tmpdir'; <ide> process.env.TMP = '/tmp'; <ide> process.env.TEMP = '/temp'; <ide> if (!common.isSunOS) { <ide> assert.ok(os.totalmem() > 0); <ide> } <ide> <del> <ide> const interfaces = os.networkInterfaces(); <ide> switch (platform) { <del> case 'linux': <del> { <del> const filter = <del> (e) => e.address === '127.0.0.1' && e.netmask === '255.0.0.0'; <add> case 'linux': { <add> const filter = (e) => <add> e.address === '127.0.0.1' && <add> e.netmask === '255.0.0.0'; <add> <ide> const actual = interfaces.lo.filter(filter); <del> const expected = [{ address: '127.0.0.1', netmask: '255.0.0.0', <del> mac: '00:00:00:00:00:00', family: 'IPv4', <del> internal: true, cidr: '127.0.0.1/8' }]; <add> const expected = [{ <add> address: '127.0.0.1', <add> netmask: '255.0.0.0', <add> mac: '00:00:00:00:00:00', <add> family: 'IPv4', <add> internal: true, <add> cidr: '127.0.0.1/8' <add> }]; <ide> assert.deepStrictEqual(actual, expected); <ide> break; <ide> } <del> case 'win32': <del> { <del> const filter = (e) => e.address === '127.0.0.1'; <add> case 'win32': { <add> const filter = (e) => <add> e.address === '127.0.0.1'; <add> <ide> const actual = interfaces['Loopback Pseudo-Interface 1'].filter(filter); <del> const expected = [{ address: '127.0.0.1', netmask: '255.0.0.0', <del> mac: '00:00:00:00:00:00', family: 'IPv4', <del> internal: true, cidr: '127.0.0.1/8' }]; <add> const expected = [{ <add> address: '127.0.0.1', <add> netmask: '255.0.0.0', <add> mac: '00:00:00:00:00:00', <add> family: 'IPv4', <add> internal: true, <add> cidr: '127.0.0.1/8' <add> }]; <ide> assert.deepStrictEqual(actual, expected); <ide> break; <ide> } <ide> } <del>function flatten(arr) { <del> return arr.reduce( <del> (acc, c) => acc.concat(Array.isArray(c) ? flatten(c) : c), <del> [] <del> ); <del>} <ide> const netmaskToCIDRSuffixMap = new Map(Object.entries({ <ide> '255.0.0.0': 8, <ide> '255.255.255.0': 24, <ide> 'ffff:ffff:ffff:ffff::': 64, <ide> 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff': 128 <ide> })); <add> <ide> flatten(Object.values(interfaces)) <ide> .map((v) => ({ v, mask: netmaskToCIDRSuffixMap.get(v.netmask) })) <ide> .forEach(({ v, mask }) => { <ide> flatten(Object.values(interfaces)) <ide> }); <ide> <ide> const EOL = os.EOL; <del>assert.ok(EOL.length > 0); <del> <add>if (common.isWindows) { <add> assert.strictEqual(EOL, '\r\n'); <add>} else { <add> assert.strictEqual(EOL, '\n'); <add>} <ide> <ide> const home = os.homedir(); <del> <ide> is.string(home); <ide> assert.ok(home.includes(path.sep)); <ide> <ide> assert.ok(pwd.homedir.includes(path.sep)); <ide> assert.strictEqual(pwd.username, pwdBuf.username.toString('utf8')); <ide> assert.strictEqual(pwd.homedir, pwdBuf.homedir.toString('utf8')); <ide> <del>// Test that the Symbol.toPrimitive functions work correctly <del>[ <del> [`${os.hostname}`, os.hostname()], <del> [`${os.homedir}`, os.homedir()], <del> [`${os.release}`, os.release()], <del> [`${os.type}`, os.type()], <del> [`${os.endianness}`, os.endianness()] <del>].forEach((set) => assert.strictEqual(set[0], set[1])); <add>assert.strictEqual(`${os.hostname}`, os.hostname()); <add>assert.strictEqual(`${os.homedir}`, os.homedir()); <add>assert.strictEqual(`${os.release}`, os.release()); <add>assert.strictEqual(`${os.type}`, os.type()); <add>assert.strictEqual(`${os.endianness}`, os.endianness()); <add>assert.strictEqual(`${os.tmpdir}`, os.tmpdir()); <add>assert.strictEqual(`${os.arch}`, os.arch()); <add>assert.strictEqual(`${os.platform}`, os.platform()); <add> <add>assert.strictEqual(+os.totalmem, os.totalmem()); <add> <add>// Assert that the following values are coercible to numbers. <add>is.number(+os.uptime, 'uptime'); <add>is.number(os.uptime(), 'uptime'); <add> <add>is.number(+os.freemem, 'freemem'); <add>is.number(os.freemem(), 'freemem');
1
Text
Text
fix typo in cli.md
bc89048f3d2f8886d708494c25560a02552df94d
<ide><path>doc/api/cli.md <ide> Generating V8 snapshots takes time and memory (both memory managed by the <ide> V8 heap and native memory outside the V8 heap). The bigger the heap is, <ide> the more resources it needs. Node.js will adjust the V8 heap to accommondate <ide> the additional V8 heap memory overhead, and try its best to avoid using up <del>all the memory avialable to the process. When the process uses <add>all the memory available to the process. When the process uses <ide> more memory than the system deems appropriate, the process may be terminated <ide> abruptly by the system, depending on the system configuration. <ide>
1
PHP
PHP
give submit button sensible default
4c36ed37b2f92fe63dce1cb16cd169c578417f70
<ide><path>src/Illuminate/Html/FormBuilder.php <ide> protected function getSelectOption($display, $value, $selected) <ide> { <ide> return $this->optionGroup($display, $value, $selected); <ide> } <del> <add> <ide> return $this->option($display, $value, $selected); <ide> } <ide> <ide> protected function checkable($type, $name, $value, $checked, $options) <ide> * @param array $options <ide> * @return string <ide> */ <del> public function submit($value = null, $options = array()) <add> public function submit($value = 'Submit', $options = array()) <ide> { <ide> return $this->input('submit', null, $value, $options); <ide> }
1
Text
Text
update scriptable tooltip context docs
9a042672a74d71fb0238bfaa2310b237d106604e
<ide><path>docs/docs/general/options.md <ide> There are multiple levels of context objects: <ide> * `data` <ide> * `scale` <ide> * `tick` <add> * `tooltip` <ide> <ide> Each level inherits its parent(s) and any contextual information stored in the parent is available through the child. <ide> <ide> In addition to [scale](#scale) <ide> * `tick`: the associated tick object <ide> * `index`: tick index <ide> * `type`: `'tick'` <add> <add>### tooltip <add> <add>In addition to [chart](#chart) <add> <add>* `tooltip`: the tooltip object <add>* `tooltipItems`: the items the tooltip is displaying
1
Java
Java
update javadoc for supported jboss vfs version
34eb4dba3a4e1915989c29713a2b3551d3d5520d
<ide><path>spring-core/src/main/java/org/springframework/core/io/VfsResource.java <ide> /* <del> * Copyright 2002-2012 the original author or authors. <add> * Copyright 2002-2014 the original author or authors. <ide> * <ide> * Licensed under the Apache License, Version 2.0 (the "License"); <ide> * you may not use this file except in compliance with the License. <ide> import org.springframework.util.Assert; <ide> <ide> /** <del> * VFS based {@link Resource} implementation. <del> * Supports the corresponding VFS API versions on JBoss AS 5.x as well as 6.x and 7.x. <add> * JBoss VFS based {@link Resource} implementation. <add> * <add> * <p>As of Spring 4.0, this class supports VFS 3.x on JBoss AS 6+ (package <add> * {@code org.jboss.vfs}) and is in particular compatible with JBoss AS 7 and <add> * WildFly 8. <ide> * <ide> * @author Ales Justin <ide> * @author Juergen Hoeller <ide> public class VfsResource extends AbstractResource { <ide> private final Object resource; <ide> <ide> <del> public VfsResource(Object resources) { <del> Assert.notNull(resources, "VirtualFile must not be null"); <del> this.resource = resources; <add> public VfsResource(Object resource) { <add> Assert.notNull(resource, "VirtualFile must not be null"); <add> this.resource = resource; <ide> } <ide> <ide> <ide><path>spring-core/src/main/java/org/springframework/core/io/VfsUtils.java <ide> /* <del> * Copyright 2002-2013 the original author or authors. <add> * Copyright 2002-2014 the original author or authors. <ide> * <ide> * Licensed under the Apache License, Version 2.0 (the "License"); <ide> * you may not use this file except in compliance with the License. <ide> <ide> /** <ide> * Utility for detecting and accessing JBoss VFS in the classpath. <del> * As of Spring 4.0, supports VFS 3.x on JBoss AS 6+ (package {@code org.jboss.vfs}) <del> * and is in particular compatible with JBoss AS 7 and WildFly 8. <add> * <add> * <p>As of Spring 4.0, this class supports VFS 3.x on JBoss AS 6+ (package <add> * {@code org.jboss.vfs}) and is in particular compatible with JBoss AS 7 and <add> * WildFly 8. <ide> * <ide> * <p>Thanks go to Marius Bogoevici for the initial patch. <ide> * <b>Note:</b> This is an internal class and should not be used outside the framework.
2
Text
Text
improve grammar in readme
031ee97b12b99cac0bfe80f29c876aaddb01a4cc
<ide><path>README.md <ide> The **[#redux channel](https://discord.gg/0ZcbPKXt5bZ6au5t)** of the **[Reactifl <ide> <ide> Redux is a valuable tool for organizing your state, but you should also consider whether it's appropriate for your situation. Don't use Redux just because someone said you should - take some time to understand the potential benefits and tradeoffs of using it. <ide> <del>Here's some suggestions on when it makes sense to use Redux: <add>Here are some suggestions on when it makes sense to use Redux: <ide> * You have reasonable amounts of data changing over time <ide> * You need a single source of truth for your state <ide> * You find that keeping all your state in a top-level component is no longer sufficient <ide> Yes, these guidelines are subjective and vague, but this is for good reason. The <ide> <ide> ## Developer Experience <ide> <del>Dan Abramov (author of Redux) wrote Redux while working on his React Europe talk called [“Hot Reloading with Time Travel”](https://www.youtube.com/watch?v=xsSnOQynTHs). His goal was to create a state management library with minimal API but completely predictable behavior, so it is possible to implement logging, hot reloading, time travel, universal apps, record and replay, without any buy-in from the developer. <add>Dan Abramov (author of Redux) wrote Redux while working on his React Europe talk called [“Hot Reloading with Time Travel”](https://www.youtube.com/watch?v=xsSnOQynTHs). His goal was to create a state management library with a minimal API but completely predictable behavior. Redux makes it possible to implement logging, hot reloading, time travel, universal apps, record and replay, without any buy-in from the developer. <ide> <ide> ## Influences <ide> <ide> Redux evolves the ideas of [Flux](http://facebook.github.io/flux/), but avoids its complexity by taking cues from [Elm](https://github.com/evancz/elm-architecture-tutorial/). <del>Whether you have used them or not, Redux only takes a few minutes to get started with. <add>Even if you haven't used Flux or Elm, Redux only takes a few minutes to get started with. <ide> <ide> ## Installation <ide> <ide> This assumes you are using [npm](https://www.npmjs.com/) as your package manager <ide> <ide> If you're not, you can [access these files on unpkg](https://unpkg.com/redux/), download them, or point your package manager to them. <ide> <del>Most commonly people consume Redux as a collection of [CommonJS](http://webpack.github.io/docs/commonjs.html) modules. These modules are what you get when you import `redux` in a [Webpack](https://webpack.js.org/), [Browserify](http://browserify.org/), or a Node environment. If you like to live on the edge and use [Rollup](http://rollupjs.org), we support that as well. <add>Most commonly, people consume Redux as a collection of [CommonJS](http://webpack.github.io/docs/commonjs.html) modules. These modules are what you get when you import `redux` in a [Webpack](https://webpack.js.org/), [Browserify](http://browserify.org/), or a Node environment. If you like to live on the edge and use [Rollup](http://rollupjs.org), we support that as well. <ide> <ide> If you don't use a module bundler, it's also fine. The `redux` npm package includes precompiled production and development [UMD](https://github.com/umdjs/umd) builds in the [`dist` folder](https://unpkg.com/redux/dist/). They can be used directly without a bundler and are thus compatible with many popular JavaScript module loaders and environments. For example, you can drop a UMD build as a [`<script>` tag](https://unpkg.com/redux/dist/redux.js) on the page, or [tell Bower to install it](https://github.com/reactjs/redux/pull/1181#issuecomment-167361975). The UMD builds make Redux available as a `window.Redux` global variable. <ide>
1
Javascript
Javascript
improve collada "duplicate node warning"
18317890f5cef98f0c7be068d2e0fd064f93de01
<ide><path>examples/js/loaders/ColladaLoader.js <ide> THREE.ColladaLoader.prototype = { <ide> <ide> } <ide> <del> if (hasNode(data.id)) { <del> console.warn("Duplicate id ", data.id, "ignoring") <add> if ( hasNode( data.id ) ) { <add> <add> console.warn( 'THREE.ColladaLoader: There is already a node with ID %s. Exclude current node from further processing.', data.id ); <add> <ide> } else { <add> <ide> library.nodes[ data.id ] = data; <add> <ide> } <ide> <ide> return data;
1
Text
Text
add whatwg file urls in fs module
c1b3b95939884b20a6312822091b9fec2ef9f31f
<ide><path>doc/api/fs.md <ide> Error: EISDIR: illegal operation on a directory, read <ide> <stack trace.> <ide> ``` <ide> <add>## WHATWG URL object support <add><!-- YAML <add>added: v7.6.0 <add>--> <add> <add>> Stability: 1 - Experimental <add> <add>For most `fs` module functions, the `path` or `filename` argument may be passed <add>as a WHATWG [`URL`][] object. Only [`URL`][] objects using the `file:` protocol <add>are supported. <add> <add>```js <add>const fs = require('fs'); <add>const { URL } = require('url'); <add>const fileUrl = new URL('file:///tmp/hello'); <add> <add>fs.readFileSync(fileUrl); <add>``` <add> <add>*Note*: `file:` URLs are always absolute paths. <add> <add>Using WHATWG [`URL`][] objects might introduce platform-specific behaviors. <add> <add>On Windows, `file:` URLs with a hostname convert to UNC paths, while `file:` <add>URLs with drive letters convert to local absolute paths. `file:` URLs without a <add>hostname nor a drive letter will result in a throw : <add> <add>```js <add>// On Windows : <add> <add>// - WHATWG file URLs with hostname convert to UNC path <add>// file://hostname/p/a/t/h/file => \\hostname\p\a\t\h\file <add>fs.readFileSync(new URL('file://hostname/p/a/t/h/file')); <add> <add>// - WHATWG file URLs with drive letters convert to absolute path <add>// file:///C:/tmp/hello => C:\tmp\hello <add>fs.readFileSync(new URL('file:///C:/tmp/hello')); <add> <add>// - WHATWG file URLs without hostname must have a drive letters <add>fs.readFileSync(new URL('file:///notdriveletter/p/a/t/h/file')); <add>fs.readFileSync(new URL('file:///c/p/a/t/h/file')); <add>// TypeError [ERR_INVALID_FILE_URL_PATH]: File URL path must be absolute <add>``` <add> <add>*Note*: `file:` URLs with drive letters must use `:` as a separator just after <add>the drive letter. Using another separator will result in a throw. <add> <add>On all other platforms, `file:` URLs with a hostname are unsupported and will <add>result in a throw: <add> <add>```js <add>// On other platforms: <add> <add>// - WHATWG file URLs with hostname are unsupported <add>// file://hostname/p/a/t/h/file => throw! <add>fs.readFileSync(new URL('file://hostname/p/a/t/h/file')); <add>// TypeError [ERR_INVALID_FILE_URL_PATH]: must be absolute <add> <add>// - WHATWG file URLs convert to absolute path <add>// file:///tmp/hello => /tmp/hello <add>fs.readFileSync(new URL('file:///tmp/hello')); <add>``` <add> <add>A `file:` URL having encoded slash characters will result in a throw on all <add>platforms: <add> <add>```js <add>// On Windows <add>fs.readFileSync(new URL('file:///C:/p/a/t/h/%2F')); <add>fs.readFileSync(new URL('file:///C:/p/a/t/h/%2f')); <add>/* TypeError [ERR_INVALID_FILE_URL_PATH]: File URL path must not include encoded <add>\ or / characters */ <add> <add>// On POSIX <add>fs.readFileSync(new URL('file:///p/a/t/h/%2F')); <add>fs.readFileSync(new URL('file:///p/a/t/h/%2f')); <add>/* TypeError [ERR_INVALID_FILE_URL_PATH]: File URL path must not include encoded <add>/ characters */ <add>``` <add>On Windows, `file:` URLs having encoded backslash will result in a throw: <add> <add>```js <add>// On Windows <add>fs.readFileSync(new URL('file:///C:/path/%5C')); <add>fs.readFileSync(new URL('file:///C:/path/%5c')); <add>/* TypeError [ERR_INVALID_FILE_URL_PATH]: File URL path must not include encoded <add>\ or / characters */ <add>``` <add> <ide> ## Buffer API <ide> <!-- YAML <ide> added: v6.0.0 <ide> argument to `fs.createWriteStream()`. If `path` is passed as a string, then <ide> ## fs.access(path[, mode], callback) <ide> <!-- YAML <ide> added: v0.11.15 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> * `callback` {Function} <ide> <ide> process. <ide> ## fs.accessSync(path[, mode]) <ide> <!-- YAML <ide> added: v0.11.15 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> <ide> Synchronous version of [`fs.access()`][]. This throws if any accessibility <ide> The synchronous version of [`fs.appendFile()`][]. Returns `undefined`. <ide> <!-- YAML <ide> added: v0.1.30 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> * `callback` {Function} <ide> <ide> to the completion callback. <ide> ## fs.chmodSync(path, mode) <ide> <!-- YAML <ide> added: v0.6.7 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> <ide> Synchronous chmod(2). Returns `undefined`. <ide> Synchronous chmod(2). Returns `undefined`. <ide> <!-- YAML <ide> added: v0.1.97 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `uid` {integer} <ide> * `gid` {integer} <ide> * `callback` {Function} <ide> to the completion callback. <ide> ## fs.chownSync(path, uid, gid) <ide> <!-- YAML <ide> added: v0.1.97 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `uid` {integer} <ide> * `gid` {integer} <ide> <ide> operations. The specific constants currently defined are described in <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7831 <ide> description: The passed `options` object will never be modified. <ide> changes: <ide> description: The passed `options` object can be a string now. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `flags` {string} <ide> * `encoding` {string} <ide> If `options` is a string, then it specifies the encoding. <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7831 <ide> description: The passed `options` object will never be modified. <ide> changes: <ide> description: The passed `options` object can be a string now. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `flags` {string} <ide> * `defaultEncoding` {string} <ide> If `options` is a string, then it specifies the encoding. <ide> ## fs.exists(path, callback) <ide> <!-- YAML <ide> added: v0.0.2 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> deprecated: v1.0.0 <ide> --> <ide> <ide> > Stability: 0 - Deprecated: Use [`fs.stat()`][] or [`fs.access()`][] instead. <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Test whether or not the given path exists by checking with the file system. <ide> process. <ide> ## fs.existsSync(path) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> <ide> Synchronous version of [`fs.exists()`][]. <ide> Returns `true` if the file exists, `false` otherwise. <ide> Synchronous lchown(2). Returns `undefined`. <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `existingPath` and `newPath` parameters can be WHATWG <add> `URL` objects using `file:` protocol. Support is currently <add> still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `existingPath` {string|Buffer} <del>* `newPath` {string|Buffer} <add>* `existingPath` {string|Buffer|URL} <add>* `newPath` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous link(2). No arguments other than a possible exception are given to <ide> the completion callback. <ide> ## fs.linkSync(existingPath, newPath) <ide> <!-- YAML <ide> added: v0.1.31 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `existingPath` and `newPath` parameters can be WHATWG <add> `URL` objects using `file:` protocol. Support is currently <add> still *experimental*. <ide> --> <ide> <del>* `existingPath` {string|Buffer} <del>* `newPath` {string|Buffer} <add>* `existingPath` {string|Buffer|URL} <add>* `newPath` {string|Buffer|URL} <ide> <ide> Synchronous link(2). Returns `undefined`. <ide> <ide> ## fs.lstat(path, callback) <ide> <!-- YAML <ide> added: v0.1.30 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous lstat(2). The callback gets two arguments `(err, stats)` where <ide> not the file that it refers to. <ide> ## fs.lstatSync(path) <ide> <!-- YAML <ide> added: v0.1.30 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> <ide> Synchronous lstat(2). Returns an instance of [`fs.Stats`][]. <ide> <ide> ## fs.mkdir(path[, mode], callback) <ide> <!-- YAML <ide> added: v0.1.8 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> * `callback` {Function} <ide> <ide> to the completion callback. `mode` defaults to `0o777`. <ide> ## fs.mkdirSync(path[, mode]) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `mode` {integer} <ide> <ide> Synchronous mkdir(2). Returns `undefined`. <ide> object with an `encoding` property specifying the character encoding to use. <ide> ## fs.open(path, flags[, mode], callback) <ide> <!-- YAML <ide> added: v0.0.2 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `flags` {string|number} <ide> * `mode` {integer} <ide> * `callback` {Function} <ide> fs.open('<directory>', 'a+', (err, fd) => { <ide> ## fs.openSync(path, flags[, mode]) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `flags` {string|number} <ide> * `mode` {integer} <ide> <ide> The callback is given the three arguments, `(err, bytesRead, buffer)`. <ide> <!-- YAML <ide> added: v0.1.8 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> changes: <ide> description: The `options` parameter was added. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> * `callback` {Function} <ide> the filenames returned will be passed as `Buffer` objects. <ide> ## fs.readdirSync(path[, options]) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> <ide> object with an `encoding` property specifying the character encoding to use for <ide> the filenames passed to the callback. If the `encoding` is set to `'buffer'`, <ide> the filenames returned will be passed as `Buffer` objects. <ide> <del>## fs.readFile(file[, options], callback) <add>## fs.readFile(path[, options], callback) <ide> <!-- YAML <ide> added: v0.1.29 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> changes: <ide> description: The `file` parameter can be a file descriptor now. <ide> --> <ide> <del>* `file` {string|Buffer|integer} filename or file descriptor <add>* `path` {string|Buffer|URL|integer} filename or file descriptor <ide> * `options` {Object|string} <ide> * `encoding` {string|null} default = `null` <ide> * `flag` {string} default = `'r'` <ide> fs.readFile('/etc/passwd', 'utf8', callback); <ide> <ide> Any specified file descriptor has to support reading. <ide> <del>_Note: If a file descriptor is specified as the `file`, it will not be closed <add>_Note: If a file descriptor is specified as the `path`, it will not be closed <ide> automatically._ <ide> <del>## fs.readFileSync(file[, options]) <add>## fs.readFileSync(path[, options]) <ide> <!-- YAML <ide> added: v0.1.8 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v5.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/3163 <ide> description: The `file` parameter can be a file descriptor now. <ide> --> <ide> <del>* `file` {string|Buffer|integer} filename or file descriptor <add>* `path` {string|Buffer|URL|integer} filename or file descriptor <ide> * `options` {Object|string} <ide> * `encoding` {string|null} default = `null` <ide> * `flag` {string} default = `'r'` <ide> string. Otherwise it returns a buffer. <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> * `callback` {Function} <ide> the link path returned will be passed as a `Buffer` object. <ide> ## fs.readlinkSync(path[, options]) <ide> <!-- YAML <ide> added: v0.1.31 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> <ide> Synchronous version of [`fs.read()`][]. Returns the number of `bytesRead`. <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> changes: <ide> description: The `cache` parameter was removed. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> * `callback` {Function} <ide> the path returned will be passed as a `Buffer` object. <ide> <!-- YAML <ide> added: v0.1.31 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v6.4.0 <ide> pr-url: https://github.com/nodejs/node/pull/7899 <ide> description: Calling `realpathSync` now works again for various edge cases <ide> changes: <ide> description: The `cache` parameter was removed. <ide> --> <ide> <del>* `path` {string|Buffer}; <add>* `path` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `encoding` {string} default = `'utf8'` <ide> <ide> will be passed as a `Buffer` object. <ide> <!-- YAML <ide> added: v0.0.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `oldPath` and `newPath` parameters can be WHATWG `URL` <add> objects using `file:` protocol. Support is currently still <add> *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `oldPath` {string|Buffer} <del>* `newPath` {string|Buffer} <add>* `oldPath` {string|Buffer|URL} <add>* `newPath` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous rename(2). No arguments other than a possible exception are given <ide> to the completion callback. <ide> ## fs.renameSync(oldPath, newPath) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `oldPath` and `newPath` parameters can be WHATWG `URL` <add> objects using `file:` protocol. Support is currently still <add> *experimental*. <ide> --> <ide> <del>* `oldPath` {string|Buffer} <del>* `newPath` {string|Buffer} <add>* `oldPath` {string|Buffer|URL} <add>* `newPath` {string|Buffer|URL} <ide> <ide> Synchronous rename(2). Returns `undefined`. <ide> <ide> ## fs.rmdir(path, callback) <ide> <!-- YAML <ide> added: v0.0.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameters can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous rmdir(2). No arguments other than a possible exception are given <ide> to the completion callback. <ide> ## fs.rmdirSync(path) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameters can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> <ide> Synchronous rmdir(2). Returns `undefined`. <ide> <ide> ## fs.stat(path, callback) <ide> <!-- YAML <ide> added: v0.0.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous stat(2). The callback gets two arguments `(err, stats)` where <ide> is recommended. <ide> ## fs.statSync(path) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> <ide> Synchronous stat(2). Returns an instance of [`fs.Stats`][]. <ide> <ide> ## fs.symlink(target, path[, type], callback) <ide> <!-- YAML <ide> added: v0.1.31 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `target` and `path` parameters can be WHATWG `URL` objects <add> using `file:` protocol. Support is currently still <add> *experimental*. <ide> --> <ide> <del>* `target` {string|Buffer} <del>* `path` {string|Buffer} <add>* `target` {string|Buffer|URL} <add>* `path` {string|Buffer|URL} <ide> * `type` {string} <ide> * `callback` {Function} <ide> <ide> It creates a symbolic link named "new-port" that points to "foo". <ide> ## fs.symlinkSync(target, path[, type]) <ide> <!-- YAML <ide> added: v0.1.31 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `target` and `path` parameters can be WHATWG `URL` objects <add> using `file:` protocol. Support is currently still <add> *experimental*. <ide> --> <ide> <del>* `target` {string|Buffer} <del>* `path` {string|Buffer} <add>* `target` {string|Buffer|URL} <add>* `path` {string|Buffer|URL} <ide> * `type` {string} <ide> <ide> Synchronous symlink(2). Returns `undefined`. <ide> passed as the first argument. In this case, `fs.ftruncateSync()` is called. <ide> <!-- YAML <ide> added: v0.0.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> it will emit a deprecation warning. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `callback` {Function} <ide> <ide> Asynchronous unlink(2). No arguments other than a possible exception are given <ide> to the completion callback. <ide> ## fs.unlinkSync(path) <ide> <!-- YAML <ide> added: v0.1.21 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> <ide> Synchronous unlink(2). Returns `undefined`. <ide> <ide> when possible._ <ide> <!-- YAML <ide> added: v0.4.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7897 <ide> description: The `callback` parameter is no longer optional. Not passing <ide> changes: <ide> time specifiers. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `atime` {integer} <ide> * `mtime` {integer} <ide> * `callback` {Function} <ide> follow these rules: <ide> <!-- YAML <ide> added: v0.4.2 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `path` parameter can be a WHATWG `URL` object using `file:` <add> protocol. Support is currently still *experimental*. <ide> - version: v4.1.0 <ide> pr-url: https://github.com/nodejs/node/pull/2387 <ide> description: Numeric strings, `NaN` and `Infinity` are now allowed <ide> time specifiers. <ide> --> <ide> <del>* `path` {string|Buffer} <add>* `path` {string|Buffer|URL} <ide> * `atime` {integer} <ide> * `mtime` {integer} <ide> <ide> Synchronous version of [`fs.utimes()`][]. Returns `undefined`. <ide> <!-- YAML <ide> added: v0.5.10 <ide> changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `filename` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> - version: v7.0.0 <ide> pr-url: https://github.com/nodejs/node/pull/7831 <ide> description: The passed `options` object will never be modified. <ide> --> <ide> <del>* `filename` {string|Buffer} <add>* `filename` {string|Buffer|URL} <ide> * `options` {string|Object} <ide> * `persistent` {boolean} Indicates whether the process should continue to run <ide> as long as files are being watched. default = `true` <ide> fs.watch('somedir', (eventType, filename) => { <ide> ## fs.watchFile(filename[, options], listener) <ide> <!-- YAML <ide> added: v0.1.31 <add>changes: <add> - version: v7.6.0 <add> pr-url: https://github.com/nodejs/node/pull/10739 <add> description: The `filename` parameter can be a WHATWG `URL` object using <add> `file:` protocol. Support is currently still *experimental*. <ide> --> <ide> <del>* `filename` {string|Buffer} <add>* `filename` {string|Buffer|URL} <ide> * `options` {Object} <ide> * `persistent` {boolean} <ide> * `interval` {integer} <ide> The following constants are meant for use with the [`fs.Stats`][] object's <ide> [`AHAFS`]: https://www.ibm.com/developerworks/aix/library/au-aix_event_infrastructure/ <ide> [Common System Errors]: errors.html#errors_common_system_errors <ide> [`Uint8Array`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array <add>[`URL`]: url.html#url_the_whatwg_url_api
1
Go
Go
add comment to put()
a1851a6d3e69d67bfc4e6bfdec85ba58293351b9
<ide><path>graphdriver/btrfs/btrfs.go <ide> func (d *Driver) Get(id string) (string, error) { <ide> } <ide> <ide> func (d *Driver) Put(id string) { <add> // Get() creates no runtime resources (like e.g. mounts) <add> // so this doesn't need to do anything. <ide> } <ide> <ide> func (d *Driver) Exists(id string) bool {
1
Text
Text
add keytool for windows details
c2fd4d48bf7052722b4712a5ddcc1a9d0cbf3bc2
<ide><path>docs/SignedAPKAndroid.md <ide> Android requires that all apps be digitally signed with a certificate before the <ide> <ide> ### Generating a signing key <ide> <del>You can generate a private signing key using `keytool`. <add>You can generate a private signing key using `keytool`. On Windows `keytool` must be run from `C:\Program Files\Java\jdkx.x.x_x\bin`. <ide> <ide> $ keytool -genkey -v -keystore my-release-key.keystore -alias my-key-alias -keyalg RSA -keysize 2048 -validity 10000 <ide>
1
Javascript
Javascript
remove suffix _ir
261ff39f6df936af16b0ebdf1bff49a3ffaaa798
<ide><path>src/canvas.js <ide> var CanvasGraphics = (function CanvasGraphicsClosure() { <ide> 'setStrokeColor': true, <ide> 'setStrokeColorN': true, <ide> 'setFillColor': true, <del> 'setFillColorN_IR': true, <add> 'setFillColorN': true, <ide> 'setStrokeGray': true, <ide> 'setFillGray': true, <ide> 'setStrokeRGBColor': true, <ide> var CanvasGraphics = (function CanvasGraphicsClosure() { <ide> this.ctx.strokeStyle = color; <ide> this.current.strokeColor = color; <ide> }, <del> getColorN_IR_Pattern: function canvasGraphicsGetColorN_IR_Pattern(IR, cs) { <add> getColorN_Pattern: function canvasGraphicsGetColorN_Pattern(IR, cs) { <ide> if (IR[0] == 'TilingPattern') { <ide> var args = IR[1]; <ide> var base = cs.base; <ide> var CanvasGraphics = (function CanvasGraphicsClosure() { <ide> } <ide> return pattern; <ide> }, <del> setStrokeColorN_IR: function canvasGraphicsSetStrokeColorN(/*...*/) { <add> setStrokeColorN: function canvasGraphicsSetStrokeColorN(/*...*/) { <ide> var cs = this.current.strokeColorSpace; <ide> <ide> if (cs.name == 'Pattern') { <del> this.current.strokeColor = this.getColorN_IR_Pattern(arguments, cs); <add> this.current.strokeColor = this.getColorN_Pattern(arguments, cs); <ide> } else { <ide> this.setStrokeColor.apply(this, arguments); <ide> } <ide> var CanvasGraphics = (function CanvasGraphicsClosure() { <ide> this.ctx.fillStyle = color; <ide> this.current.fillColor = color; <ide> }, <del> setFillColorN_IR: function canvasGraphicsSetFillColorN(/*...*/) { <add> setFillColorN: function canvasGraphicsSetFillColorN(/*...*/) { <ide> var cs = this.current.fillColorSpace; <ide> <ide> if (cs.name == 'Pattern') { <del> this.current.fillColor = this.getColorN_IR_Pattern(arguments, cs); <add> this.current.fillColor = this.getColorN_Pattern(arguments, cs); <ide> } else { <ide> this.setFillColor.apply(this, arguments); <ide> } <ide><path>src/evaluator.js <ide> var PartialEvaluator = (function PartialEvaluatorClosure() { <ide> // TODO figure out how to type-check vararg functions <ide> <ide> if ((cmd == 'SCN' || cmd == 'scn') && !args[args.length - 1].code) { <del> // Use the IR version for setStroke/FillColorN. <del> fn += '_IR'; <del> <ide> // compile tiling patterns <ide> var patternName = args[args.length - 1]; <ide> // SCN/scn applies patterns along with normal colors
2
PHP
PHP
fix return type
a41e27d6f7d3e23580a6a3ca8a8ffe90fb5f1819
<ide><path>src/Event/EventManager.php <ide> protected function _attachSubscriber(EventListenerInterface $subscriber): void <ide> * <ide> * @param array $function the array taken from a handler definition for an event <ide> * @param \Cake\Event\EventListenerInterface $object The handler object <del> * @return callable <add> * @return array <add> * @psalm-return array{callable, array} <ide> */ <del> protected function _extractCallable(array $function, EventListenerInterface $object) <add> protected function _extractCallable(array $function, EventListenerInterface $object): array <ide> { <add> /** @var callable $method */ <ide> $method = $function['callable']; <ide> $options = $function; <ide> unset($options['callable']); <ide> if (is_string($method)) { <add> /** @var callable $method */ <ide> $method = [$object, $method]; <ide> } <ide> <del> /** @var callable $callable */ <del> $callable = [$method, $options]; <del> <del> return $callable; <add> return [$method, $options]; <ide> } <ide> <ide> /**
1
Java
Java
add stompprotocolhandler tests
39ff1e2c5375868a92318251f3015114e8b49d0d
<ide><path>spring-messaging/src/main/java/org/springframework/messaging/simp/stomp/StompProtocolHandler.java <ide> import org.springframework.messaging.handler.websocket.SubProtocolHandler; <ide> import org.springframework.messaging.simp.SimpMessageType; <ide> import org.springframework.messaging.simp.handler.MutableUserQueueSuffixResolver; <add>import org.springframework.messaging.simp.handler.SimpleUserQueueSuffixResolver; <ide> import org.springframework.messaging.support.MessageBuilder; <ide> import org.springframework.util.Assert; <ide> import org.springframework.web.socket.CloseStatus; <ide> public class StompProtocolHandler implements SubProtocolHandler { <ide> <ide> private final StompMessageConverter stompMessageConverter = new StompMessageConverter(); <ide> <del> private MutableUserQueueSuffixResolver queueSuffixResolver; <add> private MutableUserQueueSuffixResolver queueSuffixResolver = new SimpleUserQueueSuffixResolver(); <ide> <ide> <ide> /** <ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/config/WebSocketMessageBrokerConfigurationTests.java <ide> import org.springframework.beans.factory.annotation.Autowired; <ide> import org.springframework.context.annotation.Bean; <ide> import org.springframework.context.annotation.Configuration; <del>import org.springframework.messaging.Message; <ide> import org.springframework.messaging.SubscribableChannel; <ide> import org.springframework.messaging.handler.annotation.MessageMapping; <ide> import org.springframework.messaging.simp.AbstractWebSocketIntegrationTests; <ide> import org.springframework.messaging.simp.JettyTestServer; <ide> import org.springframework.messaging.simp.stomp.StompCommand; <del>import org.springframework.messaging.simp.stomp.StompHeaderAccessor; <del>import org.springframework.messaging.simp.stomp.StompMessageConverter; <del>import org.springframework.messaging.support.MessageBuilder; <add>import org.springframework.messaging.simp.stomp.StompTextMessageBuilder; <ide> import org.springframework.messaging.support.channel.ExecutorSubscribableChannel; <ide> import org.springframework.stereotype.Controller; <ide> import org.springframework.web.context.support.AnnotationConfigWebApplicationContext; <ide> public void sendMessage() throws Exception { <ide> this.server.init(cxt); <ide> this.server.start(); <ide> <del> StompHeaderAccessor headers = StompHeaderAccessor.create(StompCommand.SEND); <del> headers.setDestination("/app/foo"); <del> Message<byte[]> message = MessageBuilder.withPayloadAndHeaders(new byte[0], headers).build(); <del> byte[] bytes = new StompMessageConverter().fromMessage(message); <del> final TextMessage webSocketMessage = new TextMessage(new String(bytes)); <add> final TextMessage textMessage = StompTextMessageBuilder.create(StompCommand.SEND) <add> .headers("destination:/app/foo").build(); <ide> <ide> WebSocketHandler clientHandler = new TextWebSocketHandlerAdapter() { <ide> @Override <ide> public void afterConnectionEstablished(WebSocketSession session) throws Exception { <del> session.sendMessage(webSocketMessage); <add> session.sendMessage(textMessage); <ide> } <ide> }; <ide> <ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/stomp/StompMessageConverterTests.java <ide> import org.springframework.messaging.MessageHeaders; <ide> import org.springframework.messaging.simp.SimpMessageHeaderAccessor; <ide> import org.springframework.messaging.simp.SimpMessageType; <add>import org.springframework.web.socket.TextMessage; <ide> <ide> import static org.junit.Assert.*; <ide> <ide> public void setup() { <ide> this.converter = new StompMessageConverter(); <ide> } <ide> <del> @SuppressWarnings("unchecked") <ide> @Test <ide> public void connectFrame() throws Exception { <ide> <del> String accept = "accept-version:1.1\n"; <del> String host = "host:github.org\n"; <del> String frame = "\n\n\nCONNECT\n" + accept + host + "\n"; <del> Message<byte[]> message = (Message<byte[]>) this.converter.toMessage(frame.getBytes("UTF-8")); <add> String accept = "accept-version:1.1"; <add> String host = "host:github.org"; <add> <add> TextMessage textMessage = StompTextMessageBuilder.create(StompCommand.CONNECT) <add> .headers(accept, host).build(); <add> <add> @SuppressWarnings("unchecked") <add> Message<byte[]> message = (Message<byte[]>) this.converter.toMessage(textMessage.getPayload()); <ide> <ide> assertEquals(0, message.getPayload().length); <ide> <ide> public void connectFrame() throws Exception { <ide> @Test <ide> public void connectWithEscapes() throws Exception { <ide> <del> String accept = "accept-version:1.1\n"; <del> String host = "ho\\c\\ns\\rt:st\\nomp.gi\\cthu\\b.org\n"; <del> String frame = "CONNECT\n" + accept + host + "\n"; <add> String accept = "accept-version:1.1"; <add> String host = "ho\\c\\ns\\rt:st\\nomp.gi\\cthu\\b.org"; <add> <add> TextMessage textMessage = StompTextMessageBuilder.create(StompCommand.CONNECT) <add> .headers(accept, host).build(); <add> <ide> @SuppressWarnings("unchecked") <del> Message<byte[]> message = (Message<byte[]>) this.converter.toMessage(frame.getBytes("UTF-8")); <add> Message<byte[]> message = (Message<byte[]>) this.converter.toMessage(textMessage.getPayload()); <ide> <ide> assertEquals(0, message.getPayload().length); <ide> <ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/stomp/StompProtocolHandlerTests.java <add>/* <add> * Copyright 2002-2013 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * http://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.messaging.simp.stomp; <add> <add>import java.util.Arrays; <add>import java.util.HashSet; <add> <add>import org.junit.Before; <add>import org.junit.Test; <add>import org.mockito.ArgumentCaptor; <add>import org.mockito.Mockito; <add>import org.springframework.messaging.Message; <add>import org.springframework.messaging.MessageChannel; <add>import org.springframework.web.socket.TextMessage; <add>import org.springframework.web.socket.support.TestPrincipal; <add>import org.springframework.web.socket.support.TestWebSocketSession; <add> <add>import static org.junit.Assert.*; <add>import static org.mockito.Mockito.*; <add> <add>/** <add> * Test fixture for {@link StompProtocolHandler} tests. <add> * <add> * @author Rossen Stoyanchev <add> */ <add>public class StompProtocolHandlerTests { <add> <add> private StompProtocolHandler stompHandler; <add> <add> private TestWebSocketSession session; <add> <add> private MessageChannel channel; <add> <add> private ArgumentCaptor<Message> messageCaptor; <add> <add> <add> @Before <add> public void setup() { <add> this.stompHandler = new StompProtocolHandler(); <add> this.channel = Mockito.mock(MessageChannel.class); <add> this.messageCaptor = ArgumentCaptor.forClass(Message.class); <add> <add> this.session = new TestWebSocketSession(); <add> this.session.setId("s1"); <add> this.session.setPrincipal(new TestPrincipal("joe")); <add> } <add> <add> @Test <add> public void handleConnect() { <add> <add> TextMessage textMessage = StompTextMessageBuilder.create(StompCommand.CONNECT).headers( <add> "login:guest", "passcode:guest", "accept-version:1.1,1.0", "heart-beat:10000,10000").build(); <add> <add> this.stompHandler.handleMessageFromClient(this.session, textMessage, this.channel); <add> <add> verify(this.channel).send(this.messageCaptor.capture()); <add> Message<?> actual = this.messageCaptor.getValue(); <add> assertNotNull(actual); <add> <add> StompHeaderAccessor headers = StompHeaderAccessor.wrap(actual); <add> assertEquals(StompCommand.CONNECT, headers.getCommand()); <add> assertEquals("s1", headers.getSessionId()); <add> assertEquals("joe", headers.getUser().getName()); <add> assertEquals("guest", headers.getLogin()); <add> assertEquals("PROTECTED", headers.getPasscode()); <add> assertArrayEquals(new long[] {10000, 10000}, headers.getHeartbeat()); <add> assertEquals(new HashSet<>(Arrays.asList("1.1","1.0")), headers.getAcceptVersion()); <add> <add> // Check CONNECTED reply <add> <add> assertEquals(1, this.session.getSentMessages().size()); <add> textMessage = (TextMessage) this.session.getSentMessages().get(0); <add> Message<?> message = new StompMessageConverter().toMessage(textMessage.getPayload()); <add> StompHeaderAccessor replyHeaders = StompHeaderAccessor.wrap(message); <add> <add> assertEquals(StompCommand.CONNECTED, replyHeaders.getCommand()); <add> assertEquals("1.1", replyHeaders.getVersion()); <add> assertArrayEquals(new long[] {0, 0}, replyHeaders.getHeartbeat()); <add> assertEquals("joe", replyHeaders.getNativeHeader("user-name").get(0)); <add> assertEquals("s1", replyHeaders.getNativeHeader("queue-suffix").get(0)); <add> } <add> <add>} <ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/stomp/StompTextMessageBuilder.java <add>/* <add> * Copyright 2002-2013 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * http://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.messaging.simp.stomp; <add> <add>import java.util.ArrayList; <add>import java.util.Arrays; <add>import java.util.List; <add> <add>import org.springframework.web.socket.TextMessage; <add> <add> <add>/** <add> * A builder for creating WebSocket messages with STOMP frame content. <add> * <add> * @author Rossen Stoyanchev <add> */ <add>public class StompTextMessageBuilder { <add> <add> private StompCommand command; <add> <add> private final List<String> headerLines = new ArrayList<String>(); <add> <add> private String body; <add> <add> <add> private StompTextMessageBuilder(StompCommand command) { <add> this.command = command; <add> } <add> <add> public static StompTextMessageBuilder create(StompCommand command) { <add> return new StompTextMessageBuilder(command); <add> } <add> <add> public StompTextMessageBuilder headers(String... headerLines) { <add> this.headerLines.addAll(Arrays.asList(headerLines)); <add> return this; <add> } <add> <add> public StompTextMessageBuilder body(String body) { <add> this.body = body; <add> return this; <add> } <add> <add> public TextMessage build() { <add> StringBuilder sb = new StringBuilder(this.command.name()).append("\n"); <add> for (String line : this.headerLines) { <add> sb.append(line).append("\n"); <add> } <add> sb.append("\n"); <add> if (this.body != null) { <add> sb.append(this.body); <add> } <add> sb.append("\u0000"); <add> return new TextMessage(sb.toString()); <add> } <add> <add>} <ide><path>spring-websocket/src/test/java/org/springframework/web/socket/server/config/WebSocketConfigurationTests.java <ide> package org.springframework.web.socket.server.config; <ide> <ide> import java.util.Arrays; <add>import java.util.concurrent.CountDownLatch; <add>import java.util.concurrent.TimeUnit; <ide> <ide> import org.junit.Test; <ide> import org.junit.runner.RunWith; <ide> import org.junit.runners.Parameterized; <ide> import org.junit.runners.Parameterized.Parameters; <del>import org.mockito.Mockito; <ide> import org.springframework.beans.factory.annotation.Autowired; <ide> import org.springframework.context.annotation.Bean; <ide> import org.springframework.context.annotation.Configuration; <ide> import org.springframework.web.context.support.AnnotationConfigWebApplicationContext; <ide> import org.springframework.web.socket.AbstractWebSocketIntegrationTests; <ide> import org.springframework.web.socket.JettyTestServer; <del>import org.springframework.web.socket.WebSocketHandler; <ide> import org.springframework.web.socket.WebSocketSession; <add>import org.springframework.web.socket.adapter.WebSocketHandlerAdapter; <ide> import org.springframework.web.socket.client.jetty.JettyWebSocketClient; <ide> import org.springframework.web.socket.server.HandshakeHandler; <ide> import org.springframework.web.socket.sockjs.transport.handler.WebSocketTransportHandler; <ide> <del>import static org.mockito.Matchers.*; <del>import static org.mockito.Mockito.*; <add>import static org.junit.Assert.*; <ide> <ide> <ide> /** <ide> public void registerWebSocketHandler() throws Exception { <ide> this.server.init(cxt); <ide> this.server.start(); <ide> <del> WebSocketHandler clientHandler = Mockito.mock(WebSocketHandler.class); <del> WebSocketHandler serverHandler = cxt.getBean(WebSocketHandler.class); <add> this.webSocketClient.doHandshake(new WebSocketHandlerAdapter(), getWsBaseUrl() + "/ws"); <ide> <del> this.webSocketClient.doHandshake(clientHandler, getWsBaseUrl() + "/ws"); <del> <del> verify(serverHandler).afterConnectionEstablished(any(WebSocketSession.class)); <del> verify(clientHandler).afterConnectionEstablished(any(WebSocketSession.class)); <add> TestWebSocketHandler serverHandler = cxt.getBean(TestWebSocketHandler.class); <add> assertTrue(serverHandler.latch.await(2, TimeUnit.SECONDS)); <ide> } <ide> <ide> @Test <ide> public void registerWebSocketHandlerWithSockJS() throws Exception { <ide> this.server.init(cxt); <ide> this.server.start(); <ide> <del> WebSocketHandler clientHandler = Mockito.mock(WebSocketHandler.class); <del> WebSocketHandler serverHandler = cxt.getBean(WebSocketHandler.class); <del> <del> this.webSocketClient.doHandshake(clientHandler, getWsBaseUrl() + "/sockjs/websocket"); <add> this.webSocketClient.doHandshake(new WebSocketHandlerAdapter(), getWsBaseUrl() + "/sockjs/websocket"); <ide> <del> verify(serverHandler).afterConnectionEstablished(any(WebSocketSession.class)); <del> verify(clientHandler).afterConnectionEstablished(any(WebSocketSession.class)); <add> TestWebSocketHandler serverHandler = cxt.getBean(TestWebSocketHandler.class); <add> assertTrue(serverHandler.latch.await(2, TimeUnit.SECONDS)); <ide> } <ide> <ide> <ide> public void registerWebSocketHandlers(WebSocketHandlerRegistry registry) { <ide> } <ide> <ide> @Bean <del> public WebSocketHandler serverHandler() { <del> return Mockito.mock(WebSocketHandler.class); <add> public TestWebSocketHandler serverHandler() { <add> return new TestWebSocketHandler(); <add> } <add> } <add> <add> private static class TestWebSocketHandler extends WebSocketHandlerAdapter { <add> <add> private CountDownLatch latch = new CountDownLatch(1); <add> <add> @Override <add> public void afterConnectionEstablished(WebSocketSession session) throws Exception { <add> this.latch.countDown(); <ide> } <ide> } <ide> <ide><path>spring-websocket/src/test/java/org/springframework/web/socket/support/TestPrincipal.java <add>/* <add> * Copyright 2002-2013 the original author or authors. <add> * <add> * Licensed under the Apache License, Version 2.0 (the "License"); <add> * you may not use this file except in compliance with the License. <add> * You may obtain a copy of the License at <add> * <add> * http://www.apache.org/licenses/LICENSE-2.0 <add> * <add> * Unless required by applicable law or agreed to in writing, software <add> * distributed under the License is distributed on an "AS IS" BASIS, <add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. <add> * See the License for the specific language governing permissions and <add> * limitations under the License. <add> */ <add> <add>package org.springframework.web.socket.support; <add> <add>import java.security.Principal; <add> <add> <add>/** <add> * An implementation of Prinicipal for testing. <add> * @author Rossen Stoyanchev <add> */ <add>public class TestPrincipal implements Principal { <add> <add> private String name; <add> <add> public TestPrincipal(String name) { <add> this.name = name; <add> } <add> <add> @Override <add> public String getName() { <add> return this.name; <add> } <add> <add> @Override <add> public boolean equals(Object obj) { <add> if (obj == this) { <add> return true; <add> } <add> if (!(obj instanceof TestPrincipal)) { <add> return false; <add> } <add> TestPrincipal p = (TestPrincipal) obj; <add> return this.name.equals(p.name); <add> } <add> <add> @Override <add> public int hashCode() { <add> return this.name.hashCode(); <add> } <add> <add>}
7
Text
Text
remove unused link [ci skip]
04d36d2471bc48548abbae6c8913b2371b68d3bf
<ide><path>website/docs/usage/v2.md <ide> process. <ide> <ide> <Infobox> <ide> <del>**Usage:** [Models directory](/models) [Benchmarks](#benchmarks) <add>**Usage:** [Models directory](/models) <ide> <ide> </Infobox> <ide>
1
Ruby
Ruby
add git tests
0b2cc5c20db30f5d0046091f8c2318752f7b0659
<ide><path>Library/Homebrew/test/test_download_strategies.rb <ide> def cache_tag <ide> end <ide> end <ide> <add>class GitDownloadStrategyTests < Homebrew::TestCase <add> include FileUtils <add> <add> def setup <add> resource = ResourceDouble.new("https://github.com/homebrew/foo") <add> @commit_id = 1 <add> @strategy = GitDownloadStrategy.new("baz", resource) <add> @cached_location = @strategy.cached_location <add> mkpath @cached_location <add> touch @cached_location/"README" <add> end <add> <add> def teardown <add> rmtree @cached_location <add> end <add> <add> def git_commit_all <add> shutup do <add> system "git", "add", "--all" <add> system "git", "commit", "-m", "commit number #{@commit_id}" <add> @commit_id += 1 <add> end <add> end <add> <add> def inside_repo_using_git_env <add> initial_env = ENV.to_hash <add> %w[AUTHOR COMMITTER].each do |role| <add> ENV["GIT_#{role}_NAME"] = "brew tests" <add> ENV["GIT_#{role}_EMAIL"] = "brew-tests@localhost" <add> ENV["GIT_#{role}_DATE"] = "Thu May 21 00:04:11 2009 +0100" <add> end <add> @cached_location.cd do <add> yield <add> end <add> ensure <add> ENV.replace(initial_env) <add> end <add> <add> def setup_git_repo <add> inside_repo_using_git_env do <add> shutup do <add> system "git", "init" <add> system "git", "remote", "add", "origin", "https://github.com/Homebrew/homebrew-foo" <add> end <add> git_commit_all <add> end <add> end <add> <add> def test_source_modified_time <add> setup_git_repo <add> assert_equal 1242860651, @strategy.source_modified_time.to_i <add> end <add> <add> def test_last_commit <add> setup_git_repo <add> inside_repo_using_git_env do <add> touch "LICENSE" <add> git_commit_all <add> end <add> assert_equal "c50c79b", @strategy.last_commit <add> end <add>end <add> <ide> class DownloadStrategyDetectorTests < Homebrew::TestCase <ide> def setup <ide> @d = DownloadStrategyDetector.new
1
Text
Text
add weight to daemon page so it renders in order
2a841573521cc9d62d7b3f06143334890662cd54
<ide><path>docs/reference/commandline/daemon.md <ide> description = "The daemon command description and usage" <ide> keywords = ["container, daemon, runtime"] <ide> [menu.main] <ide> parent = "smn_cli" <add>weight=1 <ide> +++ <ide> <![end-metadata]--> <ide>
1
Javascript
Javascript
use hardcoded value for proptype secret
2c93a41580030ae44105626cdd19c97704128771
<ide><path>src/isomorphic/classic/types/ReactPropTypesSecret.js <ide> 'use strict'; <ide> <ide> <del>const ReactPropTypesSecret = '__REACT_PROP_TYPES_SECRET__' + Math.random().toString(); <add>const ReactPropTypesSecret = 'SECRET_DO_NOT_PASS_THIS_OR_YOU_WILL_BE_FIRED'; <ide> <ide> module.exports = ReactPropTypesSecret;
1
Go
Go
fix some typos
e1003fb0787d702b2820fdf70ffbf7c958c7dcf5
<ide><path>libnetwork/client/client.go <ide> type NetworkCli struct { <ide> call CallFunc <ide> } <ide> <del>// NewNetworkCli is a conveninent function to create a NetworkCli object <add>// NewNetworkCli is a convenient function to create a NetworkCli object <ide> func NewNetworkCli(out, err io.Writer, call CallFunc) *NetworkCli { <ide> return &NetworkCli{ <ide> out: out, <ide><path>libnetwork/client/client_test.go <ide> func TestClientNetworkCreateHelp(t *testing.T) { <ide> } <ide> */ <ide> <del>// Docker flag processing in flag.go uses os.Exit(1) for incorrect paramater case. <add>// Docker flag processing in flag.go uses os.Exit(1) for incorrect parameter case. <ide> // TODO : Handle the missing argument case in the IT when CLI is available <ide> /* <ide> func TestClientNetworkCreateMissingArgument(t *testing.T) { <ide><path>libnetwork/drivers/bridge/bridge.go <ide> func (d *driver) CreateNetwork(id types.UUID, option map[string]interface{}) err <ide> // Even if a bridge exists try to setup IPv4. <ide> bridgeSetup.queueStep(setupBridgeIPv4) <ide> <del> // Conditionnally queue setup steps depending on configuration values. <add> // Conditionally queue setup steps depending on configuration values. <ide> for _, step := range []struct { <ide> Condition bool <ide> Fn setupStep <ide><path>libnetwork/drivers/bridge/error.go <ide> func (uat ErrUnsupportedAddressType) Error() string { <ide> return fmt.Sprintf("unsupported address type: %s", string(uat)) <ide> } <ide> <del>// ErrInvalidAddressBinding is returned when the host address specfied in the port binding is not valid. <add>// ErrInvalidAddressBinding is returned when the host address specified in the port binding is not valid. <ide> type ErrInvalidAddressBinding string <ide> <ide> func (iab ErrInvalidAddressBinding) Error() string { <ide><path>libnetwork/netutils/utils.go <ide> type PortBinding struct { <ide> HostPort uint16 <ide> } <ide> <del>// HostAddr returns the host side tranport address <add>// HostAddr returns the host side transport address <ide> func (p PortBinding) HostAddr() (net.Addr, error) { <ide> switch p.Proto { <ide> case UDP: <ide> func (p PortBinding) HostAddr() (net.Addr, error) { <ide> } <ide> } <ide> <del>// ContainerAddr returns the container side tranport address <add>// ContainerAddr returns the container side transport address <ide> func (p PortBinding) ContainerAddr() (net.Addr, error) { <ide> switch p.Proto { <ide> case UDP: <ide><path>libnetwork/sandbox/namespace_linux.go <ide> func (n *networkNamespace) AddInterface(i *Interface) error { <ide> } <ide> defer f.Close() <ide> <del> // Find the network inteerface identified by the SrcName attribute. <add> // Find the network interface identified by the SrcName attribute. <ide> iface, err := netlink.LinkByName(i.SrcName) <ide> if err != nil { <ide> return err
6
Python
Python
use dag_actions constant.
513261094a46bb13a5e6ccc2fc4cc8344ee09eba
<ide><path>airflow/models/dagbag.py <ide> def _sync_perm_for_dag(self, dag, session: Optional[Session] = None): <ide> """Sync DAG specific permissions, if necessary""" <ide> from flask_appbuilder.security.sqla import models as sqla_models <ide> <del> from airflow.security.permissions import DAG_PERMS, resource_name_for_dag <add> from airflow.security.permissions import DAG_ACTIONS, resource_name_for_dag <ide> <ide> def needs_perm_views(dag_id: str) -> bool: <ide> dag_resource_name = resource_name_for_dag(dag_id) <del> for permission_name in DAG_PERMS: <add> for permission_name in DAG_ACTIONS: <ide> if not ( <ide> session.query(sqla_models.PermissionView) <ide> .join(sqla_models.Permission) <ide><path>airflow/security/permissions.py <ide> DEPRECATED_ACTION_CAN_DAG_READ = "can_dag_read" <ide> DEPRECATED_ACTION_CAN_DAG_EDIT = "can_dag_edit" <ide> <del>DAG_PERMS = {ACTION_CAN_READ, ACTION_CAN_EDIT} <add>DAG_ACTIONS = {ACTION_CAN_READ, ACTION_CAN_EDIT} <ide> <ide> <ide> def resource_name_for_dag(dag_id): <ide><path>airflow/www/security.py <ide> class AirflowSecurityManager(SecurityManager, LoggingMixin): # pylint: disable= <ide> (permissions.ACTION_CAN_EDIT, permissions.RESOURCE_ROLE), <ide> ] <ide> <del> # global view-menu for dag-level access <del> DAG_VMS = {permissions.RESOURCE_DAG} <del> <del> READ_DAG_PERMS = {permissions.ACTION_CAN_READ} <del> DAG_PERMS = permissions.DAG_PERMS <add> # global resource for dag-level access <add> DAG_RESOURCES = {permissions.RESOURCE_DAG} <add> DAG_ACTIONS = permissions.DAG_ACTIONS <ide> <ide> ########################################################################### <ide> # DEFAULT ROLE CONFIGURATIONS <ide> def __init__(self, appbuilder): <ide> <ide> def init_role(self, role_name, perms): <ide> """ <del> Initialize the role with the permissions and related view-menus. <add> Initialize the role with the actions and related resources. <ide> :param role_name: <ide> :param perms: <ide> :return: <ide> def bulk_sync_roles(self, roles): <ide> perms = config['perms'] <ide> role = existing_roles.get(role_name) or self.add_role(role_name) <ide> <del> for perm_name, view_name in perms: <del> perm_view = non_dag_perms.get((perm_name, view_name)) or self.create_permission( <del> perm_name, view_name <add> for action_name, resource_name in perms: <add> perm = non_dag_perms.get((action_name, resource_name)) or self.create_permission( <add> action_name, resource_name <ide> ) <ide> <del> if perm_view not in role.permissions: <del> self.add_permission_to_role(role, perm_view) <add> if perm not in role.permissions: <add> self.add_permission_to_role(role, perm) <ide> <ide> def add_permissions(self, role, perms): <ide> """Adds resource permissions to a given role.""" <ide> def clean_perms(self): <ide> <ide> def _merge_perm(self, action_name, resource_name): <ide> """ <del> Add the new (permission, resource) to assoc_permissionview_role if it doesn't exist. <del> It will add the related entry to ab_permission <del> and ab_view_menu two meta tables as well. <add> Add the new (action, resource) to assoc_permissionview_role if it doesn't exist. <add> It will add the related entry to ab_permission and ab_view_menu two meta tables as well. <ide> <ide> :param action_name: Name of the action <ide> :type action_name: str <ide> def create_dag_specific_permissions(self) -> None: <ide> <ide> for dag in dags: <ide> dag_resource_name = permissions.resource_name_for_dag(dag.dag_id) <del> for action_name in self.DAG_PERMS: <add> for action_name in self.DAG_ACTIONS: <ide> if (action_name, dag_resource_name) not in perms: <ide> self._merge_perm(action_name, dag_resource_name) <ide> <ide> if dag.access_control: <ide> self._sync_dag_view_permissions(dag_resource_name, dag.access_control) <ide> <del> def update_admin_perm_view(self): <add> def update_admin_permission(self): <ide> """ <ide> Admin should have all the permissions, except the dag permissions. <ide> because Admin already has Dags permission. <ide> def sync_roles(self): <ide> <ide> self.add_homepage_access_to_custom_roles() <ide> # init existing roles, the rest role could be created through UI. <del> self.update_admin_perm_view() <add> self.update_admin_permission() <ide> self.clean_perms() <ide> <ide> def sync_resource_permissions(self, perms=None): <ide> def sync_perm_for_dag(self, dag_id, access_control=None): <ide> :return: <ide> """ <ide> dag_resource_name = permissions.resource_name_for_dag(dag_id) <del> for action_name in self.DAG_PERMS: <del> self.create_permission(action_name, dag_resource_name) <add> for dag_action_name in self.DAG_ACTIONS: <add> self.create_permission(dag_action_name, dag_resource_name) <ide> <ide> if access_control: <ide> self._sync_dag_view_permissions(dag_resource_name, access_control) <ide> def _sync_dag_view_permissions(self, dag_id, access_control): <ide> """ <ide> dag_resource_name = permissions.resource_name_for_dag(dag_id) <ide> <del> def _get_or_create_dag_permission(action_name): <add> def _get_or_create_dag_permission(action_name: str) -> PermissionView: <ide> perm = self.get_permission(action_name, dag_resource_name) <ide> if not perm: <ide> self.log.info("Creating new action '%s' on resource '%s'", action_name, dag_resource_name) <ide> perm = self.create_permission(action_name, dag_resource_name) <ide> <ide> return perm <ide> <del> def _revoke_stale_permissions(resource): <add> def _revoke_stale_permissions(resource: ViewMenu): <ide> existing_dag_perms = self.get_resource_permissions(resource) <ide> for perm in existing_dag_perms: <ide> non_admin_roles = [role for role in perm.role if role.name != 'Admin'] <ide> def _revoke_stale_permissions(resource): <ide> if resource: <ide> _revoke_stale_permissions(resource) <ide> <del> for rolename, perms in access_control.items(): <add> for rolename, action_names in access_control.items(): <ide> role = self.find_role(rolename) <ide> if not role: <ide> raise AirflowException( <ide> "The access_control mapping for DAG '{}' includes a role " <ide> "named '{}', but that role does not exist".format(dag_id, rolename) <ide> ) <ide> <del> perms = set(perms) <del> invalid_perms = perms - self.DAG_PERMS <del> if invalid_perms: <add> action_names = set(action_names) <add> invalid_action_names = action_names - self.DAG_ACTIONS <add> if invalid_action_names: <ide> raise AirflowException( <ide> "The access_control map for DAG '{}' includes the following " <ide> "invalid permissions: {}; The set of valid permissions " <del> "is: {}".format(dag_resource_name, invalid_perms, self.DAG_PERMS) <add> "is: {}".format(dag_resource_name, invalid_action_names, self.DAG_ACTIONS) <ide> ) <ide> <del> for action_name in perms: <add> for action_name in action_names: <ide> dag_perm = _get_or_create_dag_permission(action_name) <ide> self.add_permission_to_role(role, dag_perm) <ide> <ide> def create_resource(self, name: str) -> ViewMenu: <ide> def create_perm_vm_for_all_dag(self): <ide> """Create perm-vm if not exist and insert into FAB security model for all-dags.""" <ide> # create perm for global logical dag <del> for resource_name in self.DAG_VMS: <del> for action_name in self.DAG_PERMS: <add> for resource_name in self.DAG_RESOURCES: <add> for action_name in self.DAG_ACTIONS: <ide> self._merge_perm(action_name, resource_name) <ide> <ide> def check_authorization( <ide><path>tests/www/test_security.py <ide> def test_all_dag_access_doesnt_give_non_dag_access(self): <ide> <ide> def test_access_control_with_invalid_permission(self): <ide> invalid_permissions = [ <del> 'can_varimport', # a real permission, but not a member of DAG_PERMS <add> 'can_varimport', # a real permission, but not a member of DAG_ACTIONS <ide> 'can_eat_pudding', # clearly not a real permission <ide> ] <ide> username = "LaUser"
4
Text
Text
simplify the windows docs and fix formatting
744d39a46683909b95ac5a769ae5f839a992d59a
<ide><path>docs/sources/installation/windows.md <ide> Let's try the “hello world” example. Run <ide> <ide> This will download the small busybox image and print hello world. <ide> <add> <ide> # Further Details <ide> <ide> The Boot2Docker management tool provides some commands: <ide> <ide> $ ./boot2docker <ide> Usage: ./boot2docker [<options>] {help|init|up|ssh|save|down|poweroff|reset|restart|config|status|info|delete|download|version} [<args>] <ide> <del>## Container port redirection <ide> <del>The latest version of `boot2docker` sets up two network adaptors: one using NAT <del>to allow the VM to download images and files from the Internet, and one host only <del>network adaptor to which the container's ports will be exposed on. <add>## Container port redirection <add> <add>The latest version of `boot2docker` sets up a host only <add>network adaptor on which the container's ports will be exposed. <ide> <ide> If you run a container with an exposed port: <ide> <ide> If you run a container with an exposed port: <ide> Then you should be able to access that Apache server using the IP address reported <ide> to you using: <ide> <del> boot2docker ssh ip addr show dev eth1 <del> <del>Typically, it is 192.168.59.103, but at this point it can change. <add> boot2docker ip <ide> <del>If you want to share container ports with other computers on your LAN, you will <del>need to set up [NAT adaptor based port forwarding]( <del>https://github.com/boot2docker/boot2docker/blob/master/doc/WORKAROUNDS.md) <add>Typically, it is `192.168.59.103`, but it can change. <ide> <ide> For further information or to report issues, please see the [Boot2Docker site](http://boot2docker.io)
1
Javascript
Javascript
improve tick generation for linear scales
aae05a08da0e216371caafa17228b3bff97d9fa9
<ide><path>src/scales/scale.linear.js <ide> 'use strict'; <ide> <del>var defaults = require('../core/core.defaults'); <ide> var helpers = require('../helpers/index'); <ide> var scaleService = require('../core/core.scaleService'); <ide> var Ticks = require('../core/core.ticks'); <ide> module.exports = function(Chart) { <ide> // Common base implementation to handle ticks.min, ticks.max, ticks.beginAtZero <ide> this.handleTickRangeOptions(); <ide> }, <del> getTickLimit: function() { <del> var maxTicks; <add> // Returns the maximum number of ticks based on the scale dimension <add> _computeTickLimit: function() { <ide> var me = this; <del> var tickOpts = me.options.ticks; <add> var tickFont; <ide> <ide> if (me.isHorizontal()) { <del> maxTicks = Math.min(tickOpts.maxTicksLimit ? tickOpts.maxTicksLimit : 11, Math.ceil(me.width / 50)); <del> } else { <del> // The factor of 2 used to scale the font size has been experimentally determined. <del> var tickFontSize = helpers.valueOrDefault(tickOpts.fontSize, defaults.global.defaultFontSize); <del> maxTicks = Math.min(tickOpts.maxTicksLimit ? tickOpts.maxTicksLimit : 11, Math.ceil(me.height / (2 * tickFontSize))); <add> return Math.ceil(me.width / 40); <ide> } <del> <del> return maxTicks; <add> tickFont = helpers.options._parseFont(me.options.ticks); <add> return Math.ceil(me.height / tickFont.lineHeight); <ide> }, <ide> // Called after the ticks are built. We need <ide> handleDirectionalChanges: function() { <ide><path>src/scales/scale.linearbase.js <ide> function generateTicks(generationOptions, dataRange) { <ide> // for details. <ide> <ide> var stepSize = generationOptions.stepSize; <add> var unit = stepSize || 1; <add> var maxNumSpaces = generationOptions.maxTicks - 1; <ide> var min = generationOptions.min; <ide> var max = generationOptions.max; <del> var spacing, precision, factor, niceRange, niceMin, niceMax, numSpaces; <del> <del> if (stepSize && stepSize > 0) { <del> spacing = stepSize; <del> } else { <del> niceRange = helpers.niceNum(dataRange.max - dataRange.min, false); <del> spacing = helpers.niceNum(niceRange / (generationOptions.maxTicks - 1), true); <del> <del> precision = generationOptions.precision; <del> if (!helpers.isNullOrUndef(precision)) { <del> // If the user specified a precision, round to that number of decimal places <del> factor = Math.pow(10, precision); <del> spacing = Math.ceil(spacing * factor) / factor; <del> } <add> var precision = generationOptions.precision; <add> var spacing, factor, niceMin, niceMax, numSpaces; <add> <add> // spacing is set to a nice number of the dataRange divided by maxNumSpaces. <add> // stepSize is used as a minimum unit if it is specified. <add> spacing = helpers.niceNum((dataRange.max - dataRange.min) / maxNumSpaces / unit) * unit; <add> numSpaces = Math.ceil(dataRange.max / spacing) - Math.floor(dataRange.min / spacing); <add> if (numSpaces > maxNumSpaces) { <add> // If the calculated num of spaces exceeds maxNumSpaces, recalculate it <add> spacing = helpers.niceNum(numSpaces * spacing / maxNumSpaces / unit) * unit; <ide> } <del> // If a precision is not specified, calculate factor based on spacing <del> if (!factor) { <add> <add> if (stepSize || helpers.isNullOrUndef(precision)) { <add> // If a precision is not specified, calculate factor based on spacing <ide> factor = Math.pow(10, helpers.decimalPlaces(spacing)); <add> } else { <add> // If the user specified a precision, round to that number of decimal places <add> factor = Math.pow(10, precision); <add> spacing = Math.ceil(spacing * factor) / factor; <ide> } <add> <ide> niceMin = Math.floor(dataRange.min / spacing) * spacing; <ide> niceMax = Math.ceil(dataRange.max / spacing) * spacing; <ide> <ide> // If min, max and stepSize is set and they make an evenly spaced scale use it. <del> if (!helpers.isNullOrUndef(min) && !helpers.isNullOrUndef(max) && stepSize) { <add> if (stepSize) { <ide> // If very close to our whole number, use it. <del> if (helpers.almostWhole((max - min) / stepSize, spacing / 1000)) { <add> if (!helpers.isNullOrUndef(min) && helpers.almostWhole(min / spacing, spacing / 1000)) { <ide> niceMin = min; <add> } <add> if (!helpers.isNullOrUndef(max) && helpers.almostWhole(max / spacing, spacing / 1000)) { <ide> niceMax = max; <ide> } <ide> } <ide> module.exports = function(Chart) { <ide> } <ide> } <ide> }, <del> getTickLimit: noop, <add> <add> getTickLimit: function() { <add> var me = this; <add> var tickOpts = me.options.ticks; <add> var stepSize = tickOpts.stepSize; <add> var maxTicksLimit = tickOpts.maxTicksLimit; <add> var maxTicks; <add> <add> if (stepSize) { <add> maxTicks = Math.ceil(me.max / stepSize) - Math.floor(me.min / stepSize) + 1; <add> } else { <add> maxTicks = me._computeTickLimit(); <add> maxTicksLimit = maxTicksLimit || 11; <add> } <add> <add> if (maxTicksLimit) { <add> maxTicks = Math.min(maxTicksLimit, maxTicks); <add> } <add> <add> return maxTicks; <add> }, <add> <add> _computeTickLimit: function() { <add> return Number.POSITIVE_INFINITY; <add> }, <add> <ide> handleDirectionalChanges: noop, <ide> <ide> buildTicks: function() { <ide> module.exports = function(Chart) { <ide> var tickOpts = opts.ticks; <ide> <ide> // Figure out what the max number of ticks we can support it is based on the size of <del> // the axis area. For now, we say that the minimum tick spacing in pixels must be 50 <add> // the axis area. For now, we say that the minimum tick spacing in pixels must be 40 <ide> // We also limit the maximum number of ticks to 11 which gives a nice 10 squares on <ide> // the graph. Make sure we always have at least 2 ticks <ide> var maxTicks = me.getTickLimit(); <ide><path>src/scales/scale.logarithmic.js <ide> function generateTicks(generationOptions, dataRange) { <ide> var ticks = []; <ide> var valueOrDefault = helpers.valueOrDefault; <ide> <del> // Figure out what the max number of ticks we can support it is based on the size of <del> // the axis area. For now, we say that the minimum tick spacing in pixels must be 50 <del> // We also limit the maximum number of ticks to 11 which gives a nice 10 squares on <del> // the graph <ide> var tickVal = valueOrDefault(generationOptions.min, Math.pow(10, Math.floor(helpers.log10(dataRange.min)))); <ide> <ide> var endExp = Math.floor(helpers.log10(dataRange.max)); <ide><path>src/scales/scale.radialLinear.js <ide> module.exports = function(Chart) { <ide> // Common base implementation to handle ticks.min, ticks.max, ticks.beginAtZero <ide> me.handleTickRangeOptions(); <ide> }, <del> getTickLimit: function() { <del> var opts = this.options; <del> var tickOpts = opts.ticks; <del> var tickBackdropHeight = getTickBackdropHeight(opts); <del> return Math.min(tickOpts.maxTicksLimit ? tickOpts.maxTicksLimit : 11, Math.ceil(this.drawingArea / tickBackdropHeight)); <add> // Returns the maximum number of ticks based on the scale dimension <add> _computeTickLimit: function() { <add> return Math.ceil(this.drawingArea / getTickBackdropHeight(this.options)); <ide> }, <ide> convertTicksToLabels: function() { <ide> var me = this; <ide><path>test/specs/scale.linear.tests.js <ide> describe('Linear Scale', function() { <ide> expect(chart.scales.yScale0).not.toEqual(undefined); // must construct <ide> expect(chart.scales.yScale0.min).toBe(1); <ide> expect(chart.scales.yScale0.max).toBe(11); <del> expect(chart.scales.yScale0.ticks).toEqual(['11', '9', '7', '5', '3', '1']); <add> expect(chart.scales.yScale0.ticks).toEqual(['11', '10', '8', '6', '4', '2', '1']); <ide> }); <ide> <ide> it('Should create decimal steps if stepSize is a decimal number', function() { <ide> describe('Linear Scale', function() { <ide> expect(chart.scales.yScale0.ticks).toEqual(['0.06', '0.05', '0.04', '0.03', '0.02', '0.01', '0']); <ide> }); <ide> <add> it('Should correctly limit the maximum number of ticks', function() { <add> var chart = window.acquireChart({ <add> type: 'bar', <add> data: { <add> labels: ['a', 'b'], <add> datasets: [{ <add> data: [0.5, 2.5] <add> }] <add> }, <add> options: { <add> scales: { <add> yAxes: [{ <add> id: 'yScale' <add> }] <add> } <add> } <add> }); <add> <add> expect(chart.scales.yScale.ticks).toEqual(['2.5', '2.0', '1.5', '1.0', '0.5']); <add> <add> chart.options.scales.yAxes[0].ticks.maxTicksLimit = 11; <add> chart.update(); <add> <add> expect(chart.scales.yScale.ticks).toEqual(['2.5', '2.0', '1.5', '1.0', '0.5']); <add> <add> chart.options.scales.yAxes[0].ticks.maxTicksLimit = 21; <add> chart.update(); <add> <add> expect(chart.scales.yScale.ticks).toEqual([ <add> '2.5', '2.4', '2.3', '2.2', '2.1', '2.0', '1.9', '1.8', '1.7', '1.6', <add> '1.5', '1.4', '1.3', '1.2', '1.1', '1.0', '0.9', '0.8', '0.7', '0.6', <add> '0.5' <add> ]); <add> <add> chart.options.scales.yAxes[0].ticks.maxTicksLimit = 11; <add> chart.options.scales.yAxes[0].ticks.stepSize = 0.01; <add> chart.update(); <add> <add> expect(chart.scales.yScale.ticks).toEqual(['2.5', '2.0', '1.5', '1.0', '0.5']); <add> <add> chart.options.scales.yAxes[0].ticks.min = 0.3; <add> chart.options.scales.yAxes[0].ticks.max = 2.8; <add> chart.update(); <add> <add> expect(chart.scales.yScale.ticks).toEqual(['2.8', '2.5', '2.0', '1.5', '1.0', '0.5', '0.3']); <add> }); <add> <ide> it('Should build labels using the user supplied callback', function() { <ide> var chart = window.acquireChart({ <ide> type: 'bar', <ide><path>test/specs/scale.radialLinear.tests.js <ide> describe('Test the radial linear scale', function() { <ide> expect(chart.scale.end).toBe(0); <ide> }); <ide> <add> it('Should correctly limit the maximum number of ticks', function() { <add> var chart = window.acquireChart({ <add> type: 'radar', <add> data: { <add> labels: ['label1', 'label2', 'label3'], <add> datasets: [{ <add> data: [0.5, 1.5, 2.5] <add> }] <add> }, <add> options: { <add> scale: { <add> pointLabels: { <add> display: false <add> } <add> } <add> } <add> }); <add> <add> expect(chart.scale.ticks).toEqual(['0.5', '1.0', '1.5', '2.0', '2.5']); <add> <add> chart.options.scale.ticks.maxTicksLimit = 11; <add> chart.update(); <add> <add> expect(chart.scale.ticks).toEqual(['0.5', '1.0', '1.5', '2.0', '2.5']); <add> <add> chart.options.scale.ticks.stepSize = 0.01; <add> chart.update(); <add> <add> expect(chart.scale.ticks).toEqual(['0.5', '1.0', '1.5', '2.0', '2.5']); <add> <add> chart.options.scale.ticks.min = 0.3; <add> chart.options.scale.ticks.max = 2.8; <add> chart.update(); <add> <add> expect(chart.scale.ticks).toEqual(['0.3', '0.5', '1.0', '1.5', '2.0', '2.5', '2.8']); <add> }); <add> <ide> it('Should build labels using the user supplied callback', function() { <ide> var chart = window.acquireChart({ <ide> type: 'radar',
6
Javascript
Javascript
add missing import
e84115af9b4c2117c2626ae9afc07c38e06fa0d4
<ide><path>lib/dependencies/HarmonyImportSideEffectDependency.js <ide> const makeSerializable = require("../util/makeSerializable"); <ide> const HarmonyImportDependency = require("./HarmonyImportDependency"); <ide> <add>/** @typedef {import("webpack-sources").ReplaceSource} ReplaceSource */ <ide> /** @typedef {import("../Dependency")} Dependency */ <ide> /** @typedef {import("../DependencyTemplate").DependencyTemplateContext} DependencyTemplateContext */ <ide> /** @typedef {import("../InitFragment")} InitFragment */
1
PHP
PHP
update the mocking
50b6f556500c5d07416827c49e47077f08e857e3
<ide><path>tests/Database/DatabaseMigrationMigrateCommandTest.php <ide> public function testBasicMigrationsCallMigratorWithProperArguments() <ide> $app->useDatabasePath(__DIR__); <ide> $command->setLaravel($app); <ide> $migrator->shouldReceive('setConnection')->once()->with(null); <del> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false); <add> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false, null); <ide> $migrator->shouldReceive('getNotes')->andReturn([]); <ide> $migrator->shouldReceive('repositoryExists')->once()->andReturn(true); <ide> <ide> public function testMigrationRepositoryCreatedWhenNecessary() <ide> $app->useDatabasePath(__DIR__); <ide> $command->setLaravel($app); <ide> $migrator->shouldReceive('setConnection')->once()->with(null); <del> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false); <add> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false, null); <ide> $migrator->shouldReceive('getNotes')->andReturn([]); <ide> $migrator->shouldReceive('repositoryExists')->once()->andReturn(false); <ide> $command->expects($this->once())->method('call')->with($this->equalTo('migrate:install'), $this->equalTo(['--database' => null])); <ide> public function testTheCommandMayBePretended() <ide> $app->useDatabasePath(__DIR__); <ide> $command->setLaravel($app); <ide> $migrator->shouldReceive('setConnection')->once()->with(null); <del> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', true); <add> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', true, null); <ide> $migrator->shouldReceive('getNotes')->andReturn([]); <ide> $migrator->shouldReceive('repositoryExists')->once()->andReturn(true); <ide> <ide> public function testTheDatabaseMayBeSet() <ide> $app->useDatabasePath(__DIR__); <ide> $command->setLaravel($app); <ide> $migrator->shouldReceive('setConnection')->once()->with('foo'); <del> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false); <add> $migrator->shouldReceive('run')->once()->with(__DIR__.'/migrations', false, null); <ide> $migrator->shouldReceive('getNotes')->andReturn([]); <ide> $migrator->shouldReceive('repositoryExists')->once()->andReturn(true); <ide>
1
Python
Python
disallow verbose=1 with parameterserverstrategy
df97de0a0d2646b718540bb568b7d70b423bf536
<ide><path>keras/distribute/dataset_creator_model_fit_ps_only_test.py <ide> def testModelFitTensorBoardEpochLevel(self, strategy, use_dataset_creator): <ide> files = tf.compat.v1.gfile.ListDirectory(log_dir) <ide> self.assertGreaterEqual(len(files), 1) <ide> <add> def testModelFitVerbose1(self, strategy, use_dataset_creator): <add> with self.assertRaisesRegex(ValueError, <add> "`verbose=1` is not allowed with " <add> "`ParameterServerStrategy` for performance " <add> "reasons. Received: `verbose`=1"): <add> self._model_fit( <add> strategy, use_dataset_creator=use_dataset_creator, <add> verbose=1) <add> <ide> def testModelEvaluateErrorOnBatchLevelCallbacks(self, strategy, <ide> use_dataset_creator): <ide> <ide><path>keras/distribute/dataset_creator_model_fit_test_base.py <ide> def _model_fit(self, <ide> with_normalization_layer=False, <ide> callbacks=None, <ide> use_lookup_layer=False, <del> use_dataset_creator=True): <add> use_dataset_creator=True, <add> verbose="auto"): <ide> if callbacks is None: <ide> callbacks = [] <ide> <ide> def _model_fit(self, <ide> steps_per_epoch=steps_per_epoch, <ide> callbacks=callbacks, <ide> validation_data=validation_data, <del> validation_steps=steps_per_epoch) <add> validation_steps=steps_per_epoch, <add> verbose=verbose) <ide> return model <ide> <ide> def _model_evaluate(self, <ide><path>keras/engine/training.py <ide> def fit(self, <ide> verbose = 2 # Default to epoch-level logging for PSStrategy. <ide> else: <ide> verbose = 1 # Default to batch-level logging otherwise. <add> elif verbose == 1 and self.distribute_strategy._should_use_with_coordinator: # pylint: disable=protected-access <add> raise ValueError( <add> '`verbose=1` is not allowed with `ParameterServerStrategy` for ' <add> f'performance reasons. Received: `verbose`={verbose}') <ide> <ide> if validation_split: <ide> # Create the validation data using the training data. Only supported for
3
Javascript
Javascript
add unit test for isplainobject(symbol)
9090d98439f8dc449beafee98f8ff35cfb4f9116
<ide><path>test/unit/core.js <ide> QUnit.asyncTest( "isPlainObject", function( assert ) { <ide> } <ide> } ); <ide> <add>// <add>QUnit[ typeof Symbol === "function" ? "test" : "skip" ]( "isPlainObject(Symbol)", function( assert ) { <add> assert.expect( 2 ); <add> <add> assert.equal( jQuery.isPlainObject( Symbol() ), false, "Symbol" ); <add> assert.equal( jQuery.isPlainObject( Object( Symbol() ) ), false, "Symbol inside an object" ); <add>} ); <add> <add> <ide> QUnit.test( "isFunction", function( assert ) { <ide> assert.expect( 19 ); <ide>
1
Python
Python
reduce line length
4ef538b23754612bd05d1d547dd04408ec396cdc
<ide><path>libcloud/compute/drivers/dimensiondata.py <ide> def _to_image(self, element, locations=None): <ide> if locations is None: <ide> locations = self.list_locations(location_id) <ide> <del> location = [match_location for match_location in locations if match_location.id == location_id][0] <add> location = [loc for loc in locations if loc.id == location_id][0] <ide> cpu_spec = self._to_cpu_spec(element.find(fixxpath('cpu', TYPES_URN))) <ide> <ide> if LooseVersion(self.connection.active_api_version) > LooseVersion(
1
Ruby
Ruby
add doctor check for homebrew_keep_info
fab5e1d905cd243a9e8306dcd4e74b5f1d596f8f
<ide><path>Library/Homebrew/cmd/doctor.rb <ide> def check_for_latest_xquartz <ide> https://xquartz.macosforge.org <ide> EOS <ide> end <add> <add> def check_for_old_env_vars <add> if ENV["HOMEBREW_KEEP_INFO"] <add> <<-EOS.undent <add> `HOMEBREW_KEEP_INFO` is no longer used <add> info files are no longer deleted by default; you may <add> remove this environment variable. <add> EOS <add> end <add> end <ide> end # end class Checks <ide> <ide> module Homebrew extend self
1
Text
Text
remove setprops and replaceprops from docs
718c07c9150042c8af503b584537458f4e6ad5e1
<ide><path>docs/docs/ref-02-component-api.it-IT.md <ide> boolean isMounted() <ide> > Nota: <ide> > <ide> > Questo metodo non è disponibile il componenti `class` ES6 che estendono `React.Component`. Potrebbe essere eliminato del tutto in una versione futura di React. <del> <del> <del>### setProps <del> <del>```javascript <del>void setProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>Quando stai integrando con un'applicazione JavaScript esterna puoi voler segnalare un cambiamento a un componente React il cui rendering è stato effettuato con `ReactDOM.render()`. <del> <del>Chiamare `setProps()` su un componente al livello radice cambierà le sue proprietà e scatenerà un ri-rendering. Inoltre, puoi fornire una funzione callback opzionale che verrà eseguita quando `setProps` ha terminato e il rendering del componente è terminato. <del> <del>> Nota: <del>> <del>> Quando possibile, l'approccio dichiarativo di invocare nuovamente `ReactDOM.render()` sullo stesso nodo è preferibile. Quest'ultimo tende a rendere gli aggiornamenti più comprensibili. (Non vi è una differenza significativa di prestazioni tra i due approcci.) <del>> <del>> Questo metodo può essere chiamato soltanto su un componente al livello radice. Ovvero, è disponibile soltanto sul componente passato direttamente a `ReactDOM.render()` e nessuno dei suoi figli. Se il tuo intento è usare `setProps()` su un componente figlio, approfitta degli aggiornamenti reattivi e passa la nuova proprietà al componente figlio quando viene creato in `render()`. <del>> <del>> Questo metodo non è disponibile il componenti `class` ES6 che estendono `React.Component`. Potrebbe essere eliminato del tutto in una versione futura di React. <del> <del>### replaceProps <del> <del>```javascript <del>void replaceProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>Come `setProps()` ma elimina ogni proprietà preesistente anziché riunire i due oggetti. <del> <del>> Nota: <del>> <del>> Questo metodo non è disponibile il componenti `class` ES6 che estendono `React.Component`. Potrebbe essere eliminato del tutto in una versione futura di React. <ide><path>docs/docs/ref-02-component-api.ko-KR.md <ide> boolean isMounted() <ide> > 주의: <ide> > <ide> > 이 메소드는 `React.Component`를 확장한 ES6 `class` 컴포넌트에서는 사용할 수 없습니다. React의 미래 버전에서 이는 완전히 사라지게 될 것입니다. <del> <del> <del>### setProps <del> <del>```javascript <del>void setProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>외부 JavaScript 애플리케이션과 연동하는 경우 `ReactDOM.render()`로 렌더링된 React 컴포넌트에 변경을 알리고 싶을 때가 있습니다. <del> <del>최상위 컴포넌트에서 `setProps()`를 호출하면 프로퍼티를 변경하고 렌더를 다시 발생합니다. 거기에 콜백 함수를 넘기면 `setProps`가 완료되고 컴포넌트가 다시 렌더링된 다음에 한번 호출됩니다. <del> <del>> 주의: <del>> <del>> 가능하다면 이것 대신 `ReactDOM.render()`를 같은 노드에서 다시 호출하는 선언적인 방법이 더 바람직합니다. 그렇게 하는 편이 업데이트에 대해 생각하는 것을 쉽게 만듭니다. (두가지 방식에 눈에 띄는 성능 차이는 없습니다.) <del>> <del>> 이 메소드는 최상위 컴포넌트에만 호출 가능합니다. 다시 말해, `ReactDOM.render()`에 바로 넘긴 컴포넌트에서만 사용할 수 있고 자식에서는 불가능합니다. 자식 컴포넌트에 `setProps()`를 사용하고 싶다면, 그 대신 반응적인 업데이트의 장점을 활용하여 `render()` 안에서 자식 컴포넌트를 만들 때 새로운 prop을 넘기세요. <del>> <del>> 이 메소드는 `React.Component`를 확장한 ES6 `class` 컴포넌트에서는 사용할 수 없습니다. React의 미래 버전에서 이는 완전히 사라지게 될 것입니다. <del> <del> <del>### replaceProps <del> <del>```javascript <del>void replaceProps( <del> object nextProps, <del> function callback] <del>) <del>``` <del> <del>`setProps()`와 비슷하지만 두 객체를 합치는 대신 이전에 존재하던 props를 삭제합니다. <del> <del>> 주의: <del>> <del>> 이 메소드는 `React.Component`를 확장한 ES6 `class` 컴포넌트에서는 사용할 수 없습니다. React의 미래 버전에서 이는 완전히 사라지게 될 것입니다. <ide><path>docs/docs/ref-02-component-api.md <ide> boolean isMounted() <ide> > Note: <ide> > <ide> > This method is not available on ES6 `class` components that extend `React.Component`. It will likely be removed entirely in a future version of React, so you might as well [start migrating away from isMounted() now](/react/blog/2015/12/16/ismounted-antipattern.html). <del> <del> <del>### setProps <del> <del>```javascript <del>void setProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>When you're integrating with an external JavaScript application you may want to signal a change to a React component rendered with `ReactDOM.render()`. <del> <del>Calling `setProps()` on a root-level component will change its properties and trigger a re-render. In addition, you can supply an optional callback function that is executed once `setProps` is completed and the component is re-rendered. <del> <del>> Note: <del>> <del>> This method is deprecated and will be removed soon. This method is not available on ES6 `class` components that extend `React.Component`. Instead of calling `setProps`, try invoking ReactDOM.render() again with the new props. For additional notes, see our [blog post about using the Top Level API](/react/blog/2015/10/01/react-render-and-top-level-api.html) <del>> <del>> When possible, the declarative approach of calling `ReactDOM.render()` again on the same node is preferred instead. It tends to make updates easier to reason about. (There's no significant performance difference between the two approaches.) <del>> <del>> This method can only be called on a root-level component. That is, it's only available on the component passed directly to `ReactDOM.render()` and none of its children. If you're inclined to use `setProps()` on a child component, instead take advantage of reactive updates and pass the new prop to the child component when it's created in `render()`. <del> <del>### replaceProps <del> <del>```javascript <del>void replaceProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>Like `setProps()` but deletes any pre-existing props instead of merging the two objects. <del> <del>> Note: <del>> <del>> This method is deprecated and will be removed soon. This method is not available on ES6 `class` components that extend `React.Component`. Instead of calling `replaceProps`, try invoking ReactDOM.render() again with the new props. For additional notes, see our [blog post about using the Top Level API](/react/blog/2015/10/01/react-render-and-top-level-api.html) <ide><path>docs/docs/ref-02-component-api.zh-CN.md <ide> boolean isMounted() <ide> > 注意: <ide> > <ide> > 这个方法在从 `React.Component` 扩展的 ES6 `class` 组件里不可用。它也许会在未来的 React 版本中被完全移除,所以你也要移除它 [start migrating away from isMounted() now](/react/blog/2015/12/16/ismounted-antipattern.html) <del> <del> <del>### setProps <del> <del>```javascript <del>void setProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>当和一个外部的 JavaScript 应用整合的时候,你也许会想用 `ReactDOM.render()` 给 React 组件标示一个改变。 <del> <del>在根组件上调用 `setProps()` 会改变他的属性并触发一次重绘。另外,你可以提供一个可选的回调函数,一旦 `setProps` 完成并且组件被重绘它就执行。 <del> <del>> 注意: <del>> <del>> 这个方法被弃用了并会很快移除.这个方法在从 `React.Component` 扩展的 ES6 `class` 组件里不可用. 取代调用 `setProps`,试着以新的 props 再次调用 `ReactDOM.render()`. 更多的注意事项,见我们的[blog post about using the Top Level API](/react/blog/2015/10/01/react-render-and-top-level-api.html) <del>> <del>> 如果可能,上述的在同一个节点上再次调用 `ReactDOM.render()` 的方法是优先替代的。它往往使更新更容易理解。(两种方法并没有显著的性能区别。) <del>> <del>> 这个方法仅能在根组件上被调用。也就是说,它仅在直接传给 `ReactDOM.render()` 的组件上可用,在它的子级上不可用。如果你倾向于在子组件上使用 `setProps()`,不要利用响应式更新,而是当子组件在 `render()` 中创建的时候传入新的 prop 到子组件中。 <del> <del>### replaceProps <del> <del>```javascript <del>void replaceProps( <del> object nextProps, <del> [function callback] <del>) <del>``` <del> <del>类似于 `setProps()`,但是删除任何先前存在的 props,而不是合并这两个对象。 <del> <del>> 注意: <del>> <del>> 这个方法被弃用了并会很快移除.这个方法在从 `React.Component` 扩展的 ES6 `class` 组件里不可用. 取代调用 `replaceProps`,试着以新的 props 再次调用 `ReactDOM.render()`. 更多的注意事项,见我们的[blog post about using the Top Level API](/react/blog/2015/10/01/react-render-and-top-level-api.html)
4
Text
Text
fix backticks in docs
87562e470d0a38d1919c60941afbda5765f97ef7
<ide><path>website/docs/api/tokenizer.md <ide> it. <ide> <ide> ## Attributes {#attributes} <ide> <del>| Name | Type | Description | <del>| ---------------- | ------- | -------------------------------------------------------------------------------------------------------------------------- | <del>| `vocab` | `Vocab` | The vocab object of the parent `Doc`. | <del>| `prefix_search` | - | A function to find segment boundaries from the start of a string. Returns the length of the segment, or `None`. | <del>| `suffix_search` | - | A function to find segment boundaries from the end of a string. Returns the length of the segment, or `None`. | <del>| `infix_finditer` | - | A function to find internal segment separators, e.g. hyphens. Returns a (possibly empty) list of `re.MatchObject` objects. | <del>| `token_match` | - | A function matching the signature of `re.compile(string).match to find token matches. Returns an`re.MatchObject`or`None. | <del>| `rules` | dict | A dictionary of tokenizer exceptions and special cases. | <add>| Name | Type | Description | <add>| ---------------- | ------- | ----------------------------------------------------------------------------------------------------------------------------- | <add>| `vocab` | `Vocab` | The vocab object of the parent `Doc`. | <add>| `prefix_search` | - | A function to find segment boundaries from the start of a string. Returns the length of the segment, or `None`. | <add>| `suffix_search` | - | A function to find segment boundaries from the end of a string. Returns the length of the segment, or `None`. | <add>| `infix_finditer` | - | A function to find internal segment separators, e.g. hyphens. Returns a (possibly empty) list of `re.MatchObject` objects. | <add>| `token_match` | - | A function matching the signature of `re.compile(string).match` to find token matches. Returns an `re.MatchObject` or `None`. | <add>| `rules` | dict | A dictionary of tokenizer exceptions and special cases. | <ide> <ide> ## Serialization fields {#serialization-fields} <ide>
1
Ruby
Ruby
add new --cache command
169ac2d413d1a354e1a273759fac61faa2357c17
<ide><path>Library/Homebrew/cask/cmd.rb <ide> require "cask/cmd/options" <ide> <ide> require "cask/cmd/abstract_command" <add>require "cask/cmd/--cache" <ide> require "cask/cmd/audit" <ide> require "cask/cmd/automerge" <ide> require "cask/cmd/cat" <ide><path>Library/Homebrew/cask/cmd/--cache.rb <add># frozen_string_literal: true <add> <add>require "cask/download" <add> <add>module Cask <add> class Cmd <add> class Cache < AbstractCommand <add> def self.command_name <add> "--cache" <add> end <add> <add> def initialize(*) <add> super <add> raise CaskUnspecifiedError if args.empty? <add> end <add> <add> def run <add> casks.each do |cask| <add> puts Download.new(cask).downloader.cached_location <add> end <add> end <add> <add> def self.help <add> "display the file used to cache the Cask" <add> end <add> end <add> end <add>end
2
Mixed
Javascript
add support for compilestreaming
9c7f53e63cf1698a9cb1b20dd75237326bc7111e
<ide><path>examples/wasm-simple/README.md <ide> export function fibonacciJavascript(i) { <ide> /******/ var installedWasmModuleData = installedWasmModules[wasmModuleId]; <ide> /******/ <ide> /******/ // a Promise means "currently loading" or "already loaded". <del>/******/ if(installedWasmModuleData) { <del>/******/ promises.push(installedWasmModuleData); <del>/******/ } else { <del>/******/ var promise = installedWasmModules[wasmModuleId] = fetch(__webpack_require__.p + "" + {"1":"80925f35a6f1cf550d38","3":"3d28950d91bc7246f5af","4":"1d2268b99656e9575a63"}[wasmModuleId] + ".wasm") <del>/******/ .then(function(response) { return response.arrayBuffer(); }) <del>/******/ .then(function(bytes) { return WebAssembly.compile(bytes); }) <del>/******/ .then(function(module) { __webpack_require__.w[wasmModuleId] = module; }) <del>/******/ promises.push(promise); <del>/******/ } <add>/******/ promises.push(installedWasmModuleData || <add>/******/ promises.push(installedWasmModules[wasmModuleId] = fetch(__webpack_require__.p + "" + {"1":"80925f35a6f1cf550d38","3":"3d28950d91bc7246f5af","4":"1d2268b99656e9575a63"}[wasmModuleId] + ".wasm").then(function(response) { <add>/******/ if(WebAssembly.compileStreaming) { <add>/******/ return WebAssembly.compileStreaming(response); <add>/******/ } else { <add>/******/ return response.arrayBuffer().then(function(bytes) { return WebAssembly.compile(bytes); }); <add>/******/ } <add>/******/ }).then(function(module) { __webpack_require__.w[wasmModuleId] = module; })) <add>/******/ ); <ide> /******/ }); <ide> /******/ return Promise.all(promises); <ide> /******/ }; <ide> Version: webpack 3.8.1 <ide> 3d28950d91bc7246f5af.wasm 62 bytes 0, 1 [emitted] <ide> 1d2268b99656e9575a63.wasm 67 bytes 0, 1 [emitted] <ide> 1.output.js 486 bytes 1 [emitted] <del> output.js 8.9 kB 2 [emitted] main <add> output.js 8.94 kB 2 [emitted] main <ide> Entrypoint main = output.js <ide> chunk {0} 0.output.js, 80925f35a6f1cf550d38.wasm, 3d28950d91bc7246f5af.wasm, 1d2268b99656e9575a63.wasm 585 bytes {2} [rendered] <ide> > [0] ./example.js 3:1-17 <ide> chunk {1} 1.output.js, 80925f35a6f1cf550d38.wasm 41 bytes {2} [rendered] <ide> chunk {2} output.js (main) 788 bytes [entry] [rendered] <ide> > main [0] ./example.js <ide> [0] ./example.js 788 bytes {2} [built] <add> single entry .\example.js main <ide> ``` <ide> <ide> ## Minimized (uglify-js, no zip) <ide> Version: webpack 3.8.1 <ide> 3d28950d91bc7246f5af.wasm 62 bytes 0, 1 [emitted] <ide> 1d2268b99656e9575a63.wasm 67 bytes 0, 1 [emitted] <ide> 1.output.js 155 bytes 1 [emitted] <del> output.js 8.74 kB 2 [emitted] main <add> output.js 8.78 kB 2 [emitted] main <ide> Entrypoint main = output.js <ide> chunk {0} 0.output.js, 80925f35a6f1cf550d38.wasm, 3d28950d91bc7246f5af.wasm, 1d2268b99656e9575a63.wasm 585 bytes {2} [rendered] <ide> > [0] ./example.js 3:1-17 <ide> chunk {1} 1.output.js, 80925f35a6f1cf550d38.wasm 41 bytes {2} [rendered] <ide> chunk {2} output.js (main) 788 bytes [entry] [rendered] <ide> > main [0] ./example.js <ide> [0] ./example.js 788 bytes {2} [built] <add> single entry .\example.js main <ide> <ide> ERROR in output.js from UglifyJs <ide> Unexpected token: operator (>) [output.js:194,95] <ide><path>lib/FetchCompileWasmMainTemplatePlugin.js <ide> class FetchCompileWasmMainTemplatePlugin { <ide> "var installedWasmModuleData = installedWasmModules[wasmModuleId];", <ide> "", <ide> "// a Promise means \"currently loading\" or \"already loaded\".", <del> "if(installedWasmModuleData) {", <add> "promises.push(installedWasmModuleData ||", <ide> this.indent([ <del> "promises.push(installedWasmModuleData);" <del> ]), <del> "} else {", <del> this.indent([ <del> `var promise = installedWasmModules[wasmModuleId] = fetch(${this.requireFn}.p + ${wasmModuleSrcPath})`, <add> `promises.push(installedWasmModules[wasmModuleId] = fetch(${this.requireFn}.p + ${wasmModuleSrcPath}).then(function(response) {`, <ide> this.indent([ <del> ".then(function(response) { return response.arrayBuffer(); })", <del> ".then(function(bytes) { return WebAssembly.compile(bytes); })", <del> `.then(function(module) { ${this.requireFn}.w[wasmModuleId] = module; })` <add> "if(WebAssembly.compileStreaming) {", <add> this.indent([ <add> "return WebAssembly.compileStreaming(response);" <add> ]), <add> "} else {", <add> this.indent([ <add> "return response.arrayBuffer().then(function(bytes) { return WebAssembly.compile(bytes); });", <add> ]), <add> "}" <ide> ]), <del> "promises.push(promise);" <add> `}).then(function(module) { ${this.requireFn}.w[wasmModuleId] = module; }))` <ide> ]), <del> "}", <add> ");", <ide> ]), <ide> "});", <ide> ]);
2
Java
Java
fix smooth scrolling on old devices (sdk >=16)
67be81968ea6044571f55683bb8263ed15d306e9
<ide><path>ReactAndroid/src/main/java/com/facebook/react/views/scroll/ReactScrollView.java <ide> public void fling(int velocityY) { <ide> // <ide> // Hence, we can use the absolute value from whatever the OS gives <ide> // us and use the sign of what mOnScrollDispatchHelper has tracked. <del> final int correctedVelocityY = (int)(Math.abs(velocityY) * Math.signum(mOnScrollDispatchHelper.getYFlingVelocity())); <del> <add> float signum = Math.signum(mOnScrollDispatchHelper.getYFlingVelocity()); <add> if (signum == 0) { <add> signum = Math.signum(velocityY); <add> } <add> final int correctedVelocityY = (int)(Math.abs(velocityY) * signum); <ide> <ide> if (mPagingEnabled) { <ide> flingAndSnap(correctedVelocityY);
1
Python
Python
modernize tokenizer tests for whitespace
667051375d4a0f1e88652c1b34980f6722c11730
<ide><path>spacy/tests/tokenizer/test_whitespace.py <ide> """Test that tokens are created correctly for whitespace.""" <add> <add> <ide> from __future__ import unicode_literals <ide> <ide> import pytest <ide> <ide> <del>def test_single_space(en_tokenizer): <del> tokens = en_tokenizer('hello possums') <add>@pytest.mark.parametrize('text', ["hello possums"]) <add>def test_tokenizer_splits_single_space(en_tokenizer, text): <add> tokens = en_tokenizer(text) <ide> assert len(tokens) == 2 <ide> <ide> <del>def test_double_space(en_tokenizer): <del> tokens = en_tokenizer('hello possums') <add>@pytest.mark.parametrize('text', ["hello possums"]) <add>def test_tokenizer_splits_double_space(en_tokenizer, text): <add> tokens = en_tokenizer(text) <ide> assert len(tokens) == 3 <del> assert tokens[1].orth_ == ' ' <add> assert tokens[1].text == " " <ide> <ide> <del>def test_newline(en_tokenizer): <del> tokens = en_tokenizer('hello\npossums') <add>@pytest.mark.parametrize('text', ["hello\npossums"]) <add>def test_tokenizer_splits_newline(en_tokenizer, text): <add> tokens = en_tokenizer(text) <ide> assert len(tokens) == 3 <add> assert tokens[1].text == "\n" <ide> <ide> <del>def test_newline_space(en_tokenizer): <add>@pytest.mark.parametrize('text', ["hello \npossums"]) <add>def test_tokenizer_splits_newline_space(en_tokenizer, text): <ide> tokens = en_tokenizer('hello \npossums') <ide> assert len(tokens) == 3 <ide> <ide> <del>def test_newline_double_space(en_tokenizer): <del> tokens = en_tokenizer('hello \npossums') <add>@pytest.mark.parametrize('text', ["hello \npossums"]) <add>def test_tokenizer_splits_newline_double_space(en_tokenizer, text): <add> tokens = en_tokenizer(text) <ide> assert len(tokens) == 3 <ide> <ide> <del>def test_newline_space_wrap(en_tokenizer): <del> tokens = en_tokenizer('hello \n possums') <add>@pytest.mark.parametrize('text', ["hello \n possums"]) <add>def test_tokenizer_splits_newline_space_wrap(en_tokenizer, text): <add> tokens = en_tokenizer(text) <ide> assert len(tokens) == 3 <ide> <ide>
1
Ruby
Ruby
test both mac and linux
d3507d9899e94c5371e8172162cc5082b242aec8
<ide><path>Library/Homebrew/test/language/java_spec.rb <ide> end <ide> end <ide> <add> let(:expected_home) do <add> if OS.mac? <add> f.opt_libexec/"openjdk.jdk/Contents/Home" <add> else <add> f.opt_libexec <add> end <add> end <add> <ide> before do <ide> allow(Formula).to receive(:[]).and_return(f) <ide> allow(f).to receive(:any_version_installed?).and_return(true) <ide> allow(f).to receive(:any_installed_version).and_return(f.version) <ide> end <ide> <ide> describe "::java_home" do <del> it "returns valid JAVA_HOME if version is specified", :needs_macos do <del> java_home = described_class.java_home("1.8+") <del> expect(java_home).to eql(f.opt_libexec/"openjdk.jdk/Contents/Home") <del> end <del> <del> it "returns valid JAVA_HOME if version is not specified", :needs_macos do <del> java_home = described_class.java_home <del> expect(java_home).to eql(f.opt_libexec/"openjdk.jdk/Contents/Home") <del> end <del> <del> it "returns valid JAVA_HOME if version is specified", :needs_linux do <add> it "returns valid JAVA_HOME if version is specified" do <ide> java_home = described_class.java_home("1.8+") <del> expect(java_home).to eql(f.opt_libexec) <add> expect(java_home).to eql(expected_home) <ide> end <ide> <del> it "returns valid JAVA_HOME if version is not specified", :needs_linux do <add> it "returns valid JAVA_HOME if version is not specified" do <ide> java_home = described_class.java_home <del> expect(java_home).to eql(f.opt_libexec) <add> expect(java_home).to eql(expected_home) <ide> end <ide> end <ide> <ide> describe "::java_home_env" do <ide> it "returns java_home path if version specified" do <ide> java_home_env = described_class.java_home_env("1.8+") <del> expect(java_home_env[:JAVA_HOME]).to include(f.opt_libexec.to_s) <add> expect(java_home_env[:JAVA_HOME]).to eql(expected_home.to_s) <ide> end <ide> <ide> it "returns java_home path if version is not specified" do <ide> java_home_env = described_class.java_home_env <del> expect(java_home_env[:JAVA_HOME]).to include(f.opt_libexec.to_s) <add> expect(java_home_env[:JAVA_HOME]).to eql(expected_home.to_s) <ide> end <ide> end <ide> <ide> describe "::overridable_java_home_env" do <ide> it "returns java_home path if version specified" do <ide> overridable_java_home_env = described_class.overridable_java_home_env("1.8+") <del> expect(overridable_java_home_env[:JAVA_HOME]).to include(f.opt_libexec.to_s) <add> expect(overridable_java_home_env[:JAVA_HOME]).to eql("${JAVA_HOME:-#{expected_home}}") <ide> end <ide> <ide> it "returns java_home path if version is not specified" do <ide> overridable_java_home_env = described_class.overridable_java_home_env <del> expect(overridable_java_home_env[:JAVA_HOME]).to include(f.opt_libexec.to_s) <add> expect(overridable_java_home_env[:JAVA_HOME]).to eql("${JAVA_HOME:-#{expected_home}}") <ide> end <ide> end <ide> end
1
Python
Python
fix eval_training_data bug
bc061e9e35eba1766faff634151c970b11dd08ca
<ide><path>research/object_detection/eval.py <ide> def main(unused_argv): <ide> <ide> model_config = configs['model'] <ide> eval_config = configs['eval_config'] <del> input_config = configs['eval_input_config'] <add> if FLAGS.eval_training_data: <add> input_config = configs['train_input_config'] <add> else: <add> input_config = configs['eval_input_config'] <ide> <ide> model_fn = functools.partial( <ide> model_builder.build,
1
Javascript
Javascript
add "reading" case
25515405401fcbb7bda68e55c49b4cc141f13972
<ide><path>website/src/react-native/showcase.js <ide> var apps = [ <ide> linkAppStore: 'https://itunes.apple.com/cn/app/hong-bei-bang-hai-liang-hong/id1007812319?mt=8', <ide> author: 'Hongbeibang' <ide> }, <add> { <add> name: 'Reading', <add> icon: 'http://7xr0xq.com1.z0.glb.clouddn.com/about_logo.png', <add> link: 'http://www.wandoujia.com/apps/com.reading', <add> author: 'RichardCao', <add> blogs: [ <add> 'http://richard-cao.github.io/2016/02/06/Reading-App-Write-In-React-Native/', <add> ], <add> }, <ide> ]; <ide> <ide> var AppList = React.createClass({
1
PHP
PHP
apply fixes from styleci
0bf5f8ec0c519ccb558c3280b5ce09f4f3f7486e
<ide><path>tests/Database/DatabaseEloquentIntegrationTest.php <ide> namespace Illuminate\Tests\Database; <ide> <ide> use Exception; <del>use ReflectionObject; <ide> use PHPUnit\Framework\TestCase; <ide> use Illuminate\Database\Eloquent\Collection; <ide> use Illuminate\Database\Eloquent\SoftDeletes;
1
Python
Python
add regression test for r4798 (ticket #658)
e3fbd7984862c82e5065360145bc26c5cb603b79
<ide><path>numpy/core/tests/test_regression.py <ide> def check_flat_index_byteswap(self, level=rlevel): <ide> x = np.array([-1,0,1],dtype=dt) <ide> assert_equal(x.flat[0].dtype, x[0].dtype) <ide> <add> def check_copy_detection_corner_case(self, level=rlevel): <add> """Ticket #658""" <add> np.indices((0,3,4)).T.reshape(-1,3) <add> <ide> if __name__ == "__main__": <ide> NumpyTest().run()
1
Python
Python
raise keyerror on missing message data
38d4248b6a78918b4194e76023f594148483be0c
<ide><path>celery/worker.py <ide> def __init__(self, task_name, task_id, task_func, args, kwargs): <ide> @classmethod <ide> def from_message(cls, message): <ide> message_data = simplejson.loads(message.body) <del> task_name = message_data.pop("task") <del> task_id = message_data.pop("id") <del> args = message_data.pop("args") <del> kwargs = message_data.pop("kwargs") <add> task_name = message_data["task"] <add> task_id = message_data["id"] <add> args = message_data["args"] <add> kwargs = message_data["kwargs"] <ide> if task_name not in tasks: <ide> message.reject() <ide> raise UnknownTask(task_name)
1
Ruby
Ruby
add test for `--binaries` default value
b91d0254bbd07b7b0ddf3906cb6c08684ebc07ef
<ide><path>Library/Homebrew/test/cask/cli_spec.rb <ide> ]) <ide> end <ide> <add> context "when no option is specified" do <add> it "--binaries is true by default" do <add> command = Hbc::CLI::Install.new("some-cask") <add> expect(command.binaries?).to be true <add> end <add> end <add> <ide> context "::run" do <ide> let(:noop_command) { double("CLI::Noop") } <ide>
1
Text
Text
note autoschema limitations on bare apiview
c7df69ab7703c7ed345a445fe1b16ac52c40f026
<ide><path>docs/api-guide/schemas.md <ide> appropriate Core API `Link` object for the view, request method and path: <ide> (In compiling the schema, `SchemaGenerator` calls `view.schema.get_link()` for <ide> each view, allowed method and path.) <ide> <add>--- <add> <add>**Note**: For basic `APIView` subclasses, default introspection is essentially <add>limited to the URL kwarg path parameters. For `GenericAPIView` <add>subclasses, which includes all the provided class based views, `AutoSchema` will <add>attempt to introspect serialiser, pagination and filter fields, as well as <add>provide richer path field descriptions. (The key hooks here are the relevant <add>`GenericAPIView` attributes and methods: `get_serializer`, `pagination_class`, <add>`filter_backends` and so on.) <add> <add>--- <add> <ide> To customise the `Link` generation you may: <ide> <ide> * Instantiate `AutoSchema` on your view with the `manual_fields` kwarg:
1
PHP
PHP
update requesthandler to psr 15 standard
83eea0efc71d1700b59c9658ba6d0adb4d2832ff
<ide><path>src/Routing/Middleware/RoutingMiddleware.php <ide> use Cake\Cache\Cache; <ide> use Cake\Core\HttpApplicationInterface; <ide> use Cake\Core\PluginApplicationInterface; <add>use Cake\Http\Middleware\CallableDecoratorMiddleware; <ide> use Cake\Http\MiddlewareQueue; <ide> use Cake\Http\Runner; <ide> use Cake\Routing\Exception\RedirectException; <ide> public function process(ServerRequestInterface $request, RequestHandlerInterface <ide> } <ide> $matching = Router::getRouteCollection()->getMiddleware($middleware); <ide> if (!$matching) { <del> return $handler->process($request); <add> return $handler->handle($request); <ide> } <add> <add> $matching[] = new CallableDecoratorMiddleware(function ($req, $handl) use ($handler) { <add> return $handler->handle($req); <add> }); <ide> $middleware = new MiddlewareQueue($matching); <ide> $runner = new Runner(); <ide> <ide><path>tests/TestCase/Routing/Middleware/RoutingMiddlewareTest.php <ide> use TestApp\Http\TestRequestHandler; <ide> use TestApp\Middleware\DumbMiddleware; <ide> use Zend\Diactoros\Response; <del>use Zend\Diactoros\ServerRequest; <ide> use Zend\Diactoros\ServerRequestFactory; <ide> <ide> /** <ide> public function testRedirectResponseWithHeaders() <ide> public function testRouterSetParams() <ide> { <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/articles']); <del> $response = new Response(); <del> $next = function (ServerRequest $req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $expected = [ <ide> 'controller' => 'Articles', <ide> 'action' => 'index', <ide> public function testRouterSetParams() <ide> ]; <ide> $this->assertEquals($expected, $req->getAttribute('params')); <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $middleware->process($request, $handler); <ide> } <ide> public function testPreservingExistingParams() <ide> { <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/articles']); <ide> $request = $request->withAttribute('params', ['_csrfToken' => 'i-am-groot']); <del> $response = new Response(); <del> $next = function (ServerRequest $req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $expected = [ <ide> 'controller' => 'Articles', <ide> 'action' => 'index', <ide> public function testPreservingExistingParams() <ide> ]; <ide> $this->assertEquals($expected, $req->getAttribute('params')); <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $middleware->process($request, $handler); <ide> } <ide> public function testRoutesHookInvokedOnApp() <ide> Router::reload(); <ide> <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/app/articles']); <del> $response = new Response(); <del> $next = function (ServerRequest $req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $expected = [ <ide> 'controller' => 'Articles', <ide> 'action' => 'index', <ide> public function testRoutesHookInvokedOnApp() <ide> $this->assertNotEmpty(Router::routes()); <ide> $this->assertEquals('/app/articles', Router::routes()[0]->template); <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $app = new Application(CONFIG); <ide> $middleware = new RoutingMiddleware($app); <ide> $middleware->process($request, $handler); <ide> public function testRoutesHookCallsPluginHook() <ide> Router::reload(); <ide> <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/app/articles']); <del> $response = new Response(); <del> $next = function ($req, $res) { <del> return $res; <del> }; <ide> $app = $this->getMockBuilder(Application::class) <ide> ->setMethods(['pluginRoutes']) <ide> ->setConstructorArgs([CONFIG]) <ide> public function testRoutesHookCallsPluginHook() <ide> ->method('pluginRoutes') <ide> ->with($this->isInstanceOf(RouteBuilder::class)); <ide> $middleware = new RoutingMiddleware($app); <del> $middleware->process($request, $handler); <add> $middleware->process($request, new TestRequestHandler()); <ide> } <ide> <ide> /** <ide> public function testRouterNoopOnController() <ide> { <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/articles']); <ide> $request = $request->withAttribute('params', ['controller' => 'Articles']); <del> $response = new Response(); <del> $next = function (ServerRequest $req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $this->assertEquals(['controller' => 'Articles'], $req->getAttribute('params')); <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $middleware->process($request, $handler); <ide> } <ide> public function testMissingRouteNotCaught() <ide> { <ide> $this->expectException(\Cake\Routing\Exception\MissingRouteException::class); <ide> $request = ServerRequestFactory::fromGlobals(['REQUEST_URI' => '/missing']); <del> $response = new Response(); <del> $next = function ($req, $res) { <del> return $res; <del> }; <ide> $middleware = new RoutingMiddleware($this->app()); <del> $middleware->process($request, $handler); <add> $middleware->process($request, new TestRequestHandler()); <ide> } <ide> <ide> /** <ide> public function testFakedRequestMethodParsed() <ide> null, <ide> ['_method' => 'PATCH'] <ide> ); <del> $response = new Response(); <del> $next = function (ServerRequest $req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $expected = [ <ide> 'controller' => 'Articles', <ide> 'action' => 'index', <ide> public function testFakedRequestMethodParsed() <ide> $this->assertEquals($expected, $req->getAttribute('params')); <ide> $this->assertEquals('PATCH', $req->getMethod()); <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $middleware->process($request, $handler); <ide> } <ide> public function testInvokeScopedMiddleware() <ide> 'REQUEST_METHOD' => 'GET', <ide> 'REQUEST_URI' => '/api/ping', <ide> ]); <del> $response = new Response(); <del> $next = function ($req, $res, $next) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $this->log[] = 'last'; <ide> <del> return $next($req, $res); <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $result = $middleware->process($request, $handler); <ide> $this->assertSame(['second', 'first', 'last'], $this->log); <ide> public function testInvokeScopedMiddlewareReturnResponse() <ide> 'REQUEST_METHOD' => 'GET', <ide> 'REQUEST_URI' => '/api/articles', <ide> ]); <del> $response = new Response(); <del> $next = function ($req, $res) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $this->fail('Should not be invoked as first should be ignored.'); <del> }; <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $result = $middleware->process($request, $handler); <ide> <ide> public function testInvokeScopedMiddlewareReturnResponseMainScope() <ide> 'REQUEST_METHOD' => 'GET', <ide> 'REQUEST_URI' => '/', <ide> ]); <del> $response = new Response(); <del> $next = function ($req, $res) { <del> $this->fail('Should not be invoked as second should be ignored.'); <del> }; <add> $handler = new TestRequestHandler(function ($req) { <add> $this->fail('Should not be invoked as first should be ignored.'); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $result = $middleware->process($request, $handler); <ide> <ide> public function testInvokeScopedMiddlewareIsolatedScopes($url, $expected) <ide> 'REQUEST_METHOD' => 'GET', <ide> 'REQUEST_URI' => $url, <ide> ]); <del> $response = new Response(); <del> $next = function ($req, $res, $next) { <add> $handler = new TestRequestHandler(function ($req) { <ide> $this->log[] = 'last'; <ide> <del> return $res; <del> }; <add> return new Response(); <add> }); <ide> $middleware = new RoutingMiddleware($this->app()); <ide> $result = $middleware->process($request, $handler); <ide> $this->assertSame($expected, $this->log);
2
Javascript
Javascript
remove lookup of undefined property
df92b3c64b3e1d74174ac85297fc7ce39ef5728e
<ide><path>lib/child_process.js <ide> function execSync(command, options) { <ide> if (inheritStderr && ret.stderr) <ide> process.stderr.write(ret.stderr); <ide> <del> const err = checkExecSyncError(ret, opts.args, command); <add> const err = checkExecSyncError(ret, undefined, command); <ide> <ide> if (err) <ide> throw err;
1
Javascript
Javascript
remove unused env vars
a555e013a0e879b524359f89267bf8a17d0ce203
<ide><path>api-server/src/server/boot/authentication.js <ide> import { removeCookies } from '../utils/getSetAccessToken'; <ide> import { decodeEmail } from '../../common/utils'; <ide> import { getRedirectParams } from '../utils/redirection'; <ide> <del>const isSignUpDisabled = !!process.env.DISABLE_SIGNUP; <del>if (isSignUpDisabled) { <del> console.log('fcc:boot:auth - Sign up is disabled'); <del>} <del> <ide> const passwordlessGetValidators = [ <ide> check('email') <ide> .isBase64() <ide><path>api-server/src/server/index.js <ide> const path = require('path'); <ide> require('dotenv').config({ path: path.resolve(__dirname, '../../../.env') }); <ide> <ide> const _ = require('lodash'); <del>const Rx = require('rx'); <ide> const loopback = require('loopback'); <ide> const boot = require('loopback-boot'); <ide> const createDebugger = require('debug'); <ide> if (sentry.dns === 'dsn_from_sentry_dashboard') { <ide> log('Sentry initialized'); <ide> } <ide> <del>Rx.config.longStackSupport = process.env.NODE_DEBUG !== 'production'; <ide> const app = loopback(); <ide> <ide> app.set('state namespace', '__fcc__');
2
Python
Python
add a test and cover edge case with parens
9cbe83ef0d03a07a491bd26ecbe508a348ed48a1
<ide><path>scripts/flaskext_migrate.py <ide> def fix_from_imports(red): <ide> if len(node.value) == 3: <ide> package = values[2].value <ide> modules = node.modules() <add> module_string = _get_modules(modules) <ide> if len(modules) > 1: <del> r = "{}," * len(modules) <ide> node.replace("from flask_%s import %s" <del> % (package, r.format(*modules)[:-1])) <add> % (package, module_string)) <ide> else: <ide> name = node.names()[0] <ide> node.replace("from flask_%s import %s as %s" <del> % (package, modules.pop(), name)) <add> % (package, module_string, name)) <ide> # Case 2 <ide> else: <ide> module = node.modules()[0] <ide> def fix_standard_imports(red): <ide> return red <ide> <ide> <add>def _get_modules(module): <add> """ <add> Takes a list of modules and converts into a string <add> <add> The module list can include parens, this function checks each element in <add> the list, if there is a paren then it does not add a comma before the next <add> element. Otherwise a comma and space is added. This is to preserve module <add> imports which are multi-line and/or occur within parens. While also not <add> affecting imports which are not enclosed. <add> """ <add> modules_string = [cur + ', ' if cur.isalnum() and next.isalnum() <add> else cur <add> for (cur, next) in zip(module, module[1:]+[''])] <add> <add> return ''.join(modules_string) <add> <add> <add>def check_user_input(): <add> """Exits and gives error message if no argument is passed in the shell.""" <add> if len(sys.argv) < 2: <add> sys.exit("No filename was included, please try again.") <add> <add> <ide> def fix(ast): <del> """Wrapper which allows for testing when not running from shell""" <add> """Wrapper which allows for testing when not running from shell.""" <ide> return fix_imports(ast).dumps() <ide> <add> <ide> if __name__ == "__main__": <del> if len(sys.argv) < 2: <del> sys.exit("No filename was included, please try again.") <add> check_user_input() <ide> input_file = sys.argv[1] <ide> ast = read_source(input_file) <ide> ast = fix_imports(ast) <ide><path>scripts/test_import_migration.py <add># Tester for the flaskext_migrate.py module located in flask/scripts/ <add># <add># Author: Keyan Pishdadian <add>import pytest <add>from redbaron import RedBaron <add>import flaskext_migrate as migrate <add> <add> <add>def test_simple_from_import(): <add> red = RedBaron("from flask.ext import foo") <add> output = migrate.fix(red) <add> assert output == "import flask_foo as foo" <add> <add> <add>def test_from_to_from_import(): <add> red = RedBaron("from flask.ext.foo import bar") <add> output = migrate.fix(red) <add> assert output == "from flask_foo import bar as bar" <add> <add> <add>def test_multiple_import(): <add> red = RedBaron("from flask.ext.foo import bar, foobar, something") <add> output = migrate.fix(red) <add> assert output == "from flask_foo import bar, foobar, something" <add> <add> <add>def test_multiline_import(): <add> red = RedBaron("from flask.ext.foo import \ <add> bar,\ <add> foobar,\ <add> something") <add> output = migrate.fix(red) <add> assert output == "from flask_foo import bar, foobar, something" <add> <add> <add>def test_module_import(): <add> red = RedBaron("import flask.ext.foo") <add> output = migrate.fix(red) <add> assert output == "import flask_foo" <add> <add> <add>def test_module_import(): <add> red = RedBaron("from flask.ext.foo import bar as baz") <add> output = migrate.fix(red) <add> assert output == "from flask_foo import bar as baz" <add> <add> <add>def test_parens_import(): <add> red = RedBaron("from flask.ext.foo import (bar, foo, foobar)") <add> output = migrate.fix(red) <add> assert output == "from flask_foo import (bar, foo, foobar)"
2