content_type stringclasses 8
values | main_lang stringclasses 7
values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Text | Text | add documentation about action cable npm package | 1713075a80ecaaf18b1d8927aaa3b9836cc68214 | <ide><path>actioncable/README.md
<ide> with all the popular application servers -- Unicorn, Puma and Passenger.
<ide> Action Cable does not work with WEBrick, because WEBrick does not support the
<ide> Rack socket hijacking API.
<ide>
<add>## Frontend assets
<add>
<add>Action Cable's frontend assets are distributed through two channels: the
<add>official gem and npm package, both titled `actioncable`.
<add>
<add>### Gem usage
<add>
<add>Through the `actioncable` gem, Action Cable's frontend assets are
<add>available through the Rails Asset Pipeline. Create a `cable.js` or
<add>`cable.coffee` file (this is automatically done for you with Rails
<add>generators), and then simply require the assets:
<add>
<add>In JavaScript...
<add>
<add>```javascript
<add>//= require action_cable
<add>```
<add>
<add>... and in CoffeeScript:
<add>
<add>```coffeescript
<add>#= require action_cable
<add>```
<add>
<add>### npm usage
<add>
<add>In addition to being available through the `actioncable` gem, Action Cable's
<add>frontend JS assets are also bundled in an officially supported npm module,
<add>intended for usage in standalone frontend applications that communicate with a
<add>Rails application. A common use case for this could be if you have a decoupled
<add>frontend application written in React, Ember.js, etc. and want to add real-time
<add>WebSocket functionality.
<add>
<add>### Installation
<add>
<add>```
<add>npm install actioncable --save
<add>```
<add>
<add>### Usage
<add>
<add>The `ActionCable` constant is available as a `require`-able module, so
<add>you only have to require the package to gain access to the API that is
<add>provided.
<add>
<add>In JavaScript...
<add>
<add>```javascript
<add>ActionCable = require('actioncable')
<add>
<add>var cable = ActionCable.createConsumer('wss://RAILS-API-PATH.com/cable')
<add>
<add>cable.subscriptions.create('AppearanceChannel', {
<add> // normal channel code goes here...
<add>});
<add>```
<add>
<add>and in CoffeeScript...
<add>
<add>```coffeescript
<add>ActionCable = require('actioncable')
<add>
<add>cable = ActionCable.createConsumer('wss://RAILS-API-PATH.com/cable')
<add>
<add>cable.subscriptions.create 'AppearanceChannel',
<add> # normal channel code goes here...
<add>```
<add>
<ide> ## License
<ide>
<ide> Action Cable is released under the MIT license: | 1 |
Javascript | Javascript | use err_debugger_startup_error in _inspect.js | 767996047c0e05ffd4a49bdd08c5e031bd874dc2 | <ide><path>lib/internal/inspector/_inspect.js
<ide> const {
<ide> ArrayPrototypePop,
<ide> ArrayPrototypeShift,
<ide> ArrayPrototypeSlice,
<del> Error,
<ide> FunctionPrototypeBind,
<ide> Number,
<ide> Promise,
<ide> const { 0: InspectClient, 1: createRepl } =
<ide>
<ide> const debuglog = util.debuglog('inspect');
<ide>
<del>class StartupError extends Error {
<del> constructor(message) {
<del> super(message);
<del> this.name = 'StartupError';
<del> }
<del>}
<add>const { ERR_DEBUGGER_STARTUP_ERROR } = require('internal/errors').codes;
<ide>
<ide> async function portIsFree(host, port, timeout = 9999) {
<ide> if (port === 0) return; // Binding to a random port.
<ide> async function portIsFree(host, port, timeout = 9999) {
<ide> while (true) {
<ide> await asyncIterator.next();
<ide> if (signal.aborted) {
<del> throw new StartupError( // eslint-disable-line no-restricted-syntax
<add> throw new ERR_DEBUGGER_STARTUP_ERROR(
<ide> `Timeout (${timeout}) waiting for ${host}:${port} to be free`);
<ide> }
<ide> const error = await new Promise((resolve) => {
<ide> function startInspect(argv = ArrayPrototypeSlice(process.argv, 2),
<ide> stdin.resume();
<ide>
<ide> function handleUnexpectedError(e) {
<del> if (!(e instanceof StartupError)) {
<add> if (e.code !== 'ERR_DEBUGGER_STARTUP_ERROR') {
<ide> console.error('There was an internal error in Node.js. ' +
<ide> 'Please report this bug.');
<ide> console.error(e.message); | 1 |
PHP | PHP | remove unused use statement | d5172d47326714a564b786e61800f8083f9d3d20 | <ide><path>src/Illuminate/Database/Console/Migrations/FreshCommand.php
<ide> use Illuminate\Console\ConfirmableTrait;
<ide> use Illuminate\Contracts\Events\Dispatcher;
<ide> use Illuminate\Database\Events\DatabaseRefreshed;
<del>use Illuminate\Database\Migrations\Migrator;
<ide> use Symfony\Component\Console\Input\InputOption;
<ide>
<ide> class FreshCommand extends Command | 1 |
Javascript | Javascript | improve grouping and sorting of api nav items | 3fbc25718eae6b7fc67f822755f68f8f53b1ef8a | <ide><path>docs/config/processors/pages-data.js
<ide> var navGroupMappers = {
<ide> delete docTypes.module;
<ide> })
<ide>
<add> .tap(function(docTypes) {
<add> if ( docTypes.input ) {
<add> docTypes.directive = docTypes.directive || [];
<add> // Combine input docTypes into directive docTypes
<add> docTypes.directive = docTypes.directive.concat(docTypes.input);
<add> delete docTypes.input;
<add> }
<add> })
<add>
<ide> .forEach(function(sectionPages, sectionName) {
<ide>
<add> sectionPages = _.sortBy(sectionPages, 'name');
<add>
<ide> if ( sectionPages.length > 0 ) {
<ide> // Push a navItem for this section
<ide> navItems.push({ | 1 |
Go | Go | fix undead containers | c3c08f76bec023218b632e4c688ff9fcda11fcef | <ide><path>daemon/delete.go
<ide> func (daemon *Daemon) commonRm(container *Container, forceRemove bool) (err erro
<ide> if err != nil && forceRemove {
<ide> daemon.idIndex.Delete(container.ID)
<ide> daemon.containers.Delete(container.ID)
<add> os.RemoveAll(container.root)
<ide> }
<ide> }()
<ide> | 1 |
PHP | PHP | put same methods that were in l4 | d75eab8dbe04d5d01a8539b07a9e89071e52eecd | <ide><path>src/Illuminate/Pagination/AbstractPaginator.php
<ide> public function count()
<ide> return $this->items->count();
<ide> }
<ide>
<add> /**
<add> * Get the paginator's underlying collection.
<add> *
<add> * @return \Illuminate\Support\Collection
<add> */
<add> public function getCollection()
<add> {
<add> return $this->items;
<add> }
<add>
<ide> /**
<ide> * Determine if the given item exists.
<ide> *
<ide> public function offsetUnset($key)
<ide> unset($this->items[$key]);
<ide> }
<ide>
<add> /**
<add> * Make dynamic calls into the collection.
<add> *
<add> * @param string $method
<add> * @param array $parameters
<add> * @return mixed
<add> */
<add> public function __call($method, $parameters)
<add> {
<add> return call_user_func_array([$this->getCollection(), $method], $parameters);
<add> }
<add>
<ide> /**
<ide> * Render the contents of the paginator when casting to string.
<ide> * | 1 |
Python | Python | add type hints to modeling_utils.py closes | e19b978151419fe0756ba852b145fccfc96dbeb4 | <ide><path>src/transformers/modeling_utils.py
<ide> import inspect
<ide> import logging
<ide> import os
<del>from typing import Callable, List, Tuple
<add>from typing import Callable, Dict, Iterable, List, Optional, Tuple
<ide>
<ide> import torch
<ide> from torch import Tensor, device, dtype, nn
<ide> def invert_attention_mask(self, encoder_attention_mask: Tensor) -> Tensor:
<ide>
<ide> return encoder_extended_attention_mask
<ide>
<del> def get_extended_attention_mask(self, attention_mask: Tensor, input_shape: tuple, device: device):
<add> def get_extended_attention_mask(self, attention_mask: Tensor, input_shape: Tuple, device: device) -> Tensor:
<ide> """Makes broadcastable attention mask and causal mask so that future and maked tokens are ignored.
<ide>
<ide> Arguments:
<ide> def get_extended_attention_mask(self, attention_mask: Tensor, input_shape: tuple
<ide> extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0
<ide> return extended_attention_mask
<ide>
<del> def get_head_mask(self, head_mask, num_hidden_layers, is_attention_chunked=False):
<add> def get_head_mask(self, head_mask: Tensor, num_hidden_layers: int, is_attention_chunked: bool = False) -> Tensor:
<ide> """
<ide> # Prepare head mask if needed
<ide> # 1.0 in head_mask indicate we keep the head
<ide> def get_input_embeddings(self):
<ide> else:
<ide> raise NotImplementedError
<ide>
<del> def set_input_embeddings(self, value):
<add> def set_input_embeddings(self, value: nn.Module):
<ide> """
<ide> Set model's input embeddings
<ide>
<ide> def _tie_or_clone_weights(self, output_embeddings, input_embeddings):
<ide> if hasattr(output_embeddings, "out_features") and hasattr(input_embeddings, "num_embeddings"):
<ide> output_embeddings.out_features = input_embeddings.num_embeddings
<ide>
<del> def resize_token_embeddings(self, new_num_tokens=None):
<add> def resize_token_embeddings(self, new_num_tokens: Optional[int] = None):
<ide> """ Resize input token embeddings matrix of the model if new_num_tokens != config.vocab_size.
<ide> Take care of tying weights embeddings afterwards if the model class has a `tie_weights()` method.
<ide>
<ide> def _resize_token_embeddings(self, new_num_tokens):
<ide> self.set_input_embeddings(new_embeddings)
<ide> return self.get_input_embeddings()
<ide>
<del> def _get_resized_embeddings(self, old_embeddings, new_num_tokens=None):
<add> def _get_resized_embeddings(
<add> self, old_embeddings: torch.nn.Embedding, new_num_tokens: Optional[int] = None
<add> ) -> torch.nn.Embedding:
<ide> """ Build a resized Embedding Module from a provided token Embedding Module.
<ide> Increasing the size will add newly initialized vectors at the end
<ide> Reducing the size will remove vectors from the end
<ide>
<ide> Args:
<add> old_embeddings: ``torch.nn.Embedding``
<add> Old embeddings to be resized.
<ide> new_num_tokens: (`optional`) int
<ide> New number of tokens in the embedding matrix.
<ide> Increasing the size will add newly initialized vectors at the end
<ide> Reducing the size will remove vectors from the end
<ide> If not provided or None: return the provided token Embedding Module.
<del> Return: ``torch.nn.Embeddings``
<add> Return: ``torch.nn.Embedding``
<ide> Pointer to the resized Embedding Module or the old Embedding Module if new_num_tokens is None
<ide> """
<ide> if new_num_tokens is None:
<ide> def init_weights(self):
<ide> # Tie weights if needed
<ide> self.tie_weights()
<ide>
<del> def prune_heads(self, heads_to_prune):
<add> def prune_heads(self, heads_to_prune: Dict):
<ide> """ Prunes heads of the base model.
<ide>
<ide> Arguments:
<ide> def enforce_repetition_penalty_(self, lprobs, batch_size, num_beams, prev_output
<ide> @torch.no_grad()
<ide> def generate(
<ide> self,
<del> input_ids=None,
<del> max_length=None,
<del> min_length=None,
<del> do_sample=None,
<del> early_stopping=None,
<del> num_beams=None,
<del> temperature=None,
<del> top_k=None,
<del> top_p=None,
<del> repetition_penalty=None,
<del> bad_words_ids=None,
<del> bos_token_id=None,
<del> pad_token_id=None,
<del> eos_token_id=None,
<del> length_penalty=None,
<del> no_repeat_ngram_size=None,
<del> num_return_sequences=None,
<del> attention_mask=None,
<del> decoder_start_token_id=None,
<del> use_cache=None,
<add> input_ids: Optional[torch.LongTensor] = None,
<add> max_length: Optional[int] = None,
<add> min_length: Optional[int] = None,
<add> do_sample: Optional[bool] = None,
<add> early_stopping: Optional[bool] = None,
<add> num_beams: Optional[int] = None,
<add> temperature: Optional[float] = None,
<add> top_k: Optional[int] = None,
<add> top_p: Optional[float] = None,
<add> repetition_penalty: Optional[float] = None,
<add> bad_words_ids: Optional[Iterable[int]] = None,
<add> bos_token_id: Optional[int] = None,
<add> pad_token_id: Optional[int] = None,
<add> eos_token_id: Optional[int] = None,
<add> length_penalty: Optional[float] = None,
<add> no_repeat_ngram_size: Optional[int] = None,
<add> num_return_sequences: Optional[int] = None,
<add> attention_mask: Optional[torch.LongTensor] = None,
<add> decoder_start_token_id: Optional[int] = None,
<add> use_cache: Optional[bool] = None,
<ide> **model_specific_kwargs
<del> ):
<add> ) -> torch.LongTensor:
<ide> r""" Generates sequences for models with a LM head. The method currently supports greedy decoding, beam-search decoding, sampling with temperature, sampling with top-k or nucleus sampling.
<ide>
<ide> Adapted in part from `Facebook's XLM beam search code`_.
<ide> def _get_generated_ngrams(hypo_idx):
<ide> return banned_tokens
<ide>
<ide>
<del>def calc_banned_bad_words_ids(prev_input_ids, bad_words_ids):
<add>def calc_banned_bad_words_ids(prev_input_ids: Iterable[int], bad_words_ids: Iterable[int]) -> Iterable[int]:
<ide> banned_tokens = []
<ide>
<ide> def _tokens_match(prev_tokens, tokens):
<ide> def _tokens_match(prev_tokens, tokens):
<ide> return banned_tokens
<ide>
<ide>
<del>def top_k_top_p_filtering(logits, top_k=0, top_p=1.0, filter_value=-float("Inf"), min_tokens_to_keep=1):
<add>def top_k_top_p_filtering(
<add> logits: Tensor,
<add> top_k: int = 0,
<add> top_p: float = 1.0,
<add> filter_value: float = -float("Inf"),
<add> min_tokens_to_keep: int = 1,
<add>) -> Tensor:
<ide> """ Filter a distribution of logits using top-k and/or nucleus (top-p) filtering
<ide> Args:
<ide> logits: logits distribution shape (batch size, vocabulary size) | 1 |
Java | Java | append unique number to webflux server log prefix | 2fcee5ae58e61c9c999606c9ff6ad3424644b4b5 | <ide><path>spring-web/src/main/java/org/springframework/http/server/reactive/ReactorServerHttpRequest.java
<ide> /*
<del> * Copyright 2002-2019 the original author or authors.
<add> * Copyright 2002-2020 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> import java.net.InetSocketAddress;
<ide> import java.net.URI;
<ide> import java.net.URISyntaxException;
<add>import java.util.concurrent.atomic.AtomicLong;
<ide>
<ide> import javax.net.ssl.SSLSession;
<ide>
<ide> */
<ide> class ReactorServerHttpRequest extends AbstractServerHttpRequest {
<ide>
<add> private static final AtomicLong logPrefixIndex = new AtomicLong(0);
<add>
<add>
<ide> private final HttpServerRequest request;
<ide>
<ide> private final NettyDataBufferFactory bufferFactory;
<ide> public <T> T getNativeRequest() {
<ide> @Override
<ide> @Nullable
<ide> protected String initId() {
<del> return this.request instanceof Connection ?
<del> ((Connection) this.request).channel().id().asShortText() : null;
<add> if (this.request instanceof Connection) {
<add> return ((Connection) this.request).channel().id().asShortText() +
<add> "-" + logPrefixIndex.incrementAndGet();
<add> }
<add> return null;
<ide> }
<ide>
<ide> } | 1 |
Text | Text | add docs working group | 9cd4b76d4dae24e041923c7383fd5244d02f8842 | <ide><path>WORKING_GROUPS.md
<ide> back in to the TSC.
<ide> * [Post-mortem](#post-mortem)
<ide> * [Intl](#intl)
<ide> * [HTTP](#http)
<add>* [Documentation](#documentation)
<ide>
<ide> #### Process:
<ide>
<ide> Its responsibilities are:
<ide> + Defining and adding common structures to the dumps generated
<ide> in order to support tools that want to introspect those dumps
<ide>
<add>### [Documentation](https://github.com/nodejs/docs)
<add>
<add>The Documentation working group exists to support the improvement of Node.js
<add>documentation, both in the core API documentation, and elsewhere, such as the
<add>Node.js website. Its intent is to work closely with Evangelism, Website, and
<add>Intl working groups to make excellent documentation available and accessible
<add>to all.
<add>
<add>Its responsibilities are:
<add>
<add>* Defining and maintaining documentation style and content standards.
<add>* Producing documentation in a format acceptable for the Website WG to consume.
<add>* Ensuring that Node's documentation addresses a wide variety of audiences.
<add>* Creating and operating a process for documentation review that produces
<add> quality documentation and avoids impeding the progress of Core work.
<add>
<ide> ## Starting a WG
<ide>
<ide> A Working Group is established by first defining a charter that can be | 1 |
Ruby | Ruby | add test case for `unscope` with unknown column | 15da1fb35b41a94bdd6b75b249f83572400843d7 | <ide><path>activerecord/test/cases/relations_test.rb
<ide> def test_unscope_with_subquery
<ide> assert_equal p2.first.comments, comments
<ide> end
<ide>
<add> def test_unscope_with_unknown_column
<add> comment = comments(:greetings)
<add> comment.update!(comments: 1)
<add>
<add> comments = Comment.where(comments: 1).unscope(where: :unknown_column)
<add> assert_equal [comment], comments
<add>
<add> comments = Comment.where(comments: 1).unscope(where: { comments: :unknown_column })
<add> assert_equal [comment], comments
<add> end
<add>
<ide> def test_unscope_specific_where_value
<ide> posts = Post.where(title: "Welcome to the weblog", body: "Such a lovely day")
<ide> | 1 |
Javascript | Javascript | remove forced optimization from tls | ea61ce518bed2b8d807062d2f8828739ad6ee693 | <ide><path>benchmark/tls/convertprotocols.js
<ide> function main(conf) {
<ide>
<ide> var i = 0;
<ide> var m = {};
<del> common.v8ForceOptimization(
<del> tls.convertNPNProtocols, ['ABC', 'XYZ123', 'FOO'], m);
<add> // First call dominates results
<add> if (n > 1) {
<add> tls.convertNPNProtocols(['ABC', 'XYZ123', 'FOO'], m);
<add> m = {};
<add> }
<ide> bench.start();
<ide> for (; i < n; i++) tls.convertNPNProtocols(['ABC', 'XYZ123', 'FOO'], m);
<ide> bench.end(n); | 1 |
Javascript | Javascript | fix ie whitespace issues in tests | 97a86c83363786bd8191609bd9b61259cd25f0ce | <ide><path>packages/ember-routing/tests/helpers/outlet_test.js
<ide> test("view should support connectOutlet for the main outlet", function() {
<ide> }));
<ide> });
<ide>
<del> equal(view.$().text(), 'HIBYE');
<add> // Replace whitespace for older IE
<add> equal(view.$().text().replace(/\s+/,''), 'HIBYE');
<ide> });
<ide>
<ide> test("outlet should support connectOutlet in slots in prerender state", function() {
<ide> test("outlet should support an optional name", function() {
<ide> }));
<ide> });
<ide>
<del> equal(view.$().text(), 'HIBYE');
<add> // Replace whitespace for older IE
<add> equal(view.$().text().replace(/\s+/,''), 'HIBYE');
<ide> });
<ide>
<ide> test("Outlets bind to the current view, not the current concrete view", function() {
<ide> test("view should support disconnectOutlet for the main outlet", function() {
<ide> }));
<ide> });
<ide>
<del> equal(view.$().text(), 'HIBYE');
<add> // Replace whitespace for older IE
<add> equal(view.$().text().replace(/\s+/,''), 'HIBYE');
<ide>
<ide> Ember.run(function() {
<ide> view.disconnectOutlet('main');
<ide> });
<ide>
<del> equal(view.$().text(), 'HI');
<add> // Replace whitespace for older IE
<add> equal(view.$().text().replace(/\s+/,''), 'HI');
<ide> }); | 1 |
Text | Text | add v3.4.0-beta.2 to changelog | ddec402d601c9791f2f8a7ca1c83265e0e48de6c | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### v3.4.0-beta.2 (August 06, 2018)
<add>
<add>- [#16857](https://github.com/emberjs/ember.js/pull/16857) [BUGFIX] Prevents the recursive redefinition of root chains
<add>- [#16854](https://github.com/emberjs/ember.js/pull/16854) [BUGFIX] Don't thread FactoryManager through createComponent
<add>- [#16853](https://github.com/emberjs/ember.js/pull/16853) [BUGFIX] Allow ArrayProxy#pushObjects to accept ArrayProxy again
<add>
<ide> ### v3.4.0-beta.1 (July 16, 2018)
<ide>
<ide> - [#16773](https://github.com/emberjs/ember.js/pull/16773) [FEATURE] Custom component manager (see [emberjs/rfcs#213](https://github.com/emberjs/rfcs/blob/master/text/0213-custom-components.md) for more details) | 1 |
Python | Python | add persian character and symbols | 2bda582135b5687609c4816182aa4cfae5d865f3 | <ide><path>spacy/lang/char_classes.py
<ide> _latin_lower = r'[\p{Ll}&&\p{Latin}]'
<ide> _latin_upper = r'[\p{Lu}&&\p{Latin}]'
<ide> _latin = r'[[\p{Ll}||\p{Lu}]&&\p{Latin}]'
<add>_persian = r'[\p{L}&&\p{Arabic}]'
<ide> _russian_lower = r'[ёа-я]'
<ide> _russian_upper = r'[ЁА-Я]'
<ide>
<ide> _upper = [_latin_upper, _russian_upper]
<ide> _lower = [_latin_lower, _russian_lower]
<del>_uncased = [_bengali, _hebrew]
<add>_uncased = [_bengali, _hebrew, _persian]
<ide>
<ide> ALPHA = merge_char_classes(_upper + _lower + _uncased)
<ide> ALPHA_LOWER = merge_char_classes(_lower + _uncased)
<ide>
<ide> _units = ('km km² km³ m m² m³ dm dm² dm³ cm cm² cm³ mm mm² mm³ ha µm nm yd in ft '
<ide> 'kg g mg µg t lb oz m/s km/h kmh mph hPa Pa mbar mb MB kb KB gb GB tb '
<del> 'TB T G M K % км км² км³ м м² м³ дм дм² дм³ см см² см³ мм мм² мм³ нм '
<add> 'TB T G M K % ٪ км км² км³ м м² м³ дм дм² дм³ см см² см³ мм мм² мм³ нм '
<ide> 'кг г мг м/с км/ч кПа Па мбар Кб КБ кб Мб МБ мб Гб ГБ гб Тб ТБ тб')
<del>_currency = r'\$ £ € ¥ ฿ US\$ C\$ A\$ ₽'
<add>_currency = r'\$ £ € ¥ ฿ US\$ C\$ A\$ ₽ ﷼'
<ide>
<ide> # These expressions contain various unicode variations, including characters
<ide> # used in Chinese (see #1333, #1340, #1351) – unless there are cross-language
<ide> # conflicts, spaCy's base tokenizer should handle all of those by default
<del>_punct = r'… …… , : ; \! \? ¿ ¡ \( \) \[ \] \{ \} < > _ # \* & 。 ? ! , 、 ; : ~ · ।'
<add>_punct = r'… …… , : ; \! \? ¿ ؟ ¡ \( \) \[ \] \{ \} < > _ # \* & 。 ? ! , 、 ; : ~ · । ، ؛'
<ide> _quotes = r'\' \'\' " ” “ `` ` ‘ ´ ‘‘ ’’ ‚ , „ » « 「 」 『 』 ( ) 〔 〕 【 】 《 》 〈 〉'
<ide> _hyphens = '- – — -- --- —— ~'
<ide> | 1 |
PHP | PHP | expose last smtp response | c1824071c9b0d552493e8373fe0b73c107b79a5b | <ide><path>lib/Cake/Network/Email/SmtpTransport.php
<ide> class SmtpTransport extends AbstractTransport {
<ide> */
<ide> protected $_content;
<ide>
<add>/**
<add> * The response of the last sent SMTP command.
<add> *
<add> * @var array
<add> */
<add> protected $_lastResponse = array();
<add>
<add>/**
<add> * Returns the response of the last sent SMTP command.
<add> *
<add> * A response consists of one or more lines containing a response
<add> * code and an optional response message text:
<add> * {{{
<add> * array(
<add> * array(
<add> * 'code' => '250',
<add> * 'message' => 'mail.example.com'
<add> * ),
<add> * array(
<add> * 'code' => '250',
<add> * 'message' => 'PIPELINING'
<add> * ),
<add> * array(
<add> * 'code' => '250',
<add> * 'message' => '8BITMIME'
<add> * ),
<add> * // etc...
<add> * )
<add> * }}}
<add> *
<add> * @return array
<add> */
<add> public function getLastResponse() {
<add> return $this->_lastResponse;
<add> }
<add>
<ide> /**
<ide> * Send mail
<ide> *
<ide> public function config($config = null) {
<ide> return $this->_config;
<ide> }
<ide>
<add>/**
<add> * Parses and stores the reponse lines in `'code' => 'message'` format.
<add> *
<add> * @param array $responseLines
<add> * @return void
<add> */
<add> protected function _bufferResponseLines(array $responseLines) {
<add> $response = array();
<add> foreach ($responseLines as $responseLine) {
<add> if (preg_match('/^(\d{3})(?:[\s\-]+(.*))?$/', $responseLine, $match)) {
<add> $response[] = array(
<add> 'code' => $match[1],
<add> 'message' => isset($match[2]) ? $match[2] : null
<add> );
<add> }
<add> }
<add> $this->_lastResponse = array_merge($this->_lastResponse, $response);
<add> }
<add>
<ide> /**
<ide> * Connect to SMTP Server
<ide> *
<ide> protected function _generateSocket() {
<ide> * @throws SocketException
<ide> */
<ide> protected function _smtpSend($data, $checkCode = '250') {
<add> $this->_lastResponse = array();
<add>
<ide> if ($data !== null) {
<ide> $this->_socket->write($data . "\r\n");
<ide> }
<ide> protected function _smtpSend($data, $checkCode = '250') {
<ide> $responseLines = explode("\r\n", rtrim($response, "\r\n"));
<ide> $response = end($responseLines);
<ide>
<add> $this->_bufferResponseLines($responseLines);
<add>
<ide> if (preg_match('/^(' . $checkCode . ')(.)/', $response, $code)) {
<ide> if ($code[2] === '-') {
<ide> continue;
<ide><path>lib/Cake/Test/Case/Network/Email/SmtpTransportTest.php
<ide> protected function _generateSocket() {
<ide> */
<ide> public function __call($method, $args) {
<ide> $method = '_' . $method;
<del> return $this->$method();
<add> return call_user_func_array(array($this, $method), $args);
<ide> }
<ide>
<ide> }
<ide> public function testEmptyConfigArray() {
<ide> $this->assertEquals($expected, $result);
<ide> }
<ide>
<add>/**
<add> * testGetLastResponse method
<add> *
<add> * @return void
<add> */
<add> public function testGetLastResponse() {
<add> $this->assertEmpty($this->SmtpTransport->getLastResponse());
<add>
<add> $this->socket->expects($this->any())->method('connect')->will($this->returnValue(true));
<add> $this->socket->expects($this->at(0))->method('read')->will($this->returnValue(false));
<add> $this->socket->expects($this->at(1))->method('read')->will($this->returnValue("220 Welcome message\r\n"));
<add> $this->socket->expects($this->at(2))->method('write')->with("EHLO localhost\r\n");
<add> $this->socket->expects($this->at(3))->method('read')->will($this->returnValue(false));
<add> $this->socket->expects($this->at(4))->method('read')->will($this->returnValue("250-PIPELINING\r\n"));
<add> $this->socket->expects($this->at(5))->method('read')->will($this->returnValue("250-SIZE 102400000\r\n"));
<add> $this->socket->expects($this->at(6))->method('read')->will($this->returnValue("250-VRFY\r\n"));
<add> $this->socket->expects($this->at(7))->method('read')->will($this->returnValue("250-ETRN\r\n"));
<add> $this->socket->expects($this->at(8))->method('read')->will($this->returnValue("250-STARTTLS\r\n"));
<add> $this->socket->expects($this->at(9))->method('read')->will($this->returnValue("250-AUTH PLAIN LOGIN\r\n"));
<add> $this->socket->expects($this->at(10))->method('read')->will($this->returnValue("250-AUTH=PLAIN LOGIN\r\n"));
<add> $this->socket->expects($this->at(11))->method('read')->will($this->returnValue("250-ENHANCEDSTATUSCODES\r\n"));
<add> $this->socket->expects($this->at(12))->method('read')->will($this->returnValue("250-8BITMIME\r\n"));
<add> $this->socket->expects($this->at(13))->method('read')->will($this->returnValue("250 DSN\r\n"));
<add> $this->SmtpTransport->connect();
<add>
<add> $expected = array(
<add> array('code' => '250', 'message' => 'PIPELINING'),
<add> array('code' => '250', 'message' => 'SIZE 102400000'),
<add> array('code' => '250', 'message' => 'VRFY'),
<add> array('code' => '250', 'message' => 'ETRN'),
<add> array('code' => '250', 'message' => 'STARTTLS'),
<add> array('code' => '250', 'message' => 'AUTH PLAIN LOGIN'),
<add> array('code' => '250', 'message' => 'AUTH=PLAIN LOGIN'),
<add> array('code' => '250', 'message' => 'ENHANCEDSTATUSCODES'),
<add> array('code' => '250', 'message' => '8BITMIME'),
<add> array('code' => '250', 'message' => 'DSN')
<add> );
<add> $result = $this->SmtpTransport->getLastResponse();
<add> $this->assertEquals($expected, $result);
<add>
<add> $email = new CakeEmail();
<add> $email->from('noreply@cakephp.org', 'CakePHP Test');
<add> $email->to('cake@cakephp.org', 'CakePHP');
<add>
<add> $this->socket->expects($this->at(0))->method('write')->with("MAIL FROM:<noreply@cakephp.org>\r\n");
<add> $this->socket->expects($this->at(1))->method('read')->will($this->returnValue(false));
<add> $this->socket->expects($this->at(2))->method('read')->will($this->returnValue("250 OK\r\n"));
<add> $this->socket->expects($this->at(3))->method('write')->with("RCPT TO:<cake@cakephp.org>\r\n");
<add> $this->socket->expects($this->at(4))->method('read')->will($this->returnValue(false));
<add> $this->socket->expects($this->at(5))->method('read')->will($this->returnValue("250 OK\r\n"));
<add>
<add> $this->SmtpTransport->setCakeEmail($email);
<add> $this->SmtpTransport->sendRcpt();
<add>
<add> $expected = array(
<add> array('code' => '250', 'message' => 'OK'),
<add> );
<add> $result = $this->SmtpTransport->getLastResponse();
<add> $this->assertEquals($expected, $result);
<add> }
<add>
<add>/**
<add> * testBufferResponseLines method
<add> *
<add> * @return void
<add> */
<add> public function testBufferResponseLines() {
<add> $reponseLines = array(
<add> '123',
<add> 'FOOBAR',
<add> '250-PIPELINING',
<add> '250-ENHANCEDSTATUSCODES',
<add> '250-8BITMIME',
<add> '250 DSN',
<add> );
<add> $this->SmtpTransport->bufferResponseLines($reponseLines);
<add>
<add> $expected = array(
<add> array('code' => '123', 'message' => null),
<add> array('code' => '250', 'message' => 'PIPELINING'),
<add> array('code' => '250', 'message' => 'ENHANCEDSTATUSCODES'),
<add> array('code' => '250', 'message' => '8BITMIME'),
<add> array('code' => '250', 'message' => 'DSN')
<add> );
<add> $result = $this->SmtpTransport->getLastResponse();
<add> $this->assertEquals($expected, $result);
<add> }
<ide> } | 2 |
Ruby | Ruby | add some more tests | 4710be2fb3a6f76c06a118f6d41e3c289c3f8324 | <ide><path>Library/Homebrew/test/cli/parser_spec.rb
<ide> expect { parser.parse([]) }.to raise_error(Homebrew::CLI::MinNamedArgumentsError)
<ide> end
<ide>
<add> it "treats a symbol as a single argument of the specified type" do
<add> formula_parser = described_class.new do
<add> named :formula
<add> end
<add> expect { formula_parser.parse([]) }.to raise_error(UsageError, /this command requires a formula argument/)
<add> end
<add>
<ide> it "doesn't allow more than the specified number of arguments" do
<ide> expect { parser.parse(["foo", "bar"]) }.to raise_error(Homebrew::CLI::MaxNamedArgumentsError)
<ide> end | 1 |
Go | Go | address simple failures in testps* | 75d107451a16161e2ff54753814936fbf6ca0d02 | <ide><path>integration-cli/docker_cli_ps_test.go
<ide> import (
<ide> "github.com/go-check/check"
<ide> )
<ide>
<add>var sleepCmd = "/bin/sleep"
<add>
<add>func init() {
<add> if daemonPlatform == "windows" {
<add> sleepCmd = "sleep"
<add> }
<add>}
<add>
<ide> func (s *DockerSuite) TestPsListContainersBase(c *check.C) {
<add> // Problematic on Windows as busybox doesn't support top
<ide> testRequires(c, DaemonIsLinux)
<ide> out, _ := dockerCmd(c, "run", "-d", "busybox", "top")
<ide> firstID := strings.TrimSpace(out)
<ide> func assertContainerList(out string, expected []string) bool {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersSize(c *check.C) {
<add> // Problematic on Windows as it doesn't report the size correctly @swernli
<ide> testRequires(c, DaemonIsLinux)
<ide> dockerCmd(c, "run", "-d", "busybox", "echo", "hello")
<ide>
<ide> func (s *DockerSuite) TestPsListContainersSize(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterStatus(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<del>
<ide> // start exited container
<ide> out, _ := dockerCmd(c, "run", "-d", "busybox")
<ide> firstID := strings.TrimSpace(out)
<ide> func (s *DockerSuite) TestPsListContainersFilterStatus(c *check.C) {
<ide> out, _, _ = dockerCmdWithTimeout(time.Second*60, "ps", "-a", "-q", "--filter=status=rubbish")
<ide> c.Assert(out, checker.Contains, "Unrecognised filter value for status", check.Commentf("Expected error response due to invalid status filter output: %q", out))
<ide>
<del> // pause running container
<del> out, _ = dockerCmd(c, "run", "-itd", "busybox")
<del> pausedID := strings.TrimSpace(out)
<del> dockerCmd(c, "pause", pausedID)
<del> // make sure the container is unpaused to let the daemon stop it properly
<del> defer func() { dockerCmd(c, "unpause", pausedID) }()
<del>
<del> out, _ = dockerCmd(c, "ps", "--no-trunc", "-q", "--filter=status=paused")
<del> containerOut = strings.TrimSpace(out)
<del> c.Assert(containerOut, checker.Equals, pausedID)
<add> // Windows doesn't support pausing of containers
<add> if daemonPlatform != "windows" {
<add> // pause running container
<add> out, _ = dockerCmd(c, "run", "-itd", "busybox")
<add> pausedID := strings.TrimSpace(out)
<add> dockerCmd(c, "pause", pausedID)
<add> // make sure the container is unpaused to let the daemon stop it properly
<add> defer func() { dockerCmd(c, "unpause", pausedID) }()
<add>
<add> out, _ = dockerCmd(c, "ps", "--no-trunc", "-q", "--filter=status=paused")
<add> containerOut = strings.TrimSpace(out)
<add> c.Assert(containerOut, checker.Equals, pausedID)
<add> }
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterID(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // start container
<ide> out, _ := dockerCmd(c, "run", "-d", "busybox")
<ide> firstID := strings.TrimSpace(out)
<ide>
<ide> // start another container
<del> dockerCmd(c, "run", "-d", "busybox", "top")
<add> dockerCmd(c, "run", "-d", "busybox", sleepCmd, "60")
<ide>
<ide> // filter containers by id
<ide> out, _ = dockerCmd(c, "ps", "-a", "-q", "--filter=id="+firstID)
<ide> func (s *DockerSuite) TestPsListContainersFilterID(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterName(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // start container
<ide> out, _ := dockerCmd(c, "run", "-d", "--name=a_name_to_match", "busybox")
<ide> firstID := strings.TrimSpace(out)
<ide>
<ide> // start another container
<del> dockerCmd(c, "run", "-d", "--name=b_name_to_match", "busybox", "top")
<add> dockerCmd(c, "run", "-d", "--name=b_name_to_match", "busybox", sleepCmd, "60")
<ide>
<ide> // filter containers by name
<ide> out, _ = dockerCmd(c, "ps", "-a", "-q", "--filter=name=a_name_to_match")
<ide> func (s *DockerSuite) TestPsListContainersFilterName(c *check.C) {
<ide> // - Run containers for each of those image (busybox, images_ps_filter_test1, images_ps_filter_test2)
<ide> // - Filter them out :P
<ide> func (s *DockerSuite) TestPsListContainersFilterAncestorImage(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // Build images
<ide> imageName1 := "images_ps_filter_test1"
<ide> imageID1, err := buildImage(imageName1,
<ide> func checkPsAncestorFilterOutput(c *check.C, out string, filterName string, expe
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterLabel(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // start container
<ide> out, _ := dockerCmd(c, "run", "-d", "-l", "match=me", "-l", "second=tag", "busybox")
<ide> firstID := strings.TrimSpace(out)
<ide> func (s *DockerSuite) TestPsListContainersFilterLabel(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterExited(c *check.C) {
<add> // TODO Windows CI: Enable for TP5. Fails on TP4
<ide> testRequires(c, DaemonIsLinux)
<del> dockerCmd(c, "run", "-d", "--name", "top", "busybox", "top")
<add> dockerCmd(c, "run", "-d", "--name", "sleep", "busybox", sleepCmd, "60")
<ide>
<ide> dockerCmd(c, "run", "--name", "zero1", "busybox", "true")
<ide> firstZero, err := getIDByName("zero1")
<ide> func (s *DockerSuite) TestPsListContainersFilterExited(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsRightTagName(c *check.C) {
<add> // TODO Investigate further why this fails on Windows to Windows CI
<ide> testRequires(c, DaemonIsLinux)
<ide> tag := "asybox:shmatest"
<ide> dockerCmd(c, "tag", "busybox", tag)
<ide>
<ide> var id1 string
<del> out, _ := dockerCmd(c, "run", "-d", "busybox", "top")
<add> out, _ := dockerCmd(c, "run", "-d", "busybox", sleepCmd, "60")
<ide> id1 = strings.TrimSpace(string(out))
<ide>
<ide> var id2 string
<del> out, _ = dockerCmd(c, "run", "-d", tag, "top")
<add> out, _ = dockerCmd(c, "run", "-d", tag, sleepCmd, "60")
<ide> id2 = strings.TrimSpace(string(out))
<ide>
<ide> var imageID string
<ide> out, _ = dockerCmd(c, "inspect", "-f", "{{.Id}}", "busybox")
<ide> imageID = strings.TrimSpace(string(out))
<ide>
<ide> var id3 string
<del> out, _ = dockerCmd(c, "run", "-d", imageID, "top")
<add> out, _ = dockerCmd(c, "run", "-d", imageID, sleepCmd, "60")
<ide> id3 = strings.TrimSpace(string(out))
<ide>
<ide> out, _ = dockerCmd(c, "ps", "--no-trunc")
<ide> func (s *DockerSuite) TestPsRightTagName(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsLinkedWithNoTrunc(c *check.C) {
<add> // Problematic on Windows as it doesn't support links as of Jan 2016
<ide> testRequires(c, DaemonIsLinux)
<del> dockerCmd(c, "run", "--name=first", "-d", "busybox", "top")
<del> dockerCmd(c, "run", "--name=second", "--link=first:first", "-d", "busybox", "top")
<add> dockerCmd(c, "run", "--name=first", "-d", "busybox", sleepCmd, "60")
<add> dockerCmd(c, "run", "--name=second", "--link=first:first", "-d", "busybox", sleepCmd, "60")
<ide>
<ide> out, _ := dockerCmd(c, "ps", "--no-trunc")
<ide> lines := strings.Split(strings.TrimSpace(string(out)), "\n")
<ide> func (s *DockerSuite) TestPsLinkedWithNoTrunc(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsGroupPortRange(c *check.C) {
<add> // Problematic on Windows as it doesn't support port ranges as of Jan 2016
<ide> testRequires(c, DaemonIsLinux)
<ide> portRange := "3800-3900"
<ide> dockerCmd(c, "run", "-d", "--name", "porttest", "-p", portRange+":"+portRange, "busybox", "top")
<ide> func (s *DockerSuite) TestPsGroupPortRange(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsWithSize(c *check.C) {
<add> // Problematic on Windows as it doesn't report the size correctly @swernli
<ide> testRequires(c, DaemonIsLinux)
<ide> dockerCmd(c, "run", "-d", "--name", "sizetest", "busybox", "top")
<ide>
<ide> func (s *DockerSuite) TestPsWithSize(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsListContainersFilterCreated(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // create a container
<ide> out, _ := dockerCmd(c, "create", "busybox")
<ide> cID := strings.TrimSpace(out)
<ide> func (s *DockerSuite) TestPsListContainersFilterCreated(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsFormatMultiNames(c *check.C) {
<add> // Problematic on Windows as it doesn't support link as of Jan 2016
<ide> testRequires(c, DaemonIsLinux)
<ide> //create 2 containers and link them
<ide> dockerCmd(c, "run", "--name=child", "-d", "busybox", "top")
<ide> func (s *DockerSuite) TestPsFormatMultiNames(c *check.C) {
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsFormatHeaders(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> // make sure no-container "docker ps" still prints the header row
<ide> out, _ := dockerCmd(c, "ps", "--format", "table {{.ID}}")
<ide> c.Assert(out, checker.Equals, "CONTAINER ID\n", check.Commentf(`Expected 'CONTAINER ID\n', got %v`, out))
<ide>
<ide> // verify that "docker ps" with a container still prints the header row also
<del> dockerCmd(c, "run", "--name=test", "-d", "busybox", "top")
<add> dockerCmd(c, "run", "--name=test", "-d", "busybox", sleepCmd, "60")
<ide> out, _ = dockerCmd(c, "ps", "--format", "table {{.Names}}")
<ide> c.Assert(out, checker.Equals, "NAMES\ntest\n", check.Commentf(`Expected 'NAMES\ntest\n', got %v`, out))
<ide> }
<ide>
<ide> func (s *DockerSuite) TestPsDefaultFormatAndQuiet(c *check.C) {
<del> testRequires(c, DaemonIsLinux)
<ide> config := `{
<ide> "psFormat": "default {{ .ID }}"
<ide> }`
<ide> func (s *DockerSuite) TestPsDefaultFormatAndQuiet(c *check.C) {
<ide> err = ioutil.WriteFile(filepath.Join(d, "config.json"), []byte(config), 0644)
<ide> c.Assert(err, checker.IsNil)
<ide>
<del> out, _ := dockerCmd(c, "run", "--name=test", "-d", "busybox", "top")
<add> out, _ := dockerCmd(c, "run", "--name=test", "-d", "busybox", sleepCmd, "60")
<ide> id := strings.TrimSpace(out)
<ide>
<ide> out, _ = dockerCmd(c, "--config", d, "ps", "-q")
<ide> func (s *DockerSuite) TestPsDefaultFormatAndQuiet(c *check.C) {
<ide>
<ide> // Test for GitHub issue #12595
<ide> func (s *DockerSuite) TestPsImageIDAfterUpdate(c *check.C) {
<add> // TODO: Investigate why this fails on Windows to Windows CI further.
<ide> testRequires(c, DaemonIsLinux)
<del>
<ide> originalImageName := "busybox:TestPsImageIDAfterUpdate-original"
<ide> updatedImageName := "busybox:TestPsImageIDAfterUpdate-updated"
<ide>
<ide> func (s *DockerSuite) TestPsImageIDAfterUpdate(c *check.C) {
<ide> originalImageID, err := getIDByName(originalImageName)
<ide> c.Assert(err, checker.IsNil)
<ide>
<del> runCmd = exec.Command(dockerBinary, "run", "-d", originalImageName, "top")
<add> runCmd = exec.Command(dockerBinary, "run", "-d", originalImageName, sleepCmd, "60")
<ide> out, _, err = runCommandWithOutput(runCmd)
<ide> c.Assert(err, checker.IsNil)
<ide> containerID := strings.TrimSpace(out) | 1 |
Ruby | Ruby | fix warning of shadowing variable | 9e53488b1d8d535182a989bd38fbb171aebbeef5 | <ide><path>lib/arel/visitors/visitor.rb
<ide> def accept object
<ide>
<ide> DISPATCH = Hash.new do |hash, visitor_class|
<ide> hash[visitor_class] =
<del> Hash.new do |hash, node_class|
<del> hash[node_class] = "visit_#{(node_class.name || '').gsub('::', '_')}"
<add> Hash.new do |method_hash, node_class|
<add> method_hash[node_class] = "visit_#{(node_class.name || '').gsub('::', '_')}"
<ide> end
<ide> end
<ide> | 1 |
Ruby | Ruby | rewrite sanitizehelper docs | d4ec25a3176c1e5f7083d0d39203d01c03cbb696 | <ide><path>actionview/lib/action_view/helpers/sanitize_helper.rb
<ide> module Helpers
<ide> # These helper methods extend Action View making them callable within your template files.
<ide> module SanitizeHelper
<ide> extend ActiveSupport::Concern
<del> # This +sanitize+ helper will HTML encode all tags and strip all attributes that
<del> # aren't specifically allowed.
<add> # Sanitizes HTML input, stripping all tags and attributes that aren't whitelisted.
<ide> #
<del> # It also strips href/src tags with invalid protocols, like javascript: especially.
<del> # It does its best to counter any tricks that hackers may use, like throwing in
<del> # unicode/ascii/hex values to get past the javascript: filters. Check out
<del> # the extensive test suite.
<add> # It also strips href/src attributes with unsafe protocols like
<add> # <tt>javascript:</tt>, while also protecting against attempts to use Unicode,
<add> # ASCII, and hex character references to work around these protocol filters.
<ide> #
<del> # <%= sanitize @article.body %>
<add> # The default sanitizer is Rails::Html::WhiteListSanitizer. See {Rails HTML
<add> # Sanitizers}[https://github.com/rails/rails-html-sanitizer] for more information.
<ide> #
<del> # You can add or remove tags/attributes if you want to customize it a bit.
<del> # See ActionView::Base for full docs on the available options. You can add
<del> # tags/attributes for single uses of +sanitize+ by passing either the
<del> # <tt>:attributes</tt> or <tt>:tags</tt> options:
<add> # Custom sanitization rules can also be provided.
<ide> #
<del> # Normal Use
<del> #
<del> # <%= sanitize @article.body %>
<add> # Please note that sanitizing user-provided text does not guarantee that the
<add> # resulting markup is valid or even well-formed. For example, the output may still
<add> # contain unescaped characters like <tt><</tt>, <tt>></tt>, or <tt>&</tt>.
<ide> #
<del> # Custom Use - Custom Scrubber
<del> # (supply a Loofah::Scrubber that does the sanitization)
<add> # ==== Options
<ide> #
<del> # scrubber can either wrap a block:
<add> # * <tt>:tags</tt> - An array of allowed tags.
<add> # * <tt>:attributes</tt> - An array of allowed attributes.
<add> # * <tt>:scrubber</tt> - A {Rails::Html scrubber}[https://github.com/rails/rails-html-sanitizer]
<add> # or {Loofah::Scrubber}[https://github.com/flavorjones/loofah] object that
<add> # defines custom sanitization rules. A custom scrubber takes precedence over
<add> # custom tags and attributes.
<ide> #
<del> # scrubber = Loofah::Scrubber.new do |node|
<del> # node.text = "dawn of cats"
<del> # end
<add> # ==== Examples
<ide> #
<del> # or be a subclass of Loofah::Scrubber which responds to scrub:
<add> # Normal use:
<ide> #
<del> # class KittyApocalypse < Loofah::Scrubber
<del> # def scrub(node)
<del> # node.text = "dawn of cats"
<del> # end
<del> # end
<del> # scrubber = KittyApocalypse.new
<add> # <%= sanitize @comment.body %>
<ide> #
<del> # <%= sanitize @article.body, scrubber: scrubber %>
<add> # Providing custom whitelisted tags and attributes:
<ide> #
<del> # A custom scrubber takes precedence over custom tags and attributes
<del> # Learn more about scrubbers here: https://github.com/flavorjones/loofah
<add> # <%= sanitize @comment.body, tags: %w(strong em a), attributes: %w(href) %>
<ide> #
<del> # Custom Use - tags and attributes
<del> # (only the mentioned tags and attributes are allowed, nothing else)
<add> # Providing a custom Rails::Html scrubber:
<ide> #
<del> # <%= sanitize @article.body, tags: %w(table tr td), attributes: %w(id class style) %>
<add> # class CommentScrubber < Rails::Html::PermitScrubber
<add> # def allowed_node?(node)
<add> # !%w(form script comment blockquote).include?(node.name)
<add> # end
<ide> #
<del> # Add table tags to the default allowed tags
<add> # def skip_node?(node)
<add> # node.text?
<add> # end
<ide> #
<del> # class Application < Rails::Application
<del> # config.action_view.sanitized_allowed_tags = ['table', 'tr', 'td']
<add> # def scrub_attribute?(name)
<add> # name == "style"
<add> # end
<ide> # end
<ide> #
<del> # Remove tags to the default allowed tags
<add> # <%= sanitize @comment.body, scrubber: CommentScrubber.new %>
<ide> #
<del> # class Application < Rails::Application
<del> # config.after_initialize do
<del> # ActionView::Base.sanitized_allowed_tags.delete 'div'
<del> # end
<del> # end
<add> # See {Rails HTML Sanitizer}[https://github.com/rails/rails-html-sanitizer] for
<add> # documentation about Rails::Html scrubbers.
<ide> #
<del> # Change allowed default attributes
<add> # Providing a custom Loofah::Scrubber:
<ide> #
<del> # class Application < Rails::Application
<del> # config.action_view.sanitized_allowed_attributes = ['id', 'class', 'style']
<add> # scrubber = Loofah::Scrubber.new do |node|
<add> # node.remove if node.name == "script"
<ide> # end
<ide> #
<del> # Please note that sanitizing user-provided text does not guarantee that the
<del> # resulting markup is valid (conforming to a document type) or even well-formed.
<del> # The output may still contain e.g. unescaped '<', '>', '&' characters and
<del> # confuse browsers.
<add> # <%= sanitize @comment.body, scrubber: scrubber %>
<add> #
<add> # See {Loofah's documentation}[https://github.com/flavorjones/loofah] for more
<add> # information about defining custom Loofah::Scrubber objects.
<add> #
<add> # To set the default allowed tags or attributes across your application:
<ide> #
<add> # # In config/application.rb
<add> # config.action_view.sanitized_allowed_tags = ['strong', 'em', 'a']
<add> # config.action_view.sanitized_allowed_attributes = ['href', 'title']
<ide> def sanitize(html, options = {})
<ide> self.class.white_list_sanitizer.sanitize(html, options).try(:html_safe)
<ide> end
<ide> def sanitize_css(style)
<ide> self.class.white_list_sanitizer.sanitize_css(style)
<ide> end
<ide>
<del> # Strips all HTML tags from the +html+, including comments. This uses
<del> # Nokogiri for tokenization (via Loofah) and so its HTML parsing ability
<del> # is limited by that of Nokogiri.
<add> # Strips all HTML tags from +html+, including comments.
<ide> #
<ide> # strip_tags("Strip <i>these</i> tags!")
<ide> # # => Strip these tags!
<ide> def strip_tags(html)
<ide> self.class.full_sanitizer.sanitize(html)
<ide> end
<ide>
<del> # Strips all link tags from +text+ leaving just the link text.
<add> # Strips all link tags from +html+ leaving just the link text.
<ide> #
<ide> # strip_links('<a href="http://www.rubyonrails.org">Ruby on Rails</a>')
<ide> # # => Ruby on Rails
<ide> def link_sanitizer
<ide> def white_list_sanitizer
<ide> @white_list_sanitizer ||= sanitizer_vendor.white_list_sanitizer.new
<ide> end
<del>
<del> ##
<del> # :method: sanitized_allowed_tags=
<del> #
<del> # :call-seq: sanitized_allowed_tags=(tags)
<del> #
<del> # Replaces the allowed tags for the +sanitize+ helper.
<del> #
<del> # class Application < Rails::Application
<del> # config.action_view.sanitized_allowed_tags = ['table', 'tr', 'td']
<del> # end
<del> #
<del>
<del> ##
<del> # :method: sanitized_allowed_attributes=
<del> #
<del> # :call-seq: sanitized_allowed_attributes=(attributes)
<del> #
<del> # Replaces the allowed HTML attributes for the +sanitize+ helper.
<del> #
<del> # class Application < Rails::Application
<del> # config.action_view.sanitized_allowed_attributes = ['onclick', 'longdesc']
<del> # end
<del> #
<ide> end
<ide> end
<ide> end | 1 |
Javascript | Javascript | ensure message sanity at source | daa97df343daa79fbd49c3d77d3ddca254ca9303 | <ide><path>lib/internal/child_process.js
<ide> function setupChannel(target, channel) {
<ide> if (message === undefined)
<ide> throw new ERR_MISSING_ARGS('message');
<ide>
<add> // Non-serializable messages should not reach the remote
<add> // end point; as any failure in the stringification there
<add> // will result in error message that is weakly consumable.
<add> // So perform a sanity check on message prior to sending.
<add> if (typeof message !== 'string' &&
<add> typeof message !== 'object' &&
<add> typeof message !== 'number' &&
<add> typeof message !== 'boolean') {
<add> throw new ERR_INVALID_ARG_TYPE(
<add> 'message', ['string', 'object', 'number', 'boolean'], message);
<add> }
<add>
<ide> // Support legacy function signature
<ide> if (typeof options === 'boolean') {
<ide> options = { swallowErrors: options };
<ide><path>test/parallel/test-child-process-fork.js
<ide> assert.throws(() => n.send(), {
<ide> code: 'ERR_MISSING_ARGS'
<ide> });
<ide>
<add>assert.throws(() => n.send(Symbol()), {
<add> name: 'TypeError [ERR_INVALID_ARG_TYPE]',
<add> message: 'The "message" argument must be one of type string,' +
<add> ' object, number, or boolean. Received type symbol',
<add> code: 'ERR_INVALID_ARG_TYPE'
<add>});
<ide> n.send({ hello: 'world' });
<ide>
<ide> n.on('exit', common.mustCall((c) => { | 2 |
Javascript | Javascript | move methods from module into modulegraph | 3bb5263bfd603c9d9b7b582cef085f8c492325f8 | <ide><path>lib/Compilation.js
<ide> class Compilation {
<ide> dependentModule.profile = currentProfile;
<ide> }
<ide>
<del> dependentModule.setIssuer(this.moduleGraph, module);
<add> this.moduleGraph.setIssuer(dependentModule, module);
<ide> } else {
<ide> if (this.profile) {
<ide> if (module.profile) {
<ide><path>lib/FlagDependencyUsagePlugin.js
<ide> class FlagDependencyUsagePlugin {
<ide> * @returns {void}
<ide> */
<ide> const processModule = (module, usedExports) => {
<del> let ue = module.getUsedExports(moduleGraph);
<add> let ue = moduleGraph.getUsedExports(module);
<ide> if (ue === true) return;
<ide> if (usedExports === true) {
<del> module.setUsedExports(moduleGraph, (ue = true));
<add> moduleGraph.setUsedExports(module, (ue = true));
<ide> } else if (Array.isArray(usedExports)) {
<ide> if (!ue) {
<del> module.setUsedExports(
<del> moduleGraph,
<add> moduleGraph.setUsedExports(
<add> module,
<ide> (ue = new SortableSet(usedExports))
<ide> );
<ide> } else {
<ide> class FlagDependencyUsagePlugin {
<ide> }
<ide> } else {
<ide> if (ue !== false) return;
<del> module.setUsedExports(moduleGraph, (ue = new SortableSet()));
<add> moduleGraph.setUsedExports(module, (ue = new SortableSet()));
<ide> }
<ide>
<ide> // for a module without side effects we stop tracking usage here when no export is used
<ide> class FlagDependencyUsagePlugin {
<ide> if (!reference) return;
<ide> const referenceModule = reference.module;
<ide> const importedNames = reference.importedNames;
<del> const oldUsedExports = referenceModule.getUsedExports(moduleGraph);
<add> const oldUsedExports = moduleGraph.getUsedExports(referenceModule);
<ide> if (
<ide> !oldUsedExports ||
<ide> !isContained(oldUsedExports, importedNames)
<ide> class FlagDependencyUsagePlugin {
<ide> };
<ide>
<ide> for (const module of modules) {
<del> module.setUsedExports(moduleGraph, false);
<add> moduleGraph.setUsedExports(module, false);
<ide> }
<ide>
<ide> /** @type {[Module, DependenciesBlock, UsedExports][]} */
<ide><path>lib/FlagInitialModulesAsUsedPlugin.js
<ide> class FlagInitialModulesAsUsedPlugin {
<ide> return;
<ide> }
<ide> for (const module of chunkGraph.getChunkModulesIterable(chunk)) {
<del> module.setUsedExports(moduleGraph, true);
<add> moduleGraph.setUsedExports(module, true);
<ide> moduleGraph.addExtraReason(module, this.explanation);
<ide> }
<ide> }
<ide><path>lib/FunctionModuleTemplatePlugin.js
<ide> class FunctionModuleTemplatePlugin {
<ide> } else if (module.buildMeta.providedExports) {
<ide> source.add(Template.toComment("no static exports found") + "\n");
<ide> }
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide> if (usedExports === true) {
<ide> source.add(Template.toComment("all exports used") + "\n");
<ide> } else if (usedExports === false) {
<ide> class FunctionModuleTemplatePlugin {
<ide> );
<ide> }
<ide> }
<del> const optimizationBailout = module.getOptimizationBailout(
<del> this.moduleGraph
<add> const optimizationBailout = this.moduleGraph.getOptimizationBailout(
<add> module
<ide> );
<ide> if (optimizationBailout) {
<ide> for (const text of optimizationBailout) {
<ide><path>lib/Module.js
<ide> const Template = require("./Template");
<ide> /** @typedef {KnownBuildMeta & Record<string, any>} BuildMeta */
<ide>
<ide> const EMPTY_RESOLVE_OPTIONS = {};
<del>const optimizationBailoutSymbol = Symbol("optimization bailout");
<del>const usedExportsSymbol = Symbol("used exports");
<del>const issuerSymbol = Symbol("issuer");
<ide>
<ide> let debugId = 1000;
<ide>
<ide> class Module extends DependenciesBlock {
<ide> return (this.buildInfo && this.buildInfo.moduleArgument) || "module";
<ide> }
<ide>
<del> /**
<del> * @param {ModuleGraph} moduleGraph the module graph
<del> * @returns {Module | null} the issuer module
<del> */
<del> getIssuer(moduleGraph) {
<del> const meta = moduleGraph.getMeta(this);
<del> return meta[issuerSymbol];
<del> }
<del>
<del> /**
<del> * @param {ModuleGraph} moduleGraph the module graph
<del> * @param {Module | null} issuer the issuer module
<del> * @returns {void}
<del> */
<del> setIssuer(moduleGraph, issuer) {
<del> const meta = moduleGraph.getMeta(this);
<del> meta[issuerSymbol] = issuer;
<del> }
<del>
<del> /**
<del> * @param {ModuleGraph} moduleGraph the module graph
<del> * @returns {(string | OptimizationBailoutFunction)[]} optimization bailouts
<del> */
<del> getOptimizationBailout(moduleGraph) {
<del> const meta = moduleGraph.getMeta(this);
<del> const list = meta[optimizationBailoutSymbol];
<del> if (list === undefined) {
<del> return (meta[optimizationBailoutSymbol] = []);
<del> }
<del> return list;
<del> }
<del>
<del> /**
<del> * @param {ModuleGraph} moduleGraph the module graph
<del> * @returns {false | true | SortableSet<string> | null} the used exports
<del> * false: module is not used at all.
<del> * true: the module namespace/object export is used.
<del> * SortableSet<string>: these export names are used.
<del> * empty SortableSet<string>: module is used but no export.
<del> * null: unknown, worst case should be assumed.
<del> */
<del> getUsedExports(moduleGraph) {
<del> const value = moduleGraph.getMeta(this)[usedExportsSymbol];
<del> return value === undefined ? null : value;
<del> }
<del>
<del> /**
<del> * @param {ModuleGraph} moduleGraph the module graph
<del> * @param {false | true | SortableSet<string>} usedExports the used exports
<del> * @returns {void}
<del> */
<del> setUsedExports(moduleGraph, usedExports) {
<del> moduleGraph.getMeta(this)[usedExportsSymbol] = usedExports;
<del> }
<del>
<ide> /**
<ide> * disconnect the module from the graph
<ide> * @returns {void}
<ide> class Module extends DependenciesBlock {
<ide> * @returns {boolean} true, if the module is used
<ide> */
<ide> isModuleUsed(moduleGraph) {
<del> return this.getUsedExports(moduleGraph) !== false;
<add> return moduleGraph.getUsedExports(this) !== false;
<ide> }
<ide>
<ide> /**
<ide> class Module extends DependenciesBlock {
<ide> * @returns {boolean} true, if the export is used
<ide> */
<ide> isExportUsed(moduleGraph, exportName) {
<del> const usedExports = this.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(this);
<ide> if (usedExports === null || usedExports === true) return true;
<ide> if (usedExports === false) return false;
<ide> return usedExports.has(exportName);
<ide> class Module extends DependenciesBlock {
<ide> * string, the mangled export name when used.
<ide> */
<ide> getUsedName(moduleGraph, exportName) {
<del> const usedExports = this.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(this);
<ide> if (usedExports === null || usedExports === true) return exportName;
<ide> if (usedExports === false) return false;
<ide> if (!usedExports.has(exportName)) return false;
<ide> class Module extends DependenciesBlock {
<ide> */
<ide> updateHash(hash, compilation) {
<ide> hash.update(`${this.id}`);
<del> const usedExports = this.getUsedExports(compilation.moduleGraph);
<add> const usedExports = compilation.moduleGraph.getUsedExports(this);
<ide> if (typeof usedExports === "boolean") {
<ide> hash.update(JSON.stringify(usedExports));
<ide> } else if (!usedExports) {
<ide> Object.defineProperty(Module.prototype, "isUsed", {
<ide>
<ide> Object.defineProperty(Module.prototype, "used", {
<ide> get() {
<del> throw new Error("Module.used was refactored (use getUsedExports instead)");
<add> throw new Error(
<add> "Module.used was refactored (use ModuleGraph.getUsedExports instead)"
<add> );
<ide> },
<ide> set(value) {
<del> throw new Error("Module.used was refactored (use setUsedExports instead)");
<add> throw new Error(
<add> "Module.used was refactored (use ModuleGraph.setUsedExports instead)"
<add> );
<ide> }
<ide> });
<ide>
<ide> Object.defineProperty(Module.prototype, "usedExports", {
<ide> get() {
<ide> throw new Error(
<del> "Module.usedExports was refactored (use getUsedExports instead)"
<add> "Module.usedExports was refactored (use ModuleGraph.getUsedExports instead)"
<ide> );
<ide> },
<ide> set(value) {
<ide> throw new Error(
<del> "Module.usedExports was refactored (use setUsedExports instead)"
<add> "Module.usedExports was refactored (use ModuleGraph.setUsedExports instead)"
<ide> );
<ide> }
<ide> });
<ide><path>lib/ModuleGraph.js
<ide> const ModuleGraphConnection = require("./ModuleGraphConnection");
<ide> /** @typedef {import("./DependenciesBlock")} DependenciesBlock */
<ide> /** @typedef {import("./Dependency")} Dependency */
<ide> /** @typedef {import("./Module")} Module */
<add>/** @typedef {import("./RequestShortener")} RequestShortener */
<add>/** @template T @typedef {import("./util/SortableSet")<T>} SortableSet<T> */
<add>
<add>/** @typedef {(requestShortener: RequestShortener) => string} OptimizationBailoutFunction */
<add>
<add>class ModuleGraphModule {
<add> constructor() {
<add> /** @type {Set<ModuleGraphConnection>} */
<add> this.incomingConnections = new Set();
<add> /** @type {Set<ModuleGraphConnection>} */
<add> this.outgoingConnections = new Set();
<add> /** @type {Module | null} */
<add> this.issuer = null;
<add> /** @type {(string | OptimizationBailoutFunction)[]} */
<add> this.optimizationBailout = [];
<add> /** @type {false | true | SortableSet<string> | null} */
<add> this.usedExports = null;
<add> }
<add>}
<ide>
<ide> class ModuleGraph {
<ide> constructor() {
<ide> /** @type {Map<Dependency, ModuleGraphConnection>} */
<ide> this._dependencyMap = new Map();
<del> /** @type {Map<Module, Set<ModuleGraphConnection>>} */
<add> /** @type {Map<Module, ModuleGraphModule>} */
<ide> this._moduleMap = new Map();
<ide> /** @type {Map<Module, Set<ModuleGraphConnection>>} */
<ide> this._originMap = new Map();
<ide> class ModuleGraph {
<ide> this._parentsMap = new Map();
<ide> }
<ide>
<del> _getModuleSet(module) {
<del> let connections = this._moduleMap.get(module);
<del> if (connections === undefined) {
<del> connections = new Set();
<del> this._moduleMap.set(module, connections);
<del> }
<del> return connections;
<del> }
<del>
<del> _getOriginSet(module) {
<del> let connections = this._originMap.get(module);
<del> if (connections === undefined) {
<del> connections = new Set();
<del> this._originMap.set(module, connections);
<add> /**
<add> * @param {Module} module the module
<add> * @returns {ModuleGraphModule} the internal module
<add> */
<add> _getModuleGraphModule(module) {
<add> let mgm = this._moduleMap.get(module);
<add> if (mgm === undefined) {
<add> mgm = new ModuleGraphModule();
<add> this._moduleMap.set(module, mgm);
<ide> }
<del> return connections;
<add> return mgm;
<ide> }
<ide>
<ide> /**
<ide> class ModuleGraph {
<ide> module
<ide> );
<ide> this._dependencyMap.set(dependency, connection);
<del> const connections = this._getModuleSet(module);
<add> const connections = this._getModuleGraphModule(module).incomingConnections;
<ide> connections.add(connection);
<del> const originConnections = this._getOriginSet(originModule);
<add> const originConnections = this._getModuleGraphModule(originModule)
<add> .outgoingConnections;
<ide> originConnections.add(connection);
<ide> }
<ide>
<ide> class ModuleGraph {
<ide> updateModule(dependency, module) {
<ide> const connection = this._dependencyMap.get(dependency);
<ide> if (connection.module === module) return;
<del> const oldSet = this._moduleMap.get(connection.module);
<del> oldSet.delete(connection);
<add> const oldMgm = this._getModuleGraphModule(connection.module);
<add> oldMgm.incomingConnections.delete(connection);
<ide> connection.module = module;
<del> const newSet = this._moduleMap.get(module);
<del> newSet.add(connection);
<add> const newMgm = this._getModuleGraphModule(module);
<add> newMgm.incomingConnections.add(connection);
<ide> }
<ide>
<ide> /**
<ide> class ModuleGraph {
<ide> */
<ide> replaceModule(oldModule, newModule, filterConnection) {
<ide> if (oldModule === newModule) return;
<del> const oldConnections = this._getOriginSet(oldModule);
<del> const newConnections = this._getOriginSet(newModule);
<add> const oldMgm = this._getModuleGraphModule(oldModule);
<add> const newMgm = this._getModuleGraphModule(newModule);
<add> const oldConnections = oldMgm.outgoingConnections;
<add> const newConnections = newMgm.outgoingConnections;
<ide> for (const connection of oldConnections) {
<ide> if (filterConnection(connection)) {
<ide> connection.originModule = newModule;
<ide> newConnections.add(connection);
<ide> oldConnections.delete(connection);
<ide> }
<ide> }
<del> const oldConnections2 = this._getModuleSet(oldModule);
<del> const newConnections2 = this._getModuleSet(newModule);
<add> const oldConnections2 = oldMgm.incomingConnections;
<add> const newConnections2 = newMgm.incomingConnections;
<ide> for (const connection of oldConnections2) {
<ide> if (filterConnection(connection)) {
<ide> connection.module = newModule;
<ide> class ModuleGraph {
<ide> * @returns {void}
<ide> */
<ide> addExtraReason(module, explanation) {
<del> const connections = this._getModuleSet(module);
<add> const connections = this._getModuleGraphModule(module).incomingConnections;
<ide> connections.add(new ModuleGraphConnection(null, null, module, explanation));
<ide> }
<ide>
<ide> class ModuleGraph {
<ide> * @returns {ModuleGraphConnection[]} reasons why a module is included
<ide> */
<ide> getIncomingConnections(module) {
<del> const connections = this._getModuleSet(module);
<add> const connections = this._getModuleGraphModule(module).incomingConnections;
<ide> return Array.from(connections);
<ide> }
<ide>
<ide> class ModuleGraph {
<ide> * @returns {ModuleGraphConnection[]} list of outgoing connections
<ide> */
<ide> getOutgoingConnection(module) {
<del> const connections = this._getOriginSet(module);
<add> const connections = this._getModuleGraphModule(module).outgoingConnections;
<ide> return Array.from(connections);
<ide> }
<ide>
<add> /**
<add> * @param {Module} module the module
<add> * @returns {Module | null} the issuer module
<add> */
<add> getIssuer(module) {
<add> const mgm = this._getModuleGraphModule(module);
<add> return mgm.issuer;
<add> }
<add>
<add> /**
<add> * @param {Module} module the module
<add> * @param {Module | null} issuer the issuer module
<add> * @returns {void}
<add> */
<add> setIssuer(module, issuer) {
<add> const mgm = this._getModuleGraphModule(module);
<add> mgm.issuer = issuer;
<add> }
<add>
<add> /**
<add> * @param {Module} module the module
<add> * @returns {(string | OptimizationBailoutFunction)[]} optimization bailouts
<add> */
<add> getOptimizationBailout(module) {
<add> const mgm = this._getModuleGraphModule(module);
<add> return mgm.optimizationBailout;
<add> }
<add>
<add> /**
<add> * @param {Module} module the module
<add> * @returns {false | true | SortableSet<string> | null} the used exports
<add> * false: module is not used at all.
<add> * true: the module namespace/object export is used.
<add> * SortableSet<string>: these export names are used.
<add> * empty SortableSet<string>: module is used but no export.
<add> * null: unknown, worst case should be assumed.
<add> */
<add> getUsedExports(module) {
<add> const mgm = this._getModuleGraphModule(module);
<add> return mgm.usedExports;
<add> }
<add>
<add> /**
<add> * @param {Module} module the module
<add> * @param {false | true | SortableSet<string>} usedExports the used exports
<add> * @returns {void}
<add> */
<add> setUsedExports(module, usedExports) {
<add> const mgm = this._getModuleGraphModule(module);
<add> mgm.usedExports = usedExports;
<add> }
<add>
<ide> /**
<ide> * @param {any} thing any thing
<ide> * @returns {Object} metadata
<ide><path>lib/Stats.js
<ide> class Stats {
<ide> if (showErrorDetails && e.missing) {
<ide> text += e.missing.map(item => `\n[${item}]`).join("");
<ide> }
<del> const origin = e.origin || (e.module && e.module.getIssuer(moduleGraph));
<add> const origin = e.origin || (e.module && moduleGraph.getIssuer(e.module));
<ide> if (showModuleTrace && origin) {
<ide> text += `\n @ ${this.formatFilePath(
<ide> origin.readableIdentifier(requestShortener)
<ide> class Stats {
<ide> text += ` ${locInfo}`;
<ide> }
<ide> }
<del> let current = origin.getIssuer(moduleGraph);
<add> let current = moduleGraph.getIssuer(origin);
<ide> while (current) {
<ide> text += `\n @ ${current.readableIdentifier(requestShortener)}`;
<del> current = current.getIssuer(moduleGraph);
<add> current = moduleGraph.getIssuer(current);
<ide> }
<ide> }
<ide> return text;
<ide> class Stats {
<ide>
<ide> const fnModule = (module, nested) => {
<ide> const path = [];
<del> const issuer = module.getIssuer(moduleGraph);
<add> const issuer = moduleGraph.getIssuer(module);
<ide> let current = issuer;
<ide> while (current) {
<ide> path.push(current);
<del> current = current.getIssuer(moduleGraph);
<add> current = moduleGraph.getIssuer(current);
<ide> }
<ide> path.reverse();
<ide> const obj = {
<ide> class Stats {
<ide> });
<ide> }
<ide> if (showUsedExports) {
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide> if (usedExports === null) {
<ide> obj.usedExports = null;
<ide> } else if (typeof usedExports === "boolean") {
<ide> class Stats {
<ide> : null;
<ide> }
<ide> if (showOptimizationBailout) {
<del> obj.optimizationBailout = module
<del> .getOptimizationBailout(moduleGraph)
<add> obj.optimizationBailout = moduleGraph
<add> .getOptimizationBailout(module)
<ide> .map(item => {
<ide> if (typeof item === "function") return item(requestShortener);
<ide> return item;
<ide><path>lib/dependencies/HarmonyCompatibilityDependency.js
<ide> HarmonyCompatibilityDependency.Template = class HarmonyExportDependencyTemplate
<ide> * @returns {InitFragment[]|null} the init fragments
<ide> */
<ide> getInitFragments(dependency, { module, runtimeTemplate, moduleGraph }) {
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide> if (usedExports === true || usedExports === null) {
<ide> const content = runtimeTemplate.defineEsModuleFlagStatement({
<ide> exportsArgument: module.exportsArgument
<ide><path>lib/dependencies/HarmonyExportImportedSpecifierDependency.js
<ide> const getHashValue = (moduleGraph, importedModule) => {
<ide> return "";
<ide> }
<ide>
<del> const usedExports = importedModule.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(importedModule);
<ide> const stringifiedUsedExports = JSON.stringify(usedExports);
<ide> const stringifiedProvidedExports = JSON.stringify(
<ide> importedModule.buildMeta.providedExports
<ide> class HarmonyExportImportedSpecifierDependency extends HarmonyImportDependency {
<ide> const id = this.id;
<ide> const parentModule = moduleGraph.getParentModule(this);
<ide> const used = parentModule.getUsedName(moduleGraph, name);
<del> const usedExports = parentModule.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(parentModule);
<ide> const importedModule = moduleGraph.getModule(this);
<ide>
<ide> if (!importedModule) {
<ide> HarmonyExportImportedSpecifierDependency.Template = class HarmonyExportImportedS
<ide> const activeFromOtherStarExports = dep._discoverActiveExportsFromOtherStartExports(
<ide> moduleGraph
<ide> );
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide>
<ide> if (usedExports && usedExports !== true) {
<ide> // we know which exports are used
<ide><path>lib/dependencies/HarmonyImportSpecifierDependency.js
<ide> class HarmonyImportSpecifierDependency extends HarmonyImportDependency {
<ide> ""
<ide> );
<ide> if (importedModule) {
<del> const usedExports = importedModule.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(importedModule);
<ide> const stringifyUsedExports = JSON.stringify(usedExports);
<ide> hash.update(stringifyUsedExports);
<ide> }
<ide><path>lib/optimize/ConcatenatedModule.js
<ide> class ConcatenatedModule extends Module {
<ide> this.depth = rootModule.depth;
<ide>
<ide> // Info from Optimization
<del> this.setUsedExports(moduleGraph, rootModule.getUsedExports(moduleGraph));
<add> moduleGraph.setUsedExports(this, moduleGraph.getUsedExports(rootModule));
<ide>
<ide> const modulesArray = Array.from(modules);
<ide>
<ide> class ConcatenatedModule extends Module {
<ide> const result = new ConcatSource();
<ide>
<ide> // add harmony compatibility flag (must be first because of possible circular dependencies)
<del> const usedExports = this.rootModule.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(this.rootModule);
<ide> if (usedExports === true) {
<ide> result.add(
<ide> runtimeTemplate.defineEsModuleFlagStatement({
<ide><path>lib/optimize/ModuleConcatenationPlugin.js
<ide> class ModuleConcatenationPlugin {
<ide>
<ide> const setBailoutReason = (module, reason) => {
<ide> setInnerBailoutReason(module, reason);
<del> module
<del> .getOptimizationBailout(moduleGraph)
<add> moduleGraph
<add> .getOptimizationBailout(module)
<ide> .push(
<ide> typeof reason === "function"
<ide> ? rs => formatBailoutReason(reason(rs))
<ide> class ModuleConcatenationPlugin {
<ide> }
<ide> } else {
<ide> for (const warning of currentConfiguration.getWarningsSorted()) {
<del> currentRoot
<del> .getOptimizationBailout(moduleGraph)
<add> moduleGraph
<add> .getOptimizationBailout(currentRoot)
<ide> .push(formatBailoutWarning(warning[0], warning[1]));
<ide> }
<ide> }
<ide> class ModuleConcatenationPlugin {
<ide> compilation
<ide> );
<ide> for (const warning of concatConfiguration.getWarningsSorted()) {
<del> newModule
<del> .getOptimizationBailout(moduleGraph)
<add> moduleGraph
<add> .getOptimizationBailout(newModule)
<ide> .push(formatBailoutWarning(warning[0], warning[1]));
<ide> }
<ide> for (const m of modules) {
<ide><path>lib/wasm/WebAssemblyGenerator.js
<ide> class WebAssemblyGenerator extends Generator {
<ide> let bin = /** @type {ArrayBuffer} */ (sourceAsAny);
<ide> bin = preprocess(bin);
<ide>
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide> const initFuncId = t.identifier(
<ide> usedExports && usedExports !== true
<ide> ? Template.numberToIdentifer(usedExports.size)
<ide><path>lib/wasm/WebAssemblyJavascriptGenerator.js
<ide> class WebAssemblyJavascriptGenerator extends Generator {
<ide> * @returns {Source} generated code
<ide> */
<ide> generate(module, { runtimeTemplate, moduleGraph }) {
<del> const usedExports = module.getUsedExports(moduleGraph);
<add> const usedExports = moduleGraph.getUsedExports(module);
<ide> const initIdentifer =
<ide> usedExports && usedExports !== true
<ide> ? Template.numberToIdentifer(usedExports.size) | 14 |
PHP | PHP | move exception to validation component | bf902a261296ee3f9344badf871e4f28e547a411 | <ide><path>src/Illuminate/Foundation/Exceptions/Handler.php
<ide> use Exception;
<ide> use Psr\Log\LoggerInterface;
<ide> use Illuminate\Http\Response;
<add>use Illuminate\Validation\ValidationException;
<ide> use Illuminate\Auth\Access\AuthorizationException;
<ide> use Illuminate\Http\Exception\HttpResponseException;
<ide> use Symfony\Component\Debug\Exception\FlattenException;
<ide> use Illuminate\Database\Eloquent\ModelNotFoundException;
<ide> use Symfony\Component\HttpKernel\Exception\HttpException;
<del>use Illuminate\Foundation\Validation\ValidationException;
<ide> use Symfony\Component\Console\Application as ConsoleApplication;
<ide> use Symfony\Component\HttpFoundation\Response as SymfonyResponse;
<ide> use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;
<ide><path>src/Illuminate/Foundation/Validation/ValidationException.php
<ide> namespace Illuminate\Foundation\Validation;
<ide>
<ide> use Exception;
<add>use Illuminate\Validation\ValidationException as BaseException;
<ide>
<del>class ValidationException extends Exception
<add>/**
<add> * @deprecated since 5.3. Use Illuminate\Validation\ValidationException.
<add> */
<add>class ValidationException extends BaseException
<ide> {
<del> /**
<del> * The validator instance.
<del> *
<del> * @var \Illuminate\Validation\Validator
<del> */
<del> public $validator;
<del>
<del> /**
<del> * The recommended response to send to the client.
<del> *
<del> * @var \Illuminate\Http\Response|null
<del> */
<del> public $response;
<del>
<del> /**
<del> * Create a new exception instance.
<del> *
<del> * @param \Illuminate\Validation\Validator $validator
<del> * @param \Illuminate\Http\Response $response
<del> * @return void
<del> */
<del> public function __construct($validator, $response = null)
<del> {
<del> parent::__construct('The given data failed to pass validation.');
<del>
<del> $this->response = $response;
<del> $this->validator = $validator;
<del> }
<del>
<del> /**
<del> * Get the underlying response instance.
<del> *
<del> * @return \Symfony\Component\HttpFoundation\Response
<del> */
<del> public function getResponse()
<del> {
<del> return $this->response;
<del> }
<ide> }
<ide><path>src/Illuminate/Validation/ValidationException.php
<add><?php
<add>
<add>namespace Illuminate\Validation;
<add>
<add>use Exception;
<add>
<add>class ValidationException extends Exception
<add>{
<add> /**
<add> * The validator instance.
<add> *
<add> * @var \Illuminate\Validation\Validator
<add> */
<add> public $validator;
<add>
<add> /**
<add> * The recommended response to send to the client.
<add> *
<add> * @var \Illuminate\Http\Response|null
<add> */
<add> public $response;
<add>
<add> /**
<add> * Create a new exception instance.
<add> *
<add> * @param \Illuminate\Validation\Validator $validator
<add> * @param \Illuminate\Http\Response $response
<add> * @return void
<add> */
<add> public function __construct($validator, $response = null)
<add> {
<add> parent::__construct('The given data failed to pass validation.');
<add>
<add> $this->response = $response;
<add> $this->validator = $validator;
<add> }
<add>
<add> /**
<add> * Get the underlying response instance.
<add> *
<add> * @return \Symfony\Component\HttpFoundation\Response
<add> */
<add> public function getResponse()
<add> {
<add> return $this->response;
<add> }
<add>} | 3 |
Text | Text | remove wildcard options for checkemail | 5aa401050388ecacc2d009f7c504b83bb3dad324 | <ide><path>doc/api/crypto.md
<ide> added: v15.6.0
<ide> <!-- YAML
<ide> added: v15.6.0
<ide> changes:
<add> - version: REPLACEME
<add> pr-url: https://github.com/nodejs/node/pull/41599
<add> description: The `wildcards`, `partialWildcards`, `multiLabelWildcards`, and
<add> `singleLabelSubdomains` options have been removed since they
<add> had no effect.
<ide> - version: REPLACEME
<ide> pr-url: https://github.com/nodejs/node/pull/41569
<ide> description: The subject option can now be set to `'default'`.
<ide> changes:
<ide> * `options` {Object}
<ide> * `subject` {string} `'default'`, `'always'`, or `'never'`.
<ide> **Default:** `'always'`.
<del> * `wildcards` {boolean} **Default:** `true`.
<del> * `partialWildcards` {boolean} **Default:** `true`.
<del> * `multiLabelWildcards` {boolean} **Default:** `false`.
<del> * `singleLabelSubdomains` {boolean} **Default:** `false`.
<ide> * Returns: {string|undefined} Returns `email` if the certificate matches,
<ide> `undefined` if it does not.
<ide> | 1 |
Javascript | Javascript | fix broken default getprojectroots | f847fbe0212619559d347f5c44b4fce78ea6abf8 | <ide><path>local-cli/core/default.config.js
<del>/**
<del> * Copyright (c) 2015-present, Facebook, Inc.
<del> * All rights reserved.
<del> *
<del> * This source code is licensed under the BSD-style license found in the
<del> * LICENSE file in the root directory of this source tree. An additional grant
<del> * of patent rights can be found in the PATENTS file in the same directory.
<del> *
<del> * @flow
<del> */
<del>'use strict';
<del>
<del>const path = require('path');
<del>const flatten = require('lodash').flatten;
<del>const android = require('./android');
<del>const findAssets = require('./findAssets');
<del>const ios = require('./ios');
<del>const windows = require('./windows');
<del>const wrapCommands = require('./wrapCommands');
<del>const findPlugins = require('./findPlugins');
<del>const findSymlinksPaths = require('../util/findSymlinksPaths');
<del>
<del>function getProjectPath() {
<del> if (__dirname.match(/node_modules[\/\\]react-native[\/\\]local-cli[\/\\]core$/)) {
<del> // Packager is running from node_modules.
<del> // This is the default case for all projects created using 'react-native init'.
<del> return path.resolve(__dirname, '../../../..');
<del> } else if (__dirname.match(/Pods[\/\\]React[\/\\]packager$/)) {
<del> // React Native was installed using CocoaPods.
<del> return path.resolve(__dirname, '../../../..');
<del> }
<del> return path.resolve(__dirname, '../..');
<del>}
<del>
<del>const getRNPMConfig = (folder) =>
<del> // $FlowFixMe non-literal require
<del> require(path.join(folder, './package.json')).rnpm || {};
<del>
<del>const attachPackage = (command, pkg) => Array.isArray(command)
<del> ? command.map(cmd => attachPackage(cmd, pkg))
<del> : { ...command, pkg };
<del>
<del>const resolveSymlink = (roots) =>
<del> roots.concat(
<del> findSymlinksPaths(
<del> path.join(getProjectPath(), 'node_modules'),
<del> roots
<del> )
<del> );
<del>
<del>/**
<del> * Default configuration for the CLI.
<del> *
<del> * If you need to override any of this functions do so by defining the file
<del> * `rn-cli.config.js` on the root of your project with the functions you need
<del> * to tweak.
<del> */
<del>const config = {
<del> getProjectCommands() {
<del> const appRoot = process.cwd();
<del> const plugins = findPlugins([appRoot])
<del> .map(pathToCommands => {
<del> const name = pathToCommands.split(path.sep)[0];
<del>
<del> return attachPackage(
<del> // $FlowFixMe non-literal require
<del> require(path.join(appRoot, 'node_modules', pathToCommands)),
<del> // $FlowFixMe non-literal require
<del> require(path.join(appRoot, 'node_modules', name, 'package.json'))
<del> );
<del> });
<del>
<del> return flatten(plugins);
<del> },
<del> getProjectConfig() {
<del> const folder = process.cwd();
<del> const rnpm = getRNPMConfig(folder);
<del>
<del> return Object.assign({}, rnpm, {
<del> ios: ios.projectConfig(folder, rnpm.ios || {}),
<del> android: android.projectConfig(folder, rnpm.android || {}),
<del> windows: windows.projectConfig(folder, rnpm.windows || {}),
<del> assets: findAssets(folder, rnpm.assets),
<del> });
<del> },
<del> getDependencyConfig(packageName: string) {
<del> const folder = path.join(process.cwd(), 'node_modules', packageName);
<del> const rnpm = getRNPMConfig(
<del> path.join(process.cwd(), 'node_modules', packageName)
<del> );
<del>
<del> return Object.assign({}, rnpm, {
<del> ios: ios.dependencyConfig(folder, rnpm.ios || {}),
<del> android: android.dependencyConfig(folder, rnpm.android || {}),
<del> windows: windows.dependencyConfig(folder, rnpm.windows || {}),
<del> assets: findAssets(folder, rnpm.assets),
<del> commands: wrapCommands(rnpm.commands),
<del> params: rnpm.params || [],
<del> });
<del> },
<del> getProjectRoots() {
<del> const root = process.env.REACT_NATIVE_APP_ROOT;
<del> if (root) {
<del> return resolveSymlink([path.resolve(root)]);
<del> }
<del>
<del> return resolveSymlink([getProjectPath()]);
<del> },
<del>};
<del>
<del>module.exports = config;
<ide><path>local-cli/core/index.js
<ide> */
<ide> 'use strict';
<ide>
<add>const android = require('./android');
<ide> const Config = require('../util/Config');
<add>const findPlugins = require('./findPlugins');
<add>const findAssets = require('./findAssets');
<add>const ios = require('./ios');
<add>const windows = require('./windows');
<add>const wrapCommands = require('./wrapCommands');
<ide>
<del>const defaultConfig = require('./default.config');
<add>const flatten = require('lodash').flatten;
<ide> const minimist = require('minimist');
<ide> const path = require('path');
<ide>
<ide> export type RNConfig = {
<ide> getDependencyConfig(pkgName: string): Object,
<ide> };
<ide>
<add>const getRNPMConfig = (folder) =>
<add> // $FlowFixMe non-literal require
<add> require(path.join(folder, './package.json')).rnpm || {};
<add>
<add>const attachPackage = (command, pkg) => Array.isArray(command)
<add> ? command.map(cmd => attachPackage(cmd, pkg))
<add> : { ...command, pkg };
<add>
<add>const defaultRNConfig = {
<add> getProjectCommands(): Array<CommandT> {
<add> const appRoot = process.cwd();
<add> const plugins = findPlugins([appRoot])
<add> .map(pathToCommands => {
<add> const name = pathToCommands.split(path.sep)[0];
<add>
<add> return attachPackage(
<add> // $FlowFixMe non-literal require
<add> require(path.join(appRoot, 'node_modules', pathToCommands)),
<add> // $FlowFixMe non-literal require
<add> require(path.join(appRoot, 'node_modules', name, 'package.json'))
<add> );
<add> });
<add>
<add> return flatten(plugins);
<add> },
<add>
<add> getProjectConfig(): Object {
<add> const folder = process.cwd();
<add> const rnpm = getRNPMConfig(folder);
<add>
<add> return Object.assign({}, rnpm, {
<add> ios: ios.projectConfig(folder, rnpm.ios || {}),
<add> android: android.projectConfig(folder, rnpm.android || {}),
<add> windows: windows.projectConfig(folder, rnpm.windows || {}),
<add> assets: findAssets(folder, rnpm.assets),
<add> });
<add> },
<add>
<add> getDependencyConfig(packageName: string) {
<add> const folder = path.join(process.cwd(), 'node_modules', packageName);
<add> const rnpm = getRNPMConfig(
<add> path.join(process.cwd(), 'node_modules', packageName)
<add> );
<add>
<add> return Object.assign({}, rnpm, {
<add> ios: ios.dependencyConfig(folder, rnpm.ios || {}),
<add> android: android.dependencyConfig(folder, rnpm.android || {}),
<add> windows: windows.dependencyConfig(folder, rnpm.windows || {}),
<add> assets: findAssets(folder, rnpm.assets),
<add> commands: wrapCommands(rnpm.commands),
<add> params: rnpm.params || [],
<add> });
<add> },
<add>};
<add>
<ide> /**
<ide> * Loads the CLI configuration
<ide> */
<ide> function getCliConfig(): RNConfig {
<ide> ? Config.loadFile(path.resolve(__dirname, cliArgs.config))
<ide> : Config.findOptional(__dirname);
<ide>
<del> return {...defaultConfig, ...config};
<add> return {...defaultRNConfig, ...config};
<ide> }
<ide>
<ide> module.exports = getCliConfig();
<ide><path>local-cli/util/Config.js
<ide> */
<ide> 'use strict';
<ide>
<add>const findSymlinksPaths = require('./findSymlinksPaths');
<add>
<ide> const blacklist = require('metro-bundler/build/blacklist');
<ide> const fs = require('fs');
<ide> const invariant = require('fbjs/lib/invariant');
<ide> export type ConfigT = {
<ide> transformVariants: () => TransformVariants,
<ide> };
<ide>
<add>function getProjectPath() {
<add> if (__dirname.match(/node_modules[\/\\]react-native[\/\\]local-cli[\/\\]util$/)) {
<add> // Packager is running from node_modules.
<add> // This is the default case for all projects created using 'react-native init'.
<add> return path.resolve(__dirname, '../../../..');
<add> } else if (__dirname.match(/Pods[\/\\]React[\/\\]packager$/)) {
<add> // React Native was installed using CocoaPods.
<add> return path.resolve(__dirname, '../../../..');
<add> }
<add> return path.resolve(__dirname, '../..');
<add>}
<add>
<add>const resolveSymlink = (roots) =>
<add> roots.concat(
<add> findSymlinksPaths(
<add> path.join(getProjectPath(), 'node_modules'),
<add> roots
<add> )
<add> );
<add>
<ide> /**
<ide> * Module capable of getting the configuration out of a given file.
<ide> *
<ide> const Config = {
<ide> getBlacklistRE: () => blacklist(),
<ide> getPlatforms: () => [],
<ide> getPolyfillModuleNames: () => [],
<del> getProjectRoots: () => [process.cwd()],
<add> getProjectRoots: () => {
<add> const root = process.env.REACT_NATIVE_APP_ROOT;
<add> if (root) {
<add> return resolveSymlink([path.resolve(root)]);
<add> }
<add> return resolveSymlink([getProjectPath()]);
<add> },
<ide> getProvidesModuleNodeModules: () => providesModuleNodeModules.slice(),
<ide> getSourceExts: () => [],
<ide> getTransformModulePath: () => require.resolve('metro-bundler/build/transformer.js'), | 3 |
Javascript | Javascript | fix lint errors | 20a348fc0cd3910366f03cbe29878b2563158e9c | <ide><path>fonts.js
<ide> var FontLoader = {
<ide>
<ide> for (var i = 0; i < fonts.length; i++) {
<ide> var font = fonts[i];
<del>
<add>
<ide> // If there is already a fontObj on the font, then it was loaded/attached
<ide> // to the page already and we don't have to do anything for this font
<ide> // here future.
<ide> function getUnicodeRangeFor(value) {
<ide> return -1;
<ide> }
<ide>
<del>/**
<del> * FontShape is the minimal shape a FontObject can have to be useful during
<del> * executing the IRQueue.
<del> */
<del>var FontShape = (function FontShape() {
<del> var constructor = function FontShape_constructor(obj) {
<del> for (var name in obj) {
<del> this[name] = obj[name];
<del> }
<del>
<del> var name = this.loadedName;
<del> var bold = this.black ? (this.bold ? 'bolder' : 'bold') :
<del> (this.bold ? 'bold' : 'normal');
<del>
<del> var italic = this.italic ? 'italic' : 'normal';
<del> this.fontFallback = this.serif ? 'serif' : 'sans-serif';
<del>
<del> this.namePart1 = italic + ' ' + bold + ' ';
<del> this.namePart2 = 'px "' + name + '", "';
<del>
<del> this.supported = Object.keys(this.encoding).length != 0;
<del>
<del> // Set the loading flag. Gets set to false in FontLoader.bind().
<del> this.loading = true;
<del> };
<del>
<del> function int16(bytes) {
<del> return (bytes[0] << 8) + (bytes[1] & 0xff);
<del> };
<del>
<del> constructor.prototype = {
<del> getRule: function fonts_getRule(size, fallback) {
<del> fallback = fallback || this.fontFallback;
<del> return this.namePart1 + size + this.namePart2 + fallback + '"';
<del> },
<del>
<del> charsToUnicode: function fonts_chars2Unicode(chars) {
<del> var charsCache = this.charsCache;
<del> var str;
<del>
<del> // if we translated this string before, just grab it from the cache
<del> if (charsCache) {
<del> str = charsCache[chars];
<del> if (str)
<del> return str;
<del> }
<del>
<del> // lazily create the translation cache
<del> if (!charsCache)
<del> charsCache = this.charsCache = Object.create(null);
<del>
<del> // translate the string using the font's encoding
<del> var encoding = this.encoding;
<del> if (!encoding)
<del> return chars;
<del> str = '';
<del>
<del> if (this.composite) {
<del> // composite fonts have multi-byte strings convert the string from
<del> // single-byte to multi-byte
<del> // XXX assuming CIDFonts are two-byte - later need to extract the
<del> // correct byte encoding according to the PDF spec
<del> var length = chars.length - 1; // looping over two bytes at a time so
<del> // loop should never end on the last byte
<del> for (var i = 0; i < length; i++) {
<del> var charcode = int16([chars.charCodeAt(i++), chars.charCodeAt(i)]);
<del> var unicode = encoding[charcode];
<del> if ('undefined' == typeof(unicode)) {
<del> warn('Unencoded charcode ' + charcode);
<del> unicode = charcode;
<del> } else {
<del> unicode = unicode.unicode;
<del> }
<del> str += String.fromCharCode(unicode);
<del> }
<del> }
<del> else {
<del> for (var i = 0; i < chars.length; ++i) {
<del> var charcode = chars.charCodeAt(i);
<del> var unicode = encoding[charcode];
<del> if ('undefined' == typeof(unicode)) {
<del> warn('Unencoded charcode ' + charcode);
<del> unicode = charcode;
<del> } else {
<del> unicode = unicode.unicode;
<del> }
<del>
<del> // Handle surrogate pairs
<del> if (unicode > 0xFFFF) {
<del> str += String.fromCharCode(unicode & 0xFFFF);
<del> unicode >>= 16;
<del> }
<del> str += String.fromCharCode(unicode);
<del> }
<del> }
<del>
<del> // Enter the translated string into the cache
<del> return (charsCache[chars] = str);
<del> }
<del> }
<del>
<del> return constructor;
<del>})();
<del>
<del>
<ide> /**
<ide> * 'Font' is the class the outside world should use, it encapsulate all the font
<ide> * decoding logics whatever type it is (assuming the font type is supported).
<ide><path>pdf.js
<ide> var JpegStreamIR = (function() {
<ide> getImage: function() {
<ide> return this.domImage;
<ide> }
<del> }
<add> };
<ide>
<ide> return JpegStreamIR;
<del>})()
<add>})();
<ide>
<ide> // A JpegStream can't be read directly. We use the platform to render
<ide> // the underlying JPEG data for us.
<ide> var JpegStream = (function jpegStream() {
<ide> newBytes.set(embedMarker, 2);
<ide> return newBytes;
<ide> }
<del>
<add>
<ide> function constructor(bytes, dict) {
<del> // TODO: per poppler, some images may have "junk" before that
<add> // TODO: per poppler, some images may have 'junk' before that
<ide> // need to be removed
<ide> this.dict = dict;
<ide>
<ide> if (isAdobeImage(bytes))
<ide> bytes = fixAdobeImage(bytes);
<ide>
<del> this.src = bytesToString(bytes);
<add> this.src = bytesToString(bytes);
<ide> }
<ide>
<ide> constructor.prototype = {
<ide> var Page = (function pagePage() {
<ide> }
<ide> return shadow(this, 'rotate', rotate);
<ide> },
<del>
<del> startRenderingFromIRQueue: function startRenderingFromIRQueue(gfx, IRQueue, fonts, continuation) {
<add>
<add> startRenderingFromIRQueue: function startRenderingFromIRQueue(gfx,
<add> IRQueue, fonts, continuation) {
<ide> var self = this;
<ide> this.IRQueue = IRQueue;
<del>
<del> var displayContinuation = function pageDisplayContinuation() {
<add>
<add> var displayContinuation = function pageDisplayContinuation() {
<ide> console.log('--display--');
<ide>
<ide> // Always defer call to display() to work around bug in
<ide> var Page = (function pagePage() {
<ide> // }
<ide> });
<ide> };
<del>
<add>
<ide> this.ensureFonts(fonts, function() {
<ide> displayContinuation();
<ide> });
<ide> var Page = (function pagePage() {
<ide> content[i] = xref.fetchIfRef(content[i]);
<ide> content = new StreamsSequenceStream(content);
<ide> }
<del>
<add>
<ide> var pe = this.pe = new PartialEvaluator();
<ide> var IRQueue = {};
<del> return this.IRQueue = pe.getIRQueue(content, xref, resources, IRQueue, handler, "p" + this.pageNumber + "_", dependency);
<add> return this.IRQueue = pe.getIRQueue(content, xref, resources, IRQueue,
<add> handler, 'p' + this.pageNumber + '_', dependency);
<ide> },
<del>
<add>
<ide> ensureFonts: function(fonts, callback) {
<ide> console.log('--ensureFonts--', '' + fonts);
<ide> // Convert the font names to the corresponding font obj.
<ide> var Page = (function pagePage() {
<ide> fonts,
<ide> function(fontObjs) {
<ide> this.stats.fonts = Date.now();
<del>
<add>
<ide> callback.call(this);
<ide> }.bind(this),
<ide> this.objs
<ide> );
<ide> },
<del>
<add>
<ide> display: function(gfx, callback) {
<ide> var xref = this.xref;
<ide> var resources = xref.fetchIfRef(this.resources);
<ide> var mediaBox = xref.fetchIfRef(this.mediaBox);
<ide> assertWellFormed(isDict(resources), 'invalid page resources');
<del>
<add>
<ide> gfx.xref = xref;
<ide> gfx.res = resources;
<ide> gfx.beginDrawing({ x: mediaBox[0], y: mediaBox[1],
<ide> width: this.width,
<ide> height: this.height,
<ide> rotate: this.rotate });
<del>
<add>
<ide> var startIdx = 0;
<ide> var length = this.IRQueue.fnArray.length;
<ide> var IRQueue = this.IRQueue;
<del>
<add>
<ide> var self = this;
<ide> var startTime = Date.now();
<ide> function next() {
<ide> startIdx = gfx.executeIRQueue(IRQueue, startIdx, next);
<ide> if (startIdx == length) {
<ide> self.stats.render = Date.now();
<del> console.log("page=%d - executeIRQueue: time=%dms",
<add> console.log('page=%d - executeIRQueue: time=%dms',
<ide> self.pageNumber + 1, self.stats.render - startTime);
<ide> callback();
<ide> }
<ide> var Catalog = (function catalogCatalog() {
<ide> })();
<ide>
<ide> /**
<del> * The `PDFDocModel` holds all the data of the PDF file. Compared to the
<add> * The `PDFDocModel` holds all the data of the PDF file. Compared to the
<ide> * `PDFDoc`, this one doesn't have any job management code.
<ide> * Right now there exists one PDFDocModel on the main thread + one object
<ide> * for each worker. If there is no worker support enabled, there are two
<ide> var PDFDoc = (function() {
<ide> this.data = data;
<ide> this.stream = stream;
<ide> this.pdf = new PDFDocModel(stream);
<del>
<add>
<ide> this.catalog = this.pdf.catalog;
<ide> this.objs = new PDFObjects();
<del>
<add>
<ide> this.pageCache = [];
<del>
<add>
<ide> if (useWorker) {
<del> var worker = this.worker = new Worker("../worker/pdf_worker_loader.js");
<add> var worker = this.worker = new Worker('../worker/pdf_worker_loader.js');
<ide> } else {
<ide> // If we don't use a worker, just post/sendMessage to the main thread.
<ide> var worker = {
<ide> postMessage: function(obj) {
<ide> worker.onmessage({data: obj});
<ide> }
<del> }
<add> };
<ide> }
<del>
<add>
<ide> this.fontsLoading = {};
<ide>
<del> var processorHandler = this.processorHandler = new MessageHandler("main", worker);
<del> processorHandler.on("page", function(data) {
<add> var processorHandler = this.processorHandler =
<add> new MessageHandler('main', worker);
<add>
<add> processorHandler.on('page', function(data) {
<ide> var pageNum = data.pageNum;
<ide> var page = this.pageCache[pageNum];
<del>
<add>
<ide> // DepFonts are all fonts are required to render the page. `fontsToLoad`
<ide> // are all the fonts that are required to render the page AND that
<ide> // aren't loaded on the page yet.
<ide> var PDFDoc = (function() {
<ide> var fontsLoading = this.fontsLoading;
<ide> // The `i` for the checkFontData is stored here to keep the state in
<ide> // the closure.
<del> var i = 0;
<del>
<add> var i = 0;
<add>
<ide>
<ide> // function checkFontData() {
<ide> // // Check if all fontObjs have been processed. If not, shedule a
<ide> var PDFDoc = (function() {
<ide> // fontsToLoad.push(fontName);
<ide> // }
<ide> // }
<del> //
<add> //
<ide> // // There can be edge cases where two pages wait for one font and then
<del> // // call startRenderingFromIRQueue twice with the same font. That makes
<del> // // the font getting loaded twice and throw an error later as the font
<del> // // promise gets resolved twice.
<add> // // call startRenderingFromIRQueue twice with the same font. That
<add> // // makes the font getting loaded twice and throw an error later as
<add> // // the font promise gets resolved twice.
<add> // //
<ide> // // This prevents thats fonts are loaded really only once.
<ide> // for (var j = 0; j < fontsToLoad.length; j++) {
<ide> // var fontName = fontsToLoad[j];
<ide> var PDFDoc = (function() {
<ide> // fontsLoading[fontName] = true;
<ide> // }
<ide> // }
<del> //
<add> //
<ide> // for (var i = 0; i < depFonts.lenght; i++) {
<ide> // // Get fonts from the objects.
<ide> // fontsToLoad.push(this.objs.get(depFonts[i]));
<ide> // }
<del> //
<del> // // At this point, all font data ia loaded. Start the actuall rendering.
<add> //
<add> // // At this point, all font data ia loaded. Start the actuall
<add> // // rendering.
<ide> // page.startRenderingFromIRQueue(data.IRQueue, fontsToLoad);
<ide> // }
<ide> //
<ide> // checkFontData();
<ide>
<del> // Start rendering directly for now, as the fonts are
<add> // Start rendering directly for now, as the fonts are
<ide> page.startRenderingFromIRQueue(data.IRQueue, Object.keys(data.depFonts));
<ide> }, this);
<ide>
<del> processorHandler.on("obj", function(data) {
<del> var objId = data[0];
<add> processorHandler.on('obj', function(data) {
<add> var objId = data[0];
<ide> var objType = data[1];
<ide>
<ide> switch (objType) {
<del> case "JpegStream":
<add> case 'JpegStream':
<ide> var IR = data[2];
<ide> new JpegStreamIR(objId, IR, this.objs);
<ide> console.log('got image');
<ide> break;
<del> case "Font":
<add> case 'Font':
<ide> var name = data[2];
<ide> var file = data[3];
<ide> var properties = data[4];
<del>
<add>
<ide> if (file) {
<ide> var fontFileDict = new Dict();
<ide> fontFileDict.map = file.dict.map;
<del>
<add>
<ide> var fontFile = new Stream(file.bytes, file.start,
<ide> file.end - file.start, fontFileDict);
<del>
<del> // Check if this is a FlateStream. Otherwise just use the created
<add>
<add> // Check if this is a FlateStream. Otherwise just use the created
<ide> // Stream one. This makes complex_ttf_font.pdf work.
<ide> var cmf = file.bytes[0];
<ide> if ((cmf & 0x0f) == 0x08) {
<ide> file = new FlateStream(fontFile);
<ide> } else {
<ide> file = fontFile;
<del> }
<add> }
<ide> }
<ide>
<del>
<del> // For now, resolve the font object here direclty. The real font object
<del> // is then created in FontLoader.bind().
<add>
<add> // For now, resolve the font object here direclty. The real font
<add> // object is then created in FontLoader.bind().
<ide> this.objs.resolve(objId, {
<ide> name: name,
<ide> file: file,
<ide> properties: properties
<ide> });
<del>
<del> //
<del> //
<add>
<add> //
<add> //
<ide> // // << CODE TAKEN FROM WORKER >>
<del> // data = [objId, name, file, properties];
<del> // var objId = data[0];
<del> // var name = data[1];
<del> // var file = data[2];
<add> // data = [objId, name, file, properties];
<add> // var objId = data[0];
<add> // var name = data[1];
<add> // var file = data[2];
<ide> // var properties = data[3];
<del> //
<add> //
<ide> // var font = {
<ide> // name: name,
<ide> // file: file,
<ide> // properties: properties
<ide> // };
<del> //
<add> //
<ide> // // Some fonts don't have a file, e.g. the build in ones like Arial.
<ide> // if (file) {
<ide> // var fontFileDict = new Dict();
<ide> // fontFileDict.map = file.dict.map;
<del> //
<add> //
<ide> // var fontFile = new Stream(file.bytes, file.start,
<ide> // file.end - file.start, fontFileDict);
<del> //
<del> // // Check if this is a FlateStream. Otherwise just use the created
<add> //
<add> // // Check if this is a FlateStream. Otherwise just use the created
<ide> // // Stream one. This makes complex_ttf_font.pdf work.
<ide> // var cmf = file.bytes[0];
<ide> // if ((cmf & 0x0f) == 0x08) {
<ide> // font.file = new FlateStream(fontFile);
<ide> // } else {
<ide> // font.file = fontFile;
<del> // }
<add> // }
<ide> // }
<del> //
<add> //
<ide> // var obj = new Font(font.name, font.file, font.properties);
<del> //
<add> //
<ide> // var str = '';
<ide> // var data = obj.data;
<ide> // if (data) {
<ide> // var length = data.length;
<ide> // for (var j = 0; j < length; j++)
<ide> // str += String.fromCharCode(data[j]);
<ide> // }
<del> //
<add> //
<ide> // obj.str = str;
<del> //
<add> //
<ide> // var fontObj = new FontShape(obj);
<ide> // for (var prop in obj) {
<ide> // fontObj[prop] = obj[prop];
<ide> // }
<del> //
<add> //
<ide> // if (!str) {
<ide> // this.objs.resolve(objId, fontObj);
<ide> // } else {
<ide> // this.objs.setData(objId, fontObj);
<ide> // }
<ide>
<del> // processorHandler.send("font", [objId, name, file, properties]);
<add> // processorHandler.send('font', [objId, name, file, properties]);
<ide> break;
<ide> default:
<del> throw "Got unkown object type " + objType;
<add> throw 'Got unkown object type ' + objType;
<ide> }
<ide> }, this);
<ide>
<ide> processorHandler.on('font_ready', function(data) {
<del> var objId = data[0];
<add> var objId = data[0];
<ide> var fontObj = new FontShape(data[1]);
<ide>
<ide> console.log('got fontData', objId);
<ide> var PDFDoc = (function() {
<ide> this.objs.setData(objId, fontObj);
<ide> }
<ide> }.bind(this));
<del>
<add>
<ide> if (!useWorker) {
<ide> // If the main thread is our worker, setup the handling for the messages
<ide> // the main thread sends to it self.
<ide> WorkerProcessorHandler.setup(processorHandler);
<ide> }
<del>
<del> processorHandler.send("doc", this.data);
<add>
<add> processorHandler.send('doc', this.data);
<ide> }
<ide>
<ide> constructor.prototype = {
<ide> get numPages() {
<ide> return this.pdf.numPages;
<ide> },
<del>
<add>
<ide> startRendering: function(page) {
<del> this.processorHandler.send("page_request", page.page.pageNumber + 1);
<add> this.processorHandler.send('page_request', page.page.pageNumber + 1);
<ide> },
<del>
<add>
<ide> getPage: function(n) {
<ide> if (this.pageCache[n]) {
<ide> return this.pageCache[n];
<ide> }
<del>
<add>
<ide> var page = this.pdf.getPage(n);
<ide> // Add a reference to the objects such that Page can forward the reference
<ide> // to the CanvasGraphics and so on.
<ide> page.objs = this.objs;
<ide> return this.pageCache[n] = new WorkerPage(this, page, this.objs);
<ide> },
<del>
<add>
<ide> destroy: function() {
<del> console.log("destroy worker");
<add> console.log('destroy worker');
<ide> if (this.worker) {
<ide> this.worker.terminate();
<ide> }
<ide> if (this.fontWorker) {
<ide> this.fontWorker.terminate();
<ide> }
<del>
<add>
<ide> for (var n in this.pageCache) {
<ide> delete this.pageCache[n];
<ide> }
<ide> var PDFDoc = (function() {
<ide> delete this.catalog;
<ide> }
<ide> };
<del>
<add>
<ide> return constructor;
<ide> })();
<ide>
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> };
<ide>
<ide> constructor.prototype = {
<del> getIRQueue: function partialEvaluatorGetIRQueue(stream, xref, resources, queue, handler,
<del> uniquePrefix, dependency) {
<add> getIRQueue: function partialEvaluatorGetIRQueue(stream, xref, resources,
<add> queue, handler, uniquePrefix, dependency) {
<ide>
<ide> function insertDependency(depList) {
<del> fnArray.push("dependency");
<add> fnArray.push('dependency');
<ide> argsArray.push(depList);
<ide> for (var i = 0; i < depList.length; i++) {
<ide> var dep = depList[i];
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> var font = xref.fetchIfRef(fontRes.get(fontName));
<ide> assertWellFormed(isDict(font));
<ide> if (!font.translated) {
<del> font.translated = self.translateFont(font, xref, resources, handler,
<add> font.translated = self.translateFont(font, xref, resources, handler,
<ide> uniquePrefix, dependency);
<ide> if (font.translated) {
<ide> // keep track of each font we translated so the caller can
<ide> // load them asynchronously before calling display on a page
<del> loadedName = "font_" + uniquePrefix + (FontLoadedCounter++);
<add> loadedName = 'font_' + uniquePrefix + (FontLoadedCounter++);
<ide> font.translated.properties.loadedName = loadedName;
<ide> font.loadedName = loadedName;
<ide> FontsMap[loadedName] = font;
<ide>
<del> handler.send("obj", [
<del> loadedName,
<del> "Font",
<add> handler.send('obj', [
<add> loadedName,
<add> 'Font',
<ide> font.translated.name,
<ide> font.translated.file,
<ide> font.translated.properties
<ide> ]);
<ide> }
<ide> }
<ide> loadedName = loadedName || font.loadedName;
<del>
<add>
<ide> // Ensure the font is ready before the font is set
<ide> // and later on used for drawing.
<ide> // TODO: This should get insert to the IRQueue only once per
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> } else {
<ide> // TODO: TOASK: Is it possible to get here? If so, what does
<ide> // args[0].name should be like???
<add> return null;
<ide> }
<ide> }
<ide>
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide>
<ide> if (image instanceof JpegStream) {
<ide> var objId = 'img_' + ++objIdCounter;
<del> handler.send("obj", [objId, "JpegStream", image.getIR()]);
<add> handler.send('obj', [objId, 'JpegStream', image.getIR()]);
<ide>
<ide> // Add the dependency on the image object.
<ide> insertDependency([objId]);
<ide>
<ide> // The normal fn.
<ide> fn = 'paintJpegXObject';
<del> args = [ objId, w, h ];
<add> args = [objId, w, h];
<ide> } else {
<ide> // Needs to be rendered ourself.
<del>
<add>
<ide> // Figure out if the image has an imageMask.
<ide> var imageMask = dict.get('ImageMask', 'IM') || false;
<ide>
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> var imageObj = new PDFImage(xref, resources, image, inline);
<ide>
<ide> if (imageObj.imageMask) {
<del> throw "Can't handle this in the web worker :/";
<add> throw 'Can\'t handle this in the web worker :/';
<ide> }
<del>
<add>
<ide> var imgData = {
<ide> width: w,
<ide> height: h,
<ide> data: new Uint8Array(w * h * 4)
<ide> };
<ide> var pixels = imgData.data;
<ide> imageObj.fillRgbaBuffer(pixels, imageObj.decode);
<del>
<del> fn = "paintImageXObject";
<del> args = [ imgData ];
<add>
<add> fn = 'paintImageXObject';
<add> args = [imgData];
<ide> } else /* imageMask == true */ {
<ide> // This depends on a tmpCanvas beeing filled with the
<ide> // current fillStyle, such that processing the pixel
<ide> // data can't be done here. Instead of creating a
<ide> // complete PDFImage, only read the information needed
<ide> // for later.
<del> fn = "paintImageMaskXObject";
<del>
<add> fn = 'paintImageMaskXObject';
<add>
<ide> var width = dict.get('Width', 'W');
<ide> var height = dict.get('Height', 'H');
<ide> var bitStrideLength = (width + 7) >> 3;
<ide> var imgArray = image.getBytes(bitStrideLength * height);
<ide> var decode = dict.get('Decode', 'D');
<ide> var inverseDecode = !!decode && decode[0] > 0;
<ide>
<del> args = [ imgArray, inverseDecode, width, height ];
<add> args = [imgArray, inverseDecode, width, height];
<ide> }
<ide> }
<ide> }
<del>
<del> uniquePrefix = uniquePrefix || "";
<add>
<add> uniquePrefix = uniquePrefix || '';
<ide> if (!queue.argsArray) {
<del> queue.argsArray = []
<add> queue.argsArray = [];
<ide> }
<ide> if (!queue.fnArray) {
<ide> queue.fnArray = [];
<ide> }
<ide>
<ide> var fnArray = queue.fnArray, argsArray = queue.argsArray;
<ide> var dependency = dependency || [];
<del>
<add>
<ide> resources = xref.fetchIfRef(resources) || new Dict();
<ide> var xobjs = xref.fetchIfRef(resources.get('XObject')) || new Dict();
<ide> var patterns = xref.fetchIfRef(resources.get('Pattern')) || new Dict();
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> };
<ide> }
<ide> }
<del> assertWellFormed(fn, "Unknown command '" + cmd + "'");
<add> assertWellFormed(fn, 'Unknown command "' + cmd + '"');
<ide> // TODO figure out how to type-check vararg functions
<ide>
<ide> if ((cmd == 'SCN' || cmd == 'scn') && !args[args.length - 1].code) {
<ide> // Use the IR version for setStroke/FillColorN.
<ide> fn += '_IR';
<del>
<add>
<ide> // compile tiling patterns
<ide> var patternName = args[args.length - 1];
<ide> // SCN/scn applies patterns along with normal colors
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> // Create an IR of the pattern code.
<ide> var depIdx = dependency.length;
<ide> var codeIR = this.getIRQueue(pattern, xref,
<del> dict.get('Resources'), {}, handler,
<add> dict.get('Resources'), {}, handler,
<ide> uniquePrefix, dependency);
<del>
<add>
<ide> // Add the dependencies that are required to execute the
<ide> // codeIR.
<ide> insertDependency(dependency.slice(depIdx));
<del>
<add>
<ide> args = TilingPattern.getIR(codeIR, dict, args);
<del> }
<add> }
<ide> // Type2 is ShadingPattern.
<ide> else if (typeNum == 2) {
<ide> var shading = xref.fetchIfRef(dict.get('Shading'));
<ide> var matrix = dict.get('Matrix');
<del> var pattern = Pattern.parseShading(shading, matrix, xref, res, null /*ctx*/);
<add> var pattern = Pattern.parseShading(shading, matrix, xref, res,
<add> null /*ctx*/);
<ide> args = pattern.getIR();
<ide> } else {
<del> error("Unkown PatternType " + typeNum);
<add> error('Unkown PatternType ' + typeNum);
<ide> }
<ide> }
<ide> }
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> if ('Form' == type.name) {
<ide> var matrix = xobj.dict.get('Matrix');
<ide> var bbox = xobj.dict.get('BBox');
<del>
<del> fnArray.push("paintFormXObjectBegin");
<del> argsArray.push([ matrix, bbox ]);
<del>
<add>
<add> fnArray.push('paintFormXObjectBegin');
<add> argsArray.push([matrix, bbox]);
<add>
<ide> // This adds the IRQueue of the xObj to the current queue.
<ide> var depIdx = dependency.length;
<del>
<add>
<ide> this.getIRQueue(xobj, xref, xobj.dict.get('Resources'), queue,
<ide> handler, uniquePrefix, dependency);
<ide>
<ide> // Add the dependencies that are required to execute the
<ide> // codeIR.
<ide> insertDependency(dependency.slice(depIdx));
<ide>
<del> fn = "paintFormXObjectEnd";
<add> fn = 'paintFormXObjectEnd';
<ide> args = [];
<ide> } else if ('Image' == type.name) {
<del> buildPaintImageXObject(xobj, false)
<add> buildPaintImageXObject(xobj, false);
<ide> } else {
<ide> error('Unhandled XObject subtype ' + type.name);
<ide> }
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> // font.translated = this.translateFont(font, xref, resources);
<ide> // if (font.translated) {
<ide> // // keep track of each font we translated so the caller can
<del> // // load them asynchronously before calling display on a page
<add> // // load them asynchronously before calling display on a
<add> // // page
<ide> // // fonts.push(font.translated);
<ide> // dependency.push(font.translated);
<ide> // }
<ide> // }
<ide> // }
<del> //
<add> //
<ide> } else if (cmd == 'EI') {
<ide> buildPaintImageXObject(args[0], true);
<ide> }
<ide>
<ide> // Transform some cmds.
<ide> switch (fn) {
<ide> // Parse the ColorSpace data to a raw format.
<del> case "setFillColorSpace":
<del> case "setStrokeColorSpace":
<del> args = [ ColorSpace.parseToIR(args[0], xref, resources) ];
<add> case 'setFillColorSpace':
<add> case 'setStrokeColorSpace':
<add> args = [ColorSpace.parseToIR(args[0], xref, resources)];
<ide> break;
<del> case "shadingFill":
<add> case 'shadingFill':
<ide> var shadingRes = xref.fetchIfRef(res.get('Shading'));
<ide> if (!shadingRes)
<ide> error('No shading resource found');
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> if (!shading)
<ide> error('No shading object found');
<ide>
<del> var shadingFill = Pattern.parseShading(shading, null, xref, res, /* ctx */ null);
<add> var shadingFill = Pattern.parseShading(shading, null, xref, res,
<add> /* ctx */ null);
<ide> var patternIR = shadingFill.getIR();
<ide>
<del> args = [ patternIR ];
<del> fn = "shadingFill";
<add> args = [patternIR];
<add> fn = 'shadingFill';
<ide>
<ide> break;
<del> case "setGState":
<add> case 'setGState':
<ide> var dictName = args[0];
<ide> var extGState = xref.fetchIfRef(resources.get('ExtGState'));
<ide> if (isDict(extGState) && extGState.has(dictName.name)) {
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> // This array holds the converted/processed state data.
<ide> var gsStateObj = [];
<ide>
<del> gsState.forEach(function canvasGraphicsSetGStateForEach(key, value) {
<add> gsState.forEach(
<add> function canvasGraphicsSetGStateForEach(key, value) {
<ide> switch (key) {
<ide> case 'Type':
<ide> break;
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> gsStateObj.push([key, value]);
<ide> break;
<ide> case 'Font':
<del> gsStateObj.push(['Font', handleSetFont(value[0]), value[1] ]);
<add> gsStateObj.push([
<add> 'Font',
<add> handleSetFont(value[0]), value[1]
<add> ]);
<ide> break;
<ide> case 'OP':
<ide> case 'op':
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> break;
<ide> }
<ide> });
<del> args = [ gsStateObj ];
<add> args = [gsStateObj];
<ide> }
<ide> }
<ide>
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> };
<ide> },
<ide>
<del> translateFont: function partialEvaluatorTranslateFont(dict, xref,
<del> resources, queue, handler, uniquePrefix, dependency) {
<add> translateFont: function partialEvaluatorTranslateFont(dict, xref, resources,
<add> queue, handler, uniquePrefix, dependency) {
<ide> var baseDict = dict;
<ide> var type = dict.get('Subtype');
<ide> assertWellFormed(isName(type), 'invalid font Subtype');
<ide> var PartialEvaluator = (function partialEvaluator() {
<ide> for (var key in charProcs.map) {
<ide> var glyphStream = xref.fetchIfRef(charProcs.map[key]);
<ide> var queue = {};
<del> properties.glyphs[key].IRQueue = this.getIRQueue(glyphStream,
<del> xref, fontResources, queue, handler, uniquePrefix, dependency);
<add> properties.glyphs[key].IRQueue = this.getIRQueue(glyphStream, xref,
<add> fontResources, queue, handler, uniquePrefix, dependency);
<ide> }
<ide> }
<ide>
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> this.ctx.scale(cw / mediaBox.width, ch / mediaBox.height);
<ide> },
<ide>
<del> executeIRQueue: function canvasGraphicsExecuteIRQueue(codeIR,
<add> executeIRQueue: function canvasGraphicsExecuteIRQueue(codeIR,
<ide> executionStartIdx, continueCallback) {
<ide> var argsArray = codeIR.argsArray;
<del> var fnArray = codeIR.fnArray;
<add> var fnArray = codeIR.fnArray;
<ide> var i = executionStartIdx || 0;
<ide> var argsArrayLen = argsArray.length;
<del>
<add>
<ide> var executionEndIdx;
<ide> var startTime = Date.now();
<ide>
<ide> var objs = this.objs;
<ide>
<ide> do {
<ide> executionEndIdx = Math.min(argsArrayLen, i + kExecutionTimeCheck);
<del>
<add>
<ide> for (i; i < executionEndIdx; i++) {
<del> if (fnArray[i] !== "dependency") {
<add> if (fnArray[i] !== 'dependency') {
<ide> this[fnArray[i]].apply(this, argsArray[i]);
<ide> } else {
<ide> var deps = argsArray[i];
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> // If the entire IRQueue was executed, stop as were done.
<ide> if (i == argsArrayLen) {
<ide> return i;
<del> }
<add> }
<ide> // If the execution took longer then a certain amount of time, shedule
<ide> // to continue exeution after a short delay.
<ide> // However, this is only possible if a 'continueCallback' is passed in.
<del> else if (continueCallback &&
<add> else if (continueCallback &&
<ide> (Date.now() - startTime) > kExecutionTime) {
<ide> setTimeout(continueCallback, 0);
<ide> return i;
<del> }
<add> }
<ide>
<ide> // If the IRQueue isn't executed completly yet OR the execution time
<ide> // was short enough, do another execution round.
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> // Lookup the fontObj using fontRef only.
<ide> var fontRefName = fontRef.name;
<ide> var fontObj = this.objs.get(fontRefName).fontObj;
<del>
<add>
<ide> if (!fontObj) {
<del> throw "Can't find font for " + fontRefName;
<add> throw 'Can\'t find font for ' + fontRefName;
<ide> }
<del>
<add>
<ide> var name = fontObj.loadedName || 'sans-serif';
<ide>
<ide> // var font;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> // font = this.xref.fetchIfRef(this.res.get('Font'));
<ide> // if (!isDict(font))
<ide> // return;
<del> //
<add> //
<ide> // font = font.get(fontRef.name);
<ide> // } else if (isRef(fontRef)) {
<ide> // font = fontRef;
<ide> // }
<ide> // font = this.xref.fetchIfRef(font);
<ide> // if (!font)
<ide> // error('Referenced font is not found');
<del> //
<add> //
<ide> // var fontObj = font.fontObj;
<ide> this.current.font = fontObj;
<ide> this.current.fontSize = size;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> // }
<ide>
<ide> // console.log("showSpacedText", arr);
<del>
<add>
<ide> var ctx = this.ctx;
<ide> var current = this.current;
<ide> var fontSize = current.fontSize;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> },
<ide>
<ide> // Color
<del> setStrokeColorSpace: function canvasGraphicsSetStrokeColorSpacefunction(raw) {
<add> setStrokeColorSpace:
<add> function canvasGraphicsSetStrokeColorSpacefunction(raw) {
<ide> this.current.strokeColorSpace =
<ide> ColorSpace.fromIR(raw);
<ide> },
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> this.setStrokeRGBColor.apply(this, color);
<ide> },
<ide> getColorN_IR_Pattern: function(IR, cs) {
<del> if (IR[0] == "TilingPatternIR") {
<add> if (IR[0] == 'TilingPatternIR') {
<ide> // First, build the `color` var like it's done in the
<ide> // Pattern.prototype.parse function.
<ide> var args = IR[1];
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide>
<ide> // Build the pattern based on the IR data.
<ide> var pattern = new TilingPatternIR(IR, color, this.ctx, this.objs);
<del> } else if (IR[0] == "RadialAxialShading" || IR[0] == "DummyShading") {
<del> var pattern = Pattern.shadingFromIR(this.ctx, IR);
<add> } else if (IR[0] == 'RadialAxialShading' || IR[0] == 'DummyShading') {
<add> var pattern = Pattern.shadingFromIR(this.ctx, IR);
<ide> } else {
<del> throw "Unkown IR type";
<add> throw 'Unkown IR type';
<ide> }
<del> return pattern;
<add> return pattern;
<ide> },
<ide> setStrokeColorN_IR: function canvasGraphicsSetStrokeColorN(/*...*/) {
<ide> var cs = this.current.strokeColorSpace;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide>
<ide> shadingFill: function canvasGraphicsShadingFill(patternIR) {
<ide> var ctx = this.ctx;
<del>
<add>
<ide> this.save();
<ide> ctx.fillStyle = Pattern.shadingFromIR(ctx, patternIR);
<ide>
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> beginImageData: function canvasGraphicsBeginImageData() {
<ide> error('Should not call beginImageData');
<ide> },
<del>
<del> paintFormXObjectBegin: function canvasGraphicsPaintFormXObject(matrix, bbox) {
<add>
<add> paintFormXObjectBegin:
<add> function canvasGraphicsPaintFormXObject(matrix, bbox) {
<ide> this.save();
<ide>
<ide> if (matrix && isArray(matrix) && 6 == matrix.length)
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> paintJpegXObject: function(objId, w, h) {
<ide> var image = this.objs.get(objId);
<ide> if (!image) {
<del> error("Dependent image isn't ready yet");
<add> error('Dependent image isn\'t ready yet');
<ide> }
<ide>
<ide> this.save();
<del>
<add>
<ide> var ctx = this.ctx;
<ide> ctx.scale(1 / w, -1 / h);
<ide>
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide>
<ide> this.restore();
<ide> },
<del>
<add>
<ide> paintImageMaskXObject: function(imgArray, inverseDecode, width, height) {
<ide> function applyStencilMask(buffer, inverseDecode) {
<ide> var imgArrayPos = 0;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide> var tmpCanvas = new this.ScratchCanvas(w, h);
<ide> var tmpCtx = tmpCanvas.getContext('2d');
<ide> var tmpImgData;
<del>
<add>
<ide> // Deactivating this for now until we have feature detection.
<ide> // if (isGecko) {
<ide> // tmpImgData = imgData;
<ide> var CanvasGraphics = (function canvasGraphics() {
<ide>
<ide> // TODO: There got to be a better way to copy an ImageData array
<ide> // then coping over all the bytes one by one :/
<del> while (len--)
<add> while (len--)
<ide> tmpImgDataPixels[len] = imgData.data[len];
<ide> // }
<ide>
<ide> var ColorSpace = (function colorSpaceColorSpace() {
<ide> if (!(IR instanceof SeparationCS)) {
<ide> return constructor.fromIR(IR);
<ide> } else {
<del> return IR
<add> return IR;
<ide> }
<ide> };
<del>
<add>
<ide> constructor.fromIR = function(IR) {
<ide> var name;
<ide> if (isArray(IR)) {
<ide> name = IR[0];
<ide> } else {
<ide> name = IR;
<ide> }
<del>
<add>
<ide> switch (name) {
<del> case "DeviceGrayCS":
<add> case 'DeviceGrayCS':
<ide> return new DeviceGrayCS();
<del> case "DeviceRgbCS":
<add> case 'DeviceRgbCS':
<ide> return new DeviceRgbCS();
<del> case "DeviceCmykCS":
<add> case 'DeviceCmykCS':
<ide> return new DeviceCmykCS();
<del> case "PatternCS":
<add> case 'PatternCS':
<ide> var baseCS = IR[1];
<ide> if (baseCS == null) {
<ide> return new PatternCS(null);
<ide> } else {
<ide> return new PatternCS(ColorSpace.fromIR(baseCS));
<ide> }
<del> case "IndexedCS":
<add> case 'IndexedCS':
<ide> var baseCS = IR[1];
<del> var hiVal = IR[2];
<add> var hiVal = IR[2];
<ide> var lookup = IR[3];
<del> return new IndexedCS(ColorSpace.fromIR(baseCS), hiVal, lookup)
<del> case "SeparationCS":
<del> var alt = IR[1];
<del> var tintFnIR = IR[2];
<del>
<add> return new IndexedCS(ColorSpace.fromIR(baseCS), hiVal, lookup);
<add> case 'SeparationCS':
<add> var alt = IR[1];
<add> var tintFnIR = IR[2];
<add>
<ide> return new SeparationCS(
<ide> ColorSpace.fromIR(alt),
<ide> PDFFunction.fromIR(tintFnIR)
<ide> );
<ide> default:
<del> error("Unkown name " + name);
<add> error('Unkown name ' + name);
<ide> }
<ide> return null;
<ide> }
<del>
<add>
<ide> constructor.parseToIR = function colorspace_parse(cs, xref, res, parseOnly) {
<ide> if (isName(cs)) {
<ide> var colorSpaces = res.get('ColorSpace');
<ide> var ColorSpace = (function colorSpaceColorSpace() {
<ide> switch (mode) {
<ide> case 'DeviceGray':
<ide> case 'G':
<del> return "DeviceGrayCS";
<add> return 'DeviceGrayCS';
<ide> case 'DeviceRGB':
<ide> case 'RGB':
<del> return "DeviceRgbCS";
<add> return 'DeviceRgbCS';
<ide> case 'DeviceCMYK':
<ide> case 'CMYK':
<del> return "DeviceCmykCS";
<add> return 'DeviceCmykCS';
<ide> case 'Pattern':
<del> return ["PatternCS", null];
<add> return ['PatternCS', null];
<ide> default:
<ide> error('unrecognized colorspace ' + mode);
<ide> }
<ide> var ColorSpace = (function colorSpaceColorSpace() {
<ide> switch (mode) {
<ide> case 'DeviceGray':
<ide> case 'G':
<del> return "DeviceGrayCS";
<add> return 'DeviceGrayCS';
<ide> case 'DeviceRGB':
<ide> case 'RGB':
<del> return "DeviceRgbCS";
<add> return 'DeviceRgbCS';
<ide> case 'DeviceCMYK':
<ide> case 'CMYK':
<del> return "DeviceCmykCS";
<add> return 'DeviceCmykCS';
<ide> case 'CalGray':
<del> return "DeviceGrayCS";
<add> return 'DeviceGrayCS';
<ide> case 'CalRGB':
<del> return "DeviceRgbCS";
<add> return 'DeviceRgbCS';
<ide> case 'ICCBased':
<ide> var stream = xref.fetchIfRef(cs[1]);
<ide> var dict = stream.dict;
<ide> var numComps = dict.get('N');
<ide> if (numComps == 1)
<del> return "DeviceGrayCS";
<add> return 'DeviceGrayCS';
<ide> if (numComps == 3)
<del> return "DeviceRgbCS";
<add> return 'DeviceRgbCS';
<ide> if (numComps == 4)
<del> return "DeviceCmykCS";
<add> return 'DeviceCmykCS';
<ide> break;
<ide> case 'Pattern':
<ide> var baseCS = cs[1];
<ide> if (baseCS)
<ide> baseCS = ColorSpace.parseToIR(baseCS, xref, res);
<del> return ["PatternCS", baseCS];
<add> return ['PatternCS', baseCS];
<ide> case 'Indexed':
<ide> var baseCS = ColorSpace.parseToIR(cs[1], xref, res);
<ide> var hiVal = cs[2] + 1;
<ide> var lookup = xref.fetchIfRef(cs[3]);
<del> return ["IndexedCS", baseCS, hiVal, lookup];
<add> return ['IndexedCS', baseCS, hiVal, lookup];
<ide> case 'Separation':
<ide> var alt = ColorSpace.parseToIR(cs[2], xref, res);
<ide> var tintFnIR = PDFFunction.getIR(xref, xref.fetchIfRef(cs[3]));
<del> return ["SeparationCS", alt, tintFnIR];
<add> return ['SeparationCS', alt, tintFnIR];
<ide> case 'Lab':
<ide> case 'DeviceN':
<ide> default:
<ide> var DummyShading = (function dummyShading() {
<ide>
<ide> constructor.prototype = {
<ide> getIR: function dummpy_getir() {
<del> return [ 'DummyShading' ];
<add> return ['DummyShading'];
<ide> }
<ide> };
<ide> return constructor;
<ide> var RadialAxialShading = (function radialAxialShading() {
<ide> var r0 = raw[5];
<ide> var r1 = raw[6];
<ide>
<del> var curMatrix = ctx.mozCurrentTransform;
<add> var curMatrix = ctx.mozCurrentTransform;
<ide> if (curMatrix) {
<ide> var userMatrix = ctx.mozCurrentTransformInverse;
<ide>
<ide> var RadialAxialShading = (function radialAxialShading() {
<ide> p0 = Util.applyTransform(p0, matrix);
<ide> p1 = Util.applyTransform(p1, matrix);
<ide> }
<del>
<del> return [ "RadialAxialShading", type, this.colorStops, p0, p1, r0, r1 ];
<add>
<add> return ['RadialAxialShading', type, this.colorStops, p0, p1, r0, r1];
<ide> }
<ide> };
<del>
<add>
<ide> return constructor;
<ide> })();
<ide>
<ide> var TilingPatternIR = (function tilingPattern() {
<ide> var PAINT_TYPE_COLORED = 1, PAINT_TYPE_UNCOLORED = 2;
<ide>
<ide> function TilingPatternIR(IR, color, ctx, objs) {
<del> // "Unfolding" the IR.
<del> var IRQueue = IR[2];
<del> this.matrix = IR[3];
<del> var bbox = IR[4];
<del> var xstep = IR[5];
<del> var ystep = IR[6];
<add> // 'Unfolding' the IR.
<add> var IRQueue = IR[2];
<add> this.matrix = IR[3];
<add> var bbox = IR[4];
<add> var xstep = IR[5];
<add> var ystep = IR[6];
<ide> var paintType = IR[7];
<ide>
<del> //
<add> //
<ide> TODO('TilingType');
<ide>
<ide> this.curMatrix = ctx.mozCurrentTransform;
<ide> var TilingPatternIR = (function tilingPattern() {
<ide>
<ide> return ctx.createPattern(this.canvas, 'repeat');
<ide> }
<del> }
<add> };
<ide>
<ide> return TilingPatternIR;
<ide> })();
<ide> var TilingPattern = {
<ide> var xstep = dict.get('XStep');
<ide> var ystep = dict.get('YStep');
<ide> var paintType = dict.get('PaintType');
<del>
<del> return ["TilingPatternIR", args, codeIR, matrix, bbox, xstep, ystep, paintType];
<add>
<add> return [
<add> 'TilingPatternIR', args, codeIR, matrix, bbox, xstep, ystep, paintType
<add> ];
<ide> }
<ide> };
<ide>
<ide> var PDFFunction = (function() {
<ide> var CONSTRUCT_INTERPOLATED = 2;
<ide> var CONSTRUCT_STICHED = 3;
<ide> var CONSTRUCT_POSTSCRIPT = 4;
<del>
<add>
<ide> return {
<ide> getSampleArray: function(size, outputSize, bps, str) {
<ide> var length = 1;
<ide> var PDFFunction = (function() {
<ide> if (!typeFn)
<ide> error('Unknown type of function');
<ide>
<del> return typeFn.call(this, fn, dict, xref);
<add> return typeFn.call(this, fn, dict, xref);
<ide> },
<del>
<add>
<ide> fromIR: function(IR) {
<ide> var type = IR[0];
<ide> switch (type) {
<ide> var PDFFunction = (function() {
<ide>
<ide> var samples = this.getSampleArray(size, outputSize, bps, str);
<ide>
<del> return [ CONSTRUCT_SAMPLED, inputSize, domain, encode, decode, samples, size, outputSize, bps, range ];
<add> return [
<add> CONSTRUCT_SAMPLED, inputSize, domain, encode, decode, samples, size,
<add> outputSize, bps, range
<add> ];
<ide> },
<del>
<add>
<ide> constructSampledFromIR: function(IR) {
<ide> var inputSize = IR[1];
<del> var domain = IR[2];
<del> var encode = IR[3];
<del> var decode = IR[4]
<del> var samples = IR[5]
<del> var size = IR[6]
<del> var outputSize= IR[7];
<del> var bps = IR[8];
<del> var range = IR[9];
<del>
<add> var domain = IR[2];
<add> var encode = IR[3];
<add> var decode = IR[4];
<add> var samples = IR[5];
<add> var size = IR[6];
<add> var outputSize = IR[7];
<add> var bps = IR[8];
<add> var range = IR[9];
<add>
<ide> return function(args) {
<ide> var clip = function(v, min, max) {
<ide> if (v > max)
<ide> var PDFFunction = (function() {
<ide> }
<ide> },
<ide>
<del> constructInterpolated: function pdfFunctionConstructInterpolated(str, dict) {
<add> constructInterpolated:
<add> function pdfFunctionConstructInterpolated(str, dict) {
<ide> var c0 = dict.get('C0') || [0];
<ide> var c1 = dict.get('C1') || [1];
<ide> var n = dict.get('N');
<ide> var PDFFunction = (function() {
<ide> for (var i = 0; i < length; ++i)
<ide> diff.push(c1[i] - c0[i]);
<ide>
<del> return [ CONSTRUCT_INTERPOLATED, c0, diff, n, i ];
<add> return [CONSTRUCT_INTERPOLATED, c0, diff, n, i];
<ide> },
<ide>
<del> constructInterpolatedFromIR: function pdfFunctionconstructInterpolatedFromIR(IR) {
<del> var c0 = IR[1];
<add> constructInterpolatedFromIR:
<add> function pdfFunctionconstructInterpolatedFromIR(IR) {
<add> var c0 = IR[1];
<ide> var diff = IR[2];
<del> var n = IR[3];
<del> var i = IR[4];
<del>
<add> var n = IR[3];
<add> var i = IR[4];
<add>
<ide> var length = diff.length;
<ide>
<ide> return function(args) {
<ide> var PDFFunction = (function() {
<ide> out.push(c0[j] + (x^n * diff[i]));
<ide>
<ide> return out;
<del>
<add>
<ide> }
<ide> },
<del>
<add>
<ide> constructStiched: function pdfFunctionConstructStiched(fn, dict, xref) {
<ide> var domain = dict.get('Domain');
<ide> var range = dict.get('Range');
<ide> var PDFFunction = (function() {
<ide> var bounds = dict.get('Bounds');
<ide> var encode = dict.get('Encode');
<ide>
<del> return [ CONSTRUCT_STICHED, domain, bounds, encoding, fns ];
<add> return [CONSTRUCT_STICHED, domain, bounds, encoding, fns];
<ide> },
<ide>
<ide> constructStichedFromIR: function pdfFunctionConstructStichedFromIR(IR) {
<del> var domain = IR[1];
<del> var bounds = IR[2];
<del> var encoding = IR[3];
<del> var fnsIR = IR[4];
<add> var domain = IR[1];
<add> var bounds = IR[2];
<add> var encoding = IR[3];
<add> var fnsIR = IR[4];
<ide> var fns = [];
<ide>
<ide> for (var i = 0; i < fnsIR.length; i++) {
<ide> var PDFFunction = (function() {
<ide> },
<ide>
<ide> constructPostScript: function pdfFunctionConstructPostScript() {
<del> return [ CONSTRUCT_POSTSCRIPT ];
<add> return [CONSTRUCT_POSTSCRIPT];
<ide> },
<ide>
<ide> constructPostScriptFromIR: function pdfFunctionConstructPostScriptFromIR() {
<ide> var PDFFunction = (function() {
<ide> return [255, 105, 180];
<ide> };
<ide> }
<del> }
<add> };
<ide> })();
<ide>
<ide><path>worker.js
<ide> var WorkerPage = (function() {
<ide> this.workerPDF = workerPDF;
<ide> this.page = page;
<ide> this.objs = objs;
<del>
<add>
<ide> this.ref = page.ref;
<ide> }
<del>
<add>
<ide> constructor.prototype = {
<ide> get width() {
<ide> return this.page.width;
<ide> },
<del>
<add>
<ide> get height() {
<ide> return this.page.height;
<ide> },
<del>
<add>
<ide> get stats() {
<ide> return this.page.stats;
<ide> },
<del>
<add>
<ide> get view() {
<ide> return this.page.view;
<ide> },
<ide> var WorkerPage = (function() {
<ide> this.callback = callback;
<ide> // TODO: Place the worker magic HERE.
<ide> // this.page.startRendering(ctx, callback, errback);
<del>
<add>
<ide> this.startRenderingTime = Date.now();
<del> this.workerPDF.startRendering(this)
<add> this.workerPDF.startRendering(this);
<ide> },
<del>
<add>
<ide> startRenderingFromIRQueue: function(IRQueue, fonts) {
<ide> var gfx = new CanvasGraphics(this.ctx, this.objs);
<del>
<add>
<ide> var startTime = Date.now();
<ide> var callback = function(err) {
<ide> var pageNum = this.page.pageNumber + 1;
<del> console.log("page=%d - rendering time: time=%dms",
<add> console.log('page=%d - rendering time: time=%dms',
<ide> pageNum, Date.now() - startTime);
<del> console.log("page=%d - total time: time=%dms",
<add> console.log('page=%d - total time: time=%dms',
<ide> pageNum, Date.now() - this.startRenderingTime);
<ide>
<ide> this.callback(err);
<ide> }.bind(this);
<ide> this.page.startRenderingFromIRQueue(gfx, IRQueue, fonts, callback);
<ide> },
<del>
<add>
<ide> getLinks: function() {
<ide> return this.page.getLinks();
<ide> }
<ide> };
<del>
<add>
<ide> return constructor;
<ide> })();
<ide>
<ide> /**
<ide> * A PDF document and page is build up of many objects. E.g. there are objects
<ide> * for fonts, images, rendering code and such. These objects might get processed
<del> * inside of a worker. The `PDFObjects` implements some basic functions to manage
<del> * these objects.
<add> * inside of a worker. The `PDFObjects` implements some basic functions to
<add> * manage these objects.
<ide> */
<ide> var PDFObjects = (function() {
<ide> function PDFObjects() {
<ide> var PDFObjects = (function() {
<ide> * object needs to be resolved. If it isn't, this function throws.
<ide> *
<ide> * If called *with* a callback, the callback is called with the data of the
<del> * object once the object is resolved. That means, if you call this
<add> * object once the object is resolved. That means, if you call this
<ide> * function and the object is already resolved, the callback gets called
<ide> * right away.
<ide> */
<ide> get: function(objId, callback) {
<del> // If there is a callback, then the get can be async and the object is
<add> // If there is a callback, then the get can be async and the object is
<ide> // not required to be resolved right now
<ide> if (callback) {
<ide> this.ensureObj(objId).then(callback);
<del> }
<add> }
<ide> // If there isn't a callback, the user expects to get the resolved data
<ide> // directly.
<ide> else {
<ide> var PDFObjects = (function() {
<ide> // If there isn't an object yet or the object isn't resolved, then the
<ide> // data isn't ready yet!
<ide> if (!obj || !obj.isResolved) {
<del> throw "Requesting object that isn't resolved yet " + objId;
<del> }
<add> throw 'Requesting object that isn\'t resolved yet ' + objId;
<add> }
<ide> // Direct access.
<ide> else {
<ide> return obj.data;
<ide> var PDFObjects = (function() {
<ide> */
<ide> resolve: function(objId, data) {
<ide> var objs = this.objs;
<del>
<add>
<ide> // In case there is a promise already on this object, just resolve it.
<ide> if (objs[objId]) {
<ide> objs[objId].resolve(data);
<ide> var PDFObjects = (function() {
<ide> // a *resolved* promise which shouldn't be the case!
<ide> this.ensureObj(objId).data = data;
<ide> }
<del> }
<add> };
<ide> return PDFObjects;
<ide> })();
<ide>
<ide>
<ide> /**
<del> * "Promise" object.
<add> * 'Promise' object.
<ide> * Each object that is stored in PDFObjects is based on a Promise object that
<ide> * contains the status of the object and the data. There migth be situations,
<ide> * where a function want to use the value of an object, but it isn't ready at
<ide> var Promise = (function() {
<ide> this._data = data;
<ide> this.hasData = true;
<ide> } else {
<del> this.isResolved = false;
<add> this.isResolved = false;
<ide> this._data = EMPTY_PROMISE;
<ide> }
<ide> this.callbacks = [];
<ide> };
<del>
<add>
<ide> Promise.prototype = {
<ide> hasData: false,
<ide>
<ide> var Promise = (function() {
<ide> return;
<ide> }
<ide> if (this._data !== EMPTY_PROMISE) {
<del> throw "Promise " + this.name + ": Cannot set the data of a promise twice";
<add> throw 'Promise ' + this.name +
<add> ': Cannot set the data of a promise twice';
<ide> }
<ide> this._data = data;
<ide> this.hasData = true;
<ide> var Promise = (function() {
<ide> this.onDataCallback(data);
<ide> }
<ide> },
<del>
<add>
<ide> get data() {
<ide> if (this._data === EMPTY_PROMISE) {
<del> throw "Promise " + this.name + ": Cannot get data that isn't set";
<add> throw 'Promise ' + this.name + ': Cannot get data that isn\'t set';
<ide> }
<ide> return this._data;
<ide> },
<ide> var Promise = (function() {
<ide> this.onDataCallback = callback;
<ide> }
<ide> },
<del>
<add>
<ide> resolve: function(data) {
<ide> if (this.isResolved) {
<del> throw "A Promise can be resolved only once " + this.name;
<add> throw 'A Promise can be resolved only once ' + this.name;
<ide> }
<ide>
<ide> this.isResolved = true;
<ide> this.data = data;
<ide> var callbacks = this.callbacks;
<del>
<add>
<ide> for (var i = 0; i < callbacks.length; i++) {
<ide> callbacks[i].call(null, data);
<ide> }
<ide> },
<del>
<add>
<ide> then: function(callback) {
<ide> if (!callback) {
<del> throw "Requiring callback" + this.name;
<add> throw 'Requiring callback' + this.name;
<ide> }
<del>
<add>
<ide> // If the promise is already resolved, call the callback directly.
<ide> if (this.isResolved) {
<ide> var data = this.data;
<ide> callback.call(null, data);
<ide> } else {
<del> this.callbacks.push(callback);
<add> this.callbacks.push(callback);
<ide> }
<ide> }
<del> }
<add> };
<ide> return Promise;
<ide> })();
<ide>
<ide><path>worker/console.js
<ide> var console = {
<ide> var args = Array.prototype.slice.call(arguments);
<ide> postMessage({
<ide> action: 'console_log',
<del> data: args
<add> data: args
<ide> });
<ide> },
<del>
<add>
<ide> error: function error() {
<ide> var args = Array.prototype.slice.call(arguments);
<ide> postMessage({
<ide> action: 'console_error',
<del> data: args
<add> data: args
<ide> });
<ide> },
<ide>
<ide><path>worker/message_handler.js
<ide> function MessageHandler(name, comObj) {
<ide> this.name = name;
<ide> this.comObj = comObj;
<ide> var ah = this.actionHandler = {};
<del>
<del> ah["console_log"] = [function(data) {
<add>
<add> ah['console_log'] = [function(data) {
<ide> console.log.apply(console, data);
<del> }]
<del> ah["console_error"] = [function(data) {
<add> }];
<add> ah['console_error'] = [function(data) {
<ide> console.error.apply(console, data);
<del> }]
<del>
<add> }];
<add>
<ide> comObj.onmessage = function(event) {
<ide> var data = event.data;
<ide> if (data.action in ah) {
<ide> MessageHandler.prototype = {
<ide> send: function(actionName, data) {
<ide> this.comObj.postMessage({
<ide> action: actionName,
<del> data: data
<del> });
<add> data: data
<add> });
<ide> }
<del>}
<add>};
<ide>
<ide><path>worker/pdf_worker_loader.js
<ide> importScripts('processor_handler.js');
<ide> // Listen for messages from the main thread.
<ide> var pdfDoc = null;
<ide>
<del>var handler = new MessageHandler("worker_processor", this);
<add>var handler = new MessageHandler('worker_processor', this);
<ide> WorkerProcessorHandler.setup(handler);
<ide><path>worker/processor_handler.js
<ide> var WorkerProcessorHandler = {
<ide> setup: function(handler) {
<ide> var pdfDoc = null;
<del>
<del> handler.on("doc", function(data) {
<add>
<add> handler.on('doc', function(data) {
<ide> // Create only the model of the PDFDoc, which is enough for
<ide> // processing the content of the pdf.
<ide> pdfDoc = new PDFDocModel(new Stream(data));
<ide> });
<del>
<del> handler.on("page_request", function(pageNum) {
<add>
<add> handler.on('page_request', function(pageNum) {
<ide> pageNum = parseInt(pageNum);
<ide>
<ide> var page = pdfDoc.getPage(pageNum);
<ide>
<del> // The following code does quite the same as Page.prototype.startRendering,
<del> // but stops at one point and sends the result back to the main thread.
<add> // The following code does quite the same as
<add> // Page.prototype.startRendering, but stops at one point and sends the
<add> // result back to the main thread.
<ide> var gfx = new CanvasGraphics(null);
<ide>
<ide> var start = Date.now();
<ide> var WorkerProcessorHandler = {
<ide> // Pre compile the pdf page and fetch the fonts/images.
<ide> var IRQueue = page.getIRQueue(handler, dependency);
<ide>
<del> console.log("page=%d - getIRQueue: time=%dms, len=%d", pageNum, Date.now() - start, IRQueue.fnArray.length);
<add> console.log('page=%d - getIRQueue: time=%dms, len=%d', pageNum,
<add> Date.now() - start, IRQueue.fnArray.length);
<ide>
<ide> if (false /* show used commands */) {
<ide> var cmdMap = {};
<del>
<add>
<ide> var fnArray = IRQueue .fnArray;
<ide> for (var i = 0; i < fnArray.length; i++) {
<ide> var entry = fnArray[i];
<del> if (entry == "paintReadyFormXObject") {
<add> if (entry == 'paintReadyFormXObject') {
<ide> //console.log(preCompilation.argsArray[i]);
<ide> }
<ide> if (cmdMap[entry] == null) {
<ide> var WorkerProcessorHandler = {
<ide> cmdMap[entry] += 1;
<ide> }
<ide> }
<del> console.log("cmds", JSON.stringify(cmdMap));
<add> console.log('cmds', JSON.stringify(cmdMap));
<ide> }
<ide>
<ide> // Filter the dependecies for fonts.
<ide> var WorkerProcessorHandler = {
<ide> // }
<ide> // }
<ide>
<del> handler.send("page", {
<del> pageNum: pageNum,
<del> IRQueue: IRQueue,
<add> handler.send('page', {
<add> pageNum: pageNum,
<add> IRQueue: IRQueue,
<ide> depFonts: fonts
<ide> });
<ide> }, this);
<del>
<del> handler.on("font", function(data) {
<del> var objId = data[0];
<del> var name = data[1];
<del> var file = data[2];
<add>
<add> handler.on('font', function(data) {
<add> var objId = data[0];
<add> var name = data[1];
<add> var file = data[2];
<ide> var properties = data[3];
<ide>
<ide> var font = {
<ide> var WorkerProcessorHandler = {
<ide>
<ide> var fontFile = new Stream(file.bytes, file.start,
<ide> file.end - file.start, fontFileDict);
<del>
<del> // Check if this is a FlateStream. Otherwise just use the created
<add>
<add> // Check if this is a FlateStream. Otherwise just use the created
<ide> // Stream one. This makes complex_ttf_font.pdf work.
<ide> var cmf = file.bytes[0];
<ide> if ((cmf & 0x0f) == 0x08) {
<ide> font.file = new FlateStream(fontFile);
<ide> } else {
<ide> font.file = fontFile;
<del> }
<add> }
<ide> }
<ide>
<ide> var obj = new Font(font.name, font.file, font.properties);
<ide> var WorkerProcessorHandler = {
<ide> // anymore as we sent over the ready str.
<ide> delete obj.data;
<ide>
<del> handler.send("font_ready", [objId, obj]);
<add> handler.send('font_ready', [objId, obj]);
<ide> });
<ide> }
<del>}
<add>}; | 7 |
Javascript | Javascript | add datasetlabel to elements for tooltip templates | 7a231313668f3b3dfe949dabd2c8da69f9088c55 | <ide><path>src/Chart.Bar.js
<ide> datasetObject.bars.push(new this.BarClass({
<ide> value : dataPoint,
<ide> label : data.labels[index],
<add> datasetLabel: dataset.label,
<ide> strokeColor : dataset.strokeColor,
<ide> fillColor : dataset.fillColor,
<ide> highlightFill : dataset.highlightFill || dataset.fillColor,
<ide><path>src/Chart.Line.js
<ide> datasetObject.points.push(new this.PointClass({
<ide> value : dataPoint,
<ide> label : data.labels[index],
<del> // x: this.scale.calculateX(index),
<del> // y: this.scale.endPoint,
<add> datasetLabel: dataset.label,
<ide> strokeColor : dataset.pointStrokeColor,
<ide> fillColor : dataset.pointColor,
<ide> highlightFill : dataset.pointHighlightFill || dataset.pointColor,
<ide><path>src/Chart.Radar.js
<ide> datasetObject.points.push(new this.PointClass({
<ide> value : dataPoint,
<ide> label : data.labels[index],
<add> datasetLabel: dataset.label,
<ide> x: (this.options.animation) ? this.scale.xCenter : pointPosition.x,
<ide> y: (this.options.animation) ? this.scale.yCenter : pointPosition.y,
<ide> strokeColor : dataset.pointStrokeColor, | 3 |
Javascript | Javascript | use callbacks instead of events in net2 | 979f5889d5366e93e6fc4355e891f34098055059 | <ide><path>lib/http2.js
<ide> function connectionListener (socket) {
<ide> // we need to keep track of the order they were sent.
<ide> var responses = [];
<ide>
<del> socket.addListener('dataLite', function (d, start, end) {
<add> socket.ondata = function (d, start, end) {
<ide> parser.execute(d, start, end - start);
<del> });
<add> };
<ide>
<del> // is this really needed?
<del> socket.addListener('end', function () {
<add> socket.onend = function () {
<ide> parser.finish();
<ide> // unref the parser for easy gc
<ide> freeParser(parser);
<ide> function connectionListener (socket) {
<ide> } else {
<ide> responses[responses.length-1].closeOnFinish = true;
<ide> }
<del> });
<add> };
<ide>
<ide> parser.socket = socket;
<ide> // The following callback is issued after the headers have been read on a
<ide><path>lib/net.js
<add>var sys = require("./sys");
<ide> var debugLevel = 0;
<ide> if ('NODE_DEBUG' in process.ENV) debugLevel = 1;
<ide> function debug (x) {
<ide> function Socket (peerInfo) {
<ide> if (!recvMsg.fd && bytesRead == 0) {
<ide> self.readable = false;
<ide> self._readWatcher.stop();
<del> self.emit('end');
<add>
<add> if (self._events && self._events['end']) self.emit('end');
<add> if (self.onend) self.onend();
<add>
<ide> if (!self.writable) self.forceClose();
<ide> } else if (bytesRead > 0) {
<ide> var start = recvBuffer.used;
<ide> var end = recvBuffer.used + bytesRead;
<del> if (self.listeners('data').length) {
<add>
<add> if (self._events && self._events['data']) {
<ide> // emit a slice
<ide> self.emit('data', recvBuffer.slice(start, end));
<ide> }
<del> if (self.listeners('dataLite').length) {
<del> // emit the original buffer with end points.
<del> self.emit('dataLite', recvBuffer, start, end);
<del> }
<add>
<add> // Optimization: emit the original buffer with end points
<add> if (self.ondata) self.ondata(recvBuffer, start, end);
<add>
<ide> recvBuffer.used += bytesRead;
<ide> }
<ide> };
<ide> function Socket (peerInfo) {
<ide> self.sendQueueSize = 0; // in bytes, not to be confused with sendQueue.length!
<ide> self.sendMessageQueueSize = 0; // number of messages remaining to be sent
<ide> self._doFlush = function () {
<del> /* Socket becomes writeable on connect() but don't flush if there's
<del> * nothing actually to write */
<add> // Socket becomes writeable on connect() but don't flush if there's
<add> // nothing actually to write
<ide> if ((self.sendQueueSize == 0) && (self.sendMessageQueueSize == 0)) {
<ide> return;
<ide> }
<ide> if (self.flush()) {
<ide> assert(self.sendQueueSize == 0);
<ide> assert(self.sendMessageQueueSize == 0);
<del> self.emit("drain");
<add>
<add> if (self._events && self._events['drain']) self.emit("drain");
<add> if (self.ondrain) self.ondrain(); // Optimization
<ide> }
<ide> };
<ide> self._writeWatcher = ioWatchers.alloc();
<ide> Socket.prototype.sendFD = function(socketToPass) {
<ide> };
<ide>
<ide>
<del>// Flushes the write buffer out. Emits "drain" if the buffer is empty.
<add>// Flushes the write buffer out.
<add>// Returns true if the entire buffer was flushed.
<ide> Socket.prototype.flush = function () {
<ide> var self = this;
<ide>
<ide> Socket.prototype.flush = function () {
<ide>
<ide> if (b == END_OF_FILE) {
<ide> self._shutdown();
<del> break;
<add> return false;
<ide> }
<ide>
<ide> if (b.sent == b.used) {
<ide> Socket.prototype.flush = function () {
<ide> }
<ide> } catch (e) {
<ide> self.forceClose(e);
<del> return;
<add> return false;
<ide> }
<ide>
<ide> if (bytesWritten === null) { | 2 |
Javascript | Javascript | fix incorrect disposing of modules during hmr | 12997f0a6c333ebc2dc88e0d5ca4becbe6addbe8 | <ide><path>lib/HotModuleReplacementPlugin.js
<ide> const {
<ide> const { find } = require("./util/SetHelpers");
<ide> const TupleSet = require("./util/TupleSet");
<ide> const { compareModulesById } = require("./util/comparators");
<add>const { getRuntimeKey, keyToRuntime } = require("./util/runtime");
<ide>
<ide> /** @typedef {import("./Chunk")} Chunk */
<ide> /** @typedef {import("./Compilation").AssetInfo} AssetInfo */
<ide> class HotModuleReplacementPlugin {
<ide> records.fullHashChunkModuleHashes = fullHashChunkModuleHashes;
<ide> records.chunkModuleHashes = chunkModuleHashes;
<ide> records.chunkHashs = {};
<add> records.chunkRuntime = {};
<ide> for (const chunk of compilation.chunks) {
<ide> records.chunkHashs[chunk.id] = chunk.hash;
<add> records.chunkRuntime[chunk.id] = getRuntimeKey(chunk.runtime);
<ide> }
<ide> records.chunkModuleIds = {};
<ide> for (const chunk of compilation.chunks) {
<ide> class HotModuleReplacementPlugin {
<ide> r: [],
<ide> m: undefined
<ide> };
<add>
<add> // Create a list of all active modules to verify which modules are removed completely
<add> /** @type {Map<number|string, Module>} */
<add> const allModules = new Map();
<add> for (const module of compilation.modules) {
<add> allModules.set(chunkGraph.getModuleId(module), module);
<add> }
<add>
<add> // List of completely removed modules
<ide> const allRemovedModules = new Set();
<add>
<ide> for (const key of Object.keys(records.chunkHashs)) {
<add> // Check which modules are completely removed
<add> for (const id of records.chunkModuleIds[key]) {
<add> if (!allModules.has(id)) {
<add> allRemovedModules.add(id);
<add> }
<add> }
<add>
<add> let chunkId;
<add> let newModules;
<add> let newRuntimeModules;
<add> let newFullHashModules;
<add> let newRuntime;
<ide> const currentChunk = find(
<ide> compilation.chunks,
<ide> chunk => `${chunk.id}` === key
<ide> );
<ide> if (currentChunk) {
<del> const chunkId = currentChunk.id;
<del> const newModules = chunkGraph
<add> chunkId = currentChunk.id;
<add> newRuntime = currentChunk.runtime;
<add> newModules = chunkGraph
<ide> .getChunkModules(currentChunk)
<ide> .filter(module => updatedModules.has(module, currentChunk));
<del> const newRuntimeModules = Array.from(
<add> newRuntimeModules = Array.from(
<ide> chunkGraph.getChunkRuntimeModulesIterable(currentChunk)
<ide> ).filter(module => updatedModules.has(module, currentChunk));
<ide> const fullHashModules = chunkGraph.getChunkFullHashModulesIterable(
<ide> currentChunk
<ide> );
<del> const newFullHashModules =
<add> newFullHashModules =
<ide> fullHashModules &&
<ide> Array.from(fullHashModules).filter(module =>
<ide> updatedModules.has(module, currentChunk)
<ide> );
<del> /** @type {Set<number|string>} */
<del> const allModules = new Set();
<del> for (const module of chunkGraph.getChunkModulesIterable(
<del> currentChunk
<del> )) {
<del> allModules.add(chunkGraph.getModuleId(module));
<add> } else {
<add> chunkId = `${+key}` === key ? +key : key;
<add> hotUpdateMainContent.r.push(chunkId);
<add> const runtime = keyToRuntime(records.chunkRuntime[key]);
<add> for (const id of records.chunkModuleIds[key]) {
<add> const module = allModules.get(id);
<add> if (!module) continue;
<add> const hash = chunkGraph.getModuleHash(module, runtime);
<add> const moduleKey = `${key}|${module.identifier()}`;
<add> if (hash !== records.chunkModuleHashes[moduleKey]) {
<add> newModules = newModules || [];
<add> newModules.push(module);
<add> }
<ide> }
<del> const removedModules = records.chunkModuleIds[chunkId].filter(
<del> id => !allModules.has(id)
<add> }
<add> if (
<add> (newModules && newModules.length > 0) ||
<add> (newRuntimeModules && newRuntimeModules.length > 0)
<add> ) {
<add> const hotUpdateChunk = new HotUpdateChunk();
<add> ChunkGraph.setChunkGraphForChunk(hotUpdateChunk, chunkGraph);
<add> hotUpdateChunk.id = chunkId;
<add> hotUpdateChunk.runtime = newRuntime;
<add> chunkGraph.attachModules(hotUpdateChunk, newModules || []);
<add> chunkGraph.attachRuntimeModules(
<add> hotUpdateChunk,
<add> newRuntimeModules || []
<ide> );
<del> if (
<del> newModules.length > 0 ||
<del> newRuntimeModules.length > 0 ||
<del> removedModules.length > 0
<del> ) {
<del> const hotUpdateChunk = new HotUpdateChunk();
<del> ChunkGraph.setChunkGraphForChunk(hotUpdateChunk, chunkGraph);
<del> hotUpdateChunk.id = chunkId;
<del> hotUpdateChunk.runtime = currentChunk.runtime;
<del> chunkGraph.attachModules(hotUpdateChunk, newModules);
<del> chunkGraph.attachRuntimeModules(
<add> if (newFullHashModules) {
<add> chunkGraph.attachFullHashModules(
<ide> hotUpdateChunk,
<del> newRuntimeModules
<add> newFullHashModules
<ide> );
<del> if (newFullHashModules) {
<del> chunkGraph.attachFullHashModules(
<del> hotUpdateChunk,
<del> newFullHashModules
<del> );
<add> }
<add> const renderManifest = compilation.getRenderManifest({
<add> chunk: hotUpdateChunk,
<add> hash: records.hash,
<add> fullHash: records.hash,
<add> outputOptions: compilation.outputOptions,
<add> moduleTemplates: compilation.moduleTemplates,
<add> dependencyTemplates: compilation.dependencyTemplates,
<add> codeGenerationResults: compilation.codeGenerationResults,
<add> runtimeTemplate: compilation.runtimeTemplate,
<add> moduleGraph: compilation.moduleGraph,
<add> chunkGraph
<add> });
<add> for (const entry of renderManifest) {
<add> /** @type {string} */
<add> let filename;
<add> /** @type {AssetInfo} */
<add> let assetInfo;
<add> if ("filename" in entry) {
<add> filename = entry.filename;
<add> assetInfo = entry.info;
<add> } else {
<add> ({
<add> path: filename,
<add> info: assetInfo
<add> } = compilation.getPathWithInfo(
<add> entry.filenameTemplate,
<add> entry.pathOptions
<add> ));
<ide> }
<del> hotUpdateChunk.removedModules = removedModules;
<del> const renderManifest = compilation.getRenderManifest({
<del> chunk: hotUpdateChunk,
<del> hash: records.hash,
<del> fullHash: records.hash,
<del> outputOptions: compilation.outputOptions,
<del> moduleTemplates: compilation.moduleTemplates,
<del> dependencyTemplates: compilation.dependencyTemplates,
<del> codeGenerationResults: compilation.codeGenerationResults,
<del> runtimeTemplate: compilation.runtimeTemplate,
<del> moduleGraph: compilation.moduleGraph,
<del> chunkGraph
<add> const source = entry.render();
<add> compilation.additionalChunkAssets.push(filename);
<add> compilation.emitAsset(filename, source, {
<add> hotModuleReplacement: true,
<add> ...assetInfo
<ide> });
<del> for (const entry of renderManifest) {
<del> /** @type {string} */
<del> let filename;
<del> /** @type {AssetInfo} */
<del> let assetInfo;
<del> if ("filename" in entry) {
<del> filename = entry.filename;
<del> assetInfo = entry.info;
<del> } else {
<del> ({
<del> path: filename,
<del> info: assetInfo
<del> } = compilation.getPathWithInfo(
<del> entry.filenameTemplate,
<del> entry.pathOptions
<del> ));
<del> }
<del> const source = entry.render();
<del> compilation.additionalChunkAssets.push(filename);
<del> compilation.emitAsset(filename, source, {
<del> hotModuleReplacement: true,
<del> ...assetInfo
<del> });
<add> if (currentChunk) {
<ide> currentChunk.files.add(filename);
<ide> compilation.hooks.chunkAsset.call(currentChunk, filename);
<ide> }
<del> hotUpdateMainContent.c.push(chunkId);
<ide> }
<del> } else {
<del> const chunkId = `${+key}` === key ? +key : key;
<del> hotUpdateMainContent.r.push(chunkId);
<del> for (const id of records.chunkModuleIds[chunkId])
<del> allRemovedModules.add(id);
<add> hotUpdateMainContent.c.push(chunkId);
<ide> }
<ide> }
<ide> hotUpdateMainContent.m = Array.from(allRemovedModules);
<ide><path>lib/HotUpdateChunk.js
<ide> const Chunk = require("./Chunk");
<ide> class HotUpdateChunk extends Chunk {
<ide> constructor() {
<ide> super();
<del> /** @type {(string|number)[]} */
<del> this.removedModules = undefined;
<del> }
<del>
<del> /**
<del> * @param {Hash} hash hash (will be modified)
<del> * @param {ChunkGraph} chunkGraph the chunk graph
<del> * @returns {void}
<del> */
<del> updateHash(hash, chunkGraph) {
<del> super.updateHash(hash, chunkGraph);
<del> hash.update(JSON.stringify(this.removedModules));
<ide> }
<ide> }
<ide>
<ide><path>lib/Template.js
<ide> "use strict";
<ide>
<ide> const { ConcatSource, PrefixSource } = require("webpack-sources");
<del>const HotUpdateChunk = require("./HotUpdateChunk");
<del>const { compareIds } = require("./util/comparators");
<ide>
<ide> /** @typedef {import("webpack-sources").ConcatSource} ConcatSource */
<ide> /** @typedef {import("webpack-sources").Source} Source */
<ide> class Template {
<ide> * @returns {Source} rendered chunk modules in a Source object
<ide> */
<ide> static renderChunkModules(renderContext, modules, renderModule, prefix = "") {
<del> const { chunk, chunkGraph } = renderContext;
<add> const { chunkGraph } = renderContext;
<ide> var source = new ConcatSource();
<del> let removedModules;
<del> if (chunk instanceof HotUpdateChunk) {
<del> removedModules = chunk.removedModules;
<del> }
<del> if (
<del> modules.length === 0 &&
<del> (!removedModules || removedModules.length === 0)
<del> ) {
<add> if (modules.length === 0) {
<ide> return null;
<ide> }
<ide> /** @type {{id: string|number, source: Source|string}[]} */
<ide> class Template {
<ide> source: renderModule(module) || "false"
<ide> };
<ide> });
<del> if (removedModules && removedModules.length > 0) {
<del> removedModules.sort(compareIds);
<del> for (const id of removedModules) {
<del> allModules.push({
<del> id,
<del> source: "false"
<del> });
<del> }
<del> }
<ide> const bounds = Template.getModulesArrayBounds(allModules);
<ide> if (bounds) {
<ide> // Render a spare array
<ide><path>lib/javascript/JavascriptModulesPlugin.js
<ide> class JavascriptModulesPlugin {
<ide> hashFunction
<ide> }
<ide> } = compilation;
<del> const hotUpdateChunk = chunk instanceof HotUpdateChunk ? chunk : null;
<ide> const hash = createHash(hashFunction);
<ide> if (hashSalt) hash.update(hashSalt);
<ide> hash.update(`${chunk.id} `);
<ide> class JavascriptModulesPlugin {
<ide> }
<ide> xor.updateHash(hash);
<ide> }
<del> if (hotUpdateChunk) {
<del> hash.update(JSON.stringify(hotUpdateChunk.removedModules));
<del> }
<ide> const digest = /** @type {string} */ (hash.digest(hashDigest));
<ide> chunk.contentHash.javascript = digest.substr(0, hashDigestLength);
<ide> });
<ide><path>lib/util/runtime.js
<ide> const getRuntimeKey = runtime => {
<ide> };
<ide> exports.getRuntimeKey = getRuntimeKey;
<ide>
<add>/**
<add> * @param {string} key key of runtimes
<add> * @returns {RuntimeSpec} runtime(s)
<add> */
<add>const keyToRuntime = key => {
<add> if (key === "*") return undefined;
<add> const items = key.split("\n");
<add> if (items.length === 1) return items[0];
<add> return new SortableSet(items);
<add>};
<add>exports.keyToRuntime = keyToRuntime;
<add>
<ide> const getRuntimesString = set => {
<ide> set.sort();
<ide> return Array.from(set).join("+");
<ide> class RuntimeSpecMap {
<ide> }
<ide>
<ide> keys() {
<del> return Array.from(this._map.keys(), key => {
<del> if (key === "*") return undefined;
<del> const items = key.split("\n");
<del> if (items.length === 1) return items[0];
<del> return new SortableSet(items);
<del> });
<add> return Array.from(this._map.keys(), keyToRuntime);
<ide> }
<ide>
<ide> values() {
<ide><path>test/hotCases/disposing/remove-chunk-with-shared/chunk1.js
<add>export * from "./shared";
<add>import.meta.webpackHot.accept("./shared");
<ide><path>test/hotCases/disposing/remove-chunk-with-shared/chunk2.js
<add>export * from "./shared";
<add>import.meta.webpackHot.accept("./shared");
<ide><path>test/hotCases/disposing/remove-chunk-with-shared/index.js
<add>import module from "./module";
<add>
<add>it("should not disposed shared modules when a chunk is removed", done => {
<add> import("./chunk1").then(chunk1 => {
<add> import.meta.webpackHot.accept("./module", async () => {
<add> expect(module).toBe(42);
<add> expect(chunk1).toMatchObject({
<add> active: true
<add> });
<add> done();
<add> });
<add> NEXT(require("../../update")(done));
<add> }, done);
<add>});
<ide><path>test/hotCases/disposing/remove-chunk-with-shared/module.js
<add>export default import("./chunk2");
<add>---
<add>export default 42;
<ide><path>test/hotCases/disposing/remove-chunk-with-shared/shared.js
<add>export let active = true;
<add>
<add>import.meta.webpackHot.dispose(() => {
<add> active = false;
<add>}); | 10 |
Javascript | Javascript | keep track of incomplete | dfb4dde8fdedd6a9df3026afae2acb9fa9288ca3 | <ide><path>test/testImageURL.browser.js
<ide> function testImageURL(url, timeout, callback){
<ide> testImageURL.getImage(function(img, done){
<ide> function callbackWrapper(error, event){
<ide> callbackWrapper = testImageURL.noop;
<add> testImageURL.running = (testImageURL.running || 0) - 1;
<ide> clearTimeout(timer);
<ide> done(img);
<ide> img = url = timeout = null;
<ide> function testImageURL(url, timeout, callback){
<ide> img.onload = function(event){ callbackWrapper(null, event || window.event); };
<ide> img.onerror = function(error){ callbackWrapper(error); };
<ide> img.src = url;
<add> testImageURL.running = (testImageURL.running || 0) + 1;
<ide>
<ide> if (img.complete === true
<ide> || img.readyState == 4 | 1 |
Python | Python | enforce py3 for official/nlp/modeling | 8a78c15449c471bb603e3289cf7f2b5a1d419dfc | <ide><path>official/nlp/modeling/layers/attention.py
<ide> # ==============================================================================
<ide> """Keras-based attention layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import math
<ide> import string
<ide><path>official/nlp/modeling/layers/cls_head.py
<ide> # limitations under the License.
<ide> # ==============================================================================
<ide> """A Classification head layer which is common used with sequence encoders."""
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/dense_einsum.py
<ide> # ==============================================================================
<ide> """Keras-based einsum layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/dense_einsum_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based einsum layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/gated_feedforward.py
<ide> # ==============================================================================
<ide> """Keras-based gated feedforward layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import gin
<ide> import tensorflow as tf
<ide><path>official/nlp/modeling/layers/gated_feedforward_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based gated feedforward layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> from absl.testing import parameterized
<ide> import numpy as np
<ide> import tensorflow as tf
<ide><path>official/nlp/modeling/layers/masked_lm.py
<ide> # ==============================================================================
<ide> """Masked language model network."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<del>
<ide> import tensorflow as tf
<ide>
<ide> from official.modeling import tf_utils
<ide><path>official/nlp/modeling/layers/masked_lm_test.py
<ide> # ==============================================================================
<ide> """Tests for masked language model network."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/masked_softmax.py
<ide> # ==============================================================================
<ide> """Keras-based softmax layer with optional masking."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/masked_softmax_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based masked softmax layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/mat_mul_with_margin.py
<ide> # ==============================================================================
<ide> """Dot product with margin layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> from typing import Tuple
<ide> # Import libraries
<ide><path>official/nlp/modeling/layers/mat_mul_with_margin_test.py
<ide> # ==============================================================================
<ide> """Tests for mat_mul_with_margin layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<del># Import libraries
<ide> import tensorflow as tf
<ide>
<ide> from tensorflow.python.keras import keras_parameterized # pylint: disable=g-direct-tensorflow-import
<ide><path>official/nlp/modeling/layers/multi_channel_attention.py
<ide> """Multi-channel Attention."""
<ide> # pylint: disable=g-classes-have-attributes
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<del>
<ide> import math
<ide>
<ide> import tensorflow as tf
<ide><path>official/nlp/modeling/layers/multi_channel_attention_test.py
<ide> # ==============================================================================
<ide> """Tests for nlp.nhnet.multi_channel_attention."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/on_device_embedding.py
<ide> # ==============================================================================
<ide> """Keras-based one-hot embedding layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/on_device_embedding_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based one-hot embedding layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/position_embedding.py
<ide> # ==============================================================================
<ide> """Keras-based positional embedding layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import math
<ide>
<ide><path>official/nlp/modeling/layers/position_embedding_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based positional embedding layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/rezero_transformer.py
<ide> # ==============================================================================
<ide> """Keras-based rezero-transformer block layer (Transformer with ReZero)."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import gin
<ide> import tensorflow as tf
<ide> def __init__(self,
<ide> def build(self, input_shape):
<ide> input_tensor = input_shape[0] if len(input_shape) == 2 else input_shape
<ide> input_tensor_shape = tf.TensorShape(input_tensor)
<del> if len(input_tensor_shape) != 3:
<add> if len(input_tensor_shape.as_list()) != 3:
<ide> raise ValueError("TransformerLayer expects a three-dimensional input of "
<ide> "shape [batch, sequence, width].")
<ide> batch_size, sequence_length, hidden_size = input_tensor_shape
<ide><path>official/nlp/modeling/layers/rezero_transformer_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based rezero-transformer block layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import numpy as np
<ide> import tensorflow as tf
<ide>
<ide><path>official/nlp/modeling/layers/self_attention_mask.py
<ide> # ==============================================================================
<ide> """Keras layer that creates a self-attention mask."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<del>
<ide> import tensorflow as tf
<ide> from official.modeling import tf_utils
<ide>
<ide><path>official/nlp/modeling/layers/talking_heads_attention_test.py
<ide> # ==============================================================================
<ide> """Tests for the attention layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> from absl.testing import parameterized
<ide> import numpy as np
<ide> import tensorflow as tf
<ide><path>official/nlp/modeling/layers/transformer.py
<ide> # ==============================================================================
<ide> """Keras-based transformer block layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import gin
<ide> import tensorflow as tf
<ide> def __init__(self,
<ide> def build(self, input_shape):
<ide> input_tensor = input_shape[0] if len(input_shape) == 2 else input_shape
<ide> input_tensor_shape = tf.TensorShape(input_tensor)
<del> if len(input_tensor_shape) != 3:
<add> if len(input_tensor_shape.as_list()) != 3:
<ide> raise ValueError("TransformerLayer expects a three-dimensional input of "
<ide> "shape [batch, sequence, width].")
<ide> batch_size, sequence_length, hidden_size = input_tensor_shape
<ide> def __init__(self,
<ide>
<ide> def build(self, input_shape):
<ide> target_tensor_shape = tf.TensorShape(input_shape[0])
<del> if len(target_tensor_shape) != 3:
<add> if len(target_tensor_shape.as_list()) != 3:
<ide> raise ValueError("TransformerLayer expects a three-dimensional input of "
<ide> "shape [batch, sequence, width].")
<ide> hidden_size = target_tensor_shape[2]
<ide><path>official/nlp/modeling/layers/transformer_scaffold.py
<ide> # ==============================================================================
<ide> """Keras-based transformer scaffold layer."""
<ide> # pylint: disable=g-classes-have-attributes
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<ide>
<ide> import gin
<ide> import tensorflow as tf
<ide> def __init__(self,
<ide> def build(self, input_shape):
<ide> input_tensor = input_shape[0] if len(input_shape) == 2 else input_shape
<ide> input_tensor_shape = tf.TensorShape(input_tensor)
<del> if len(input_tensor_shape) != 3:
<add> if len(input_tensor_shape.as_list()) != 3:
<ide> raise ValueError(
<ide> "TransformerScaffold expects a three-dimensional input of "
<ide> "shape [batch, sequence, width].")
<ide><path>official/nlp/modeling/layers/transformer_scaffold_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based transformer block layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> import json
<ide>
<ide> import numpy as np
<ide><path>official/nlp/modeling/layers/transformer_test.py
<ide> # ==============================================================================
<ide> """Tests for Keras-based transformer block layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del>from __future__ import print_function
<del>
<ide> from absl.testing import parameterized
<ide> import numpy as np
<ide> import tensorflow as tf
<ide><path>official/nlp/modeling/layers/util.py
<ide> # ==============================================================================
<ide> """Keras-based transformer block layer."""
<ide>
<del>from __future__ import absolute_import
<del>from __future__ import division
<del># from __future__ import google_type_annotations
<del>from __future__ import print_function
<del>
<ide> import functools
<ide>
<ide> import tensorflow as tf | 27 |
Ruby | Ruby | use full path to sudo | d39364b70a451a0a0bbd5c5abd1bca2371173c72 | <ide><path>install_homebrew.rb
<ide> def system *args
<ide>
<ide> def sudo *args
<ide> args = if args.length > 1
<del> args.unshift "sudo"
<add> args.unshift "/usr/bin/sudo"
<ide> else
<del> "sudo #{args}"
<add> "/usr/bin/sudo #{args}"
<ide> end
<ide> ohai *args
<ide> system *args | 1 |
PHP | PHP | return response from mailgun | 11b0550b344fea355d7802f2dfaea01c40d2ecdb | <ide><path>src/Illuminate/Mail/Transport/MailgunTransport.php
<ide> public function send(Swift_Mime_Message $message, &$failedRecipients = null)
<ide> {
<ide> $client = $this->getHttpClient();
<ide>
<del> $client->post($this->url, ['auth' => ['api', $this->key],
<add> return $client->post($this->url, ['auth' => ['api', $this->key],
<ide> 'body' => [
<ide> 'to' => $this->getTo($message),
<ide> 'message' => new PostFile('message', (string) $message), | 1 |
PHP | PHP | check ioc for custom connector instances | 19f37ca715808aee7628774a50d19b8939f47945 | <ide><path>src/Illuminate/Database/Connectors/ConnectionFactory.php
<ide> public function createConnector(array $config)
<ide> throw new \InvalidArgumentException("A driver must be specified.");
<ide> }
<ide>
<add> if ($this->container->bound($key = "db.connector.{$config['driver']}"))
<add> {
<add> return $this->container->make($key);
<add> }
<add>
<ide> switch ($config['driver'])
<ide> {
<ide> case 'mysql': | 1 |
Text | Text | adjust code sample for stream.finished | f185990738ca6eb781328bfec65c416b5415d1fc | <ide><path>doc/api/stream.md
<ide> If this is unwanted behavior then the returned cleanup function needs to be
<ide> invoked in the callback:
<ide>
<ide> ```js
<del>const cleanup = finished(...streams, (err) => {
<add>const cleanup = finished(rs, (err) => {
<ide> cleanup();
<ide> // ...
<ide> }); | 1 |
PHP | PHP | add option shortcut 'r' for resource. | 03e44e6406dd0c8cf9de94e1ed58db586858bc24 | <ide><path>src/Illuminate/Routing/Console/ControllerMakeCommand.php
<ide> protected function getDefaultNamespace($rootNamespace)
<ide> protected function getOptions()
<ide> {
<ide> return [
<del> ['resource', null, InputOption::VALUE_NONE, 'Generate a resource controller class.'],
<add> ['resource', 'r', InputOption::VALUE_NONE, 'Generate a resource controller class.'],
<ide> ];
<ide> }
<ide> | 1 |
Ruby | Ruby | add assertion messages | a363099ce19291d786a31342b6efc528b0cf296d | <ide><path>activerecord/test/cases/adapters/postgresql/infinity_test.rb
<ide> class PostgresqlInfinity < ActiveRecord::Base
<ide> record = PostgresqlInfinity.new(float: "-Infinity")
<ide> assert_equal(-Float::INFINITY, record.float)
<ide> record = PostgresqlInfinity.new(float: "NaN")
<del> assert record.float.nan?
<add> assert record.float.nan?, "Expected #{record.float} to be NaN"
<ide> end
<ide>
<ide> test "update_all with infinity on a float column" do
<ide><path>activerecord/test/cases/adapters/postgresql/numbers_test.rb
<ide> def test_values
<ide> assert_equal 123456.789, first.double
<ide> assert_equal(-::Float::INFINITY, second.single)
<ide> assert_equal ::Float::INFINITY, second.double
<del> assert third.double.nan?
<add> assert third.double.nan?, "Expected #{third.double} to be NaN"
<ide> end
<ide>
<ide> def test_update | 2 |
Text | Text | fix small issues with changelog-7.xmd | 20ff6a5db22f6cc8f971a715f09ee65e11bf4adc | <ide><path>CHANGELOG-7.x.md
<ide> - RefreshDatabase migration commands parameters moved to methods ([#34007](https://github.com/laravel/framework/pull/34007), [8b35c8e](https://github.com/laravel/framework/commit/8b35c8e6ba5879e71fd81fd03b5687ee2b46c55a), [256f71c](https://github.com/laravel/framework/commit/256f71c1f81da2d4bb3e327b18389ac43fa97a72))
<ide>
<ide> ### Changed
<del>- allow to reset forced scheme and root-url in UrlGenerator ([#34039](https://github.com/laravel/framework/pull/34039))
<add>- Allow to reset forced scheme and root-url in UrlGenerator ([#34039](https://github.com/laravel/framework/pull/34039))
<ide> - Updating the make commands to use a custom views path ([#34060](https://github.com/laravel/framework/pull/34060), [b593c62](https://github.com/laravel/framework/commit/b593c6242942623fcc12638d0390da7c58dbbb11))
<ide> - Using "public static property" in View Component causes an error ([#34058](https://github.com/laravel/framework/pull/34058))
<ide> - Changed postgres processor ([#34055](https://github.com/laravel/framework/pull/34055))
<ide>
<ide> ### Fixed
<ide> - Fixed `Illuminate\Support\Arr::query()` ([c6f9ae2](https://github.com/laravel/framework/commit/c6f9ae2b6fdc3c1716938223de731b97f6a5a255))
<del>- Dont allow mass filling with table names ([9240404](https://github.com/laravel/framework/commit/9240404b22ef6f9e827577b3753e4713ddce7471), [f5fa6e3](https://github.com/laravel/framework/commit/f5fa6e3a0fbf9a93eab45b9ae73265b4dbfc3ad7))
<add>- Don't allow mass filling with table names ([9240404](https://github.com/laravel/framework/commit/9240404b22ef6f9e827577b3753e4713ddce7471), [f5fa6e3](https://github.com/laravel/framework/commit/f5fa6e3a0fbf9a93eab45b9ae73265b4dbfc3ad7))
<ide>
<ide>
<ide> ## [v7.23.1 (2020-08-06)](https://github.com/laravel/framework/compare/v7.23.0...v7.23.1)
<ide>
<ide> ### Fixed
<ide> - Prevent usage of get*AtColumn() when model has no timestamps ([#33634](https://github.com/laravel/framework/pull/33634))
<del>- Dont decrement transaction below 0 in `Illuminate\Database\Concerns\ManagesTransactions::handleCommitTransactionException()` ([7681795](https://github.com/laravel/framework/commit/768179578e5492b5f80c391bd43b233938e16e27))
<add>- Don't decrement transaction below 0 in `Illuminate\Database\Concerns\ManagesTransactions::handleCommitTransactionException()` ([7681795](https://github.com/laravel/framework/commit/768179578e5492b5f80c391bd43b233938e16e27))
<ide> - Fixed transaction problems on closure transaction ([c4cdfc7](https://github.com/laravel/framework/commit/c4cdfc7c54127b772ef10f37cfc9ef8e9d6b3227))
<ide> - Prevent to serialize uninitialized properties ([#33644](https://github.com/laravel/framework/pull/33644))
<ide> - Fixed missing statement preventing deletion in `Illuminate\Database\Eloquent\Relations\MorphPivot::delete()` ([#33648](https://github.com/laravel/framework/pull/33648))
<ide> - Fixed `Illuminate\Cache\ArrayStore::increment()` bug that changes expiration to forever ([#32875](https://github.com/laravel/framework/pull/32875))
<ide>
<ide> ### Changed
<del>- Dont cache non objects in `Illuminate/Database/Eloquent/Concerns/HasAttributes::getClassCastableAttributeValue()` ([894fe22](https://github.com/laravel/framework/commit/894fe22c6c111b224de5bada24dcbba4c93f0305))
<add>- Don't cache non objects in `Illuminate/Database/Eloquent/Concerns/HasAttributes::getClassCastableAttributeValue()` ([894fe22](https://github.com/laravel/framework/commit/894fe22c6c111b224de5bada24dcbba4c93f0305))
<ide> - Added explicit `symfony/polyfill-php73` dependency ([5796b1e](https://github.com/laravel/framework/commit/5796b1e43dfe14914050a7e5dd24ddf803ec99b8))
<ide> - Set `Cache\FileStore` file permissions only once ([#32845](https://github.com/laravel/framework/pull/32845), [11c533b](https://github.com/laravel/framework/commit/11c533b9aa062f4cba1dd0fe3673bf33d275480f))
<ide> - Added alias as key of package's view components ([#32863](https://github.com/laravel/framework/pull/32863))
<ide> - Fixed `trim` of the prefix in the `CompiledRouteCollection::newRoute()` ([ce0355c](https://github.com/laravel/framework/commit/ce0355c72bf4defb93ae80c7bf7812bd6532031a), [b842c65](https://github.com/laravel/framework/commit/b842c65ecfe1ea7839d61a46b177b6b5887fd4d2))
<ide>
<ide> ### Changed
<del>- remove comments before compiling components in the `BladeCompiler` ([2964d2d](https://github.com/laravel/framework/commit/2964d2dfd3cc50f7a709effee0af671c86587915))
<add>- Remove comments before compiling components in the `BladeCompiler` ([2964d2d](https://github.com/laravel/framework/commit/2964d2dfd3cc50f7a709effee0af671c86587915))
<ide>
<ide>
<ide> ## [v7.0.1 (2020-03-03)](https://github.com/laravel/framework/compare/v7.0.0...v7.0.1) | 1 |
Text | Text | avoid _may_ in collaborator guide | 8640cd6066b858f7499d175b18dfc1947653063f | <ide><path>doc/guides/collaborator-guide.md
<ide> as a _Contributor_. Ask if they have configured their git
<ide>
<ide> ### Closing issues and pull requests
<ide>
<del>Collaborators may close any issue or pull request that is not relevant to the
<add>Collaborators can close any issue or pull request that is not relevant to the
<ide> future of the Node.js project. Where this is unclear, leave the issue or pull
<ide> request open for several days to allow for discussion. Where this does not yield
<ide> evidence that the issue or pull request has relevance, close it. Remember that
<ide> for the change.
<ide>
<ide> Approval must be from Collaborators who are not authors of the change.
<ide>
<del>In some cases, it may be necessary to summon a GitHub team to a pull request for
<del>review by @-mention.
<add>In some cases, it might be necessary to summon a GitHub team to a pull request
<add>for review by @-mention.
<ide> See [Who to CC in the issue tracker](#who-to-cc-in-the-issue-tracker).
<ide>
<ide> If you are the first Collaborator to approve a pull request that has no CI yet,
<ide> pull request creator pushed new code since the last CI run.
<ide>
<ide> ### Consensus seeking
<ide>
<del>A pull request may land if it has the needed [approvals](#code-reviews),
<add>A pull request can land if it has the needed [approvals](#code-reviews),
<ide> [CI](#testing-and-ci), [wait time](#waiting-for-approvals) and no
<ide> [outstanding objections](#objections). [Breaking changes](#breaking-changes)
<ide> must receive [TSC review](#involving-the-tsc) in addition to other
<ide> requirements. If a pull request meets all requirements except the
<ide>
<ide> #### Objections
<ide>
<del>**Collaborators may object to a pull request by using the "Request
<add>**Collaborators can object to a pull request by using the "Request
<ide> Changes" GitHub feature**. Dissent comments alone don't constitute an
<ide> objection. **Any PR objection must include a clear reason for that objection,
<ide> and the objector must remain responsive for further discussion towards
<ide> consensus about the direction of the pull request**. Providing a set of
<ide> actionable steps alongside the objection is recommended.
<ide>
<del>If the objection is not clear to others, another collaborator may ask an
<add>If the objection is not clear to others, another collaborator can ask an
<ide> objecting collaborator to explain their objection or to provide actionable
<ide> steps to resolve the objection. **If the objector is unresponsive for seven
<del>days after a collaborator asks for clarification, another collaborator may
<add>days after a collaborator asks for clarification, another collaborator can
<ide> dismiss the objection**.
<ide>
<ide> **Pull requests with outstanding objections must remain open until all
<ide> objections are satisfied**. If reaching consensus is not possible, a
<del>collaborator may escalate the issue to the TSC by pinging `@nodejs/tsc` and
<add>collaborator can escalate the issue to the TSC by pinging `@nodejs/tsc` and
<ide> adding the `tsc-agenda` label to the issue.
<ide>
<ide> #### Helpful resources
<ide> adding the `tsc-agenda` label to the issue.
<ide> ### Waiting for approvals
<ide>
<ide> Before landing pull requests, allow 48 hours for input from other Collaborators.
<del>Certain types of pull requests can be fast-tracked and may land after a shorter
<add>Certain types of pull requests can be fast-tracked and can land after a shorter
<ide> delay. For example:
<ide>
<ide> * Focused changes that affect only documentation and/or the test suite:
<ide> * `code-and-learn` tasks often fall into this category.
<del> * `good-first-issue` pull requests may also be suitable.
<add> * `good-first-issue` pull requests might also be suitable.
<ide> * Changes that fix regressions:
<ide> * Regressions that break the workflow (red CI or broken compilation).
<ide> * Regressions that happen right before a release, or reported soon after.
<ide>
<ide> To propose fast-tracking a pull request, apply the `fast-track` label. Then add
<del>a comment that Collaborators may upvote.
<add>a comment that Collaborators can upvote.
<ide>
<ide> If someone disagrees with the fast-tracking request, remove the label. Do not
<ide> fast-track the pull request in that case.
<ide>
<del>The pull request may be fast-tracked if two Collaborators approve the
<add>The pull request can be fast-tracked if two Collaborators approve the
<ide> fast-tracking request. To land, the pull request itself still needs two
<ide> Collaborator approvals and a passing CI.
<ide>
<del>Collaborators may request fast-tracking of pull requests they did not author.
<add>Collaborators can request fast-tracking of pull requests they did not author.
<ide> In that case only, the request itself is also one fast-track approval. Upvote
<ide> the comment anyway to avoid any doubt.
<ide>
<ide> For more information, see [Deprecations](#deprecations).
<ide>
<ide> #### Breaking changes to internal elements
<ide>
<del>Breaking changes to internal elements may occur in semver-patch or semver-minor
<add>Breaking changes to internal elements can occur in semver-patch or semver-minor
<ide> commits. Take significant care when making and reviewing such changes. Make
<ide> an effort to determine the potential impact of the change in the ecosystem. Use
<ide> [Canary in the Goldmine](https://github.com/nodejs/citgm) to test such changes.
<ide> providing a Public API in such cases.
<ide> #### Unintended breaking changes
<ide>
<ide> Sometimes, a change intended to be non-breaking turns out to be a breaking
<del>change. If such a change lands on the master branch, a Collaborator may revert
<del>it. As an alternative to reverting, the TSC may apply the semver-major label
<add>change. If such a change lands on the master branch, a Collaborator can revert
<add>it. As an alternative to reverting, the TSC can apply the semver-major label
<ide> after-the-fact.
<ide>
<ide> ##### Reverting commits
<ide>
<ide> Revert commits with `git revert <HASH>` or `git revert <FROM>..<TO>`. The
<del>generated commit message will not have a subsystem and may violate line length
<add>generated commit message will not have a subsystem and might violate line length
<ide> rules. That is OK. Append the reason for the revert and any `Refs` or `Fixes`
<ide> metadata. Raise a pull request like any other change.
<ide>
<ide> documentation must state the deprecation status.
<ide> * There are no functional changes.
<ide> * By default, there will be no warnings emitted for such deprecations at
<ide> runtime.
<del> * May cause a runtime warning with the [`--pending-deprecation`][] flag or
<add> * Might cause a runtime warning with the [`--pending-deprecation`][] flag or
<ide> `NODE_PENDING_DEPRECATION` environment variable.
<ide>
<ide> * Runtime Deprecation
<ide> documentation must state the deprecation status.
<ide>
<ide> * End-of-Life
<ide> * The API is no longer subject to the semantic versioning rules.
<del> * Backward-incompatible changes including complete removal of such APIs may
<add> * Backward-incompatible changes including complete removal of such APIs can
<ide> occur at any time.
<ide>
<ide> Apply the `notable change` label to all pull requests that introduce
<ide> Documentation-Only Deprecations. Such deprecations have no impact on code
<ide> execution. Thus, they are not breaking changes (`semver-major`).
<ide>
<ide> Runtime Deprecations and End-of-Life APIs (internal or public) are breaking
<del>changes (`semver-major`). The TSC may make exceptions, deciding that one of
<add>changes (`semver-major`). The TSC can make exceptions, deciding that one of
<ide> these deprecations is not a breaking change.
<ide>
<ide> Avoid Runtime Deprecations when an alias or a stub/no-op will suffice. An alias
<ide> example, due to removal of an End-of-Life deprecated API).
<ide>
<ide> <a id="deprecation-cycle"></a>
<ide> A _deprecation cycle_ is a major release during which an API has been in one of
<del>the three Deprecation levels. Documentation-Only Deprecations may land in a
<del>minor release. They may not change to a Runtime Deprecation until the next major
<add>the three Deprecation levels. Documentation-Only Deprecations can land in a
<add>minor release. They can not change to a Runtime Deprecation until the next major
<ide> release.
<ide>
<ide> No API can change to End-of-Life without going through a Runtime Deprecation
<ide> cycle. There is no rule that deprecated code must progress to End-of-Life.
<del>Documentation-Only and Runtime Deprecations may remain in place for an unlimited
<add>Documentation-Only and Runtime Deprecations can remain in place for an unlimited
<ide> duration.
<ide>
<ide> Communicate pending deprecations and associated mitigations with the ecosystem
<ide> deprecation level of an API.
<ide>
<ide> ### Involving the TSC
<ide>
<del>Collaborators may opt to elevate pull requests or issues to the [TSC][].
<add>Collaborators can opt to elevate pull requests or issues to the [TSC][].
<ide> Do this if a pull request or issue:
<ide>
<ide> * Is labeled `semver-major`, or
<ide> code. If you wish to create the token yourself in advance, see
<ide>
<ide> ### Technical HOWTO
<ide>
<del>Clear any `am`/`rebase` that may already be underway:
<add>Clear any `am`/`rebase` that might already be underway:
<ide>
<ide> ```text
<ide> $ git am --abort
<ide> for that commit. This is an opportunity to fix commit messages.
<ide> request. This makes it easy to trace a commit back to the conversation that
<ide> led up to that change.
<ide> * Optional: A `Fixes: X` line, where _X_ is the full GitHub URL for an
<del> issue. A commit message may include more than one `Fixes:` lines.
<add> issue. A commit message can include more than one `Fixes:` lines.
<ide> * Optional: One or more `Refs:` lines referencing a URL for any relevant
<ide> background.
<ide> * Required: A `Reviewed-By: Name <email>` line for each Collaborator who
<ide> for that commit. This is an opportunity to fix commit messages.
<ide> pull request.
<ide> * Protects against the assumption that GitHub will be around forever.
<ide>
<del>Other changes may have landed on master since the successful CI run. As a
<add>Other changes might have landed on master since the successful CI run. As a
<ide> precaution, run tests (`make -j4 test` or `vcbuild test`).
<ide>
<ide> Confirm that the commit message format is correct using
<ide> more than one commit.
<ide>
<ide> ### Troubleshooting
<ide>
<del>Sometimes, when running `git push upstream master`, you may get an error message
<del>like this:
<add>Sometimes, when running `git push upstream master`, you might get an error
<add>message like this:
<ide>
<ide> ```console
<ide> To https://github.com/nodejs/node
<ide> corresponding staging branch (v10.x-staging, v8.x-staging, etc.).
<ide> Commits that land on master are cherry-picked to each staging branch as
<ide> appropriate. If a change applies only to the LTS branch, open the PR against the
<ide> *staging* branch. Commits from the staging branch land on the LTS branch only
<del>when a release is being prepared. They may land on the LTS branch in a different
<add>when a release is being prepared. They can land on the LTS branch in a different
<ide> order than they were in staging.
<ide>
<ide> Only members of @nodejs/backporters should land commits onto LTS staging
<ide> Likewise, as commits land in an LTS release, the releaser removes the `land-on-`
<ide> label.
<ide>
<ide> Attach the appropriate `lts-watch-` label to any pull request that
<del>may impact an LTS release.
<add>might impact an LTS release.
<ide>
<ide> ## Who to CC in the issue tracker
<ide>
<ide> may impact an LTS release.
<ide> When things need extra attention, are controversial, or `semver-major`:
<ide> @nodejs/tsc
<ide>
<del>If you cannot find who to cc for a file, `git shortlog -n -s <file>` may help.
<add>If you cannot find who to cc for a file, `git shortlog -n -s <file>` can help.
<ide>
<ide> ["Merge Pull Request"]: https://help.github.com/articles/merging-a-pull-request/#merging-a-pull-request-on-github
<ide> [Deprecation]: https://en.wikipedia.org/wiki/Deprecation | 1 |
Go | Go | remove run race condition | 2b6ca3872883dcb487d8a39a1a8530be6a62f947 | <ide><path>commands.go
<ide> func (cli *DockerCli) CmdRun(args ...string) error {
<ide> fmt.Fprintln(os.Stderr, "WARNING: ", warning)
<ide> }
<ide>
<del> splitStderr := !config.Tty
<del>
<del> connections := 0
<del> if config.AttachStdin || config.AttachStdout || (!splitStderr && config.AttachStderr) {
<del> connections += 1
<del> }
<del> if splitStderr && config.AttachStderr {
<del> connections += 1
<del> }
<del>
<ide> //start the container
<ide> _, _, err = cli.call("POST", "/containers/"+out.ID+"/start", nil)
<ide> if err != nil {
<ide> func (cli *DockerCli) CmdRun(args ...string) error {
<ide>
<ide> if !config.AttachStdout && !config.AttachStderr {
<ide> fmt.Println(out.ID)
<del> }
<del> if connections > 0 {
<del> chErrors := make(chan error, connections)
<add> } else {
<add> chErrors := make(chan error)
<ide> if config.Tty {
<ide> cli.monitorTtySize(out.ID)
<ide> }
<ide>
<del> if splitStderr && config.AttachStderr {
<del> go func() {
<del> chErrors <- cli.hijack("POST", "/containers/"+out.ID+"/attach?logs=1&stream=1&stderr=1", config.Tty, nil, os.Stderr)
<del> }()
<del> }
<del>
<ide> v := url.Values{}
<ide> v.Set("logs", "1")
<ide> v.Set("stream", "1")
<ide> func (cli *DockerCli) CmdRun(args ...string) error {
<ide> if config.AttachStdout {
<ide> v.Set("stdout", "1")
<ide> }
<del> if !splitStderr && config.AttachStderr {
<add> if config.AttachStderr {
<ide> v.Set("stderr", "1")
<ide> }
<ide> go func() {
<ide> chErrors <- cli.hijack("POST", "/containers/"+out.ID+"/attach?"+v.Encode(), config.Tty, os.Stdin, os.Stdout)
<ide> }()
<del> for connections > 0 {
<del> err := <-chErrors
<del> if err != nil {
<del> utils.Debugf("Error hijack: %s", err)
<del> return err
<del> }
<del> connections -= 1
<add> if err := <-chErrors; err != nil {
<add> utils.Debugf("Error hijack: %s", err)
<add> return err
<ide> }
<ide> }
<ide> return nil | 1 |
Text | Text | update welcome aboard text in guides [ci skip] | bb1f556a9a1e3e097bcfed5096dd921053769564 | <ide><path>guides/source/getting_started.md
<ide> This will fire up Puma, a web server distributed with Rails by default. To see
<ide> your application in action, open a browser window and navigate to
<ide> <http://localhost:3000>. You should see the Rails default information page:
<ide>
<del>
<add>
<ide>
<ide> TIP: To stop the web server, hit Ctrl+C in the terminal window where it's
<ide> running. To verify the server has stopped you should see your command prompt
<ide> dollar sign `$`. In development mode, Rails does not generally require you to
<ide> restart the server; changes you make in files will be automatically picked up by
<ide> the server.
<ide>
<del>The "Welcome aboard" page is the _smoke test_ for a new Rails application: it
<del>makes sure that you have your software configured correctly enough to serve a
<del>page.
<add>The "Yay! You're on Rails!" page is the _smoke test_ for a new Rails
<add>application: it makes sure that you have your software configured correctly
<add>enough to serve a page.
<ide>
<ide> ### Say "Hello", Rails
<ide>
<ide> of code:
<ide> Now that we have made the controller and view, we need to tell Rails when we
<ide> want "Hello, Rails!" to show up. In our case, we want it to show up when we
<ide> navigate to the root URL of our site, <http://localhost:3000>. At the moment,
<del>"Welcome aboard" is occupying that spot.
<add>"Yay! You're on Rails!" is occupying that spot.
<ide>
<ide> Next, you have to tell Rails where your actual home page is located.
<ide> | 1 |
Python | Python | fix error on test | c68f188eb035ed67e2df905dd5e483f0261a8ace | <ide><path>spacy/tests/tokenizer/test_customized_tokenizer.py
<ide> # coding: utf-8
<ide> from __future__ import unicode_literals
<ide>
<del>from ...lang.en import English
<add>from ...en import English
<ide> from ...tokenizer import Tokenizer
<ide> from ... import util
<ide> | 1 |
PHP | PHP | return whatever the mailer returns | 76c3d4cc2697c64d02e0e2239274ebf84656b857 | <ide><path>src/Illuminate/Mail/Mailable.php
<ide> class Mailable implements MailableContract, Renderable
<ide> */
<ide> public function send(MailerContract $mailer)
<ide> {
<del> $this->withLocale($this->locale, function () use ($mailer) {
<add> return $this->withLocale($this->locale, function () use ($mailer) {
<ide> Container::getInstance()->call([$this, 'build']);
<ide>
<del> $mailer->send($this->buildView(), $this->buildViewData(), function ($message) {
<add> return $mailer->send($this->buildView(), $this->buildViewData(), function ($message) {
<ide> $this->buildFrom($message)
<ide> ->buildRecipients($message)
<ide> ->buildSubject($message) | 1 |
Javascript | Javascript | clarify error when passing class to render() | 895fab782bc565bc1b8de434e00e1d50bd291279 | <ide><path>src/renderers/dom/client/ReactMount.js
<ide> var ReactMount = {
<ide> 'ReactDOM.render(): Invalid component element.%s',
<ide> (
<ide> typeof nextElement === 'string' ?
<del> ' Instead of passing an element string, make sure to instantiate ' +
<del> 'it by passing it to React.createElement.' :
<add> ' Instead of passing a string like \'div\', pass ' +
<add> 'React.createElement(\'div\') or <div />.' :
<ide> typeof nextElement === 'function' ?
<del> ' Instead of passing a component class, make sure to instantiate ' +
<del> 'it by passing it to React.createElement.' :
<add> ' Instead of passing a class like Foo, pass ' +
<add> 'React.createElement(Foo) or <Foo />.' :
<ide> // Check if it quacks like an element
<ide> nextElement != null && nextElement.props !== undefined ?
<ide> ' This may be caused by unintentionally loading two independent ' +
<ide><path>src/renderers/dom/client/__tests__/ReactMount-test.js
<ide> describe('ReactMount', function() {
<ide> expect(function() {
<ide> ReactTestUtils.renderIntoDocument('div');
<ide> }).toThrow(
<del> 'ReactDOM.render(): Invalid component element. Instead of passing an ' +
<del> 'element string, make sure to instantiate it by passing it to ' +
<del> 'React.createElement.'
<add> 'ReactDOM.render(): Invalid component element. Instead of passing a ' +
<add> 'string like \'div\', pass React.createElement(\'div\') or <div />.'
<ide> );
<ide> });
<ide>
<ide> describe('ReactMount', function() {
<ide> ReactTestUtils.renderIntoDocument(Component);
<ide> }).toThrow(
<ide> 'ReactDOM.render(): Invalid component element. Instead of passing a ' +
<del> 'component class, make sure to instantiate it by passing it to ' +
<del> 'React.createElement.'
<add> 'class like Foo, pass React.createElement(Foo) or <Foo />.'
<ide> );
<ide> });
<ide> | 2 |
Python | Python | add shape inference to existing layers | 35d66d672b2bdb5048b39153c60070c5c1f05fc4 | <ide><path>keras/layers/convolutional.py
<ide>
<ide>
<ide> def conv_output_length(input_length, filter_size, border_mode, stride):
<add> if input_length is None:
<add> return None
<ide> assert border_mode in {'same', 'full', 'valid'}
<ide> if border_mode == 'same':
<ide> output_length = input_length
<ide> def conv_output_length(input_length, filter_size, border_mode, stride):
<ide>
<ide>
<ide> def pool_output_length(input_length, pool_size, ignore_border, stride):
<add> if input_length is None:
<add> return None
<ide> if ignore_border:
<ide> output_length = input_length - pool_size + 1
<ide> output_length = (output_length + stride - 1) // stride
<ide> def __init__(self, input_dim, nb_filter, filter_length,
<ide>
<ide> if border_mode not in {'valid', 'full', 'same'}:
<ide> raise Exception('Invalid border mode for Convolution1D:', border_mode)
<del>
<del> super(Convolution1D, self).__init__(**kwargs)
<ide> self.nb_filter = nb_filter
<del> self.input_dim = input_dim
<ide> self.filter_length = filter_length
<del> self.subsample_length = subsample_length
<ide> self.init = initializations.get(init)
<ide> self.activation = activations.get(activation)
<del> self.subsample = (subsample_length, 1)
<ide> self.border_mode = border_mode
<add> self.subsample_length = subsample_length
<add>
<add> self.subsample = (subsample_length, 1)
<ide>
<add> self.W_regularizer = regularizers.get(W_regularizer)
<add> self.b_regularizer = regularizers.get(b_regularizer)
<add> self.activity_regularizer = regularizers.get(activity_regularizer)
<add>
<add> self.W_constraint = constraints.get(W_constraint)
<add> self.b_constraint = constraints.get(b_constraint)
<add> self.constraints = [self.W_constraint, self.b_constraint]
<add>
<add> self.initial_weights = weights
<add> super(Convolution1D, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide> self.W_shape = (nb_filter, input_dim, filter_length, 1)
<ide> self.W = self.init(self.W_shape)
<ide> self.b = shared_zeros((nb_filter,))
<del>
<ide> self.params = [self.W, self.b]
<del>
<ide> self.regularizers = []
<ide>
<del> self.W_regularizer = regularizers.get(W_regularizer)
<ide> if self.W_regularizer:
<ide> self.W_regularizer.set_param(self.W)
<ide> self.regularizers.append(self.W_regularizer)
<ide>
<del> self.b_regularizer = regularizers.get(b_regularizer)
<ide> if self.b_regularizer:
<ide> self.b_regularizer.set_param(self.b)
<ide> self.regularizers.append(self.b_regularizer)
<ide>
<del> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide> if self.activity_regularizer:
<ide> self.activity_regularizer.set_layer(self)
<ide> self.regularizers.append(self.activity_regularizer)
<ide>
<del> self.W_constraint = constraints.get(W_constraint)
<del> self.b_constraint = constraints.get(b_constraint)
<del> self.constraints = [self.W_constraint, self.b_constraint]
<del>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "nb_filter": self.nb_filter,
<ide> "filter_length": self.filter_length,
<ide> "init": self.init.__name__,
<ide> def get_config(self):
<ide> class Convolution2D(Layer):
<ide> input_ndim = 4
<ide>
<del> def __init__(self, nb_filter, stack_size, nb_row, nb_col,
<add> def __init__(self, nb_filter, nb_row, nb_col,
<ide> init='glorot_uniform', activation='linear', weights=None,
<ide> border_mode='valid', subsample=(1, 1),
<ide> W_regularizer=None, b_regularizer=None, activity_regularizer=None,
<ide> W_constraint=None, b_constraint=None, **kwargs):
<ide>
<ide> if border_mode not in {'valid', 'full', 'same'}:
<ide> raise Exception('Invalid border mode for Convolution2D:', border_mode)
<del>
<del> super(Convolution2D, self).__init__(**kwargs)
<add> self.nb_filter = nb_filter
<add> self.nb_row = nb_row
<add> self.nb_col = nb_col
<ide> self.init = initializations.get(init)
<ide> self.activation = activations.get(activation)
<del> self.subsample = tuple(subsample)
<ide> self.border_mode = border_mode
<del> self.nb_filter = nb_filter
<del> self.stack_size = stack_size
<add> self.subsample = tuple(subsample)
<ide>
<del> self.nb_row = nb_row
<del> self.nb_col = nb_col
<add> self.W_regularizer = regularizers.get(W_regularizer)
<add> self.b_regularizer = regularizers.get(b_regularizer)
<add> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide>
<add> self.W_constraint = constraints.get(W_constraint)
<add> self.b_constraint = constraints.get(b_constraint)
<add> self.constraints = [self.W_constraint, self.b_constraint]
<add>
<add> self.initial_weights = weights
<add> super(Convolution2D, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> stack_size = self.input_shape[1]
<ide> self.input = T.tensor4()
<ide> self.W_shape = (nb_filter, stack_size, nb_row, nb_col)
<ide> self.W = self.init(self.W_shape)
<ide> self.b = shared_zeros((nb_filter,))
<del>
<ide> self.params = [self.W, self.b]
<del>
<ide> self.regularizers = []
<ide>
<del> self.W_regularizer = regularizers.get(W_regularizer)
<ide> if self.W_regularizer:
<ide> self.W_regularizer.set_param(self.W)
<ide> self.regularizers.append(self.W_regularizer)
<ide>
<del> self.b_regularizer = regularizers.get(b_regularizer)
<ide> if self.b_regularizer:
<ide> self.b_regularizer.set_param(self.b)
<ide> self.regularizers.append(self.b_regularizer)
<ide>
<del> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide> if self.activity_regularizer:
<ide> self.activity_regularizer.set_layer(self)
<ide> self.regularizers.append(self.activity_regularizer)
<ide>
<del> self.W_constraint = constraints.get(W_constraint)
<del> self.b_constraint = constraints.get(b_constraint)
<del> self.constraints = [self.W_constraint, self.b_constraint]
<del>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output(self, train=False):
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<ide> "nb_filter": self.nb_filter,
<del> "stack_size": self.stack_size,
<ide> "nb_row": self.nb_row,
<ide> "nb_col": self.nb_col,
<ide> "init": self.init.__name__,
<ide><path>keras/layers/core.py
<ide> def set_previous(self, layer, connection_map={}):
<ide> if not self.supports_masked_input() and layer.get_output_mask() is not None:
<ide> raise Exception("Cannot connect non-masking layer to layer with masked output")
<ide> self.previous = layer
<add> self.build()
<add>
<add> def build(self):
<add> '''Instantiation of layer weights.
<add>
<add> Called after `set_previous`, or after `set_input_shape`,
<add> once the layer has a defined input shape.
<add> Must be implemented on all layers that have weights.
<add> '''
<add> pass
<ide>
<ide> @property
<ide> def nb_input(self):
<ide> def input_shape(self):
<ide> elif hasattr(self, '_input_shape'):
<ide> return self._input_shape
<ide> else:
<del> raise NotImplementedError
<add> raise Exception('Layer is not connected.')
<ide>
<ide> def set_input_shape(self, input_shape):
<ide> if type(input_shape) not in [tuple, list]:
<ide> def set_input_shape(self, input_shape):
<ide> str(self.input_ndim) + ', was provided with input shape ' + str(input_shape))
<ide> self._input_shape = input_shape
<ide> self.input = ndim_tensor(len(self._input_shape))
<add> self.build()
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output_mask(self, train=None):
<ide> return None
<ide>
<ide> def set_weights(self, weights):
<add> assert len(self.params) == len(weights), 'Provided weight array does not match layer weights.'
<ide> for p, w in zip(self.params, weights):
<ide> if p.eval().shape != w.shape:
<ide> raise Exception("Layer shape %s not compatible with weight shape %s." % (p.eval().shape, w.shape))
<ide> class Dense(Layer):
<ide> '''
<ide> input_ndim = 2
<ide>
<del> def __init__(self, input_dim, output_dim, init='glorot_uniform', activation='linear', weights=None, name=None,
<add> def __init__(self, output_dim, init='glorot_uniform', activation='linear', weights=None,
<ide> W_regularizer=None, b_regularizer=None, activity_regularizer=None,
<ide> W_constraint=None, b_constraint=None, **kwargs):
<del>
<del> super(Dense, self).__init__(**kwargs)
<ide> self.init = initializations.get(init)
<ide> self.activation = activations.get(activation)
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<ide>
<add> self.W_regularizer = regularizers.get(W_regularizer)
<add> self.b_regularizer = regularizers.get(b_regularizer)
<add> self.activity_regularizer = regularizers.get(activity_regularizer)
<add>
<add> self.W_constraint = constraints.get(W_constraint)
<add> self.b_constraint = constraints.get(b_constraint)
<add> self.constraints = [self.W_constraint, self.b_constraint]
<add>
<add> self.initial_weights = weights
<add> super(Dense, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[1]
<add>
<ide> self.input = T.matrix()
<del> self.W = self.init((self.input_dim, self.output_dim))
<add> self.W = self.init((input_dim, self.output_dim))
<ide> self.b = shared_zeros((self.output_dim))
<ide>
<ide> self.params = [self.W, self.b]
<ide>
<ide> self.regularizers = []
<del> self.W_regularizer = regularizers.get(W_regularizer)
<ide> if self.W_regularizer:
<ide> self.W_regularizer.set_param(self.W)
<ide> self.regularizers.append(self.W_regularizer)
<ide>
<del> self.b_regularizer = regularizers.get(b_regularizer)
<ide> if self.b_regularizer:
<ide> self.b_regularizer.set_param(self.b)
<ide> self.regularizers.append(self.b_regularizer)
<ide>
<del> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide> if self.activity_regularizer:
<ide> self.activity_regularizer.set_layer(self)
<ide> self.regularizers.append(self.activity_regularizer)
<ide>
<del> self.W_constraint = constraints.get(W_constraint)
<del> self.b_constraint = constraints.get(b_constraint)
<del> self.constraints = [self.W_constraint, self.b_constraint]
<del>
<del> if weights is not None:
<del> self.set_weights(weights)
<del>
<del> if name is not None:
<del> self.set_name(name)
<del>
<del> def set_name(self, name):
<del> self.W.name = '%s_W' % name
<del> self.b.name = '%s_b' % name
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "activation": self.activation.__name__,
<ide> class TimeDistributedDense(MaskedLayer):
<ide> '''
<ide> input_ndim = 3
<ide>
<del> def __init__(self, input_dim, output_dim, init='glorot_uniform', activation='linear', weights=None,
<add> def __init__(self, output_dim, init='glorot_uniform', activation='linear', weights=None,
<ide> W_regularizer=None, b_regularizer=None, activity_regularizer=None,
<ide> W_constraint=None, b_constraint=None, **kwargs):
<del>
<del> super(TimeDistributedDense, self).__init__(**kwargs)
<add> self.output_dim = output_dim
<ide> self.init = initializations.get(init)
<ide> self.activation = activations.get(activation)
<del> self.input_dim = input_dim
<del> self.output_dim = output_dim
<add>
<add> self.W_regularizer = regularizers.get(W_regularizer)
<add> self.b_regularizer = regularizers.get(b_regularizer)
<add> self.activity_regularizer = regularizers.get(activity_regularizer)
<add>
<add> self.W_constraint = constraints.get(W_constraint)
<add> self.b_constraint = constraints.get(b_constraint)
<add> self.constraints = [self.W_constraint, self.b_constraint]
<add>
<add> self.initial_weights = weights
<add> super(TimeDistributedDense, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[1]
<ide>
<ide> self.input = T.tensor3()
<del> self.W = self.init((self.input_dim, self.output_dim))
<add> self.W = self.init((input_dim, self.output_dim))
<ide> self.b = shared_zeros((self.output_dim))
<ide>
<ide> self.params = [self.W, self.b]
<del>
<ide> self.regularizers = []
<ide>
<del> self.W_regularizer = regularizers.get(W_regularizer)
<ide> if self.W_regularizer:
<ide> self.W_regularizer.set_param(self.W)
<ide> self.regularizers.append(self.W_regularizer)
<ide>
<del> self.b_regularizer = regularizers.get(b_regularizer)
<ide> if self.b_regularizer:
<ide> self.b_regularizer.set_param(self.b)
<ide> self.regularizers.append(self.b_regularizer)
<ide>
<del> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide> if self.activity_regularizer:
<ide> self.activity_regularizer.set_layer(self)
<ide> self.regularizers.append(self.activity_regularizer)
<ide>
<del> self.W_constraint = constraints.get(W_constraint)
<del> self.b_constraint = constraints.get(b_constraint)
<del> self.constraints = [self.W_constraint, self.b_constraint]
<del>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "activation": self.activation.__name__,
<ide> class MaxoutDense(Layer):
<ide> '''
<ide> input_ndim = 2
<ide>
<del> def __init__(self, input_dim, output_dim, nb_feature=4, init='glorot_uniform', weights=None,
<add> def __init__(self, output_dim, nb_feature=4, init='glorot_uniform', weights=None,
<ide> W_regularizer=None, b_regularizer=None, activity_regularizer=None,
<ide> W_constraint=None, b_constraint=None, **kwargs):
<del>
<del> super(MaxoutDense, self).__init__(**kwargs)
<del> self.init = initializations.get(init)
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<ide> self.nb_feature = nb_feature
<add> self.init = initializations.get(init)
<add>
<add> self.W_regularizer = regularizers.get(W_regularizer)
<add> self.b_regularizer = regularizers.get(b_regularizer)
<add> self.activity_regularizer = regularizers.get(activity_regularizer)
<add>
<add> self.W_constraint = constraints.get(W_constraint)
<add> self.b_constraint = constraints.get(b_constraint)
<add> self.constraints = [self.W_constraint, self.b_constraint]
<add>
<add> self.initial_weights = weights
<add> super(MaxoutDense, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[1]
<ide>
<ide> self.input = T.matrix()
<del> self.W = self.init((self.nb_feature, self.input_dim, self.output_dim))
<add> self.W = self.init((self.nb_feature, input_dim, self.output_dim))
<ide> self.b = shared_zeros((self.nb_feature, self.output_dim))
<ide>
<ide> self.params = [self.W, self.b]
<del>
<ide> self.regularizers = []
<ide>
<del> self.W_regularizer = regularizers.get(W_regularizer)
<ide> if self.W_regularizer:
<ide> self.W_regularizer.set_param(self.W)
<ide> self.regularizers.append(self.W_regularizer)
<ide>
<del> self.b_regularizer = regularizers.get(b_regularizer)
<ide> if self.b_regularizer:
<ide> self.b_regularizer.set_param(self.b)
<ide> self.regularizers.append(self.b_regularizer)
<ide>
<del> self.activity_regularizer = regularizers.get(activity_regularizer)
<ide> if self.activity_regularizer:
<ide> self.activity_regularizer.set_layer(self)
<ide> self.regularizers.append(self.activity_regularizer)
<ide>
<del> self.W_constraint = constraints.get(W_constraint)
<del> self.b_constraint = constraints.get(b_constraint)
<del> self.constraints = [self.W_constraint, self.b_constraint]
<del>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> @property
<ide> def output_shape(self):
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "nb_feature": self.nb_feature,
<ide><path>keras/layers/normalization.py
<ide> class BatchNormalization(Layer):
<ide>
<ide> momentum: momentum term in the computation of a running estimate of the mean and std of the data
<ide> '''
<del> def __init__(self, input_shape, epsilon=1e-6, mode=0, momentum=0.9, weights=None, **kwargs):
<del> super(BatchNormalization, self).__init__(**kwargs)
<add> def __init__(self, epsilon=1e-6, mode=0, momentum=0.9, weights=None, **kwargs):
<ide> self.init = initializations.get("uniform")
<del> self._input_shape = input_shape
<ide> self.epsilon = epsilon
<ide> self.mode = mode
<ide> self.momentum = momentum
<add> super(BatchNormalization, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_shape = self.input_shape # starts with samples axis
<add> input_shape = input_shape[1:]
<ide> self.input = ndim_tensor(len(input_shape) + 1)
<ide>
<ide> self.gamma = self.init((input_shape))
<ide> def get_output(self, train):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_shape": self._input_shape,
<ide> "epsilon": self.epsilon,
<del> "mode": self.mode}
<add> "mode": self.mode,
<add> "momentum": self.momentum}
<ide>
<ide>
<ide> class LRN2D(Layer):
<ide><path>keras/layers/recurrent.py
<ide> class SimpleRNN(Recurrent):
<ide> included for demonstration purposes
<ide> (demonstrates how to use theano.scan to build a basic RNN).
<ide> '''
<del> def __init__(self, input_dim, output_dim,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal', activation='sigmoid', weights=None,
<ide> truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(SimpleRNN, self).__init__(**kwargs)
<add> self.output_dim = output_dim
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<del> self.input_dim = input_dim
<del> self.output_dim = output_dim
<ide> self.truncate_gradient = truncate_gradient
<ide> self.activation = activations.get(activation)
<ide> self.return_sequences = return_sequences
<add> self.initial_weights = weights
<add> super(SimpleRNN, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W = self.init((self.input_dim, self.output_dim))
<add> self.W = self.init((input_dim, self.output_dim))
<ide> self.U = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b = shared_zeros((self.output_dim))
<ide> self.params = [self.W, self.U, self.b]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self, x_t, mask_tm1, h_tm1, u):
<ide> '''
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__,
<ide> class SimpleDeepRNN(Recurrent):
<ide> This demonstrates how to build RNNs with arbitrary lookback.
<ide> Also (probably) not a super useful model.
<ide> '''
<del> def __init__(self, input_dim, output_dim, depth=3,
<add> def __init__(self, output_dim, depth=3,
<ide> init='glorot_uniform', inner_init='orthogonal',
<ide> activation='sigmoid', inner_activation='hard_sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(SimpleDeepRNN, self).__init__()
<add> self.output_dim = output_dim
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<del> self.input_dim = input_dim
<del> self.output_dim = output_dim
<ide> self.truncate_gradient = truncate_gradient
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<ide> self.depth = depth
<ide> self.return_sequences = return_sequences
<del> self.input = T.tensor3()
<add> self.initial_weights = weights
<add> super(SimpleDeepRNN, self).__init__(**kwargs)
<ide>
<del> self.W = self.init((self.input_dim, self.output_dim))
<add> def build(self):
<add> input_dim = self.input_shape[2]
<add> self.input = T.tensor3()
<add> self.W = self.init((input_dim, self.output_dim))
<ide> self.Us = [self.inner_init((self.output_dim, self.output_dim)) for _ in range(self.depth)]
<ide> self.b = shared_zeros((self.output_dim))
<ide> self.params = [self.W] + self.Us + [self.b]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self, x_t, *args):
<ide> o = x_t
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "depth": self.depth,
<ide> "init": self.init.__name__,
<ide> class GRU(Recurrent):
<ide> Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
<ide> http://arxiv.org/pdf/1412.3555v1.pdf
<ide> '''
<del> def __init__(self, input_dim, output_dim=128,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal',
<ide> activation='sigmoid', inner_activation='hard_sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(GRU, self).__init__()
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<del> self.truncate_gradient = truncate_gradient
<del> self.return_sequences = return_sequences
<del>
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<add> self.truncate_gradient = truncate_gradient
<add> self.return_sequences = return_sequences
<add> self.initial_weights = weights
<add> super(GRU, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W_z = self.init((self.input_dim, self.output_dim))
<add> self.W_z = self.init((input_dim, self.output_dim))
<ide> self.U_z = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_z = shared_zeros((self.output_dim))
<ide>
<del> self.W_r = self.init((self.input_dim, self.output_dim))
<add> self.W_r = self.init((input_dim, self.output_dim))
<ide> self.U_r = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_r = shared_zeros((self.output_dim))
<ide>
<del> self.W_h = self.init((self.input_dim, self.output_dim))
<add> self.W_h = self.init((input_dim, self.output_dim))
<ide> self.U_h = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_h = shared_zeros((self.output_dim))
<ide>
<ide> def __init__(self, input_dim, output_dim=128,
<ide> self.W_h, self.U_h, self.b_h,
<ide> ]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self,
<ide> xz_t, xr_t, xh_t, mask_tm1,
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__,
<ide> class LSTM(Recurrent):
<ide> Supervised sequence labelling with recurrent neural networks
<ide> http://www.cs.toronto.edu/~graves/preprint.pdf
<ide> '''
<del> def __init__(self, input_dim, output_dim=128,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal', forget_bias_init='one',
<ide> activation='tanh', inner_activation='hard_sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(LSTM, self).__init__()
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<del> self.truncate_gradient = truncate_gradient
<del> self.return_sequences = return_sequences
<del>
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<ide> self.forget_bias_init = initializations.get(forget_bias_init)
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<add> self.truncate_gradient = truncate_gradient
<add> self.return_sequences = return_sequences
<add> self.initial_weights = weights
<add> super(LSTM, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W_i = self.init((self.input_dim, self.output_dim))
<add> self.W_i = self.init((input_dim, self.output_dim))
<ide> self.U_i = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_i = shared_zeros((self.output_dim))
<ide>
<del> self.W_f = self.init((self.input_dim, self.output_dim))
<add> self.W_f = self.init((input_dim, self.output_dim))
<ide> self.U_f = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_f = self.forget_bias_init((self.output_dim))
<ide>
<del> self.W_c = self.init((self.input_dim, self.output_dim))
<add> self.W_c = self.init((input_dim, self.output_dim))
<ide> self.U_c = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_c = shared_zeros((self.output_dim))
<ide>
<del> self.W_o = self.init((self.input_dim, self.output_dim))
<add> self.W_o = self.init((input_dim, self.output_dim))
<ide> self.U_o = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_o = shared_zeros((self.output_dim))
<ide>
<ide> def __init__(self, input_dim, output_dim=128,
<ide> self.W_o, self.U_o, self.b_o,
<ide> ]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self,
<ide> xi_t, xf_t, xo_t, xc_t, mask_tm1,
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__,
<ide> class JZS1(Recurrent):
<ide> An Empirical Exploration of Recurrent Network Architectures
<ide> http://www.jmlr.org/proceedings/papers/v37/jozefowicz15.pdf
<ide> '''
<del> def __init__(self, input_dim, output_dim=128,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal',
<ide> activation='tanh', inner_activation='sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(JZS1, self).__init__(**kwargs)
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<del> self.truncate_gradient = truncate_gradient
<del> self.return_sequences = return_sequences
<del>
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<add> self.truncate_gradient = truncate_gradient
<add> self.return_sequences = return_sequences
<add> self.initial_weights = weights
<add> super(JZS1, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W_z = self.init((self.input_dim, self.output_dim))
<add> self.W_z = self.init((input_dim, self.output_dim))
<ide> self.b_z = shared_zeros((self.output_dim))
<ide>
<del> self.W_r = self.init((self.input_dim, self.output_dim))
<add> self.W_r = self.init((input_dim, self.output_dim))
<ide> self.U_r = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_r = shared_zeros((self.output_dim))
<ide>
<ide> self.U_h = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_h = shared_zeros((self.output_dim))
<ide>
<ide> # P_h used to project X onto different dimension, using sparse random projections
<del> if self.input_dim == self.output_dim:
<add> if input_dim == self.output_dim:
<ide> self.Pmat = theano.shared(np.identity(self.output_dim, dtype=theano.config.floatX), name=None)
<ide> else:
<del> P = np.random.binomial(1, 0.5, size=(self.input_dim, self.output_dim)).astype(theano.config.floatX) * 2 - 1
<del> P = 1 / np.sqrt(self.input_dim) * P
<add> P = np.random.binomial(1, 0.5, size=(input_dim, self.output_dim)).astype(theano.config.floatX) * 2 - 1
<add> P = 1 / np.sqrt(input_dim) * P
<ide> self.Pmat = theano.shared(P, name=None)
<ide>
<ide> self.params = [
<ide> def __init__(self, input_dim, output_dim=128,
<ide> self.Pmat
<ide> ]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self,
<ide> xz_t, xr_t, xh_t, mask_tm1,
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__,
<ide> class JZS2(Recurrent):
<ide> An Empirical Exploration of Recurrent Network Architectures
<ide> http://www.jmlr.org/proceedings/papers/v37/jozefowicz15.pdf
<ide> '''
<del> def __init__(self, input_dim, output_dim=128,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal',
<ide> activation='tanh', inner_activation='sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(JZS2, self).__init__(**kwargs)
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<del> self.truncate_gradient = truncate_gradient
<del> self.return_sequences = return_sequences
<del>
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<add> self.truncate_gradient = truncate_gradient
<add> self.return_sequences = return_sequences
<add> super(JZS2, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W_z = self.init((self.input_dim, self.output_dim))
<add> self.W_z = self.init((input_dim, self.output_dim))
<ide> self.U_z = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_z = shared_zeros((self.output_dim))
<ide>
<ide> self.U_r = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_r = shared_zeros((self.output_dim))
<ide>
<del> self.W_h = self.init((self.input_dim, self.output_dim))
<add> self.W_h = self.init((input_dim, self.output_dim))
<ide> self.U_h = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_h = shared_zeros((self.output_dim))
<ide>
<ide> # P_h used to project X onto different dimension, using sparse random projections
<del> if self.input_dim == self.output_dim:
<add> if input_dim == self.output_dim:
<ide> self.Pmat = theano.shared(np.identity(self.output_dim, dtype=theano.config.floatX), name=None)
<ide> else:
<del> P = np.random.binomial(1, 0.5, size=(self.input_dim, self.output_dim)).astype(theano.config.floatX) * 2 - 1
<del> P = 1 / np.sqrt(self.input_dim) * P
<add> P = np.random.binomial(1, 0.5, size=(input_dim, self.output_dim)).astype(theano.config.floatX) * 2 - 1
<add> P = 1 / np.sqrt(input_dim) * P
<ide> self.Pmat = theano.shared(P, name=None)
<ide>
<ide> self.params = [
<ide> def __init__(self, input_dim, output_dim=128,
<ide> self.Pmat
<ide> ]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self,
<ide> xz_t, xr_t, xh_t, mask_tm1,
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__,
<ide> class JZS3(Recurrent):
<ide> An Empirical Exploration of Recurrent Network Architectures
<ide> http://www.jmlr.org/proceedings/papers/v37/jozefowicz15.pdf
<ide> '''
<del> def __init__(self, input_dim, output_dim=128,
<add> def __init__(self, output_dim,
<ide> init='glorot_uniform', inner_init='orthogonal',
<ide> activation='tanh', inner_activation='sigmoid',
<ide> weights=None, truncate_gradient=-1, return_sequences=False, **kwargs):
<del>
<del> super(JZS3, self).__init__(**kwargs)
<del> self.input_dim = input_dim
<ide> self.output_dim = output_dim
<del> self.truncate_gradient = truncate_gradient
<del> self.return_sequences = return_sequences
<del>
<ide> self.init = initializations.get(init)
<ide> self.inner_init = initializations.get(inner_init)
<ide> self.activation = activations.get(activation)
<ide> self.inner_activation = activations.get(inner_activation)
<add> self.truncate_gradient = truncate_gradient
<add> self.return_sequences = return_sequences
<add> self.initial_weights = weights
<add> super(JZS3, self).__init__(**kwargs)
<add>
<add> def build(self):
<add> input_dim = self.input_shape[2]
<ide> self.input = T.tensor3()
<ide>
<del> self.W_z = self.init((self.input_dim, self.output_dim))
<add> self.W_z = self.init((input_dim, self.output_dim))
<ide> self.U_z = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_z = shared_zeros((self.output_dim))
<ide>
<del> self.W_r = self.init((self.input_dim, self.output_dim))
<add> self.W_r = self.init((input_dim, self.output_dim))
<ide> self.U_r = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_r = shared_zeros((self.output_dim))
<ide>
<del> self.W_h = self.init((self.input_dim, self.output_dim))
<add> self.W_h = self.init((input_dim, self.output_dim))
<ide> self.U_h = self.inner_init((self.output_dim, self.output_dim))
<ide> self.b_h = shared_zeros((self.output_dim))
<ide>
<ide> def __init__(self, input_dim, output_dim=128,
<ide> self.W_h, self.U_h, self.b_h,
<ide> ]
<ide>
<del> if weights is not None:
<del> self.set_weights(weights)
<add> if self.initial_weights is not None:
<add> self.set_weights(self.initial_weights)
<add> del self.initial_weights
<ide>
<ide> def _step(self,
<ide> xz_t, xr_t, xh_t, mask_tm1,
<ide> def get_output(self, train=False):
<ide>
<ide> def get_config(self):
<ide> return {"name": self.__class__.__name__,
<del> "input_dim": self.input_dim,
<ide> "output_dim": self.output_dim,
<ide> "init": self.init.__name__,
<ide> "inner_init": self.inner_init.__name__, | 4 |
Javascript | Javascript | increase code coverage of source_text_module.js | 99879cac865877825a5a1feda4b766a6f780573f | <ide><path>test/parallel/test-vm-module-dynamic-namespace.js
<add>'use strict';
<add>
<add>// Flags: --experimental-vm-modules --expose-internals
<add>//
<add>const common = require('../common');
<add>
<add>const assert = require('assert');
<add>
<add>const { types } = require('util');
<add>const { SourceTextModule, wrapMap } = require('internal/vm/source_text_module');
<add>
<add>const { importModuleDynamicallyCallback } =
<add> require('internal/process/esm_loader');
<add>
<add>async function getNamespace() {
<add> const m = new SourceTextModule('');
<add> await m.link(() => 0);
<add> m.instantiate();
<add> await m.evaluate();
<add> return m.namespace;
<add>}
<add>
<add>(async () => {
<add> const namespace = await getNamespace();
<add> const m = new SourceTextModule('export const A = "A";', {
<add> importModuleDynamically: common.mustCall((specifier, wrap) => {
<add> return namespace;
<add> })
<add> });
<add> await m.link(() => 0);
<add> m.instantiate();
<add> await m.evaluate();
<add> const ns = await importModuleDynamicallyCallback(wrapMap.get(m));
<add> assert.ok(types.isModuleNamespaceObject(ns));
<add>})().then(common.mustCall()); | 1 |
PHP | PHP | use escapeshellarg on windows symlink | 44c3feb604944599ad1c782a9942981c3991fa31 | <ide><path>src/Illuminate/Filesystem/Filesystem.php
<ide> public function link($target, $link)
<ide>
<ide> $mode = $this->isDirectory($target) ? 'J' : 'H';
<ide>
<del> exec("mklink /{$mode} \"{$link}\" \"{$target}\"");
<add> exec("mklink /{$mode} ".escapeshellarg($link)." ".escapeshellarg($target));
<ide> }
<ide>
<ide> /** | 1 |
Python | Python | add separate test for failed custom validation | f0071dbccd592ba3157738ced66809869f68b1cb | <ide><path>rest_framework/tests/serializer.py
<ide> def test_read_only_fields(self):
<ide> """
<ide> Attempting to update fields set as read_only should have no effect.
<ide> """
<del>
<ide> serializer = PersonSerializer(self.person, data={'name': 'dwight', 'age': 99})
<ide> self.assertEquals(serializer.is_valid(), True)
<ide> instance = serializer.save()
<ide> def setUp(self):
<ide> 'content': 'x' * 1001,
<ide> 'created': datetime.datetime(2012, 1, 1)
<ide> }
<del> self.actionitem = ActionItem(title='Some to do item',
<del> )
<add> self.actionitem = ActionItem(title='Some to do item',)
<ide>
<ide> def test_create(self):
<ide> serializer = CommentSerializer(data=self.data)
<ide> def test_missing_bool_with_default(self):
<ide> self.assertEquals(serializer.is_valid(), True)
<ide> self.assertEquals(serializer.errors, {})
<ide>
<del> def test_field_validation(self):
<del>
<del> class CommentSerializerWithFieldValidator(CommentSerializer):
<del>
<del> def validate_content(self, attrs, source):
<del> value = attrs[source]
<del> if "test" not in value:
<del> raise serializers.ValidationError("Test not in value")
<del> return attrs
<del>
<del> data = {
<del> 'email': 'tom@example.com',
<del> 'content': 'A test comment',
<del> 'created': datetime.datetime(2012, 1, 1)
<del> }
<del>
<del> serializer = CommentSerializerWithFieldValidator(data=data)
<del> self.assertTrue(serializer.is_valid())
<del>
<del> data['content'] = 'This should not validate'
<del>
<del> serializer = CommentSerializerWithFieldValidator(data=data)
<del> self.assertFalse(serializer.is_valid())
<del> self.assertEquals(serializer.errors, {'content': [u'Test not in value']})
<del>
<del> incomplete_data = {
<del> 'email': 'tom@example.com',
<del> 'created': datetime.datetime(2012, 1, 1)
<del> }
<del> serializer = CommentSerializerWithFieldValidator(data=incomplete_data)
<del> self.assertFalse(serializer.is_valid())
<del> self.assertEquals(serializer.errors, {'content': [u'This field is required.']})
<del>
<ide> def test_bad_type_data_is_false(self):
<ide> """
<ide> Data of the wrong type is not valid.
<ide> def test_default_modelfield_max_length_exceeded(self):
<ide> self.assertEquals(serializer.errors, {'info': [u'Ensure this value has at most 12 characters (it has 13).']})
<ide>
<ide>
<add>class CustomValidationTests(TestCase):
<add> class CommentSerializerWithFieldValidator(CommentSerializer):
<add>
<add> def validate_email(self, attrs, source):
<add> value = attrs[source]
<add>
<add> return attrs
<add>
<add> def validate_content(self, attrs, source):
<add> value = attrs[source]
<add> if "test" not in value:
<add> raise serializers.ValidationError("Test not in value")
<add> return attrs
<add>
<add> def test_field_validation(self):
<add> data = {
<add> 'email': 'tom@example.com',
<add> 'content': 'A test comment',
<add> 'created': datetime.datetime(2012, 1, 1)
<add> }
<add>
<add> serializer = self.CommentSerializerWithFieldValidator(data=data)
<add> self.assertTrue(serializer.is_valid())
<add>
<add> data['content'] = 'This should not validate'
<add>
<add> serializer = self.CommentSerializerWithFieldValidator(data=data)
<add> self.assertFalse(serializer.is_valid())
<add> self.assertEquals(serializer.errors, {'content': [u'Test not in value']})
<add>
<add> def test_missing_data(self):
<add> """
<add> Make sure that validate_content isn't called if the field is missing
<add> """
<add> incomplete_data = {
<add> 'email': 'tom@example.com',
<add> 'created': datetime.datetime(2012, 1, 1)
<add> }
<add> serializer = self.CommentSerializerWithFieldValidator(data=incomplete_data)
<add> self.assertFalse(serializer.is_valid())
<add> self.assertEquals(serializer.errors, {'content': [u'This field is required.']})
<add>
<add> def test_wrong_data(self):
<add> """
<add> Make sure that validate_content isn't called if the field input is wrong
<add> """
<add> wrong_data = {
<add> 'email': 'not an email',
<add> 'content': 'A test comment',
<add> 'created': datetime.datetime(2012, 1, 1)
<add> }
<add> serializer = self.CommentSerializerWithFieldValidator(data=wrong_data)
<add> self.assertFalse(serializer.is_valid())
<add> self.assertEquals(serializer.errors, {'email': [u'Enter a valid e-mail address.']})
<add>
<add>
<ide> class PositiveIntegerAsChoiceTests(TestCase):
<ide> def test_positive_integer_in_json_is_correctly_parsed(self):
<del> data = {'some_integer':1}
<add> data = {'some_integer': 1}
<ide> serializer = PositiveIntegerAsChoiceSerializer(data=data)
<ide> self.assertEquals(serializer.is_valid(), True)
<ide>
<add>
<ide> class ModelValidationTests(TestCase):
<ide> def test_validate_unique(self):
<ide> """ | 1 |
Text | Text | add solution to redux remove item from array | 1d77711431ec3cb57e6641b90cee54450d031f8e | <ide><path>guide/english/certifications/front-end-libraries/redux/remove-an-item-from-an-array/index.md
<ide> title: Remove an Item from an Array
<ide> ---
<ide> ## Remove an Item from an Array
<ide>
<del>This is a stub. <a href='https://github.com/freecodecamp/guides/tree/master/src/pages/certifications/front-end-libraries/redux/remove-an-item-from-an-array/index.md' target='_blank' rel='nofollow'>Help our community expand it</a>.
<add>### Solution
<add>```javascript
<add>const immutableReducer = (state = [0,1,2,3,4,5], action) => {
<add> switch(action.type) {
<add> case 'REMOVE_ITEM':
<add> // don't mutate state here or the tests will fail
<add> return [...state.slice(0, action.index), ...state.slice(action.index + 1, state.length)];
<add> // or return state.slice(0, action.index).concat(state.slice(action.index + 1, state.length));
<add> default:
<add> return state;
<add> }
<add>};
<ide>
<del><a href='https://github.com/freecodecamp/guides/blob/master/README.md' target='_blank' rel='nofollow'>This quick style guide will help ensure your pull request gets accepted</a>.
<add>const removeItem = (index) => {
<add> return {
<add> type: 'REMOVE_ITEM',
<add> index
<add> }
<add>}
<ide>
<del><!-- The article goes here, in GitHub-flavored Markdown. Feel free to add YouTube videos, images, and CodePen/JSBin embeds -->
<add>const store = Redux.createStore(immutableReducer);
<add>```
<add>
<add>### Notes
<add>* array.slice(fromIndex, untilIndex) returns a new array
<add>* 1st slice from the first item's index (0 inclusive) until indexToRemove(action.index exclusive)
<add>* 2nd slice from item right after indexToRemove (action.index + 1 inclusive) until length (last item's index + 1 exclusive)
<add>* since slice returns a new array, combine both parts with [...array1, ...array2] spread operator
<add>* or combine them with .concat() | 1 |
Mixed | Ruby | provide pattern matching for activemodel | 7e499b25acd5e1bfdd54ca2af66678b0ed05def1 | <ide><path>activemodel/CHANGELOG.md
<add>* Define `deconstruct_keys` in `ActiveModel::AttributeMethods`
<add>
<add> This provides the Ruby 2.7+ pattern matching interface for hash patterns,
<add> which allows the user to pattern match against anything that includes the
<add> `ActiveModel::AttributeMethods` module (e.g., `ActiveRecord::Base`). As an
<add> example, you can now:
<add>
<add> ```ruby
<add> class Person < ActiveRecord::Base
<add> end
<add>
<add> person = Person.new(name: "Mary")
<add> person => { name: }
<add> name # => "Mary"
<add> ```
<add>
<add> *Kevin Newton*
<add>
<ide> * Fix casting long strings to `Date`, `Time` or `DateTime`
<ide>
<ide> *fatkodima*
<ide><path>activemodel/lib/active_model/attribute_methods.rb
<ide> def respond_to?(method, include_private_methods = false)
<ide> end
<ide> end
<ide>
<add> # Returns a hash of attributes for the given keys. Provides the pattern
<add> # matching interface for matching against hash patterns. For example:
<add> #
<add> # class Person
<add> # include ActiveModel::AttributeMethods
<add> #
<add> # attr_accessor :name
<add> # define_attribute_method :name
<add> # end
<add> #
<add> # def greeting_for(person)
<add> # case person
<add> # in { name: "Mary" }
<add> # "Welcome back, Mary!"
<add> # in { name: }
<add> # "Welcome, stranger!"
<add> # end
<add> # end
<add> #
<add> # person = Person.new
<add> # person.name = "Mary"
<add> # greeting_for(person) # => "Welcome back, Mary!"
<add> #
<add> # person = Person.new
<add> # person.name = "Bob"
<add> # greeting_for(person) # => "Welcome, stranger!"
<add> def deconstruct_keys(keys)
<add> deconstructed = {}
<add>
<add> keys.each do |key|
<add> string_key = key.to_s
<add> deconstructed[key] = _read_attribute(string_key) if attribute_method?(string_key)
<add> end
<add>
<add> deconstructed
<add> end
<add>
<ide> private
<ide> def attribute_method?(attr_name)
<ide> respond_to_without_attributes?(:attributes) && attributes.include?(attr_name)
<ide><path>activemodel/test/cases/attributes_test.rb
<ide> def attribute=(_, _)
<ide> ModelForAttributesTest.attribute :foo, :unknown
<ide> end
<ide> end
<add>
<add> test "pattern matching against keys" do
<add> case ModelForAttributesTest.new(integer_field: 1)
<add> in { integer_field: 1 }
<add> assert(true)
<add> else
<add> assert(false, "Failed to pattern match")
<add> end
<add> end
<ide> end
<ide> end | 3 |
Javascript | Javascript | add hello dating to showcase | 804791e0861a7bb234882264dfa6069fd4962f7e | <ide><path>website/src/react-native/showcase.js
<ide> var apps = [
<ide> link: 'https://itunes.apple.com/us/app/hashtag-by-hashley-ironic/id1022724462?mt=8',
<ide> author: 'Elephant, LLC',
<ide> },
<add> {
<add> name: 'hello dating',
<add> icon: 'http://a3.mzstatic.com/us/r30/Purple49/v4/54/29/59/54295932-f821-35db-8556-ba4006098ee9/icon175x175.png',
<add> linkAppStore: 'https://itunes.apple.com/il/app/hello-dating/id1072062348?mt=8',
<add> author: 'Gertler Davidov communication'
<add> },
<ide> {
<ide> name: 'Hey, Neighbor!',
<ide> icon: 'https://raw.githubusercontent.com/scrollback/io.scrollback.neighborhoods/master/android/app/src/main/res/mipmap-xxxhdpi/ic_launcher.png', | 1 |
Ruby | Ruby | escape `.`s in hostnames in regexps | ffe0c18b2a10b8bc6369b2095a556fcb1ad4135c | <ide><path>Library/Homebrew/rubocops/homepage.rb
<ide> def audit_formula(_node, _class_node, _parent_class_node, body_node)
<ide> when
<ide> # Check for http:// GitHub homepage URLs, https:// is preferred.
<ide> # Note: only check homepages that are repo pages, not *.github.com hosts
<del> %r{^http://github.com/},
<add> %r{^http://github\.com/},
<ide> %r{^http://[^/]*\.github\.io/},
<ide>
<ide> # Savannah has full SSL/TLS support but no auto-redirect.
<ide> # Doesn't apply to the download URLs, only the homepage.
<del> %r{^http://savannah.nongnu.org/},
<add> %r{^http://savannah\.nongnu\.org/},
<ide>
<ide> %r{^http://[^/]*\.sourceforge\.io/},
<ide> # There's an auto-redirect here, but this mistake is incredibly common too. | 1 |
PHP | PHP | apply fixes from styleci | d464ff5f594a4f8183520d30c807680384e84571 | <ide><path>src/Illuminate/Database/Query/Builder.php
<ide> protected function addDynamic($segment, $connector, $parameters, $index)
<ide> $this->where(Str::snake($segment), '=', $parameters[$index], $bool);
<ide> }
<ide>
<del> /**
<del> * Add a "where fulltext" clause to the query.
<del> *
<del> * @param string|string[] $columns
<del> * @param string $value
<del> * @param string $boolean
<del> * @return $this
<del> */
<add> /**
<add> * Add a "where fulltext" clause to the query.
<add> *
<add> * @param string|string[] $columns
<add> * @param string $value
<add> * @param string $boolean
<add> * @return $this
<add> */
<ide> public function whereFullText($columns, $value, array $options = [], $boolean = 'and')
<ide> {
<ide> $type = 'Fulltext';
<ide> public function whereFullText($columns, $value, array $options = [], $boolean =
<ide> return $this;
<ide> }
<ide>
<del> /**
<del> * Add a "or where fulltext" clause to the query.
<del> *
<del> * @param string|string[] $columns
<del> * @param string $value
<del> * @return $this
<del> */
<add> /**
<add> * Add a "or where fulltext" clause to the query.
<add> *
<add> * @param string|string[] $columns
<add> * @param string $value
<add> * @return $this
<add> */
<ide> public function orWhereFullText($columns, $value, array $options = [])
<ide> {
<ide> return $this->whereFulltext($columns, $value, $options, 'or'); | 1 |
Javascript | Javascript | add new credits to quaternion.js | e2e5cb13bd3bcda806966ba1c775428a2673a6b3 | <ide><path>src/math/Quaternion.js
<ide> * @author mikael emtinger / http://gomo.se/
<ide> * @author alteredq / http://alteredqualia.com/
<ide> * @author WestLangley / http://github.com/WestLangley
<add> * @author bhouston / http://exocortex.com
<add> * @author Hasan Kamal-Al-Deen / hasank1987@gmail.com
<ide> */
<ide>
<ide> THREE.Quaternion = function( x, y, z, w ) { | 1 |
Ruby | Ruby | remove a dead branch from argv.kegs | 301f1b20e698107e2f3b13bf4a2e8035affbaf03 | <ide><path>Library/Homebrew/extend/ARGV.rb
<ide> def kegs
<ide> linked_keg_ref = HOMEBREW_REPOSITORY/"Library/LinkedKegs"/name
<ide> opt_prefix = HOMEBREW_PREFIX/"opt"/name
<ide>
<del> if opt_prefix.symlink? && opt_prefix.directory?
<del> Keg.new(opt_prefix.resolved_path)
<del> elsif linked_keg_ref.symlink? && linked_keg_ref.directory?
<del> Keg.new(linked_keg_ref.resolved_path)
<del> elsif dirs.length == 1
<del> Keg.new(dirs.first)
<del> elsif (prefix = Formulary.factory(canonical_name).prefix).directory?
<del> Keg.new(prefix)
<del> else
<del> raise MultipleVersionsInstalledError.new(name)
<add> begin
<add> if opt_prefix.symlink? && opt_prefix.directory?
<add> Keg.new(opt_prefix.resolved_path)
<add> elsif linked_keg_ref.symlink? && linked_keg_ref.directory?
<add> Keg.new(linked_keg_ref.resolved_path)
<add> elsif dirs.length == 1
<add> Keg.new(dirs.first)
<add> elsif (prefix = Formulary.factory(canonical_name).prefix).directory?
<add> Keg.new(prefix)
<add> else
<add> raise MultipleVersionsInstalledError.new(name)
<add> end
<add> rescue FormulaUnavailableError
<add> raise <<-EOS.undent
<add> Multiple kegs installed to #{rack}
<add> However we don't know which one you refer to.
<add> Please delete (with rm -rf!) all but one and then try again.
<add> Sorry, we know this is lame.
<add> EOS
<ide> end
<ide> end
<del> rescue FormulaUnavailableError
<del> if rack
<del> raise <<-EOS.undent
<del> Multiple kegs installed to #{rack}
<del> However we don't know which one you refer to.
<del> Please delete (with rm -rf!) all but one and then try again.
<del> Sorry, we know this is lame.
<del> EOS
<del> else
<del> raise
<del> end
<ide> end
<ide>
<ide> # self documenting perhaps? | 1 |
Javascript | Javascript | improve grammar and clarity | 78eead6775427bf02d9ad92e5c0d54a674794b9e | <ide><path>src/ng/directive/input.js
<ide> var requiredDirective = function() {
<ide> * @name ng.directive:ngList
<ide> *
<ide> * @description
<del> * Text input that converts between comma-separated string into an array of strings.
<add> * Text input that converts between a delimited string and an array of strings. The delimiter
<add> * can be a fixed string (by default a comma) or a regular expression.
<ide> *
<ide> * @element input
<ide> * @param {string=} ngList optional delimiter that should be used to split the value. If | 1 |
Python | Python | update documentation for `where` | 08488627b64df0275ba000cd8f24484639263a63 | <ide><path>numpy/add_newdocs.py
<ide> """)
<ide>
<ide> add_newdoc('numpy.core.multiarray','where',
<del> """where(condition, | x, y)
<add> """where(condition, x, y) or where(condition)
<ide>
<del> The result is shaped like condition and has elements of x and y where
<del> condition is respectively true or false. If x or y are not given,
<del> condition.nonzero() is returned.
<add> Return elements from `x` or `y`, depending on `condition`.
<ide>
<del> To group the indices by element, rather than dimension, use
<add> *Parameters*:
<add> condition : array of bool
<add> When True, yield x, otherwise yield y.
<add> x,y : 1-dimensional arrays
<add> Values from which to choose.
<add>
<add> *Notes*
<add> This is equivalent to
<add>
<add> [xv if c else yv for (c,xv,yv) in zip(condition,x,y)]
<add>
<add> The result is shaped like `condition` and has elements of `x`
<add> or `y` where `condition` is respectively True or False.
<ide>
<del> transpose(where(condition))
<add> In the special case, where only `condition` is given, the
<add> tuple condition.nonzero() is returned, instead.
<ide>
<del> instead. This always results in a 2d array, with a row of indices for
<del> each element that satisfies the condition.
<add> *Examples*
<add> >>> where([True,False,True],[1,2,3],[4,5,6])
<add> array([1, 5, 3])
<ide>
<ide> """)
<ide> | 1 |
Ruby | Ruby | simplify subclasses method using reject | 0399bfbc3aa8cd3ad2afc2bf7eea306d622bc220 | <ide><path>activesupport/lib/active_support/core_ext/class/subclasses.rb
<ide> def descendants
<ide> #
<ide> # Foo.subclasses # => [Bar]
<ide> def subclasses
<del> subclasses, chain = [], descendants
<del> chain.each do |k|
<del> subclasses << k unless chain.any? { |c| c > k }
<add> chain = descendants
<add> chain.reject do |k|
<add> chain.any? { |c| c > k }
<ide> end
<del> subclasses
<ide> end
<ide> end | 1 |
Python | Python | add tests for `genericalias` | 610edf23678a81a9a81d71d238a8f8c1eedcc78a | <ide><path>numpy/typing/tests/test_generic_alias.py
<add>from __future__ import annotations
<add>
<add>import sys
<add>import types
<add>import pickle
<add>import weakref
<add>from typing import TypeVar, Any, Callable, Tuple, Type, Union
<add>
<add>import pytest
<add>import numpy as np
<add>from numpy.typing._generic_alias import _GenericAlias
<add>
<add>ScalarType = TypeVar("ScalarType", bound=np.generic)
<add>DType = _GenericAlias(np.dtype, (ScalarType,))
<add>NDArray = _GenericAlias(np.ndarray, (Any, DType))
<add>
<add>if sys.version_info >= (3, 9):
<add> DType_ref = types.GenericAlias(np.dtype, (ScalarType,))
<add> NDArray_ref = types.GenericAlias(np.ndarray, (Any, DType_ref))
<add> FuncType = Callable[[Union[_GenericAlias, types.GenericAlias]], Any]
<add>else:
<add> DType_ref = NotImplemented
<add> NDArray_ref = NotImplemented
<add> FuncType = Callable[[_GenericAlias], Any]
<add>
<add>GETATTR_NAMES = sorted(set(dir(np.ndarray)) - _GenericAlias._ATTR_EXCEPTIONS)
<add>
<add>
<add>def _get_subclass_mro(base: type) -> Tuple[type, ...]:
<add> class Subclass(base): # type: ignore[misc,valid-type]
<add> pass
<add> return Subclass.__mro__[1:]
<add>
<add>
<add>class TestGenericAlias:
<add> """Tests for `numpy.typing._generic_alias._GenericAlias`."""
<add>
<add> @pytest.mark.parametrize(
<add> "name,func",
<add> [
<add> ("__init__", lambda n: n),
<add> ("__origin__", lambda n: n.__origin__),
<add> ("__args__", lambda n: n.__args__),
<add> ("__parameters__", lambda n: n.__parameters__),
<add> ("__reduce__", lambda n: n.__reduce__()[1:]),
<add> ("__reduce_ex__", lambda n: n.__reduce_ex__(1)[1:]),
<add> ("__mro_entries__", lambda n: n.__mro_entries__([object])),
<add> ("__hash__", lambda n: hash(n)),
<add> ("__repr__", lambda n: repr(n)),
<add> ("__getitem__", lambda n: n[np.float64]),
<add> ("__getitem__", lambda n: n[ScalarType][np.float64]),
<add> ("__getitem__", lambda n: n[Union[np.int64, ScalarType]][np.float64]),
<add> ("__eq__", lambda n: n == n),
<add> ("__ne__", lambda n: n != np.ndarray),
<add> ("__dir__", lambda n: dir(n)),
<add> ("__call__", lambda n: n((5,), np.int64)),
<add> ("__call__", lambda n: n(shape=(5,), dtype=np.int64)),
<add> ("subclassing", lambda n: _get_subclass_mro(n)),
<add> ("pickle", lambda n: n == pickle.loads(pickle.dumps(n))),
<add> ("__weakref__", lambda n: n == weakref.ref(n)()),
<add> ]
<add> )
<add> def test_pass(self, name: str, func: FuncType) -> None:
<add> """Compare `types.GenericAlias` with its numpy-based backport.
<add>
<add> Checker whether ``func`` runs as intended and that both `GenericAlias`
<add> and `_GenericAlias` return the same result.
<add>
<add> """
<add> value = func(NDArray)
<add>
<add> if sys.version_info >= (3, 9):
<add> value_ref = func(NDArray_ref)
<add> assert value == value_ref
<add>
<add> @pytest.mark.parametrize("name", GETATTR_NAMES)
<add> def test_getattr(self, name: str) -> None:
<add> """Test that `getattr` wraps around the underlying type (``__origin__``)."""
<add> value = getattr(NDArray, name)
<add> value_ref1 = getattr(np.ndarray, name)
<add>
<add> if sys.version_info >= (3, 9):
<add> value_ref2 = getattr(NDArray_ref, name)
<add> assert value == value_ref1 == value_ref2
<add> else:
<add> assert value == value_ref1
<add>
<add> @pytest.mark.parametrize(
<add> "name,exc_type,func",
<add> [
<add> ("__getitem__", TypeError, lambda n: n[()]),
<add> ("__getitem__", TypeError, lambda n: n[Any, Any]),
<add> ("__getitem__", TypeError, lambda n: n[Any][Any]),
<add> ("__instancecheck__", TypeError, lambda n: isinstance(np.array(1), n)),
<add> ("__subclasscheck__", TypeError, lambda n: issubclass(np.ndarray, n)),
<add> ("__setattr__", AttributeError, lambda n: setattr(n, "__origin__", int)),
<add> ("__setattr__", AttributeError, lambda n: setattr(n, "test", int)),
<add> ("__getattribute__", AttributeError, lambda n: getattr(n, "test")),
<add> ]
<add> )
<add> def test_raise(
<add> self,
<add> name: str,
<add> exc_type: Type[BaseException],
<add> func: FuncType,
<add> ) -> None:
<add> """Test operations that are supposed to raise."""
<add> with pytest.raises(exc_type):
<add> func(NDArray)
<add>
<add> if sys.version_info >= (3, 9):
<add> with pytest.raises(exc_type):
<add> func(NDArray_ref) | 1 |
PHP | PHP | add source file to collection's dd function output | 78104e0530c930a5585bbc5250d63502b115e207 | <ide><path>src/Illuminate/Foundation/Concerns/ResolvesDumpSource.php
<ide> public function resolveDumpSource()
<ide> $sourceKey = null;
<ide>
<ide> foreach ($trace as $traceKey => $traceFile) {
<del> if (isset($traceFile['file']) && str_ends_with(
<del> $traceFile['file'],
<del> 'dump.php'
<del> )) {
<add> if (! isset($traceFile['file'])) {
<add> continue;
<add> }
<add>
<add> if (str_ends_with($traceFile['file'], 'dump.php')) {
<ide> $sourceKey = $traceKey + 1;
<add> break;
<add> }
<ide>
<add> if (str_ends_with($traceFile['file'], 'EnumeratesValues.php')) {
<add> $sourceKey = $traceKey + 4;
<ide> break;
<ide> }
<ide> } | 1 |
Text | Text | adopt contributor covenant | a939f8b31786055aba4a8284f2fc23e7f790cba3 | <ide><path>CODE_OF_CONDUCT.md
<ide> # Code of Conduct
<ide>
<del>Facebook has adopted a Code of Conduct that we expect project participants to adhere to.
<del>Please read the [full text](https://code.fb.com/codeofconduct/)
<del>so that you can understand what actions will and will not be tolerated.
<add>## Our Pledge
<add>
<add>In the interest of fostering an open and welcoming environment, we as
<add>contributors and maintainers pledge to make participation in our project and
<add>our community a harassment-free experience for everyone, regardless of age, body
<add>size, disability, ethnicity, sex characteristics, gender identity and expression,
<add>level of experience, education, socio-economic status, nationality, personal
<add>appearance, race, religion, or sexual identity and orientation.
<add>
<add>## Our Standards
<add>
<add>Examples of behavior that contributes to creating a positive environment
<add>include:
<add>
<add>* Using welcoming and inclusive language
<add>* Being respectful of differing viewpoints and experiences
<add>* Gracefully accepting constructive criticism
<add>* Focusing on what is best for the community
<add>* Showing empathy towards other community members
<add>
<add>Examples of unacceptable behavior by participants include:
<add>
<add>* The use of sexualized language or imagery and unwelcome sexual attention or
<add> advances
<add>* Trolling, insulting/derogatory comments, and personal or political attacks
<add>* Public or private harassment
<add>* Publishing others' private information, such as a physical or electronic
<add> address, without explicit permission
<add>* Other conduct which could reasonably be considered inappropriate in a
<add> professional setting
<add>
<add>## Our Responsibilities
<add>
<add>Project maintainers are responsible for clarifying the standards of acceptable
<add>behavior and are expected to take appropriate and fair corrective action in
<add>response to any instances of unacceptable behavior.
<add>
<add>Project maintainers have the right and responsibility to remove, edit, or
<add>reject comments, commits, code, wiki edits, issues, and other contributions
<add>that are not aligned to this Code of Conduct, or to ban temporarily or
<add>permanently any contributor for other behaviors that they deem inappropriate,
<add>threatening, offensive, or harmful.
<add>
<add>## Scope
<add>
<add>This Code of Conduct applies within all project spaces, and it also applies when
<add>an individual is representing the project or its community in public spaces.
<add>Examples of representing a project or community include using an official
<add>project e-mail address, posting via an official social media account, or acting
<add>as an appointed representative at an online or offline event. Representation of
<add>a project may be further defined and clarified by project maintainers.
<add>
<add>## Enforcement
<add>
<add>Instances of abusive, harassing, or otherwise unacceptable behavior may be
<add>reported by contacting the project team at <opensource-conduct@fb.com>. All
<add>complaints will be reviewed and investigated and will result in a response that
<add>is deemed necessary and appropriate to the circumstances. The project team is
<add>obligated to maintain confidentiality with regard to the reporter of an incident.
<add>Further details of specific enforcement policies may be posted separately.
<add>
<add>Project maintainers who do not follow or enforce the Code of Conduct in good
<add>faith may face temporary or permanent repercussions as determined by other
<add>members of the project's leadership.
<add>
<add>## Attribution
<add>
<add>This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
<add>available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
<add>
<add>[homepage]: https://www.contributor-covenant.org
<add>
<add>For answers to common questions about this code of conduct, see
<add>https://www.contributor-covenant.org/faq
<add> | 1 |
Ruby | Ruby | test explicit skip | 3b0f917b1dfabce6a6b338e4a7cb02995c055596 | <ide><path>actionpack/test/controller/new_base/render_streaming_test.rb
<ide> class BasicController < ActionController::Base
<ide> )]
<ide>
<ide> layout "application"
<del> stream :only => :hello_world
<add> stream :only => [:hello_world, :skip]
<ide>
<ide> def hello_world
<ide> end
<ide>
<add> def skip
<add> render :action => "hello_world", :stream => false
<add> end
<add>
<ide> def explicit
<ide> render :action => "hello_world", :stream => true
<ide> end
<ide> class StreamingTest < Rack::TestCase
<ide> assert_streaming!
<ide> end
<ide>
<add> test "skip rendering with streaming at render level" do
<add> get "/render_streaming/basic/skip"
<add> assert_body "Hello world, I'm here!"
<add> end
<add>
<ide> def assert_streaming!(cache="no-cache")
<ide> assert_status 200
<ide> assert_equal nil, headers["Content-Length"] | 1 |
Go | Go | lintify code with confidence=1 | 5e941f1ca035bb1ec014c18d277aecaa41deba85 | <ide><path>api.go
<ide> func postContainersKill(srv *Server, version float64, w http.ResponseWriter, r *
<ide>
<ide> signal := 0
<ide> if r != nil {
<del> s := r.Form.Get("signal")
<del> if s != "" {
<del> if s, err := strconv.Atoi(s); err != nil {
<add> if s := r.Form.Get("signal"); s != "" {
<add> s, err := strconv.Atoi(s)
<add> if err != nil {
<ide> return err
<del> } else {
<del> signal = s
<ide> }
<add> signal = s
<ide> }
<ide> }
<ide> if err := srv.ContainerKill(name, signal); err != nil {
<ide> func getImagesJSON(srv *Server, version float64, w http.ResponseWriter, r *http.
<ide> }
<ide>
<ide> return writeJSON(w, http.StatusOK, outs2)
<del> } else {
<del> return writeJSON(w, http.StatusOK, outs)
<ide> }
<add> return writeJSON(w, http.StatusOK, outs)
<ide> }
<ide>
<ide> func getImagesViz(srv *Server, version float64, w http.ResponseWriter, r *http.Request, vars map[string]string) error {
<ide> func getContainersTop(srv *Server, version float64, w http.ResponseWriter, r *ht
<ide> if err := parseForm(r); err != nil {
<ide> return err
<ide> }
<del> name := vars["name"]
<del> ps_args := r.Form.Get("ps_args")
<del> procsStr, err := srv.ContainerTop(name, ps_args)
<add> procsStr, err := srv.ContainerTop(vars["name"], r.Form.Get("ps_args"))
<ide> if err != nil {
<ide> return err
<ide> }
<del>
<ide> return writeJSON(w, http.StatusOK, procsStr)
<ide> }
<ide>
<ide> func getContainersJSON(srv *Server, version float64, w http.ResponseWriter, r *h
<ide> if version < 1.5 {
<ide> outs2 := []APIContainersOld{}
<ide> for _, ctnr := range outs {
<del> outs2 = append(outs2, ctnr.ToLegacy())
<add> outs2 = append(outs2, *ctnr.ToLegacy())
<ide> }
<ide>
<ide> return writeJSON(w, http.StatusOK, outs2)
<del> } else {
<del> return writeJSON(w, http.StatusOK, outs)
<ide> }
<add> return writeJSON(w, http.StatusOK, outs)
<ide> }
<ide>
<ide> func postImagesTag(srv *Server, version float64, w http.ResponseWriter, r *http.Request, vars map[string]string) error {
<ide> func deleteImages(srv *Server, version float64, w http.ResponseWriter, r *http.R
<ide> if imgs != nil {
<ide> if len(imgs) != 0 {
<ide> return writeJSON(w, http.StatusOK, imgs)
<del> } else {
<del> return fmt.Errorf("Conflict, %s wasn't deleted", name)
<ide> }
<del> } else {
<del> w.WriteHeader(http.StatusNoContent)
<add> return fmt.Errorf("Conflict, %s wasn't deleted", name)
<ide> }
<add> w.WriteHeader(http.StatusNoContent)
<ide> return nil
<ide> }
<ide>
<ide><path>api_params.go
<ide> package docker
<ide>
<ide> import "strings"
<ide>
<del>type APIHistory struct {
<del> ID string `json:"Id"`
<del> Tags []string `json:",omitempty"`
<del> Created int64
<del> CreatedBy string `json:",omitempty"`
<del> Size int64
<del>}
<del>
<del>type APIImages struct {
<del> ID string `json:"Id"`
<del> RepoTags []string `json:",omitempty"`
<del> Created int64
<del> Size int64
<del> VirtualSize int64
<del> ParentId string `json:",omitempty"`
<del>}
<del>
<del>type APIImagesOld struct {
<del> Repository string `json:",omitempty"`
<del> Tag string `json:",omitempty"`
<del> ID string `json:"Id"`
<del> Created int64
<del> Size int64
<del> VirtualSize int64
<del>}
<del>
<del>func (self *APIImages) ToLegacy() []APIImagesOld {
<del>
<del> outs := []APIImagesOld{}
<del> for _, repotag := range self.RepoTags {
<add>type (
<add> APIHistory struct {
<add> ID string `json:"Id"`
<add> Tags []string `json:",omitempty"`
<add> Created int64
<add> CreatedBy string `json:",omitempty"`
<add> Size int64
<add> }
<ide>
<del> components := strings.SplitN(repotag, ":", 2)
<add> APIImages struct {
<add> ID string `json:"Id"`
<add> RepoTags []string `json:",omitempty"`
<add> Created int64
<add> Size int64
<add> VirtualSize int64
<add> ParentId string `json:",omitempty"`
<add> }
<ide>
<del> outs = append(outs, APIImagesOld{
<del> ID: self.ID,
<del> Repository: components[0],
<del> Tag: components[1],
<del> Created: self.Created,
<del> Size: self.Size,
<del> VirtualSize: self.VirtualSize,
<del> })
<add> APIImagesOld struct {
<add> Repository string `json:",omitempty"`
<add> Tag string `json:",omitempty"`
<add> ID string `json:"Id"`
<add> Created int64
<add> Size int64
<add> VirtualSize int64
<ide> }
<ide>
<del> return outs
<del>}
<add> APIInfo struct {
<add> Debug bool
<add> Containers int
<add> Images int
<add> NFd int `json:",omitempty"`
<add> NGoroutines int `json:",omitempty"`
<add> MemoryLimit bool `json:",omitempty"`
<add> SwapLimit bool `json:",omitempty"`
<add> IPv4Forwarding bool `json:",omitempty"`
<add> LXCVersion string `json:",omitempty"`
<add> NEventsListener int `json:",omitempty"`
<add> KernelVersion string `json:",omitempty"`
<add> IndexServerAddress string `json:",omitempty"`
<add> }
<ide>
<del>type APIInfo struct {
<del> Debug bool
<del> Containers int
<del> Images int
<del> NFd int `json:",omitempty"`
<del> NGoroutines int `json:",omitempty"`
<del> MemoryLimit bool `json:",omitempty"`
<del> SwapLimit bool `json:",omitempty"`
<del> IPv4Forwarding bool `json:",omitempty"`
<del> LXCVersion string `json:",omitempty"`
<del> NEventsListener int `json:",omitempty"`
<del> KernelVersion string `json:",omitempty"`
<del> IndexServerAddress string `json:",omitempty"`
<del>}
<add> APITop struct {
<add> Titles []string
<add> Processes [][]string
<add> }
<ide>
<del>type APITop struct {
<del> Titles []string
<del> Processes [][]string
<del>}
<add> APIRmi struct {
<add> Deleted string `json:",omitempty"`
<add> Untagged string `json:",omitempty"`
<add> }
<ide>
<del>type APIRmi struct {
<del> Deleted string `json:",omitempty"`
<del> Untagged string `json:",omitempty"`
<del>}
<add> APIContainers struct {
<add> ID string `json:"Id"`
<add> Image string
<add> Command string
<add> Created int64
<add> Status string
<add> Ports []APIPort
<add> SizeRw int64
<add> SizeRootFs int64
<add> Names []string
<add> }
<ide>
<del>type APIContainers struct {
<del> ID string `json:"Id"`
<del> Image string
<del> Command string
<del> Created int64
<del> Status string
<del> Ports []APIPort
<del> SizeRw int64
<del> SizeRootFs int64
<del> Names []string
<del>}
<add> APIContainersOld struct {
<add> ID string `json:"Id"`
<add> Image string
<add> Command string
<add> Created int64
<add> Status string
<add> Ports string
<add> SizeRw int64
<add> SizeRootFs int64
<add> }
<ide>
<del>func (self *APIContainers) ToLegacy() APIContainersOld {
<del> return APIContainersOld{
<del> ID: self.ID,
<del> Image: self.Image,
<del> Command: self.Command,
<del> Created: self.Created,
<del> Status: self.Status,
<del> Ports: displayablePorts(self.Ports),
<del> SizeRw: self.SizeRw,
<del> SizeRootFs: self.SizeRootFs,
<add> APIID struct {
<add> ID string `json:"Id"`
<ide> }
<del>}
<ide>
<del>type APIContainersOld struct {
<del> ID string `json:"Id"`
<del> Image string
<del> Command string
<del> Created int64
<del> Status string
<del> Ports string
<del> SizeRw int64
<del> SizeRootFs int64
<del>}
<add> APIRun struct {
<add> ID string `json:"Id"`
<add> Warnings []string `json:",omitempty"`
<add> }
<ide>
<del>type APIID struct {
<del> ID string `json:"Id"`
<del>}
<add> APIPort struct {
<add> PrivatePort int64
<add> PublicPort int64
<add> Type string
<add> IP string
<add> }
<ide>
<del>type APIRun struct {
<del> ID string `json:"Id"`
<del> Warnings []string `json:",omitempty"`
<del>}
<add> APIVersion struct {
<add> Version string
<add> GitCommit string `json:",omitempty"`
<add> GoVersion string `json:",omitempty"`
<add> }
<ide>
<del>type APIPort struct {
<del> PrivatePort int64
<del> PublicPort int64
<del> Type string
<del> IP string
<del>}
<add> APIWait struct {
<add> StatusCode int
<add> }
<ide>
<del>type APIVersion struct {
<del> Version string
<del> GitCommit string `json:",omitempty"`
<del> GoVersion string `json:",omitempty"`
<del>}
<add> APIAuth struct {
<add> Status string
<add> }
<ide>
<del>type APIWait struct {
<del> StatusCode int
<del>}
<add> APIImageConfig struct {
<add> ID string `json:"Id"`
<add> *Config
<add> }
<ide>
<del>type APIAuth struct {
<del> Status string
<del>}
<add> APICopy struct {
<add> Resource string
<add> HostPath string
<add> }
<add>)
<ide>
<del>type APIImageConfig struct {
<del> ID string `json:"Id"`
<del> *Config
<add>func (api APIImages) ToLegacy() []APIImagesOld {
<add> outs := []APIImagesOld{}
<add> for _, repotag := range api.RepoTags {
<add> components := strings.SplitN(repotag, ":", 2)
<add> outs = append(outs, APIImagesOld{
<add> ID: api.ID,
<add> Repository: components[0],
<add> Tag: components[1],
<add> Created: api.Created,
<add> Size: api.Size,
<add> VirtualSize: api.VirtualSize,
<add> })
<add> }
<add> return outs
<ide> }
<ide>
<del>type APICopy struct {
<del> Resource string
<del> HostPath string
<add>func (api APIContainers) ToLegacy() *APIContainersOld {
<add> return &APIContainersOld{
<add> ID: api.ID,
<add> Image: api.Image,
<add> Command: api.Command,
<add> Created: api.Created,
<add> Status: api.Status,
<add> Ports: displayablePorts(api.Ports),
<add> SizeRw: api.SizeRw,
<add> SizeRootFs: api.SizeRootFs,
<add> }
<ide> }
<ide><path>archive/archive.go
<ide> func Untar(archive io.Reader, path string) error {
<ide> buf := make([]byte, 10)
<ide> totalN := 0
<ide> for totalN < 10 {
<del> if n, err := archive.Read(buf[totalN:]); err != nil {
<add> n, err := archive.Read(buf[totalN:])
<add> if err != nil {
<ide> if err == io.EOF {
<ide> return fmt.Errorf("Tarball too short")
<ide> }
<ide> return err
<del> } else {
<del> totalN += n
<del> utils.Debugf("[tar autodetect] n: %d", n)
<ide> }
<add> totalN += n
<add> utils.Debugf("[tar autodetect] n: %d", n)
<ide> }
<add>
<ide> compression := DetectCompression(buf)
<ide>
<ide> utils.Debugf("Archive compression detected: %s", compression.Extension())
<ide><path>auth/auth.go
<ide> func Login(authConfig *AuthConfig, factory *utils.HTTPRequestFactory) (string, e
<ide> if loginAgainstOfficialIndex {
<ide> return "", fmt.Errorf("Login: Your account hasn't been activated. " +
<ide> "Please check your e-mail for a confirmation link.")
<del> } else {
<del> return "", fmt.Errorf("Login: Your account hasn't been activated. " +
<del> "Please see the documentation of the registry " + serverAddress + " for instructions how to activate it.")
<ide> }
<add> return "", fmt.Errorf("Login: Your account hasn't been activated. " +
<add> "Please see the documentation of the registry " + serverAddress + " for instructions how to activate it.")
<ide> } else if reqStatusCode == 400 {
<ide> if string(reqBody) == "\"Username or email already exists\"" {
<ide> req, err := factory.NewRequest("GET", serverAddress+"users/", nil)
<ide><path>commands.go
<ide> func (cli *DockerCli) stream(method, path string, in io.Reader, out io.Writer, h
<ide>
<ide> if matchesContentType(resp.Header.Get("Content-Type"), "application/json") {
<ide> return utils.DisplayJSONMessagesStream(resp.Body, out)
<del> } else {
<del> if _, err := io.Copy(out, resp.Body); err != nil {
<del> return err
<del> }
<add> }
<add> if _, err := io.Copy(out, resp.Body); err != nil {
<add> return err
<ide> }
<ide> return nil
<ide> }
<ide><path>graph.go
<ide> func (graph *Graph) getDockerInitLayer() (string, error) {
<ide> if err := os.MkdirAll(path.Join(initLayer, path.Dir(pth)), 0755); err != nil {
<ide> return "", err
<ide> }
<del>
<del> if f, err := os.OpenFile(path.Join(initLayer, pth), os.O_CREATE, 0755); err != nil {
<add> f, err := os.OpenFile(path.Join(initLayer, pth), os.O_CREATE, 0755)
<add> if err != nil {
<ide> return "", err
<del> } else {
<del> f.Close()
<ide> }
<add> f.Close()
<ide> }
<ide> } else {
<ide> return "", err
<ide><path>image.go
<ide> func LoadImage(root string) (*Image, error) {
<ide> return nil, err
<ide> }
<ide> } else {
<del> if size, err := strconv.Atoi(string(buf)); err != nil {
<add> size, err := strconv.Atoi(string(buf))
<add> if err != nil {
<ide> return nil, err
<del> } else {
<del> img.Size = int64(size)
<ide> }
<add> img.Size = int64(size)
<ide> }
<ide>
<ide> // Check that the filesystem layer exists
<ide> func StoreImage(img *Image, jsonData []byte, layerData archive.Archive, root str
<ide> // If raw json is provided, then use it
<ide> if jsonData != nil {
<ide> return ioutil.WriteFile(jsonPath(root), jsonData, 0600)
<del> } else { // Otherwise, unmarshal the image
<del> jsonData, err := json.Marshal(img)
<del> if err != nil {
<del> return err
<del> }
<del> if err := ioutil.WriteFile(jsonPath(root), jsonData, 0600); err != nil {
<del> return err
<del> }
<add> }
<add> // Otherwise, unmarshal the image
<add> jsonData, err := json.Marshal(img)
<add> if err != nil {
<add> return err
<add> }
<add> if err := ioutil.WriteFile(jsonPath(root), jsonData, 0600); err != nil {
<add> return err
<ide> }
<ide>
<ide> return StoreSize(img, root)
<ide> func StoreImage(img *Image, jsonData []byte, layerData archive.Archive, root str
<ide> func StoreSize(img *Image, root string) error {
<ide> layer := layerPath(root)
<ide>
<del> var totalSize int64 = 0
<add> var totalSize int64
<ide> filepath.Walk(layer, func(path string, fileInfo os.FileInfo, err error) error {
<ide> totalSize += fileInfo.Size()
<ide> return nil
<ide> func MountAUFS(ro []string, rw string, target string) error {
<ide> }
<ide>
<ide> // TarLayer returns a tar archive of the image's filesystem layer.
<del>func (image *Image) TarLayer(compression archive.Compression) (archive.Archive, error) {
<del> layerPath, err := image.layer()
<add>func (img *Image) TarLayer(compression archive.Compression) (archive.Archive, error) {
<add> layerPath, err := img.layer()
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> return archive.Tar(layerPath, compression)
<ide> }
<ide>
<del>func (image *Image) Mount(root, rw string) error {
<add>func (img *Image) Mount(root, rw string) error {
<ide> if mounted, err := Mounted(root); err != nil {
<ide> return err
<ide> } else if mounted {
<ide> return fmt.Errorf("%s is already mounted", root)
<ide> }
<del> layers, err := image.layers()
<add> layers, err := img.layers()
<ide> if err != nil {
<ide> return err
<ide> }
<ide> func (image *Image) Mount(root, rw string) error {
<ide> return nil
<ide> }
<ide>
<del>func (image *Image) Changes(rw string) ([]Change, error) {
<del> layers, err := image.layers()
<add>func (img *Image) Changes(rw string) ([]Change, error) {
<add> layers, err := img.layers()
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide> func (img *Image) History() ([]*Image, error) {
<ide> // FIXME: @shykes refactor this function with the new error handling
<ide> // (I'll do it if I have time tonight, I focus on the rest)
<ide> func (img *Image) layers() ([]string, error) {
<del> var list []string
<del> var e error
<add> var (
<add> list []string
<add> e error
<add> )
<ide> if err := img.WalkHistory(
<ide> func(img *Image) (err error) {
<ide> if layer, err := img.layer(); err != nil {
<ide> func (img *Image) layers() ([]string, error) {
<ide> }
<ide>
<ide> // Inject the dockerinit layer (empty place-holder for mount-binding dockerinit)
<del> if dockerinitLayer, err := img.getDockerInitLayer(); err != nil {
<add> dockerinitLayer, err := img.getDockerInitLayer()
<add> if err != nil {
<ide> return nil, err
<del> } else {
<del> list = append([]string{dockerinitLayer}, list...)
<ide> }
<del> return list, nil
<add> return append([]string{dockerinitLayer}, list...), nil
<ide> }
<ide>
<ide> func (img *Image) WalkHistory(handler func(*Image) error) (err error) {
<ide><path>runtime.go
<ide> package docker
<ide>
<ide> import (
<del> _ "code.google.com/p/gosqlite/sqlite3"
<add> _ "code.google.com/p/gosqlite/sqlite3" // registers sqlite
<ide> "container/list"
<ide> "database/sql"
<ide> "fmt"
<ide><path>server.go
<ide> func (srv *Server) ImageHistory(name string) ([]APIHistory, error) {
<ide>
<ide> }
<ide>
<del>func (srv *Server) ContainerTop(name, ps_args string) (*APITop, error) {
<add>func (srv *Server) ContainerTop(name, psArgs string) (*APITop, error) {
<ide> if container := srv.runtime.Get(name); container != nil {
<del> output, err := exec.Command("lxc-ps", "--name", container.ID, "--", ps_args).CombinedOutput()
<add> output, err := exec.Command("lxc-ps", "--name", container.ID, "--", psArgs).CombinedOutput()
<ide> if err != nil {
<ide> return nil, fmt.Errorf("lxc-ps: %s (%s)", err, output)
<ide> }
<ide> func (srv *Server) pushRepository(r *registry.Registry, out io.Writer, localName
<ide> out.Write(sf.FormatStatus("", "Image %s already pushed, skipping", elem.ID))
<ide> continue
<ide> }
<del> if checksum, err := srv.pushImage(r, out, remoteName, elem.ID, ep, repoData.Tokens, sf); err != nil {
<add> checksum, err := srv.pushImage(r, out, remoteName, elem.ID, ep, repoData.Tokens, sf)
<add> if err != nil {
<ide> // FIXME: Continue on error?
<ide> return err
<del> } else {
<del> elem.Checksum = checksum
<ide> }
<add> elem.Checksum = checksum
<add>
<ide> if err := pushTags(); err != nil {
<ide> return err
<ide> }
<ide> func (srv *Server) pushImage(r *registry.Registry, out io.Writer, remote, imgID,
<ide> defer os.RemoveAll(layerData.Name())
<ide>
<ide> // Send the layer
<del> if checksum, err := r.PushImageLayerRegistry(imgData.ID, utils.ProgressReader(layerData, int(layerData.Size), out, sf.FormatProgress("", "Pushing", "%8v/%v (%v)"), sf, false), ep, token, jsonRaw); err != nil {
<add> checksum, err = r.PushImageLayerRegistry(imgData.ID, utils.ProgressReader(layerData, int(layerData.Size), out, sf.FormatProgress("", "Pushing", "%8v/%v (%v)"), sf, false), ep, token, jsonRaw)
<add> if err != nil {
<ide> return "", err
<del> } else {
<del> imgData.Checksum = checksum
<ide> }
<add> imgData.Checksum = checksum
<add>
<ide> out.Write(sf.FormatStatus("", ""))
<ide>
<ide> // Send the checksum
<ide><path>utils/http.go
<ide> func NewHTTPUserAgentDecorator(versions ...VersionInfo) HTTPRequestDecorator {
<ide> return ret
<ide> }
<ide>
<del>func (self *HTTPUserAgentDecorator) ChangeRequest(req *http.Request) (newReq *http.Request, err error) {
<add>func (h *HTTPUserAgentDecorator) ChangeRequest(req *http.Request) (newReq *http.Request, err error) {
<ide> if req == nil {
<ide> return req, nil
<ide> }
<ide>
<del> userAgent := appendVersions(req.UserAgent(), self.versions...)
<add> userAgent := appendVersions(req.UserAgent(), h.versions...)
<ide> if len(userAgent) > 0 {
<ide> req.Header.Set("User-Agent", userAgent)
<ide> }
<ide> type HTTPMetaHeadersDecorator struct {
<ide> Headers map[string][]string
<ide> }
<ide>
<del>func (self *HTTPMetaHeadersDecorator) ChangeRequest(req *http.Request) (newReq *http.Request, err error) {
<del> if self.Headers == nil {
<add>func (h *HTTPMetaHeadersDecorator) ChangeRequest(req *http.Request) (newReq *http.Request, err error) {
<add> if h.Headers == nil {
<ide> return req, nil
<ide> }
<del> for k, v := range self.Headers {
<add> for k, v := range h.Headers {
<ide> req.Header[k] = v
<ide> }
<ide> return req, nil
<ide> type HTTPRequestFactory struct {
<ide> }
<ide>
<ide> func NewHTTPRequestFactory(d ...HTTPRequestDecorator) *HTTPRequestFactory {
<del> ret := new(HTTPRequestFactory)
<del> ret.decorators = d
<del> return ret
<add> return &HTTPRequestFactory{
<add> decorators: d,
<add> }
<ide> }
<ide>
<ide> // NewRequest() creates a new *http.Request,
<ide> // applies all decorators in the HTTPRequestFactory on the request,
<ide> // then applies decorators provided by d on the request.
<del>func (self *HTTPRequestFactory) NewRequest(method, urlStr string, body io.Reader, d ...HTTPRequestDecorator) (*http.Request, error) {
<add>func (h *HTTPRequestFactory) NewRequest(method, urlStr string, body io.Reader, d ...HTTPRequestDecorator) (*http.Request, error) {
<ide> req, err := http.NewRequest(method, urlStr, body)
<ide> if err != nil {
<ide> return nil, err
<ide> }
<ide>
<ide> // By default, a nil factory should work.
<del> if self == nil {
<add> if h == nil {
<ide> return req, nil
<ide> }
<del> for _, dec := range self.decorators {
<add> for _, dec := range h.decorators {
<ide> req, err = dec.ChangeRequest(req)
<ide> if err != nil {
<ide> return nil, err
<ide><path>utils/utils.go
<ide> func (graph *DependencyGraph) GenerateTraversalMap() ([][]string, error) {
<ide> for len(processed) < len(graph.nodes) {
<ide> // Use a temporary buffer for processed nodes, otherwise
<ide> // nodes that depend on each other could end up in the same round.
<del> tmp_processed := []*DependencyNode{}
<add> tmpProcessed := []*DependencyNode{}
<ide> for _, node := range graph.nodes {
<ide> // If the node has more dependencies than what we have cleared,
<ide> // it won't be valid for this round.
<ide> func (graph *DependencyGraph) GenerateTraversalMap() ([][]string, error) {
<ide> // It's not been processed yet and has 0 deps. Add it!
<ide> // (this is a shortcut for what we're doing below)
<ide> if node.Degree() == 0 {
<del> tmp_processed = append(tmp_processed, node)
<add> tmpProcessed = append(tmpProcessed, node)
<ide> continue
<ide> }
<ide> // If at least one dep hasn't been processed yet, we can't
<ide> func (graph *DependencyGraph) GenerateTraversalMap() ([][]string, error) {
<ide> }
<ide> // All deps have already been processed. Add it!
<ide> if ok {
<del> tmp_processed = append(tmp_processed, node)
<add> tmpProcessed = append(tmpProcessed, node)
<ide> }
<ide> }
<del> Debugf("Round %d: found %d available nodes", len(result), len(tmp_processed))
<add> Debugf("Round %d: found %d available nodes", len(result), len(tmpProcessed))
<ide> // If no progress has been made this round,
<ide> // that means we have circular dependencies.
<del> if len(tmp_processed) == 0 {
<add> if len(tmpProcessed) == 0 {
<ide> return nil, fmt.Errorf("Could not find a solution to this dependency graph")
<ide> }
<ide> round := []string{}
<del> for _, nd := range tmp_processed {
<add> for _, nd := range tmpProcessed {
<ide> round = append(round, nd.id)
<ide> processed[nd] = true
<ide> } | 11 |
Javascript | Javascript | fix useid in strict mode | 00ced1e2b7610543a519329a76ad0bfd12cd1c32 | <ide><path>packages/react-dom/src/__tests__/ReactDOMUseId-test.js
<ide> let ReactDOMFizzServer;
<ide> let Stream;
<ide> let Suspense;
<ide> let useId;
<add>let useState;
<ide> let document;
<ide> let writable;
<ide> let container;
<ide> describe('useId', () => {
<ide> Stream = require('stream');
<ide> Suspense = React.Suspense;
<ide> useId = React.useId;
<add> useState = React.useState;
<ide>
<ide> // Test Environment
<ide> const jsdom = new JSDOM(
<ide> describe('useId', () => {
<ide> `);
<ide> });
<ide>
<add> test('StrictMode double rendering', async () => {
<add> const {StrictMode} = React;
<add>
<add> function App() {
<add> return (
<add> <StrictMode>
<add> <DivWithId />
<add> </StrictMode>
<add> );
<add> }
<add>
<add> await serverAct(async () => {
<add> const {pipe} = ReactDOMFizzServer.renderToPipeableStream(<App />);
<add> pipe(writable);
<add> });
<add> await clientAct(async () => {
<add> ReactDOM.hydrateRoot(container, <App />);
<add> });
<add> expect(container).toMatchInlineSnapshot(`
<add> <div
<add> id="container"
<add> >
<add> <div
<add> id="0"
<add> />
<add> </div>
<add> `);
<add> });
<add>
<ide> test('empty (null) children', async () => {
<ide> // We don't treat empty children different from non-empty ones, which means
<ide> // they get allocated a slot when generating ids. There's no inherent reason
<ide> describe('useId', () => {
<ide> `);
<ide> });
<ide>
<add> test('local render phase updates', async () => {
<add> function App({swap}) {
<add> const [count, setCount] = useState(0);
<add> if (count < 3) {
<add> setCount(count + 1);
<add> }
<add> return useId();
<add> }
<add>
<add> await serverAct(async () => {
<add> const {pipe} = ReactDOMFizzServer.renderToPipeableStream(<App />);
<add> pipe(writable);
<add> });
<add> await clientAct(async () => {
<add> ReactDOM.hydrateRoot(container, <App />);
<add> });
<add> expect(container).toMatchInlineSnapshot(`
<add> <div
<add> id="container"
<add> >
<add> R:0
<add> <!-- -->
<add> </div>
<add> `);
<add> });
<add>
<ide> test('basic incremental hydration', async () => {
<ide> function App() {
<ide> return (
<ide><path>packages/react-reconciler/src/ReactFiberBeginWork.new.js
<ide> import {
<ide> prepareToReadContext,
<ide> scheduleWorkOnParentPath,
<ide> } from './ReactFiberNewContext.new';
<del>import {renderWithHooks, bailoutHooks} from './ReactFiberHooks.new';
<add>import {
<add> renderWithHooks,
<add> checkDidRenderIdHook,
<add> bailoutHooks,
<add>} from './ReactFiberHooks.new';
<ide> import {stopProfilerTimerIfRunning} from './ReactProfilerTimer.new';
<ide> import {
<ide> getMaskedContext,
<ide> import {
<ide> getForksAtLevel,
<ide> isForkedChild,
<ide> pushTreeId,
<add> pushMaterializedTreeId,
<ide> } from './ReactFiberTreeContext.new';
<ide>
<ide> const ReactCurrentOwner = ReactSharedInternals.ReactCurrentOwner;
<ide> function updateForwardRef(
<ide>
<ide> // The rest is a fork of updateFunctionComponent
<ide> let nextChildren;
<add> let hasId;
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> if (
<ide> debugRenderPhaseSideEffectsForStrictMode &&
<ide> workInProgress.mode & StrictLegacyMode
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function updateForwardRef(
<ide> return bailoutOnAlreadyFinishedWork(current, workInProgress, renderLanes);
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> // React DevTools reads this flag.
<ide> workInProgress.flags |= PerformedWork;
<ide> reconcileChildren(current, workInProgress, nextChildren, renderLanes);
<ide> function updateFunctionComponent(
<ide> }
<ide>
<ide> let nextChildren;
<add> let hasId;
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> if (
<ide> debugRenderPhaseSideEffectsForStrictMode &&
<ide> workInProgress.mode & StrictLegacyMode
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function updateFunctionComponent(
<ide> return bailoutOnAlreadyFinishedWork(current, workInProgress, renderLanes);
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> // React DevTools reads this flag.
<ide> workInProgress.flags |= PerformedWork;
<ide> reconcileChildren(current, workInProgress, nextChildren, renderLanes);
<ide> function mountIndeterminateComponent(
<ide>
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> let value;
<add> let hasId;
<ide>
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> setIsRendering(false);
<ide> } else {
<ide> value = renderWithHooks(
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> }
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> reconcileChildren(null, workInProgress, value, renderLanes);
<ide> if (__DEV__) {
<ide> validateFunctionComponentInDev(workInProgress, Component);
<ide><path>packages/react-reconciler/src/ReactFiberBeginWork.old.js
<ide> import {
<ide> prepareToReadContext,
<ide> scheduleWorkOnParentPath,
<ide> } from './ReactFiberNewContext.old';
<del>import {renderWithHooks, bailoutHooks} from './ReactFiberHooks.old';
<add>import {
<add> renderWithHooks,
<add> checkDidRenderIdHook,
<add> bailoutHooks,
<add>} from './ReactFiberHooks.old';
<ide> import {stopProfilerTimerIfRunning} from './ReactProfilerTimer.old';
<ide> import {
<ide> getMaskedContext,
<ide> import {
<ide> getForksAtLevel,
<ide> isForkedChild,
<ide> pushTreeId,
<add> pushMaterializedTreeId,
<ide> } from './ReactFiberTreeContext.old';
<ide>
<ide> const ReactCurrentOwner = ReactSharedInternals.ReactCurrentOwner;
<ide> function updateForwardRef(
<ide>
<ide> // The rest is a fork of updateFunctionComponent
<ide> let nextChildren;
<add> let hasId;
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> if (
<ide> debugRenderPhaseSideEffectsForStrictMode &&
<ide> workInProgress.mode & StrictLegacyMode
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> function updateForwardRef(
<ide> ref,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function updateForwardRef(
<ide> return bailoutOnAlreadyFinishedWork(current, workInProgress, renderLanes);
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> // React DevTools reads this flag.
<ide> workInProgress.flags |= PerformedWork;
<ide> reconcileChildren(current, workInProgress, nextChildren, renderLanes);
<ide> function updateFunctionComponent(
<ide> }
<ide>
<ide> let nextChildren;
<add> let hasId;
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> if (
<ide> debugRenderPhaseSideEffectsForStrictMode &&
<ide> workInProgress.mode & StrictLegacyMode
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> function updateFunctionComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function updateFunctionComponent(
<ide> return bailoutOnAlreadyFinishedWork(current, workInProgress, renderLanes);
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> // React DevTools reads this flag.
<ide> workInProgress.flags |= PerformedWork;
<ide> reconcileChildren(current, workInProgress, nextChildren, renderLanes);
<ide> function mountIndeterminateComponent(
<ide>
<ide> prepareToReadContext(workInProgress, renderLanes);
<ide> let value;
<add> let hasId;
<ide>
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStarted(workInProgress);
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> setIsRendering(false);
<ide> } else {
<ide> value = renderWithHooks(
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> }
<ide> if (enableSchedulingProfiler) {
<ide> markComponentRenderStopped();
<ide> function mountIndeterminateComponent(
<ide> context,
<ide> renderLanes,
<ide> );
<add> hasId = checkDidRenderIdHook();
<ide> } finally {
<ide> setIsStrictModeForDevtools(false);
<ide> }
<ide> }
<ide> }
<ide>
<add> if (getIsHydrating() && hasId) {
<add> pushMaterializedTreeId(workInProgress);
<add> }
<add>
<ide> reconcileChildren(null, workInProgress, value, renderLanes);
<ide> if (__DEV__) {
<ide> validateFunctionComponentInDev(workInProgress, Component);
<ide><path>packages/react-reconciler/src/ReactFiberHooks.new.js
<ide> import {
<ide> } from './ReactUpdateQueue.new';
<ide> import {pushInterleavedQueue} from './ReactFiberInterleavedUpdates.new';
<ide> import {warnOnSubscriptionInsideStartTransition} from 'shared/ReactFeatureFlags';
<del>import {getTreeId, pushTreeFork, pushTreeId} from './ReactFiberTreeContext.new';
<add>import {getTreeId} from './ReactFiberTreeContext.new';
<ide>
<ide> const {ReactCurrentDispatcher, ReactCurrentBatchConfig} = ReactSharedInternals;
<ide>
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> let numberOfReRenders: number = 0;
<ide> do {
<ide> didScheduleRenderPhaseUpdateDuringThisPass = false;
<add> localIdCounter = 0;
<ide>
<ide> if (numberOfReRenders >= RE_RENDER_LIMIT) {
<ide> throw new Error(
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> }
<ide>
<ide> didScheduleRenderPhaseUpdate = false;
<add> // This is reset by checkDidRenderIdHook
<add> // localIdCounter = 0;
<ide>
<ide> if (didRenderTooFewHooks) {
<ide> throw new Error(
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> }
<ide> }
<ide> }
<del>
<del> if (localIdCounter !== 0) {
<del> localIdCounter = 0;
<del> if (getIsHydrating()) {
<del> // This component materialized an id. This will affect any ids that appear
<del> // in its children.
<del> const returnFiber = workInProgress.return;
<del> if (returnFiber !== null) {
<del> const numberOfForks = 1;
<del> const slotIndex = 0;
<del> pushTreeFork(workInProgress, numberOfForks);
<del> pushTreeId(workInProgress, numberOfForks, slotIndex);
<del> }
<del> }
<del> }
<del>
<ide> return children;
<ide> }
<ide>
<add>export function checkDidRenderIdHook() {
<add> // This should be called immediately after every renderWithHooks call.
<add> // Conceptually, it's part of the return value of renderWithHooks; it's only a
<add> // separate function to avoid using an array tuple.
<add> const didRenderIdHook = localIdCounter !== 0;
<add> localIdCounter = 0;
<add> return didRenderIdHook;
<add>}
<add>
<ide> export function bailoutHooks(
<ide> current: Fiber,
<ide> workInProgress: Fiber,
<ide><path>packages/react-reconciler/src/ReactFiberHooks.old.js
<ide> import {
<ide> } from './ReactUpdateQueue.old';
<ide> import {pushInterleavedQueue} from './ReactFiberInterleavedUpdates.old';
<ide> import {warnOnSubscriptionInsideStartTransition} from 'shared/ReactFeatureFlags';
<del>import {getTreeId, pushTreeFork, pushTreeId} from './ReactFiberTreeContext.old';
<add>import {getTreeId} from './ReactFiberTreeContext.old';
<ide>
<ide> const {ReactCurrentDispatcher, ReactCurrentBatchConfig} = ReactSharedInternals;
<ide>
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> let numberOfReRenders: number = 0;
<ide> do {
<ide> didScheduleRenderPhaseUpdateDuringThisPass = false;
<add> localIdCounter = 0;
<ide>
<ide> if (numberOfReRenders >= RE_RENDER_LIMIT) {
<ide> throw new Error(
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> }
<ide>
<ide> didScheduleRenderPhaseUpdate = false;
<add> // This is reset by checkDidRenderIdHook
<add> // localIdCounter = 0;
<ide>
<ide> if (didRenderTooFewHooks) {
<ide> throw new Error(
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> }
<ide> }
<ide> }
<del>
<del> if (localIdCounter !== 0) {
<del> localIdCounter = 0;
<del> if (getIsHydrating()) {
<del> // This component materialized an id. This will affect any ids that appear
<del> // in its children.
<del> const returnFiber = workInProgress.return;
<del> if (returnFiber !== null) {
<del> const numberOfForks = 1;
<del> const slotIndex = 0;
<del> pushTreeFork(workInProgress, numberOfForks);
<del> pushTreeId(workInProgress, numberOfForks, slotIndex);
<del> }
<del> }
<del> }
<del>
<ide> return children;
<ide> }
<ide>
<add>export function checkDidRenderIdHook() {
<add> // This should be called immediately after every renderWithHooks call.
<add> // Conceptually, it's part of the return value of renderWithHooks; it's only a
<add> // separate function to avoid using an array tuple.
<add> const didRenderIdHook = localIdCounter !== 0;
<add> localIdCounter = 0;
<add> return didRenderIdHook;
<add>}
<add>
<ide> export function bailoutHooks(
<ide> current: Fiber,
<ide> workInProgress: Fiber,
<ide><path>packages/react-reconciler/src/ReactFiberTreeContext.new.js
<ide> export function pushTreeId(
<ide> }
<ide> }
<ide>
<add>export function pushMaterializedTreeId(workInProgress: Fiber) {
<add> warnIfNotHydrating();
<add>
<add> // This component materialized an id. This will affect any ids that appear
<add> // in its children.
<add> const returnFiber = workInProgress.return;
<add> if (returnFiber !== null) {
<add> const numberOfForks = 1;
<add> const slotIndex = 0;
<add> pushTreeFork(workInProgress, numberOfForks);
<add> pushTreeId(workInProgress, numberOfForks, slotIndex);
<add> }
<add>}
<add>
<ide> function getBitLength(number: number): number {
<ide> return 32 - clz32(number);
<ide> }
<ide><path>packages/react-reconciler/src/ReactFiberTreeContext.old.js
<ide> export function pushTreeId(
<ide> }
<ide> }
<ide>
<add>export function pushMaterializedTreeId(workInProgress: Fiber) {
<add> warnIfNotHydrating();
<add>
<add> // This component materialized an id. This will affect any ids that appear
<add> // in its children.
<add> const returnFiber = workInProgress.return;
<add> if (returnFiber !== null) {
<add> const numberOfForks = 1;
<add> const slotIndex = 0;
<add> pushTreeFork(workInProgress, numberOfForks);
<add> pushTreeId(workInProgress, numberOfForks, slotIndex);
<add> }
<add>}
<add>
<ide> function getBitLength(number: number): number {
<ide> return 32 - clz32(number);
<ide> }
<ide><path>packages/react-server/src/ReactFizzHooks.js
<ide> export function finishHooks(
<ide> // work-in-progress hooks and applying the additional updates on top. Keep
<ide> // restarting until no more updates are scheduled.
<ide> didScheduleRenderPhaseUpdate = false;
<add> localIdCounter = 0;
<ide> numberOfReRenders += 1;
<ide>
<ide> // Start over from the beginning of the list | 8 |
Ruby | Ruby | use latest versions for dependencies | 0b10efcc4cd08732eba5c78d9267d90451d18c7d | <ide><path>railties/lib/rails/generators/app_base.rb
<ide> def asset_pipeline_gemfile_entry
<ide> return [] if options[:skip_asset_pipeline]
<ide>
<ide> if options[:asset_pipeline] == "sprockets"
<del> GemfileEntry.version "sprockets-rails", ">= 2.0.0",
<add> GemfileEntry.version "sprockets-rails", ">= 3.4.1",
<ide> "The original asset pipeline for Rails [https://github.com/rails/sprockets-rails]"
<ide> elsif options[:asset_pipeline] == "propshaft"
<del> GemfileEntry.version "propshaft", ">= 0.1.7", "The modern asset pipeline for Rails [https://github.com/rails/propshaft/]"
<add> GemfileEntry.version "propshaft", ">= 0.4.1", "The modern asset pipeline for Rails [https://github.com/rails/propshaft/]"
<ide> else
<ide> []
<ide> end
<ide> def rails_version_specifier(gem_version = Rails.gem_version)
<ide> def jbuilder_gemfile_entry
<ide> return [] if options[:skip_jbuilder]
<ide> comment = "Build JSON APIs with ease [https://github.com/rails/jbuilder]"
<del> GemfileEntry.new "jbuilder", "~> 2.7", comment, {}, options[:api]
<add> GemfileEntry.new "jbuilder", "~> 2.11", comment, {}, options[:api]
<ide> end
<ide>
<ide> def javascript_gemfile_entry
<ide> return [] if options[:skip_javascript]
<ide>
<ide> if adjusted_javascript_option == "importmap"
<del> GemfileEntry.version("importmap-rails", ">= 0.3.4", "Use JavaScript with ESM import maps [https://github.com/rails/importmap-rails]")
<add> GemfileEntry.version("importmap-rails", ">= 0.9.2", "Use JavaScript with ESM import maps [https://github.com/rails/importmap-rails]")
<ide> else
<del> GemfileEntry.version "jsbundling-rails", ">= 0.2.0", "Bundle and transpile JavaScript [https://github.com/rails/jsbundling-rails]"
<add> GemfileEntry.version "jsbundling-rails", ">= 0.2.2", "Bundle and transpile JavaScript [https://github.com/rails/jsbundling-rails]"
<ide> end
<ide> end
<ide>
<ide> def hotwire_gemfile_entry
<ide> return [] if options[:skip_javascript] || options[:skip_hotwire]
<ide>
<ide> turbo_rails_entry =
<del> GemfileEntry.version("turbo-rails", ">= 0.7.11", "Hotwire's SPA-like page accelerator [https://turbo.hotwired.dev]")
<add> GemfileEntry.version("turbo-rails", ">= 0.9.0", "Hotwire's SPA-like page accelerator [https://turbo.hotwired.dev]")
<ide>
<ide> stimulus_rails_entry =
<del> GemfileEntry.version("stimulus-rails", ">= 0.4.0", "Hotwire's modest JavaScript framework [https://stimulus.hotwired.dev]")
<add> GemfileEntry.version("stimulus-rails", ">= 0.7.3", "Hotwire's modest JavaScript framework [https://stimulus.hotwired.dev]")
<ide>
<ide> [ turbo_rails_entry, stimulus_rails_entry ]
<ide> end
<ide> def css_gemfile_entry
<ide> return [] unless options[:css]
<ide>
<ide> if !using_node? && options[:css] == "tailwind"
<del> GemfileEntry.version("tailwindcss-rails", ">= 0.4.3", "Use Tailwind CSS [https://github.com/rails/tailwindcss-rails]")
<add> GemfileEntry.version("tailwindcss-rails", ">= 0.5.3", "Use Tailwind CSS [https://github.com/rails/tailwindcss-rails]")
<ide> else
<del> GemfileEntry.version("cssbundling-rails", ">= 0.1.0", "Bundle and process CSS [https://github.com/rails/cssbundling-rails]")
<add> GemfileEntry.version("cssbundling-rails", ">= 0.2.7", "Bundle and process CSS [https://github.com/rails/cssbundling-rails]")
<ide> end
<ide> end
<ide> | 1 |
Python | Python | add import math | 912f6881d2b69f180522172a5283702bd8c41d9c | <ide><path>templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py
<ide> def call(
<ide> )
<ide>
<ide> {% else %}
<add>import math
<ide> import random
<ide> from typing import Dict, Optional, Tuple, Union
<ide>
<ide><path>templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py
<ide> def forward(
<ide> attentions=outputs.attentions,
<ide> )
<ide> {% else %}
<add>import math
<ide> import random
<ide> from typing import Optional, Tuple
<ide> | 2 |
PHP | PHP | use getter method for access primary key | 850b7531d42b6fcd8941b12339fb3135128ba518 | <ide><path>src/Illuminate/Database/Eloquent/Model.php
<ide> public function resolveRouteBinding($value)
<ide> */
<ide> public function getForeignKey()
<ide> {
<del> return Str::snake(class_basename($this)).'_'.$this->primaryKey;
<add> return Str::snake(class_basename($this)).'_'.$this->getKeyName();
<ide> }
<ide>
<ide> /** | 1 |
Text | Text | make minor fixes to maintaining-openssl.md | f461a66822a807d7f6c5eb7300fd69638beee1f8 | <ide><path>doc/guides/maintaining-openssl.md
<ide> them.
<ide> % mv openssl-1.1.0h openssl
<ide> % git add --all openssl
<ide> % git commit openssl
<del>````
<add>```
<ide>
<ide> The commit message can be written as (with the openssl version set
<ide> to the relevant value):
<ide> separately after updating the openssl source as described above. The
<ide> current patch implementation can be found in the `deps/openssl/patches`
<ide> directory in the file `0001-deps-add-support-for-BoringSSL-QUIC-APIs.patch`.
<ide>
<del>```text
<del> $ git am deps/openssl/patches 0001-deps-add-support-for-BoringSSL-QUIC-APIs.patch
<add>```console
<add>% git am deps/openssl/patches 0001-deps-add-support-for-BoringSSL-QUIC-APIs.patch
<ide> ```
<ide>
<ide> The patch file itself is generated by squashing commits from the
<ide> The patch is currently supported only for openssl-1.1.1e.
<ide> Use `make` to regenerate all platform dependent files in
<ide> `deps/openssl/config/archs/`:
<ide> ```console
<del># On non-linux machines
<add># On non-Linux machines
<ide> % make gen-openssl
<ide>
<del># On Linux machine
<add># On Linux machines
<ide> % make -C deps/openssl/config
<ide> ```
<ide> | 1 |
Ruby | Ruby | fix message and undent for external deps | 1f1da9266ccfac3831767e0a1d1fe3c1b797ac68 | <ide><path>Library/Homebrew/formula_installer.rb
<ide> def self.expand_deps f
<ide> end
<ide> deps
<ide> end
<del>
<add>
<ide> def pyerr dep
<ide> brew_pip = ' brew install pip &&' unless Formula.factory('pip').installed?
<del> <<-EOS.dedent
<add> <<-EOS.undent
<ide> Unsatisfied dependency, #{dep}
<del> Homebrew does not provide formula for Python dependencies, pip does:
<add> Homebrew does not provide Python dependencies, pip does:
<ide>
<ide> #{brew_pip} pip install #{dep}
<ide> EOS
<ide> end
<del> def plerr dep; <<-EOS.dedent
<add> def plerr dep; <<-EOS.undent
<ide> Unsatisfied dependency, #{dep}
<del> Homebrew does not provide formula for Perl dependencies, cpan does:
<add> Homebrew does not provide Perl dependencies, cpan does:
<ide>
<ide> cpan -i #{dep}
<ide> EOS
<ide> end
<del> def rberr dep; <<-EOS.dedent
<add> def rberr dep; <<-EOS.undent
<ide> Unsatisfied dependency "#{dep}"
<del> Homebrew does not provide formulae for Ruby dependencies, rubygems does:
<add> Homebrew does not provide Ruby dependencies, rubygems does:
<ide>
<ide> gem install #{dep}
<ide> EOS
<ide> end
<del> def jrberr dep; <<-EOS
<del>Unsatisfied dependency "#{dep}"
<del>Homebrew does not provide formulae for JRuby dependencies, rubygems does:
<add> def jrberr dep; <<-EOS.undent
<add> Unsatisfied dependency "#{dep}"
<add> Homebrew does not provide JRuby dependencies, rubygems does:
<ide>
<del> jruby -S gem install #{dep}
<add> jruby -S gem install #{dep}
<ide> EOS
<ide> end
<ide>
<ide> def install f
<ide> end
<ide> install_private f
<ide> end
<del>
<add>
<ide> private
<ide>
<ide> def install_private f | 1 |
Go | Go | remove unused options | d4d5e0ae0c9d79ada765da484950811d41c286bc | <ide><path>libcontainerd/supervisor/remote_daemon_options.go
<ide> package supervisor // import "github.com/docker/docker/libcontainerd/supervisor"
<ide>
<del>// WithRemoteAddr sets the external containerd socket to connect to.
<del>func WithRemoteAddr(addr string) DaemonOpt {
<del> return func(r *remote) error {
<del> r.GRPC.Address = addr
<del> return nil
<del> }
<del>}
<del>
<del>// WithRemoteAddrUser sets the uid and gid to create the RPC address with
<del>func WithRemoteAddrUser(uid, gid int) DaemonOpt {
<del> return func(r *remote) error {
<del> r.GRPC.UID = uid
<del> r.GRPC.GID = gid
<del> return nil
<del> }
<del>}
<del>
<ide> // WithLogLevel defines which log level to starts containerd with.
<ide> // This only makes sense if WithStartDaemon() was set to true.
<ide> func WithLogLevel(lvl string) DaemonOpt {
<ide> func WithLogLevel(lvl string) DaemonOpt {
<ide> }
<ide> }
<ide>
<del>// WithDebugAddress defines at which location the debug GRPC connection
<del>// should be made
<del>func WithDebugAddress(addr string) DaemonOpt {
<del> return func(r *remote) error {
<del> r.Debug.Address = addr
<del> return nil
<del> }
<del>}
<del>
<del>// WithMetricsAddress defines at which location the debug GRPC connection
<del>// should be made
<del>func WithMetricsAddress(addr string) DaemonOpt {
<del> return func(r *remote) error {
<del> r.Metrics.Address = addr
<del> return nil
<del> }
<del>}
<del>
<ide> // WithPlugin allow configuring a containerd plugin
<ide> // configuration values passed needs to be quoted if quotes are needed in
<ide> // the toml format. | 1 |
Text | Text | update electron link in readme | ae51570201c0d915aae1d57810f1e37610e40d44 | <ide><path>README.md
<ide> [](https://david-dm.org/atom/atom)
<ide> [](https://atom-slack.herokuapp.com)
<ide>
<del>Atom is a hackable text editor for the 21st century, built on [Electron](https://github.com/atom/electron), and based on everything we love about our favorite editors. We designed it to be deeply customizable, but still approachable using the default configuration.
<add>Atom is a hackable text editor for the 21st century, built on [Electron](https://github.com/electron/electron), and based on everything we love about our favorite editors. We designed it to be deeply customizable, but still approachable using the default configuration.
<ide>
<ide> 
<ide> | 1 |
Text | Text | fix small typo | 58f3403587741c7ed2e25fc8bfa4faee825894c4 | <ide><path>readme.md
<ide> Note: we recommend putting `.next`, or your custom dist folder (Please have a lo
<ide>
<ide> ## Static HTML export
<ide>
<del>This is a way to run your Next.js app as a standalone static app without any Node.js server. The export app supports almost every feature of Next.js including dyanmic urls, prefetching, preloading and dynamic imports.
<add>This is a way to run your Next.js app as a standalone static app without any Node.js server. The export app supports almost every feature of Next.js including dynamic urls, prefetching, preloading and dynamic imports.
<ide>
<ide> ### Usage
<ide> | 1 |
Text | Text | add changelog entry | 7d9c5c16539696ce2f15bac47d64edd401eeacc8 | <ide><path>actionpack/CHANGELOG.md
<add>* Deprecate *_via_redirect integration test methods.
<add>
<add> Use `follow_redirect!` manually after the request call for the same behavior.
<add>
<add> *Aditya Kapoor*
<add>
<ide> * Add `ActionController::Renderer` to render arbitrary templates
<ide> outside controller actions.
<ide> | 1 |
Java | Java | handle illegal errors thrown from plugin | 26f8e8350e32256ed64f8323f697ba388690dc82 | <ide><path>rxjava-core/src/main/java/rx/observers/SafeSubscriber.java
<ide> public void onNext(T args) {
<ide> protected void _onError(Throwable e) {
<ide> try {
<ide> RxJavaPlugins.getInstance().getErrorHandler().handleError(e);
<add> } catch (Throwable pluginException) {
<add> handlePluginException(pluginException);
<add> }
<add> try {
<ide> actual.onError(e);
<ide> } catch (Throwable e2) {
<ide> if (e2 instanceof OnErrorNotImplementedException) {
<ide> protected void _onError(Throwable e) {
<ide> try {
<ide> unsubscribe();
<ide> } catch (Throwable unsubscribeException) {
<del> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> try {
<add> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> } catch (Throwable pluginException) {
<add> handlePluginException(pluginException);
<add> }
<ide> throw new RuntimeException("Observer.onError not implemented and error while unsubscribing.", new CompositeException(Arrays.asList(e, unsubscribeException)));
<ide> }
<ide> throw (OnErrorNotImplementedException) e2;
<ide> protected void _onError(Throwable e) {
<ide> *
<ide> * https://github.com/Netflix/RxJava/issues/198
<ide> */
<del> RxJavaPlugins.getInstance().getErrorHandler().handleError(e2);
<add> try {
<add> RxJavaPlugins.getInstance().getErrorHandler().handleError(e2);
<add> } catch (Throwable pluginException) {
<add> handlePluginException(pluginException);
<add> }
<ide> try {
<ide> unsubscribe();
<ide> } catch (Throwable unsubscribeException) {
<del> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> try {
<add> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> } catch (Throwable pluginException) {
<add> handlePluginException(pluginException);
<add> }
<ide> throw new RuntimeException("Error occurred when trying to propagate error to Observer.onError and during unsubscription.", new CompositeException(Arrays.asList(e, e2, unsubscribeException)));
<ide> }
<ide>
<ide> protected void _onError(Throwable e) {
<ide> try {
<ide> unsubscribe();
<ide> } catch (RuntimeException unsubscribeException) {
<del> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> try {
<add> RxJavaPlugins.getInstance().getErrorHandler().handleError(unsubscribeException);
<add> } catch (Throwable pluginException) {
<add> handlePluginException(pluginException);
<add> }
<ide> throw unsubscribeException;
<ide> }
<ide> }
<ide>
<add> private void handlePluginException(Throwable pluginException) {
<add> /*
<add> * We don't want errors from the plugin to affect normal flow.
<add> * Since the plugin should never throw this is a safety net
<add> * and will complain loudly to System.err so it gets fixed.
<add> */
<add> System.err.println("RxJavaErrorHandler threw an Exception. It shouldn't. => " + pluginException.getMessage());
<add> pluginException.printStackTrace();
<add> }
<add>
<ide> public Subscriber<? super T> getActual() {
<ide> return actual;
<ide> } | 1 |
Javascript | Javascript | move auth api into own boot script | b35501c78eefc24ad4b4b2ab009ef8c51db1068b | <ide><path>server/boot/authentication.js
<add>import dedent from 'dedent';
<add>import debugFactory from 'debug';
<add>
<add>const isSignUpDisabled = !!process.env.DISABLE_SIGNUP;
<add>const debug = debugFactory('fcc:boot:auth');
<add>
<ide> module.exports = function enableAuthentication(app) {
<del> // enable authentication
<add> // enable loopback access control authentication. see:
<add> // loopback.io/doc/en/lb2/Authentication-authorization-and-permissions.html
<ide> app.enableAuth();
<add> const router = app.loopback.Router();
<add> const api = app.loopback.Router();
<add> const { AccessToken, User } = app.models;
<add>
<add> router.get('/login', function(req, res) {
<add> res.redirect(301, '/signin');
<add> });
<add>
<add> router.get('/logout', function(req, res) {
<add> res.redirect(301, '/signout');
<add> });
<add>
<add> function getEmailSignin(req, res) {
<add> if (req.user) {
<add> return res.redirect('/');
<add> }
<add> if (isSignUpDisabled) {
<add> return res.render('account/beta', {
<add> title: 'New sign ups are disabled'
<add> });
<add> }
<add> return res.render('account/email-signin', {
<add> title: 'Sign in to freeCodeCamp using your Email Address'
<add> });
<add> }
<add>
<add> router.get('/signup', getEmailSignin);
<add> router.get('/signin', getEmailSignin);
<add> router.get('/email-signin', getEmailSignin);
<add>
<add> function signout(req, res) {
<add> req.logout();
<add> res.redirect('/');
<add> }
<add>
<add> router.get('/signout', signout);
<add>
<add> function getDepSignin(req, res) {
<add> if (req.user) {
<add> return res.redirect('/');
<add> }
<add> return res.render('account/deprecated-signin', {
<add> title: 'Sign in to freeCodeCamp using a Deprecated Login'
<add> });
<add> }
<add>
<add> router.get('/deprecated-signin', getDepSignin);
<add>
<add> function invalidateAuthToken(req, res, next) {
<add> if (req.user) {
<add> return res.redirect('/');
<add> }
<add>
<add> if (!req.query || !req.query.email || !req.query.token) {
<add> req.flash('info', { msg: defaultErrorMsg });
<add> return res.redirect('/email-signin');
<add> }
<add>
<add> const authTokenId = req.query.token;
<add> const authEmailId = new Buffer(req.query.email, 'base64').toString();
<add>
<add> return AccessToken.findOne$({ where: {id: authTokenId} })
<add> .map(authToken => {
<add> if (!authToken) {
<add> req.flash('info', { msg: defaultErrorMsg });
<add> return res.redirect('/email-signin');
<add> }
<add>
<add> const userId = authToken.userId;
<add> return User.findById(userId, (err, user) => {
<add> if (err || !user || user.email !== authEmailId) {
<add> debug(err);
<add> req.flash('info', { msg: defaultErrorMsg });
<add> return res.redirect('/email-signin');
<add> }
<add> return authToken.validate((err, isValid) => {
<add> if (err) { throw err; }
<add> if (!isValid) {
<add> req.flash('info', { msg: [ 'Looks like the link you clicked has',
<add> 'expired, please request a fresh link, to sign in.'].join('')
<add> });
<add> return res.redirect('/email-signin');
<add> }
<add> return authToken.destroy((err) => {
<add> if (err) { debug(err); }
<add> next();
<add> });
<add> });
<add> });
<add> })
<add> .subscribe(
<add> () => {},
<add> next
<add> );
<add> }
<add>
<add> const defaultErrorMsg = dedent`
<add> Oops, something is not right,
<add> please request a fresh link to sign in / sign up.
<add> `;
<add>
<add> function getPasswordlessAuth(req, res, next) {
<add> if (req.user) {
<add> req.flash('info', {
<add> msg: 'Hey, looks like you’re already signed in.'
<add> });
<add> return res.redirect('/');
<add> }
<add>
<add> if (!req.query || !req.query.email || !req.query.token) {
<add> req.flash('info', { msg: defaultErrorMsg });
<add> return res.redirect('/email-signin');
<add> }
<add>
<add> const email = new Buffer(req.query.email, 'base64').toString();
<add>
<add> return User.findOne$({ where: { email }})
<add> .map(user => {
<add>
<add> if (!user) {
<add> debug(`did not find a valid user with email: ${email}`);
<add> req.flash('info', { msg: defaultErrorMsg });
<add> return res.redirect('/email-signin');
<add> }
<add>
<add> const emailVerified = true;
<add> const emailAuthLinkTTL = null;
<add> const emailVerifyTTL = null;
<add> user.update$({
<add> emailVerified, emailAuthLinkTTL, emailVerifyTTL
<add> })
<add> .do((user) => {
<add> user.emailVerified = emailVerified;
<add> user.emailAuthLinkTTL = emailAuthLinkTTL;
<add> user.emailVerifyTTL = emailVerifyTTL;
<add> });
<add>
<add> return user.createAccessToken(
<add> { ttl: User.settings.ttl }, (err, accessToken) => {
<add> if (err) { throw err; }
<add>
<add> var config = {
<add> signed: !!req.signedCookies,
<add> maxAge: accessToken.ttl
<add> };
<add>
<add> if (accessToken && accessToken.id) {
<add> debug('setting cookies');
<add> res.cookie('access_token', accessToken.id, config);
<add> res.cookie('userId', accessToken.userId, config);
<add> }
<add>
<add> return req.logIn({
<add> id: accessToken.userId.toString() }, err => {
<add> if (err) { return next(err); }
<add>
<add> debug('user logged in');
<add>
<add> if (req.session && req.session.returnTo) {
<add> var redirectTo = req.session.returnTo;
<add> if (redirectTo === '/map-aside') {
<add> redirectTo = '/map';
<add> }
<add> return res.redirect(redirectTo);
<add> }
<add>
<add> req.flash('success', { msg:
<add> 'Success! You have signed in to your account. Happy Coding!'
<add> });
<add> return res.redirect('/');
<add> });
<add> });
<add> })
<add> .subscribe(
<add> () => {},
<add> next
<add> );
<add> }
<add>
<add> router.get('/passwordless-auth', invalidateAuthToken, getPasswordlessAuth);
<add>
<add> function postPasswordlessAuth(req, res) {
<add> if (req.user || !(req.body && req.body.email)) {
<add> return res.redirect('/');
<add> }
<add>
<add> return User.requestAuthEmail(req.body.email)
<add> .then(msg => {
<add> return res.status(200).send({ message: msg });
<add> })
<add> .catch(err => {
<add> debug(err);
<add> return res.status(200).send({ message: defaultErrorMsg });
<add> });
<add> }
<add>
<add> api.post('/passwordless-auth', postPasswordlessAuth);
<add>
<add> app.use('/:lang', router);
<add> app.use(api);
<ide> };
<ide><path>server/boot/user.js
<ide> import {
<ide> import supportedLanguages from '../../common/utils/supported-languages';
<ide> import { getChallengeInfo, cachedMap } from '../utils/map';
<ide>
<del>const isSignUpDisabled = !!process.env.DISABLE_SIGNUP;
<ide> const debug = debugFactory('fcc:boot:user');
<ide> const sendNonUserToMap = ifNoUserRedirectTo('/map');
<ide> const certIds = {
<ide> function buildDisplayChallenges(
<ide> module.exports = function(app) {
<ide> const router = app.loopback.Router();
<ide> const api = app.loopback.Router();
<del> const { AccessToken, Email, User } = app.models;
<add> const { Email, User } = app.models;
<ide> const map$ = cachedMap(app.models);
<ide>
<ide> function findUserByUsername$(username, fields) {
<ide> module.exports = function(app) {
<ide> );
<ide> }
<ide>
<del> router.get('/login', function(req, res) {
<del> res.redirect(301, '/signin');
<del> });
<del> router.get('/logout', function(req, res) {
<del> res.redirect(301, '/signout');
<del> });
<del> router.get('/signup', getEmailSignin);
<del> router.get('/signin', getEmailSignin);
<del> router.get('/signout', signout);
<del> router.get('/email-signin', getEmailSignin);
<del> router.get('/deprecated-signin', getDepSignin);
<del> router.get('/passwordless-auth', invalidateAuthToken, getPasswordlessAuth);
<del> api.post('/passwordless-auth', postPasswordlessAuth);
<ide> router.get(
<ide> '/delete-my-account',
<ide> sendNonUserToMap,
<ide> module.exports = function(app) {
<ide> app.use('/:lang', router);
<ide> app.use(api);
<ide>
<del> const defaultErrorMsg = [ 'Oops, something is not right, please request a ',
<del> 'fresh link to sign in / sign up.' ].join('');
<del>
<del> function postPasswordlessAuth(req, res) {
<del> if (req.user || !(req.body && req.body.email)) {
<del> return res.redirect('/');
<del> }
<del>
<del> return User.requestAuthEmail(req.body.email)
<del> .then(msg => {
<del> return res.status(200).send({ message: msg });
<del> })
<del> .catch(err => {
<del> debug(err);
<del> return res.status(200).send({ message: defaultErrorMsg });
<del> });
<del> }
<del>
<del> function invalidateAuthToken(req, res, next) {
<del> if (req.user) {
<del> return res.redirect('/');
<del> }
<del>
<del> if (!req.query || !req.query.email || !req.query.token) {
<del> req.flash('info', { msg: defaultErrorMsg });
<del> return res.redirect('/email-signin');
<del> }
<del>
<del> const authTokenId = req.query.token;
<del> const authEmailId = new Buffer(req.query.email, 'base64').toString();
<del>
<del> return AccessToken.findOne$({ where: {id: authTokenId} })
<del> .map(authToken => {
<del> if (!authToken) {
<del> req.flash('info', { msg: defaultErrorMsg });
<del> return res.redirect('/email-signin');
<del> }
<del>
<del> const userId = authToken.userId;
<del> return User.findById(userId, (err, user) => {
<del> if (err || !user || user.email !== authEmailId) {
<del> debug(err);
<del> req.flash('info', { msg: defaultErrorMsg });
<del> return res.redirect('/email-signin');
<del> }
<del> return authToken.validate((err, isValid) => {
<del> if (err) { throw err; }
<del> if (!isValid) {
<del> req.flash('info', { msg: [ 'Looks like the link you clicked has',
<del> 'expired, please request a fresh link, to sign in.'].join('')
<del> });
<del> return res.redirect('/email-signin');
<del> }
<del> return authToken.destroy((err) => {
<del> if (err) { debug(err); }
<del> next();
<del> });
<del> });
<del> });
<del> })
<del> .subscribe(
<del> () => {},
<del> next
<del> );
<del> }
<del>
<del> function getPasswordlessAuth(req, res, next) {
<del> if (req.user) {
<del> req.flash('info', {
<del> msg: 'Hey, looks like you’re already signed in.'
<del> });
<del> return res.redirect('/');
<del> }
<del>
<del> if (!req.query || !req.query.email || !req.query.token) {
<del> req.flash('info', { msg: defaultErrorMsg });
<del> return res.redirect('/email-signin');
<del> }
<del>
<del> const email = new Buffer(req.query.email, 'base64').toString();
<del>
<del> return User.findOne$({ where: { email }})
<del> .map(user => {
<del>
<del> if (!user) {
<del> debug(`did not find a valid user with email: ${email}`);
<del> req.flash('info', { msg: defaultErrorMsg });
<del> return res.redirect('/email-signin');
<del> }
<del>
<del> const emailVerified = true;
<del> const emailAuthLinkTTL = null;
<del> const emailVerifyTTL = null;
<del> user.update$({
<del> emailVerified, emailAuthLinkTTL, emailVerifyTTL
<del> })
<del> .do((user) => {
<del> user.emailVerified = emailVerified;
<del> user.emailAuthLinkTTL = emailAuthLinkTTL;
<del> user.emailVerifyTTL = emailVerifyTTL;
<del> });
<del>
<del> return user.createAccessToken(
<del> { ttl: User.settings.ttl }, (err, accessToken) => {
<del> if (err) { throw err; }
<del>
<del> var config = {
<del> signed: !!req.signedCookies,
<del> maxAge: accessToken.ttl
<del> };
<del>
<del> if (accessToken && accessToken.id) {
<del> debug('setting cookies');
<del> res.cookie('access_token', accessToken.id, config);
<del> res.cookie('userId', accessToken.userId, config);
<del> }
<del>
<del> return req.logIn({
<del> id: accessToken.userId.toString() }, err => {
<del> if (err) { return next(err); }
<del>
<del> debug('user logged in');
<del>
<del> if (req.session && req.session.returnTo) {
<del> var redirectTo = req.session.returnTo;
<del> if (redirectTo === '/map-aside') {
<del> redirectTo = '/map';
<del> }
<del> return res.redirect(redirectTo);
<del> }
<del>
<del> req.flash('success', { msg:
<del> 'Success! You have signed in to your account. Happy Coding!'
<del> });
<del> return res.redirect('/');
<del> });
<del> });
<del> })
<del> .subscribe(
<del> () => {},
<del> next
<del> );
<del> }
<del>
<del> function signout(req, res) {
<del> req.logout();
<del> res.redirect('/');
<del> }
<del>
<del>
<del> function getDepSignin(req, res) {
<del> if (req.user) {
<del> return res.redirect('/');
<del> }
<del> return res.render('account/deprecated-signin', {
<del> title: 'Sign in to freeCodeCamp using a Deprecated Login'
<del> });
<del> }
<del>
<del> function getEmailSignin(req, res) {
<del> if (req.user) {
<del> return res.redirect('/');
<del> }
<del> if (isSignUpDisabled) {
<del> return res.render('account/beta', {
<del> title: 'New sign ups are disabled'
<del> });
<del> }
<del> return res.render('account/email-signin', {
<del> title: 'Sign in to freeCodeCamp using your Email Address'
<del> });
<del> }
<del>
<ide> function getAccount(req, res) {
<ide> const { username } = req.user;
<ide> return res.redirect('/' + username); | 2 |
Java | Java | remove extraneous asterisk | 33d7816de38aa7b3040667b1717722206609fd1f | <ide><path>spring-aop/src/main/java/org/springframework/aop/support/NameMatchMethodPointcut.java
<ide>
<ide> /**
<ide> * Pointcut bean for simple method name matches, as alternative to regexp patterns.
<del> * Does not handle overloaded methods: all methods *with a given name will be eligible.
<add> * Does not handle overloaded methods: all methods with a given name will be eligible.
<ide> *
<ide> * @author Juergen Hoeller
<ide> * @author Rod Johnson | 1 |
Java | Java | fix failing test | 598fafd95743840bce6c7f929eaaf596fefe088b | <ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/config/MvcNamespaceTests.java
<ide> public void testViewControllersDefaultConfig() {
<ide>
<ide> @Test
<ide> public void testContentNegotiationManager() throws Exception {
<del> loadBeanDefinitions("mvc-config-content-negotiation-manager.xml", 14);
<add> loadBeanDefinitions("mvc-config-content-negotiation-manager.xml", 15);
<ide>
<ide> RequestMappingHandlerMapping mapping = appContext.getBean(RequestMappingHandlerMapping.class);
<ide> ContentNegotiationManager manager = mapping.getContentNegotiationManager(); | 1 |
Java | Java | implement 'tostring' method for some emitters | a957c789495304c0de79689a07575977f59afc7d | <ide><path>src/main/java/io/reactivex/internal/operators/completable/CompletableCreate.java
<ide> public void dispose() {
<ide> public boolean isDisposed() {
<ide> return DisposableHelper.isDisposed(get());
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return String.format("%s{%s}", getClass().getSimpleName(), super.toString());
<add> }
<ide> }
<ide> }
<ide><path>src/main/java/io/reactivex/internal/operators/flowable/FlowableCreate.java
<ide> public boolean isCancelled() {
<ide> public FlowableEmitter<T> serialize() {
<ide> return this;
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return emitter.toString();
<add> }
<ide> }
<ide>
<ide> abstract static class BaseEmitter<T>
<ide> public final long requested() {
<ide> public final FlowableEmitter<T> serialize() {
<ide> return new SerializedEmitter<T>(this);
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return String.format("%s{%s}", getClass().getSimpleName(), super.toString());
<add> }
<ide> }
<ide>
<ide> static final class MissingEmitter<T> extends BaseEmitter<T> {
<ide><path>src/main/java/io/reactivex/internal/operators/maybe/MaybeCreate.java
<ide> public void dispose() {
<ide> public boolean isDisposed() {
<ide> return DisposableHelper.isDisposed(get());
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return String.format("%s{%s}", getClass().getSimpleName(), super.toString());
<add> }
<ide> }
<ide> }
<ide><path>src/main/java/io/reactivex/internal/operators/observable/ObservableCreate.java
<ide> public void dispose() {
<ide> public boolean isDisposed() {
<ide> return DisposableHelper.isDisposed(get());
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return String.format("%s{%s}", getClass().getSimpleName(), super.toString());
<add> }
<ide> }
<ide>
<ide> /**
<ide> public boolean isDisposed() {
<ide> public ObservableEmitter<T> serialize() {
<ide> return this;
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return emitter.toString();
<add> }
<ide> }
<ide>
<ide> }
<ide><path>src/main/java/io/reactivex/internal/operators/single/SingleCreate.java
<ide> public void dispose() {
<ide> public boolean isDisposed() {
<ide> return DisposableHelper.isDisposed(get());
<ide> }
<add>
<add> @Override
<add> public String toString() {
<add> return String.format("%s{%s}", getClass().getSimpleName(), super.toString());
<add> }
<ide> }
<ide> }
<ide><path>src/test/java/io/reactivex/internal/operators/completable/CompletableCreateTest.java
<ide> public void subscribe(CompletableEmitter e) throws Exception {
<ide> RxJavaPlugins.reset();
<ide> }
<ide> }
<add>
<add> @Test
<add> public void emitterHasToString() {
<add> Completable.create(new CompletableOnSubscribe() {
<add> @Override
<add> public void subscribe(CompletableEmitter emitter) throws Exception {
<add> assertTrue(emitter.toString().contains(CompletableCreate.Emitter.class.getSimpleName()));
<add> }
<add> }).test().assertEmpty();
<add> }
<ide> }
<ide><path>src/test/java/io/reactivex/internal/operators/flowable/FlowableCreateTest.java
<ide> import static org.junit.Assert.*;
<ide>
<ide> import java.io.IOException;
<add>import java.util.HashMap;
<ide> import java.util.List;
<add>import java.util.Map;
<ide>
<ide> import org.junit.Test;
<ide> import org.reactivestreams.*;
<ide> public void subscribe(FlowableEmitter<Object> e) throws Exception {
<ide> }
<ide> }
<ide> }
<add>
<add> @Test
<add> public void emittersHasToString() {
<add> Map<BackpressureStrategy, Class<? extends FlowableEmitter>> emitterMap =
<add> new HashMap<BackpressureStrategy, Class<? extends FlowableEmitter>>();
<add>
<add> emitterMap.put(BackpressureStrategy.MISSING, FlowableCreate.MissingEmitter.class);
<add> emitterMap.put(BackpressureStrategy.ERROR, FlowableCreate.ErrorAsyncEmitter.class);
<add> emitterMap.put(BackpressureStrategy.DROP, FlowableCreate.DropAsyncEmitter.class);
<add> emitterMap.put(BackpressureStrategy.LATEST, FlowableCreate.LatestAsyncEmitter.class);
<add> emitterMap.put(BackpressureStrategy.BUFFER, FlowableCreate.BufferAsyncEmitter.class);
<add>
<add> for (final Map.Entry<BackpressureStrategy, Class<? extends FlowableEmitter>> entry : emitterMap.entrySet()) {
<add> Flowable.create(new FlowableOnSubscribe<Object>() {
<add> @Override
<add> public void subscribe(FlowableEmitter<Object> emitter) throws Exception {
<add> assertTrue(emitter.toString().contains(entry.getValue().getSimpleName()));
<add> assertTrue(emitter.serialize().toString().contains(entry.getValue().getSimpleName()));
<add> }
<add> }, entry.getKey()).test().assertEmpty();
<add> }
<add> }
<ide> }
<ide><path>src/test/java/io/reactivex/internal/operators/maybe/MaybeCreateTest.java
<ide> public void subscribe(MaybeEmitter<Object> e) throws Exception {
<ide> RxJavaPlugins.reset();
<ide> }
<ide> }
<add>
<add> @Test
<add> public void emitterHasToString() {
<add> Maybe.create(new MaybeOnSubscribe<Object>() {
<add> @Override
<add> public void subscribe(MaybeEmitter<Object> emitter) throws Exception {
<add> assertTrue(emitter.toString().contains(MaybeCreate.Emitter.class.getSimpleName()));
<add> }
<add> }).test().assertEmpty();
<add> }
<ide> }
<ide><path>src/test/java/io/reactivex/internal/operators/observable/ObservableCreateTest.java
<ide> public void subscribe(ObservableEmitter<Object> e) throws Exception {
<ide> RxJavaPlugins.reset();
<ide> }
<ide> }
<add>
<add> @Test
<add> public void emitterHasToString() {
<add> Observable.create(new ObservableOnSubscribe<Object>() {
<add> @Override
<add> public void subscribe(ObservableEmitter<Object> emitter) throws Exception {
<add> assertTrue(emitter.toString().contains(ObservableCreate.CreateEmitter.class.getSimpleName()));
<add> assertTrue(emitter.serialize().toString().contains(ObservableCreate.CreateEmitter.class.getSimpleName()));
<add> }
<add> }).test().assertEmpty();
<add> }
<ide> }
<ide><path>src/test/java/io/reactivex/internal/operators/single/SingleCreateTest.java
<ide> public void subscribe(SingleEmitter<Object> e) throws Exception {
<ide> RxJavaPlugins.reset();
<ide> }
<ide> }
<add>
<add> @Test
<add> public void emitterHasToString() {
<add> Single.create(new SingleOnSubscribe<Object>() {
<add> @Override
<add> public void subscribe(SingleEmitter<Object> emitter) throws Exception {
<add> assertTrue(emitter.toString().contains(SingleCreate.Emitter.class.getSimpleName()));
<add> }
<add> }).test().assertEmpty();
<add> }
<ide> } | 10 |
Ruby | Ruby | require bundle in the app generator | 7f800b4d69c0750bb47989027580299751a22616 | <ide><path>railties/lib/rails/generators/app_base.rb
<ide> def bundle_command(command)
<ide> # Thanks to James Tucker for the Gem tricks involved in this call.
<ide> _bundle_command = Gem.bin_path('bundler', 'bundle')
<ide>
<add> require 'bundler'
<ide> Bundler.with_clean_env do
<ide> print `"#{Gem.ruby}" "#{_bundle_command}" #{command}`
<ide> end | 1 |
Go | Go | remove uneeded sleep in test bash loop | ee594dcb7d42f95048c9047d86c61447243db3cd | <ide><path>integration-cli/docker_api_logs_test.go
<ide> func (s *DockerSuite) TestLogsAPIUntilFutureFollow(c *check.C) {
<ide>
<ide> func (s *DockerSuite) TestLogsAPIUntil(c *check.C) {
<ide> name := "logsuntil"
<del> dockerCmd(c, "run", "--name", name, "busybox", "/bin/sh", "-c", "for i in $(seq 1 3); do echo log$i; sleep 0.5; done")
<add> dockerCmd(c, "run", "--name", name, "busybox", "/bin/sh", "-c", "for i in $(seq 1 3); do echo log$i; done")
<ide>
<ide> client, err := request.NewClient()
<ide> if err != nil { | 1 |
Javascript | Javascript | show error messages instead of timeout | 0009100f712d0486befc718d606260c38053120d | <ide><path>client/src/templates/Challenges/redux/execute-challenge-saga.js
<ide> function* previewChallengeSaga({ flushLogs = true } = {}) {
<ide> }
<ide> }
<ide> } catch (err) {
<del> if (err === 'timeout') {
<add> if (err[0] === 'timeout') {
<ide> // TODO: translate the error
<ide> // eslint-disable-next-line no-ex-assign
<del> err = `The code you have written is taking longer than the ${previewTimeout}ms our challenges allow. You may have created an infinite loop or need to write a more efficient algorithm`;
<add> err[0] = `The code you have written is taking longer than the ${previewTimeout}ms our challenges allow. You may have created an infinite loop or need to write a more efficient algorithm`;
<ide> }
<ide> console.log(err);
<ide> yield put(updateConsole(escape(err))); | 1 |
Ruby | Ruby | add tests for --master app_generator | 3b882deb0c125f6ee4781b261ab04e17afc7b07e | <ide><path>railties/test/generators/app_generator_test.rb
<ide> def test_web_console_with_edge_option
<ide> end
<ide> end
<ide>
<add> def test_web_console_with_master_option
<add> run_generator [destination_root, "--master"]
<add>
<add> assert_file "Gemfile" do |content|
<add> assert_match(/gem 'web-console',\s+github: 'rails\/web-console'/, content)
<add> assert_no_match(/\Agem 'web-console', '>= 3\.3\.0'\z/, content)
<add> end
<add> end
<add>
<ide> def test_generation_runs_bundle_install
<ide> generator([destination_root], skip_webpack_install: true)
<ide>
<ide> def test_edge_option
<ide> assert_file "Gemfile", %r{^gem\s+["']rails["'],\s+github:\s+["']#{Regexp.escape("rails/rails")}["']$}
<ide> end
<ide>
<add>
<add> def test_master_option
<add> generator([destination_root], master: true, skip_webpack_install: true)
<add>
<add> assert_bundler_command_called("install")
<add> assert_file "Gemfile", %r{^gem\s+["']rails["'],\s+github:\s+["']#{Regexp.escape("rails/rails")}["'],\s+branch:\s+["']master["']$}
<add> end
<add>
<ide> def test_spring
<ide> run_generator
<ide> assert_gem "spring" | 1 |
Javascript | Javascript | update foreach example to include side effects | f2412282271df9b94cff63be9853ebdec047c8f9 | <ide><path>packages/@ember/-internals/runtime/lib/mixins/array.js
<ide> const ArrayMixin = Mixin.create(Enumerable, {
<ide> Example Usage:
<ide>
<ide> ```javascript
<del> let foods = ['apple', 'banana', carrot'];
<add> let foods = [
<add> { name: 'apple', eaten: false },
<add> { name: 'banana', eaten: false },
<add> { name: carrot': eaten: false }
<add> ];
<add>
<add> foods.forEach((food) => food.eaten = true);
<ide>
<del> foods.forEach(food => console.log(`You should eat a ${food}`));
<del> foods.forEach((item, index, array) => console.log(`${index + 1}/${array.length}) ${item}`));
<add> let output = '';
<add> foods.forEach((item, index, array) =>
<add> output += `${index + 1}/${array.length}) ${item}\n`
<add> );
<ide> ```
<ide>
<ide> @method forEach | 1 |
PHP | PHP | add log driver to broadcasting connections | 1c4dcea2a23719b7315b6a4e732ff7d0890b5bfe | <ide><path>config/broadcasting.php
<ide> 'driver' => 'redis',
<ide> 'connection' => 'default',
<ide> ],
<add>
<add> 'log' => [
<add> 'driver' => 'log',
<add> ],
<ide>
<ide> ],
<ide> | 1 |
Ruby | Ruby | revert the merge because tests did not pass | 886818d2bab40585c0cea763002ffc16917dd0b3 | <ide><path>activerecord/lib/active_record/base.rb
<ide> def symbolized_base_class
<ide> @symbolized_base_class ||= base_class.to_s.to_sym
<ide> end
<ide>
<del> def symbolized_sti_name
<del> @symbolized_sti_name ||= sti_name ? sti_name.to_sym : symbolized_base_class
<del> end
<del>
<ide> # Returns the base AR subclass that this class descends from. If A
<ide> # extends AR::Base, A.base_class will return A. If B descends from A
<ide> # through some arbitrarily deep hierarchy, B.base_class will return A.
<ide><path>activerecord/lib/active_record/identity_map.rb
<ide> def without
<ide> end
<ide>
<ide> def get(klass, primary_key)
<del> record = repository[klass.symbolized_sti_name][primary_key]
<add> record = repository[klass.symbolized_base_class][primary_key]
<ide>
<ide> if record.is_a?(klass)
<ide> ActiveSupport::Notifications.instrument("identity.active_record",
<ide> def get(klass, primary_key)
<ide> end
<ide>
<ide> def add(record)
<del> repository[record.class.symbolized_sti_name][record.id] = record
<add> repository[record.class.symbolized_base_class][record.id] = record
<ide> end
<ide>
<ide> def remove(record)
<del> repository[record.class.symbolized_sti_name].delete(record.id)
<add> repository[record.class.symbolized_base_class].delete(record.id)
<ide> end
<ide>
<del> def remove_by_id(symbolized_sti_name, id)
<del> repository[symbolized_sti_name].delete(id)
<add> def remove_by_id(symbolized_base_class, id)
<add> repository[symbolized_base_class].delete(id)
<ide> end
<ide>
<ide> def clear
<ide><path>activerecord/test/cases/identity_map_test.rb
<ide> def test_creation
<ide> assert_same(t1, t2)
<ide> end
<ide>
<del> ##############################################################################
<del> # Tests checking if IM is functioning properly on classes with multiple #
<del> # types of inheritance #
<del> ##############################################################################
<del>
<del> def test_inherited_without_type_attribute_without_identity_map
<del> ActiveRecord::IdentityMap.without do
<del> p1 = DestructivePirate.create!(:catchphrase => "I'm not a regular Pirate")
<del> p2 = Pirate.find(p1.id)
<del> assert_not_same(p1, p2)
<del> end
<del> end
<del>
<del> def test_inherited_with_type_attribute_without_identity_map
<del> ActiveRecord::IdentityMap.without do
<del> c = comments(:sub_special_comment)
<del> c1 = SubSpecialComment.find(c.id)
<del> c2 = Comment.find(c.id)
<del> assert_same(c1.class, c2.class)
<del> end
<del> end
<del>
<del> def test_inherited_without_type_attribute
<del> p1 = DestructivePirate.create!(:catchphrase => "I'm not a regular Pirate")
<del> p2 = Pirate.find(p1.id)
<del> assert_not_same(p1, p2)
<del> end
<del>
<del> def test_inherited_with_type_attribute
<del> c = comments(:sub_special_comment)
<del> c1 = SubSpecialComment.find(c.id)
<del> c2 = Comment.find(c.id)
<del> assert_same(c1, c2)
<del> end
<del>
<ide> ##############################################################################
<ide> # Tests checking dirty attribute behaviour with IM #
<ide> ############################################################################## | 3 |
Java | Java | relocate web artifacts in the tcf to web package | d0503ab733bb9d4987e51cb76a8b2ec58c5f8468 | <ide><path>spring-test/src/main/java/org/springframework/test/context/ContextLoaderUtils.java
<ide> abstract class ContextLoaderUtils {
<ide> private static final Log logger = LogFactory.getLog(ContextLoaderUtils.class);
<ide>
<ide> private static final String DEFAULT_CONTEXT_LOADER_CLASS_NAME = "org.springframework.test.context.support.DelegatingSmartContextLoader";
<del> private static final String DEFAULT_WEB_CONTEXT_LOADER_CLASS_NAME = "org.springframework.test.context.support.WebDelegatingSmartContextLoader";
<add> private static final String DEFAULT_WEB_CONTEXT_LOADER_CLASS_NAME = "org.springframework.test.context.web.WebDelegatingSmartContextLoader";
<ide>
<ide> private static final String WEB_APP_CONFIGURATION_CLASS_NAME = "org.springframework.test.context.web.WebAppConfiguration";
<ide> private static final String WEB_MERGED_CONTEXT_CONFIGURATION_CLASS_NAME = "org.springframework.test.context.web.WebMergedContextConfiguration";
<ide><path>spring-test/src/main/java/org/springframework/test/context/support/AbstractDelegatingSmartContextLoader.java
<ide> * @since 3.2
<ide> * @see SmartContextLoader
<ide> */
<del>abstract class AbstractDelegatingSmartContextLoader implements SmartContextLoader {
<add>public abstract class AbstractDelegatingSmartContextLoader implements SmartContextLoader {
<ide>
<ide> private static final Log logger = LogFactory.getLog(AbstractDelegatingSmartContextLoader.class);
<ide>
<ide><path>spring-test/src/main/java/org/springframework/test/context/support/AnnotationConfigContextLoaderUtils.java
<ide> * @author Sam Brannen
<ide> * @since 3.2
<ide> */
<del>abstract class AnnotationConfigContextLoaderUtils {
<add>public abstract class AnnotationConfigContextLoaderUtils {
<ide>
<ide> private static final Log logger = LogFactory.getLog(AnnotationConfigContextLoaderUtils.class);
<ide>
<ide> private static boolean isDefaultConfigurationClassCandidate(Class<?> clazz) {
<ide> * @return an array of default configuration classes, potentially empty but
<ide> * never <code>null</code>
<ide> */
<del> static Class<?>[] detectDefaultConfigurationClasses(Class<?> declaringClass) {
<add> public static Class<?>[] detectDefaultConfigurationClasses(Class<?> declaringClass) {
<ide> Assert.notNull(declaringClass, "Declaring class must not be null");
<ide>
<ide> List<Class<?>> configClasses = new ArrayList<Class<?>>();
<add><path>spring-test/src/main/java/org/springframework/test/context/web/AbstractGenericWebContextLoader.java
<del><path>spring-test/src/main/java/org/springframework/test/context/support/AbstractGenericWebContextLoader.java
<ide> * limitations under the License.
<ide> */
<ide>
<del>package org.springframework.test.context.support;
<add>package org.springframework.test.context.web;
<ide>
<ide> import javax.servlet.ServletContext;
<ide>
<ide> import org.springframework.core.io.ResourceLoader;
<ide> import org.springframework.mock.web.MockServletContext;
<ide> import org.springframework.test.context.MergedContextConfiguration;
<del>import org.springframework.test.context.web.WebMergedContextConfiguration;
<add>import org.springframework.test.context.support.AbstractContextLoader;
<ide> import org.springframework.web.context.WebApplicationContext;
<ide> import org.springframework.web.context.support.GenericWebApplicationContext;
<ide>
<add><path>spring-test/src/main/java/org/springframework/test/context/web/AnnotationConfigWebContextLoader.java
<del><path>spring-test/src/main/java/org/springframework/test/context/support/AnnotationConfigWebContextLoader.java
<ide> * limitations under the License.
<ide> */
<ide>
<del>package org.springframework.test.context.support;
<add>package org.springframework.test.context.web;
<ide>
<ide> import org.apache.commons.logging.Log;
<ide> import org.apache.commons.logging.LogFactory;
<ide> import org.springframework.context.annotation.AnnotatedBeanDefinitionReader;
<ide> import org.springframework.test.context.ContextConfigurationAttributes;
<del>import org.springframework.test.context.web.WebMergedContextConfiguration;
<add>import org.springframework.test.context.support.AbstractContextLoader;
<add>import org.springframework.test.context.support.AnnotationConfigContextLoaderUtils;
<ide> import org.springframework.util.ObjectUtils;
<ide> import org.springframework.web.context.support.GenericWebApplicationContext;
<ide>
<add><path>spring-test/src/main/java/org/springframework/test/context/web/GenericXmlWebContextLoader.java
<del><path>spring-test/src/main/java/org/springframework/test/context/support/XmlWebContextLoader.java
<ide> * limitations under the License.
<ide> */
<ide>
<del>package org.springframework.test.context.support;
<add>package org.springframework.test.context.web;
<ide>
<ide> import org.springframework.beans.factory.xml.XmlBeanDefinitionReader;
<del>import org.springframework.test.context.web.WebMergedContextConfiguration;
<ide> import org.springframework.web.context.support.GenericWebApplicationContext;
<ide>
<ide> /**
<del> * TODO [SPR-9864] Document XmlWebContextLoader.
<add> * TODO [SPR-9864] Document GenericXmlWebContextLoader.
<ide> *
<ide> * @author Sam Brannen
<ide> * @since 3.2
<ide> */
<del>public class XmlWebContextLoader extends AbstractGenericWebContextLoader {
<del>
<del> /**
<del> * Returns "<code>-context.xml</code>".
<del> */
<del> protected String getResourceSuffix() {
<del> return "-context.xml";
<del> }
<add>public class GenericXmlWebContextLoader extends AbstractGenericWebContextLoader {
<ide>
<ide> /**
<ide> * TODO [SPR-9864] Document overridden loadBeanDefinitions().
<ide> *
<del> * @see org.springframework.test.context.support.AbstractGenericWebContextLoader#loadBeanDefinitions(org.springframework.web.context.support.GenericWebApplicationContext, org.springframework.test.context.web.WebMergedContextConfiguration)
<add> * @see org.springframework.test.context.web.AbstractGenericWebContextLoader#loadBeanDefinitions(org.springframework.web.context.support.GenericWebApplicationContext, org.springframework.test.context.web.WebMergedContextConfiguration)
<ide> */
<ide> @Override
<ide> protected void loadBeanDefinitions(GenericWebApplicationContext context,
<ide> WebMergedContextConfiguration webMergedConfig) {
<ide> new XmlBeanDefinitionReader(context).loadBeanDefinitions(webMergedConfig.getLocations());
<ide> }
<ide>
<add> /**
<add> * Returns "<code>-context.xml</code>".
<add> */
<add> protected String getResourceSuffix() {
<add> return "-context.xml";
<add> }
<add>
<ide> }
<ide><path>spring-test/src/main/java/org/springframework/test/context/web/ServletTestExecutionListener.java
<ide>
<ide> import org.apache.commons.logging.Log;
<ide> import org.apache.commons.logging.LogFactory;
<del>
<ide> import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
<ide> import org.springframework.context.ApplicationContext;
<ide> import org.springframework.context.ConfigurableApplicationContext;
<ide> import org.springframework.mock.web.MockServletContext;
<ide> import org.springframework.test.context.TestContext;
<ide> import org.springframework.test.context.TestExecutionListener;
<add>import org.springframework.test.context.support.AbstractTestExecutionListener;
<ide> import org.springframework.web.context.WebApplicationContext;
<ide> import org.springframework.web.context.request.RequestContextHolder;
<ide> import org.springframework.web.context.request.ServletWebRequest;
<ide> * @author Sam Brannen
<ide> * @since 3.2
<ide> */
<del>public class ServletTestExecutionListener implements TestExecutionListener {
<add>public class ServletTestExecutionListener extends AbstractTestExecutionListener {
<ide>
<ide> private static final Log logger = LogFactory.getLog(ServletTestExecutionListener.class);
<ide>
<ide>
<del> /**
<del> * The default implementation is <em>empty</em>. Can be overridden by
<del> * subclasses as necessary.
<del> *
<del> * @see TestExecutionListener#beforeTestClass(TestContext)
<del> */
<del> public void beforeTestClass(TestContext testContext) throws Exception {
<del> /* no-op */
<del> }
<del>
<ide> /**
<ide> * TODO [SPR-9864] Document overridden prepareTestInstance().
<ide> *
<ide> public void afterTestMethod(TestContext testContext) throws Exception {
<ide> RequestContextHolder.resetRequestAttributes();
<ide> }
<ide>
<del> /**
<del> * The default implementation is <em>empty</em>. Can be overridden by
<del> * subclasses as necessary.
<del> *
<del> * @see TestExecutionListener#afterTestClass(TestContext)
<del> */
<del> public void afterTestClass(TestContext testContext) throws Exception {
<del> /* no-op */
<del> }
<del>
<ide> /**
<ide> * TODO [SPR-9864] Document setUpRequestContext().
<ide> *
<ide> * @param testContext
<del> * @param servletContext
<ide> */
<ide> private void setUpRequestContextIfNecessary(TestContext testContext) {
<ide>
<add><path>spring-test/src/main/java/org/springframework/test/context/web/WebDelegatingSmartContextLoader.java
<del><path>spring-test/src/main/java/org/springframework/test/context/support/WebDelegatingSmartContextLoader.java
<ide> * limitations under the License.
<ide> */
<ide>
<del>package org.springframework.test.context.support;
<add>package org.springframework.test.context.web;
<ide>
<ide> import org.springframework.test.context.SmartContextLoader;
<add>import org.springframework.test.context.support.AbstractDelegatingSmartContextLoader;
<ide>
<ide> /**
<ide> * {@code WebDelegatingSmartContextLoader} is a concrete implementation of
<del> * {@link AbstractDelegatingSmartContextLoader} that delegates to an
<del> * {@link XmlWebContextLoader} and an {@link AnnotationConfigWebContextLoader}.
<add> * {@link AbstractDelegatingSmartContextLoader} that delegates to a
<add> * {@link GenericXmlWebContextLoader} and an {@link AnnotationConfigWebContextLoader}.
<ide> *
<ide> * @author Sam Brannen
<ide> * @since 3.2
<ide> * @see SmartContextLoader
<ide> * @see AbstractDelegatingSmartContextLoader
<del> * @see XmlWebContextLoader
<add> * @see GenericXmlWebContextLoader
<ide> * @see AnnotationConfigWebContextLoader
<ide> */
<ide> public class WebDelegatingSmartContextLoader extends AbstractDelegatingSmartContextLoader {
<ide>
<del> private final SmartContextLoader xmlLoader = new XmlWebContextLoader();
<add> private final SmartContextLoader xmlLoader = new GenericXmlWebContextLoader();
<ide> private final SmartContextLoader annotationConfigLoader = new AnnotationConfigWebContextLoader();
<ide>
<ide>
<ide><path>spring-test/src/main/java/org/springframework/test/context/web/package-info.java
<add>/**
<add> * Web support classes for the <em>Spring TestContext Framework</em>.
<add> */
<add>
<add>package org.springframework.test.context.web;
<add> | 9 |
Python | Python | add setter for some properties of request class | c09268551110c6d423d9e1f51463f840abb8f147 | <ide><path>celery/worker/request.py
<ide> def __init__(self, message, on_ack=noop,
<ide> headers=None, decoded=False, utc=True,
<ide> maybe_make_aware=maybe_make_aware,
<ide> maybe_iso8601=maybe_iso8601, **opts):
<add> self._message = message
<ide> self._request_dict = message.headers if headers is None else headers
<ide> self._body = message.body if body is None else body
<ide> self._app = app
<ide> def __init__(self, message, on_ack=noop,
<ide> def delivery_info(self):
<ide> return self._delivery_info
<ide>
<add> @property
<add> def message(self):
<add> return self._message
<add>
<ide> @property
<ide> def request_dict(self):
<ide> return self._request_dict
<ide> def on_ack(self):
<ide> def on_reject(self):
<ide> return self._on_reject
<ide>
<add> @on_reject.setter
<add> def on_reject(self, value):
<add> self._on_reject = value
<add>
<ide> @property
<ide> def hostname(self):
<ide> return self._hostname
<ide> def hostname(self):
<ide> def eventer(self):
<ide> return self._eventer
<ide>
<add> @eventer.setter
<add> def expires(self, eventer):
<add> self._eventer = eventer
<add>
<add> @property
<add> def connection_errors(self):
<add> return self._connection_errors
<add>
<ide> @property
<ide> def task(self):
<ide> return self._task
<ide> def eta(self):
<ide> def expires(self):
<ide> return self._expires
<ide>
<add> @expires.setter
<add> def expires(self, value):
<add> self._expires = value
<add>
<ide> @property
<ide> def tzlocal(self):
<ide> if self._tzlocal is None: | 1 |
PHP | PHP | allow simpler singleton binding syntax | 83c425b17758315964be7bc605c740a4c008b56b | <ide><path>src/Illuminate/Foundation/Application.php
<ide> public function register($provider, $force = false)
<ide>
<ide> if (property_exists($provider, 'singletons')) {
<ide> foreach ($provider->singletons as $key => $value) {
<add> $key = is_int($key) ? $value : $key;
<add>
<ide> $this->singleton($key, $value);
<ide> }
<ide> }
<ide><path>tests/Foundation/FoundationApplicationTest.php
<ide> public function testSingletonsAreCreatedWhenServiceProviderIsRegistered()
<ide> $app->register($provider = new class($app) extends ServiceProvider
<ide> {
<ide> public $singletons = [
<add> NonContractBackedClass::class,
<ide> AbstractClass::class => ConcreteClass::class,
<ide> ];
<ide> });
<ide> public function testSingletonsAreCreatedWhenServiceProviderIsRegistered()
<ide>
<ide> $this->assertInstanceOf(ConcreteClass::class, $instance);
<ide> $this->assertSame($instance, $app->make(AbstractClass::class));
<add>
<add> $instance = $app->make(NonContractBackedClass::class);
<add>
<add> $this->assertInstanceOf(NonContractBackedClass::class, $instance);
<add> $this->assertSame($instance, $app->make(NonContractBackedClass::class));
<ide> }
<ide>
<ide> public function testServiceProvidersAreCorrectlyRegisteredWhenRegisterMethodIsNotFilled()
<ide> class ConcreteClass extends AbstractClass
<ide> //
<ide> }
<ide>
<add>class NonContractBackedClass
<add>{
<add> //
<add>}
<add>
<ide> class ConcreteTerminator
<ide> {
<ide> public static $counter = 0; | 2 |
Java | Java | fix handling of line height with inline images | da8759c6103eef4b53b46640d1f03d29cf5614bb | <ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/RCTTextInput.java
<ide> public void onCollectExtraUpdates(UIViewOperationQueue uiViewOperationQueue) {
<ide> super.onCollectExtraUpdates(uiViewOperationQueue);
<ide> if (mJsEventCount != UNSET) {
<ide> ReactTextUpdate reactTextUpdate =
<del> new ReactTextUpdate(getText(), mJsEventCount, false, getPadding());
<add> new ReactTextUpdate(getText(), mJsEventCount, false, getPadding(), Float.NaN);
<add> // TODO: the Float.NaN should be replaced with the real line height see D3592781
<ide> uiViewOperationQueue.enqueueUpdateExtraData(getReactTag(), reactTextUpdate);
<ide> }
<ide> } | 1 |
PHP | PHP | remove unnecessary lines | e821c27f54218a9bdbbf3e5bc4942fe0165cddcb | <ide><path>lib/Cake/Test/Case/Error/ExceptionRendererTest.php
<ide> public function testCakeErrorHelpersNotLost() {
<ide> $testApp . 'Error' . DS
<ide> ),
<ide> ), App::RESET);
<del> Configure::write('Error', array(
<del> 'handler' => 'TestAppsErrorHandler::handleError',
<del> 'level' => E_ALL & ~E_DEPRECATED,
<del> 'trace' => true
<del> ));
<del>
<del> Configure::write('Exception', array(
<del> 'handler' => 'TestAppsErrorHandler::handleException',
<del> 'renderer' => 'TestAppsExceptionRenderer',
<del> 'log' => true
<del> ));
<del>
<del> App::uses('TestAppsErrorController', 'Controller');
<del> App::uses('TestAppsExceptionRenderer', 'Error');
<ide>
<add> App::uses('TestAppsExceptionRenderer', 'Error');
<ide> $exception = new SocketException('socket exception');
<ide> $renderer = new TestAppsExceptionRenderer($exception);
<ide> | 1 |
PHP | PHP | add logging and error when fixture creation fails | 3b46cd43f1279b7d24c58861466c414e0eb8d72c | <ide><path>lib/Cake/TestSuite/Fixture/CakeTestFixture.php
<ide> public function create($db) {
<ide> $db->execute($db->createSchema($this->Schema), array('log' => false));
<ide> $this->created[] = $db->configKeyName;
<ide> } catch (Exception $e) {
<add> $msg = __d(
<add> 'cake_dev',
<add> 'Fixture creation for "%s" failed "%s"',
<add> $this->table,
<add> $e->getMessage()
<add> );
<add> CakeLog::error($msg);
<add> trigger_error($msg, E_USER_WARNING);
<ide> return false;
<ide> }
<ide> return true; | 1 |
Python | Python | change defaults of gru and lstm layers | 5949aeede9efd19447770cb1acb33192c08b54ff | <ide><path>keras/layers/recurrent.py
<ide> class GRUCell(Layer):
<ide>
<ide> def __init__(self, units,
<ide> activation='tanh',
<del> recurrent_activation='hard_sigmoid',
<add> recurrent_activation='sigmoid',
<ide> use_bias=True,
<ide> kernel_initializer='glorot_uniform',
<ide> recurrent_initializer='orthogonal',
<ide> def __init__(self, units,
<ide> bias_constraint=None,
<ide> dropout=0.,
<ide> recurrent_dropout=0.,
<del> implementation=1,
<add> implementation=2,
<ide> reset_after=False,
<ide> **kwargs):
<ide> super(GRUCell, self).__init__(**kwargs)
<ide> class GRU(RNN):
<ide> @interfaces.legacy_recurrent_support
<ide> def __init__(self, units,
<ide> activation='tanh',
<del> recurrent_activation='hard_sigmoid',
<add> recurrent_activation='sigmoid',
<ide> use_bias=True,
<ide> kernel_initializer='glorot_uniform',
<ide> recurrent_initializer='orthogonal',
<ide> def __init__(self, units,
<ide> bias_constraint=None,
<ide> dropout=0.,
<ide> recurrent_dropout=0.,
<del> implementation=1,
<add> implementation=2,
<ide> return_sequences=False,
<ide> return_state=False,
<ide> go_backwards=False,
<ide> class LSTMCell(Layer):
<ide>
<ide> def __init__(self, units,
<ide> activation='tanh',
<del> recurrent_activation='hard_sigmoid',
<add> recurrent_activation='sigmoid',
<ide> use_bias=True,
<ide> kernel_initializer='glorot_uniform',
<ide> recurrent_initializer='orthogonal',
<ide> def __init__(self, units,
<ide> bias_constraint=None,
<ide> dropout=0.,
<ide> recurrent_dropout=0.,
<del> implementation=1,
<add> implementation=2,
<ide> **kwargs):
<ide> super(LSTMCell, self).__init__(**kwargs)
<ide> self.units = units
<ide> class LSTM(RNN):
<ide> @interfaces.legacy_recurrent_support
<ide> def __init__(self, units,
<ide> activation='tanh',
<del> recurrent_activation='hard_sigmoid',
<add> recurrent_activation='sigmoid',
<ide> use_bias=True,
<ide> kernel_initializer='glorot_uniform',
<ide> recurrent_initializer='orthogonal',
<ide> def __init__(self, units,
<ide> bias_constraint=None,
<ide> dropout=0.,
<ide> recurrent_dropout=0.,
<del> implementation=1,
<add> implementation=2,
<ide> return_sequences=False,
<ide> return_state=False,
<ide> go_backwards=False, | 1 |
Python | Python | add more complete docstring to ones_like | 4b27d2291c67ef476254dba1277ca2e43152165e | <ide><path>numpy/core/code_generators/generate_umath.py
<ide> def __init__(self, nin, nout, identity, docstring,
<ide> ),
<ide> 'ones_like' :
<ide> Ufunc(1, 1, None,
<del> 'return 1',
<add> 'returns an array of ones of the shape and typecode of x.',
<ide> TD(nobool_or_obj),
<ide> TD(O, f='Py_get_one'),
<ide> ), | 1 |
Python | Python | cherrypick more python3 changes from node-gyp | 54fcb14467b59e82d6e24bf44803462226a5de4d | <ide><path>tools/gyp/pylib/gyp/MSVSSettings.py
<ide> MSBuild install directory, e.g. c:\Program Files (x86)\MSBuild
<ide> """
<ide>
<add>from __future__ import print_function
<add>
<add>from gyp import string_types
<add>
<ide> import sys
<ide> import re
<ide>
<ide> class _String(_Type):
<ide> """A setting that's just a string."""
<ide>
<ide> def ValidateMSVS(self, value):
<del> if not isinstance(value, basestring):
<add> if not isinstance(value, string_types):
<ide> raise ValueError('expected string; got %r' % value)
<ide>
<ide> def ValidateMSBuild(self, value):
<del> if not isinstance(value, basestring):
<add> if not isinstance(value, string_types):
<ide> raise ValueError('expected string; got %r' % value)
<ide>
<ide> def ConvertToMSBuild(self, value):
<ide> class _StringList(_Type):
<ide> """A settings that's a list of strings."""
<ide>
<ide> def ValidateMSVS(self, value):
<del> if not isinstance(value, basestring) and not isinstance(value, list):
<add> if not isinstance(value, string_types) and not isinstance(value, list):
<ide> raise ValueError('expected string list; got %r' % value)
<ide>
<ide> def ValidateMSBuild(self, value):
<del> if not isinstance(value, basestring) and not isinstance(value, list):
<add> if not isinstance(value, string_types) and not isinstance(value, list):
<ide> raise ValueError('expected string list; got %r' % value)
<ide>
<ide> def ConvertToMSBuild(self, value):
<ide> def _ValidateExclusionSetting(setting, settings, error_msg, stderr=sys.stderr):
<ide>
<ide> if unrecognized:
<ide> # We don't know this setting. Give a warning.
<del> print >> stderr, error_msg
<add> print(error_msg, file=stderr)
<ide>
<ide>
<ide> def FixVCMacroSlashes(s):
<ide> def ConvertVCMacrosToMSBuild(s):
<ide> '$(PlatformName)': '$(Platform)',
<ide> '$(SafeInputName)': '%(Filename)',
<ide> }
<del> for old, new in replace_map.iteritems():
<add> for old, new in replace_map.items():
<ide> s = s.replace(old, new)
<ide> s = FixVCMacroSlashes(s)
<ide> return s
<ide> def ConvertToMSBuildSettings(msvs_settings, stderr=sys.stderr):
<ide> dictionaries of settings and their values.
<ide> """
<ide> msbuild_settings = {}
<del> for msvs_tool_name, msvs_tool_settings in msvs_settings.iteritems():
<add> for msvs_tool_name, msvs_tool_settings in msvs_settings.items():
<ide> if msvs_tool_name in _msvs_to_msbuild_converters:
<ide> msvs_tool = _msvs_to_msbuild_converters[msvs_tool_name]
<del> for msvs_setting, msvs_value in msvs_tool_settings.iteritems():
<add> for msvs_setting, msvs_value in msvs_tool_settings.items():
<ide> if msvs_setting in msvs_tool:
<ide> # Invoke the translation function.
<ide> try:
<ide> msvs_tool[msvs_setting](msvs_value, msbuild_settings)
<del> except ValueError, e:
<del> print >> stderr, ('Warning: while converting %s/%s to MSBuild, '
<del> '%s' % (msvs_tool_name, msvs_setting, e))
<add> except ValueError as e:
<add> print('Warning: while converting %s/%s to MSBuild, '
<add> '%s' % (msvs_tool_name, msvs_setting, e), file=stderr)
<ide> else:
<ide> _ValidateExclusionSetting(msvs_setting,
<ide> msvs_tool,
<ide> def ConvertToMSBuildSettings(msvs_settings, stderr=sys.stderr):
<ide> (msvs_tool_name, msvs_setting)),
<ide> stderr)
<ide> else:
<del> print >> stderr, ('Warning: unrecognized tool %s while converting to '
<del> 'MSBuild.' % msvs_tool_name)
<add> print('Warning: unrecognized tool %s while converting to '
<add> 'MSBuild.' % msvs_tool_name, file=stderr)
<ide> return msbuild_settings
<ide>
<ide>
<ide> def _ValidateSettings(validators, settings, stderr):
<ide> for tool_name in settings:
<ide> if tool_name in validators:
<ide> tool_validators = validators[tool_name]
<del> for setting, value in settings[tool_name].iteritems():
<add> for setting, value in settings[tool_name].items():
<ide> if setting in tool_validators:
<ide> try:
<ide> tool_validators[setting](value)
<del> except ValueError, e:
<del> print >> stderr, ('Warning: for %s/%s, %s' %
<del> (tool_name, setting, e))
<add> except ValueError as e:
<add> print('Warning: for %s/%s, %s' % (tool_name, setting, e),
<add> file=stderr)
<ide> else:
<ide> _ValidateExclusionSetting(setting,
<ide> tool_validators,
<ide> def _ValidateSettings(validators, settings, stderr):
<ide> stderr)
<ide>
<ide> else:
<del> print >> stderr, ('Warning: unrecognized tool %s' % tool_name)
<add> print('Warning: unrecognized tool %s' % (tool_name), file=stderr)
<ide>
<ide>
<ide> # MSVS and MBuild names of the tools.
<ide><path>tools/gyp/pylib/gyp/MSVSUtil.py
<ide> def InsertLargePdbShims(target_list, target_dicts, vars):
<ide>
<ide> # Set up the shim to output its PDB to the same location as the final linker
<ide> # target.
<del> for config_name, config in shim_dict.get('configurations').iteritems():
<add> for config_name, config in shim_dict.get('configurations').items():
<ide> pdb_path = _GetPdbPath(target_dict, config_name, vars)
<ide>
<ide> # A few keys that we don't want to propagate.
<ide><path>tools/gyp/pylib/gyp/MSVSVersion.py
<ide> def _RegistryQuery(key, value=None):
<ide> text = None
<ide> try:
<ide> text = _RegistryQueryBase('Sysnative', key, value)
<del> except OSError, e:
<add> except OSError as e:
<ide> if e.errno == errno.ENOENT:
<ide> text = _RegistryQueryBase('System32', key, value)
<ide> else:
<ide> def _RegistryGetValueUsingWinReg(key, value):
<ide> contents of the registry key's value, or None on failure. Throws
<ide> ImportError if _winreg is unavailable.
<ide> """
<del> import _winreg
<add> try:
<add> # Python 2
<add> from _winreg import HKEY_LOCAL_MACHINE, OpenKey, QueryValueEx
<add> except ImportError:
<add> # Python 3
<add> from winreg import HKEY_LOCAL_MACHINE, OpenKey, QueryValueEx
<add>
<ide> try:
<ide> root, subkey = key.split('\\', 1)
<ide> assert root == 'HKLM' # Only need HKLM for now.
<del> with _winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE, subkey) as hkey:
<del> return _winreg.QueryValueEx(hkey, value)[0]
<add> with OpenKey(HKEY_LOCAL_MACHINE, subkey) as hkey:
<add> return QueryValueEx(hkey, value)[0]
<ide> except WindowsError:
<ide> return None
<ide>
<ide><path>tools/gyp/pylib/gyp/common.py
<ide> # Use of this source code is governed by a BSD-style license that can be
<ide> # found in the LICENSE file.
<ide>
<del>from __future__ import with_statement
<del>
<ide> import collections
<ide> import errno
<ide> import filecmp
<ide> def close(self):
<ide> same = False
<ide> try:
<ide> same = filecmp.cmp(self.tmp_path, filename, False)
<del> except OSError, e:
<add> except OSError as e:
<ide> if e.errno != errno.ENOENT:
<ide> raise
<ide>
<ide> def close(self):
<ide> #
<ide> # No way to get the umask without setting a new one? Set a safe one
<ide> # and then set it back to the old value.
<del> umask = os.umask(077)
<add> umask = os.umask(0o77)
<ide> os.umask(umask)
<del> os.chmod(self.tmp_path, 0666 & ~umask)
<add> os.chmod(self.tmp_path, 0o666 & ~umask)
<ide> if sys.platform == 'win32' and os.path.exists(filename):
<ide> # NOTE: on windows (but not cygwin) rename will not replace an
<ide> # existing file, so it must be preceded with a remove. Sadly there
<ide> def CopyTool(flavor, out_path, generator_flags={}):
<ide> ''.join([source[0], header] + source[1:]))
<ide>
<ide> # Make file executable.
<del> os.chmod(tool_path, 0755)
<add> os.chmod(tool_path, 0o755)
<ide>
<ide>
<ide> # From Alex Martelli,
<ide><path>tools/gyp/pylib/gyp/easy_xml.py
<ide> import re
<ide> import os
<ide> import locale
<add>from functools import reduce
<ide>
<ide>
<ide> def XmlToString(content, encoding='utf-8', pretty=False):
<ide> def _ConstructContentList(xml_parts, specification, pretty, level=0):
<ide> # Optionally in second position is a dictionary of the attributes.
<ide> rest = specification[1:]
<ide> if rest and isinstance(rest[0], dict):
<del> for at, val in sorted(rest[0].iteritems()):
<add> for at, val in sorted(rest[0].items()):
<ide> xml_parts.append(' %s="%s"' % (at, _XmlEscape(val, attr=True)))
<ide> rest = rest[1:]
<ide> if rest:
<ide><path>tools/gyp/pylib/gyp/generator/cmake.py
<ide> CMakeLists.txt file.
<ide> """
<ide>
<add>from __future__ import print_function
<add>
<ide> import multiprocessing
<ide> import os
<ide> import signal
<ide> def WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use,
<ide>
<ide> cmake_target_type = cmake_target_type_from_gyp_target_type.get(target_type)
<ide> if cmake_target_type is None:
<del> print ('Target %s has unknown target type %s, skipping.' %
<del> ( target_name, target_type ) )
<add> print('Target %s has unknown target type %s, skipping.' %
<add> ( target_name, target_type))
<ide> return
<ide>
<ide> SetVariable(output, 'TARGET', target_name)
<ide> def WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use,
<ide> default_product_ext = generator_default_variables['SHARED_LIB_SUFFIX']
<ide>
<ide> elif target_type != 'executable':
<del> print ('ERROR: What output file should be generated?',
<del> 'type', target_type, 'target', target_name)
<add> print('ERROR: What output file should be generated?',
<add> 'type', target_type, 'target', target_name)
<ide>
<ide> product_prefix = spec.get('product_prefix', default_product_prefix)
<ide> product_name = spec.get('product_name', default_product_name)
<ide> def PerformBuild(data, configurations, params):
<ide> output_dir,
<ide> config_name))
<ide> arguments = ['cmake', '-G', 'Ninja']
<del> print 'Generating [%s]: %s' % (config_name, arguments)
<add> print('Generating [%s]: %s' % (config_name, arguments))
<ide> subprocess.check_call(arguments, cwd=build_dir)
<ide>
<ide> arguments = ['ninja', '-C', build_dir]
<del> print 'Building [%s]: %s' % (config_name, arguments)
<add> print('Building [%s]: %s' % (config_name, arguments))
<ide> subprocess.check_call(arguments)
<ide>
<ide>
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> arglists.append((target_list, target_dicts, data,
<ide> params, config_name))
<ide> pool.map(CallGenerateOutputForConfig, arglists)
<del> except KeyboardInterrupt, e:
<add> except KeyboardInterrupt as e:
<ide> pool.terminate()
<ide> raise e
<ide> else:
<ide><path>tools/gyp/pylib/gyp/generator/make.py
<ide> # toplevel Makefile. It may make sense to generate some .mk files on
<ide> # the side to keep the files readable.
<ide>
<add>from __future__ import print_function
<add>
<ide> import os
<ide> import re
<ide> import sys
<ide> def _ValidateSourcesForOSX(spec, all_sources):
<ide> basenames.setdefault(basename, []).append(source)
<ide>
<ide> error = ''
<del> for basename, files in basenames.iteritems():
<add> for basename, files in basenames.items():
<ide> if len(files) > 1:
<ide> error += ' %s: %s\n' % (basename, ' '.join(files))
<ide>
<ide> def WriteSources(self, configs, deps, sources,
<ide> cflags_c = config.get('cflags_c')
<ide> cflags_cc = config.get('cflags_cc')
<ide>
<del> self.WriteLn("# Flags passed to all source files.");
<add> self.WriteLn("# Flags passed to all source files.")
<ide> self.WriteList(cflags, 'CFLAGS_%s' % configname)
<del> self.WriteLn("# Flags passed to only C files.");
<add> self.WriteLn("# Flags passed to only C files.")
<ide> self.WriteList(cflags_c, 'CFLAGS_C_%s' % configname)
<del> self.WriteLn("# Flags passed to only C++ files.");
<add> self.WriteLn("# Flags passed to only C++ files.")
<ide> self.WriteList(cflags_cc, 'CFLAGS_CC_%s' % configname)
<ide> if self.flavor == 'mac':
<del> self.WriteLn("# Flags passed to only ObjC files.");
<add> self.WriteLn("# Flags passed to only ObjC files.")
<ide> self.WriteList(cflags_objc, 'CFLAGS_OBJC_%s' % configname)
<del> self.WriteLn("# Flags passed to only ObjC++ files.");
<add> self.WriteLn("# Flags passed to only ObjC++ files.")
<ide> self.WriteList(cflags_objcc, 'CFLAGS_OBJCC_%s' % configname)
<ide> includes = config.get('include_dirs')
<ide> if includes:
<ide> def ComputeOutputBasename(self, spec):
<ide> elif self.type == 'none':
<ide> target = '%s.stamp' % target
<ide> elif self.type != 'executable':
<del> print ("ERROR: What output file should be generated?",
<del> "type", self.type, "target", target)
<add> print("ERROR: What output file should be generated?",
<add> "type", self.type, "target", target)
<ide>
<ide> target_prefix = spec.get('product_prefix', target_prefix)
<ide> target = spec.get('product_name', target)
<ide> def WriteTarget(self, spec, configs, deps, link_deps, bundle_deps,
<ide> # Postbuilds expect to be run in the gyp file's directory, so insert an
<ide> # implicit postbuild to cd to there.
<ide> postbuilds.insert(0, gyp.common.EncodePOSIXShellList(['cd', self.path]))
<del> for i in xrange(len(postbuilds)):
<add> for i in range(len(postbuilds)):
<ide> if not postbuilds[i].startswith('$'):
<ide> postbuilds[i] = EscapeShellArgument(postbuilds[i])
<ide> self.WriteLn('%s: builddir := $(abs_builddir)' % QuoteSpaces(self.output))
<ide> def WriteTarget(self, spec, configs, deps, link_deps, bundle_deps,
<ide> self.WriteDoCmd([self.output_binary], deps, 'touch', part_of_all,
<ide> postbuilds=postbuilds)
<ide> else:
<del> print "WARNING: no output for", self.type, target
<add> print("WARNING: no output for", self.type, self.target)
<ide>
<ide> # Add an alias for each target (if there are any outputs).
<ide> # Installable target aliases are created below.
<ide> def PerformBuild(data, configurations, params):
<ide> if options.toplevel_dir and options.toplevel_dir != '.':
<ide> arguments += '-C', options.toplevel_dir
<ide> arguments.append('BUILDTYPE=' + config)
<del> print 'Building [%s]: %s' % (config, arguments)
<add> print('Building [%s]: %s' % (config, arguments))
<ide> subprocess.check_call(arguments)
<ide>
<ide>
<ide><path>tools/gyp/pylib/gyp/generator/msvs.py
<ide> # Use of this source code is governed by a BSD-style license that can be
<ide> # found in the LICENSE file.
<ide>
<add>from __future__ import print_function
<add>
<ide> import copy
<ide> import ntpath
<ide> import os
<ide> def _ConfigWindowsTargetPlatformVersion(config_data, version):
<ide> if names:
<ide> return names[0]
<ide> else:
<del> print >> sys.stdout, (
<del> 'Warning: No include files found for '
<del> 'detected Windows SDK version %s' % (version)
<del> )
<add> print('Warning: No include files found for detected '
<add> 'Windows SDK version %s' % (version), file=sys.stdout)
<ide>
<ide>
<ide> def _BuildCommandLineForRuleRaw(spec, cmd, cygwin_shell, has_input_path,
<ide> def _AddCustomBuildToolForMSVS(p, spec, primary_input,
<ide> 'CommandLine': cmd,
<ide> })
<ide> # Add to the properties of primary input for each config.
<del> for config_name, c_data in spec['configurations'].iteritems():
<add> for config_name, c_data in spec['configurations'].items():
<ide> p.AddFileConfig(_FixPath(primary_input),
<ide> _ConfigFullName(config_name, c_data), tools=[tool])
<ide>
<ide> def _Replace(match):
<ide> # the VCProj but cause the same problem on the final command-line. Moving
<ide> # the item to the end of the list does works, but that's only possible if
<ide> # there's only one such item. Let's just warn the user.
<del> print >> sys.stderr, ('Warning: MSVS may misinterpret the odd number of ' +
<del> 'quotes in ' + s)
<add> print('Warning: MSVS may misinterpret the odd number of '
<add> 'quotes in ' + s, file=sys.stderr)
<ide> return s
<ide>
<ide>
<ide> def _ValidateSourcesForMSVSProject(spec, version):
<ide> basenames.setdefault(basename, []).append(source)
<ide>
<ide> error = ''
<del> for basename, files in basenames.iteritems():
<add> for basename, files in basenames.items():
<ide> if len(files) > 1:
<ide> error += ' %s: %s\n' % (basename, ' '.join(files))
<ide>
<ide> def _GenerateMSVSProject(project, options, version, generator_flags):
<ide> relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir)
<ide>
<ide> config_type = _GetMSVSConfigurationType(spec, project.build_file)
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config)
<ide>
<ide> # MSVC08 and prior version cannot handle duplicate basenames in the same
<ide> def _ConvertToolsToExpectedForm(tools):
<ide> A list of Tool objects.
<ide> """
<ide> tool_list = []
<del> for tool, settings in tools.iteritems():
<add> for tool, settings in tools.items():
<ide> # Collapse settings with lists.
<ide> settings_fixed = {}
<del> for setting, value in settings.iteritems():
<add> for setting, value in settings.items():
<ide> if type(value) == list:
<ide> if ((tool == 'VCLinkerTool' and
<ide> setting == 'AdditionalDependencies') or
<ide> def _IdlFilesHandledNonNatively(spec, sources):
<ide> def _GetPrecompileRelatedFiles(spec):
<ide> # Gather a list of precompiled header related sources.
<ide> precompiled_related = []
<del> for _, config in spec['configurations'].iteritems():
<add> for _, config in spec['configurations'].items():
<ide> for k in precomp_keys:
<ide> f = config.get(k)
<ide> if f:
<ide> def _GetPrecompileRelatedFiles(spec):
<ide> def _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl,
<ide> list_excluded):
<ide> exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl)
<del> for file_name, excluded_configs in exclusions.iteritems():
<add> for file_name, excluded_configs in exclusions.items():
<ide> if (not list_excluded and
<ide> len(excluded_configs) == len(spec['configurations'])):
<ide> # If we're not listing excluded files, then they won't appear in the
<ide> def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl):
<ide> # Exclude excluded sources from being built.
<ide> for f in excluded_sources:
<ide> excluded_configs = []
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> precomped = [_FixPath(config.get(i, '')) for i in precomp_keys]
<ide> # Don't do this for ones that are precompiled header related.
<ide> if f not in precomped:
<ide> def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl):
<ide> # Exclude them now.
<ide> for f in excluded_idl:
<ide> excluded_configs = []
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> excluded_configs.append((config_name, config))
<ide> exclusions[f] = excluded_configs
<ide> return exclusions
<ide> def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl):
<ide> def _AddToolFilesToMSVS(p, spec):
<ide> # Add in tool files (rules).
<ide> tool_files = OrderedSet()
<del> for _, config in spec['configurations'].iteritems():
<add> for _, config in spec['configurations'].items():
<ide> for f in config.get('msvs_tool_files', []):
<ide> tool_files.add(f)
<ide> for f in tool_files:
<ide> def _HandlePreCompiledHeaders(p, sources, spec):
<ide> # kind (i.e. C vs. C++) as the precompiled header source stub needs
<ide> # to have use of precompiled headers disabled.
<ide> extensions_excluded_from_precompile = []
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> source = config.get('msvs_precompiled_source')
<ide> if source:
<ide> source = _FixPath(source)
<ide> def DisableForSourceTree(source_tree):
<ide> else:
<ide> basename, extension = os.path.splitext(source)
<ide> if extension in extensions_excluded_from_precompile:
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> tool = MSVSProject.Tool('VCCLCompilerTool',
<ide> {'UsePrecompiledHeader': '0',
<ide> 'ForcedIncludeFiles': '$(NOINHERIT)'})
<ide> def _WriteMSVSUserFile(project_path, version, spec):
<ide> return # Nothing to add
<ide> # Write out the user file.
<ide> user_file = _CreateMSVSUserFile(project_path, version, spec)
<del> for config_name, c_data in spec['configurations'].iteritems():
<add> for config_name, c_data in spec['configurations'].items():
<ide> user_file.AddDebugSettings(_ConfigFullName(config_name, c_data),
<ide> action, environment, working_directory)
<ide> user_file.WriteIfChanged()
<ide> def _GetPathDict(root, path):
<ide> def _DictsToFolders(base_path, bucket, flat):
<ide> # Convert to folders recursively.
<ide> children = []
<del> for folder, contents in bucket.iteritems():
<add> for folder, contents in bucket.items():
<ide> if type(contents) == dict:
<ide> folder_children = _DictsToFolders(os.path.join(base_path, folder),
<ide> contents, flat)
<ide> def _GetPlatformOverridesOfProject(spec):
<ide> # Prepare a dict indicating which project configurations are used for which
<ide> # solution configurations for this target.
<ide> config_platform_overrides = {}
<del> for config_name, c in spec['configurations'].iteritems():
<add> for config_name, c in spec['configurations'].items():
<ide> config_fullname = _ConfigFullName(config_name, c)
<ide> platform = c.get('msvs_target_platform', _ConfigPlatform(c))
<ide> fixed_config_fullname = '%s|%s' % (
<ide> def PerformBuild(data, configurations, params):
<ide> msvs_version = params['msvs_version']
<ide> devenv = os.path.join(msvs_version.path, 'Common7', 'IDE', 'devenv.com')
<ide>
<del> for build_file, build_file_dict in data.iteritems():
<add> for build_file, build_file_dict in data.items():
<ide> (build_file_root, build_file_ext) = os.path.splitext(build_file)
<ide> if build_file_ext != '.gyp':
<ide> continue
<ide> def PerformBuild(data, configurations, params):
<ide>
<ide> for config in configurations:
<ide> arguments = [devenv, sln_path, '/Build', config]
<del> print 'Building [%s]: %s' % (config, arguments)
<add> print('Building [%s]: %s' % (config, arguments))
<ide> rtn = subprocess.check_call(arguments)
<ide>
<ide>
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> configs = set()
<ide> for qualified_target in target_list:
<ide> spec = target_dicts[qualified_target]
<del> for config_name, config in spec['configurations'].iteritems():
<add> for config_name, config in spec['configurations'].items():
<ide> configs.add(_ConfigFullName(config_name, config))
<ide> configs = list(configs)
<ide>
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> if generator_flags.get('msvs_error_on_missing_sources', False):
<ide> raise GypError(error_message)
<ide> else:
<del> print >> sys.stdout, "Warning: " + error_message
<add> print("Warning: " + error_message, file=sys.stdout)
<ide>
<ide>
<ide> def _GenerateMSBuildFiltersFile(filters_path, source_files,
<ide> def _GetConfigurationCondition(name, settings):
<ide>
<ide> def _GetMSBuildProjectConfigurations(configurations):
<ide> group = ['ItemGroup', {'Label': 'ProjectConfigurations'}]
<del> for (name, settings) in sorted(configurations.iteritems()):
<add> for (name, settings) in sorted(configurations.items()):
<ide> configuration, platform = _GetConfigurationAndPlatform(name, settings)
<ide> designation = '%s|%s' % (configuration, platform)
<ide> group.append(
<ide> def _GetMSBuildGlobalProperties(spec, version, guid, gyp_file_name):
<ide>
<ide> def _GetMSBuildConfigurationDetails(spec, build_file):
<ide> properties = {}
<del> for name, settings in spec['configurations'].iteritems():
<add> for name, settings in spec['configurations'].items():
<ide> msbuild_attributes = _GetMSBuildAttributes(spec, settings, build_file)
<ide> condition = _GetConfigurationCondition(name, settings)
<ide> character_set = msbuild_attributes.get('CharacterSet')
<ide> def _GetMSBuildPropertySheets(configurations):
<ide> user_props = r'$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props'
<ide> additional_props = {}
<ide> props_specified = False
<del> for name, settings in sorted(configurations.iteritems()):
<add> for name, settings in sorted(configurations.items()):
<ide> configuration = _GetConfigurationCondition(name, settings)
<del> if settings.has_key('msbuild_props'):
<add> if 'msbuild_props' in settings:
<ide> additional_props[configuration] = _FixPaths(settings['msbuild_props'])
<ide> props_specified = True
<ide> else:
<ide> def _GetMSBuildPropertySheets(configurations):
<ide> ]
<ide> else:
<ide> sheets = []
<del> for condition, props in additional_props.iteritems():
<add> for condition, props in additional_props.items():
<ide> import_group = [
<ide> 'ImportGroup',
<ide> {'Label': 'PropertySheets',
<ide> def _ConvertMSVSBuildAttributes(spec, config, build_file):
<ide> elif a == 'ConfigurationType':
<ide> msbuild_attributes[a] = _ConvertMSVSConfigurationType(msvs_attributes[a])
<ide> else:
<del> print 'Warning: Do not know how to convert MSVS attribute ' + a
<add> print('Warning: Do not know how to convert MSVS attribute ' + a)
<ide> return msbuild_attributes
<ide>
<ide>
<ide> def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file):
<ide> new_paths = '$(ExecutablePath);' + ';'.join(new_paths)
<ide>
<ide> properties = {}
<del> for (name, configuration) in sorted(configurations.iteritems()):
<add> for (name, configuration) in sorted(configurations.items()):
<ide> condition = _GetConfigurationCondition(name, configuration)
<ide> attributes = _GetMSBuildAttributes(spec, configuration, build_file)
<ide> msbuild_settings = configuration['finalized_msbuild_settings']
<ide> def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file):
<ide> _AddConditionalProperty(properties, condition, 'ExecutablePath',
<ide> new_paths)
<ide> tool_settings = msbuild_settings.get('', {})
<del> for name, value in sorted(tool_settings.iteritems()):
<add> for name, value in sorted(tool_settings.items()):
<ide> formatted_value = _GetValueFormattedForMSBuild('', name, value)
<ide> _AddConditionalProperty(properties, condition, name, formatted_value)
<ide> return _GetMSBuildPropertyGroup(spec, None, properties)
<ide> def GetEdges(node):
<ide> # NOTE: reverse(topsort(DAG)) = topsort(reverse_edges(DAG))
<ide> for name in reversed(properties_ordered):
<ide> values = properties[name]
<del> for value, conditions in sorted(values.iteritems()):
<add> for value, conditions in sorted(values.items()):
<ide> if len(conditions) == num_configurations:
<ide> # If the value is the same all configurations,
<ide> # just add one unconditional entry.
<ide> def GetEdges(node):
<ide>
<ide> def _GetMSBuildToolSettingsSections(spec, configurations):
<ide> groups = []
<del> for (name, configuration) in sorted(configurations.iteritems()):
<add> for (name, configuration) in sorted(configurations.items()):
<ide> msbuild_settings = configuration['finalized_msbuild_settings']
<ide> group = ['ItemDefinitionGroup',
<ide> {'Condition': _GetConfigurationCondition(name, configuration)}
<ide> ]
<del> for tool_name, tool_settings in sorted(msbuild_settings.iteritems()):
<add> for tool_name, tool_settings in sorted(msbuild_settings.items()):
<ide> # Skip the tool named '' which is a holder of global settings handled
<ide> # by _GetMSBuildConfigurationGlobalProperties.
<ide> if tool_name:
<ide> if tool_settings:
<ide> tool = [tool_name]
<del> for name, value in sorted(tool_settings.iteritems()):
<add> for name, value in sorted(tool_settings.items()):
<ide> formatted_value = _GetValueFormattedForMSBuild(tool_name, name,
<ide> value)
<ide> tool.append([name, formatted_value])
<ide> def _FinalizeMSBuildSettings(spec, configuration):
<ide> for ignored_setting in ignored_settings:
<ide> value = configuration.get(ignored_setting)
<ide> if value:
<del> print ('Warning: The automatic conversion to MSBuild does not handle '
<del> '%s. Ignoring setting of %s' % (ignored_setting, str(value)))
<add> print('Warning: The automatic conversion to MSBuild does not handle '
<add> '%s. Ignoring setting of %s' % (ignored_setting, str(value)))
<ide>
<ide> defines = [_EscapeCppDefineForMSBuild(d) for d in defines]
<ide> disabled_warnings = _GetDisabledWarnings(configuration)
<ide> def _AddSources2(spec, sources, exclusions, grouped_sources,
<ide> {'Condition': condition},
<ide> 'true'])
<ide> # Add precompile if needed
<del> for config_name, configuration in spec['configurations'].iteritems():
<add> for config_name, configuration in spec['configurations'].items():
<ide> precompiled_source = configuration.get('msvs_precompiled_source', '')
<ide> if precompiled_source != '':
<ide> precompiled_source = _FixPath(precompiled_source)
<ide> def _GenerateActionsForMSBuild(spec, actions_to_add):
<ide> """
<ide> sources_handled_by_action = OrderedSet()
<ide> actions_spec = []
<del> for primary_input, actions in actions_to_add.iteritems():
<add> for primary_input, actions in actions_to_add.items():
<ide> inputs = OrderedSet()
<ide> outputs = OrderedSet()
<ide> descriptions = []
<ide><path>tools/gyp/pylib/gyp/generator/ninja.py
<ide> # Use of this source code is governed by a BSD-style license that can be
<ide> # found in the LICENSE file.
<ide>
<add>from __future__ import print_function
<add>
<ide> import collections
<ide> import copy
<ide> import hashlib
<ide> def WriteSpec(self, spec, config_name, generator_flags):
<ide> try:
<ide> sources = extra_sources + spec.get('sources', [])
<ide> except TypeError:
<del> print 'extra_sources: ', str(extra_sources)
<del> print 'spec.get("sources"): ', str(spec.get('sources'))
<add> print('extra_sources: ', str(extra_sources))
<add> print('spec.get("sources"): ', str(spec.get('sources')))
<ide> raise
<ide> if sources:
<ide> if self.flavor == 'mac' and len(self.archs) > 1:
<ide> def WriteSpec(self, spec, config_name, generator_flags):
<ide> if self.flavor != 'mac' or len(self.archs) == 1:
<ide> link_deps += [self.GypPathToNinja(o) for o in obj_outputs]
<ide> else:
<del> print "Warning: Actions/rules writing object files don't work with " \
<del> "multiarch targets, dropping. (target %s)" % spec['target_name']
<add> print("Warning: Actions/rules writing object files don't work with "
<add> "multiarch targets, dropping. (target %s)" % spec['target_name'])
<ide> elif self.flavor == 'mac' and len(self.archs) > 1:
<ide> link_deps = collections.defaultdict(list)
<ide>
<ide> def WriteMacXCassets(self, xcassets, bundle_depends):
<ide> 'XCASSETS_LAUNCH_IMAGE': 'launch-image',
<ide> }
<ide> settings = self.xcode_settings.xcode_settings[self.config_name]
<del> for settings_key, arg_name in settings_to_arg.iteritems():
<add> for settings_key, arg_name in settings_to_arg.items():
<ide> value = settings.get(settings_key)
<ide> if value:
<ide> extra_arguments[arg_name] = value
<ide> def GenerateOutputForConfig(target_list, target_dicts, data, params,
<ide> wrappers[key[:-len('_wrapper')]] = os.path.join(build_to_root, value)
<ide>
<ide> # Support wrappers from environment variables too.
<del> for key, value in os.environ.iteritems():
<add> for key, value in os.environ.items():
<ide> if key.lower().endswith('_wrapper'):
<ide> key_prefix = key[:-len('_wrapper')]
<ide> key_prefix = re.sub(r'\.HOST$', '.host', key_prefix)
<ide> def GenerateOutputForConfig(target_list, target_dicts, data, params,
<ide> configs, generator_flags)
<ide> cl_paths = gyp.msvs_emulation.GenerateEnvironmentFiles(
<ide> toplevel_build, generator_flags, shared_system_includes, OpenOutput)
<del> for arch, path in sorted(cl_paths.iteritems()):
<add> for arch, path in sorted(cl_paths.items()):
<ide> if clang_cl:
<ide> # If we have selected clang-cl, use that instead.
<ide> path = clang_cl
<ide> def PerformBuild(data, configurations, params):
<ide> for config in configurations:
<ide> builddir = os.path.join(options.toplevel_dir, 'out', config)
<ide> arguments = ['ninja', '-C', builddir]
<del> print 'Building [%s]: %s' % (config, arguments)
<add> print('Building [%s]: %s' % (config, arguments))
<ide> subprocess.check_call(arguments)
<ide>
<ide>
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> arglists.append(
<ide> (target_list, target_dicts, data, params, config_name))
<ide> pool.map(CallGenerateOutputForConfig, arglists)
<del> except KeyboardInterrupt, e:
<add> except KeyboardInterrupt as e:
<ide> pool.terminate()
<ide> raise e
<ide> else:
<ide><path>tools/gyp/pylib/gyp/generator/xcode.py
<ide> # Use of this source code is governed by a BSD-style license that can be
<ide> # found in the LICENSE file.
<ide>
<add>from __future__ import print_function
<add>
<ide> import filecmp
<ide> import gyp.common
<ide> import gyp.xcodeproj_file
<ide> def __init__(self, gyp_path, path, build_file_dict):
<ide> try:
<ide> os.makedirs(self.path)
<ide> self.created_dir = True
<del> except OSError, e:
<add> except OSError as e:
<ide> if e.errno != errno.EEXIST:
<ide> raise
<ide>
<ide> def Finalize1(self, xcode_targets, serialize_all_tests):
<ide> # the tree tree view for UI display.
<ide> # Any values set globally are applied to all configurations, then any
<ide> # per-configuration values are applied.
<del> for xck, xcv in self.build_file_dict.get('xcode_settings', {}).iteritems():
<add> for xck, xcv in self.build_file_dict.get('xcode_settings', {}).items():
<ide> xccl.SetBuildSetting(xck, xcv)
<ide> if 'xcode_config_file' in self.build_file_dict:
<ide> config_ref = self.project.AddOrGetFileInRootGroup(
<ide> def Finalize1(self, xcode_targets, serialize_all_tests):
<ide> if build_file_configuration_named:
<ide> xcc = xccl.ConfigurationNamed(config_name)
<ide> for xck, xcv in build_file_configuration_named.get('xcode_settings',
<del> {}).iteritems():
<add> {}).items():
<ide> xcc.SetBuildSetting(xck, xcv)
<ide> if 'xcode_config_file' in build_file_configuration_named:
<ide> config_ref = self.project.AddOrGetFileInRootGroup(
<ide> def Finalize1(self, xcode_targets, serialize_all_tests):
<ide> script = script + "\n".join(
<ide> ['export %s="%s"' %
<ide> (key, gyp.xcodeproj_file.ConvertVariablesToShellSyntax(val))
<del> for (key, val) in command.get('environment').iteritems()]) + "\n"
<add> for (key, val) in command.get('environment').items()]) + "\n"
<ide>
<ide> # Some test end up using sockets, files on disk, etc. and can get
<ide> # confused if more then one test runs at a time. The generator
<ide> def Write(self):
<ide> same = False
<ide> try:
<ide> same = filecmp.cmp(pbxproj_path, new_pbxproj_path, False)
<del> except OSError, e:
<add> except OSError as e:
<ide> if e.errno != errno.ENOENT:
<ide> raise
<ide>
<ide> def Write(self):
<ide> #
<ide> # No way to get the umask without setting a new one? Set a safe one
<ide> # and then set it back to the old value.
<del> umask = os.umask(077)
<add> umask = os.umask(0o77)
<ide> os.umask(umask)
<ide>
<del> os.chmod(new_pbxproj_path, 0666 & ~umask)
<add> os.chmod(new_pbxproj_path, 0o666 & ~umask)
<ide> os.rename(new_pbxproj_path, pbxproj_path)
<ide>
<ide> except Exception:
<ide> def EscapeXcodeDefine(s):
<ide> def PerformBuild(data, configurations, params):
<ide> options = params['options']
<ide>
<del> for build_file, build_file_dict in data.iteritems():
<add> for build_file, build_file_dict in data.items():
<ide> (build_file_root, build_file_ext) = os.path.splitext(build_file)
<ide> if build_file_ext != '.gyp':
<ide> continue
<ide> def PerformBuild(data, configurations, params):
<ide> for config in configurations:
<ide> arguments = ['xcodebuild', '-project', xcodeproj_path]
<ide> arguments += ['-configuration', config]
<del> print "Building [%s]: %s" % (config, arguments)
<add> print("Building [%s]: %s" % (config, arguments))
<ide> subprocess.check_call(arguments)
<ide>
<ide>
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> skip_excluded_files = \
<ide> not generator_flags.get('xcode_list_excluded_files', True)
<ide> xcode_projects = {}
<del> for build_file, build_file_dict in data.iteritems():
<add> for build_file, build_file_dict in data.items():
<ide> (build_file_root, build_file_ext) = os.path.splitext(build_file)
<ide> if build_file_ext != '.gyp':
<ide> continue
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> xctarget_type = gyp.xcodeproj_file.PBXNativeTarget
<ide> try:
<ide> target_properties['productType'] = _types[type_bundle_key]
<del> except KeyError, e:
<add> except KeyError as e:
<ide> gyp.common.ExceptionAppend(e, "-- unknown product type while "
<ide> "writing target %s" % target_name)
<ide> raise
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> # target.
<ide> makefile.write('all: \\\n')
<ide> for concrete_output_index in \
<del> xrange(0, len(concrete_outputs_by_rule_source)):
<add> range(0, len(concrete_outputs_by_rule_source)):
<ide> # Only list the first (index [0]) concrete output of each input
<ide> # in the "all" target. Otherwise, a parallel make (-j > 1) would
<ide> # attempt to process each input multiple times simultaneously.
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> # rule source. Collect the names of the directories that are
<ide> # required.
<ide> concrete_output_dirs = []
<del> for concrete_output_index in xrange(0, len(concrete_outputs)):
<add> for concrete_output_index in range(0, len(concrete_outputs)):
<ide> concrete_output = concrete_outputs[concrete_output_index]
<ide> if concrete_output_index == 0:
<ide> bol = ''
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> # the set of additional rule inputs, if any.
<ide> prerequisites = [rule_source]
<ide> prerequisites.extend(rule.get('inputs', []))
<del> for prerequisite_index in xrange(0, len(prerequisites)):
<add> for prerequisite_index in range(0, len(prerequisites)):
<ide> prerequisite = prerequisites[prerequisite_index]
<ide> if prerequisite_index == len(prerequisites) - 1:
<ide> eol = ''
<ide> def GenerateOutput(target_list, target_dicts, data, params):
<ide> set_define = EscapeXcodeDefine(define)
<ide> xcbc.AppendBuildSetting('GCC_PREPROCESSOR_DEFINITIONS', set_define)
<ide> if 'xcode_settings' in configuration:
<del> for xck, xcv in configuration['xcode_settings'].iteritems():
<add> for xck, xcv in configuration['xcode_settings'].items():
<ide> xcbc.SetBuildSetting(xck, xcv)
<ide> if 'xcode_config_file' in configuration:
<ide> config_ref = pbxp.AddOrGetFileInRootGroup(
<ide> configuration['xcode_config_file'])
<ide> xcbc.SetBaseConfiguration(config_ref)
<ide>
<ide> build_files = []
<del> for build_file, build_file_dict in data.iteritems():
<add> for build_file, build_file_dict in data.items():
<ide> if build_file.endswith('.gyp'):
<ide> build_files.append(build_file)
<ide>
<ide><path>tools/gyp/pylib/gyp/mac_tool.py
<ide> These functions are executed via gyp-mac-tool when using the Makefile generator.
<ide> """
<ide>
<add>from __future__ import print_function
<add>
<ide> import fcntl
<ide> import fnmatch
<ide> import glob
<ide> def _DetectInputEncoding(self, file_name):
<ide> fp = open(file_name, 'rb')
<ide> try:
<ide> header = fp.read(3)
<del> except:
<add> except Exception:
<ide> fp.close()
<ide> return None
<ide> fp.close()
<ide> def ExecFilterLibtool(self, *cmd_list):
<ide> _, err = libtoolout.communicate()
<ide> for line in err.splitlines():
<ide> if not libtool_re.match(line) and not libtool_re5.match(line):
<del> print >>sys.stderr, line
<add> print(line, file=sys.stderr)
<ide> # Unconditionally touch the output .a file on the command line if present
<ide> # and the command succeeded. A bit hacky.
<ide> if not libtoolout.returncode:
<ide> def ExecCompileXcassets(self, keys, *inputs):
<ide> ])
<ide> if keys:
<ide> keys = json.loads(keys)
<del> for key, value in keys.iteritems():
<add> for key, value in keys.items():
<ide> arg_name = '--' + key
<ide> if isinstance(value, bool):
<ide> if value:
<ide> def _FindProvisioningProfile(self, profile, bundle_identifier):
<ide> profiles_dir = os.path.join(
<ide> os.environ['HOME'], 'Library', 'MobileDevice', 'Provisioning Profiles')
<ide> if not os.path.isdir(profiles_dir):
<del> print >>sys.stderr, (
<del> 'cannot find mobile provisioning for %s' % bundle_identifier)
<add> print('cannot find mobile provisioning for %s' % bundle_identifier,
<add> file=sys.stderr)
<ide> sys.exit(1)
<ide> provisioning_profiles = None
<ide> if profile:
<ide> def _FindProvisioningProfile(self, profile, bundle_identifier):
<ide> valid_provisioning_profiles[app_id_pattern] = (
<ide> profile_path, profile_data, team_identifier)
<ide> if not valid_provisioning_profiles:
<del> print >>sys.stderr, (
<del> 'cannot find mobile provisioning for %s' % bundle_identifier)
<add> print('cannot find mobile provisioning for %s' % bundle_identifier,
<add> file=sys.stderr)
<ide> sys.exit(1)
<ide> # If the user has multiple provisioning profiles installed that can be
<ide> # used for ${bundle_identifier}, pick the most specific one (ie. the
<ide> def _LoadProvisioningProfile(self, profile_path):
<ide>
<ide> def _MergePlist(self, merged_plist, plist):
<ide> """Merge |plist| into |merged_plist|."""
<del> for key, value in plist.iteritems():
<add> for key, value in plist.items():
<ide> if isinstance(value, dict):
<ide> merged_value = merged_plist.get(key, {})
<ide> if isinstance(merged_value, dict):
<ide> def _ExpandVariables(self, data, substitutions):
<ide> the key was not found.
<ide> """
<ide> if isinstance(data, str):
<del> for key, value in substitutions.iteritems():
<add> for key, value in substitutions.items():
<ide> data = data.replace('$(%s)' % key, value)
<ide> return data
<ide> if isinstance(data, list):
<ide><path>tools/gyp/pylib/gyp/msvs_emulation.py
<ide> def __init__(self, spec, generator_flags):
<ide> configs = spec['configurations']
<ide> for field, default in supported_fields:
<ide> setattr(self, field, {})
<del> for configname, config in configs.iteritems():
<add> for configname, config in configs.items():
<ide> getattr(self, field)[configname] = config.get(field, default())
<ide>
<ide> self.msvs_cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.'])
<ide> def ExpandMacros(string, expansions):
<ide> """Expand $(Variable) per expansions dict. See MsvsSettings.GetVSMacroEnv
<ide> for the canonical way to retrieve a suitable dict."""
<ide> if '$' in string:
<del> for old, new in expansions.iteritems():
<add> for old, new in expansions.items():
<ide> assert '$(' not in new, new
<ide> string = string.replace(old, new)
<ide> return string
<ide> def _FormatAsEnvironmentBlock(envvar_dict):
<ide> CreateProcess documentation for more details."""
<ide> block = ''
<ide> nul = '\0'
<del> for key, value in envvar_dict.iteritems():
<add> for key, value in envvar_dict.items():
<ide> block += key + '=' + value + nul
<ide> block += nul
<ide> return block
<ide><path>tools/gyp/pylib/gyp/win_tool.py
<ide> These functions are executed via gyp-win-tool when using the ninja generator.
<ide> """
<ide>
<add>from __future__ import print_function
<add>
<ide> import os
<ide> import re
<ide> import shutil
<ide> def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args):
<ide> if (not line.startswith(' Creating library ') and
<ide> not line.startswith('Generating code') and
<ide> not line.startswith('Finished generating code')):
<del> print line
<add> print(line)
<ide> return link.returncode
<ide>
<ide> def ExecLinkWithManifests(self, arch, embed_manifest, out, ldcmd, resname,
<ide> def ExecManifestWrapper(self, arch, *args):
<ide> out, _ = popen.communicate()
<ide> for line in out.splitlines():
<ide> if line and 'manifest authoring warning 81010002' not in line:
<del> print line
<add> print(line)
<ide> return popen.returncode
<ide>
<ide> def ExecManifestToRc(self, arch, *args):
<ide> def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl,
<ide> for x in lines if x.startswith(prefixes))
<ide> for line in lines:
<ide> if not line.startswith(prefixes) and line not in processing:
<del> print line
<add> print(line)
<ide> return popen.returncode
<ide>
<ide> def ExecAsmWrapper(self, arch, *args):
<ide> def ExecAsmWrapper(self, arch, *args):
<ide> not line.startswith('Microsoft (R) Macro Assembler') and
<ide> not line.startswith(' Assembling: ') and
<ide> line):
<del> print line
<add> print(line)
<ide> return popen.returncode
<ide>
<ide> def ExecRcWrapper(self, arch, *args):
<ide> def ExecRcWrapper(self, arch, *args):
<ide> if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and
<ide> not line.startswith('Copyright (C) Microsoft Corporation') and
<ide> line):
<del> print line
<add> print(line)
<ide> return popen.returncode
<ide>
<ide> def ExecActionWrapper(self, arch, rspfile, *dir):
<ide> def ExecActionWrapper(self, arch, rspfile, *dir):
<ide> env = self._GetEnv(arch)
<ide> # TODO(scottmg): This is a temporary hack to get some specific variables
<ide> # through to actions that are set after gyp-time. http://crbug.com/333738.
<del> for k, v in os.environ.iteritems():
<add> for k, v in os.environ.items():
<ide> if k not in env:
<ide> env[k] = v
<ide> args = open(rspfile).read()
<ide><path>tools/gyp/pylib/gyp/xcode_emulation.py
<ide> other build systems, such as make and ninja.
<ide> """
<ide>
<add>from __future__ import print_function
<add>
<ide> import copy
<ide> import gyp.common
<ide> import os
<ide> def _ExpandArchs(self, archs, sdkroot):
<ide> if arch not in expanded_archs:
<ide> expanded_archs.append(arch)
<ide> except KeyError as e:
<del> print 'Warning: Ignoring unsupported variable "%s".' % variable
<add> print('Warning: Ignoring unsupported variable "%s".' % variable)
<ide> elif arch not in expanded_archs:
<ide> expanded_archs.append(arch)
<ide> return expanded_archs
<ide> def __init__(self, spec):
<ide> # the same for all configs are implicitly per-target settings.
<ide> self.xcode_settings = {}
<ide> configs = spec['configurations']
<del> for configname, config in configs.iteritems():
<add> for configname, config in configs.items():
<ide> self.xcode_settings[configname] = config.get('xcode_settings', {})
<ide> self._ConvertConditionalKeys(configname)
<ide> if self.xcode_settings[configname].get('IPHONEOS_DEPLOYMENT_TARGET',
<ide> def _ConvertConditionalKeys(self, configname):
<ide> new_key = key.split("[")[0]
<ide> settings[new_key] = settings[key]
<ide> else:
<del> print 'Warning: Conditional keys not implemented, ignoring:', \
<del> ' '.join(conditional_keys)
<add> print('Warning: Conditional keys not implemented, ignoring:',
<add> ' '.join(conditional_keys))
<ide> del settings[key]
<ide>
<ide> def _Settings(self):
<ide> def _Appendf(self, lst, test_key, format_str, default=None):
<ide>
<ide> def _WarnUnimplemented(self, test_key):
<ide> if test_key in self._Settings():
<del> print 'Warning: Ignoring not yet implemented key "%s".' % test_key
<add> print('Warning: Ignoring not yet implemented key "%s".' % test_key)
<ide>
<ide> def IsBinaryOutputFormat(self, configname):
<ide> default = "binary" if self.isIOS else "xml"
<ide> def GetCflags(self, configname, arch=None):
<ide> cflags += self._Settings().get('WARNING_CFLAGS', [])
<ide>
<ide> if self._IsXCTest():
<del> platform_root = self._XcodePlatformPath(configname)
<del> if platform_root:
<del> cflags.append('-F' + platform_root + '/Developer/Library/Frameworks/')
<add> platform_root = self._XcodePlatformPath(configname)
<add> if platform_root:
<add> cflags.append('-F' + platform_root + '/Developer/Library/Frameworks/')
<ide>
<ide> if sdk_root:
<ide> framework_root = sdk_root
<ide> def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None):
<ide> ldflags.append('-F' + directory.replace('$(SDKROOT)', sdk_root))
<ide>
<ide> if self._IsXCTest():
<del> platform_root = self._XcodePlatformPath(configname)
<del> if sdk_root and platform_root:
<del> ldflags.append('-F' + platform_root + '/Developer/Library/Frameworks/')
<del> ldflags.append('-framework XCTest')
<add> platform_root = self._XcodePlatformPath(configname)
<add> if sdk_root and platform_root:
<add> ldflags.append('-F' + platform_root + '/Developer/Library/Frameworks/')
<add> ldflags.append('-framework XCTest')
<ide>
<ide> is_extension = self._IsIosAppExtension() or self._IsIosWatchKitExtension()
<ide> if sdk_root and is_extension:
<ide> def GetPerTargetSettings(self):
<ide> result = dict(self.xcode_settings[configname])
<ide> first_pass = False
<ide> else:
<del> for key, value in self.xcode_settings[configname].iteritems():
<add> for key, value in self.xcode_settings[configname].items():
<ide> if key not in result:
<ide> continue
<ide> elif result[key] != value:
<ide> def _GetIOSPostbuilds(self, configname, output_binary):
<ide> unimpl = ['OTHER_CODE_SIGN_FLAGS']
<ide> unimpl = set(unimpl) & set(self.xcode_settings[configname].keys())
<ide> if unimpl:
<del> print 'Warning: Some codesign keys not implemented, ignoring: %s' % (
<del> ', '.join(sorted(unimpl)))
<add> print('Warning: Some codesign keys not implemented, ignoring: %s' %
<add> ', '.join(sorted(unimpl)))
<ide>
<ide> if self._IsXCTest():
<ide> # For device xctests, Xcode copies two extra frameworks into $TEST_HOST.
<ide> def GetEdges(node):
<ide> order = gyp.common.TopologicallySorted(env.keys(), GetEdges)
<ide> order.reverse()
<ide> return order
<del> except gyp.common.CycleError, e:
<add> except gyp.common.CycleError as e:
<ide> raise GypError(
<ide> 'Xcode environment variables are cyclically dependent: ' + str(e.nodes))
<ide>
<ide> def _AddIOSDeviceConfigurations(targets):
<ide> for target_dict in targets.itervalues():
<ide> toolset = target_dict['toolset']
<ide> configs = target_dict['configurations']
<del> for config_name, simulator_config_dict in dict(configs).iteritems():
<add> for config_name, simulator_config_dict in dict(configs).items():
<ide> iphoneos_config_dict = copy.deepcopy(simulator_config_dict)
<ide> configs[config_name + '-iphoneos'] = iphoneos_config_dict
<ide> configs[config_name + '-iphonesimulator'] = simulator_config_dict
<ide><path>tools/gyp/pylib/gyp/xcode_ninja.py
<ide> def _WriteWorkspace(main_gyp, sources_gyp, params):
<ide> workspace_path = os.path.join(options.generator_output, workspace_path)
<ide> try:
<ide> os.makedirs(workspace_path)
<del> except OSError, e:
<add> except OSError as e:
<ide> if e.errno != errno.EEXIST:
<ide> raise
<ide> output_string = '<?xml version="1.0" encoding="UTF-8"?>\n' + \
<ide> def CreateWrapper(target_list, target_dicts, data, params):
<ide> params: Dict of global options for gyp.
<ide> """
<ide> orig_gyp = params['build_files'][0]
<del> for gyp_name, gyp_dict in data.iteritems():
<add> for gyp_name, gyp_dict in data.items():
<ide> if gyp_name == orig_gyp:
<ide> depth = gyp_dict['_DEPTH']
<ide>
<ide> def CreateWrapper(target_list, target_dicts, data, params):
<ide> not generator_flags.get('xcode_ninja_list_excluded_files', True)
<ide>
<ide> sources = []
<del> for target, target_dict in target_dicts.iteritems():
<add> for target, target_dict in target_dicts.items():
<ide> base = os.path.dirname(target)
<ide> files = target_dict.get('sources', []) + \
<ide> target_dict.get('mac_bundle_resources', [])
<ide><path>tools/gyp/pylib/gyp/xcodeproj_file.py
<ide> """
<ide>
<ide> import gyp.common
<add>import hashlib
<ide> import posixpath
<ide> import re
<ide> import struct
<ide> import sys
<ide>
<del># hashlib is supplied as of Python 2.5 as the replacement interface for sha
<del># and other secure hashes. In 2.6, sha is deprecated. Import hashlib if
<del># available, avoiding a deprecation warning under 2.6. Import sha otherwise,
<del># preserving 2.4 compatibility.
<ide> try:
<del> import hashlib
<del> _new_sha1 = hashlib.sha1
<del>except ImportError:
<del> import sha
<del> _new_sha1 = sha.new
<add> basestring, cmp, unicode
<add>except NameError: # Python 3
<add> basestring = unicode = str
<add> def cmp(x, y):
<add> return (x > y) - (x < y)
<ide>
<ide>
<ide> # See XCObject._EncodeString. This pattern is used to determine when a string
<ide> def Copy(self):
<ide> """
<ide>
<ide> that = self.__class__(id=self.id, parent=self.parent)
<del> for key, value in self._properties.iteritems():
<add> for key, value in self._properties.items():
<ide> is_strong = self._schema[key][2]
<ide>
<ide> if isinstance(value, XCObject):
<ide> def Copy(self):
<ide> that._properties[key] = new_value
<ide> else:
<ide> that._properties[key] = value
<del> elif isinstance(value, str) or isinstance(value, unicode) or \
<del> isinstance(value, int):
<add> elif isinstance(value, (basestring, int)):
<ide> that._properties[key] = value
<ide> elif isinstance(value, list):
<ide> if is_strong:
<ide> def _HashUpdate(hash, data):
<ide> hash.update(data)
<ide>
<ide> if seed_hash is None:
<del> seed_hash = _new_sha1()
<add> seed_hash = hashlib.sha1()
<ide>
<ide> hash = seed_hash.copy()
<ide>
<ide> def _HashUpdate(hash, data):
<ide> digest_int_count = hash.digest_size / 4
<ide> digest_ints = struct.unpack('>' + 'I' * digest_int_count, hash.digest())
<ide> id_ints = [0, 0, 0]
<del> for index in xrange(0, digest_int_count):
<add> for index in range(0, digest_int_count):
<ide> id_ints[index % 3] ^= digest_ints[index]
<ide> self.id = '%08X%08X%08X' % tuple(id_ints)
<ide>
<ide> def Children(self):
<ide> """Returns a list of all of this object's owned (strong) children."""
<ide>
<ide> children = []
<del> for property, attributes in self._schema.iteritems():
<add> for property, attributes in self._schema.items():
<ide> (is_list, property_type, is_strong) = attributes[0:3]
<ide> if is_strong and property in self._properties:
<ide> if not is_list:
<ide> def _XCPrintableValue(self, tabs, value, flatten_list=False):
<ide> printable += end_tabs + ')'
<ide> elif isinstance(value, dict):
<ide> printable = '{' + sep
<del> for item_key, item_value in sorted(value.iteritems()):
<add> for item_key, item_value in sorted(value.items()):
<ide> printable += element_tabs + \
<ide> self._XCPrintableValue(tabs + 1, item_key, flatten_list) + ' = ' + \
<ide> self._XCPrintableValue(tabs + 1, item_value, flatten_list) + ';' + \
<ide> def _XCKVPrint(self, file, tabs, key, value):
<ide> printable_value[0] == '"' and printable_value[-1] == '"':
<ide> printable_value = printable_value[1:-1]
<ide> printable += printable_key + ' = ' + printable_value + ';' + after_kv
<del> except TypeError, e:
<add> except TypeError as e:
<ide> gyp.common.ExceptionAppend(e,
<ide> 'while printing key "%s"' % key)
<ide> raise
<ide> def Print(self, file=sys.stdout):
<ide> self._XCKVPrint(file, 3, 'isa', self.__class__.__name__)
<ide>
<ide> # The remaining elements of an object dictionary are sorted alphabetically.
<del> for property, value in sorted(self._properties.iteritems()):
<add> for property, value in sorted(self._properties.items()):
<ide> self._XCKVPrint(file, 3, property, value)
<ide>
<ide> # End the object.
<ide> def UpdateProperties(self, properties, do_copy=False):
<ide> if properties is None:
<ide> return
<ide>
<del> for property, value in properties.iteritems():
<add> for property, value in properties.items():
<ide> # Make sure the property is in the schema.
<ide> if not property in self._schema:
<ide> raise KeyError(property + ' not in ' + self.__class__.__name__)
<ide> def UpdateProperties(self, properties, do_copy=False):
<ide> self._properties[property] = value.Copy()
<ide> else:
<ide> self._properties[property] = value
<del> elif isinstance(value, str) or isinstance(value, unicode) or \
<del> isinstance(value, int):
<add> elif isinstance(value, (basestring, int)):
<ide> self._properties[property] = value
<ide> elif isinstance(value, list):
<ide> if is_strong:
<ide> def VerifyHasRequiredProperties(self):
<ide>
<ide> # TODO(mark): A stronger verification mechanism is needed. Some
<ide> # subclasses need to perform validation beyond what the schema can enforce.
<del> for property, attributes in self._schema.iteritems():
<add> for property, attributes in self._schema.items():
<ide> (is_list, property_type, is_strong, is_required) = attributes[0:4]
<ide> if is_required and not property in self._properties:
<ide> raise KeyError(self.__class__.__name__ + ' requires ' + property)
<ide> def _SetDefaultsFromSchema(self):
<ide> overwrite properties that have already been set."""
<ide>
<ide> defaults = {}
<del> for property, attributes in self._schema.iteritems():
<add> for property, attributes in self._schema.items():
<ide> (is_list, property_type, is_strong, is_required) = attributes[0:4]
<ide> if is_required and len(attributes) >= 5 and \
<ide> not property in self._properties:
<ide> def PathHashables(self):
<ide> xche = self
<ide> while xche != None and isinstance(xche, XCHierarchicalElement):
<ide> xche_hashables = xche.Hashables()
<del> for index in xrange(0, len(xche_hashables)):
<add> for index in range(0, len(xche_hashables)):
<ide> hashables.insert(index, xche_hashables[index])
<ide> xche = xche.parent
<ide> return hashables
<ide> def HeadersPhase(self):
<ide> # The headers phase should come before the resources, sources, and
<ide> # frameworks phases, if any.
<ide> insert_at = len(self._properties['buildPhases'])
<del> for index in xrange(0, len(self._properties['buildPhases'])):
<add> for index in range(0, len(self._properties['buildPhases'])):
<ide> phase = self._properties['buildPhases'][index]
<ide> if isinstance(phase, PBXResourcesBuildPhase) or \
<ide> isinstance(phase, PBXSourcesBuildPhase) or \
<ide> def ResourcesPhase(self):
<ide> # The resources phase should come before the sources and frameworks
<ide> # phases, if any.
<ide> insert_at = len(self._properties['buildPhases'])
<del> for index in xrange(0, len(self._properties['buildPhases'])):
<add> for index in range(0, len(self._properties['buildPhases'])):
<ide> phase = self._properties['buildPhases'][index]
<ide> if isinstance(phase, PBXSourcesBuildPhase) or \
<ide> isinstance(phase, PBXFrameworksBuildPhase):
<ide> def CompareProducts(x, y, remote_products):
<ide> # determine the sort order.
<ide> return cmp(x_index, y_index)
<ide>
<del> for other_pbxproject, ref_dict in self._other_pbxprojects.iteritems():
<add> for other_pbxproject, ref_dict in self._other_pbxprojects.items():
<ide> # Build up a list of products in the remote project file, ordered the
<ide> # same as the targets that produce them.
<ide> remote_products = []
<ide> def Print(self, file=sys.stdout):
<ide> self._XCPrint(file, 0, '{ ')
<ide> else:
<ide> self._XCPrint(file, 0, '{\n')
<del> for property, value in sorted(self._properties.iteritems(),
<add> for property, value in sorted(self._properties.items(),
<ide> cmp=lambda x, y: cmp(x, y)):
<ide> if property == 'objects':
<ide> self._PrintObjects(file) | 16 |
Javascript | Javascript | allow object.prototype fields as labels | 3cbb5cdfdb621baec5dc3a2ac505be37f1718086 | <ide><path>lib/console.js
<ide> function Console(stdout, stderr) {
<ide> Object.defineProperty(this, '_stdout', prop);
<ide> prop.value = stderr;
<ide> Object.defineProperty(this, '_stderr', prop);
<del> prop.value = {};
<add> prop.value = new Map();
<ide> Object.defineProperty(this, '_times', prop);
<ide>
<ide> // bind the prototype functions to this Console instance
<ide> Console.prototype.dir = function(object, options) {
<ide>
<ide>
<ide> Console.prototype.time = function(label) {
<del> this._times[label] = Date.now();
<add> this._times.set(label, Date.now());
<ide> };
<ide>
<ide>
<ide> Console.prototype.timeEnd = function(label) {
<del> var time = this._times[label];
<add> var time = this._times.get(label);
<ide> if (!time) {
<ide> throw new Error('No such label: ' + label);
<ide> }
<ide><path>test/parallel/test-console.js
<ide> assert.ok(process.stderr.writable);
<ide> assert.equal('number', typeof process.stdout.fd);
<ide> assert.equal('number', typeof process.stderr.fd);
<ide>
<add>assert.throws(function () {
<add> console.timeEnd('no such label');
<add>});
<add>
<add>assert.doesNotThrow(function () {
<add> console.time('label');
<add> console.timeEnd('label');
<add>});
<add>
<ide> // an Object with a custom .inspect() function
<ide> var custom_inspect = { foo: 'bar', inspect: function () { return 'inspect'; } };
<ide>
<ide> console.dir({ foo : { bar : { baz : true } } }, { depth: 1 });
<ide> // test console.trace()
<ide> console.trace('This is a %j %d', { formatted: 'trace' }, 10, 'foo');
<ide>
<add>// test console.time() and console.timeEnd() output
<add>console.time('label');
<add>console.timeEnd('label');
<add>
<add>// verify that Object.prototype properties can be used as labels
<add>console.time('__proto__');
<add>console.timeEnd('__proto__');
<add>console.time('constructor');
<add>console.timeEnd('constructor');
<add>console.time('hasOwnProperty');
<add>console.timeEnd('hasOwnProperty');
<ide>
<ide> global.process.stdout.write = stdout_write;
<ide>
<ide> assert.notEqual(-1, strings.shift().indexOf('foo: [Object]'));
<ide> assert.equal(-1, strings.shift().indexOf('baz'));
<ide> assert.equal('Trace: This is a {"formatted":"trace"} 10 foo',
<ide> strings.shift().split('\n').shift());
<del>
<del>assert.throws(function () {
<del> console.timeEnd('no such label');
<del>});
<del>
<del>assert.doesNotThrow(function () {
<del> console.time('label');
<del> console.timeEnd('label');
<del>});
<add>assert.ok(/^label: \d+ms$/.test(strings.shift().trim()));
<add>assert.ok(/^__proto__: \d+ms$/.test(strings.shift().trim()));
<add>assert.ok(/^constructor: \d+ms$/.test(strings.shift().trim()));
<add>assert.ok(/^hasOwnProperty: \d+ms$/.test(strings.shift().trim()));
<add>assert.equal(strings.length, 0); | 2 |
Go | Go | fix empty-lines (revive) | f71fe8476a2f8493adf50b6d520a739568383a51 | <ide><path>api/server/middleware/version.go
<ide> func (v VersionMiddleware) WrapHandler(handler func(ctx context.Context, w http.
<ide> ctx = context.WithValue(ctx, httputils.APIVersionKey{}, apiVersion)
<ide> return handler(ctx, w, r, vars)
<ide> }
<del>
<ide> }
<ide><path>api/server/router/build/build_routes.go
<ide> func (br *buildRouter) postBuild(ctx context.Context, w http.ResponseWriter, r *
<ide> defer func() { _ = output.Close() }()
<ide>
<ide> errf := func(err error) error {
<del>
<ide> if httputils.BoolValue(r, "q") && notVerboseBuffer.Len() > 0 {
<ide> _, _ = output.Write(notVerboseBuffer.Bytes())
<ide> }
<ide><path>api/server/router/swarm/helpers_test.go
<ide> func TestAdjustForAPIVersion(t *testing.T) {
<ide> if len(spec.TaskTemplate.ContainerSpec.Ulimits) != 0 {
<ide> t.Error("Ulimits were not stripped from spec")
<ide> }
<del>
<ide> } | 3 |
Text | Text | add sam roberts to tsc | e582d1191376b8bf129a04730b04a2cd52ae5202 | <ide><path>README.md
<ide> For information about the governance of the Node.js project, see
<ide> **Myles Borins** <myles.borins@gmail.com> (he/him)
<ide> * [rvagg](https://github.com/rvagg) -
<ide> **Rod Vagg** <rod@vagg.org>
<add>* [sam-github](https://github.com/sam-github) -
<add>**Sam Roberts** <vieuxtech@gmail.com>
<ide> * [targos](https://github.com/targos) -
<ide> **Michaël Zasso** <targos@protonmail.com> (he/him)
<ide> * [thefourtheye](https://github.com/thefourtheye) - | 1 |
Java | Java | parameterize asynctask type | dbcbdace9e83a020e35ed95f203fbbdb632f2a85 | <ide><path>spring-web/src/main/java/org/springframework/web/context/request/async/AsyncTask.java
<ide> * @author Rossen Stoyanchev
<ide> * @since 3.2
<ide> */
<del>public class AsyncTask {
<add>public class AsyncTask<V> {
<ide>
<del> private final Callable<?> callable;
<add> private final Callable<V> callable;
<ide>
<ide> private final Long timeout;
<ide>
<ide> public class AsyncTask {
<ide> * @param timeout timeout value in milliseconds
<ide> * @param callable the callable for concurrent handling
<ide> */
<del> public AsyncTask(long timeout, Callable<?> callable) {
<add> public AsyncTask(long timeout, Callable<V> callable) {
<ide> this(timeout, null, null, callable);
<ide> Assert.notNull(timeout, "Timeout must not be null");
<ide> }
<ide> public AsyncTask(long timeout, Callable<?> callable) {
<ide> * @param timeout timeout value in milliseconds; ignored if {@code null}
<ide> * @param callable the callable for concurrent handling
<ide> */
<del> public AsyncTask(Long timeout, String executorName, Callable<?> callable) {
<add> public AsyncTask(Long timeout, String executorName, Callable<V> callable) {
<ide> this(timeout, null, executorName, callable);
<ide> Assert.notNull(executor, "Executor name must not be null");
<ide> }
<ide> public AsyncTask(Long timeout, String executorName, Callable<?> callable) {
<ide> * @param timeout timeout value in milliseconds; ignored if {@code null}
<ide> * @param callable the callable for concurrent handling
<ide> */
<del> public AsyncTask(Long timeout, AsyncTaskExecutor executor, Callable<?> callable) {
<add> public AsyncTask(Long timeout, AsyncTaskExecutor executor, Callable<V> callable) {
<ide> this(timeout, executor, null, callable);
<ide> Assert.notNull(executor, "Executor must not be null");
<ide> }
<ide>
<del> private AsyncTask(Long timeout, AsyncTaskExecutor executor, String executorName, Callable<?> callable) {
<add> private AsyncTask(Long timeout, AsyncTaskExecutor executor, String executorName, Callable<V> callable) {
<ide> Assert.notNull(callable, "Callable must not be null");
<ide> this.callable = callable;
<ide> this.timeout = timeout;
<ide><path>spring-web/src/main/java/org/springframework/web/context/request/async/WebAsyncManager.java
<ide> public void run() {
<ide> * @param processingContext additional context to save that can be accessed
<ide> * via {@link #getConcurrentResultContext()}
<ide> */
<del> public void startCallableProcessing(AsyncTask asyncTask, Object... processingContext) {
<add> public void startCallableProcessing(AsyncTask<?> asyncTask, Object... processingContext) {
<ide> Assert.notNull(asyncTask, "AsyncTask must not be null");
<ide>
<ide> Long timeout = asyncTask.getTimeout();
<ide><path>spring-web/src/test/java/org/springframework/web/context/request/async/WebAsyncManagerTests.java
<ide> public void startCallableProcessingAsyncTask() {
<ide> this.asyncWebRequest.startAsync();
<ide> replay(this.asyncWebRequest);
<ide>
<del> AsyncTask asyncTask = new AsyncTask(1000L, executor, createMock(Callable.class));
<add> @SuppressWarnings("unchecked")
<add> AsyncTask<Object> asyncTask = new AsyncTask<Object>(1000L, executor, createMock(Callable.class));
<ide> this.asyncManager.startCallableProcessing(asyncTask);
<ide>
<ide> verify(executor, this.asyncWebRequest);
<ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/AsyncTaskMethodReturnValueHandler.java
<ide> public void handleReturnValue(Object returnValue,
<ide> return;
<ide> }
<ide>
<del> AsyncTask asyncTask = (AsyncTask) returnValue;
<add> AsyncTask<?> asyncTask = (AsyncTask<?>) returnValue;
<ide> asyncTask.setBeanFactory(this.beanFactory);
<del> WebAsyncUtils.getAsyncManager(webRequest).startCallableProcessing(asyncTask.getCallable(), mavContainer);
<add> WebAsyncUtils.getAsyncManager(webRequest).startCallableProcessing(asyncTask, mavContainer);
<ide> }
<ide>
<ide> } | 4 |
Ruby | Ruby | restrict variants to variable image blobs | 311af752cfd98b534a3d5dbf30e4a202693f32dc | <ide><path>activestorage/app/models/active_storage/blob.rb
<ide> # update a blob's metadata on a subsequent pass, but you should not update the key or change the uploaded file.
<ide> # If you need to create a derivative or otherwise change the blob, simply create a new blob and purge the old one.
<ide> class ActiveStorage::Blob < ActiveRecord::Base
<add> class InvariableError < StandardError; end
<ide> class UnpreviewableError < StandardError; end
<ide> class UnrepresentableError < StandardError; end
<ide>
<ide> def text?
<ide> content_type.start_with?("text")
<ide> end
<ide>
<add>
<ide> # Returns an ActiveStorage::Variant instance with the set of +transformations+ provided. This is only relevant for image
<ide> # files, and it allows any image to be transformed for size, colors, and the like. Example:
<ide> #
<ide> def text?
<ide> #
<ide> # This will create a URL for that specific blob with that specific variant, which the ActiveStorage::VariantsController
<ide> # can then produce on-demand.
<add> #
<add> # Raises ActiveStorage::Blob::InvariableError if ImageMagick cannot transform the blob. To determine whether a blob is
<add> # variable, call ActiveStorage::Blob#previewable?.
<ide> def variant(transformations)
<del> ActiveStorage::Variant.new(self, ActiveStorage::Variation.wrap(transformations))
<add> if variable?
<add> ActiveStorage::Variant.new(self, ActiveStorage::Variation.wrap(transformations))
<add> else
<add> raise InvariableError
<add> end
<add> end
<add>
<add> # Returns true if ImageMagick can transform the blob (its content type is in +ActiveStorage.variable_content_types+).
<add> def variable?
<add> ActiveStorage.variable_content_types.include?(content_type)
<ide> end
<ide>
<ide>
<ide> def representation(transformations)
<ide> case
<ide> when previewable?
<ide> preview transformations
<del> when image?
<add> when variable?
<ide> variant transformations
<ide> else
<ide> raise UnrepresentableError
<ide> end
<ide> end
<ide>
<del> # Returns true if the blob is an image or is previewable.
<add> # Returns true if the blob is variable or is previewable.
<ide> def representable?
<del> image? || previewable?
<add> variable? || previewable?
<ide> end
<ide>
<ide>
<ide><path>activestorage/lib/active_storage.rb
<ide> module ActiveStorage
<ide> mattr_accessor :queue
<ide> mattr_accessor :previewers, default: []
<ide> mattr_accessor :analyzers, default: []
<add> mattr_accessor :variable_content_types, default: []
<ide> end
<ide><path>activestorage/lib/active_storage/engine.rb
<ide> class Engine < Rails::Engine # :nodoc:
<ide> config.active_storage = ActiveSupport::OrderedOptions.new
<ide> config.active_storage.previewers = [ ActiveStorage::Previewer::PDFPreviewer, ActiveStorage::Previewer::VideoPreviewer ]
<ide> config.active_storage.analyzers = [ ActiveStorage::Analyzer::ImageAnalyzer, ActiveStorage::Analyzer::VideoAnalyzer ]
<add> config.active_storage.variable_content_types = [ "image/png", "image/gif", "image/jpg", "image/jpeg", "image/vnd.adobe.photoshop" ]
<ide> config.active_storage.paths = ActiveSupport::OrderedOptions.new
<ide>
<ide> config.eager_load_namespaces << ActiveStorage
<ide> class Engine < Rails::Engine # :nodoc:
<ide> ActiveStorage.queue = app.config.active_storage.queue
<ide> ActiveStorage.previewers = app.config.active_storage.previewers || []
<ide> ActiveStorage.analyzers = app.config.active_storage.analyzers || []
<add> ActiveStorage.variable_content_types = app.config.active_storage.variable_content_types || []
<ide> end
<ide> end
<ide>
<ide><path>activestorage/test/models/variant_test.rb
<ide> class ActiveStorage::VariantTest < ActiveSupport::TestCase
<ide> assert_match(/Gray/, image.colorspace)
<ide> end
<ide>
<add> test "variation of invariable blob" do
<add> assert_raises ActiveStorage::Blob::InvariableError do
<add> create_file_blob(filename: "report.pdf", content_type: "application/pdf").variant(resize: "100x100")
<add> end
<add> end
<add>
<ide> test "service_url doesn't grow in length despite long variant options" do
<ide> variant = @blob.variant(font: "a" * 10_000).processed
<ide> assert_operator variant.service_url.length, :<, 500 | 4 |
Python | Python | fix trainer tests in a multigpu env | 8546dc55c265f96b984b378504241ea0225ad8ba | <ide><path>tests/test_trainer.py
<ide> def forward(self, input_x=None, labels=None, **kwargs):
<ide> loss = torch.nn.functional.mse_loss(y, labels)
<ide> return (loss, y, y) if self.double_output else (loss, y)
<ide>
<del> def get_regression_trainer(a=0, b=0, double_output=False, train_len=64, eval_len=64, **kwargs):
<add> def get_regression_trainer(a=0, b=0, double_output=False, train_len=64, eval_len=64, pretrained=True, **kwargs):
<ide> label_names = kwargs.get("label_names", None)
<ide> train_dataset = RegressionDataset(length=train_len, label_names=label_names)
<ide> eval_dataset = RegressionDataset(length=eval_len, label_names=label_names)
<del> config = RegressionModelConfig(a=a, b=b, double_output=double_output)
<del> model = RegressionPreTrainedModel(config)
<add> if pretrained:
<add> config = RegressionModelConfig(a=a, b=b, double_output=double_output)
<add> model = RegressionPreTrainedModel(config)
<add> else:
<add> model = RegressionModel(a=a, b=b, double_output=double_output)
<ide> compute_metrics = kwargs.pop("compute_metrics", None)
<ide> data_collator = kwargs.pop("data_collator", None)
<ide> optimizers = kwargs.pop("optimizers", (None, None))
<ide> def check_best_model_has_been_loaded(
<ide> best_model = RegressionModel()
<ide> state_dict = torch.load(os.path.join(checkpoint, WEIGHTS_NAME))
<ide> best_model.load_state_dict(state_dict)
<add> best_model.to(trainer.args.device)
<ide> self.assertTrue(torch.allclose(best_model.a, trainer.model.a))
<ide> self.assertTrue(torch.allclose(best_model.b, trainer.model.b))
<ide>
<ide> def test_save_checkpoints(self):
<ide>
<ide> # With a regular model that is not a PreTrainedModel
<ide> with tempfile.TemporaryDirectory() as tmpdir:
<del> trainer = get_regression_trainer(output_dir=tmpdir, save_steps=5)
<del> trainer.model = RegressionModel()
<add> trainer = get_regression_trainer(output_dir=tmpdir, save_steps=5, pretrained=False)
<ide> trainer.train()
<ide> self.check_saved_checkpoints(tmpdir, 5, int(self.n_epochs * 64 / self.batch_size), False)
<ide>
<ide> def test_load_best_model_at_end(self):
<ide> eval_steps=5,
<ide> evaluation_strategy="steps",
<ide> load_best_model_at_end=True,
<add> pretrained=False,
<ide> )
<del> trainer.model = RegressionModel(a=1.5, b=2.5)
<ide> self.assertFalse(trainer.args.greater_is_better)
<ide> trainer.train()
<ide> self.check_saved_checkpoints(tmpdir, 5, total, is_pretrained=False) | 1 |
Javascript | Javascript | reremove extra middlewares from server file | c8488c84193f0f1592fc242cdfe7b1c774242184 | <ide><path>server/server.js
<ide> var pmx = require('pmx');
<ide> pmx.init();
<ide>
<ide> var assign = require('lodash').assign,
<del> loopback = require('loopback'),
<del> boot = require('loopback-boot'),
<del> accepts = require('accepts'),
<del> cookieParser = require('cookie-parser'),
<del> compress = require('compression'),
<del> session = require('express-session'),
<del> expressState = require('express-state'),
<del> logger = require('morgan'),
<del> errorHandler = require('errorhandler'),
<del> methodOverride = require('method-override'),
<del> bodyParser = require('body-parser'),
<del> helmet = require('helmet'),
<del> MongoStore = require('connect-mongo')(session),
<del> flash = require('express-flash'),
<del> path = require('path'),
<del> expressValidator = require('express-validator'),
<del> lessMiddleware = require('less-middleware'),
<del>
<del> passportProviders = require('./passport-providers'),
<del> rxMiddleware = require('./utils/rx').rxMiddleware,
<del> /**
<del> * API keys and Passport configuration.
<del> */
<del> secrets = require('./../config/secrets');
<add> loopback = require('loopback'),
<add> boot = require('loopback-boot'),
<add> expressState = require('express-state'),
<add> path = require('path'),
<add> passportProviders = require('./passport-providers');
<ide>
<ide> var generateKey =
<ide> require('loopback-component-passport/lib/models/utils').generateKey;
<ide> var passportConfigurator = new PassportConfigurator(app);
<ide> app.set('port', process.env.PORT || 3000);
<ide> app.set('views', path.join(__dirname, 'views'));
<ide> app.set('view engine', 'jade');
<del>
<del>app.use(compress());
<del>app.use(lessMiddleware(path.join(__dirname, '/public')));
<del>app.use(logger('dev'));
<del>app.use(bodyParser.json());
<del>app.use(bodyParser.urlencoded({
<del> extended: true
<del>}));
<del>app.use(expressValidator({
<del> customValidators: {
<del> matchRegex: function(param, regex) {
<del> return regex.test(param);
<del> }
<del> }
<del>}));
<del>app.use(methodOverride());
<del>app.use(cookieParser(secrets.cookieSecret));
<del>app.use(session({
<del> resave: true,
<del> saveUninitialized: true,
<del> secret: secrets.sessionSecret,
<del> store: new MongoStore({
<del> url: secrets.db,
<del> 'autoReconnect': true
<del> })
<del>}));
<del>
<del>app.use(flash());
<ide> app.disable('x-powered-by');
<del>// adds passport initialization after session middleware phase is complete
<del>
<del>app.use(helmet.xssFilter());
<del>app.use(helmet.noSniff());
<del>app.use(helmet.frameguard());
<del>app.use(function(req, res, next) {
<del> res.header('Access-Control-Allow-Origin', '*');
<del> res.header('Access-Control-Allow-Headers',
<del> 'Origin, X-Requested-With, Content-Type, Accept'
<del> );
<del> next();
<del>});
<del>
<del>var trusted = [
<del> "'self'",
<del> 'blob:',
<del> '104.236.218.15',
<del> '*.freecodecamp.com',
<del> 'http://www.freecodecamp.com',
<del> 'https://www.freecodecamp.com',
<del> 'https://freecodecamp.com',
<del> 'https://freecodecamp.org',
<del> '*.freecodecamp.org',
<del> // NOTE(berks): add the following as the blob above was not covering www
<del> 'http://www.freecodecamp.org',
<del> 'ws://freecodecamp.com/',
<del> 'ws://www.freecodecamp.com/',
<del> '*.gstatic.com',
<del> '*.google-analytics.com',
<del> '*.googleapis.com',
<del> '*.google.com',
<del> '*.gstatic.com',
<del> '*.doubleclick.net',
<del> '*.twitter.com',
<del> '*.twitch.tv',
<del> '*.twimg.com',
<del> "'unsafe-eval'",
<del> "'unsafe-inline'",
<del> '*.bootstrapcdn.com',
<del> '*.cloudflare.com',
<del> 'https://*.cloudflare.com',
<del> 'localhost:3001',
<del> 'ws://localhost:3001/',
<del> 'http://localhost:3001',
<del> 'localhost:3000',
<del> 'ws://localhost:3000/',
<del> 'http://localhost:3000',
<del> '127.0.0.1',
<del> '127.0.0.1:3000',
<del> 'ws://127.0.0.1:3000/',
<del> 'http://127.0.0.1:3000',
<del> '*.ionicframework.com',
<del> 'https://syndication.twitter.com',
<del> '*.youtube.com',
<del> '*.jsdelivr.net',
<del> 'https://*.jsdelivr.net',
<del> '*.ytimg.com',
<del> '*.bitly.com',
<del> 'http://cdn.inspectlet.com/',
<del> 'https://cdn.inspeclet.com/',
<del> 'wss://inspectletws.herokuapp.com/',
<del> 'http://hn.inspectlet.com/',
<del> '*.googleapis.com',
<del> '*.gstatic.com',
<del> 'https://hn.inspectlet.com/',
<del> 'http://*.github.com'
<del>];
<del>
<del>app.use(helmet.csp({
<del> defaultSrc: trusted,
<del> scriptSrc: [
<del> '*.optimizely.com',
<del> '*.aspnetcdn.com',
<del> '*.d3js.org',
<del> 'https://cdn.inspectlet.com/inspectlet.js',
<del> 'http://cdn.inspectlet.com/inspectlet.js',
<del> 'http://beta.freecodecamp.com'
<del> ].concat(trusted),
<del> 'connect-src': [
<del> 'vimeo.com'
<del> ].concat(trusted),
<del> styleSrc: [
<del> '*.googleapis.com',
<del> '*.gstatic.com'
<del> ].concat(trusted),
<del> imgSrc: [
<del> /* allow all input since we have user submitted images for public profile*/
<del> '*'
<del> ].concat(trusted),
<del> fontSrc: [
<del> '*.googleapis.com',
<del> '*.gstatic.com'
<del> ].concat(trusted),
<del> mediaSrc: [
<del> '*.amazonaws.com',
<del> '*.twitter.com'
<del> ].concat(trusted),
<del> frameSrc: [
<del> '*.gitter.im',
<del> '*.gitter.im https:',
<del> '*.vimeo.com',
<del> '*.twitter.com',
<del> '*.ghbtns.com'
<del> ].concat(trusted),
<del> // set to true if you only want to report errors
<del> reportOnly: false,
<del> // set to true if you want to set all headers
<del> setAllHeaders: false,
<del> // set to true if you want to force buggy CSP in Safari 5
<del> safari5: false
<del>}));
<ide>
<add>// adds passport initialization after session middleware phase is complete
<ide> passportConfigurator.init();
<ide>
<ide> boot(app, { | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.