content_type stringclasses 8
values | main_lang stringclasses 7
values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
PHP | PHP | fix truncate tests for postgres | 47464352c579c6867313ca2c85d9028fc54634c0 | <ide><path>tests/TestCase/Database/Schema/PostgresSchemaTest.php
<ide> public function testTruncateSql() {
<ide> ]);
<ide> $result = $table->truncateSql($connection);
<ide> $this->assertCount(1, $result);
<del> $this->assertEquals('TRUNCATE "schema_articles" RESTART IDENTITY', $result[0]);
<add> $this->assertEquals('TRUNCATE "schema_articles" RESTART IDENTITY CASCADE', $result[0]);
<ide> }
<ide>
<ide> /** | 1 |
Ruby | Ruby | create directory when dumping schema cache | ca4f10ece430a41b95362e5a8d2f9d2e83535239 | <ide><path>activerecord/lib/active_record/connection_adapters/schema_cache.rb
<ide> def prepare_data_sources
<ide> end
<ide>
<ide> def open(filename)
<add> FileUtils.mkdir_p(File.dirname(filename))
<add>
<ide> File.atomic_write(filename) do |file|
<ide> if File.extname(filename) == ".gz"
<ide> zipper = Zlib::GzipWriter.new file
<ide><path>activerecord/test/cases/connection_adapters/schema_cache_test.rb
<ide> def test_yaml_dump_and_load
<ide> tempfile.unlink
<ide> end
<ide>
<add> def test_cache_path_can_be_in_directory
<add> cache = SchemaCache.new @connection
<add> filename = "some_dir/schema.json"
<add> assert cache.dump_to(filename)
<add> ensure
<add> File.delete(filename)
<add> end
<add>
<ide> def test_yaml_dump_and_load_with_gzip
<ide> # Create an empty cache.
<ide> cache = SchemaCache.new @connection | 2 |
Ruby | Ruby | fix small typo in link_to_function doc | 979f3f894bf1034d4e9ee5ab9e285713f63c2824 | <ide><path>actionpack/lib/action_view/helpers/javascript_helper.rb
<ide> def button_to_function(name, function=nil, html_options={})
<ide> # If +html_options+ has an <tt>:onclick</tt>, that one is put before +function+. Once all
<ide> # the JavaScript is set, the helper appends "; return false;".
<ide> #
<del> # The +href+ attribute of the tag is set to "#" unles +html_options+ has one.
<add> # The +href+ attribute of the tag is set to "#" unless +html_options+ has one.
<ide> #
<ide> # link_to_function "Greeting", "alert('Hello world!')", :class => "nav_link"
<ide> # # => <a class="nav_link" href="#" onclick="alert('Hello world!'); return false;">Greeting</a> | 1 |
Python | Python | add flags info when reporting benchmarks | 1e527fb5ce7b8bcfcb5d07891f60824d042a973c | <ide><path>official/resnet/ctl/ctl_imagenet_benchmark.py
<ide> from official.resnet.ctl import ctl_imagenet_main
<ide> from official.resnet.ctl import ctl_common
<ide> from official.utils.testing.perfzero_benchmark import PerfZeroBenchmark
<add>from official.utils.flags import core as flags_core
<ide>
<ide>
<ide> MIN_TOP_1_ACCURACY = 0.76
<ide> def _report_benchmark(self,
<ide> metrics.append({'name': 'avg_exp_per_second',
<ide> 'value': stats['avg_exp_per_second']})
<ide>
<del> self.report_benchmark(iters=-1, wall_time=wall_time_sec, metrics=metrics)
<add> flags_str = flags_core.get_nondefault_flags_as_str()
<add> self.report_benchmark(iters=-1, wall_time=wall_time_sec, metrics=metrics,
<add> extras={'flags': flags_str})
<ide>
<ide>
<ide> class Resnet50CtlAccuracy(CtlBenchmark): | 1 |
Ruby | Ruby | show real name for aliases | 124ddce2627d7205191e095d28d695e69da6c378 | <ide><path>Library/Homebrew/cmd/audit.rb
<ide> def audit_deps
<ide> # Don't depend_on aliases; use full name
<ide> @@aliases ||= Formula.aliases
<ide> f.deps.select { |d| @@aliases.include? d.name }.each do |d|
<del> problem "Dependency #{d} is an alias; use the canonical name."
<add> real_name = d.to_formula.name
<add> problem "Dependency '#{d}' is an alias; use the canonical name '#{real_name}'."
<ide> end
<ide>
<ide> # Check for things we don't like to depend on. | 1 |
Javascript | Javascript | remove unused parameters in function definition | 49b0f7fa19ce4b8fc3930124e58496894d4ea932 | <ide><path>test/async-hooks/test-async-await.js
<ide> const hooks = initHooks({
<ide> });
<ide> hooks.enable();
<ide>
<del>function oninit(asyncId, type, triggerAsyncId, resource) {
<add>function oninit(asyncId, type) {
<ide> if (type === 'PROMISE') {
<ide> promisesInitState.set(asyncId, 'inited');
<ide> } | 1 |
Javascript | Javascript | update code for `@enum` breaking change | a736c3ea496daf50d64b975a51b611a9171d2de2 | <ide><path>lib/logging/Logger.js
<ide> const LogType = Object.freeze({
<ide>
<ide> exports.LogType = LogType;
<ide>
<del>/** @typedef {LogType} LogTypeEnum */
<add>/** @typedef {keyof LogType} LogTypeEnum */
<ide>
<ide> const LOG_SYMBOL = Symbol("webpack logger raw log method");
<ide> const TIMERS_SYMBOL = Symbol("webpack logger times");
<ide>
<ide> class WebpackLogger {
<ide> /**
<del> * @param {function(LogType, any[]=): void} log log function
<add> * @param {function(LogTypeEnum, any[]=): void} log log function
<ide> */
<ide> constructor(log) {
<ide> this[LOG_SYMBOL] = log;
<ide><path>lib/logging/createConsoleLogger.js
<ide> const filterToFunction = item => {
<ide> };
<ide>
<ide> /**
<del> * @enum {number} */
<add> * @enum {number}
<add> */
<ide> const LogLevel = {
<ide> none: 6,
<ide> false: 6, | 2 |
Python | Python | remove unused imports | 5f0f940a1f42fe899604b39e4422d30791831e6f | <ide><path>bin/parser/train.py
<ide> import shutil
<ide> import codecs
<ide> import random
<del>import time
<del>import gzip
<ide>
<ide> import plac
<ide> import cProfile
<ide><path>bin/prepare_vecs.py
<ide> """Read a vector file, and prepare it as binary data, for easy consumption"""
<ide>
<del>import bz2
<ide> import plac
<del>import struct
<ide>
<ide> from spacy.vocab import write_binary_vectors
<ide>
<ide><path>fabfile.py
<del>from fabric.api import local, run, lcd, cd, env
<add>from fabric.api import local, lcd, env
<ide> from os.path import exists as file_exists
<ide> from fabtools.python import virtualenv
<ide> from os import path
<ide><path>setup.py
<ide> #!/usr/bin/env python
<del>import subprocess
<ide> from setuptools import setup
<del>from glob import glob
<ide> import shutil
<ide>
<ide> import sys
<ide> import os
<ide> from os import path
<del>from os.path import splitext
<ide>
<del>
<del>import shutil
<ide> from setuptools import Extension
<ide> from distutils import sysconfig
<ide> import platform
<ide><path>spacy/util.py
<del>import os
<ide> from os import path
<ide> import codecs
<ide> import json
<ide><path>tests/test_span.py
<ide> from spacy.en import English
<ide>
<ide> import pytest
<del>import re
<ide>
<ide>
<ide> EN = English()
<ide><path>tests/test_wiki_sun.py
<ide> from spacy.util import utf8open
<ide>
<ide> import pytest
<del>import os
<ide> from os import path
<ide>
<ide> | 7 |
Javascript | Javascript | improve ipc performance | 59be97532269a208f0121060772528690a63677b | <ide><path>lib/child_process.js
<ide> function setupChannel(target, channel) {
<ide> if (pool) {
<ide> jsonBuffer += pool.toString('ascii', offset, offset + length);
<ide>
<del> var i;
<del> while ((i = jsonBuffer.indexOf('\n')) >= 0) {
<del> var json = jsonBuffer.slice(0, i);
<add> var i, start = 0;
<add> while ((i = jsonBuffer.indexOf('\n', start)) >= 0) {
<add> var json = jsonBuffer.slice(start, i);
<ide> var message = JSON.parse(json);
<del> jsonBuffer = jsonBuffer.slice(i + 1);
<ide>
<ide> target.emit('message', message, recvHandle);
<add> start = i+1;
<ide> }
<add> jsonBuffer = jsonBuffer.slice(start);
<ide>
<ide> } else {
<ide> channel.close(); | 1 |
Python | Python | add test for issue #758 | e854f28304bfea7d7b80e2f52d8a514418a53553 | <ide><path>spacy/tests/regression/test_issue758.py
<add>from ... import load as load_spacy
<add>from ...attrs import LEMMA
<add>from ...matcher import merge_phrase
<add>
<add>import pytest
<add>
<add>
<add>
<add>
<add>@pytest.mark.models
<add>def test_issue758():
<add> '''Test parser transition bug after label added.'''
<add> nlp = load_spacy('en')
<add> nlp.matcher.add('splash', 'my_entity', {},
<add> [[{LEMMA: 'splash'}, {LEMMA: 'on'}]],
<add> on_match=merge_phrase)
<add> doc = nlp('splash On') | 1 |
Text | Text | use udpv4/udpv6 consistently with tcpv4/tcpv6 | 68ef009f828fb79d8aba85ff63be87152a8dbc49 | <ide><path>doc/api/cluster.md
<ide> The `addressType` is one of:
<ide> * `4` (TCPv4)
<ide> * `6` (TCPv6)
<ide> * `-1` (Unix domain socket)
<del>* `'udp4'` or `'udp6'` (UDP v4 or v6)
<add>* `'udp4'` or `'udp6'` (UDPv4 or UDPv6)
<ide>
<ide> ## Event: `'message'`
<ide> | 1 |
Javascript | Javascript | fix minor typo in panresponder | 857dd59340ec3436a098627e948cb5f909c6fafb | <ide><path>Libraries/vendor/react/browser/eventPlugins/PanResponder.js
<ide> var currentCentroidY = TouchHistoryMath.currentCentroidY;
<ide> * - `dy` - accumulated distance of the gesture since the touch started
<ide> * - `vx` - current velocity of the gesture
<ide> * - `vy` - current velocity of the gesture
<del> * - `numberActiveTouches` - Number of touches currently on screeen
<add> * - `numberActiveTouches` - Number of touches currently on screen
<ide> *
<ide> * ### Basic Usage
<ide> * | 1 |
PHP | PHP | correct docblock for connection manager parsedsn | 74ec45d7113694b7d32e6c2c1b6861b970f12d5d | <ide><path>src/Datasource/ConnectionManager.php
<ide> public static function config($key, $config = null)
<ide> *
<ide> * Note that querystring arguments are also parsed and set as values in the returned configuration.
<ide> *
<del> * @param array $config An array with a `url` key mapping to a string DSN
<add> * @param string $config The DSN string to convert to a configuration array
<ide> * @return array The configuration array to be stored after parsing the DSN
<ide> */
<ide> public static function parseDsn($config = null) | 1 |
Javascript | Javascript | fix code style(2) | e432081bc1adbff01c8ad7ba32eeb37c89a74eb7 | <ide><path>examples/jsm/renderers/nodes/accessors/CameraNode.js
<ide> class CameraNode extends Node {
<ide>
<ide> if ( scope === CameraNode.PROJECTION || scope === CameraNode.VIEW ) {
<ide>
<del> if ( !inputNode || !inputNode.isMatrix4Node !== true ) {
<add> if ( inputNode === undefined || !inputNode.isMatrix4Node !== true ) {
<ide>
<ide> inputNode = new Matrix4Node( null );
<ide>
<ide> }
<ide>
<del> } else if ( !inputNode || !inputNode.isVector3Node !== true ) {
<add> } else if ( inputNode === undefined || !inputNode.isVector3Node !== true ) {
<ide>
<ide> inputNode = new Vector3Node();
<ide> | 1 |
PHP | PHP | improve error message | 38967f3aa811c7f371e81db792102fd561e0f291 | <ide><path>src/View/Helper/FormHelper.php
<ide> public function getFormProtector(): FormProtector
<ide> {
<ide> if ($this->formProtector === null) {
<ide> throw new CakeException(
<del> 'FormHelper::create() must be called first for FormProtector instance to be created.'
<add> '`FormProtector` instance has not been created. Ensure you have loaded the `FormProtectorComponent`'
<add> . ' in your controller and called `FormHelper::create()` before calling `FormHelper::unlockField()`.'
<ide> );
<ide> }
<ide> | 1 |
Ruby | Ruby | use a set for skip_clean_paths | d8756075f4e44fd4e431e39fbbe6ce5c85813f5f | <ide><path>Library/Homebrew/formula.rb
<ide> def skip_clean *paths
<ide> return
<ide> end
<ide>
<del> @skip_clean_paths ||= []
<ide> paths.each do |p|
<ide> p = p.to_s unless p == :la # Keep :la in paths as a symbol
<del> @skip_clean_paths << p unless @skip_clean_paths.include? p
<add> skip_clean_paths << p
<ide> end
<ide> end
<ide>
<ide> def skip_clean_all?
<ide> end
<ide>
<ide> def skip_clean_paths
<del> @skip_clean_paths or []
<add> @skip_clean_paths ||= Set.new
<ide> end
<ide>
<ide> def keg_only reason, explanation=nil | 1 |
Javascript | Javascript | fix redeclared vars in test-vm-* | 5f44475b5ad9a7ddf6aea4350c0dc9cced4954e8 | <ide><path>test/parallel/test-vm-basic.js
<ide> assert.strictEqual(result, 'function');
<ide> // Test 2: vm.runInContext
<ide> var sandbox2 = { foo: 'bar' };
<ide> var context = vm.createContext(sandbox2);
<del>var result = vm.runInContext(
<add>result = vm.runInContext(
<ide> 'baz = foo; this.typeofProcess = typeof process; typeof Object;',
<ide> context
<ide> );
<ide> assert.deepEqual(sandbox2, {
<ide> assert.strictEqual(result, 'function');
<ide>
<ide> // Test 3: vm.runInThisContext
<del>var result = vm.runInThisContext(
<add>result = vm.runInThisContext(
<ide> 'vmResult = "foo"; Object.prototype.toString.call(process);'
<ide> );
<ide> assert.strictEqual(global.vmResult, 'foo');
<ide> assert.strictEqual(result, '[object process]');
<ide> delete global.vmResult;
<ide>
<ide> // Test 4: vm.runInNewContext
<del>var result = vm.runInNewContext(
<add>result = vm.runInNewContext(
<ide> 'vmResult = "foo"; typeof process;'
<ide> );
<ide> assert.strictEqual(global.vmResult, undefined);
<ide><path>test/parallel/test-vm-debug-context.js
<ide> proc.once('exit', common.mustCall(function(exitCode, signalCode) {
<ide> assert.equal(signalCode, null);
<ide> }));
<ide>
<del>var proc = spawn(process.execPath, [script, 'handle-fatal-exception']);
<add>proc = spawn(process.execPath, [script, 'handle-fatal-exception']);
<ide> proc.stdout.on('data', common.fail);
<ide> proc.stderr.on('data', common.fail);
<ide> proc.once('exit', common.mustCall(function(exitCode, signalCode) {
<ide><path>test/parallel/test-vm-harmony-proxies.js
<ide> assert(typeof sandbox.Proxy === 'object');
<ide> assert(sandbox.Proxy !== Proxy);
<ide>
<ide> // Unless we copy the Proxy object explicitly, of course.
<del>var sandbox = { Proxy: Proxy };
<add>sandbox = { Proxy: Proxy };
<ide> vm.runInNewContext('this.Proxy = Proxy', sandbox);
<ide> assert(typeof sandbox.Proxy === 'object');
<ide> assert(sandbox.Proxy === Proxy);
<ide><path>test/parallel/test-vm-harmony-symbols.js
<ide> assert(typeof sandbox.Symbol === 'function');
<ide> assert(sandbox.Symbol !== Symbol);
<ide>
<ide> // Unless we copy the Symbol constructor explicitly, of course.
<del>var sandbox = { Symbol: Symbol };
<add>sandbox = { Symbol: Symbol };
<ide> vm.runInNewContext('this.Symbol = Symbol', sandbox);
<ide> assert(typeof sandbox.Symbol === 'function');
<ide> assert(sandbox.Symbol === Symbol); | 4 |
Go | Go | avoid parallel layer downloads in load command | de35ef2ebed3a7e8173abae39933d04e9874790c | <ide><path>graph/load.go
<ide> func (s *TagStore) recursiveLoad(eng *engine.Engine, address, tmpImageDir string
<ide> log.Debugf("Error validating ID: %s", err)
<ide> return err
<ide> }
<add>
<add> // ensure no two downloads of the same layer happen at the same time
<add> if c, err := s.poolAdd("pull", "layer:"+img.ID); err != nil {
<add> if c != nil {
<add> log.Debugf("Image (id: %s) load is already running, waiting: %v", img.ID, err)
<add> <-c
<add> return nil
<add> }
<add>
<add> return err
<add> }
<add>
<add> defer s.poolRemove("pull", "layer:"+img.ID)
<add>
<ide> if img.Parent != "" {
<ide> if !s.graph.Exists(img.Parent) {
<ide> if err := s.recursiveLoad(eng, img.Parent, tmpImageDir); err != nil { | 1 |
PHP | PHP | fix user and request info not persisting | 8aacf5ccbd590573ab8c80a656c2a7ae81433dc4 | <ide><path>src/Illuminate/Session/DatabaseSessionHandler.php
<ide> protected function getDefaultPayload($data)
<ide> return $payload;
<ide> }
<ide>
<del> return tap($payload, function ($payload) {
<add> return tap($payload, function (&$payload) {
<ide> $this->addUserInformation($payload)
<ide> ->addRequestInformation($payload);
<ide> }); | 1 |
Javascript | Javascript | add ipv6 brackets but no port to test-dns | 8afab1a5a7ad1106d3199a36eb221a130d6f832b | <ide><path>test/parallel/test-dns.js
<ide> const ports = [
<ide> '4.4.4.4:53',
<ide> '[2001:4860:4860::8888]:53',
<ide> '103.238.225.181:666',
<del> '[fe80::483a:5aff:fee6:1f04]:666'
<add> '[fe80::483a:5aff:fee6:1f04]:666',
<add> '[fe80::483a:5aff:fee6:1f04]',
<ide> ];
<ide> const portsExpected = [
<ide> '4.4.4.4',
<ide> '2001:4860:4860::8888',
<ide> '103.238.225.181:666',
<del> '[fe80::483a:5aff:fee6:1f04]:666'
<add> '[fe80::483a:5aff:fee6:1f04]:666',
<add> 'fe80::483a:5aff:fee6:1f04',
<ide> ];
<ide> dns.setServers(ports);
<ide> assert.deepStrictEqual(dns.getServers(), portsExpected); | 1 |
Ruby | Ruby | fix tests for sqlite3 3.6.xx | 6513dde490ac35aa6974d85377ecc9af4d2e20fd | <ide><path>activerecord/test/models/project.rb
<ide> class Project < ActiveRecord::Base
<ide> :after_add => Proc.new {|o, r| o.developers_log << "after_adding#{r.id || '<new>'}"},
<ide> :before_remove => Proc.new {|o, r| o.developers_log << "before_removing#{r.id}"},
<ide> :after_remove => Proc.new {|o, r| o.developers_log << "after_removing#{r.id}"}
<del> has_and_belongs_to_many :well_payed_salary_groups, :class_name => "Developer", :group => "salary", :having => "SUM(salary) > 10000", :select => "SUM(salary) as salary"
<add> has_and_belongs_to_many :well_payed_salary_groups, :class_name => "Developer", :group => "developers.salary", :having => "SUM(salary) > 10000", :select => "SUM(salary) as salary"
<ide>
<ide> attr_accessor :developers_log
<ide> | 1 |
Python | Python | follow symlinks in dag_folder and plugins_folder | 80caf6a38c011673d89883faed931255df416064 | <ide><path>airflow/models.py
<ide> def collect_dags(
<ide> self.process_file(dag_folder, only_if_updated=only_if_updated)
<ide> elif os.path.isdir(dag_folder):
<ide> patterns = []
<del> for root, dirs, files in os.walk(dag_folder):
<add> for root, dirs, files in os.walk(dag_folder, followlinks=True):
<ide> ignore_file = [f for f in files if f == '.airflowignore']
<ide> if ignore_file:
<ide> f = open(os.path.join(root, ignore_file[0]), 'r')
<ide><path>airflow/plugins_manager.py
<ide> def validate(cls):
<ide> plugins = []
<ide>
<ide> # Crawl through the plugins folder to find AirflowPlugin derivatives
<del>for root, dirs, files in os.walk(plugins_folder):
<add>for root, dirs, files in os.walk(plugins_folder, followlinks=True):
<ide> for f in files:
<ide> try:
<ide> filepath = os.path.join(root, f) | 2 |
Java | Java | implement touch intercepting in rctview | d23f86e47b2e4364b4b56649ea64bd48faf4c5e5 | <ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/FlatViewGroup.java
<ide> import android.content.Context;
<ide> import android.graphics.Canvas;
<ide> import android.graphics.drawable.Drawable;
<add>import android.view.MotionEvent;
<ide> import android.view.View;
<ide> import android.view.ViewGroup;
<ide> import android.view.ViewParent;
<ide>
<add>import com.facebook.react.bridge.SoftAssertions;
<add>import com.facebook.react.touch.CatalystInterceptingViewGroup;
<add>import com.facebook.react.touch.OnInterceptTouchEventListener;
<add>import com.facebook.react.uimanager.PointerEvents;
<ide> import com.facebook.react.uimanager.ReactCompoundView;
<add>import com.facebook.react.uimanager.ReactPointerEventsView;
<ide>
<ide> /**
<ide> * A view that FlatShadowNode hierarchy maps to. Performs drawing by iterating over
<ide> * array of DrawCommands, executing them one by one.
<ide> */
<del>/* package */ final class FlatViewGroup extends ViewGroup implements ReactCompoundView {
<add>/* package */ final class FlatViewGroup extends ViewGroup
<add> implements CatalystInterceptingViewGroup, ReactCompoundView, ReactPointerEventsView {
<ide> /**
<ide> * Helper class that allows AttachDetachListener to invalidate the hosting View.
<ide> */
<ide> public void invalidate() {
<ide> private boolean mIsLayoutRequested = false;
<ide> private boolean mNeedsOffscreenAlphaCompositing = false;
<ide> private Drawable mHotspot;
<add> private PointerEvents mPointerEvents = PointerEvents.AUTO;
<add> private @Nullable OnInterceptTouchEventListener mOnInterceptTouchEventListener;
<ide>
<ide> /* package */ FlatViewGroup(Context context) {
<ide> super(context);
<ide> public void requestLayout() {
<ide>
<ide> @Override
<ide> public int reactTagForTouch(float touchX, float touchY) {
<del> for (NodeRegion nodeRegion : mNodeRegions) {
<del> if (nodeRegion.withinBounds(touchX, touchY)) {
<add> /**
<add> * Make sure we don't find any children if the pointer events are set to BOX_ONLY.
<add> * There is no need to special-case any other modes, because if PointerEvents are set to:
<add> * a) PointerEvents.AUTO - all children are included, nothing to exclude
<add> * b) PointerEvents.NONE - this method will NOT be executed, because the View will be filtered
<add> * out by TouchTargetHelper.
<add> * c) PointerEvents.BOX_NONE - TouchTargetHelper will make sure that {@link #reactTagForTouch()}
<add> * doesn't return getId().
<add> */
<add> SoftAssertions.assertCondition(
<add> mPointerEvents != PointerEvents.NONE,
<add> "TouchTargetHelper should not allow calling this method when pointer events are NONE");
<add>
<add> if (mPointerEvents != PointerEvents.BOX_ONLY) {
<add> NodeRegion nodeRegion = nodeRegionWithinBounds(touchX, touchY);
<add> if (nodeRegion != null) {
<ide> return nodeRegion.getReactTag(touchX, touchY);
<ide> }
<ide> }
<ide>
<add> SoftAssertions.assertCondition(
<add> mPointerEvents != PointerEvents.BOX_NONE,
<add> "TouchTargetHelper should not allow returning getId() when pointer events are BOX_NONE");
<add>
<ide> // no children found
<ide> return getId();
<ide> }
<ide> public boolean hasOverlappingRendering() {
<ide> return mNeedsOffscreenAlphaCompositing;
<ide> }
<ide>
<add> @Override
<add> public void setOnInterceptTouchEventListener(OnInterceptTouchEventListener listener) {
<add> mOnInterceptTouchEventListener = listener;
<add> }
<add>
<add> @Override
<add> public boolean onInterceptTouchEvent(MotionEvent ev) {
<add> if (mOnInterceptTouchEventListener != null &&
<add> mOnInterceptTouchEventListener.onInterceptTouchEvent(this, ev)) {
<add> return true;
<add> }
<add> // We intercept the touch event if the children are not supposed to receive it.
<add> if (mPointerEvents == PointerEvents.NONE || mPointerEvents == PointerEvents.BOX_ONLY) {
<add> return true;
<add> }
<add> return super.onInterceptTouchEvent(ev);
<add> }
<add>
<add> @Override
<add> public boolean onTouchEvent(MotionEvent ev) {
<add> // We do not accept the touch event if this view is not supposed to receive it.
<add> if (mPointerEvents == PointerEvents.NONE) {
<add> return false;
<add> }
<add>
<add> if (mPointerEvents == PointerEvents.BOX_NONE) {
<add> // We cannot always return false here because some child nodes could be flatten into this View
<add> NodeRegion nodeRegion = nodeRegionWithinBounds(ev.getX(), ev.getY());
<add> if (nodeRegion == null) {
<add> // no child to handle this touch event, bailing out.
<add> return false;
<add> }
<add> }
<add>
<add> // The root view always assumes any view that was tapped wants the touch
<add> // and sends the event to JS as such.
<add> // We don't need to do bubbling in native (it's already happening in JS).
<add> // For an explanation of bubbling and capturing, see
<add> // http://javascript.info/tutorial/bubbling-and-capturing#capturing
<add> return true;
<add> }
<add>
<add> @Override
<add> public PointerEvents getPointerEvents() {
<add> return mPointerEvents;
<add> }
<add>
<add> /*package*/ void setPointerEvents(PointerEvents pointerEvents) {
<add> mPointerEvents = pointerEvents;
<add> }
<add>
<ide> /**
<ide> * See the documentation of needsOffscreenAlphaCompositing in View.js.
<ide> */
<ide> public boolean hasOverlappingRendering() {
<ide> LAYOUT_REQUESTS.clear();
<ide> }
<ide>
<add> private NodeRegion nodeRegionWithinBounds(float touchX, float touchY) {
<add> for (NodeRegion nodeRegion : mNodeRegions) {
<add> if (nodeRegion.withinBounds(touchX, touchY)) {
<add> return nodeRegion;
<add> }
<add> }
<add>
<add> return null;
<add> }
<add>
<ide> private View ensureViewHasNoParent(View view) {
<ide> ViewParent oldParent = view.getParent();
<ide> if (oldParent != null) {
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/RCTView.java
<ide> public void setBorderStyle(@Nullable String borderStyle) {
<ide> getMutableBorder().setBorderStyle(borderStyle);
<ide> }
<ide>
<add> @ReactProp(name = "pointerEvents")
<add> public void setPointerEvents(@Nullable String pointerEventsStr) {
<add> forceMountToView();
<add> }
<add>
<ide> private DrawBorder getMutableBorder() {
<ide> if (mDrawBorder == null) {
<ide> mDrawBorder = new DrawBorder();
<ide><path>ReactAndroid/src/main/java/com/facebook/react/flat/RCTViewManager.java
<ide> import com.facebook.react.bridge.ReadableMap;
<ide> import com.facebook.react.common.MapBuilder;
<ide> import com.facebook.react.uimanager.PixelUtil;
<add>import com.facebook.react.uimanager.PointerEvents;
<ide> import com.facebook.react.uimanager.ReactProp;
<ide> import com.facebook.react.uimanager.ViewProps;
<ide> import com.facebook.react.views.view.ReactDrawableHelper;
<ide> public void setNeedsOffscreenAlphaCompositing(
<ide> boolean needsOffscreenAlphaCompositing) {
<ide> view.setNeedsOffscreenAlphaCompositing(needsOffscreenAlphaCompositing);
<ide> }
<add>
<add> @ReactProp(name = "pointerEvents")
<add> public void setPointerEvents(FlatViewGroup view, @Nullable String pointerEventsStr) {
<add> view.setPointerEvents(parsePointerEvents(pointerEventsStr));
<add> }
<add>
<add> private static PointerEvents parsePointerEvents(@Nullable String pointerEventsStr) {
<add> if (pointerEventsStr != null) {
<add> switch (pointerEventsStr) {
<add> case "none":
<add> return PointerEvents.NONE;
<add> case "auto":
<add> return PointerEvents.AUTO;
<add> case "box-none":
<add> return PointerEvents.BOX_NONE;
<add> case "box-only":
<add> return PointerEvents.BOX_ONLY;
<add> }
<add> }
<add> // default or invalid
<add> return PointerEvents.AUTO;
<add> }
<ide> } | 3 |
Go | Go | add more tests to unix_test.go | e802b69146ac7a008d943a3a289fba56150b4f81 | <ide><path>pkg/beam/unix_test.go
<ide> func TestSocketPair(t *testing.T) {
<ide> fmt.Printf("still open: %v\n", a.Fd())
<ide> }
<ide>
<add>func TestUSocketPair(t *testing.T) {
<add> a, b, err := USocketPair()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add>
<add> data := "hello world!"
<add> go func() {
<add> a.Write([]byte(data))
<add> a.Close()
<add> }()
<add> res := make([]byte, 1024)
<add> size, err := b.Read(res)
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if size != len(data) {
<add> t.Fatal("Unexpected size")
<add> }
<add> if string(res[0:size]) != data {
<add> t.Fatal("Unexpected data")
<add> }
<add>}
<add>
<ide> func TestSendUnixSocket(t *testing.T) {
<ide> a1, a2, err := USocketPair()
<ide> if err != nil {
<ide> func TestSendUnixSocket(t *testing.T) {
<ide> t.Fatal(err)
<ide> }
<ide> fmt.Printf("---> %s\n", data)
<add>
<add>}
<add>
<add>// Ensure we get proper segmenting of messages
<add>func TestSendSegmenting(t *testing.T) {
<add> a, b, err := USocketPair()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> defer a.Close()
<add> defer b.Close()
<add>
<add> extrafd1, extrafd2, err := SocketPair()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> extrafd2.Close()
<add>
<add> go func() {
<add> a.Send([]byte("message 1"), nil)
<add> a.Send([]byte("message 2"), extrafd1)
<add> a.Send([]byte("message 3"), nil)
<add> }()
<add>
<add> msg1, file1, err := b.Receive()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if string(msg1) != "message 1" {
<add> t.Fatal("unexpected msg1:", string(msg1))
<add> }
<add> if file1 != nil {
<add> t.Fatal("unexpectedly got file1")
<add> }
<add>
<add> msg2, file2, err := b.Receive()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if string(msg2) != "message 2" {
<add> t.Fatal("unexpected msg2:", string(msg2))
<add> }
<add> if file2 == nil {
<add> t.Fatal("didn't get file2")
<add> }
<add> file2.Close()
<add>
<add> msg3, file3, err := b.Receive()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if string(msg3) != "message 3" {
<add> t.Fatal("unexpected msg3:", string(msg3))
<add> }
<add> if file3 != nil {
<add> t.Fatal("unexpectedly got file3")
<add> }
<add>
<add>}
<add>
<add>// Test sending a zero byte message
<add>func TestSendEmpty(t *testing.T) {
<add> a, b, err := USocketPair()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> defer a.Close()
<add> defer b.Close()
<add> go func() {
<add> a.Send([]byte{}, nil)
<add> }()
<add>
<add> msg, file, err := b.Receive()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if len(msg) != 0 {
<add> t.Fatalf("unexpected non-empty message: %v", msg)
<add> }
<add> if file != nil {
<add> t.Fatal("unexpectedly got file")
<add> }
<add>
<add>}
<add>
<add>func makeLarge(size int) []byte {
<add> res := make([]byte, size)
<add> for i := range res {
<add> res[i] = byte(i % 255)
<add> }
<add> return res
<add>}
<add>
<add>func verifyLarge(data []byte, size int) bool {
<add> if len(data) != size {
<add> return false
<add> }
<add> for i := range data {
<add> if data[i] != byte(i%255) {
<add> return false
<add> }
<add> }
<add> return true
<add>}
<add>
<add>// Test sending a large message
<add>func TestSendLarge(t *testing.T) {
<add> a, b, err := USocketPair()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> defer a.Close()
<add> defer b.Close()
<add> go func() {
<add> a.Send(makeLarge(100000), nil)
<add> }()
<add>
<add> msg, file, err := b.Receive()
<add> if err != nil {
<add> t.Fatal(err)
<add> }
<add> if !verifyLarge(msg, 100000) {
<add> t.Fatalf("unexpected message (size %d)", len(msg))
<add> }
<add> if file != nil {
<add> t.Fatal("unexpectedly got file")
<add> }
<ide> } | 1 |
Ruby | Ruby | fix rubocop offenses | 7d9758a05ecf8cef93ba919568579cd0bee58c82 | <ide><path>Library/Homebrew/extend/os/mac/hardware/cpu.rb
<ide> def features
<ide> "machdep.cpu.features",
<ide> "machdep.cpu.extfeatures",
<ide> "machdep.cpu.leaf7_features",
<del> ).split(" ").map { |s| s.downcase.to_sym }
<add> ).split.map { |s| s.downcase.to_sym }
<ide> end
<ide>
<ide> def sse4?
<ide><path>Library/Homebrew/formula.rb
<ide> def system(cmd, *args)
<ide> @exec_count += 1
<ide> logfn = format("#{logs}/#{active_log_prefix}%02<exec_count>d.%<cmd_base>s",
<ide> exec_count: @exec_count,
<del> cmd_base: File.basename(cmd).split(" ").first)
<add> cmd_base: File.basename(cmd).split.first)
<ide> logs.mkpath
<ide>
<ide> File.open(logfn, "w") do |log|
<ide><path>Library/Homebrew/formula_installer.rb
<ide> def forbidden_license_check
<ide> pattern = /#{s.to_s.tr("_", " ")}/i
<ide> forbidden_licenses.sub!(pattern, s.to_s)
<ide> end
<del> forbidden_licenses = forbidden_licenses.split(" ").to_h do |license|
<add> forbidden_licenses = forbidden_licenses.split.to_h do |license|
<ide> [license, SPDX.license_version_info(license)]
<ide> end
<ide>
<ide><path>Library/Homebrew/rubocops/livecheck.rb
<ide> def audit_formula(_node, _class_node, _parent_class_node, body_node)
<ide>
<ide> def autocorrect(node)
<ide> lambda do |corrector|
<del> pattern = node.source.split(" ")[1..].join
<add> pattern = node.source.split[1..].join
<ide> corrector.replace(node.source_range, "regex(#{pattern})")
<ide> end
<ide> end
<ide><path>Library/Homebrew/test/exceptions_spec.rb
<ide> let(:mod) do
<ide> Module.new do
<ide> class Bar < Requirement; end
<add>
<ide> class Baz < Formula; end
<ide> end
<ide> end | 5 |
Go | Go | stop invalid calls to registry | d47507791e14908e78cf38d415a9863c9ef75c5e | <ide><path>server.go
<ide> func (srv *Server) ImagePull(localName string, tag string, out io.Writer, sf *ut
<ide> localName = remoteName
<ide> }
<ide>
<del> err = srv.pullRepository(r, out, localName, remoteName, tag, endpoint, sf, parallel)
<del> if err == registry.ErrLoginRequired {
<add> if err = srv.pullRepository(r, out, localName, remoteName, tag, endpoint, sf, parallel); err != nil {
<ide> return err
<ide> }
<del> if err != nil {
<del> if err := srv.pullImage(r, out, remoteName, endpoint, nil, sf); err != nil {
<del> return err
<del> }
<del> return nil
<del> }
<ide>
<ide> return nil
<ide> } | 1 |
Javascript | Javascript | increase default idletimeout to 1 minute | a7128eebe7e72859af878c61a1600b26239d02df | <ide><path>lib/WebpackOptionsDefaulter.js
<ide> class WebpackOptionsDefaulter extends OptionsDefaulter {
<ide> value.store = "pack";
<ide> }
<ide> if (value.idleTimeout === undefined) {
<del> value.idleTimeout = 10000;
<add> value.idleTimeout = 60000;
<ide> }
<ide> if (value.idleTimeoutForInitialStore === undefined) {
<ide> value.idleTimeoutForInitialStore = 0; | 1 |
Python | Python | move lookuplemmatizer to spacy.lemmatizer | 820bf850752962714a378b20de12ddbefe69f3e8 | <ide><path>spacy/lemmatizer.py
<ide> def lemmatize(string, index, exceptions, rules):
<ide> if not forms:
<ide> forms.append(string)
<ide> return set(forms)
<add>
<add>
<add>class LookupLemmatizer(Lemmatizer):
<add> @classmethod
<add> def load(cls, path, lookup):
<add> return cls(lookup or {})
<add>
<add> def __init__(self, lookup):
<add> self.lookup = lookup
<add>
<add> def __call__(self, string, univ_pos, morphology=None):
<add> try:
<add> return set([self.lookup[string]])
<add> except:
<add> return set([string])
<ide><path>spacy/lemmatizerlookup.py
<del># coding: utf8
<del>from __future__ import unicode_literals
<del>
<del>from .lemmatizer import Lemmatizer
<del>
<del>
<del>class Lemmatizer(Lemmatizer):
<del> @classmethod
<del> def load(cls, path, lookup):
<del> return cls(lookup or {})
<del>
<del> def __init__(self, lookup):
<del> self.lookup = lookup
<del>
<del> def __call__(self, string, univ_pos, morphology=None):
<del> try:
<del> return set([self.lookup[string]])
<del> except:
<del> return set([string])
<ide>\ No newline at end of file | 2 |
Javascript | Javascript | remove log in jsdevsupportmodule | 87690570c8147f57db2af2f4c9984ac5db9ec5c9 | <ide><path>Libraries/Utilities/JSDevSupportModule.js
<ide> var JSDevSupportModule = {
<ide>
<ide> var result = renderer.getInspectorDataForViewTag(tag);
<ide> var path = result.hierarchy.map( (item) => item.name).join(' -> ');
<del> console.error('StackOverflowException rendering JSComponent: ' + path);
<ide> require('NativeModules').JSDevSupport.setResult(path, null);
<ide> },
<ide> }; | 1 |
Java | Java | improve behaviorprocessor javadoc | a1c3ba9c885ac6b069b304f6b18c2bfe37278fdc | <ide><path>src/main/java/io/reactivex/processors/BehaviorProcessor.java
<ide> * <p>
<ide> * <img width="640" height="460" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/S.BehaviorProcessor.png" alt="">
<ide> * <p>
<add> * This processor does not have a public constructor by design; a new empty instance of this
<add> * {@code BehaviorSubject} can be created via the {@link #create()} method and
<add> * a new non-empty instance can be created via {@link #createDefault(Object)} (named as such to avoid
<add> * overload resolution conflict with {@code Flowable.create} that creates a Flowable, not a {@code BehaviorProcessor}).
<add> * <p>
<add> * In accordance with the Reactive Streams specification (<a href="https://github.com/reactive-streams/reactive-streams-jvm#2.13">Rule 2.13</a>)
<add> * {@code null}s are not allowed as default initial values in {@link #createDefault(Object)} or as parameters to {@link #onNext(Object)} and
<add> * {@link #onError(Throwable)}.
<add> * <p>
<add> * When this {@code BehaviorProcessor} is terminated via {@link #onError(Throwable)} or {@link #onComplete()}, the
<add> * last observed item (if any) is cleared and late {@link org.reactivestreams.Subscriber}s only receive
<add> * the respective terminal event.
<add> * <p>
<add> * The {@code BehaviorProcessor} does not support clearing its cached value (to appear empty again), however, the
<add> * effect can be achieved by using a special item and making sure {@code Subscriber}s subscribe through a
<add> * filter whose predicate filters out this special item:
<add> * <pre><code>
<add> * BehaviorProcessor<Integer> processor = BehaviorProcessor.create();
<add> *
<add> * final Integer EMPTY = Integer.MIN_VALUE;
<add> *
<add> * Flowable<Integer> flowable = processor.filter(v -> v != EMPTY);
<add> *
<add> * TestSubscriber<Integer> ts1 = flowable.test();
<add> *
<add> * processor.onNext(1);
<add> * // this will "clear" the cache
<add> * processor.onNext(EMPTY);
<add> *
<add> * TestSubscriber<Integer> ts2 = flowable.test();
<add> *
<add> * processor.onNext(2);
<add> * processor.onComplete();
<add> *
<add> * // ts1 received both non-empty items
<add> * ts1.assertResult(1, 2);
<add> *
<add> * // ts2 received only 2 even though the current item was EMPTY
<add> * // when it got subscribed
<add> * ts2.assertResult(2);
<add> *
<add> * // Subscribers coming after the processor was terminated receive
<add> * // no items and only the onComplete event in this case.
<add> * flowable.test().assertResult();
<add> * </code></pre>
<add> * <p>
<add> * Even though {@code BehaviorProcessor} implements the {@code Subscriber} interface, calling
<add> * {@code onSubscribe} is not required (<a href="https://github.com/reactive-streams/reactive-streams-jvm#2.12">Rule 2.12</a>)
<add> * if the processor is used as a standalone source. However, calling {@code onSubscribe} is
<add> * called after the {@code BehaviorProcessor} reached its terminal state will result in the
<add> * given {@code Subscription} being cancelled immediately.
<add> * <p>
<add> * Calling {@link #onNext(Object)}, {@link #onError(Throwable)} and {@link #onComplete()}
<add> * is still required to be serialized (called from the same thread or called non-overlappingly from different threads
<add> * through external means of serialization). The {@link #toSerialized()} method available to all {@code FlowableProcessor}s
<add> * provides such serialization and also protects against reentrance (i.e., when a downstream {@code Subscriber}
<add> * consuming this processor also wants to call {@link #onNext(Object)} on this processor recursively.
<add> * <p>
<add> * This {@code BehaviorProcessor} supports the standard state-peeking methods {@link #hasComplete()}, {@link #hasThrowable()},
<add> * {@link #getThrowable()} and {@link #hasSubscribers()} as well as means to read the latest observed value
<add> * in a non-blocking and thread-safe manner via {@link #hasValue()}, {@link #getValue()},
<add> * {@link #getValues()} or {@link #getValues(Object[])}.
<add> * <p>
<add> * Note that this processor signals {@code MissingBackpressureException} if a particular {@code Subscriber} is not
<add> * ready to receive {@code onNext} events. To avoid this exception being signaled, use {@link #offer(Object)} to only
<add> * try to emit an item when all {@code Subscriber}s have requested item(s).
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The {@code BehaviorProcessor} does not coordinate requests of its downstream {@code Subscriber}s and
<add> * expects each individual {@code Subscriber} is ready to receive {@code onNext} items when {@link #onNext(Object)}
<add> * is called. If a {@code Subscriber} is not ready, a {@code MissingBackpressureException} is signalled to it.
<add> * To avoid overflowing the current {@code Subscriber}s, the conditional {@link #offer(Object)} method is available
<add> * that returns true if any of the {@code Subscriber}s is not ready to receive {@code onNext} events. If
<add> * there are no {@code Subscriber}s to the processor, {@code offer()} always succeeds.
<add> * If the {@code BehaviorProcessor} is (optionally) subscribed to another {@code Publisher}, this upstream
<add> * {@code Publisher} is consumed in an unbounded fashion (requesting {@code Long.MAX_VALUE}).</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>{@code BehaviorProcessor} does not operate by default on a particular {@link io.reactivex.Scheduler} and
<add> * the {@code Subscriber}s get notified on the thread the respective {@code onXXX} methods were invoked.</dd>
<add> * </dl>
<add> * <p>
<ide> * Example usage:
<ide> * <pre> {@code
<ide>
<ide> * Creates a {@link BehaviorProcessor} without a default item.
<ide> *
<ide> * @param <T>
<del> * the type of item the Subject will emit
<add> * the type of item the BehaviorProcessor will emit
<ide> * @return the constructed {@link BehaviorProcessor}
<ide> */
<ide> @CheckReturnValue
<ide> public static <T> BehaviorProcessor<T> create() {
<ide> * {@link Subscriber} that subscribes to it.
<ide> *
<ide> * @param <T>
<del> * the type of item the Subject will emit
<add> * the type of item the BehaviorProcessor will emit
<ide> * @param defaultValue
<ide> * the item that will be emitted first to any {@link Subscriber} as long as the
<ide> * {@link BehaviorProcessor} has not yet observed any items from its source {@code Observable}
<ide> public Throwable getThrowable() {
<ide> }
<ide>
<ide> /**
<del> * Returns a single value the Subject currently has or null if no such value exists.
<add> * Returns a single value the BehaviorProcessor currently has or null if no such value exists.
<ide> * <p>The method is thread-safe.
<del> * @return a single value the Subject currently has or null if no such value exists
<add> * @return a single value the BehaviorProcessor currently has or null if no such value exists
<ide> */
<ide> public T getValue() {
<ide> Object o = value.get();
<ide> public T getValue() {
<ide> }
<ide>
<ide> /**
<del> * Returns an Object array containing snapshot all values of the Subject.
<add> * Returns an Object array containing snapshot all values of the BehaviorProcessor.
<ide> * <p>The method is thread-safe.
<del> * @return the array containing the snapshot of all values of the Subject
<add> * @return the array containing the snapshot of all values of the BehaviorProcessor
<ide> */
<ide> public Object[] getValues() {
<ide> @SuppressWarnings("unchecked")
<ide> public Object[] getValues() {
<ide> }
<ide>
<ide> /**
<del> * Returns a typed array containing a snapshot of all values of the Subject.
<add> * Returns a typed array containing a snapshot of all values of the BehaviorProcessor.
<ide> * <p>The method follows the conventions of Collection.toArray by setting the array element
<ide> * after the last value to null (if the capacity permits).
<ide> * <p>The method is thread-safe.
<ide> public boolean hasThrowable() {
<ide> }
<ide>
<ide> /**
<del> * Returns true if the subject has any value.
<add> * Returns true if the BehaviorProcessor has any value.
<ide> * <p>The method is thread-safe.
<del> * @return true if the subject has any value
<add> * @return true if the BehaviorProcessor has any value
<ide> */
<ide> public boolean hasValue() {
<ide> Object o = value.get(); | 1 |
Javascript | Javascript | replace 2 spaces with 1 tab | 859e0ddab6045e5d697daacf72e26f95f6e4f797 | <ide><path>lib/Compiler.js
<ide> Watching.prototype._done = function(err, compilation) {
<ide> this.handler(null, stats);
<ide> if(!this.closed) {
<ide> this.watch(compilation.fileDependencies, compilation.contextDependencies, compilation.missingDependencies);
<del> }
<add> }
<ide> this.callbacks.forEach(function(cb) {
<ide> cb();
<ide> }); | 1 |
PHP | PHP | update the listsources | 3f984b68facd9ba3c1eaa49d87d2bd115a87ce61 | <ide><path>lib/Cake/Model/Datasource/Database/Mssql.php
<ide> public function enabled() {
<ide> *
<ide> * @return array Array of tablenames in the database
<ide> */
<del> function listSources() {
<add> public function listSources() {
<ide> $cache = parent::listSources();
<del>
<del> if ($cache != null) {
<add> if ($cache !== null) {
<ide> return $cache;
<ide> }
<del> $result = $this->fetchAll('SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES', false);
<add> $result = $this->_execute("SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE'"));
<ide>
<del> if (!$result || empty($result)) {
<add> if (!$result) {
<add> $result->closeCursor();
<ide> return array();
<del> } else {
<del> $tables = array();
<del>
<del> foreach ($result as $table) {
<del> $tables[] = $table[0]['TABLE_NAME'];
<del> }
<add> }
<ide>
<del> parent::listSources($tables);
<del> return $tables;
<add> $tables = array();
<add> while ($line = $result->fetch(PDO::FETCH_ASSOC)) {
<add> $tables[] = $line['TABLE_NAME'];
<ide> }
<add>
<add> $result->closeCursor();
<add> parent::listSources($tables);
<add> return $tables;
<ide> }
<ide>
<ide> /** | 1 |
Text | Text | add a period at the end of an ul tag | dba157e7c058469f66f62b898b12990ae0388364 | <ide><path>curriculum/challenges/english/01-responsive-web-design/basic-html-and-html5/create-a-bulleted-unordered-list.md
<ide> dashedName: create-a-bulleted-unordered-list
<ide>
<ide> HTML has a special element for creating <dfn>unordered lists</dfn>, or bullet point style lists.
<ide>
<del>Unordered lists start with an opening `<ul>` element, followed by any number of `<li>` elements. Finally, unordered lists close with a `</ul>`
<add>Unordered lists start with an opening `<ul>` element, followed by any number of `<li>` elements. Finally, unordered lists close with a `</ul>`.
<ide>
<ide> For example:
<ide> | 1 |
Javascript | Javascript | make hostcomponent inexact | 9354dd2752239b72b3c183c75256a7830d2938c0 | <ide><path>packages/react-native-renderer/src/ReactNativeTypes.js
<ide> export type NativeMethods = {
<ide> };
<ide>
<ide> export type NativeMethodsMixinType = NativeMethods;
<del>export type HostComponent<T> = AbstractComponent<
<del> T,
<del> $ReadOnly<$Exact<NativeMethods>>,
<del>>;
<add>export type HostComponent<T> = AbstractComponent<T, $ReadOnly<NativeMethods>>;
<ide>
<ide> type SecretInternalsType = {
<ide> NativeMethodsMixin: NativeMethodsMixinType, | 1 |
Ruby | Ruby | remove duplicated tests from shared generator | 2b355757d5485beb42a4184e28621d86daced862 | <ide><path>railties/test/generators/app_generator_test.rb
<ide> def test_http_only_generates_config_middleware_and_generator_http_only_setup
<ide>
<ide> def test_http_only_generates_application_controller_with_action_controller_http
<ide> run_generator [destination_root, "--http-only"]
<del> assert_file "app/controllers/application_controller.rb", /class ApplicationController < ActionController::HTTP/
<add> assert_file "app/controllers/application_controller.rb",
<add> /class ApplicationController < ActionController::HTTP/
<ide> end
<ide>
<ide> def test_http_only_generates_application_controller_with_protect_from_forgery_commented_out_setup
<ide><path>railties/test/generators/shared_generator_tests.rb
<ide> def test_template_is_executed_when_supplied
<ide> assert_match(/It works!/, capture(:stdout) { generator.invoke_all })
<ide> end
<ide>
<del> def test_template_raises_an_error_with_invalid_path
<del> content = capture(:stderr){ run_generator([destination_root, "-m", "non/existant/path"]) }
<del> assert_match(/The template \[.*\] could not be loaded/, content)
<del> assert_match(/non\/existant\/path/, content)
<del> end
<del>
<del> def test_template_is_executed_when_supplied
<del> path = "http://gist.github.com/103208.txt"
<del> template = %{ say "It works!" }
<del> template.instance_eval "def read; self; end" # Make the string respond to read
<del>
<del> generator([destination_root], :template => path).expects(:open).with(path, 'Accept' => 'application/x-thor-template').returns(template)
<del> assert_match(/It works!/, capture(:stdout) { generator.invoke_all })
<del> end
<del>
<ide> def test_template_is_executed_when_supplied_an_https_path
<ide> path = "https://gist.github.com/103208.txt"
<ide> template = %{ say "It works!" } | 2 |
Python | Python | correct openapi test for common prefixes | 178a2dc786461fdd03da0496dccdce7fb4676072 | <ide><path>tests/schemas/test_openapi.py
<ide> def test_paths_construction(self):
<ide> assert 'post' in example_operations
<ide>
<ide> def test_prefixed_paths_construction(self):
<del> """Construction of the `paths` key with a common prefix."""
<add> """Construction of the `paths` key maintains a common prefix."""
<ide> patterns = [
<del> url(r'^api/v1/example/?$', views.ExampleListView.as_view()),
<del> url(r'^api/v1/example/{pk}/?$', views.ExampleDetailView.as_view()),
<add> url(r'^v1/example/?$', views.ExampleListView.as_view()),
<add> url(r'^v1/example/{pk}/?$', views.ExampleDetailView.as_view()),
<ide> ]
<ide> generator = SchemaGenerator(patterns=patterns)
<ide> generator._initialise_endpoints()
<ide>
<ide> paths = generator.get_paths()
<ide>
<del> assert '/example/' in paths
<del> assert '/example/{id}/' in paths
<add> assert '/v1/example/' in paths
<add> assert '/v1/example/{id}/' in paths
<ide>
<ide> def test_schema_construction(self):
<ide> """Construction of the top level dictionary.""" | 1 |
Text | Text | update troubleshooting guide from slack to discord | 2f4a8f33a3fffa4085a9e66fabfa1e715296e257 | <ide><path>docs/Troubleshooting.md
<ide> You can then pass `dispatch` down to other components manually, if you want to.
<ide>
<ide> ## Something else doesn’t work
<ide>
<del>Ask around on the **#redux** [Reactiflux](http://reactiflux.com/) Slack channel, or [create an issue](https://github.com/rackt/redux/issues).
<add>Ask around on the **#redux** [Reactiflux](http://reactiflux.com/) Discord channel, or [create an issue](https://github.com/rackt/redux/issues).
<ide> If you figure it out, [edit this document](https://github.com/rackt/redux/edit/master/docs/Troubleshooting.md) as a courtesy to the next person having the same problem. | 1 |
Text | Text | add note about next_data hydration | f7ac942e6ae990d6fedaae487f00ed777ef65f02 | <ide><path>docs/api-reference/data-fetching/get-initial-props.md
<ide> For the initial page load, `getInitialProps` will run on the server only. `getIn
<ide> - `getInitialProps` can **not** be used in children components, only in the default export of every page
<ide> - If you are using server-side only modules inside `getInitialProps`, make sure to [import them properly](https://arunoda.me/blog/ssr-and-server-only-modules), otherwise it'll slow down your app
<ide>
<add>> Note that irrespective of rendering type, any `props` will be passed to the page component and can be viewed on the client-side in the initial HTML. This is to allow the page to be [hydrated](https://reactjs.org/docs/react-dom.html#hydrate) correctly. Make sure that you don't pass any sensitive information that shouldn't be available on the client in `props`.
<add>
<ide> ## TypeScript
<ide>
<ide> If you're using TypeScript, you can use the `NextPage` type for function components:
<ide><path>docs/basic-features/data-fetching/get-server-side-props.md
<ide> export async function getServerSideProps(context) {
<ide> }
<ide> ```
<ide>
<add>> Note that irrespective of rendering type, any `props` will be passed to the page component and can be viewed on the client-side in the initial HTML. This is to allow the page to be [hydrated](https://reactjs.org/docs/react-dom.html#hydrate) correctly. Make sure that you don't pass any sensitive information that shouldn't be available on the client in `props`.
<add>
<ide> ## When does getServerSideProps run
<ide>
<ide> `getServerSideProps` only runs on server-side and never runs on the browser. If a page uses `getServerSideProps`, then:
<ide><path>docs/basic-features/data-fetching/get-static-props.md
<ide> export async function getStaticProps(context) {
<ide> }
<ide> ```
<ide>
<add>> Note that irrespective of rendering type, any `props` will be passed to the page component and can be viewed on the client-side in the initial HTML. This is to allow the page to be [hydrated](https://reactjs.org/docs/react-dom.html#hydrate) correctly. Make sure that you don't pass any sensitive information that shouldn't be available on the client in `props`.
<add>
<ide> ## When should I use getStaticProps?
<ide>
<ide> You should use `getStaticProps` if: | 3 |
Javascript | Javascript | fix bug in `classed` operator | be857135840e4b276c57920f980a50aedc18cdab | <ide><path>d3.js
<del>d3 = {version: "0.28.0"}; // semver
<add>d3 = {version: "0.28.1"}; // semver
<ide> if (!Date.now) Date.now = function() {
<ide> return +new Date();
<ide> };
<ide> function d3_selection(groups) {
<ide> // If no value is specified, return the first value.
<ide> if (arguments.length < 2) {
<ide> return first(function() {
<add> re.lastIndex = 0;
<ide> return re.test(this.className);
<ide> });
<ide> }
<ide>
<ide> /** @this {Element} */
<ide> function classedAdd() {
<ide> var classes = this.className;
<add> re.lastIndex = 0;
<ide> if (!re.test(classes)) {
<ide> this.className = d3_collapse(classes + " " + name);
<ide> }
<ide><path>d3.min.js
<del>(function(){var n=null;d3={version:"0.28.0"};if(!Date.now)Date.now=function(){return+new Date};if(!Object.create)Object.create=function(a){function b(){}b.prototype=a;return new b};function x(a){return Array.prototype.slice.call(a)}function y(a){return typeof a=="function"?a:function(){return a}}d3.ascending=function(a,b){return a<b?-1:a>b?1:0};d3.descending=function(a,b){return b<a?-1:b>a?1:0};d3.merge=function(a){return Array.prototype.concat.apply([],a)};
<add>(function(){var n=null;d3={version:"0.28.1"};if(!Date.now)Date.now=function(){return+new Date};if(!Object.create)Object.create=function(a){function b(){}b.prototype=a;return new b};function x(a){return Array.prototype.slice.call(a)}function y(a){return typeof a=="function"?a:function(){return a}}d3.ascending=function(a,b){return a<b?-1:a>b?1:0};d3.descending=function(a,b){return b<a?-1:b>a?1:0};d3.merge=function(a){return Array.prototype.concat.apply([],a)};
<ide> d3.split=function(a,b){var e=[],f=[],c,d=-1,h=a.length;if(arguments.length<2)b=aa;for(;++d<h;)if(b.call(f,c=a[d],d)){e.push(f);f=[]}else f.push(c);e.push(f);return e};function aa(a){return a==n}function E(a){return a.replace(/(^\s+)|(\s+$)/g,"").replace(/\s+/g," ")}function ba(a,b){b=x(arguments);b[0]=this;a.apply(this,b);return this}
<ide> d3.range=function(a,b,e){if(arguments.length==1){b=a;a=0}if(e==n)e=1;if((b-a)/e==Infinity)throw Error("infinite range");var f=[],c=-1,d;if(e<0)for(;(d=a+e*++c)>b;)f.push(d);else for(;(d=a+e*++c)<b;)f.push(d);return f};d3.requote=function(a){return a.replace(ca,"\\$&")};var ca=/[\\\^\$\*\+\?\[\]\(\)\.\{\}]/g;
<ide> d3.xhr=function(a,b,e){var f=new XMLHttpRequest;if(arguments.length<3)e=b;else b&&f.overrideMimeType(b);f.open("GET",a,true);f.onreadystatechange=function(){if(f.readyState==4)e(f.status<300?f:n)};f.send(n)};d3.text=function(a,b,e){if(arguments.length<3){e=b;b=n}d3.xhr(a,b,function(f){e(f&&f.responseText)})};d3.json=function(a,b){d3.text(a,"application/json",function(e){b(e?JSON.parse(e):n)})};
<ide> i=0,j=g.length;i<j;i++){var l=g[i];if(l)return c.call(l,l.__data__,i)}return n}a
<ide> var m=0,t=k.length,u=o.length,s=Math.min(t,u),v=Math.max(t,u),z=[],A=[],w=[],B,C;if(d){s={};v=[];var D;C=o.length;for(m=0;m<t;m++){D=d.nodeKey(B=k[m]);if(D in s)w[C++]=k[m];else{s[D]=B;v.push(D)}}for(m=0;m<u;m++){if(B=s[D=d.dataKey(C=o[m])]){B.__data__=C;z[m]=B;A[m]=w[m]=n}else{A[m]={appendChild:q,__data__:C};z[m]=w[m]=n}delete s[D]}for(m=0;m<t;m++)if(v[m]in s)w[m]=k[m]}else{for(;m<s;m++){B=k[m];C=o[m];if(B){B.__data__=C;z[m]=B;A[m]=w[m]=n}else{A[m]={appendChild:q,__data__:C};z[m]=w[m]=n}}for(;m<
<ide> u;m++){A[m]={appendChild:q,__data__:o[m]};z[m]=w[m]=n}for(;m<v;m++){w[m]=k[m];A[m]=z[m]=n}}A.parentNode=z.parentNode=w.parentNode=k.parentNode;A.parentData=z.parentData=w.parentData=k.parentData;l.push(A);p.push(z);r.push(w)}var g=-1,i=a.length,j,l=[],p=[],r=[];if(typeof d=="string")d=xa(d);if(typeof c=="function")for(;++g<i;)h(j=a[g],c.call(j,j.parentData,g));else for(;++g<i;)h(j=a[g],c);g=O(p);g.enter=function(k){return O(l).append(k)};g.exit=function(){return O(r)};return g};a.each=function(c){for(var d=
<ide> 0,h=a.length;d<h;d++)for(var g=a[d],i=0,j=g.length;i<j;i++){var l=g[i];l&&c.call(l,l.__data__,i)}return a};a.node=function(){return f(function(){return this})};a.attr=function(c,d){function h(){this.removeAttribute(c)}function g(){this.removeAttributeNS(c.space,c.local)}function i(){this.setAttribute(c,d)}function j(){this.setAttributeNS(c.space,c.local,d)}function l(){var r=d.apply(this,arguments);r==n?this.removeAttribute(c):this.setAttribute(c,r)}function p(){var r=d.apply(this,arguments);r==n?
<del>this.removeAttributeNS(c.space,c.local):this.setAttributeNS(c.space,c.local,r)}c=d3.ns.qualify(c);if(arguments.length<2)return f(c.local?function(){return this.getAttributeNS(c.space,c.local)}:function(){return this.getAttribute(c)});return a.each(d==n?c.local?g:h:typeof d=="function"?c.local?p:l:c.local?j:i)};a.classed=function(c,d){function h(){var l=this.className;if(!j.test(l))this.className=E(l+" "+c)}function g(){var l=E(this.className.replace(j," "));this.className=l.length?l:n}function i(){(d.apply(this,
<del>arguments)?h:g).call(this)}var j=RegExp("(^|\\s+)"+d3.requote(c)+"(\\s+|$)","g");if(arguments.length<2)return f(function(){return j.test(this.className)});return a.each(typeof d=="function"?i:d?h:g)};a.style=function(c,d,h){function g(){this.style.removeProperty(c)}function i(){this.style.setProperty(c,d,h)}function j(){var l=d.apply(this,arguments);l==n?this.style.removeProperty(c):this.style.setProperty(c,l,h)}if(arguments.length<3)h=n;if(arguments.length<2)return f(function(){return window.getComputedStyle(this,
<del>n).getPropertyValue(c)});return a.each(d==n?g:typeof d=="function"?j:i)};a.property=function(c,d){function h(){delete this[c]}function g(){this[c]=d}function i(){var j=d.apply(this,arguments);if(j==n)delete this[c];else this[c]=j}c=d3.ns.qualify(c);if(arguments.length<2)return f(function(){return this[c]});return a.each(d==n?h:typeof d=="function"?i:g)};a.text=function(c){function d(){this.appendChild(document.createTextNode(c))}function h(){var g=c.apply(this,arguments);g!=n&&this.appendChild(document.createTextNode(g))}
<del>if(arguments.length<1)return f(function(){return this.textContent});a.each(function(){for(;this.lastChild;)this.removeChild(this.lastChild)});return c==n?a:a.each(typeof c=="function"?h:d)};a.html=function(c){function d(){this.innerHTML=c}function h(){this.innerHTML=c.apply(this,arguments)}if(arguments.length<1)return f(function(){return this.innerHTML});return a.each(typeof c=="function"?h:d)};a.append=function(c){function d(g){return g.appendChild(document.createElement(c))}function h(g){return g.appendChild(document.createElementNS(c.space,
<del>c.local))}c=d3.ns.qualify(c);return b(c.local?h:d)};a.remove=function(){return b(function(c){var d=c.parentNode;d.removeChild(c);return d})};a.sort=function(c){c=ya.apply(this,arguments);for(var d=0,h=a.length;d<h;d++){var g=a[d];g.sort(c);for(var i=1,j=g.length,l=g[0];i<j;i++){var p=g[i];if(p){l&&l.parentNode.insertBefore(p,l.nextSibling);l=p}}}return a};a.on=function(c,d){c="on"+c;return a.each(function(h,g){this[c]=function(i){var j=d3.event;d3.event=i;try{d.call(this,h,g)}finally{d3.event=j}}})};
<del>a.transition=function(){return Q(a)};a.call=ba;return a}function xa(a){return{nodeKey:function(b){return b.getAttribute(a)},dataKey:function(b){return b[a]}}}function ya(a){if(!arguments.length)a=d3.u;return function(b,e){return a(b&&b.__data__,e&&e.__data__)}}d3.transition=P.transition;var za=0,R=0;
<add>this.removeAttributeNS(c.space,c.local):this.setAttributeNS(c.space,c.local,r)}c=d3.ns.qualify(c);if(arguments.length<2)return f(c.local?function(){return this.getAttributeNS(c.space,c.local)}:function(){return this.getAttribute(c)});return a.each(d==n?c.local?g:h:typeof d=="function"?c.local?p:l:c.local?j:i)};a.classed=function(c,d){function h(){var l=this.className;j.lastIndex=0;if(!j.test(l))this.className=E(l+" "+c)}function g(){var l=E(this.className.replace(j," "));this.className=l.length?l:
<add>n}function i(){(d.apply(this,arguments)?h:g).call(this)}var j=RegExp("(^|\\s+)"+d3.requote(c)+"(\\s+|$)","g");if(arguments.length<2)return f(function(){j.lastIndex=0;return j.test(this.className)});return a.each(typeof d=="function"?i:d?h:g)};a.style=function(c,d,h){function g(){this.style.removeProperty(c)}function i(){this.style.setProperty(c,d,h)}function j(){var l=d.apply(this,arguments);l==n?this.style.removeProperty(c):this.style.setProperty(c,l,h)}if(arguments.length<3)h=n;if(arguments.length<
<add>2)return f(function(){return window.getComputedStyle(this,n).getPropertyValue(c)});return a.each(d==n?g:typeof d=="function"?j:i)};a.property=function(c,d){function h(){delete this[c]}function g(){this[c]=d}function i(){var j=d.apply(this,arguments);if(j==n)delete this[c];else this[c]=j}c=d3.ns.qualify(c);if(arguments.length<2)return f(function(){return this[c]});return a.each(d==n?h:typeof d=="function"?i:g)};a.text=function(c){function d(){this.appendChild(document.createTextNode(c))}function h(){var g=
<add>c.apply(this,arguments);g!=n&&this.appendChild(document.createTextNode(g))}if(arguments.length<1)return f(function(){return this.textContent});a.each(function(){for(;this.lastChild;)this.removeChild(this.lastChild)});return c==n?a:a.each(typeof c=="function"?h:d)};a.html=function(c){function d(){this.innerHTML=c}function h(){this.innerHTML=c.apply(this,arguments)}if(arguments.length<1)return f(function(){return this.innerHTML});return a.each(typeof c=="function"?h:d)};a.append=function(c){function d(g){return g.appendChild(document.createElement(c))}
<add>function h(g){return g.appendChild(document.createElementNS(c.space,c.local))}c=d3.ns.qualify(c);return b(c.local?h:d)};a.remove=function(){return b(function(c){var d=c.parentNode;d.removeChild(c);return d})};a.sort=function(c){c=ya.apply(this,arguments);for(var d=0,h=a.length;d<h;d++){var g=a[d];g.sort(c);for(var i=1,j=g.length,l=g[0];i<j;i++){var p=g[i];if(p){l&&l.parentNode.insertBefore(p,l.nextSibling);l=p}}}return a};a.on=function(c,d){c="on"+c;return a.each(function(h,g){this[c]=function(i){var j=
<add>d3.event;d3.event=i;try{d.call(this,h,g)}finally{d3.event=j}}})};a.transition=function(){return Q(a)};a.call=ba;return a}function xa(a){return{nodeKey:function(b){return b.getAttribute(a)},dataKey:function(b){return b[a]}}}function ya(a){if(!arguments.length)a=d3.u;return function(b,e){return a(b&&b.__data__,e&&e.__data__)}}d3.transition=P.transition;var za=0,R=0;
<ide> function Q(a){function b(k){var o=true,q=-1;a.each(function(){if(i[++q]!=2){var m=(k-j[q])/l[q],t=this.__transition__,u,s,v=d[q];if(m<1){o=false;if(m<0)return}else m=1;if(i[q]){if(t.d!=f){i[q]=2;return}}else if(!t||t.d>f){i[q]=2;return}else{i[q]=1;g.start.dispatch.apply(this,arguments);v=d[q]={};t.d=f;for(s in c)v[s]=c[s].apply(this,arguments)}u=r(m);for(s in c)v[s].call(this,u);if(m==1){i[q]=2;if(t.d==f){m=t.o;if(m==f){delete this.__transition__;h&&this.parentNode.removeChild(this)}R=f;g.end.dispatch.apply(this,
<ide> arguments);R=0;t.o=m}}}});return o}var e={},f=R||++za,c={},d=[],h=false,g=d3.dispatch("start","end"),i=[],j=[],l=[],p,r=d3.ease("cubic-in-out");a.each(function(){(this.__transition__||(this.__transition__={})).o=f});e.delay=function(k){var o=Infinity,q=-1;if(typeof k=="function")a.each(function(){var m=j[++q]=+k.apply(this,arguments);if(m<o)o=m});else{o=+k;a.each(function(){j[++q]=o})}Aa(b,o);return e};e.duration=function(k){var o=-1;if(typeof k=="function"){p=0;a.each(function(){var q=l[++o]=+k.apply(this,
<ide> arguments);if(q>p)p=q})}else{p=+k;a.each(function(){l[++o]=p})}return e};e.ease=function(k){r=typeof k=="string"?d3.ease(k):k;return e};e.attrTween=function(k,o){function q(t,u){var s=o.call(this,t,u,this.getAttribute(k));return function(v){this.setAttribute(k,s(v))}}function m(t,u){var s=o.call(this,t,u,this.getAttributeNS(k.space,k.local));return function(v){this.setAttributeNS(k.space,k.local,s(v))}}c["attr."+k]=k.local?m:q;return e};e.attr=function(k,o){return e.attrTween(k,Ba(o))};e.styleTween=
<ide><path>src/core/core.js
<del>d3 = {version: "0.28.0"}; // semver
<add>d3 = {version: "0.28.1"}; // semver
<ide><path>src/core/selection.js
<ide> function d3_selection(groups) {
<ide> // If no value is specified, return the first value.
<ide> if (arguments.length < 2) {
<ide> return first(function() {
<add> re.lastIndex = 0;
<ide> return re.test(this.className);
<ide> });
<ide> }
<ide>
<ide> /** @this {Element} */
<ide> function classedAdd() {
<ide> var classes = this.className;
<add> re.lastIndex = 0;
<ide> if (!re.test(classes)) {
<ide> this.className = d3_collapse(classes + " " + name);
<ide> } | 4 |
Mixed | Ruby | add days_in_year method | 55b463f599dafc719da4a395f77482049b00f2d7 | <ide><path>activesupport/CHANGELOG.md
<add>* Added `Time#days_in_year` to return the number of days in the given year, or the
<add> current year if no argument is provided.
<add>
<add> *Jon Pascoe*
<add>
<ide> * Updated `parameterize` to preserve the case of a string, optionally.
<ide>
<ide> Example:
<ide><path>activesupport/lib/active_support/core_ext/time/calculations.rb
<ide> def days_in_month(month, year = current.year)
<ide> end
<ide> end
<ide>
<add> # Returns the number of days in the given year.
<add> # If no year is specified, it will use the current year.
<add> def days_in_year(year = current.year)
<add> days_in_month(2, year) + 337
<add> end
<add>
<ide> # Returns <tt>Time.zone.now</tt> when <tt>Time.zone</tt> or <tt>config.time_zone</tt> are set, otherwise just returns <tt>Time.now</tt>.
<ide> def current
<ide> ::Time.zone ? ::Time.zone.now : ::Time.now
<ide><path>activesupport/test/core_ext/time_ext_test.rb
<ide> def test_days_in_month_feb_in_leap_year_without_year_arg
<ide> end
<ide> end
<ide>
<add> def test_days_in_year_with_year
<add> assert_equal 365, Time.days_in_year(2005)
<add> assert_equal 366, Time.days_in_year(2004)
<add> assert_equal 366, Time.days_in_year(2000)
<add> assert_equal 365, Time.days_in_year(1900)
<add> end
<add>
<add> def test_days_in_year_in_common_year_without_year_arg
<add> Time.stub(:now, Time.utc(2007)) do
<add> assert_equal 365, Time.days_in_year
<add> end
<add> end
<add>
<add> def test_days_in_year_in_leap_year_without_year_arg
<add> Time.stub(:now, Time.utc(2008)) do
<add> assert_equal 366, Time.days_in_year
<add> end
<add> end
<add>
<ide> def test_last_month_on_31st
<ide> assert_equal Time.local(2004, 2, 29), Time.local(2004, 3, 31).last_month
<ide> end | 3 |
Text | Text | clarify section introducing callbacks | 90888debd00b752a0cdc446dae4067ee2ce7037f | <ide><path>docs/docs/tutorial.md
<ide> We use the `ref` attribute to assign a name to a child component and `this.refs`
<ide>
<ide> When a user submits a comment, we will need to refresh the list of comments to include the new one. It makes sense to do all of this logic in `CommentBox` since `CommentBox` owns the state that represents the list of comments.
<ide>
<del>We need to pass data from the child component to its parent. We do this by passing a `callback` in props from parent to child:
<add>We need to pass data from the child component back up to its parent. We do this in our parent's `render` method by passing a new callback (`handleCommentSubmit`) into the child, binding it to the child's `onCommentSubmit` event. Whenever the event is triggered, the callback will be invoked:
<ide>
<ide> ```javascript{15-17,30}
<ide> // tutorial17.js | 1 |
Text | Text | fix align documentation with the code | b22769194262a700728be9b912cce2f63383bca0 | <ide><path>docs/basic-features/data-fetching.md
<ide> export async function getStaticProps(context) {
<ide> The `context` parameter is an object containing the following keys:
<ide>
<ide> - `params` contains the route parameters for pages using dynamic routes. For example, if the page name is `[id].js` , then `params` will look like `{ id: ... }`. To learn more, take a look at the [Dynamic Routing documentation](/docs/routing/dynamic-routes.md). You should use this together with `getStaticPaths`, which we’ll explain later.
<del>- `preview` is `true` if the page is in the preview mode and `false` otherwise. See the [Preview Mode documentation](/docs/advanced-features/preview-mode.md).
<add>- `preview` is `true` if the page is in the preview mode and `undefined` otherwise. See the [Preview Mode documentation](/docs/advanced-features/preview-mode.md).
<ide> - `previewData` contains the preview data set by `setPreviewData`. See the [Preview Mode documentation](/docs/advanced-features/preview-mode.md).
<ide>
<ide> `getStaticProps` should return an object with: | 1 |
Ruby | Ruby | remove html_types set | 1fe0a1b5ebebb1372968606b85ce08b93bc145c8 | <ide><path>actionpack/lib/action_dispatch/http/mime_type.rb
<del>require 'set'
<ide> require 'singleton'
<ide> require 'active_support/core_ext/module/attribute_accessors'
<ide> require 'active_support/core_ext/string/starts_ends_with'
<ide> def const_defined?(sym, inherit = true)
<ide> # end
<ide> # end
<ide> class Type
<del> @@html_types = Set.new [:html, :all]
<del> cattr_reader :html_types
<del>
<ide> attr_reader :symbol
<ide>
<ide> @register_callbacks = []
<ide> def to_sym
<ide> end
<ide>
<ide> def ref
<del> to_sym || to_s
<add> symbol || to_s
<ide> end
<ide>
<ide> def ===(list)
<ide> def =~(mime_type)
<ide> end
<ide>
<ide> def html?
<del> @@html_types.include?(to_sym) || @string =~ /html/
<add> symbol == :html || @string =~ /html/
<ide> end
<ide>
<ide> def all?; false; end
<ide><path>actionpack/test/dispatch/mime_type_test.rb
<ide> class MimeTypeTest < ActiveSupport::TestCase
<ide> assert mime.respond_to?("#{type}?"), "#{mime.inspect} does not respond to #{type}?"
<ide> assert_equal type, mime.symbol, "#{mime.inspect} is not #{type}?"
<ide> invalid_types = types - [type]
<del> invalid_types.delete(:html) if Mime::Type.html_types.include?(type)
<add> invalid_types.delete(:html)
<ide> invalid_types.each { |other_type|
<ide> assert_not_equal mime.symbol, other_type, "#{mime.inspect} is #{other_type}?"
<ide> } | 2 |
Java | Java | add missing completable marbles (+19, 07/19a) | 382ba69afe6fdaab90682f9d37c32b4f538bb5c0 | <ide><path>src/main/java/io/reactivex/Completable.java
<ide> public final Completable compose(CompletableTransformer transformer) {
<ide>
<ide> /**
<ide> * Concatenates this Completable with another Completable.
<add> * <p>
<add> * <img width="640" height="317" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.concatWith.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code concatWith} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable concatWith(CompletableSource other) {
<ide>
<ide> /**
<ide> * Returns a Completable which delays the emission of the completion event by the given time.
<add> * <p>
<add> * <img width="640" height="343" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.delay.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code delay} does operate by default on the {@code computation} {@link Scheduler}.</dd>
<ide> public final Completable delay(long delay, TimeUnit unit) {
<ide> /**
<ide> * Returns a Completable which delays the emission of the completion event by the given time while
<ide> * running on the specified scheduler.
<add> * <p>
<add> * <img width="640" height="313" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.delay.s.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code delay} operates on the {@link Scheduler} you specify.</dd>
<ide> public final Completable delay(long delay, TimeUnit unit, Scheduler scheduler) {
<ide> /**
<ide> * Returns a Completable which delays the emission of the completion event, and optionally the error as well, by the given time while
<ide> * running on the specified scheduler.
<add> * <p>
<add> * <img width="640" height="253" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.delay.sb.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code delay} operates on the {@link Scheduler} you specify.</dd>
<ide> public final Completable delay(final long delay, final TimeUnit unit, final Sche
<ide>
<ide> /**
<ide> * Returns a Completable which calls the given onComplete callback if this Completable completes.
<add> * <p>
<add> * <img width="640" height="304" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnComplete.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnComplete} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doOnComplete(Action onComplete) {
<ide> /**
<ide> * Calls the shared {@code Action} if a CompletableObserver subscribed to the current
<ide> * Completable disposes the common Disposable it received via onSubscribe.
<add> * <p>
<add> * <img width="640" height="589" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnDispose.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnDispose} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doOnDispose(Action onDispose) {
<ide>
<ide> /**
<ide> * Returns a Completable which calls the given onError callback if this Completable emits an error.
<add> * <p>
<add> * <img width="640" height="304" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnError.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnError} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doOnError(Consumer<? super Throwable> onError) {
<ide> /**
<ide> * Returns a Completable which calls the given onEvent callback with the (throwable) for an onError
<ide> * or (null) for an onComplete signal from this Completable before delivering said signal to the downstream.
<add> * <p>
<add> * <img width="640" height="305" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnEvent.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnEvent} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> private Completable doOnLifecycle(
<ide> /**
<ide> * Returns a Completable instance that calls the given onSubscribe callback with the disposable
<ide> * that child subscribers receive on subscription.
<add> * <p>
<add> * <img width="640" height="304" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnSubscribe.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnSubscribe} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doOnSubscribe(Consumer<? super Disposable> onSubscribe)
<ide> /**
<ide> * Returns a Completable instance that calls the given onTerminate callback just before this Completable
<ide> * completes normally or with an exception.
<add> * <p>
<add> * <img width="640" height="304" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doOnTerminate.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doOnTerminate} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doOnTerminate(final Action onTerminate) {
<ide> /**
<ide> * Returns a Completable instance that calls the given onTerminate callback after this Completable
<ide> * completes normally or with an exception.
<add> * <p>
<add> * <img width="640" height="304" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doAfterTerminate.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code doAfterTerminate} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable doAfterTerminate(final Action onAfterTerminate) {
<ide> /**
<ide> * Calls the specified action after this Completable signals onError or onComplete or gets disposed by
<ide> * the downstream.
<del> * <p>In case of a race between a terminal event and a dispose call, the provided {@code onFinally} action
<add> * <p>
<add> * <img width="640" height="331" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.doFinally.png" alt="">
<add> * <p>
<add> * In case of a race between a terminal event and a dispose call, the provided {@code onFinally} action
<ide> * is executed once per subscription.
<del> * <p>Note that the {@code onFinally} action is shared between subscriptions and as such
<add> * <p>
<add> * Note that the {@code onFinally} action is shared between subscriptions and as such
<ide> * should be thread-safe.
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> public final Completable lift(final CompletableOperator onLift) {
<ide> /**
<ide> * Returns a Completable which subscribes to this and the other Completable and completes
<ide> * when both of them complete or one emits an error.
<add> * <p>
<add> * <img width="640" height="442" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.mergeWith.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code mergeWith} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable mergeWith(CompletableSource other) {
<ide>
<ide> /**
<ide> * Returns a Completable which emits the terminal events from the thread of the specified scheduler.
<add> * <p>
<add> * <img width="640" height="523" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.observeOn.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code observeOn} operates on a {@link Scheduler} you specify.</dd>
<ide> public final Completable observeOn(final Scheduler scheduler) {
<ide> /**
<ide> * Returns a Completable instance that if this Completable emits an error, it will emit an onComplete
<ide> * and swallow the throwable.
<add> * <p>
<add> * <img width="640" height="585" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.onErrorComplete.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code onErrorComplete} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable onErrorComplete() {
<ide> /**
<ide> * Returns a Completable instance that if this Completable emits an error and the predicate returns
<ide> * true, it will emit an onComplete and swallow the throwable.
<add> * <p>
<add> * <img width="640" height="283" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.onErrorComplete.f.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code onErrorComplete} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable onErrorComplete(final Predicate<? super Throwable> pred
<ide> * Returns a Completable instance that when encounters an error from this Completable, calls the
<ide> * specified mapper function that returns another Completable instance for it and resumes the
<ide> * execution with it.
<add> * <p>
<add> * <img width="640" height="426" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.onErrorResumeNext.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code onErrorResumeNext} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final Completable onErrorResumeNext(final Function<? super Throwable, ? e
<ide> /**
<ide> * Nulls out references to the upstream producer and downstream CompletableObserver if
<ide> * the sequence is terminated or downstream calls dispose().
<add> * <p>
<add> * <img width="640" height="326" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.onTerminateDetach.png" alt="">
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code onTerminateDetach} does not operate by default on a particular {@link Scheduler}.</dd>
<ide> public final <T> Flowable<T> startWith(Publisher<T> other) {
<ide>
<ide> /**
<ide> * Hides the identity of this Completable and its Disposable.
<del> * <p>Allows preventing certain identity-based
<del> * optimizations (fusion).
<add> * <p>
<add> * <img width="640" height="432" src="https://raw.github.com/wiki/ReactiveX/RxJava/images/rx-operators/Completable.hide.png" alt="">
<add> * <p>
<add> * Allows preventing certain identity-based optimizations (fusion).
<ide> * <dl>
<ide> * <dt><b>Scheduler:</b></dt>
<ide> * <dd>{@code hide} does not operate by default on a particular {@link Scheduler}.</dd> | 1 |
Text | Text | add command for e2e tests to docs | 68ba32d356326f1ce1dc6fbf99af4ad5fe92e0ea | <ide><path>docs/how-to-setup-freecodecamp-locally.md
<ide> A quick reference to the commands that you will need when working locally.
<ide> | `npm run test:curriculum --block='Basic HTML and HTML5'` | Test a specific Block. |
<ide> | `npm run test:curriculum --superblock='responsive-web-design'` | Test a specific SuperBlock. |
<ide> | `npm run test:server` | Run the server test suite. |
<add>| `npm run e2e` | Run the Cypress end to end tests. |
<ide> | `npm run clean` | Uninstalls all dependencies and cleans up caches. |
<ide>
<ide> ## Making changes locally | 1 |
Javascript | Javascript | ensure dom ready before testing | daed7ad5dd4c86ea88932d2e9ed8f184bb5a603b | <ide><path>client/src/client/frame-runner.js
<ide> async function initTestFrame(e = {}) {
<ide> // eval test string to actual JavaScript
<ide> // This return can be a function
<ide> // i.e. function() { assert(true, 'happy coding'); }
<del> // eslint-disable-next-line no-eval
<del> const test = eval(testString);
<add> const testPromise = new Promise((resolve, reject) =>
<add> // To avoid race conditions, we have to run the test in a final
<add> // document ready:
<add> $(() => {
<add> try {
<add> // eslint-disable-next-line no-eval
<add> const test = eval(testString);
<add> resolve({ test });
<add> } catch (err) {
<add> reject({ err });
<add> }
<add> })
<add> );
<add> const { test, err } = await testPromise;
<add> if (err) throw err;
<add>
<ide> if (typeof test === 'function') {
<ide> await test(e.getUserInput);
<ide> } | 1 |
Text | Text | use code markup/markdown in headers | d592efe10acb04ae64fbc5b17cac518abc418b2b | <ide><path>doc/api/querystring.md
<ide> query strings. It can be accessed using:
<ide> const querystring = require('querystring');
<ide> ```
<ide>
<del>## querystring.decode()
<add>## `querystring.decode()`
<ide> <!-- YAML
<ide> added: v0.1.99
<ide> -->
<ide>
<ide> The `querystring.decode()` function is an alias for `querystring.parse()`.
<ide>
<del>## querystring.encode()
<add>## `querystring.encode()`
<ide> <!-- YAML
<ide> added: v0.1.99
<ide> -->
<ide>
<ide> The `querystring.encode()` function is an alias for `querystring.stringify()`.
<ide>
<del>## querystring.escape(str)
<add>## `querystring.escape(str)`
<ide> <!-- YAML
<ide> added: v0.1.25
<ide> -->
<ide> generally not expected to be used directly. It is exported primarily to allow
<ide> application code to provide a replacement percent-encoding implementation if
<ide> necessary by assigning `querystring.escape` to an alternative function.
<ide>
<del>## querystring.parse(str\[, sep\[, eq\[, options\]\]\])
<add>## `querystring.parse(str[, sep[, eq[, options]]])`
<ide> <!-- YAML
<ide> added: v0.1.25
<ide> changes:
<ide> querystring.parse('w=%D6%D0%CE%C4&foo=bar', null, null,
<ide> { decodeURIComponent: gbkDecodeURIComponent });
<ide> ```
<ide>
<del>## querystring.stringify(obj\[, sep\[, eq\[, options\]\]\])
<add>## `querystring.stringify(obj[, sep[, eq[, options]]])`
<ide> <!-- YAML
<ide> added: v0.1.25
<ide> -->
<ide> querystring.stringify({ w: '中文', foo: 'bar' }, null, null,
<ide> { encodeURIComponent: gbkEncodeURIComponent });
<ide> ```
<ide>
<del>## querystring.unescape(str)
<add>## `querystring.unescape(str)`
<ide> <!-- YAML
<ide> added: v0.1.25
<ide> --> | 1 |
Javascript | Javascript | check challenges for all langs | 515999070d208306a29ef82a226efa2955452ca6 | <ide><path>curriculum/test/test-challenges.js
<ide> const ChallengeTitles = require('./utils/challengeTitles');
<ide> const { challengeSchemaValidator } = require('../schema/challengeSchema');
<ide> const { challengeTypes } = require('../../client/utils/challengeTypes');
<ide>
<add>const { supportedLangs } = require('../utils');
<add>
<ide> const { LOCALE: lang = 'english' } = process.env;
<ide>
<ide> const oldRunnerFail = Mocha.Runner.prototype.fail;
<ide> Mocha.Runner.prototype.fail = function(test, err) {
<del> if (err.stack && err instanceof AssertionError) {
<del> const assertIndex = err.message.indexOf(': expected');
<add> if (err instanceof AssertionError) {
<add> const errMessage = String(err.message || '');
<add> const assertIndex = errMessage.indexOf(': expected');
<ide> if (assertIndex !== -1) {
<del> err.message = err.message.slice(0, assertIndex);
<add> err.message = errMessage.slice(0, assertIndex);
<ide> }
<ide> // Don't show stacktrace for assertion errors.
<del> delete err.stack;
<add> if (err.stack) {
<add> delete err.stack;
<add> }
<ide> }
<ide> return oldRunnerFail.call(this, test, err);
<ide> };
<ide>
<del>const mongoIds = new MongoIds();
<del>const challengeTitles = new ChallengeTitles();
<del>const validateChallenge = challengeSchemaValidator(lang);
<del>
<ide> const { JSDOM } = jsdom;
<ide>
<ide> const babelOptions = {
<ide> const jQueryScript = fs.readFileSync(
<ide> 'utf8'
<ide> );
<ide>
<del>(async function() {
<add>runTests();
<add>
<add>async function runTests() {
<add> await Promise.all(supportedLangs.map(lang => populateTestsForLang(lang)));
<add>
<add> run();
<add>}
<add>
<add>async function populateTestsForLang(lang) {
<ide> const allChallenges = await getChallengesForLang(lang).then(curriculum =>
<ide> Object.keys(curriculum)
<ide> .map(key => curriculum[key].blocks)
<ide> const jQueryScript = fs.readFileSync(
<ide> }, [])
<ide> );
<ide>
<del> describe('Check challenges tests', async function() {
<add> const mongoIds = new MongoIds();
<add> const challengeTitles = new ChallengeTitles();
<add> const validateChallenge = challengeSchemaValidator(lang);
<add>
<add> describe(`Check challenges (${lang})`, async function() {
<ide> before(async function() {
<ide> this.timeout(30000);
<ide> global.browser = await puppeteer.launch({ args: ['--no-sandbox'] });
<ide> const jQueryScript = fs.readFileSync(
<ide> it('Common checks', function() {
<ide> const result = validateChallenge(challenge);
<ide> if (result.error) {
<del> console.log(result.value);
<del> throw new Error(result.error);
<add> throw new AssertionError(result.error);
<ide> }
<ide> const { id, title } = challenge;
<ide> mongoIds.check(id, title);
<ide> const jQueryScript = fs.readFileSync(
<ide> });
<ide> });
<ide> });
<del>
<del> run();
<del>})();
<add>}
<ide>
<ide> // Fake Deep Equal dependency
<ide> const DeepEqual = (a, b) => JSON.stringify(a) === JSON.stringify(b);
<ide><path>curriculum/utils.js
<ide> exports.dasherize = function dasherize(name) {
<ide> .replace(/\:/g, '');
<ide> };
<ide>
<del>const supportedLangs = ['english', 'spanish'];
<add>const supportedLangs = [
<add> 'arabic',
<add> 'chinese',
<add> 'english',
<add> 'portuguese',
<add> 'russian',
<add> 'spanish'
<add>];
<add>
<ide> exports.supportedLangs = supportedLangs; | 2 |
Java | Java | fix wrong upstream type | 85da0a8f68cae914e85b9e674431fa9531f94f20 | <ide><path>src/main/java/io/reactivex/internal/operators/observable/ObservableSubscribeOn.java
<ide> public final class ObservableSubscribeOn<T> extends AbstractObservableWithUpstream<T, T> {
<ide> final Scheduler scheduler;
<ide>
<del> public ObservableSubscribeOn(Observable<T> source, Scheduler scheduler) {
<add> public ObservableSubscribeOn(ObservableSource<T> source, Scheduler scheduler) {
<ide> super(source);
<ide> this.scheduler = scheduler;
<ide> } | 1 |
PHP | PHP | update doc block | 631da2d04adbe2838a757a85ded421c0ae88393f | <ide><path>lib/Cake/Controller/Component/CookieComponent.php
<ide> public function startup(Controller $controller) {
<ide> * @param string|array $key Key for the value
<ide> * @param mixed $value Value
<ide> * @param boolean $encrypt Set to true to encrypt value, false otherwise
<del> * @param integer|string $expires Can be either Unix timestamp, or date string
<add> * @param integer|string $expires Can be either the number of seconds until a cookie
<add> * expires, or a strtotime compatible time offset.
<ide> * @return void
<ide> * @link http://book.cakephp.org/2.0/en/core-libraries/components/cookie.html#CookieComponent::write
<ide> */ | 1 |
Javascript | Javascript | fix lint error | acb36abf753e524b03c15558537ef52f53e8f170 | <ide><path>test/parallel/test-tls-net-connect-prefer-path.js
<ide> common.refreshTmpDir();
<ide>
<ide> const tls = require('tls');
<ide> const net = require('net');
<del>const fs = require('fs');
<ide> const assert = require('assert');
<ide>
<ide> function libName(lib) { | 1 |
Javascript | Javascript | distribute crypto tests into separate files | becb4e980e9fde8f2e49f5d0326fed787bcc0c76 | <ide><path>test/parallel/test-crypto-cipher-decipher.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>function testCipher1(key) {
<add> // Test encryption and decryption
<add> var plaintext = 'Keep this a secret? No! Tell everyone about node.js!';
<add> var cipher = crypto.createCipher('aes192', key);
<add>
<add> // encrypt plaintext which is in utf8 format
<add> // to a ciphertext which will be in hex
<add> var ciph = cipher.update(plaintext, 'utf8', 'hex');
<add> // Only use binary or hex, not base64.
<add> ciph += cipher.final('hex');
<add>
<add> var decipher = crypto.createDecipher('aes192', key);
<add> var txt = decipher.update(ciph, 'hex', 'utf8');
<add> txt += decipher.final('utf8');
<add>
<add> assert.equal(txt, plaintext, 'encryption and decryption');
<add>
<add> // streaming cipher interface
<add> // NB: In real life, it's not guaranteed that you can get all of it
<add> // in a single read() like this. But in this case, we know it's
<add> // quite small, so there's no harm.
<add> var cStream = crypto.createCipher('aes192', key);
<add> cStream.end(plaintext);
<add> ciph = cStream.read();
<add>
<add> var dStream = crypto.createDecipher('aes192', key);
<add> dStream.end(ciph);
<add> txt = dStream.read().toString('utf8');
<add>
<add> assert.equal(txt, plaintext, 'encryption and decryption with streams');
<add>}
<add>
<add>
<add>function testCipher2(key) {
<add> // encryption and decryption with Base64
<add> // reported in https://github.com/joyent/node/issues/738
<add> var plaintext =
<add> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<add> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<add> 'jAfaFg**';
<add> var cipher = crypto.createCipher('aes256', key);
<add>
<add> // encrypt plaintext which is in utf8 format
<add> // to a ciphertext which will be in Base64
<add> var ciph = cipher.update(plaintext, 'utf8', 'base64');
<add> ciph += cipher.final('base64');
<add>
<add> var decipher = crypto.createDecipher('aes256', key);
<add> var txt = decipher.update(ciph, 'base64', 'utf8');
<add> txt += decipher.final('utf8');
<add>
<add> assert.equal(txt, plaintext, 'encryption and decryption with Base64');
<add>}
<add>
<add>
<add>function testCipher3(key, iv) {
<add> // Test encyrption and decryption with explicit key and iv
<add> var plaintext =
<add> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<add> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<add> 'jAfaFg**';
<add> var cipher = crypto.createCipheriv('des-ede3-cbc', key, iv);
<add> var ciph = cipher.update(plaintext, 'utf8', 'hex');
<add> ciph += cipher.final('hex');
<add>
<add> var decipher = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<add> var txt = decipher.update(ciph, 'hex', 'utf8');
<add> txt += decipher.final('utf8');
<add>
<add> assert.equal(txt, plaintext, 'encryption and decryption with key and iv');
<add>
<add> // streaming cipher interface
<add> // NB: In real life, it's not guaranteed that you can get all of it
<add> // in a single read() like this. But in this case, we know it's
<add> // quite small, so there's no harm.
<add> var cStream = crypto.createCipheriv('des-ede3-cbc', key, iv);
<add> cStream.end(plaintext);
<add> ciph = cStream.read();
<add>
<add> var dStream = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<add> dStream.end(ciph);
<add> txt = dStream.read().toString('utf8');
<add>
<add> assert.equal(txt, plaintext, 'streaming cipher iv');
<add>}
<add>
<add>
<add>function testCipher4(key, iv) {
<add> // Test encyrption and decryption with explicit key and iv
<add> var plaintext =
<add> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<add> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<add> 'jAfaFg**';
<add> var cipher = crypto.createCipheriv('des-ede3-cbc', key, iv);
<add> var ciph = cipher.update(plaintext, 'utf8', 'buffer');
<add> ciph = Buffer.concat([ciph, cipher.final('buffer')]);
<add>
<add> var decipher = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<add> var txt = decipher.update(ciph, 'buffer', 'utf8');
<add> txt += decipher.final('utf8');
<add>
<add> assert.equal(txt, plaintext, 'encryption and decryption with key and iv');
<add>}
<add>
<add>
<add>testCipher1('MySecretKey123');
<add>testCipher1(new Buffer('MySecretKey123'));
<add>
<add>testCipher2('0123456789abcdef');
<add>testCipher2(new Buffer('0123456789abcdef'));
<add>
<add>testCipher3('0123456789abcd0123456789', '12345678');
<add>testCipher3('0123456789abcd0123456789', new Buffer('12345678'));
<add>testCipher3(new Buffer('0123456789abcd0123456789'), '12345678');
<add>testCipher3(new Buffer('0123456789abcd0123456789'), new Buffer('12345678'));
<add>
<add>testCipher4(new Buffer('0123456789abcd0123456789'), new Buffer('12345678'));
<add>
<add>
<add>// Base64 padding regression test, see #4837.
<add>(function() {
<add> var c = crypto.createCipher('aes-256-cbc', 'secret');
<add> var s = c.update('test', 'utf8', 'base64') + c.final('base64');
<add> assert.equal(s, '375oxUQCIocvxmC5At+rvA==');
<add>})();
<add>
<add>// Calling Cipher.final() or Decipher.final() twice should error but
<add>// not assert. See #4886.
<add>(function() {
<add> var c = crypto.createCipher('aes-256-cbc', 'secret');
<add> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<add> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<add> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<add> var d = crypto.createDecipher('aes-256-cbc', 'secret');
<add> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<add> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<add> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<add>})();
<add>
<add>// Regression test for #5482: string to Cipher#update() should not assert.
<add>(function() {
<add> var c = crypto.createCipher('aes192', '0123456789abcdef');
<add> c.update('update');
<add> c.final();
<add>})();
<add>
<add>// #5655 regression tests, 'utf-8' and 'utf8' are identical.
<add>(function() {
<add> var c = crypto.createCipher('aes192', '0123456789abcdef');
<add> c.update('update', ''); // Defaults to "utf8".
<add> c.final('utf-8'); // Should not throw.
<add>
<add> c = crypto.createCipher('aes192', '0123456789abcdef');
<add> c.update('update', 'utf8');
<add> c.final('utf-8'); // Should not throw.
<add>
<add> c = crypto.createCipher('aes192', '0123456789abcdef');
<add> c.update('update', 'utf-8');
<add> c.final('utf8'); // Should not throw.
<add>})();
<ide><path>test/parallel/test-crypto-dh.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>var constants = require('constants');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>// Test Diffie-Hellman with two parties sharing a secret,
<add>// using various encodings as we go along
<add>var dh1 = crypto.createDiffieHellman(256);
<add>var p1 = dh1.getPrime('buffer');
<add>var dh2 = crypto.createDiffieHellman(p1, 'buffer');
<add>var key1 = dh1.generateKeys();
<add>var key2 = dh2.generateKeys('hex');
<add>var secret1 = dh1.computeSecret(key2, 'hex', 'base64');
<add>var secret2 = dh2.computeSecret(key1, 'binary', 'buffer');
<add>
<add>assert.equal(secret1, secret2.toString('base64'));
<add>assert.equal(dh1.verifyError, 0);
<add>assert.equal(dh2.verifyError, 0);
<add>
<add>assert.throws(function() {
<add> crypto.createDiffieHellman([0x1, 0x2]);
<add>});
<add>
<add>assert.throws(function() {
<add> crypto.createDiffieHellman(function() { });
<add>});
<add>
<add>assert.throws(function() {
<add> crypto.createDiffieHellman(/abc/);
<add>});
<add>
<add>assert.throws(function() {
<add> crypto.createDiffieHellman({});
<add>});
<add>
<add>// Create "another dh1" using generated keys from dh1,
<add>// and compute secret again
<add>var dh3 = crypto.createDiffieHellman(p1, 'buffer');
<add>var privkey1 = dh1.getPrivateKey();
<add>dh3.setPublicKey(key1);
<add>dh3.setPrivateKey(privkey1);
<add>
<add>assert.deepEqual(dh1.getPrime(), dh3.getPrime());
<add>assert.deepEqual(dh1.getGenerator(), dh3.getGenerator());
<add>assert.deepEqual(dh1.getPublicKey(), dh3.getPublicKey());
<add>assert.deepEqual(dh1.getPrivateKey(), dh3.getPrivateKey());
<add>assert.equal(dh3.verifyError, 0);
<add>
<add>var secret3 = dh3.computeSecret(key2, 'hex', 'base64');
<add>
<add>assert.equal(secret1, secret3);
<add>
<add>// Run this one twice to make sure that the dh3 clears its error properly
<add>(function() {
<add> var c = crypto.createDecipher('aes-128-ecb', '');
<add> assert.throws(function() { c.final('utf8') }, /wrong final block length/);
<add>})();
<add>
<add>assert.throws(function() {
<add> dh3.computeSecret('');
<add>}, /key is too small/i);
<add>
<add>(function() {
<add> var c = crypto.createDecipher('aes-128-ecb', '');
<add> assert.throws(function() { c.final('utf8') }, /wrong final block length/);
<add>})();
<add>
<add>// Create a shared using a DH group.
<add>var alice = crypto.createDiffieHellmanGroup('modp5');
<add>var bob = crypto.createDiffieHellmanGroup('modp5');
<add>alice.generateKeys();
<add>bob.generateKeys();
<add>var aSecret = alice.computeSecret(bob.getPublicKey()).toString('hex');
<add>var bSecret = bob.computeSecret(alice.getPublicKey()).toString('hex');
<add>assert.equal(aSecret, bSecret);
<add>assert.equal(alice.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>assert.equal(bob.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>// Ensure specific generator (buffer) works as expected.
<add>var modp1 = crypto.createDiffieHellmanGroup('modp1');
<add>var modp1buf = new Buffer([
<add> 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xc9, 0x0f,
<add> 0xda, 0xa2, 0x21, 0x68, 0xc2, 0x34, 0xc4, 0xc6, 0x62, 0x8b,
<add> 0x80, 0xdc, 0x1c, 0xd1, 0x29, 0x02, 0x4e, 0x08, 0x8a, 0x67,
<add> 0xcc, 0x74, 0x02, 0x0b, 0xbe, 0xa6, 0x3b, 0x13, 0x9b, 0x22,
<add> 0x51, 0x4a, 0x08, 0x79, 0x8e, 0x34, 0x04, 0xdd, 0xef, 0x95,
<add> 0x19, 0xb3, 0xcd, 0x3a, 0x43, 0x1b, 0x30, 0x2b, 0x0a, 0x6d,
<add> 0xf2, 0x5f, 0x14, 0x37, 0x4f, 0xe1, 0x35, 0x6d, 0x6d, 0x51,
<add> 0xc2, 0x45, 0xe4, 0x85, 0xb5, 0x76, 0x62, 0x5e, 0x7e, 0xc6,
<add> 0xf4, 0x4c, 0x42, 0xe9, 0xa6, 0x3a, 0x36, 0x20, 0xff, 0xff,
<add> 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
<add>]);
<add>var exmodp1 = crypto.createDiffieHellman(modp1buf, new Buffer([2]));
<add>modp1.generateKeys();
<add>exmodp1.generateKeys();
<add>var modp1Secret = modp1.computeSecret(exmodp1.getPublicKey()).toString('hex');
<add>var exmodp1Secret = exmodp1.computeSecret(modp1.getPublicKey()).toString('hex');
<add>assert.equal(modp1Secret, exmodp1Secret);
<add>assert.equal(modp1.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>assert.equal(exmodp1.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>
<add>// Ensure specific generator (string with encoding) works as expected.
<add>var exmodp1_2 = crypto.createDiffieHellman(modp1buf, '02', 'hex');
<add>exmodp1_2.generateKeys();
<add>modp1Secret = modp1.computeSecret(exmodp1_2.getPublicKey()).toString('hex');
<add>var exmodp1_2Secret = exmodp1_2.computeSecret(modp1.getPublicKey())
<add> .toString('hex');
<add>assert.equal(modp1Secret, exmodp1_2Secret);
<add>assert.equal(exmodp1_2.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>
<add>// Ensure specific generator (string without encoding) works as expected.
<add>var exmodp1_3 = crypto.createDiffieHellman(modp1buf, '\x02');
<add>exmodp1_3.generateKeys();
<add>modp1Secret = modp1.computeSecret(exmodp1_3.getPublicKey()).toString('hex');
<add>var exmodp1_3Secret = exmodp1_3.computeSecret(modp1.getPublicKey())
<add> .toString('hex');
<add>assert.equal(modp1Secret, exmodp1_3Secret);
<add>assert.equal(exmodp1_3.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>
<add>// Ensure specific generator (numeric) works as expected.
<add>var exmodp1_4 = crypto.createDiffieHellman(modp1buf, 2);
<add>exmodp1_4.generateKeys();
<add>modp1Secret = modp1.computeSecret(exmodp1_4.getPublicKey()).toString('hex');
<add>var exmodp1_4Secret = exmodp1_4.computeSecret(modp1.getPublicKey())
<add> .toString('hex');
<add>assert.equal(modp1Secret, exmodp1_4Secret);
<add>assert.equal(exmodp1_4.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>
<add>var p = 'FFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74' +
<add> '020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F1437' +
<add> '4FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7ED' +
<add> 'EE386BFB5A899FA5AE9F24117C4B1FE649286651ECE65381FFFFFFFFFFFFFFFF';
<add>var bad_dh = crypto.createDiffieHellman(p, 'hex');
<add>assert.equal(bad_dh.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<add>
<add>
<add>// Test ECDH
<add>var ecdh1 = crypto.createECDH('prime256v1');
<add>var ecdh2 = crypto.createECDH('prime256v1');
<add>var key1 = ecdh1.generateKeys();
<add>var key2 = ecdh2.generateKeys('hex');
<add>var secret1 = ecdh1.computeSecret(key2, 'hex', 'base64');
<add>var secret2 = ecdh2.computeSecret(key1, 'binary', 'buffer');
<add>
<add>assert.equal(secret1, secret2.toString('base64'));
<add>
<add>// Point formats
<add>assert.equal(ecdh1.getPublicKey('buffer', 'uncompressed')[0], 4);
<add>var firstByte = ecdh1.getPublicKey('buffer', 'compressed')[0];
<add>assert(firstByte === 2 || firstByte === 3);
<add>var firstByte = ecdh1.getPublicKey('buffer', 'hybrid')[0];
<add>assert(firstByte === 6 || firstByte === 7);
<add>
<add>// ECDH should check that point is on curve
<add>var ecdh3 = crypto.createECDH('secp256k1');
<add>var key3 = ecdh3.generateKeys();
<add>
<add>assert.throws(function() {
<add> var secret3 = ecdh2.computeSecret(key3, 'binary', 'buffer');
<add>});
<add>
<add>// ECDH should allow .setPrivateKey()/.setPublicKey()
<add>var ecdh4 = crypto.createECDH('prime256v1');
<add>
<add>ecdh4.setPrivateKey(ecdh1.getPrivateKey());
<add>ecdh4.setPublicKey(ecdh1.getPublicKey());
<add>
<add>assert.throws(function() {
<add> ecdh4.setPublicKey(ecdh3.getPublicKey());
<add>});
<ide><path>test/parallel/test-crypto-hash.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>var fs = require('fs');
<add>var path = require('path');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>// Test hashing
<add>var a0 = crypto.createHash('sha1').update('Test123').digest('hex');
<add>var a1 = crypto.createHash('md5').update('Test123').digest('binary');
<add>var a2 = crypto.createHash('sha256').update('Test123').digest('base64');
<add>var a3 = crypto.createHash('sha512').update('Test123').digest(); // binary
<add>var a4 = crypto.createHash('sha1').update('Test123').digest('buffer');
<add>
<add>// stream interface
<add>var a5 = crypto.createHash('sha512');
<add>a5.end('Test123');
<add>a5 = a5.read();
<add>
<add>var a6 = crypto.createHash('sha512');
<add>a6.write('Te');
<add>a6.write('st');
<add>a6.write('123');
<add>a6.end();
<add>a6 = a6.read();
<add>
<add>var a7 = crypto.createHash('sha512');
<add>a7.end();
<add>a7 = a7.read();
<add>
<add>var a8 = crypto.createHash('sha512');
<add>a8.write('');
<add>a8.end();
<add>a8 = a8.read();
<add>
<add>assert.equal(a0, '8308651804facb7b9af8ffc53a33a22d6a1c8ac2', 'Test SHA1');
<add>assert.equal(a1, 'h\u00ea\u00cb\u0097\u00d8o\fF!\u00fa+\u000e\u0017\u00ca' +
<add> '\u00bd\u008c', 'Test MD5 as binary');
<add>assert.equal(a2, '2bX1jws4GYKTlxhloUB09Z66PoJZW+y+hq5R8dnx9l4=',
<add> 'Test SHA256 as base64');
<add>assert.deepEqual(
<add> a3,
<add> new Buffer(
<add> '\u00c1(4\u00f1\u0003\u001fd\u0097!O\'\u00d4C/&Qz\u00d4' +
<add> '\u0094\u0015l\u00b8\u008dQ+\u00db\u001d\u00c4\u00b5}\u00b2' +
<add> '\u00d6\u0092\u00a3\u00df\u00a2i\u00a1\u009b\n\n*\u000f' +
<add> '\u00d7\u00d6\u00a2\u00a8\u0085\u00e3<\u0083\u009c\u0093' +
<add> '\u00c2\u0006\u00da0\u00a1\u00879(G\u00ed\'',
<add> 'binary'),
<add> 'Test SHA512 as assumed buffer');
<add>assert.deepEqual(a4,
<add> new Buffer('8308651804facb7b9af8ffc53a33a22d6a1c8ac2', 'hex'),
<add> 'Test SHA1');
<add>
<add>// stream interface should produce the same result.
<add>assert.deepEqual(a5, a3, 'stream interface is consistent');
<add>assert.deepEqual(a6, a3, 'stream interface is consistent');
<add>assert.notEqual(a7, undefined, 'no data should return data');
<add>assert.notEqual(a8, undefined, 'empty string should generate data');
<add>
<add>// Test multiple updates to same hash
<add>var h1 = crypto.createHash('sha1').update('Test123').digest('hex');
<add>var h2 = crypto.createHash('sha1').update('Test').update('123').digest('hex');
<add>assert.equal(h1, h2, 'multipled updates');
<add>
<add>// Test hashing for binary files
<add>var fn = path.join(common.fixturesDir, 'sample.png');
<add>var sha1Hash = crypto.createHash('sha1');
<add>var fileStream = fs.createReadStream(fn);
<add>fileStream.on('data', function(data) {
<add> sha1Hash.update(data);
<add>});
<add>fileStream.on('close', function() {
<add> assert.equal(sha1Hash.digest('hex'),
<add> '22723e553129a336ad96e10f6aecdf0f45e4149e',
<add> 'Test SHA1 of sample.png');
<add>});
<add>
<add>// Issue #2227: unknown digest method should throw an error.
<add>assert.throws(function() {
<add> crypto.createHash('xyzzy');
<add>});
<ide><path>test/parallel/test-crypto-hmac.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>// Test HMAC
<add>var h1 = crypto.createHmac('sha1', 'Node')
<add> .update('some data')
<add> .update('to hmac')
<add> .digest('hex');
<add>assert.equal(h1, '19fd6e1ba73d9ed2224dd5094a71babe85d9a892', 'test HMAC');
<add>
<add>// Test HMAC (Wikipedia Test Cases)
<add>var wikipedia = [
<add> {
<add> key: 'key', data: 'The quick brown fox jumps over the lazy dog',
<add> hmac: { // HMACs lifted from Wikipedia.
<add> md5: '80070713463e7749b90c2dc24911e275',
<add> sha1: 'de7c9b85b8b78aa6bc8a7a36f70a90701c9db4d9',
<add> sha256:
<add> 'f7bc83f430538424b13298e6aa6fb143ef4d59a14946175997479dbc' +
<add> '2d1a3cd8'
<add> }
<add> },
<add> {
<add> key: 'key', data: '',
<add> hmac: { // Intermediate test to help debugging.
<add> md5: '63530468a04e386459855da0063b6596',
<add> sha1: 'f42bb0eeb018ebbd4597ae7213711ec60760843f',
<add> sha256:
<add> '5d5d139563c95b5967b9bd9a8c9b233a9dedb45072794cd232dc1b74' +
<add> '832607d0'
<add> }
<add> },
<add> {
<add> key: '', data: 'The quick brown fox jumps over the lazy dog',
<add> hmac: { // Intermediate test to help debugging.
<add> md5: 'ad262969c53bc16032f160081c4a07a0',
<add> sha1: '2ba7f707ad5f187c412de3106583c3111d668de8',
<add> sha256:
<add> 'fb011e6154a19b9a4c767373c305275a5a69e8b68b0b4c9200c383dc' +
<add> 'ed19a416'
<add> }
<add> },
<add> {
<add> key: '', data: '',
<add> hmac: { // HMACs lifted from Wikipedia.
<add> md5: '74e6f7298a9c2d168935f58c001bad88',
<add> sha1: 'fbdb1d1b18aa6c08324b7d64b71fb76370690e1d',
<add> sha256:
<add> 'b613679a0814d9ec772f95d778c35fc5ff1697c493715653c6c71214' +
<add> '4292c5ad'
<add> }
<add> },
<add>]
<add>
<add>for (var i = 0, l = wikipedia.length; i < l; i++) {
<add> for (var hash in wikipedia[i]['hmac']) {
<add> var result = crypto.createHmac(hash, wikipedia[i]['key'])
<add> .update(wikipedia[i]['data'])
<add> .digest('hex');
<add> assert.equal(wikipedia[i]['hmac'][hash],
<add> result,
<add> 'Test HMAC-' + hash + ': Test case ' + (i + 1) + ' wikipedia');
<add> }
<add>}
<add>
<add>
<add>// Test HMAC-SHA-* (rfc 4231 Test Cases)
<add>var rfc4231 = [
<add> {
<add> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<add> data: new Buffer('4869205468657265', 'hex'), // 'Hi There'
<add> hmac: {
<add> sha224: '896fb1128abbdf196832107cd49df33f47b4b1169912ba4f53684b22',
<add> sha256:
<add> 'b0344c61d8db38535ca8afceaf0bf12b881dc200c9833da726e9376c' +
<add> '2e32cff7',
<add> sha384:
<add> 'afd03944d84895626b0825f4ab46907f15f9dadbe4101ec682aa034c' +
<add> '7cebc59cfaea9ea9076ede7f4af152e8b2fa9cb6',
<add> sha512:
<add> '87aa7cdea5ef619d4ff0b4241a1d6cb02379f4e2ce4ec2787ad0b305' +
<add> '45e17cdedaa833b7d6b8a702038b274eaea3f4e4be9d914eeb61f170' +
<add> '2e696c203a126854'
<add> }
<add> },
<add> {
<add> key: new Buffer('4a656665', 'hex'), // 'Jefe'
<add> data: new Buffer('7768617420646f2079612077616e7420666f72206e6f74686' +
<add> '96e673f', 'hex'), // 'what do ya want for nothing?'
<add> hmac: {
<add> sha224: 'a30e01098bc6dbbf45690f3a7e9e6d0f8bbea2a39e6148008fd05e44',
<add> sha256:
<add> '5bdcc146bf60754e6a042426089575c75a003f089d2739839dec58b9' +
<add> '64ec3843',
<add> sha384:
<add> 'af45d2e376484031617f78d2b58a6b1b9c7ef464f5a01b47e42ec373' +
<add> '6322445e8e2240ca5e69e2c78b3239ecfab21649',
<add> sha512:
<add> '164b7a7bfcf819e2e395fbe73b56e0a387bd64222e831fd610270cd7' +
<add> 'ea2505549758bf75c05a994a6d034f65f8f0e6fdcaeab1a34d4a6b4b' +
<add> '636e070a38bce737'
<add> }
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<add> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddddddd' +
<add> 'ddddddddddddddddddddddddddddddddddddddddddddddddddd',
<add> 'hex'),
<add> hmac: {
<add> sha224: '7fb3cb3588c6c1f6ffa9694d7d6ad2649365b0c1f65d69d1ec8333ea',
<add> sha256:
<add> '773ea91e36800e46854db8ebd09181a72959098b3ef8c122d9635514' +
<add> 'ced565fe',
<add> sha384:
<add> '88062608d3e6ad8a0aa2ace014c8a86f0aa635d947ac9febe83ef4e5' +
<add> '5966144b2a5ab39dc13814b94e3ab6e101a34f27',
<add> sha512:
<add> 'fa73b0089d56a284efb0f0756c890be9b1b5dbdd8ee81a3655f83e33' +
<add> 'b2279d39bf3e848279a722c806b485a47e67c807b946a337bee89426' +
<add> '74278859e13292fb'
<add> }
<add> },
<add> {
<add> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<add> 'hex'),
<add> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<add> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd',
<add> 'hex'),
<add> hmac: {
<add> sha224: '6c11506874013cac6a2abc1bb382627cec6a90d86efc012de7afec5a',
<add> sha256:
<add> '82558a389a443c0ea4cc819899f2083a85f0faa3e578f8077a2e3ff4' +
<add> '6729665b',
<add> sha384:
<add> '3e8a69b7783c25851933ab6290af6ca77a9981480850009cc5577c6e' +
<add> '1f573b4e6801dd23c4a7d679ccf8a386c674cffb',
<add> sha512:
<add> 'b0ba465637458c6990e5a8c5f61d4af7e576d97ff94b872de76f8050' +
<add> '361ee3dba91ca5c11aa25eb4d679275cc5788063a5f19741120c4f2d' +
<add> 'e2adebeb10a298dd'
<add> }
<add> },
<add>
<add> {
<add> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<add> // 'Test With Truncation'
<add> data: new Buffer('546573742057697468205472756e636174696f6e', 'hex'),
<add> hmac: {
<add> sha224: '0e2aea68a90c8d37c988bcdb9fca6fa8',
<add> sha256: 'a3b6167473100ee06e0c796c2955552b',
<add> sha384: '3abf34c3503b2a23a46efc619baef897',
<add> sha512: '415fad6271580a531d4179bc891d87a6'
<add> },
<add> truncate: true
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaa', 'hex'),
<add> // 'Test Using Larger Than Block-Size Key - Hash Key First'
<add> data: new Buffer('54657374205573696e67204c6172676572205468616e20426' +
<add> 'c6f636b2d53697a65204b6579202d2048617368204b657920' +
<add> '4669727374', 'hex'),
<add> hmac: {
<add> sha224: '95e9a0db962095adaebe9b2d6f0dbce2d499f112f2d2b7273fa6870e',
<add> sha256:
<add> '60e431591ee0b67f0d8a26aacbf5b77f8e0bc6213728c5140546040f' +
<add> '0ee37f54',
<add> sha384:
<add> '4ece084485813e9088d2c63a041bc5b44f9ef1012a2b588f3cd11f05' +
<add> '033ac4c60c2ef6ab4030fe8296248df163f44952',
<add> sha512:
<add> '80b24263c7c1a3ebb71493c1dd7be8b49b46d1f41b4aeec1121b0137' +
<add> '83f8f3526b56d037e05f2598bd0fd2215d6a1e5295e64f73f63f0aec' +
<add> '8b915a985d786598'
<add> }
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaa', 'hex'),
<add> // 'This is a test using a larger than block-size key and a larger ' +
<add> // 'than block-size data. The key needs to be hashed before being ' +
<add> // 'used by the HMAC algorithm.'
<add> data: new Buffer('5468697320697320612074657374207573696e672061206c6' +
<add> '172676572207468616e20626c6f636b2d73697a65206b6579' +
<add> '20616e642061206c6172676572207468616e20626c6f636b2' +
<add> 'd73697a6520646174612e20546865206b6579206e65656473' +
<add> '20746f20626520686173686564206265666f7265206265696' +
<add> 'e6720757365642062792074686520484d414320616c676f72' +
<add> '6974686d2e', 'hex'),
<add> hmac: {
<add> sha224: '3a854166ac5d9f023f54d517d0b39dbd946770db9c2b95c9f6f565d1',
<add> sha256:
<add> '9b09ffa71b942fcb27635fbcd5b0e944bfdc63644f0713938a7f5153' +
<add> '5c3a35e2',
<add> sha384:
<add> '6617178e941f020d351e2f254e8fd32c602420feb0b8fb9adccebb82' +
<add> '461e99c5a678cc31e799176d3860e6110c46523e',
<add> sha512:
<add> 'e37b6a775dc87dbaa4dfa9f96e5e3ffddebd71f8867289865df5a32d' +
<add> '20cdc944b6022cac3c4982b10d5eeb55c3e4de15134676fb6de04460' +
<add> '65c97440fa8c6a58'
<add> }
<add> }
<add>];
<add>
<add>for (var i = 0, l = rfc4231.length; i < l; i++) {
<add> for (var hash in rfc4231[i]['hmac']) {
<add> var str = crypto.createHmac(hash, rfc4231[i].key);
<add> str.end(rfc4231[i].data);
<add> var strRes = str.read().toString('hex');
<add> var result = crypto.createHmac(hash, rfc4231[i]['key'])
<add> .update(rfc4231[i]['data'])
<add> .digest('hex');
<add> if (rfc4231[i]['truncate']) {
<add> result = result.substr(0, 32); // first 128 bits == 32 hex chars
<add> strRes = strRes.substr(0, 32);
<add> }
<add> assert.equal(rfc4231[i]['hmac'][hash],
<add> result,
<add> 'Test HMAC-' + hash + ': Test case ' + (i + 1) + ' rfc 4231');
<add> assert.equal(strRes, result, 'Should get same result from stream');
<add> }
<add>}
<add>
<add>// Test HMAC-MD5/SHA1 (rfc 2202 Test Cases)
<add>var rfc2202_md5 = [
<add> {
<add> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<add> data: 'Hi There',
<add> hmac: '9294727a3638bb1c13f48ef8158bfc9d'
<add> },
<add> {
<add> key: 'Jefe',
<add> data: 'what do ya want for nothing?',
<add> hmac: '750c783e6ab0b503eaa86e310a5db738'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<add> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddddddd' +
<add> 'ddddddddddddddddddddddddddddddddddddddddddddddddddd',
<add> 'hex'),
<add> hmac: '56be34521d144c88dbb8c733f0e8b3f6'
<add> },
<add> {
<add> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<add> 'hex'),
<add> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<add> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd' +
<add> 'cdcdcdcdcd',
<add> 'hex'),
<add> hmac: '697eaf0aca3a3aea3a75164746ffaa79'
<add> },
<add> {
<add> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<add> data: 'Test With Truncation',
<add> hmac: '56461ef2342edc00f9bab995690efd4c'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaa',
<add> 'hex'),
<add> data: 'Test Using Larger Than Block-Size Key - Hash Key First',
<add> hmac: '6b1ab7fe4bd7bf8f0b62e6ce61b9d0cd'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaa',
<add> 'hex'),
<add> data:
<add> 'Test Using Larger Than Block-Size Key and Larger Than One ' +
<add> 'Block-Size Data',
<add> hmac: '6f630fad67cda0ee1fb1f562db3aa53e'
<add> }
<add>];
<add>var rfc2202_sha1 = [
<add> {
<add> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<add> data: 'Hi There',
<add> hmac: 'b617318655057264e28bc0b6fb378c8ef146be00'
<add> },
<add> {
<add> key: 'Jefe',
<add> data: 'what do ya want for nothing?',
<add> hmac: 'effcdf6ae5eb2fa2d27416d5f184df9c259a7c79'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<add> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddd' +
<add> 'ddddddddddddddddddddddddddddddddddddddddddddd' +
<add> 'dddddddddd',
<add> 'hex'),
<add> hmac: '125d7342b9ac11cd91a39af48aa17b4f63f175d3'
<add> },
<add> {
<add> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<add> 'hex'),
<add> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<add> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd' +
<add> 'cdcdcdcdcd',
<add> 'hex'),
<add> hmac: '4c9007f4026250c6bc8414f9bf50c86c2d7235da'
<add> },
<add> {
<add> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<add> data: 'Test With Truncation',
<add> hmac: '4c1a03424b55e07fe7f27be1d58bb9324a9a5a04'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaa',
<add> 'hex'),
<add> data: 'Test Using Larger Than Block-Size Key - Hash Key First',
<add> hmac: 'aa4ae5e15272d00e95705637ce8a3b55ed402112'
<add> },
<add> {
<add> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<add> 'aaaaaaaaaaaaaaaaaaaaaa',
<add> 'hex'),
<add> data:
<add> 'Test Using Larger Than Block-Size Key and Larger Than One ' +
<add> 'Block-Size Data',
<add> hmac: 'e8e99d0f45237d786d6bbaa7965c7808bbff1a91'
<add> }
<add>];
<add>
<add>for (var i = 0, l = rfc2202_md5.length; i < l; i++) {
<add> assert.equal(rfc2202_md5[i]['hmac'],
<add> crypto.createHmac('md5', rfc2202_md5[i]['key'])
<add> .update(rfc2202_md5[i]['data'])
<add> .digest('hex'),
<add> 'Test HMAC-MD5 : Test case ' + (i + 1) + ' rfc 2202');
<add>}
<add>for (var i = 0, l = rfc2202_sha1.length; i < l; i++) {
<add> assert.equal(rfc2202_sha1[i]['hmac'],
<add> crypto.createHmac('sha1', rfc2202_sha1[i]['key'])
<add> .update(rfc2202_sha1[i]['data'])
<add> .digest('hex'),
<add> 'Test HMAC-SHA1 : Test case ' + (i + 1) + ' rfc 2202');
<add>}
<ide><path>test/parallel/test-crypto-pbkdf2.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>//
<add>// Test PBKDF2 with RFC 6070 test vectors (except #4)
<add>//
<add>function testPBKDF2(password, salt, iterations, keylen, expected) {
<add> var actual = crypto.pbkdf2Sync(password, salt, iterations, keylen);
<add> assert.equal(actual.toString('binary'), expected);
<add>
<add> crypto.pbkdf2(password, salt, iterations, keylen, function(err, actual) {
<add> assert.equal(actual.toString('binary'), expected);
<add> });
<add>}
<add>
<add>
<add>testPBKDF2('password', 'salt', 1, 20,
<add> '\x0c\x60\xc8\x0f\x96\x1f\x0e\x71\xf3\xa9\xb5\x24' +
<add> '\xaf\x60\x12\x06\x2f\xe0\x37\xa6');
<add>
<add>testPBKDF2('password', 'salt', 2, 20,
<add> '\xea\x6c\x01\x4d\xc7\x2d\x6f\x8c\xcd\x1e\xd9\x2a' +
<add> '\xce\x1d\x41\xf0\xd8\xde\x89\x57');
<add>
<add>testPBKDF2('password', 'salt', 4096, 20,
<add> '\x4b\x00\x79\x01\xb7\x65\x48\x9a\xbe\xad\x49\xd9\x26' +
<add> '\xf7\x21\xd0\x65\xa4\x29\xc1');
<add>
<add>testPBKDF2('passwordPASSWORDpassword',
<add> 'saltSALTsaltSALTsaltSALTsaltSALTsalt',
<add> 4096,
<add> 25,
<add> '\x3d\x2e\xec\x4f\xe4\x1c\x84\x9b\x80\xc8\xd8\x36\x62' +
<add> '\xc0\xe4\x4a\x8b\x29\x1a\x96\x4c\xf2\xf0\x70\x38');
<add>
<add>testPBKDF2('pass\0word', 'sa\0lt', 4096, 16,
<add> '\x56\xfa\x6a\xa7\x55\x48\x09\x9d\xcc\x37\xd7\xf0\x34' +
<add> '\x25\xe0\xc3');
<add>
<add>var expected =
<add> '64c486c55d30d4c5a079b8823b7d7cb37ff0556f537da8410233bcec330ed956';
<add>var key = crypto.pbkdf2Sync('password', 'salt', 32, 32, 'sha256');
<add>assert.equal(key.toString('hex'), expected);
<add>
<add>crypto.pbkdf2('password', 'salt', 32, 32, 'sha256', common.mustCall(ondone));
<add>function ondone(err, key) {
<add> if (err) throw err;
<add> assert.equal(key.toString('hex'), expected);
<add>}
<add>
<add>// Error path should not leak memory (check with valgrind).
<add>assert.throws(function() {
<add> crypto.pbkdf2('password', 'salt', 1, 20, null);
<add>});
<ide><path>test/parallel/test-crypto-rsa-dsa.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>var fs = require('fs');
<add>var constants = require('constants');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>// Test certificates
<add>var certPem = fs.readFileSync(common.fixturesDir + '/test_cert.pem', 'ascii');
<add>var keyPem = fs.readFileSync(common.fixturesDir + '/test_key.pem', 'ascii');
<add>var rsaPubPem = fs.readFileSync(common.fixturesDir + '/test_rsa_pubkey.pem',
<add> 'ascii');
<add>var rsaKeyPem = fs.readFileSync(common.fixturesDir + '/test_rsa_privkey.pem',
<add> 'ascii');
<add>var rsaKeyPemEncrypted = fs.readFileSync(
<add> common.fixturesDir + '/test_rsa_privkey_encrypted.pem', 'ascii');
<add>var dsaPubPem = fs.readFileSync(common.fixturesDir + '/test_dsa_pubkey.pem',
<add> 'ascii');
<add>var dsaKeyPem = fs.readFileSync(common.fixturesDir + '/test_dsa_privkey.pem',
<add> 'ascii');
<add>var dsaKeyPemEncrypted = fs.readFileSync(
<add> common.fixturesDir + '/test_dsa_privkey_encrypted.pem', 'ascii');
<add>
<add>// Test RSA encryption/decryption
<add>(function() {
<add> var input = 'I AM THE WALRUS';
<add> var bufferToEncrypt = new Buffer(input);
<add>
<add> var encryptedBuffer = crypto.publicEncrypt(rsaPubPem, bufferToEncrypt);
<add>
<add> var decryptedBuffer = crypto.privateDecrypt(rsaKeyPem, encryptedBuffer);
<add> assert.equal(input, decryptedBuffer.toString());
<add>
<add> var decryptedBufferWithPassword = crypto.privateDecrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: 'password'
<add> }, encryptedBuffer);
<add> assert.equal(input, decryptedBufferWithPassword.toString());
<add>
<add> encryptedBuffer = crypto.publicEncrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: 'password'
<add> }, bufferToEncrypt);
<add>
<add> decryptedBufferWithPassword = crypto.privateDecrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: 'password'
<add> }, encryptedBuffer);
<add> assert.equal(input, decryptedBufferWithPassword.toString());
<add>
<add> encryptedBuffer = crypto.privateEncrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: new Buffer('password')
<add> }, bufferToEncrypt);
<add>
<add> decryptedBufferWithPassword = crypto.publicDecrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: new Buffer('password')
<add> }, encryptedBuffer);
<add> assert.equal(input, decryptedBufferWithPassword.toString());
<add>
<add> encryptedBuffer = crypto.publicEncrypt(certPem, bufferToEncrypt);
<add>
<add> decryptedBuffer = crypto.privateDecrypt(keyPem, encryptedBuffer);
<add> assert.equal(input, decryptedBuffer.toString());
<add>
<add> encryptedBuffer = crypto.publicEncrypt(keyPem, bufferToEncrypt);
<add>
<add> decryptedBuffer = crypto.privateDecrypt(keyPem, encryptedBuffer);
<add> assert.equal(input, decryptedBuffer.toString());
<add>
<add> encryptedBuffer = crypto.privateEncrypt(keyPem, bufferToEncrypt);
<add>
<add> decryptedBuffer = crypto.publicDecrypt(keyPem, encryptedBuffer);
<add> assert.equal(input, decryptedBuffer.toString());
<add>
<add> assert.throws(function() {
<add> crypto.privateDecrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: 'wrong'
<add> }, bufferToEncrypt);
<add> });
<add>
<add> assert.throws(function() {
<add> crypto.publicEncrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: 'wrong'
<add> }, encryptedBuffer);
<add> });
<add>
<add> encryptedBuffer = crypto.privateEncrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: new Buffer('password')
<add> }, bufferToEncrypt);
<add>
<add> assert.throws(function() {
<add> crypto.publicDecrypt({
<add> key: rsaKeyPemEncrypted,
<add> passphrase: [].concat.apply([], new Buffer('password'))
<add> }, encryptedBuffer);
<add> });
<add>})();
<add>
<add>function test_rsa(padding) {
<add> var input = new Buffer(padding === 'RSA_NO_PADDING' ? 1024 / 8 : 32);
<add> for (var i = 0; i < input.length; i++)
<add> input[i] = (i * 7 + 11) & 0xff;
<add> var bufferToEncrypt = new Buffer(input);
<add>
<add> padding = constants[padding];
<add>
<add> var encryptedBuffer = crypto.publicEncrypt({
<add> key: rsaPubPem,
<add> padding: padding
<add> }, bufferToEncrypt);
<add>
<add> var decryptedBuffer = crypto.privateDecrypt({
<add> key: rsaKeyPem,
<add> padding: padding
<add> }, encryptedBuffer);
<add> assert.equal(input, decryptedBuffer.toString());
<add>}
<add>
<add>test_rsa('RSA_NO_PADDING');
<add>test_rsa('RSA_PKCS1_PADDING');
<add>test_rsa('RSA_PKCS1_OAEP_PADDING');
<add>
<add>// Test RSA key signing/verification
<add>var rsaSign = crypto.createSign('RSA-SHA1');
<add>var rsaVerify = crypto.createVerify('RSA-SHA1');
<add>assert.ok(rsaSign);
<add>assert.ok(rsaVerify);
<add>
<add>rsaSign.update(rsaPubPem);
<add>var rsaSignature = rsaSign.sign(rsaKeyPem, 'hex');
<add>assert.equal(rsaSignature,
<add> '5c50e3145c4e2497aadb0eabc83b342d0b0021ece0d4c4a064b7c' +
<add> '8f020d7e2688b122bfb54c724ac9ee169f83f66d2fe90abeb95e8' +
<add> 'e1290e7e177152a4de3d944cf7d4883114a20ed0f78e70e25ef0f' +
<add> '60f06b858e6af42a2f276ede95bbc6bc9a9bbdda15bd663186a6f' +
<add> '40819a7af19e577bb2efa5e579a1f5ce8a0d4ca8b8f6');
<add>
<add>rsaVerify.update(rsaPubPem);
<add>assert.strictEqual(rsaVerify.verify(rsaPubPem, rsaSignature, 'hex'), true);
<add>
<add>// Test RSA key signing/verification with encrypted key
<add>rsaSign = crypto.createSign('RSA-SHA1');
<add>rsaSign.update(rsaPubPem);
<add>assert.doesNotThrow(function() {
<add> var signOptions = { key: rsaKeyPemEncrypted, passphrase: 'password' };
<add> rsaSignature = rsaSign.sign(signOptions, 'hex');
<add>});
<add>assert.equal(rsaSignature,
<add> '5c50e3145c4e2497aadb0eabc83b342d0b0021ece0d4c4a064b7c' +
<add> '8f020d7e2688b122bfb54c724ac9ee169f83f66d2fe90abeb95e8' +
<add> 'e1290e7e177152a4de3d944cf7d4883114a20ed0f78e70e25ef0f' +
<add> '60f06b858e6af42a2f276ede95bbc6bc9a9bbdda15bd663186a6f' +
<add> '40819a7af19e577bb2efa5e579a1f5ce8a0d4ca8b8f6');
<add>
<add>rsaVerify = crypto.createVerify('RSA-SHA1');
<add>rsaVerify.update(rsaPubPem);
<add>assert.strictEqual(rsaVerify.verify(rsaPubPem, rsaSignature, 'hex'), true);
<add>
<add>rsaSign = crypto.createSign('RSA-SHA1');
<add>rsaSign.update(rsaPubPem);
<add>assert.throws(function() {
<add> var signOptions = { key: rsaKeyPemEncrypted, passphrase: 'wrong' };
<add> rsaSign.sign(signOptions, 'hex');
<add>});
<add>
<add>//
<add>// Test RSA signing and verification
<add>//
<add>(function() {
<add> var privateKey = fs.readFileSync(
<add> common.fixturesDir + '/test_rsa_privkey_2.pem');
<add>
<add> var publicKey = fs.readFileSync(
<add> common.fixturesDir + '/test_rsa_pubkey_2.pem');
<add>
<add> var input = 'I AM THE WALRUS';
<add>
<add> var signature =
<add> '79d59d34f56d0e94aa6a3e306882b52ed4191f07521f25f505a078dc2f89' +
<add> '396e0c8ac89e996fde5717f4cb89199d8fec249961fcb07b74cd3d2a4ffa' +
<add> '235417b69618e4bcd76b97e29975b7ce862299410e1b522a328e44ac9bb2' +
<add> '8195e0268da7eda23d9825ac43c724e86ceeee0d0d4465678652ccaf6501' +
<add> '0ddfb299bedeb1ad';
<add>
<add> var sign = crypto.createSign('RSA-SHA256');
<add> sign.update(input);
<add>
<add> var output = sign.sign(privateKey, 'hex');
<add> assert.equal(output, signature);
<add>
<add> var verify = crypto.createVerify('RSA-SHA256');
<add> verify.update(input);
<add>
<add> assert.strictEqual(verify.verify(publicKey, signature, 'hex'), true);
<add>})();
<add>
<add>
<add>//
<add>// Test DSA signing and verification
<add>//
<add>(function() {
<add> var input = 'I AM THE WALRUS';
<add>
<add> // DSA signatures vary across runs so there is no static string to verify
<add> // against
<add> var sign = crypto.createSign('DSS1');
<add> sign.update(input);
<add> var signature = sign.sign(dsaKeyPem, 'hex');
<add>
<add> var verify = crypto.createVerify('DSS1');
<add> verify.update(input);
<add>
<add> assert.strictEqual(verify.verify(dsaPubPem, signature, 'hex'), true);
<add>})();
<add>
<add>
<add>//
<add>// Test DSA signing and verification with encrypted key
<add>//
<add>(function() {
<add> var input = 'I AM THE WALRUS';
<add>
<add> var sign = crypto.createSign('DSS1');
<add> sign.update(input);
<add> assert.throws(function() {
<add> sign.sign({ key: dsaKeyPemEncrypted, passphrase: 'wrong' }, 'hex');
<add> });
<add>
<add> // DSA signatures vary across runs so there is no static string to verify
<add> // against
<add> var sign = crypto.createSign('DSS1');
<add> sign.update(input);
<add>
<add> var signature;
<add> assert.doesNotThrow(function() {
<add> var signOptions = { key: dsaKeyPemEncrypted, passphrase: 'password' };
<add> signature = sign.sign(signOptions, 'hex');
<add> });
<add>
<add> var verify = crypto.createVerify('DSS1');
<add> verify.update(input);
<add>
<add> assert.strictEqual(verify.verify(dsaPubPem, signature, 'hex'), true);
<add>})();
<ide><path>test/parallel/test-crypto-sign-verify.js
<add>var common = require('../common');
<add>var assert = require('assert');
<add>var fs = require('fs');
<add>
<add>try {
<add> var crypto = require('crypto');
<add>} catch (e) {
<add> console.log('Not compiled with OPENSSL support.');
<add> process.exit();
<add>}
<add>
<add>// Test certificates
<add>var certPem = fs.readFileSync(common.fixturesDir + '/test_cert.pem', 'ascii');
<add>var keyPem = fs.readFileSync(common.fixturesDir + '/test_key.pem', 'ascii');
<add>
<add>// Test signing and verifying
<add>var s1 = crypto.createSign('RSA-SHA1')
<add> .update('Test123')
<add> .sign(keyPem, 'base64');
<add>var s1stream = crypto.createSign('RSA-SHA1');
<add>s1stream.end('Test123');
<add>s1stream = s1stream.sign(keyPem, 'base64');
<add>assert.equal(s1, s1stream, 'Stream produces same output');
<add>
<add>var verified = crypto.createVerify('RSA-SHA1')
<add> .update('Test')
<add> .update('123')
<add> .verify(certPem, s1, 'base64');
<add>assert.strictEqual(verified, true, 'sign and verify (base 64)');
<add>
<add>var s2 = crypto.createSign('RSA-SHA256')
<add> .update('Test123')
<add> .sign(keyPem, 'binary');
<add>var s2stream = crypto.createSign('RSA-SHA256');
<add>s2stream.end('Test123');
<add>s2stream = s2stream.sign(keyPem, 'binary');
<add>assert.equal(s2, s2stream, 'Stream produces same output');
<add>
<add>var verified = crypto.createVerify('RSA-SHA256')
<add> .update('Test')
<add> .update('123')
<add> .verify(certPem, s2, 'binary');
<add>assert.strictEqual(verified, true, 'sign and verify (binary)');
<add>
<add>var verStream = crypto.createVerify('RSA-SHA256');
<add>verStream.write('Tes');
<add>verStream.write('t12');
<add>verStream.end('3');
<add>verified = verStream.verify(certPem, s2, 'binary');
<add>assert.strictEqual(verified, true, 'sign and verify (stream)');
<add>
<add>var s3 = crypto.createSign('RSA-SHA1')
<add> .update('Test123')
<add> .sign(keyPem, 'buffer');
<add>var verified = crypto.createVerify('RSA-SHA1')
<add> .update('Test')
<add> .update('123')
<add> .verify(certPem, s3);
<add>assert.strictEqual(verified, true, 'sign and verify (buffer)');
<add>
<add>var verStream = crypto.createVerify('RSA-SHA1');
<add>verStream.write('Tes');
<add>verStream.write('t12');
<add>verStream.end('3');
<add>verified = verStream.verify(certPem, s3);
<add>assert.strictEqual(verified, true, 'sign and verify (stream)');
<ide><path>test/parallel/test-crypto.js
<ide> try {
<ide> crypto.DEFAULT_ENCODING = 'buffer';
<ide>
<ide> var fs = require('fs');
<del>var path = require('path');
<del>var constants = require('constants');
<ide>
<ide> // Test Certificates
<ide> var caPem = fs.readFileSync(common.fixturesDir + '/test_ca.pem', 'ascii');
<ide> var certPem = fs.readFileSync(common.fixturesDir + '/test_cert.pem', 'ascii');
<ide> var certPfx = fs.readFileSync(common.fixturesDir + '/test_cert.pfx');
<ide> var keyPem = fs.readFileSync(common.fixturesDir + '/test_key.pem', 'ascii');
<del>var rsaPubPem = fs.readFileSync(common.fixturesDir + '/test_rsa_pubkey.pem',
<del> 'ascii');
<del>var rsaKeyPem = fs.readFileSync(common.fixturesDir + '/test_rsa_privkey.pem',
<del> 'ascii');
<del>var rsaKeyPemEncrypted = fs.readFileSync(
<del> common.fixturesDir + '/test_rsa_privkey_encrypted.pem', 'ascii');
<del>var dsaPubPem = fs.readFileSync(common.fixturesDir + '/test_dsa_pubkey.pem',
<del> 'ascii');
<del>var dsaKeyPem = fs.readFileSync(common.fixturesDir + '/test_dsa_privkey.pem',
<del> 'ascii');
<del>var dsaKeyPemEncrypted = fs.readFileSync(
<del> common.fixturesDir + '/test_dsa_privkey_encrypted.pem', 'ascii');
<ide>
<ide>
<ide> // TODO(indunty): move to a separate test eventually
<ide> assert.throws(function() {
<ide> tls.createSecureContext({pfx:'sample', passphrase:'test'});
<ide> }, 'not enough data');
<ide>
<del>// Test HMAC
<del>var h1 = crypto.createHmac('sha1', 'Node')
<del> .update('some data')
<del> .update('to hmac')
<del> .digest('hex');
<del>assert.equal(h1, '19fd6e1ba73d9ed2224dd5094a71babe85d9a892', 'test HMAC');
<del>
<del>// Test HMAC (Wikipedia Test Cases)
<del>var wikipedia = [
<del> {
<del> key: 'key', data: 'The quick brown fox jumps over the lazy dog',
<del> hmac: { // HMACs lifted from Wikipedia.
<del> md5: '80070713463e7749b90c2dc24911e275',
<del> sha1: 'de7c9b85b8b78aa6bc8a7a36f70a90701c9db4d9',
<del> sha256:
<del> 'f7bc83f430538424b13298e6aa6fb143ef4d59a14946175997479dbc' +
<del> '2d1a3cd8'
<del> }
<del> },
<del> {
<del> key: 'key', data: '',
<del> hmac: { // Intermediate test to help debugging.
<del> md5: '63530468a04e386459855da0063b6596',
<del> sha1: 'f42bb0eeb018ebbd4597ae7213711ec60760843f',
<del> sha256:
<del> '5d5d139563c95b5967b9bd9a8c9b233a9dedb45072794cd232dc1b74' +
<del> '832607d0'
<del> }
<del> },
<del> {
<del> key: '', data: 'The quick brown fox jumps over the lazy dog',
<del> hmac: { // Intermediate test to help debugging.
<del> md5: 'ad262969c53bc16032f160081c4a07a0',
<del> sha1: '2ba7f707ad5f187c412de3106583c3111d668de8',
<del> sha256:
<del> 'fb011e6154a19b9a4c767373c305275a5a69e8b68b0b4c9200c383dc' +
<del> 'ed19a416'
<del> }
<del> },
<del> {
<del> key: '', data: '',
<del> hmac: { // HMACs lifted from Wikipedia.
<del> md5: '74e6f7298a9c2d168935f58c001bad88',
<del> sha1: 'fbdb1d1b18aa6c08324b7d64b71fb76370690e1d',
<del> sha256:
<del> 'b613679a0814d9ec772f95d778c35fc5ff1697c493715653c6c71214' +
<del> '4292c5ad'
<del> }
<del> },
<del>]
<del>
<del>for (var i = 0, l = wikipedia.length; i < l; i++) {
<del> for (var hash in wikipedia[i]['hmac']) {
<del> var result = crypto.createHmac(hash, wikipedia[i]['key'])
<del> .update(wikipedia[i]['data'])
<del> .digest('hex');
<del> assert.equal(wikipedia[i]['hmac'][hash],
<del> result,
<del> 'Test HMAC-' + hash + ': Test case ' + (i + 1) + ' wikipedia');
<del> }
<del>}
<del>
<del>
<del>// Test HMAC-SHA-* (rfc 4231 Test Cases)
<del>var rfc4231 = [
<del> {
<del> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<del> data: new Buffer('4869205468657265', 'hex'), // 'Hi There'
<del> hmac: {
<del> sha224: '896fb1128abbdf196832107cd49df33f47b4b1169912ba4f53684b22',
<del> sha256:
<del> 'b0344c61d8db38535ca8afceaf0bf12b881dc200c9833da726e9376c' +
<del> '2e32cff7',
<del> sha384:
<del> 'afd03944d84895626b0825f4ab46907f15f9dadbe4101ec682aa034c' +
<del> '7cebc59cfaea9ea9076ede7f4af152e8b2fa9cb6',
<del> sha512:
<del> '87aa7cdea5ef619d4ff0b4241a1d6cb02379f4e2ce4ec2787ad0b305' +
<del> '45e17cdedaa833b7d6b8a702038b274eaea3f4e4be9d914eeb61f170' +
<del> '2e696c203a126854'
<del> }
<del> },
<del> {
<del> key: new Buffer('4a656665', 'hex'), // 'Jefe'
<del> data: new Buffer('7768617420646f2079612077616e7420666f72206e6f74686' +
<del> '96e673f', 'hex'), // 'what do ya want for nothing?'
<del> hmac: {
<del> sha224: 'a30e01098bc6dbbf45690f3a7e9e6d0f8bbea2a39e6148008fd05e44',
<del> sha256:
<del> '5bdcc146bf60754e6a042426089575c75a003f089d2739839dec58b9' +
<del> '64ec3843',
<del> sha384:
<del> 'af45d2e376484031617f78d2b58a6b1b9c7ef464f5a01b47e42ec373' +
<del> '6322445e8e2240ca5e69e2c78b3239ecfab21649',
<del> sha512:
<del> '164b7a7bfcf819e2e395fbe73b56e0a387bd64222e831fd610270cd7' +
<del> 'ea2505549758bf75c05a994a6d034f65f8f0e6fdcaeab1a34d4a6b4b' +
<del> '636e070a38bce737'
<del> }
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<del> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddddddd' +
<del> 'ddddddddddddddddddddddddddddddddddddddddddddddddddd',
<del> 'hex'),
<del> hmac: {
<del> sha224: '7fb3cb3588c6c1f6ffa9694d7d6ad2649365b0c1f65d69d1ec8333ea',
<del> sha256:
<del> '773ea91e36800e46854db8ebd09181a72959098b3ef8c122d9635514' +
<del> 'ced565fe',
<del> sha384:
<del> '88062608d3e6ad8a0aa2ace014c8a86f0aa635d947ac9febe83ef4e5' +
<del> '5966144b2a5ab39dc13814b94e3ab6e101a34f27',
<del> sha512:
<del> 'fa73b0089d56a284efb0f0756c890be9b1b5dbdd8ee81a3655f83e33' +
<del> 'b2279d39bf3e848279a722c806b485a47e67c807b946a337bee89426' +
<del> '74278859e13292fb'
<del> }
<del> },
<del> {
<del> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<del> 'hex'),
<del> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<del> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd',
<del> 'hex'),
<del> hmac: {
<del> sha224: '6c11506874013cac6a2abc1bb382627cec6a90d86efc012de7afec5a',
<del> sha256:
<del> '82558a389a443c0ea4cc819899f2083a85f0faa3e578f8077a2e3ff4' +
<del> '6729665b',
<del> sha384:
<del> '3e8a69b7783c25851933ab6290af6ca77a9981480850009cc5577c6e' +
<del> '1f573b4e6801dd23c4a7d679ccf8a386c674cffb',
<del> sha512:
<del> 'b0ba465637458c6990e5a8c5f61d4af7e576d97ff94b872de76f8050' +
<del> '361ee3dba91ca5c11aa25eb4d679275cc5788063a5f19741120c4f2d' +
<del> 'e2adebeb10a298dd'
<del> }
<del> },
<del>
<del> {
<del> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<del> // 'Test With Truncation'
<del> data: new Buffer('546573742057697468205472756e636174696f6e', 'hex'),
<del> hmac: {
<del> sha224: '0e2aea68a90c8d37c988bcdb9fca6fa8',
<del> sha256: 'a3b6167473100ee06e0c796c2955552b',
<del> sha384: '3abf34c3503b2a23a46efc619baef897',
<del> sha512: '415fad6271580a531d4179bc891d87a6'
<del> },
<del> truncate: true
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaa', 'hex'),
<del> // 'Test Using Larger Than Block-Size Key - Hash Key First'
<del> data: new Buffer('54657374205573696e67204c6172676572205468616e20426' +
<del> 'c6f636b2d53697a65204b6579202d2048617368204b657920' +
<del> '4669727374', 'hex'),
<del> hmac: {
<del> sha224: '95e9a0db962095adaebe9b2d6f0dbce2d499f112f2d2b7273fa6870e',
<del> sha256:
<del> '60e431591ee0b67f0d8a26aacbf5b77f8e0bc6213728c5140546040f' +
<del> '0ee37f54',
<del> sha384:
<del> '4ece084485813e9088d2c63a041bc5b44f9ef1012a2b588f3cd11f05' +
<del> '033ac4c60c2ef6ab4030fe8296248df163f44952',
<del> sha512:
<del> '80b24263c7c1a3ebb71493c1dd7be8b49b46d1f41b4aeec1121b0137' +
<del> '83f8f3526b56d037e05f2598bd0fd2215d6a1e5295e64f73f63f0aec' +
<del> '8b915a985d786598'
<del> }
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaa', 'hex'),
<del> // 'This is a test using a larger than block-size key and a larger ' +
<del> // 'than block-size data. The key needs to be hashed before being ' +
<del> // 'used by the HMAC algorithm.'
<del> data: new Buffer('5468697320697320612074657374207573696e672061206c6' +
<del> '172676572207468616e20626c6f636b2d73697a65206b6579' +
<del> '20616e642061206c6172676572207468616e20626c6f636b2' +
<del> 'd73697a6520646174612e20546865206b6579206e65656473' +
<del> '20746f20626520686173686564206265666f7265206265696' +
<del> 'e6720757365642062792074686520484d414320616c676f72' +
<del> '6974686d2e', 'hex'),
<del> hmac: {
<del> sha224: '3a854166ac5d9f023f54d517d0b39dbd946770db9c2b95c9f6f565d1',
<del> sha256:
<del> '9b09ffa71b942fcb27635fbcd5b0e944bfdc63644f0713938a7f5153' +
<del> '5c3a35e2',
<del> sha384:
<del> '6617178e941f020d351e2f254e8fd32c602420feb0b8fb9adccebb82' +
<del> '461e99c5a678cc31e799176d3860e6110c46523e',
<del> sha512:
<del> 'e37b6a775dc87dbaa4dfa9f96e5e3ffddebd71f8867289865df5a32d' +
<del> '20cdc944b6022cac3c4982b10d5eeb55c3e4de15134676fb6de04460' +
<del> '65c97440fa8c6a58'
<del> }
<del> }
<del>];
<del>
<del>for (var i = 0, l = rfc4231.length; i < l; i++) {
<del> for (var hash in rfc4231[i]['hmac']) {
<del> var str = crypto.createHmac(hash, rfc4231[i].key);
<del> str.end(rfc4231[i].data);
<del> var strRes = str.read().toString('hex');
<del> var result = crypto.createHmac(hash, rfc4231[i]['key'])
<del> .update(rfc4231[i]['data'])
<del> .digest('hex');
<del> if (rfc4231[i]['truncate']) {
<del> result = result.substr(0, 32); // first 128 bits == 32 hex chars
<del> strRes = strRes.substr(0, 32);
<del> }
<del> assert.equal(rfc4231[i]['hmac'][hash],
<del> result,
<del> 'Test HMAC-' + hash + ': Test case ' + (i + 1) + ' rfc 4231');
<del> assert.equal(strRes, result, 'Should get same result from stream');
<del> }
<del>}
<del>
<del>// Test HMAC-MD5/SHA1 (rfc 2202 Test Cases)
<del>var rfc2202_md5 = [
<del> {
<del> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<del> data: 'Hi There',
<del> hmac: '9294727a3638bb1c13f48ef8158bfc9d'
<del> },
<del> {
<del> key: 'Jefe',
<del> data: 'what do ya want for nothing?',
<del> hmac: '750c783e6ab0b503eaa86e310a5db738'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<del> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddddddd' +
<del> 'ddddddddddddddddddddddddddddddddddddddddddddddddddd',
<del> 'hex'),
<del> hmac: '56be34521d144c88dbb8c733f0e8b3f6'
<del> },
<del> {
<del> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<del> 'hex'),
<del> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<del> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd' +
<del> 'cdcdcdcdcd',
<del> 'hex'),
<del> hmac: '697eaf0aca3a3aea3a75164746ffaa79'
<del> },
<del> {
<del> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<del> data: 'Test With Truncation',
<del> hmac: '56461ef2342edc00f9bab995690efd4c'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaa',
<del> 'hex'),
<del> data: 'Test Using Larger Than Block-Size Key - Hash Key First',
<del> hmac: '6b1ab7fe4bd7bf8f0b62e6ce61b9d0cd'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaa',
<del> 'hex'),
<del> data:
<del> 'Test Using Larger Than Block-Size Key and Larger Than One ' +
<del> 'Block-Size Data',
<del> hmac: '6f630fad67cda0ee1fb1f562db3aa53e'
<del> }
<del>];
<del>var rfc2202_sha1 = [
<del> {
<del> key: new Buffer('0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b0b', 'hex'),
<del> data: 'Hi There',
<del> hmac: 'b617318655057264e28bc0b6fb378c8ef146be00'
<del> },
<del> {
<del> key: 'Jefe',
<del> data: 'what do ya want for nothing?',
<del> hmac: 'effcdf6ae5eb2fa2d27416d5f184df9c259a7c79'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'hex'),
<del> data: new Buffer('ddddddddddddddddddddddddddddddddddddddddddddd' +
<del> 'ddddddddddddddddddddddddddddddddddddddddddddd' +
<del> 'dddddddddd',
<del> 'hex'),
<del> hmac: '125d7342b9ac11cd91a39af48aa17b4f63f175d3'
<del> },
<del> {
<del> key: new Buffer('0102030405060708090a0b0c0d0e0f10111213141516171819',
<del> 'hex'),
<del> data: new Buffer('cdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdc' +
<del> 'dcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcdcd' +
<del> 'cdcdcdcdcd',
<del> 'hex'),
<del> hmac: '4c9007f4026250c6bc8414f9bf50c86c2d7235da'
<del> },
<del> {
<del> key: new Buffer('0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c0c', 'hex'),
<del> data: 'Test With Truncation',
<del> hmac: '4c1a03424b55e07fe7f27be1d58bb9324a9a5a04'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaa',
<del> 'hex'),
<del> data: 'Test Using Larger Than Block-Size Key - Hash Key First',
<del> hmac: 'aa4ae5e15272d00e95705637ce8a3b55ed402112'
<del> },
<del> {
<del> key: new Buffer('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' +
<del> 'aaaaaaaaaaaaaaaaaaaaaa',
<del> 'hex'),
<del> data:
<del> 'Test Using Larger Than Block-Size Key and Larger Than One ' +
<del> 'Block-Size Data',
<del> hmac: 'e8e99d0f45237d786d6bbaa7965c7808bbff1a91'
<del> }
<del>];
<del>
<del>for (var i = 0, l = rfc2202_md5.length; i < l; i++) {
<del> assert.equal(rfc2202_md5[i]['hmac'],
<del> crypto.createHmac('md5', rfc2202_md5[i]['key'])
<del> .update(rfc2202_md5[i]['data'])
<del> .digest('hex'),
<del> 'Test HMAC-MD5 : Test case ' + (i + 1) + ' rfc 2202');
<del>}
<del>for (var i = 0, l = rfc2202_sha1.length; i < l; i++) {
<del> assert.equal(rfc2202_sha1[i]['hmac'],
<del> crypto.createHmac('sha1', rfc2202_sha1[i]['key'])
<del> .update(rfc2202_sha1[i]['data'])
<del> .digest('hex'),
<del> 'Test HMAC-SHA1 : Test case ' + (i + 1) + ' rfc 2202');
<del>}
<del>
<del>// Test hashing
<del>var a0 = crypto.createHash('sha1').update('Test123').digest('hex');
<del>var a1 = crypto.createHash('md5').update('Test123').digest('binary');
<del>var a2 = crypto.createHash('sha256').update('Test123').digest('base64');
<del>var a3 = crypto.createHash('sha512').update('Test123').digest(); // binary
<del>var a4 = crypto.createHash('sha1').update('Test123').digest('buffer');
<del>
<del>// stream interface
<del>var a5 = crypto.createHash('sha512');
<del>a5.end('Test123');
<del>a5 = a5.read();
<del>
<del>var a6 = crypto.createHash('sha512');
<del>a6.write('Te');
<del>a6.write('st');
<del>a6.write('123');
<del>a6.end();
<del>a6 = a6.read();
<del>
<del>var a7 = crypto.createHash('sha512');
<del>a7.end();
<del>a7 = a7.read();
<del>
<del>var a8 = crypto.createHash('sha512');
<del>a8.write('');
<del>a8.end();
<del>a8 = a8.read();
<del>
<del>assert.equal(a0, '8308651804facb7b9af8ffc53a33a22d6a1c8ac2', 'Test SHA1');
<del>assert.equal(a1, 'h\u00ea\u00cb\u0097\u00d8o\fF!\u00fa+\u000e\u0017\u00ca' +
<del> '\u00bd\u008c', 'Test MD5 as binary');
<del>assert.equal(a2, '2bX1jws4GYKTlxhloUB09Z66PoJZW+y+hq5R8dnx9l4=',
<del> 'Test SHA256 as base64');
<del>assert.deepEqual(
<del> a3,
<del> new Buffer(
<del> '\u00c1(4\u00f1\u0003\u001fd\u0097!O\'\u00d4C/&Qz\u00d4' +
<del> '\u0094\u0015l\u00b8\u008dQ+\u00db\u001d\u00c4\u00b5}\u00b2' +
<del> '\u00d6\u0092\u00a3\u00df\u00a2i\u00a1\u009b\n\n*\u000f' +
<del> '\u00d7\u00d6\u00a2\u00a8\u0085\u00e3<\u0083\u009c\u0093' +
<del> '\u00c2\u0006\u00da0\u00a1\u00879(G\u00ed\'',
<del> 'binary'),
<del> 'Test SHA512 as assumed buffer');
<del>assert.deepEqual(a4,
<del> new Buffer('8308651804facb7b9af8ffc53a33a22d6a1c8ac2', 'hex'),
<del> 'Test SHA1');
<del>
<del>// stream interface should produce the same result.
<del>assert.deepEqual(a5, a3, 'stream interface is consistent');
<del>assert.deepEqual(a6, a3, 'stream interface is consistent');
<del>assert.notEqual(a7, undefined, 'no data should return data');
<del>assert.notEqual(a8, undefined, 'empty string should generate data');
<del>
<del>// Test multiple updates to same hash
<del>var h1 = crypto.createHash('sha1').update('Test123').digest('hex');
<del>var h2 = crypto.createHash('sha1').update('Test').update('123').digest('hex');
<del>assert.equal(h1, h2, 'multipled updates');
<del>
<del>// Test hashing for binary files
<del>var fn = path.join(common.fixturesDir, 'sample.png');
<del>var sha1Hash = crypto.createHash('sha1');
<del>var fileStream = fs.createReadStream(fn);
<del>fileStream.on('data', function(data) {
<del> sha1Hash.update(data);
<del>});
<del>fileStream.on('close', function() {
<del> assert.equal(sha1Hash.digest('hex'),
<del> '22723e553129a336ad96e10f6aecdf0f45e4149e',
<del> 'Test SHA1 of sample.png');
<del>});
<del>
<del>// Issue #2227: unknown digest method should throw an error.
<del>assert.throws(function() {
<del> crypto.createHash('xyzzy');
<del>});
<del>
<del>// Test signing and verifying
<del>var s1 = crypto.createSign('RSA-SHA1')
<del> .update('Test123')
<del> .sign(keyPem, 'base64');
<del>var s1stream = crypto.createSign('RSA-SHA1');
<del>s1stream.end('Test123');
<del>s1stream = s1stream.sign(keyPem, 'base64');
<del>assert.equal(s1, s1stream, 'Stream produces same output');
<del>
<del>var verified = crypto.createVerify('RSA-SHA1')
<del> .update('Test')
<del> .update('123')
<del> .verify(certPem, s1, 'base64');
<del>assert.strictEqual(verified, true, 'sign and verify (base 64)');
<del>
<del>var s2 = crypto.createSign('RSA-SHA256')
<del> .update('Test123')
<del> .sign(keyPem, 'binary');
<del>var s2stream = crypto.createSign('RSA-SHA256');
<del>s2stream.end('Test123');
<del>s2stream = s2stream.sign(keyPem, 'binary');
<del>assert.equal(s2, s2stream, 'Stream produces same output');
<del>
<del>var verified = crypto.createVerify('RSA-SHA256')
<del> .update('Test')
<del> .update('123')
<del> .verify(certPem, s2, 'binary');
<del>assert.strictEqual(verified, true, 'sign and verify (binary)');
<del>
<del>var verStream = crypto.createVerify('RSA-SHA256');
<del>verStream.write('Tes');
<del>verStream.write('t12');
<del>verStream.end('3');
<del>verified = verStream.verify(certPem, s2, 'binary');
<del>assert.strictEqual(verified, true, 'sign and verify (stream)');
<del>
<del>var s3 = crypto.createSign('RSA-SHA1')
<del> .update('Test123')
<del> .sign(keyPem, 'buffer');
<del>var verified = crypto.createVerify('RSA-SHA1')
<del> .update('Test')
<del> .update('123')
<del> .verify(certPem, s3);
<del>assert.strictEqual(verified, true, 'sign and verify (buffer)');
<del>
<del>var verStream = crypto.createVerify('RSA-SHA1');
<del>verStream.write('Tes');
<del>verStream.write('t12');
<del>verStream.end('3');
<del>verified = verStream.verify(certPem, s3);
<del>assert.strictEqual(verified, true, 'sign and verify (stream)');
<del>
<del>
<del>function testCipher1(key) {
<del> // Test encryption and decryption
<del> var plaintext = 'Keep this a secret? No! Tell everyone about node.js!';
<del> var cipher = crypto.createCipher('aes192', key);
<del>
<del> // encrypt plaintext which is in utf8 format
<del> // to a ciphertext which will be in hex
<del> var ciph = cipher.update(plaintext, 'utf8', 'hex');
<del> // Only use binary or hex, not base64.
<del> ciph += cipher.final('hex');
<del>
<del> var decipher = crypto.createDecipher('aes192', key);
<del> var txt = decipher.update(ciph, 'hex', 'utf8');
<del> txt += decipher.final('utf8');
<del>
<del> assert.equal(txt, plaintext, 'encryption and decryption');
<del>
<del> // streaming cipher interface
<del> // NB: In real life, it's not guaranteed that you can get all of it
<del> // in a single read() like this. But in this case, we know it's
<del> // quite small, so there's no harm.
<del> var cStream = crypto.createCipher('aes192', key);
<del> cStream.end(plaintext);
<del> ciph = cStream.read();
<del>
<del> var dStream = crypto.createDecipher('aes192', key);
<del> dStream.end(ciph);
<del> txt = dStream.read().toString('utf8');
<del>
<del> assert.equal(txt, plaintext, 'encryption and decryption with streams');
<del>}
<del>
<del>
<del>function testCipher2(key) {
<del> // encryption and decryption with Base64
<del> // reported in https://github.com/joyent/node/issues/738
<del> var plaintext =
<del> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<del> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<del> 'jAfaFg**';
<del> var cipher = crypto.createCipher('aes256', key);
<del>
<del> // encrypt plaintext which is in utf8 format
<del> // to a ciphertext which will be in Base64
<del> var ciph = cipher.update(plaintext, 'utf8', 'base64');
<del> ciph += cipher.final('base64');
<del>
<del> var decipher = crypto.createDecipher('aes256', key);
<del> var txt = decipher.update(ciph, 'base64', 'utf8');
<del> txt += decipher.final('utf8');
<del>
<del> assert.equal(txt, plaintext, 'encryption and decryption with Base64');
<del>}
<del>
<del>
<del>function testCipher3(key, iv) {
<del> // Test encyrption and decryption with explicit key and iv
<del> var plaintext =
<del> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<del> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<del> 'jAfaFg**';
<del> var cipher = crypto.createCipheriv('des-ede3-cbc', key, iv);
<del> var ciph = cipher.update(plaintext, 'utf8', 'hex');
<del> ciph += cipher.final('hex');
<del>
<del> var decipher = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<del> var txt = decipher.update(ciph, 'hex', 'utf8');
<del> txt += decipher.final('utf8');
<del>
<del> assert.equal(txt, plaintext, 'encryption and decryption with key and iv');
<del>
<del> // streaming cipher interface
<del> // NB: In real life, it's not guaranteed that you can get all of it
<del> // in a single read() like this. But in this case, we know it's
<del> // quite small, so there's no harm.
<del> var cStream = crypto.createCipheriv('des-ede3-cbc', key, iv);
<del> cStream.end(plaintext);
<del> ciph = cStream.read();
<del>
<del> var dStream = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<del> dStream.end(ciph);
<del> txt = dStream.read().toString('utf8');
<del>
<del> assert.equal(txt, plaintext, 'streaming cipher iv');
<del>}
<del>
<del>
<del>function testCipher4(key, iv) {
<del> // Test encyrption and decryption with explicit key and iv
<del> var plaintext =
<del> '32|RmVZZkFUVmpRRkp0TmJaUm56ZU9qcnJkaXNNWVNpTTU*|iXmckfRWZBGWWELw' +
<del> 'eCBsThSsfUHLeRe0KCsK8ooHgxie0zOINpXxfZi/oNG7uq9JWFVCk70gfzQH8ZUJ' +
<del> 'jAfaFg**';
<del> var cipher = crypto.createCipheriv('des-ede3-cbc', key, iv);
<del> var ciph = cipher.update(plaintext, 'utf8', 'buffer');
<del> ciph = Buffer.concat([ciph, cipher.final('buffer')]);
<del>
<del> var decipher = crypto.createDecipheriv('des-ede3-cbc', key, iv);
<del> var txt = decipher.update(ciph, 'buffer', 'utf8');
<del> txt += decipher.final('utf8');
<del>
<del> assert.equal(txt, plaintext, 'encryption and decryption with key and iv');
<del>}
<del>
<del>
<del>testCipher1('MySecretKey123');
<del>testCipher1(new Buffer('MySecretKey123'));
<del>
<del>testCipher2('0123456789abcdef');
<del>testCipher2(new Buffer('0123456789abcdef'));
<del>
<del>testCipher3('0123456789abcd0123456789', '12345678');
<del>testCipher3('0123456789abcd0123456789', new Buffer('12345678'));
<del>testCipher3(new Buffer('0123456789abcd0123456789'), '12345678');
<del>testCipher3(new Buffer('0123456789abcd0123456789'), new Buffer('12345678'));
<del>
<del>testCipher4(new Buffer('0123456789abcd0123456789'), new Buffer('12345678'));
<del>
<ide>
<ide> // update() should only take buffers / strings
<ide> assert.throws(function() {
<ide> crypto.createHash('sha1').update({foo: 'bar'});
<ide> }, /buffer/);
<ide>
<ide>
<del>// Test Diffie-Hellman with two parties sharing a secret,
<del>// using various encodings as we go along
<del>var dh1 = crypto.createDiffieHellman(256);
<del>var p1 = dh1.getPrime('buffer');
<del>var dh2 = crypto.createDiffieHellman(p1, 'buffer');
<del>var key1 = dh1.generateKeys();
<del>var key2 = dh2.generateKeys('hex');
<del>var secret1 = dh1.computeSecret(key2, 'hex', 'base64');
<del>var secret2 = dh2.computeSecret(key1, 'binary', 'buffer');
<del>
<del>assert.equal(secret1, secret2.toString('base64'));
<del>assert.equal(dh1.verifyError, 0);
<del>assert.equal(dh2.verifyError, 0);
<del>
<del>assert.throws(function() {
<del> crypto.createDiffieHellman([0x1, 0x2]);
<del>});
<del>
<del>assert.throws(function() {
<del> crypto.createDiffieHellman(function() { });
<del>});
<del>
<del>assert.throws(function() {
<del> crypto.createDiffieHellman(/abc/);
<del>});
<del>
<del>assert.throws(function() {
<del> crypto.createDiffieHellman({});
<del>});
<del>
<del>// Create "another dh1" using generated keys from dh1,
<del>// and compute secret again
<del>var dh3 = crypto.createDiffieHellman(p1, 'buffer');
<del>var privkey1 = dh1.getPrivateKey();
<del>dh3.setPublicKey(key1);
<del>dh3.setPrivateKey(privkey1);
<del>
<del>assert.deepEqual(dh1.getPrime(), dh3.getPrime());
<del>assert.deepEqual(dh1.getGenerator(), dh3.getGenerator());
<del>assert.deepEqual(dh1.getPublicKey(), dh3.getPublicKey());
<del>assert.deepEqual(dh1.getPrivateKey(), dh3.getPrivateKey());
<del>assert.equal(dh3.verifyError, 0);
<del>
<del>var secret3 = dh3.computeSecret(key2, 'hex', 'base64');
<del>
<del>assert.equal(secret1, secret3);
<del>
<del>// Run this one twice to make sure that the dh3 clears its error properly
<del>(function() {
<del> var c = crypto.createDecipher('aes-128-ecb', '');
<del> assert.throws(function() { c.final('utf8') }, /wrong final block length/);
<del>})();
<del>
<del>assert.throws(function() {
<del> dh3.computeSecret('');
<del>}, /key is too small/i);
<del>
<del>(function() {
<del> var c = crypto.createDecipher('aes-128-ecb', '');
<del> assert.throws(function() { c.final('utf8') }, /wrong final block length/);
<del>})();
<del>
<del>// Create a shared using a DH group.
<del>var alice = crypto.createDiffieHellmanGroup('modp5');
<del>var bob = crypto.createDiffieHellmanGroup('modp5');
<del>alice.generateKeys();
<del>bob.generateKeys();
<del>var aSecret = alice.computeSecret(bob.getPublicKey()).toString('hex');
<del>var bSecret = bob.computeSecret(alice.getPublicKey()).toString('hex');
<del>assert.equal(aSecret, bSecret);
<del>assert.equal(alice.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>assert.equal(bob.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>// Ensure specific generator (buffer) works as expected.
<del>var modp1 = crypto.createDiffieHellmanGroup('modp1');
<del>var modp1buf = new Buffer([
<del> 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xc9, 0x0f,
<del> 0xda, 0xa2, 0x21, 0x68, 0xc2, 0x34, 0xc4, 0xc6, 0x62, 0x8b,
<del> 0x80, 0xdc, 0x1c, 0xd1, 0x29, 0x02, 0x4e, 0x08, 0x8a, 0x67,
<del> 0xcc, 0x74, 0x02, 0x0b, 0xbe, 0xa6, 0x3b, 0x13, 0x9b, 0x22,
<del> 0x51, 0x4a, 0x08, 0x79, 0x8e, 0x34, 0x04, 0xdd, 0xef, 0x95,
<del> 0x19, 0xb3, 0xcd, 0x3a, 0x43, 0x1b, 0x30, 0x2b, 0x0a, 0x6d,
<del> 0xf2, 0x5f, 0x14, 0x37, 0x4f, 0xe1, 0x35, 0x6d, 0x6d, 0x51,
<del> 0xc2, 0x45, 0xe4, 0x85, 0xb5, 0x76, 0x62, 0x5e, 0x7e, 0xc6,
<del> 0xf4, 0x4c, 0x42, 0xe9, 0xa6, 0x3a, 0x36, 0x20, 0xff, 0xff,
<del> 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
<del>]);
<del>var exmodp1 = crypto.createDiffieHellman(modp1buf, new Buffer([2]));
<del>modp1.generateKeys();
<del>exmodp1.generateKeys();
<del>var modp1Secret = modp1.computeSecret(exmodp1.getPublicKey()).toString('hex');
<del>var exmodp1Secret = exmodp1.computeSecret(modp1.getPublicKey()).toString('hex');
<del>assert.equal(modp1Secret, exmodp1Secret);
<del>assert.equal(modp1.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>assert.equal(exmodp1.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>
<del>// Ensure specific generator (string with encoding) works as expected.
<del>var exmodp1_2 = crypto.createDiffieHellman(modp1buf, '02', 'hex');
<del>exmodp1_2.generateKeys();
<del>modp1Secret = modp1.computeSecret(exmodp1_2.getPublicKey()).toString('hex');
<del>var exmodp1_2Secret = exmodp1_2.computeSecret(modp1.getPublicKey())
<del> .toString('hex');
<del>assert.equal(modp1Secret, exmodp1_2Secret);
<del>assert.equal(exmodp1_2.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>
<del>// Ensure specific generator (string without encoding) works as expected.
<del>var exmodp1_3 = crypto.createDiffieHellman(modp1buf, '\x02');
<del>exmodp1_3.generateKeys();
<del>modp1Secret = modp1.computeSecret(exmodp1_3.getPublicKey()).toString('hex');
<del>var exmodp1_3Secret = exmodp1_3.computeSecret(modp1.getPublicKey())
<del> .toString('hex');
<del>assert.equal(modp1Secret, exmodp1_3Secret);
<del>assert.equal(exmodp1_3.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>
<del>// Ensure specific generator (numeric) works as expected.
<del>var exmodp1_4 = crypto.createDiffieHellman(modp1buf, 2);
<del>exmodp1_4.generateKeys();
<del>modp1Secret = modp1.computeSecret(exmodp1_4.getPublicKey()).toString('hex');
<del>var exmodp1_4Secret = exmodp1_4.computeSecret(modp1.getPublicKey())
<del> .toString('hex');
<del>assert.equal(modp1Secret, exmodp1_4Secret);
<del>assert.equal(exmodp1_4.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>
<del>var p = 'FFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74' +
<del> '020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F1437' +
<del> '4FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7ED' +
<del> 'EE386BFB5A899FA5AE9F24117C4B1FE649286651ECE65381FFFFFFFFFFFFFFFF';
<del>var bad_dh = crypto.createDiffieHellman(p, 'hex');
<del>assert.equal(bad_dh.verifyError, constants.DH_NOT_SUITABLE_GENERATOR);
<del>
<del>// Test RSA encryption/decryption
<del>(function() {
<del> var input = 'I AM THE WALRUS';
<del> var bufferToEncrypt = new Buffer(input);
<del>
<del> var encryptedBuffer = crypto.publicEncrypt(rsaPubPem, bufferToEncrypt);
<del>
<del> var decryptedBuffer = crypto.privateDecrypt(rsaKeyPem, encryptedBuffer);
<del> assert.equal(input, decryptedBuffer.toString());
<del>
<del> var decryptedBufferWithPassword = crypto.privateDecrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: 'password'
<del> }, encryptedBuffer);
<del> assert.equal(input, decryptedBufferWithPassword.toString());
<del>
<del> encryptedBuffer = crypto.publicEncrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: 'password'
<del> }, bufferToEncrypt);
<del>
<del> decryptedBufferWithPassword = crypto.privateDecrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: 'password'
<del> }, encryptedBuffer);
<del> assert.equal(input, decryptedBufferWithPassword.toString());
<del>
<del> encryptedBuffer = crypto.privateEncrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: new Buffer('password')
<del> }, bufferToEncrypt);
<del>
<del> decryptedBufferWithPassword = crypto.publicDecrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: new Buffer('password')
<del> }, encryptedBuffer);
<del> assert.equal(input, decryptedBufferWithPassword.toString());
<del>
<del> encryptedBuffer = crypto.publicEncrypt(certPem, bufferToEncrypt);
<del>
<del> decryptedBuffer = crypto.privateDecrypt(keyPem, encryptedBuffer);
<del> assert.equal(input, decryptedBuffer.toString());
<del>
<del> encryptedBuffer = crypto.publicEncrypt(keyPem, bufferToEncrypt);
<del>
<del> decryptedBuffer = crypto.privateDecrypt(keyPem, encryptedBuffer);
<del> assert.equal(input, decryptedBuffer.toString());
<del>
<del> encryptedBuffer = crypto.privateEncrypt(keyPem, bufferToEncrypt);
<del>
<del> decryptedBuffer = crypto.publicDecrypt(keyPem, encryptedBuffer);
<del> assert.equal(input, decryptedBuffer.toString());
<del>
<del> assert.throws(function() {
<del> crypto.privateDecrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: 'wrong'
<del> }, bufferToEncrypt);
<del> });
<del>
<del> assert.throws(function() {
<del> crypto.publicEncrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: 'wrong'
<del> }, encryptedBuffer);
<del> });
<del>
<del> encryptedBuffer = crypto.privateEncrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: new Buffer('password')
<del> }, bufferToEncrypt);
<del>
<del> assert.throws(function() {
<del> crypto.publicDecrypt({
<del> key: rsaKeyPemEncrypted,
<del> passphrase: [].concat.apply([], new Buffer('password'))
<del> }, encryptedBuffer);
<del> });
<del>})();
<del>
<del>function test_rsa(padding) {
<del> var input = new Buffer(padding === 'RSA_NO_PADDING' ? 1024 / 8 : 32);
<del> for (var i = 0; i < input.length; i++)
<del> input[i] = (i * 7 + 11) & 0xff;
<del> var bufferToEncrypt = new Buffer(input);
<del>
<del> padding = constants[padding];
<del>
<del> var encryptedBuffer = crypto.publicEncrypt({
<del> key: rsaPubPem,
<del> padding: padding
<del> }, bufferToEncrypt);
<del>
<del> var decryptedBuffer = crypto.privateDecrypt({
<del> key: rsaKeyPem,
<del> padding: padding
<del> }, encryptedBuffer);
<del> assert.equal(input, decryptedBuffer.toString());
<del>}
<del>
<del>test_rsa('RSA_NO_PADDING');
<del>test_rsa('RSA_PKCS1_PADDING');
<del>test_rsa('RSA_PKCS1_OAEP_PADDING');
<del>
<del>// Test RSA key signing/verification
<del>var rsaSign = crypto.createSign('RSA-SHA1');
<del>var rsaVerify = crypto.createVerify('RSA-SHA1');
<del>assert.ok(rsaSign);
<del>assert.ok(rsaVerify);
<del>
<del>rsaSign.update(rsaPubPem);
<del>var rsaSignature = rsaSign.sign(rsaKeyPem, 'hex');
<del>assert.equal(rsaSignature,
<del> '5c50e3145c4e2497aadb0eabc83b342d0b0021ece0d4c4a064b7c' +
<del> '8f020d7e2688b122bfb54c724ac9ee169f83f66d2fe90abeb95e8' +
<del> 'e1290e7e177152a4de3d944cf7d4883114a20ed0f78e70e25ef0f' +
<del> '60f06b858e6af42a2f276ede95bbc6bc9a9bbdda15bd663186a6f' +
<del> '40819a7af19e577bb2efa5e579a1f5ce8a0d4ca8b8f6');
<del>
<del>rsaVerify.update(rsaPubPem);
<del>assert.strictEqual(rsaVerify.verify(rsaPubPem, rsaSignature, 'hex'), true);
<del>
<del>// Test RSA key signing/verification with encrypted key
<del>rsaSign = crypto.createSign('RSA-SHA1');
<del>rsaSign.update(rsaPubPem);
<del>assert.doesNotThrow(function() {
<del> var signOptions = { key: rsaKeyPemEncrypted, passphrase: 'password' };
<del> rsaSignature = rsaSign.sign(signOptions, 'hex');
<del>});
<del>assert.equal(rsaSignature,
<del> '5c50e3145c4e2497aadb0eabc83b342d0b0021ece0d4c4a064b7c' +
<del> '8f020d7e2688b122bfb54c724ac9ee169f83f66d2fe90abeb95e8' +
<del> 'e1290e7e177152a4de3d944cf7d4883114a20ed0f78e70e25ef0f' +
<del> '60f06b858e6af42a2f276ede95bbc6bc9a9bbdda15bd663186a6f' +
<del> '40819a7af19e577bb2efa5e579a1f5ce8a0d4ca8b8f6');
<del>
<del>rsaVerify = crypto.createVerify('RSA-SHA1');
<del>rsaVerify.update(rsaPubPem);
<del>assert.strictEqual(rsaVerify.verify(rsaPubPem, rsaSignature, 'hex'), true);
<del>
<del>rsaSign = crypto.createSign('RSA-SHA1');
<del>rsaSign.update(rsaPubPem);
<del>assert.throws(function() {
<del> var signOptions = { key: rsaKeyPemEncrypted, passphrase: 'wrong' };
<del> rsaSign.sign(signOptions, 'hex');
<del>});
<del>
<del>//
<del>// Test RSA signing and verification
<del>//
<del>(function() {
<del> var privateKey = fs.readFileSync(
<del> common.fixturesDir + '/test_rsa_privkey_2.pem');
<del>
<del> var publicKey = fs.readFileSync(
<del> common.fixturesDir + '/test_rsa_pubkey_2.pem');
<del>
<del> var input = 'I AM THE WALRUS';
<del>
<del> var signature =
<del> '79d59d34f56d0e94aa6a3e306882b52ed4191f07521f25f505a078dc2f89' +
<del> '396e0c8ac89e996fde5717f4cb89199d8fec249961fcb07b74cd3d2a4ffa' +
<del> '235417b69618e4bcd76b97e29975b7ce862299410e1b522a328e44ac9bb2' +
<del> '8195e0268da7eda23d9825ac43c724e86ceeee0d0d4465678652ccaf6501' +
<del> '0ddfb299bedeb1ad';
<del>
<del> var sign = crypto.createSign('RSA-SHA256');
<del> sign.update(input);
<del>
<del> var output = sign.sign(privateKey, 'hex');
<del> assert.equal(output, signature);
<del>
<del> var verify = crypto.createVerify('RSA-SHA256');
<del> verify.update(input);
<del>
<del> assert.strictEqual(verify.verify(publicKey, signature, 'hex'), true);
<del>})();
<del>
<del>
<del>//
<del>// Test DSA signing and verification
<del>//
<del>(function() {
<del> var input = 'I AM THE WALRUS';
<del>
<del> // DSA signatures vary across runs so there is no static string to verify
<del> // against
<del> var sign = crypto.createSign('DSS1');
<del> sign.update(input);
<del> var signature = sign.sign(dsaKeyPem, 'hex');
<del>
<del> var verify = crypto.createVerify('DSS1');
<del> verify.update(input);
<del>
<del> assert.strictEqual(verify.verify(dsaPubPem, signature, 'hex'), true);
<del>})();
<del>
<del>
<del>//
<del>// Test DSA signing and verification with encrypted key
<del>//
<del>(function() {
<del> var input = 'I AM THE WALRUS';
<del>
<del> var sign = crypto.createSign('DSS1');
<del> sign.update(input);
<del> assert.throws(function() {
<del> sign.sign({ key: dsaKeyPemEncrypted, passphrase: 'wrong' }, 'hex');
<del> });
<del>
<del> // DSA signatures vary across runs so there is no static string to verify
<del> // against
<del> var sign = crypto.createSign('DSS1');
<del> sign.update(input);
<del>
<del> var signature;
<del> assert.doesNotThrow(function() {
<del> var signOptions = { key: dsaKeyPemEncrypted, passphrase: 'password' };
<del> signature = sign.sign(signOptions, 'hex');
<del> });
<del>
<del> var verify = crypto.createVerify('DSS1');
<del> verify.update(input);
<del>
<del> assert.strictEqual(verify.verify(dsaPubPem, signature, 'hex'), true);
<del>})();
<del>
<del>
<del>//
<del>// Test PBKDF2 with RFC 6070 test vectors (except #4)
<del>//
<del>function testPBKDF2(password, salt, iterations, keylen, expected) {
<del> var actual = crypto.pbkdf2Sync(password, salt, iterations, keylen);
<del> assert.equal(actual.toString('binary'), expected);
<del>
<del> crypto.pbkdf2(password, salt, iterations, keylen, function(err, actual) {
<del> assert.equal(actual.toString('binary'), expected);
<del> });
<del>}
<del>
<del>
<del>testPBKDF2('password', 'salt', 1, 20,
<del> '\x0c\x60\xc8\x0f\x96\x1f\x0e\x71\xf3\xa9\xb5\x24' +
<del> '\xaf\x60\x12\x06\x2f\xe0\x37\xa6');
<del>
<del>testPBKDF2('password', 'salt', 2, 20,
<del> '\xea\x6c\x01\x4d\xc7\x2d\x6f\x8c\xcd\x1e\xd9\x2a' +
<del> '\xce\x1d\x41\xf0\xd8\xde\x89\x57');
<del>
<del>testPBKDF2('password', 'salt', 4096, 20,
<del> '\x4b\x00\x79\x01\xb7\x65\x48\x9a\xbe\xad\x49\xd9\x26' +
<del> '\xf7\x21\xd0\x65\xa4\x29\xc1');
<del>
<del>testPBKDF2('passwordPASSWORDpassword',
<del> 'saltSALTsaltSALTsaltSALTsaltSALTsalt',
<del> 4096,
<del> 25,
<del> '\x3d\x2e\xec\x4f\xe4\x1c\x84\x9b\x80\xc8\xd8\x36\x62' +
<del> '\xc0\xe4\x4a\x8b\x29\x1a\x96\x4c\xf2\xf0\x70\x38');
<del>
<del>testPBKDF2('pass\0word', 'sa\0lt', 4096, 16,
<del> '\x56\xfa\x6a\xa7\x55\x48\x09\x9d\xcc\x37\xd7\xf0\x34' +
<del> '\x25\xe0\xc3');
<del>
<del>(function() {
<del> var expected =
<del> '64c486c55d30d4c5a079b8823b7d7cb37ff0556f537da8410233bcec330ed956';
<del> var key = crypto.pbkdf2Sync('password', 'salt', 32, 32, 'sha256');
<del> assert.equal(key.toString('hex'), expected);
<del>
<del> crypto.pbkdf2('password', 'salt', 32, 32, 'sha256', common.mustCall(ondone));
<del> function ondone(err, key) {
<del> if (err) throw err;
<del> assert.equal(key.toString('hex'), expected);
<del> }
<del>})();
<del>
<ide> function assertSorted(list) {
<ide> // Array#sort() modifies the list in place so make a copy.
<ide> var sorted = util._extend([], list).sort();
<ide> assert.notEqual(-1, crypto.getHashes().indexOf('RSA-SHA1'));
<ide> assert.equal(-1, crypto.getHashes().indexOf('rsa-sha1'));
<ide> assertSorted(crypto.getHashes());
<ide>
<del>// Base64 padding regression test, see #4837.
<del>(function() {
<del> var c = crypto.createCipher('aes-256-cbc', 'secret');
<del> var s = c.update('test', 'utf8', 'base64') + c.final('base64');
<del> assert.equal(s, '375oxUQCIocvxmC5At+rvA==');
<del>})();
<del>
<del>// Error path should not leak memory (check with valgrind).
<del>assert.throws(function() {
<del> crypto.pbkdf2('password', 'salt', 1, 20, null);
<del>});
<del>
<del>// Calling Cipher.final() or Decipher.final() twice should error but
<del>// not assert. See #4886.
<del>(function() {
<del> var c = crypto.createCipher('aes-256-cbc', 'secret');
<del> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<del> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<del> try { c.final('xxx') } catch (e) { /* Ignore. */ }
<del> var d = crypto.createDecipher('aes-256-cbc', 'secret');
<del> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<del> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<del> try { d.final('xxx') } catch (e) { /* Ignore. */ }
<del>})();
<del>
<del>// Regression test for #5482: string to Cipher#update() should not assert.
<del>(function() {
<del> var c = crypto.createCipher('aes192', '0123456789abcdef');
<del> c.update('update');
<del> c.final();
<del>})();
<del>
<del>// #5655 regression tests, 'utf-8' and 'utf8' are identical.
<del>(function() {
<del> var c = crypto.createCipher('aes192', '0123456789abcdef');
<del> c.update('update', ''); // Defaults to "utf8".
<del> c.final('utf-8'); // Should not throw.
<del>
<del> c = crypto.createCipher('aes192', '0123456789abcdef');
<del> c.update('update', 'utf8');
<del> c.final('utf-8'); // Should not throw.
<del>
<del> c = crypto.createCipher('aes192', '0123456789abcdef');
<del> c.update('update', 'utf-8');
<del> c.final('utf8'); // Should not throw.
<del>})();
<del>
<ide> // Regression tests for #5725: hex input that's not a power of two should
<ide> // throw, not assert in C++ land.
<ide> assert.throws(function() {
<ide> assert.throws(function() {
<ide>
<ide> // Make sure memory isn't released before being returned
<ide> console.log(crypto.randomBytes(16));
<del>
<del>// Test ECDH
<del>var ecdh1 = crypto.createECDH('prime256v1');
<del>var ecdh2 = crypto.createECDH('prime256v1');
<del>var key1 = ecdh1.generateKeys();
<del>var key2 = ecdh2.generateKeys('hex');
<del>var secret1 = ecdh1.computeSecret(key2, 'hex', 'base64');
<del>var secret2 = ecdh2.computeSecret(key1, 'binary', 'buffer');
<del>
<del>assert.equal(secret1, secret2.toString('base64'));
<del>
<del>// Point formats
<del>assert.equal(ecdh1.getPublicKey('buffer', 'uncompressed')[0], 4);
<del>var firstByte = ecdh1.getPublicKey('buffer', 'compressed')[0];
<del>assert(firstByte === 2 || firstByte === 3);
<del>var firstByte = ecdh1.getPublicKey('buffer', 'hybrid')[0];
<del>assert(firstByte === 6 || firstByte === 7);
<del>
<del>// ECDH should check that point is on curve
<del>var ecdh3 = crypto.createECDH('secp256k1');
<del>var key3 = ecdh3.generateKeys();
<del>
<del>assert.throws(function() {
<del> var secret3 = ecdh2.computeSecret(key3, 'binary', 'buffer');
<del>});
<del>
<del>// ECDH should allow .setPrivateKey()/.setPublicKey()
<del>var ecdh4 = crypto.createECDH('prime256v1');
<del>
<del>ecdh4.setPrivateKey(ecdh1.getPrivateKey());
<del>ecdh4.setPublicKey(ecdh1.getPublicKey());
<del>
<del>assert.throws(function() {
<del> ecdh4.setPublicKey(ecdh3.getPublicKey());
<del>}); | 8 |
Python | Python | serialize sequential models as yaml | f142d34ffc43b6eb7f57b21d2fa742585e1dbe60 | <ide><path>keras/models.py
<ide> import theano
<ide> import theano.tensor as T
<ide> import numpy as np
<del>import warnings, time, copy
<add>import warnings, time, copy, yaml
<add>
<add>from .layers.convolutional import *
<add>from .layers.core import *
<add>from .layers.embeddings import *
<add>from .layers.noise import *
<add>from .layers.normalization import *
<add>from .layers.recurrent import *
<add>
<add>from .optimizers import *
<add>from .objectives import *
<add>from .regularizers import *
<add>from .constraints import *
<ide>
<ide> from . import optimizers
<ide> from . import objectives
<ide> def standardize_weights(y, sample_weight=None, class_weight=None):
<ide> else:
<ide> return np.ones(y.shape[:-1] + (1,))
<ide>
<add>def sequential_from_yaml(pathToYaml):
<add> '''
<add> Returns a compiled Sequential model generated from a local yaml file,
<add> which is either created by hand or from Sequential.to_yaml
<add> '''
<add> stream = open(pathToYaml, 'r')
<add> modelYaml = yaml.load(stream)
<add> model = Sequential()
<add>
<add> class_mode = modelYaml.get('class_mode')
<add> theano_mode = modelYaml.get('theano_mode')
<add> loss = globals()[modelYaml.get('loss')]
<add>
<add> optim = modelYaml.get('optimizer')
<add> optimName = optim.get('name')
<add> optim.pop('name')
<add> optimizer = globals()[optimName](**optim)
<add>
<add> layers = modelYaml.get('layers')
<add> for layer in layers:
<add> name = layer.get('name')
<add> layer.pop('name')
<add> hasParams = False
<add> if layer.has_key('parameters'):
<add> params = layer.get('parameters')
<add> layer.pop('parameters')
<add> hasParams = True
<add> for k, v in layer.iteritems():
<add> if isinstance(v, dict):
<add> vname = v.get('name')
<add> v.pop('name')
<add> layer[k] = globals()[vname](**v)
<add> initLayer = globals()[name](**layer)
<add> if hasParams:
<add> shapedParams = []
<add> for param in params:
<add> data = np.asarray(param.get('data'))
<add> shape = tuple(param.get('shape'))
<add> shapedParams.append(data.reshape(shape))
<add> initLayer.set_weights(shapedParams)
<add> model.add(initLayer)
<add>
<add> model.compile(loss=loss, optimizer=optimizer, class_mode=class_mode, theano_mode=theano_mode)
<add> return model
<ide>
<ide> class Model(object):
<ide> def _fit(self, f, ins, out_labels=[], batch_size=128, nb_epoch=100, verbose=1, callbacks=[], \
<ide> def load_weights(self, filepath):
<ide> f.close()
<ide>
<ide>
<add> def to_yaml(self, fileName, storeParams=True):
<add> '''
<add> Stores compiled Sequential model to local file, optionally storing all learnable parameters
<add> '''
<add> modelDict = {}
<add> modelDict['class_mode'] = self.class_mode
<add> modelDict['theano_mode'] = self.theano_mode
<add> modelDict['loss'] = self.unweighted_loss.__name__
<add> modelDict['optimizer'] = self.optimizer.get_config()
<add>
<add> layers = []
<add> for layer in self.layers:
<add> layerConf = layer.get_config()
<add> if storeParams:
<add> layerConf['parameters'] = [{'shape':list(param.get_value().shape), 'data':param.get_value().tolist()} for param in layer.params]
<add> layers.append(layerConf)
<add> modelDict['layers'] = layers
<add>
<add> with open(fileName, 'w') as outfile:
<add> outfile.write(yaml.dump(modelDict, default_flow_style=True))
<add>
<add>
<ide> class Graph(Model, containers.Graph):
<ide> def compile(self, optimizer, loss, theano_mode=None):
<ide> # loss is a dictionary mapping output name to loss functions | 1 |
Python | Python | support python 2 | 6a43dc9d7d592362d144209097e1d93876f8e88a | <ide><path>transformers/tokenization_bert_japanese.py
<ide> import collections
<ide> import logging
<ide> import os
<add>import six
<ide> import unicodedata
<ide> from io import open
<ide>
<ide> def tokenize(self, text, never_split=None, **kwargs):
<ide> never_split = self.never_split + (never_split if never_split is not None else [])
<ide> tokens = []
<ide>
<add> if six.PY2:
<add> mecab_output = self.mecab.parse(text.encode('utf-8')).decode('utf-8')
<add> else:
<add> mecab_output = self.mecab.parse(text)
<add>
<ide> cursor = 0
<del> for line in self.mecab.parse(text).split('\n'):
<add> for line in mecab_output.split('\n'):
<ide> if line == 'EOS':
<ide> break
<ide> | 1 |
Ruby | Ruby | ignore memcached shadowing | 36576333274a609b6546d7f4c05f05909d87cfda | <ide><path>Library/Homebrew/formula_cellar_checks.rb
<ide> def check_shadowed_headers
<ide> return if formula.name == formula_name
<ide> end
<ide>
<del> return if MacOS.version < :mavericks && formula.name.start_with?("postgresql")
<add> if MacOS.version < :mavericks &&
<add> (formula.name.start_with?("postgresql") ||
<add> formula.name.start_with?("memcached"))
<add> return
<add> end
<add>
<ide> return if formula.keg_only? || !formula.include.directory?
<ide>
<ide> files = relative_glob(formula.include, "**/*.h") | 1 |
Java | Java | find exact matches in webjarsresourceresolver | 3d290165fb6b37544d23755f5182fac3478210f4 | <ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/resource/WebJarsResourceResolver.java
<ide> /*
<del> * Copyright 2002-2016 the original author or authors.
<add> * Copyright 2002-2017 the original author or authors.
<ide> *
<ide> * Licensed under the Apache License, Version 2.0 (the "License");
<ide> * you may not use this file except in compliance with the License.
<ide> package org.springframework.web.servlet.resource;
<ide>
<ide> import java.util.List;
<add>
<ide> import javax.servlet.http.HttpServletRequest;
<ide>
<del>import org.webjars.MultipleMatchesException;
<ide> import org.webjars.WebJarAssetLocator;
<ide>
<ide> import org.springframework.core.io.Resource;
<ide> protected String resolveUrlPathInternal(String resourceUrlPath,
<ide> }
<ide>
<ide> protected String findWebJarResourcePath(String path) {
<del> try {
<del> int startOffset = (path.startsWith("/") ? 1 : 0);
<del> int endOffset = path.indexOf("/", 1);
<del> if (endOffset != -1) {
<del> String webjar = path.substring(startOffset, endOffset);
<del> String partialPath = path.substring(endOffset);
<del> String webJarPath = webJarAssetLocator.getFullPath(webjar, partialPath);
<add> int startOffset = (path.startsWith("/") ? 1 : 0);
<add> int endOffset = path.indexOf("/", 1);
<add> if (endOffset != -1) {
<add> String webjar = path.substring(startOffset, endOffset);
<add> String partialPath = path.substring(endOffset + 1);
<add> String webJarPath = webJarAssetLocator.getFullPathExact(webjar, partialPath);
<add> if (webJarPath != null) {
<ide> return webJarPath.substring(WEBJARS_LOCATION_LENGTH);
<ide> }
<ide> }
<del> catch (MultipleMatchesException ex) {
<del> if (logger.isWarnEnabled()) {
<del> logger.warn("WebJar version conflict for \"" + path + "\"", ex);
<del> }
<del> }
<del> catch (IllegalArgumentException ex) {
<del> if (logger.isTraceEnabled()) {
<del> logger.trace("No WebJar resource found for \"" + path + "\"");
<del> }
<del> }
<ide> return null;
<ide> }
<ide> | 1 |
Go | Go | fix docker build | 5248f5c3d1d91ea4235ffe57962e38293af18f34 | <ide><path>pkg/chrootarchive/chroot_linux.go
<ide> func chroot(path string) (err error) {
<ide> }
<ide>
<ide> errCleanup := os.Remove(pivotDir)
<del> if errCleanup != nil {
<add> // pivotDir doesn't exist if pivot_root failed and chroot+chdir was successful
<add> // but we already cleaned it up on failed pivot_root
<add> if errCleanup != nil && !os.IsNotExist(errCleanup) {
<ide> errCleanup = fmt.Errorf("Error cleaning up after pivot: %v", errCleanup)
<ide> if err == nil {
<ide> err = errCleanup
<ide> func chroot(path string) (err error) {
<ide> }()
<ide>
<ide> if err := syscall.PivotRoot(path, pivotDir); err != nil {
<del> // If pivot fails, fall back to the normal chroot
<add> // If pivot fails, fall back to the normal chroot after cleaning up temp dir for pivot_root
<add> if err := os.Remove(pivotDir); err != nil {
<add> return fmt.Errorf("Error cleaning up after failed pivot: %v", err)
<add> }
<ide> return realChroot(path)
<ide> }
<ide> mounted = true
<ide> func realChroot(path string) error {
<ide> return fmt.Errorf("Error after fallback to chroot: %v", err)
<ide> }
<ide> if err := syscall.Chdir("/"); err != nil {
<del> return fmt.Errorf("Error chaning to new root after chroot: %v", err)
<add> return fmt.Errorf("Error changing to new root after chroot: %v", err)
<ide> }
<ide> return nil
<ide> } | 1 |
Text | Text | delete unused definition in readme.md | ca3f9b75851aea0f82860ee17ddd907b57129121 | <ide><path>README.md
<ide> Information on the current Node.js Working Groups can be found in the
<ide> [Code of Conduct]: https://github.com/nodejs/TSC/blob/master/CODE_OF_CONDUCT.md
<ide> [Contributing to the project]: CONTRIBUTING.md
<ide> [Node.js Help]: https://github.com/nodejs/help
<del>[Node.js Moderation Policy]: https://github.com/nodejs/TSC/blob/master/Moderation-Policy.md
<ide> [Node.js Website]: https://nodejs.org/en/
<ide> [Questions tagged 'node.js' on StackOverflow]: https://stackoverflow.com/questions/tagged/node.js
<ide> [#node.js channel on chat.freenode.net]: https://webchat.freenode.net?channels=node.js&uio=d4 | 1 |
Text | Text | make enabling/disabling jupyter mode more explicit | cc66f47893a60596145cd62cba501d0d1410779e | <ide><path>website/docs/usage/visualizers.md
<ide> doc2 = nlp(LONG_NEWS_ARTICLE)
<ide> displacy.render(doc2, style="ent")
<ide> ```
<ide>
<del>> #### Enabling or disabling Jupyter mode
<del>>
<del>> To explicitly enable or disable "Jupyter mode", you can use the jupyter`
<del>> keyword argument – e.g. to return raw HTML in a notebook, or to force Jupyter
<del>> rendering if auto-detection fails.
<add><Infobox variant="warning" title="Important note">
<add>
<add>To explicitly enable or disable "Jupyter mode", you can use the `jupyter`
<add>keyword argument – e.g. to return raw HTML in a notebook, or to force Jupyter
<add>rendering if auto-detection fails.
<add>
<add></Infobox>
<add>
<ide>
<ide> 
<ide>
<ide> nlp = spacy.load("en_core_web_sm")
<ide> sentences = [u"This is an example.", u"This is another one."]
<ide> for sent in sentences:
<ide> doc = nlp(sent)
<del> svg = displacy.render(doc, style="dep")
<add> svg = displacy.render(doc, style="dep", jupyter=False)
<ide> file_name = '-'.join([w.text for w in doc if not w.is_punct]) + ".svg"
<ide> output_path = Path("/images/" + file_name)
<ide> output_path.open("w", encoding="utf-8").write(svg) | 1 |
Javascript | Javascript | exclude dotfiles when copying assets | 764bd8fa142c98979744027315f110718fba5953 | <ide><path>build/lib/include-path-in-packaged-app.js
<ide> module.exports = function (path) {
<ide> }
<ide>
<ide> const EXCLUDE_REGEXPS_SOURCES = [
<add> escapeRegExp('.DS_Store'),
<add> escapeRegExp('.jshintrc'),
<add> escapeRegExp('.npmignore'),
<add> escapeRegExp('.pairs'),
<add> escapeRegExp('.travis.yml'),
<add> escapeRegExp('appveyor.yml'),
<add> escapeRegExp('circle.yml'),
<add> escapeRegExp('.idea'),
<add> escapeRegExp('.editorconfig'),
<add> escapeRegExp('.lint'),
<add> escapeRegExp('.lintignore'),
<add> escapeRegExp('.eslintrc'),
<add> escapeRegExp('.jshintignore'),
<add> escapeRegExp('coffeelint.json'),
<add> escapeRegExp('.coffeelintignore'),
<add> escapeRegExp('.gitattributes'),
<add> escapeRegExp('.gitkeep'),
<ide> escapeRegExp(path.join('git-utils', 'deps')),
<ide> escapeRegExp(path.join('oniguruma', 'deps')),
<ide> escapeRegExp(path.join('less', 'dist')), | 1 |
PHP | PHP | add whenempty + variants to collection | dfe84e68eee50fa53a8ca886628b6b8a3275a1ad | <ide><path>src/Illuminate/Support/Collection.php
<ide> public function when($value, callable $callback, callable $default = null)
<ide> return $this;
<ide> }
<ide>
<add> /**
<add> * Apply the callback if the collection is empty.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function whenEmpty(callable $callback, callable $default = null)
<add> {
<add> return $this->when($this->isEmpty(), $callback, $default);
<add> }
<add>
<add> /**
<add> * Apply the callback if the collection is not empty.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function whenNotEmpty(callable $callback, callable $default = null)
<add> {
<add> return $this->when($this->isNotEmpty(), $callback, $default);
<add> }
<add>
<ide> /**
<ide> * Apply the callback if the value is falsy.
<ide> *
<ide> public function unless($value, callable $callback, callable $default = null)
<ide> return $this->when(! $value, $callback, $default);
<ide> }
<ide>
<add> /**
<add> * Apply the callback unless the collection is empty.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function unlessEmpty(callable $callback, callable $default = null)
<add> {
<add> return $this->unless($this->isEmpty(), $callback, $default);
<add> }
<add>
<add> /**
<add> * Apply the callback unless the collection is not empty.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function unlessNotEmpty(callable $callback, callable $default = null)
<add> {
<add> return $this->unless($this->isNotEmpty(), $callback, $default);
<add> }
<add>
<ide> /**
<ide> * Filter items by the given key value pair.
<ide> *
<ide><path>tests/Support/SupportCollectionTest.php
<ide> public function testWhenDefault()
<ide> $this->assertSame(['michael', 'tom', 'taylor'], $collection->toArray());
<ide> }
<ide>
<add> public function testWhenEmpty()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->whenEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom'], $collection->toArray());
<add>
<add> $collection = new Collection;
<add>
<add> $collection->whenEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['adam'], $collection->toArray());
<add> }
<add>
<add> public function testWhenEmptyDefault()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->whenEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> }, function ($collection) {
<add> return $collection->push('taylor');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'taylor'], $collection->toArray());
<add> }
<add>
<add> public function testWhenNotEmpty()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->whenNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'adam'], $collection->toArray());
<add>
<add> $collection = new Collection;
<add>
<add> $collection->whenNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame([], $collection->toArray());
<add> }
<add>
<add> public function testWhenNotEmptyDefault()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->whenNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> }, function ($collection) {
<add> return $collection->push('taylor');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'adam'], $collection->toArray());
<add> }
<add>
<ide> public function testUnless()
<ide> {
<ide> $collection = new Collection(['michael', 'tom']);
<ide> public function testUnlessDefault()
<ide> $this->assertSame(['michael', 'tom', 'taylor'], $collection->toArray());
<ide> }
<ide>
<add> public function testUnlessEmpty()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->unlessEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'adam'], $collection->toArray());
<add>
<add> $collection = new Collection;
<add>
<add> $collection->unlessEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame([], $collection->toArray());
<add> }
<add>
<add> public function testUnlessEmptyDefault()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->unlessEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> }, function ($collection) {
<add> return $collection->push('taylor');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'adam'], $collection->toArray());
<add> }
<add>
<add> public function testUnlessNotEmpty()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->unlessNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom'], $collection->toArray());
<add>
<add> $collection = new Collection;
<add>
<add> $collection->unlessNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> });
<add>
<add> $this->assertSame(['adam'], $collection->toArray());
<add> }
<add>
<add> public function testUnlessNotEmptyDefault()
<add> {
<add> $collection = new Collection(['michael', 'tom']);
<add>
<add> $collection->unlessNotEmpty(function ($collection) {
<add> return $collection->push('adam');
<add> }, function ($collection) {
<add> return $collection->push('taylor');
<add> });
<add>
<add> $this->assertSame(['michael', 'tom', 'taylor'], $collection->toArray());
<add> }
<add>
<ide> public function testHasReturnsValidResults()
<ide> {
<ide> $collection = new Collection(['foo' => 'one', 'bar' => 'two', 1 => 'three']); | 2 |
Javascript | Javascript | fix components inside group helper | 6d9a2e5d4a042424f0f2f90c885c3d80224a6bc1 | <ide><path>packages/ember-handlebars/lib/helpers/binding.js
<ide> function bind(property, options, preserveContext, shouldDisplay, valueNormalizer
<ide> if ('object' === typeof this) {
<ide> if (data.insideGroup) {
<ide> observer = function() {
<add> while (view._contextView) {
<add> view = view._contextView;
<add> }
<ide> run.once(view, 'rerender');
<ide> };
<ide>
<ide> function simpleBind(currentContext, property, options) {
<ide> if (pathRoot && ('object' === typeof pathRoot)) {
<ide> if (data.insideGroup) {
<ide> observer = function() {
<add> while (view._contextView) {
<add> view = view._contextView;
<add> }
<ide> run.once(view, 'rerender');
<ide> };
<ide>
<ide><path>packages/ember-handlebars/tests/helpers/group_test.js
<ide> import { A } from "ember-runtime/system/native_array";
<ide> import Container from "ember-runtime/system/container";
<ide> import { get } from "ember-metal/property_get";
<ide> import { set } from "ember-metal/property_set";
<add>import Component from "ember-views/views/component";
<ide>
<ide> var trim = jQuery.trim;
<ide> var container, view;
<ide> test("an #each can be nested with a view inside", function() {
<ide> equal(view.$().text(), 'ErikTom', "The updated object's view was rerendered");
<ide> });
<ide>
<add>test("an #each can be nested with a component inside", function() {
<add> var yehuda = {name: 'Yehuda'};
<add> container.register('view:test', Component.extend());
<add> createGroupedView(
<add> '{{#each people}}{{#view "test"}}{{name}}{{/view}}{{/each}}',
<add> {people: A([yehuda, {name: 'Tom'}])}
<add> );
<add>
<add> appendView();
<add> equal(view.$('script').length, 0, "No Metamorph markers are output");
<add> equal(view.$().text(), 'YehudaTom', "The content was rendered");
<add>
<add> run(function() {
<add> set(yehuda, 'name', 'Erik');
<add> });
<add>
<add> equal(view.$().text(), 'ErikTom', "The updated object's view was rerendered");
<add>});
<add>
<ide> test("#each with groupedRows=true behaves like a normal bound #each", function() {
<ide> createGroupedView(
<ide> '{{#each numbers groupedRows=true}}{{this}}{{/each}}',
<ide><path>packages/ember-views/lib/views/component.js
<ide> var Component = View.extend(TargetActionSupport, ComponentTemplateDeprecation, {
<ide>
<ide> init: function() {
<ide> this._super();
<add> set(this, 'origContext', get(this, 'context'));
<ide> set(this, 'context', this);
<ide> set(this, 'controller', this);
<ide> },
<ide> var Component = View.extend(TargetActionSupport, ComponentTemplateDeprecation, {
<ide> tagName: '',
<ide> _contextView: parentView,
<ide> template: template,
<del> context: get(parentView, 'context'),
<add> context: options.data.insideGroup ? get(this, 'origContext') : get(parentView, 'context'),
<ide> controller: get(parentView, 'controller'),
<del> templateData: { keywords: parentView.cloneKeywords() }
<add> templateData: { keywords: parentView.cloneKeywords(), insideGroup: options.data.insideGroup }
<ide> });
<ide> }
<ide> }, | 3 |
PHP | PHP | create enumerable contract | 7657da10f2a3b9e49270d371685fb836947ca724 | <ide><path>src/Illuminate/Support/Collection.php
<ide> * @property-read HigherOrderCollectionProxy $sum
<ide> * @property-read HigherOrderCollectionProxy $unique
<ide> */
<del>class Collection implements Arrayable, ArrayAccess, Countable, IteratorAggregate, Jsonable, JsonSerializable
<add>
<add>class Collection implements ArrayAccess, Enumerable
<ide> {
<ide> use Macroable;
<ide>
<ide><path>src/Illuminate/Support/Enumerable.php
<add><?php
<add>
<add>namespace Illuminate\Support;
<add>
<add>use Countable;
<add>use JsonSerializable;
<add>use IteratorAggregate;
<add>use Illuminate\Contracts\Support\Jsonable;
<add>use Illuminate\Contracts\Support\Arrayable;
<add>
<add>interface Enumerable extends Arrayable, Countable, IteratorAggregate, Jsonable, JsonSerializable
<add>{
<add> /**
<add> * Create a new collection instance if the value isn't one already.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public static function make($items = []);
<add>
<add> /**
<add> * Create a new instance by invoking the callback a given amount of times.
<add> *
<add> * @param int $number
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public static function times($number, callable $callback = null);
<add>
<add> /**
<add> * Wrap the given value in a collection if applicable.
<add> *
<add> * @param mixed $value
<add> * @return static
<add> */
<add> public static function wrap($value);
<add>
<add> /**
<add> * Get the underlying items from the given collection if applicable.
<add> *
<add> * @param array|static $value
<add> * @return array
<add> */
<add> public static function unwrap($value);
<add>
<add> /**
<add> * Get all items in the enumerable.
<add> *
<add> * @return array
<add> */
<add> public function all();
<add>
<add> /**
<add> * Alias for the "avg" method.
<add> *
<add> * @param callable|string|null $callback
<add> * @return mixed
<add> */
<add> public function average($callback = null);
<add>
<add> /**
<add> * Get the median of a given key.
<add> *
<add> * @param string|array|null $key
<add> * @return mixed
<add> */
<add> public function median($key = null);
<add>
<add> /**
<add> * Get the mode of a given key.
<add> *
<add> * @param string|array|null $key
<add> * @return array|null
<add> */
<add> public function mode($key = null);
<add>
<add> /**
<add> * Collapse the items into a single enumerable.
<add> *
<add> * @return static
<add> */
<add> public function collapse();
<add>
<add> /**
<add> * Alias for the "contains" method.
<add> *
<add> * @param mixed $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return bool
<add> */
<add> public function some($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Determine if an item exists, using strict comparison.
<add> *
<add> * @param mixed $key
<add> * @param mixed $value
<add> * @return bool
<add> */
<add> public function containsStrict($key, $value = null);
<add>
<add> /**
<add> * Get the average value of a given key.
<add> *
<add> * @param callable|string|null $callback
<add> * @return mixed
<add> */
<add> public function avg($callback = null);
<add>
<add> /**
<add> * Determine if an item exists in the enumerable.
<add> *
<add> * @param mixed $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return bool
<add> */
<add> public function contains($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Dump the collection and end the script.
<add> *
<add> * @param mixed ...$args
<add> * @return void
<add> */
<add> public function dd(...$args);
<add>
<add> /**
<add> * Dump the collection.
<add> *
<add> * @return $this
<add> */
<add> public function dump();
<add>
<add> /**
<add> * Get the items that are not present in the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function diff($items);
<add>
<add> /**
<add> * Get the items that are not present in the given items, using the callback.
<add> *
<add> * @param mixed $items
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function diffUsing($items, callable $callback);
<add>
<add> /**
<add> * Get the items whose keys and values are not present in the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function diffAssoc($items);
<add>
<add> /**
<add> * Get the items whose keys and values are not present in the given items.
<add> *
<add> * @param mixed $items
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function diffAssocUsing($items, callable $callback);
<add>
<add> /**
<add> * Get the items whose keys are not present in the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function diffKeys($items);
<add>
<add> /**
<add> * Get the items whose keys are not present in the given items.
<add> *
<add> * @param mixed $items
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function diffKeysUsing($items, callable $callback);
<add>
<add> /**
<add> * Retrieve duplicate items.
<add> *
<add> * @param callable|null $callback
<add> * @param bool $strict
<add> * @return static
<add> */
<add> public function duplicates($callback = null, $strict = false);
<add>
<add> /**
<add> * Retrieve duplicate items using strict comparison.
<add> *
<add> * @param callable|null $callback
<add> * @return static
<add> */
<add> public function duplicatesStrict($callback = null);
<add>
<add> /**
<add> * Execute a callback over each item.
<add> *
<add> * @param callable $callback
<add> * @return $this
<add> */
<add> public function each(callable $callback);
<add>
<add> /**
<add> * Execute a callback over each nested chunk of items.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function eachSpread(callable $callback);
<add>
<add> /**
<add> * Determine if all items pass the given test.
<add> *
<add> * @param string|callable $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return bool
<add> */
<add> public function every($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Get all items except for those with the specified keys.
<add> *
<add> * @param mixed $keys
<add> * @return static
<add> */
<add> public function except($keys);
<add>
<add> /**
<add> * Run a filter over each of the items.
<add> *
<add> * @param callable|null $callback
<add> * @return static
<add> */
<add> public function filter(callable $callback = null);
<add>
<add> /**
<add> * Apply the callback if the value is truthy.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function when($value, callable $callback, callable $default = null);
<add>
<add> /**
<add> * Apply the callback if the collection is empty.
<add> *
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function whenEmpty(callable $callback, callable $default = null);
<add>
<add> /**
<add> * Apply the callback if the collection is not empty.
<add> *
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function whenNotEmpty(callable $callback, callable $default = null);
<add>
<add> /**
<add> * Apply the callback if the value is falsy.
<add> *
<add> * @param bool $value
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function unless($value, callable $callback, callable $default = null);
<add>
<add> /**
<add> * Apply the callback unless the collection is empty.
<add> *
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function unlessEmpty(callable $callback, callable $default = null);
<add>
<add> /**
<add> * Apply the callback unless the collection is not empty.
<add> *
<add> * @param callable $callback
<add> * @param callable $default
<add> * @return static|mixed
<add> */
<add> public function unlessNotEmpty(callable $callback, callable $default = null);
<add>
<add> /**
<add> * Filter items by the given key value pair.
<add> *
<add> * @param string $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return static
<add> */
<add> public function where($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Filter items by the given key value pair using strict comparison.
<add> *
<add> * @param string $key
<add> * @param mixed $value
<add> * @return static
<add> */
<add> public function whereStrict($key, $value);
<add>
<add> /**
<add> * Filter items by the given key value pair.
<add> *
<add> * @param string $key
<add> * @param mixed $values
<add> * @param bool $strict
<add> * @return static
<add> */
<add> public function whereIn($key, $values, $strict = false);
<add>
<add> /**
<add> * Filter items by the given key value pair using strict comparison.
<add> *
<add> * @param string $key
<add> * @param mixed $values
<add> * @return static
<add> */
<add> public function whereInStrict($key, $values);
<add>
<add> /**
<add> * Filter items such that the value of the given key is between the given values.
<add> *
<add> * @param string $key
<add> * @param array $values
<add> * @return static
<add> */
<add> public function whereBetween($key, $values);
<add>
<add> /**
<add> * Filter items such that the value of the given key is not between the given values.
<add> *
<add> * @param string $key
<add> * @param array $values
<add> * @return static
<add> */
<add> public function whereNotBetween($key, $values);
<add>
<add> /**
<add> * Filter items by the given key value pair.
<add> *
<add> * @param string $key
<add> * @param mixed $values
<add> * @param bool $strict
<add> * @return static
<add> */
<add> public function whereNotIn($key, $values, $strict = false);
<add>
<add> /**
<add> * Filter items by the given key value pair using strict comparison.
<add> *
<add> * @param string $key
<add> * @param mixed $values
<add> * @return static
<add> */
<add> public function whereNotInStrict($key, $values);
<add>
<add> /**
<add> * Filter the items, removing any items that don't match the given type.
<add> *
<add> * @param string $type
<add> * @return static
<add> */
<add> public function whereInstanceOf($type);
<add>
<add> /**
<add> * Get the first item from the enumerable passing the given truth test.
<add> *
<add> * @param callable|null $callback
<add> * @param mixed $default
<add> * @return mixed
<add> */
<add> public function first(callable $callback = null, $default = null);
<add>
<add> /**
<add> * Get the first item by the given key value pair.
<add> *
<add> * @param string $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return mixed
<add> */
<add> public function firstWhere($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Flip the values with their keys.
<add> *
<add> * @return static
<add> */
<add> public function flip();
<add>
<add> /**
<add> * Remove an item by key.
<add> *
<add> * @param string|array $keys
<add> * @return $this
<add> */
<add> public function forget($keys);
<add>
<add> /**
<add> * Get an item from the collection by key.
<add> *
<add> * @param mixed $key
<add> * @param mixed $default
<add> * @return mixed
<add> */
<add> public function get($key, $default = null);
<add>
<add> /**
<add> * Group an associative array by a field or using a callback.
<add> *
<add> * @param array|callable|string $groupBy
<add> * @param bool $preserveKeys
<add> * @return static
<add> */
<add> public function groupBy($groupBy, $preserveKeys = false);
<add>
<add> /**
<add> * Key an associative array by a field or using a callback.
<add> *
<add> * @param callable|string $keyBy
<add> * @return static
<add> */
<add> public function keyBy($keyBy);
<add>
<add> /**
<add> * Determine if an item exists in the collection by key.
<add> *
<add> * @param mixed $key
<add> * @return bool
<add> */
<add> public function has($key);
<add>
<add> /**
<add> * Concatenate values of a given key as a string.
<add> *
<add> * @param string $value
<add> * @param string $glue
<add> * @return string
<add> */
<add> public function implode($value, $glue = null);
<add>
<add> /**
<add> * Intersect the collection with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function intersect($items);
<add>
<add> /**
<add> * Intersect the collection with the given items by key.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function intersectByKeys($items);
<add>
<add> /**
<add> * Determine if the collection is empty or not.
<add> *
<add> * @return bool
<add> */
<add> public function isEmpty();
<add>
<add> /**
<add> * Determine if the collection is not empty.
<add> *
<add> * @return bool
<add> */
<add> public function isNotEmpty();
<add>
<add> /**
<add> * Join all items from the collection using a string. The final items can use a separate glue string.
<add> *
<add> * @param string $glue
<add> * @param string $finalGlue
<add> * @return string
<add> */
<add> public function join($glue, $finalGlue = '');
<add>
<add> /**
<add> * Get the keys of the collection items.
<add> *
<add> * @return static
<add> */
<add> public function keys();
<add>
<add> /**
<add> * Get the last item from the collection.
<add> *
<add> * @param callable|null $callback
<add> * @param mixed $default
<add> * @return mixed
<add> */
<add> public function last(callable $callback = null, $default = null);
<add>
<add> /**
<add> * Run a map over each of the items.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function map(callable $callback);
<add>
<add> /**
<add> * Run a map over each nested chunk of items.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function mapSpread(callable $callback);
<add>
<add> /**
<add> * Run a dictionary map over the items.
<add> *
<add> * The callback should return an associative array with a single key/value pair.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function mapToDictionary(callable $callback);
<add>
<add> /**
<add> * Run a grouping map over the items.
<add> *
<add> * The callback should return an associative array with a single key/value pair.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function mapToGroups(callable $callback);
<add>
<add> /**
<add> * Run an associative map over each of the items.
<add> *
<add> * The callback should return an associative array with a single key/value pair.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function mapWithKeys(callable $callback);
<add>
<add> /**
<add> * Map a collection and flatten the result by a single level.
<add> *
<add> * @param callable $callback
<add> * @return static
<add> */
<add> public function flatMap(callable $callback);
<add>
<add> /**
<add> * Map the values into a new class.
<add> *
<add> * @param string $class
<add> * @return static
<add> */
<add> public function mapInto($class);
<add>
<add> /**
<add> * Merge the collection with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function merge($items);
<add>
<add> /**
<add> * Recursively merge the collection with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function mergeRecursive($items);
<add>
<add> /**
<add> * Create a collection by using this collection for keys and another for its values.
<add> *
<add> * @param mixed $values
<add> * @return static
<add> */
<add> public function combine($values);
<add>
<add> /**
<add> * Union the collection with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function union($items);
<add>
<add> /**
<add> * Get the min value of a given key.
<add> *
<add> * @param callable|string|null $callback
<add> * @return mixed
<add> */
<add> public function min($callback = null);
<add>
<add> /**
<add> * Get the max value of a given key.
<add> *
<add> * @param callable|string|null $callback
<add> * @return mixed
<add> */
<add> public function max($callback = null);
<add>
<add> /**
<add> * Create a new collection consisting of every n-th element.
<add> *
<add> * @param int $step
<add> * @param int $offset
<add> * @return static
<add> */
<add> public function nth($step, $offset = 0);
<add>
<add> /**
<add> * Get the items with the specified keys.
<add> *
<add> * @param mixed $keys
<add> * @return static
<add> */
<add> public function only($keys);
<add>
<add> /**
<add> * "Paginate" the collection by slicing it into a smaller collection.
<add> *
<add> * @param int $page
<add> * @param int $perPage
<add> * @return static
<add> */
<add> public function forPage($page, $perPage);
<add>
<add> /**
<add> * Partition the collection into two arrays using the given callback or key.
<add> *
<add> * @param callable|string $key
<add> * @param mixed $operator
<add> * @param mixed $value
<add> * @return static
<add> */
<add> public function partition($key, $operator = null, $value = null);
<add>
<add> /**
<add> * Get and remove the last item from the collection.
<add> *
<add> * @return mixed
<add> */
<add> public function pop();
<add>
<add> /**
<add> * Push an item onto the beginning of the collection.
<add> *
<add> * @param mixed $value
<add> * @param mixed $key
<add> * @return $this
<add> */
<add> public function prepend($value, $key = null);
<add>
<add> /**
<add> * Push an item onto the end of the collection.
<add> *
<add> * @param mixed $value
<add> * @return $this
<add> */
<add> public function push($value);
<add>
<add> /**
<add> * Push all of the given items onto the collection.
<add> *
<add> * @param iterable $source
<add> * @return static
<add> */
<add> public function concat($source);
<add>
<add> /**
<add> * Put an item in the collection by key.
<add> *
<add> * @param mixed $key
<add> * @param mixed $value
<add> * @return $this
<add> */
<add> public function put($key, $value);
<add>
<add> /**
<add> * Get one or a specified number of items randomly from the collection.
<add> *
<add> * @param int|null $number
<add> * @return static|mixed
<add> *
<add> * @throws \InvalidArgumentException
<add> */
<add> public function random($number = null);
<add>
<add> /**
<add> * Reduce the collection to a single value.
<add> *
<add> * @param callable $callback
<add> * @param mixed $initial
<add> * @return mixed
<add> */
<add> public function reduce(callable $callback, $initial = null);
<add>
<add> /**
<add> * Replace the collection items with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function replace($items);
<add>
<add> /**
<add> * Recursively replace the collection items with the given items.
<add> *
<add> * @param mixed $items
<add> * @return static
<add> */
<add> public function replaceRecursive($items);
<add>
<add> /**
<add> * Reverse items order.
<add> *
<add> * @return static
<add> */
<add> public function reverse();
<add>
<add> /**
<add> * Search the collection for a given value and return the corresponding key if successful.
<add> *
<add> * @param mixed $value
<add> * @param bool $strict
<add> * @return mixed
<add> */
<add> public function search($value, $strict = false);
<add>
<add> /**
<add> * Get and remove the first item from the collection.
<add> *
<add> * @return mixed
<add> */
<add> public function shift();
<add>
<add> /**
<add> * Shuffle the items in the collection.
<add> *
<add> * @param int $seed
<add> * @return static
<add> */
<add> public function shuffle($seed = null);
<add>
<add> /**
<add> * Get a slice of items from the enumerable.
<add> *
<add> * @param int $offset
<add> * @param int $length
<add> * @return static
<add> */
<add> public function slice($offset, $length = null);
<add>
<add> /**
<add> * Split a collection into a certain number of groups.
<add> *
<add> * @param int $numberOfGroups
<add> * @return static
<add> */
<add> public function split($numberOfGroups);
<add>
<add> /**
<add> * Chunk the collection into chunks of the given size.
<add> *
<add> * @param int $size
<add> * @return static
<add> */
<add> public function chunk($size);
<add>
<add> /**
<add> * Sort through each item with a callback.
<add> *
<add> * @param callable|null $callback
<add> * @return static
<add> */
<add> public function sort(callable $callback = null);
<add>
<add> /**
<add> * Sort the collection using the given callback.
<add> *
<add> * @param callable|string $callback
<add> * @param int $options
<add> * @param bool $descending
<add> * @return static
<add> */
<add> public function sortBy($callback, $options = SORT_REGULAR, $descending = false);
<add>
<add> /**
<add> * Sort the collection in descending order using the given callback.
<add> *
<add> * @param callable|string $callback
<add> * @param int $options
<add> * @return static
<add> */
<add> public function sortByDesc($callback, $options = SORT_REGULAR);
<add>
<add> /**
<add> * Sort the collection keys.
<add> *
<add> * @param int $options
<add> * @param bool $descending
<add> * @return static
<add> */
<add> public function sortKeys($options = SORT_REGULAR, $descending = false);
<add>
<add> /**
<add> * Sort the collection keys in descending order.
<add> *
<add> * @param int $options
<add> * @return static
<add> */
<add> public function sortKeysDesc($options = SORT_REGULAR);
<add>
<add> /**
<add> * Splice a portion of the underlying collection array.
<add> *
<add> * @param int $offset
<add> * @param int|null $length
<add> * @param mixed $replacement
<add> * @return static
<add> */
<add> public function splice($offset, $length = null, $replacement = []);
<add>
<add> /**
<add> * Get the sum of the given values.
<add> *
<add> * @param callable|string|null $callback
<add> * @return mixed
<add> */
<add> public function sum($callback = null);
<add>
<add> /**
<add> * Take the first or last {$limit} items.
<add> *
<add> * @param int $limit
<add> * @return static
<add> */
<add> public function take($limit);
<add>
<add> /**
<add> * Pass the collection to the given callback and then return it.
<add> *
<add> * @param callable $callback
<add> * @return $this
<add> */
<add> public function tap(callable $callback);
<add>
<add> /**
<add> * Transform each item in the collection using a callback.
<add> *
<add> * @param callable $callback
<add> * @return $this
<add> */
<add> public function transform(callable $callback);
<add>
<add> /**
<add> * Pass the enumerable to the given callback and return the result.
<add> *
<add> * @param callable $callback
<add> * @return mixed
<add> */
<add> public function pipe(callable $callback);
<add>
<add> /**
<add> * Get the values of a given key.
<add> *
<add> * @param string|array $value
<add> * @param string|null $key
<add> * @return static
<add> */
<add> public function pluck($value, $key = null);
<add>
<add> /**
<add> * Create a collection of all elements that do not pass a given truth test.
<add> *
<add> * @param callable|mixed $callback
<add> * @return static
<add> */
<add> public function reject($callback = true);
<add>
<add> /**
<add> * Return only unique items from the collection array.
<add> *
<add> * @param string|callable|null $key
<add> * @param bool $strict
<add> * @return static
<add> */
<add> public function unique($key = null, $strict = false);
<add>
<add> /**
<add> * Return only unique items from the collection array using strict comparison.
<add> *
<add> * @param string|callable|null $key
<add> * @return static
<add> */
<add> public function uniqueStrict($key = null);
<add>
<add> /**
<add> * Reset the keys on the underlying array.
<add> *
<add> * @return static
<add> */
<add> public function values();
<add>
<add> /**
<add> * Pad collection to the specified length with a value.
<add> *
<add> * @param int $size
<add> * @param mixed $value
<add> * @return static
<add> */
<add> public function pad($size, $value);
<add>
<add> /**
<add> * Count the number of items in the collection using a given truth test.
<add> *
<add> * @param callable|null $callback
<add> * @return static
<add> */
<add> public function countBy($callback = null);
<add>
<add> /**
<add> * Add an item to the collection.
<add> *
<add> * @param mixed $item
<add> * @return $this
<add> */
<add> public function add($item);
<add>
<add> /**
<add> * Convert the collection to its string representation.
<add> *
<add> * @return string
<add> */
<add> public function __toString();
<add>
<add> /**
<add> * Add a method to the list of proxied methods.
<add> *
<add> * @param string $method
<add> * @return void
<add> */
<add> public static function proxy($method);
<add>
<add> /**
<add> * Dynamically access collection proxies.
<add> *
<add> * @param string $key
<add> * @return mixed
<add> *
<add> * @throws \Exception
<add> */
<add> public function __get($key);
<add>} | 2 |
Ruby | Ruby | remove useless call to mb_chars | 803e9bab84df9a93ca5f4613f710b6e088fbe64b | <ide><path>actionpack/lib/action_view/helpers/text_helper.rb
<ide> def excerpt(text, phrase, *args)
<ide> options.reverse_merge!(:radius => 100, :omission => "...")
<ide>
<ide> phrase = Regexp.escape(phrase)
<del> return unless found_pos = text.mb_chars =~ /(#{phrase})/i
<add> return unless found_pos = text =~ /(#{phrase})/i
<ide>
<ide> start_pos = [ found_pos - options[:radius], 0 ].max
<del> end_pos = [ [ found_pos + phrase.mb_chars.length + options[:radius] - 1, 0].max, text.mb_chars.length ].min
<add> end_pos = [ [ found_pos + phrase.length + options[:radius] - 1, 0].max, text.length ].min
<ide>
<ide> prefix = start_pos > 0 ? options[:omission] : ""
<del> postfix = end_pos < text.mb_chars.length - 1 ? options[:omission] : ""
<add> postfix = end_pos < text.length - 1 ? options[:omission] : ""
<ide>
<del> prefix + text.mb_chars[start_pos..end_pos].strip + postfix
<add> prefix + text[start_pos..end_pos].strip + postfix
<ide> end
<ide>
<ide> # Attempts to pluralize the +singular+ word unless +count+ is 1. If
<ide><path>activerecord/lib/active_record/validations/uniqueness.rb
<ide> def find_finder_class_for(record) #:nodoc:
<ide>
<ide> def build_relation(klass, table, attribute, value) #:nodoc:
<ide> column = klass.columns_hash[attribute.to_s]
<del> value = column.limit ? value.to_s.mb_chars[0, column.limit] : value.to_s if column.text?
<add> value = column.limit ? value.to_s[0, column.limit] : value.to_s if column.text?
<ide>
<ide> if !options[:case_sensitive] && value && column.text?
<ide> # will use SQL LOWER function before comparison, unless it detects a case insensitive collation | 2 |
Javascript | Javascript | simplify diffiehellman getformat function | 3b8ab2ac7f875e39de446d8a7a98f824a5dae5b9 | <ide><path>lib/internal/crypto/diffiehellman.js
<ide> function encode(buffer, encoding) {
<ide> }
<ide>
<ide> function getFormat(format) {
<del> let f;
<ide> if (format) {
<ide> if (format === 'compressed')
<del> f = POINT_CONVERSION_COMPRESSED;
<del> else if (format === 'hybrid')
<del> f = POINT_CONVERSION_HYBRID;
<del> // Default
<del> else if (format === 'uncompressed')
<del> f = POINT_CONVERSION_UNCOMPRESSED;
<del> else
<add> return POINT_CONVERSION_COMPRESSED;
<add> if (format === 'hybrid')
<add> return POINT_CONVERSION_HYBRID;
<add> if (format !== 'uncompressed')
<ide> throw new ERR_CRYPTO_ECDH_INVALID_FORMAT(format);
<del> } else {
<del> f = POINT_CONVERSION_UNCOMPRESSED;
<ide> }
<del> return f;
<add> return POINT_CONVERSION_UNCOMPRESSED;
<ide> }
<ide>
<ide> module.exports = { | 1 |
Ruby | Ruby | fix fish caveats under env filtering | 55d97500565c02a70feb68dc747d9e4ef673068e | <ide><path>Library/Homebrew/caveats.rb
<ide> def keg_only_text
<ide>
<ide> def function_completion_caveats(shell)
<ide> return unless keg
<del> return unless which(shell.to_s)
<add> return unless which(shell.to_s, ENV["HOMEBREW_PATH"])
<ide>
<ide> completion_installed = keg.completion_installed?(shell)
<ide> functions_installed = keg.functions_installed?(shell) | 1 |
PHP | PHP | fix style issue | 84cc466c0e78140b15aade306dee6cea3dac5b59 | <ide><path>tests/Database/DatabaseEloquentRelationTest.php
<ide> public function testIgnoredModelsStateIsResetWhenThereAreExceptions()
<ide>
<ide> $this->fail('Exception was not thrown');
<ide> } catch (\Exception $exception) {
<del>
<add> // Does nothing.
<ide> }
<ide>
<ide> $this->assertTrue($related->shouldTouch()); | 1 |
Python | Python | remove self.using_mysql attribute | 2ac45b011d04ac141a15abfc24cc054687cadbc2 | <ide><path>airflow/jobs/scheduler_job.py
<ide> def __init__(
<ide> # Check what SQL backend we use
<ide> sql_conn: str = conf.get_mandatory_value('database', 'sql_alchemy_conn').lower()
<ide> self.using_sqlite = sql_conn.startswith('sqlite')
<del> self.using_mysql = sql_conn.startswith('mysql')
<ide> # Dag Processor agent - not used in Dag Processor standalone mode.
<ide> self.processor_agent: DagFileProcessorAgent | None = None
<ide> | 1 |
Python | Python | solve pr issues | 9c0da3dd4655ee7c58fee2c709670bec518c6354 | <ide><path>libcloud/compute/drivers/openstack.py
<ide> def _to_volume(self, api_node):
<ide>
<ide> return StorageVolume(
<ide> id=api_node['id'],
<del> name=api_node.get('name', api_node.get('displayName', None)),
<add> name=api_node.get('name', api_node.get('displayName')),
<ide> size=api_node['size'],
<ide> state=state,
<ide> driver=self,
<ide> extra={
<ide> 'description': api_node.get('description',
<del> api_node.get('displayDescription',
<del> None)),
<add> api_node.get('displayDescription')
<add> ),
<ide> 'attachments': [att for att in api_node['attachments'] if att],
<ide> # TODO: remove in 1.18.0
<ide> 'state': api_node.get('status', None),
<ide> 'snapshot_id': api_node.get('snapshot_id',
<del> api_node.get('snapshotId', None)),
<add> api_node.get('snapshotId')),
<ide> 'location': api_node.get('availability_zone',
<del> api_node.get('availabilityZone',
<del> None)),
<add> api_node.get('availabilityZone')),
<ide> 'volume_type': api_node.get('volume_type',
<del> api_node.get('volumeType', None)),
<add> api_node.get('volumeType')),
<ide> 'metadata': api_node.get('metadata', None),
<ide> 'created_at': api_node.get('created_at',
<del> api_node.get('createdAt', None))
<add> api_node.get('createdAt'))
<ide> }
<ide> )
<ide>
<ide> def create_volume_snapshot(self, volume, name=None, ex_description=None,
<ide> data=data).object)
<ide>
<ide> def destroy_volume_snapshot(self, snapshot):
<add> """
<add> Delete a Volume Snapshot.
<add>
<add> :param snapshot: Snapshot to be deleted
<add> :type snapshot: :class:`VolumeSnapshot`
<add>
<add> :rtype: ``bool``
<add> """
<ide> resp = self.volumev2_connection.request('/snapshots/%s' % snapshot.id,
<ide> method='DELETE')
<ide> return resp.status in (httplib.NO_CONTENT, httplib.ACCEPTED)
<ide> def __repr__(self):
<ide> % (self.id, self.ip_address, self.pool, self.driver))
<ide>
<ide>
<del>class OpenStack_2_FloatingIpPool(OpenStack_1_1_FloatingIpPool):
<add>class OpenStack_2_FloatingIpPool(object):
<ide> """
<ide> Floating IP Pool info.
<ide> """ | 1 |
PHP | PHP | fix formattting and make getter | bba04a1598c44a892e918c4f308407b0d297f217 | <ide><path>src/Illuminate/Routing/Route.php
<ide> public function secure()
<ide> }
<ide>
<ide> /**
<del> * Get the domain defined for the route.
<add> * Get or set the domain for the route.
<ide> *
<del> * @return string|null
<add> * @param string|null $domain
<add> * @return $this
<ide> */
<del> public function getDomain()
<add> public function domain($domain = null)
<ide> {
<del> return isset($this->action['domain'])
<del> ? str_replace(['http://', 'https://'], '', $this->action['domain']) : null;
<add> if (is_null($domain)) {
<add> return $this->getDomain();
<add> }
<add>
<add> $this->action['domain'] = $domain;
<add>
<add> return $this;
<ide> }
<ide>
<ide> /**
<del> * Add a domain to the route URI.
<add> * Get the domain defined for the route.
<ide> *
<del> * @param string $domain
<del> * @return $this
<add> * @return string|null
<ide> */
<del> public function domain($domain)
<add> public function getDomain()
<ide> {
<del> $this->action['domain'] = $domain;
<del>
<del> return $this;
<add> return isset($this->action['domain'])
<add> ? str_replace(['http://', 'https://'], '', $this->action['domain']) : null;
<ide> }
<ide>
<ide> /** | 1 |
PHP | PHP | add grammar typehint | a2e186009d10523ecdcb8037c8522f772342818a | <ide><path>src/Illuminate/Database/Schema/Grammars/RenameColumn.php
<ide> class RenameColumn
<ide> * @param \Illuminate\Database\Connection $connection
<ide> * @return array
<ide> */
<del> public static function compile($grammar, Blueprint $blueprint, Fluent $command, Connection $connection)
<add> public static function compile(Grammar $grammar, Blueprint $blueprint, Fluent $command, Connection $connection)
<ide> {
<ide> $column = $connection->getDoctrineColumn(
<ide> $grammar->getTablePrefix().$blueprint->getTable(), $command->from
<ide> public static function compile($grammar, Blueprint $blueprint, Fluent $command,
<ide> * @param \Doctrine\DBAL\Schema\AbstractSchemaManager $schema
<ide> * @return \Doctrine\DBAL\Schema\TableDiff
<ide> */
<del> protected static function getRenamedDiff($grammar, Blueprint $blueprint, Fluent $command, Column $column, SchemaManager $schema)
<add> protected static function getRenamedDiff(Grammar $grammar, Blueprint $blueprint, Fluent $command, Column $column, SchemaManager $schema)
<ide> {
<ide> return static::setRenamedColumns(
<ide> $grammar->getDoctrineTableDiff($blueprint, $schema), $command, $column | 1 |
Ruby | Ruby | restore x11 description to --config output | 2761d3ee49cd3e2f9006733b8b1084cccb063e28 | <ide><path>Library/Homebrew/cmd/--config.rb
<ide> def describe_path path
<ide>
<ide> def describe_x11
<ide> return "N/A" unless MacOS::XQuartz.installed?
<del> return "#{MacOS::XQuartz.version} in " + describe_path(MacOS::XQuartz.prefix)
<add> return "#{MacOS::XQuartz.version} => " + describe_path(MacOS::XQuartz.prefix)
<ide> end
<ide>
<ide> def describe_perl
<ide> def dump_verbose_config
<ide> puts "Clang: #{clang ? "#{clang} build #{clang_build}" : "N/A"}"
<ide> ponk = macports_or_fink_installed?
<ide> puts "MacPorts or Fink? #{ponk}" if ponk
<add> puts "X11: #{describe_x11}"
<ide> puts "System Ruby: #{RUBY_VERSION}-#{RUBY_PATCHLEVEL}"
<ide> puts "Perl: #{describe_perl}"
<ide> puts "Python: #{describe_python}" | 1 |
Ruby | Ruby | add cask#outdated_info to format output | 65ff9155f8bb5b4e598fa015431b856b6115184b | <ide><path>Library/Homebrew/cask/cask.rb
<ide> def outdated_versions(greedy = false)
<ide> installed.reject { |v| v == version }
<ide> end
<ide>
<add> def outdated_info(greedy, verbose, json)
<add> if json
<add> {
<add> name: token,
<add> installed_versions: outdated_versions(greedy).join(", "),
<add> current_version: version
<add> }
<add> elsif verbose
<add> outdated_info = token << " (#{outdated_versions(greedy).join(", ")})"
<add> "#{outdated_info} != #{version}"
<add> else
<add> token
<add> end
<add> end
<add>
<ide> def to_s
<ide> @token
<ide> end | 1 |
Javascript | Javascript | replace react transform hmr with react refresh | d7c8ace00174d1b298e6520fb06035b664bf28ab | <ide><path>Libraries/Core/setUpDeveloperTools.js
<ide> if (__DEV__) {
<ide> });
<ide> }
<ide>
<del> // This is used by the require.js polyfill for hot reloading.
<del> // TODO(t9759686) Scan polyfills for dependencies, too
<del> const reload = require('../NativeModules/specs/NativeDevSettings').default
<del> .reload;
<del> if (typeof reload !== 'function') {
<del> throw new Error('Could not find the reload() implementation.');
<del> }
<del> // flowlint-next-line unclear-type: off
<del> (require: any).reload = reload;
<del> // flowlint-next-line unclear-type: off
<del> (require: any).hot = {
<del> Runtime: require('react-refresh/runtime'),
<del> };
<add> require('./setUpReactRefresh');
<ide> }
<ide> }
<ide><path>Libraries/Core/setUpReactRefresh.js
<add>/**
<add> * Copyright (c) Facebook, Inc. and its affiliates.
<add> *
<add> * This source code is licensed under the MIT license found in the
<add> * LICENSE file in the root directory of this source tree.
<add> *
<add> * @flow
<add> * @format
<add> */
<add>'use strict';
<add>
<add>if (__DEV__) {
<add> const NativeDevSettings = require('../NativeModules/specs/NativeDevSettings')
<add> .default;
<add>
<add> if (typeof NativeDevSettings.reload !== 'function') {
<add> throw new Error('Could not find the reload() implementation.');
<add> }
<add>
<add> if ((module: any).hot) {
<add> // This needs to run before the renderer initializes.
<add> const ReactRefreshRuntime = require('react-refresh/runtime');
<add> ReactRefreshRuntime.injectIntoGlobalHook(global);
<add>
<add> (require: any).Refresh = {
<add> // Full Refresh
<add> performFullRefresh() {
<add> NativeDevSettings.reload();
<add> },
<add> // React Refresh
<add> createSignatureFunctionForTransform:
<add> ReactRefreshRuntime.createSignatureFunctionForTransform,
<add> isLikelyComponentType: ReactRefreshRuntime.isLikelyComponentType,
<add> register: ReactRefreshRuntime.register,
<add> performReactRefresh: ReactRefreshRuntime.performReactRefresh,
<add> };
<add> }
<add>} | 2 |
Text | Text | fix minor typo | b2dc442a30303b9c27f3c8a44eed778223bd28ce | <ide><path>README.md
<ide> These inbound emails are routed asynchronously using Active Job to one or severa
<ide>
<ide> ## How does this compare to Action Mailer's inbound processing?
<ide>
<del>Rails has long had an anemic way of [receiving emails using Action Mailer](https://guides.rubyonrails.org/action_mailer_basics.html#receiving-emails), but it was poorly flushed out, lacked cohesion with the task of sending emails, and offered no help on integrating with popular inbound email processing platforms. Action Mailbox supersedes the receiving part of Action Mailer, which will be deprecated in due course.
<add>Rails has long had an anemic way of [receiving emails using Action Mailer](https://guides.rubyonrails.org/action_mailer_basics.html#receiving-emails), but it was poorly fleshed out, lacked cohesion with the task of sending emails, and offered no help on integrating with popular inbound email processing platforms. Action Mailbox supersedes the receiving part of Action Mailer, which will be deprecated in due course.
<ide>
<ide>
<ide> ## Installing | 1 |
Java | Java | protect stomp passcode from showing up in logs | 80812d30d4283c11ad74befa5879e4412e4e34be | <ide><path>spring-messaging/src/main/java/org/springframework/messaging/simp/stomp/StompHeaderAccessor.java
<ide> */
<ide> public class StompHeaderAccessor extends SimpMessageHeaderAccessor {
<ide>
<add> private static final AtomicLong messageIdCounter = new AtomicLong();
<add>
<ide> // STOMP header names
<ide>
<ide> public static final String STOMP_ID_HEADER = "id";
<ide> public class StompHeaderAccessor extends SimpMessageHeaderAccessor {
<ide>
<ide> // Other header names
<ide>
<del> public static final String COMMAND_HEADER = "stompCommand";
<add> private static final String COMMAND_HEADER = "stompCommand";
<ide>
<del>
<del> private static final AtomicLong messageIdCounter = new AtomicLong();
<add> private static final String CREDENTIALS_HEADER = "stompCredentials";
<ide>
<ide>
<ide> /**
<ide> else if (StompCommand.MESSAGE.equals(command)) {
<ide> super.setSubscriptionId(values.get(0));
<ide> }
<ide> }
<add> else if (StompCommand.CONNECT.equals(command)) {
<add> if (!StringUtils.isEmpty(getPasscode())) {
<add> setHeader(CREDENTIALS_HEADER, new StompPasscode(getPasscode()));
<add> setPasscode("PROTECTED");
<add> }
<add> }
<ide> }
<ide>
<ide> /**
<ide> public Map<String, List<String>> toNativeHeaderMap() {
<ide> return result;
<ide> }
<ide>
<add> public Map<String, List<String>> toStompHeaderMap() {
<add> if (StompCommand.CONNECT.equals(getCommand())) {
<add> StompPasscode credentials = (StompPasscode) getHeader(CREDENTIALS_HEADER);
<add> if (credentials != null) {
<add> Map<String, List<String>> headers = toNativeHeaderMap();
<add> headers.put(STOMP_PASSCODE_HEADER, Arrays.asList(credentials.passcode));
<add> return headers;
<add> }
<add> }
<add> return toNativeHeaderMap();
<add> }
<add>
<ide> public void setCommandIfNotSet(StompCommand command) {
<ide> if (getCommand() == null) {
<ide> setHeader(COMMAND_HEADER, command);
<ide> public void setVersion(String version) {
<ide> setNativeHeader(STOMP_VERSION_HEADER, version);
<ide> }
<ide>
<add>
<add> private static class StompPasscode {
<add>
<add> private final String passcode;
<add>
<add> public StompPasscode(String passcode) {
<add> this.passcode = passcode;
<add> }
<add>
<add> @Override
<add> public String toString() {
<add> return "[PROTECTED]";
<add> }
<add> }
<ide> }
<ide><path>spring-messaging/src/main/java/org/springframework/messaging/simp/stomp/StompMessageConverter.java
<ide> public byte[] fromMessage(Message<?> message) {
<ide> try {
<ide> out.write(stompHeaders.getCommand().toString().getBytes("UTF-8"));
<ide> out.write(LF);
<del> for (Entry<String, List<String>> entry : stompHeaders.toNativeHeaderMap().entrySet()) {
<add> for (Entry<String, List<String>> entry : stompHeaders.toStompHeaderMap().entrySet()) {
<ide> String key = entry.getKey();
<ide> key = replaceAllOutbound(key);
<ide> for (String value : entry.getValue()) {
<ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/stomp/StompHeaderAccessorTests.java
<ide> public void createWithMessageFrameNativeHeaders() {
<ide> assertEquals("s1", headers.getSubscriptionId());
<ide> }
<ide>
<add> @Test
<add> public void createWithConnectNativeHeaders() {
<add>
<add> MultiValueMap<String, String> extHeaders = new LinkedMultiValueMap<>();
<add> extHeaders.add(StompHeaderAccessor.STOMP_LOGIN_HEADER, "joe");
<add> extHeaders.add(StompHeaderAccessor.STOMP_PASSCODE_HEADER, "joe123");
<add>
<add> StompHeaderAccessor headers = StompHeaderAccessor.create(StompCommand.CONNECT, extHeaders);
<add>
<add> assertEquals(StompCommand.CONNECT, headers.getCommand());
<add> assertEquals(SimpMessageType.CONNECT, headers.getMessageType());
<add> assertNotNull(headers.getHeader("stompCredentials"));
<add> assertEquals("joe", headers.getLogin());
<add> assertEquals("PROTECTED", headers.getPasscode());
<add>
<add> Map<String, List<String>> output = headers.toStompHeaderMap();
<add> assertEquals("joe", output.get(StompHeaderAccessor.STOMP_LOGIN_HEADER).get(0));
<add> assertEquals("joe123", output.get(StompHeaderAccessor.STOMP_PASSCODE_HEADER).get(0));
<add> }
<add>
<ide> @Test
<ide> public void toNativeHeadersSubscribe() {
<ide>
<ide><path>spring-websocket/src/main/java/org/springframework/web/socket/TextMessage.java
<ide> protected int getPayloadSize() {
<ide>
<ide> @Override
<ide> protected String toStringPayload() {
<del> return (getPayloadSize() > 80) ? getPayload().substring(0, 80) + "..." : getPayload();
<add> return (getPayloadSize() > 10) ? getPayload().substring(0, 10) + ".." : getPayload();
<ide> }
<ide>
<ide> } | 4 |
Javascript | Javascript | add missing semicolons | eecd123ecc7dc1365a3342077558694711101fbf | <ide><path>test/core/selection-test.js
<ide> suite.addBatch({
<ide> },
<ide> "selection prototype can be extended": function(selection) {
<ide> d3.selection.prototype.foo = function(v) { return this.attr("foo", v); };
<del> selection.select("body").foo(42)
<add> selection.select("body").foo(42);
<ide> assert.equal(document.body.getAttribute("foo"), "42");
<ide> delete d3.selection.prototype.foo;
<ide> }
<ide><path>test/core/transition-test.js
<ide> suite.addBatch({
<ide> "transition prototype can be extended": function(transition) {
<ide> var vv = [];
<ide> d3.transition.prototype.foo = function(v) { vv.push(v); return this; };
<del> transition.select("body").foo(42)
<add> transition.select("body").foo(42);
<ide> assert.deepEqual(vv, [42]);
<ide> delete d3.transition.prototype.foo;
<ide> } | 2 |
Javascript | Javascript | handle ime input | 00933c7c637f374c7c20d4e23d59781d95a15810 | <ide><path>src/text-editor-component.js
<ide> class TextEditorComponent {
<ide> this.previousScrollHeight = 0
<ide> this.lastKeydown = null
<ide> this.lastKeydownBeforeKeypress = null
<del> this.openedAccentedCharacterMenu = false
<add> this.accentedCharacterMenuIsOpen = false
<ide> this.decorationsToRender = {
<ide> lineNumbers: new Map(),
<ide> lines: new Map(),
<ide> class TextEditorComponent {
<ide> textInput: this.didTextInput,
<ide> keydown: this.didKeydown,
<ide> keyup: this.didKeyup,
<del> keypress: this.didKeypress
<add> keypress: this.didKeypress,
<add> compositionstart: this.didCompositionStart,
<add> compositionupdate: this.didCompositionUpdate,
<add> compositionend: this.didCompositionEnd
<ide> },
<ide> tabIndex: -1,
<ide> style: {
<ide> class TextEditorComponent {
<ide> // to test.
<ide> if (event.data !== ' ') event.preventDefault()
<ide>
<add> // TODO: Deal with disabled input
<ide> // if (!this.isInputEnabled()) return
<ide>
<del> // Workaround of the accented character suggestion feature in macOS. This
<del> // will only occur when the user is not composing in IME mode. When the user
<del> // selects a modified character from the macOS menu, `textInput` will occur
<del> // twice, once for the initial character, and once for the modified
<del> // character. However, only a single keypress will have fired. If this is
<del> // the case, select backward to replace the original character.
<del> if (this.openedAccentedCharacterMenu) {
<del> this.getModel().selectLeft()
<del> this.openedAccentedCharacterMenu = false
<add> if (this.compositionCheckpoint) {
<add> this.getModel().revertToCheckpoint(this.compositionCheckpoint)
<add> this.compositionCheckpoint = null
<ide> }
<ide>
<ide> this.getModel().insertText(event.data, {groupUndo: true})
<ide> class TextEditorComponent {
<ide> didKeydown (event) {
<ide> if (this.lastKeydownBeforeKeypress != null) {
<ide> if (this.lastKeydownBeforeKeypress.keyCode === event.keyCode) {
<del> this.openedAccentedCharacterMenu = true
<add> this.accentedCharacterMenuIsOpen = true
<add> this.getModel().selectLeft()
<ide> }
<ide> this.lastKeydownBeforeKeypress = null
<ide> } else {
<ide> this.lastKeydown = event
<ide> }
<ide> }
<ide>
<del> didKeypress () {
<add> didKeypress (event) {
<ide> this.lastKeydownBeforeKeypress = this.lastKeydown
<ide> this.lastKeydown = null
<ide>
<ide> // This cancels the accented character behavior if we type a key normally
<ide> // with the menu open.
<del> this.openedAccentedCharacterMenu = false
<add> this.accentedCharacterMenuIsOpen = false
<ide> }
<ide>
<del> didKeyup () {
<add> didKeyup (event) {
<ide> this.lastKeydownBeforeKeypress = null
<ide> this.lastKeydown = null
<ide> }
<ide>
<add> // The IME composition events work like this:
<add> //
<add> // User types 's', chromium pops up the completion helper
<add> // 1. compositionstart fired
<add> // 2. compositionupdate fired; event.data == 's'
<add> // User hits arrow keys to move around in completion helper
<add> // 3. compositionupdate fired; event.data == 's' for each arry key press
<add> // User escape to cancel
<add> // 4. compositionend fired
<add> // OR User chooses a completion
<add> // 4. compositionend fired
<add> // 5. textInput fired; event.data == the completion string
<add> didCompositionStart (event) {
<add> this.compositionCheckpoint = this.getModel().createCheckpoint()
<add> }
<add>
<add> didCompositionUpdate (event) {
<add> this.getModel().insertText(event.data, {select: true})
<add> }
<add>
<add> didCompositionEnd (event) {
<add> event.target.value = ''
<add> }
<add>
<ide> didRequestAutoscroll (autoscroll) {
<ide> this.pendingAutoscroll = autoscroll
<ide> this.scheduleUpdate() | 1 |
Javascript | Javascript | fix nits in update view api documentation | 678ea5b2331fe48bb1f36a8a09fde713f2b73eeb | <ide><path>Libraries/Components/View/View.js
<ide> *
<ide> * @providesModule View
<ide> * @flow
<del> * @jsdoc
<ide> */
<ide> 'use strict';
<ide>
<ide> const View = React.createClass({
<ide>
<ide> propTypes: {
<ide> /**
<del> * When true, indicates that the view is an accessibility element. By default,
<add> * When `true`, indicates that the view is an accessibility element. By default,
<ide> * all the touchable elements are accessible.
<ide> */
<ide> accessible: PropTypes.bool,
<ide>
<ide> /**
<ide> * Overrides the text that's read by the screen reader when the user interacts
<ide> * with the element. By default, the label is constructed by traversing all the
<del> * children and accumulating all the Text nodes separated by space.
<add> * children and accumulating all the `Text` nodes separated by space.
<ide> */
<ide> accessibilityLabel: PropTypes.string,
<ide>
<ide> const View = React.createClass({
<ide> onAccessibilityTap: PropTypes.func,
<ide>
<ide> /**
<del> * When `accessible` is true, the system will invoke this function when the
<add> * When `accessible` is `true`, the system will invoke this function when the
<ide> * user performs the magic tap gesture.
<ide> */
<ide> onMagicTap: PropTypes.func,
<ide>
<ide> /**
<del> * Used to locate this view in end-to-end tests. NB: disables the 'layout-only
<del> * view removal' optimization for this view!
<add> * Used to locate this view in end-to-end tests.
<add> *
<add> * > This disables the 'layout-only view removal' optimization for this view!
<ide> */
<ide> testID: PropTypes.string,
<ide>
<ide> const View = React.createClass({
<ide> onResponderMove: PropTypes.func,
<ide>
<ide> /**
<del> * Another responser is already active and will not release it to that `View` asking to be
<add> * Another responder is already active and will not release it to that `View` asking to be
<ide> * the responder.
<ide> *
<ide> * `View.props.onResponderReject: (event) => {}`, where `event` is a synthetic touch event as
<ide> const View = React.createClass({
<ide> /**
<ide> * Fired at the end of the touch.
<ide> *
<del> * `View.props.onResponderRelease`: (event) => {}`, where `event` is a synthetic touch event as
<add> * `View.props.onResponderRelease: (event) => {}`, where `event` is a synthetic touch event as
<ide> * described above.
<ide> */
<ide> onResponderRelease: PropTypes.func,
<ide> const View = React.createClass({
<ide> onLayout: PropTypes.func,
<ide>
<ide> /**
<del> * Controls whether the View can be the target of touch events.
<add> * Controls whether the `View` can be the target of touch events.
<ide> *
<ide> * - `'auto'`: The View can be the target of touch events.
<ide> * - `'none'`: The View is never the target of touch events.
<ide> const View = React.createClass({
<ide> * for scrolling content when there are many subviews, most of which are
<ide> * offscreen. For this property to be effective, it must be applied to a
<ide> * view that contains many subviews that extend outside its bound. The
<del> * subviews must also have overflow: hidden, as should the containing view
<add> * subviews must also have `overflow: hidden`, as should the containing view
<ide> * (or one of its superviews).
<ide> */
<ide> removeClippedSubviews: PropTypes.bool, | 1 |
Java | Java | use concurrent pattern with `mtagtoviewstate` | 99f2f5ffdd4b35e4d9b15ccc6deb55714fd686ef | <ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/mounting/MountingManager.java
<ide> public void run() {
<ide> });
<ide> }
<ide>
<del> /** Delete rootView and all children/ */
<add> /** Delete rootView and all children recursively. */
<ide> @UiThread
<ide> public void deleteRootView(int reactRootTag) {
<del> if (mTagToViewState.containsKey(reactRootTag)) {
<del> dropView(mTagToViewState.get(reactRootTag).mView, true);
<add> ViewState rootViewState = mTagToViewState.get(reactRootTag);
<add> if (rootViewState != null && rootViewState.mView != null) {
<add> dropView(rootViewState.mView, true);
<ide> }
<ide> }
<ide> | 1 |
Ruby | Ruby | add json support to brew cask outdated | f2fa2c5d3095438b4bb387d58c426ad91f2b1378 | <ide><path>Library/Homebrew/cask/cmd/outdated.rb
<ide> class Cmd
<ide> class Outdated < AbstractCommand
<ide> option "--greedy", :greedy, false
<ide> option "--quiet", :quiet, false
<add> option "--json", :json, false
<ide>
<ide> def initialize(*)
<ide> super
<ide> self.verbose = ($stdout.tty? || verbose?) && !quiet?
<del> end
<del>
<del> def run
<del> casks(alternative: -> { Caskroom.casks }).each do |cask|
<add> @outdated_casks = casks(alternative: -> { Caskroom.casks }).select do |cask|
<ide> odebug "Checking update info of Cask #{cask}"
<del> self.class.list_if_outdated(cask, greedy?, verbose?)
<add> cask.outdated?(greedy?)
<ide> end
<ide> end
<ide>
<del> def self.list_if_outdated(cask, greedy, verbose)
<del> return unless cask.outdated?(greedy)
<add> def run
<add> output = @outdated_casks.map { |cask| cask.outdated_info(greedy?, verbose?, json?) }
<ide>
<del> if verbose
<del> outdated_versions = cask.outdated_versions(greedy)
<del> outdated_info = "#{cask.token} (#{outdated_versions.join(", ")})"
<del> current_version = cask.version.to_s
<del> puts "#{outdated_info} != #{current_version}"
<del> else
<del> puts cask.token
<del> end
<add> puts json? ? JSON.generate(output) : output
<ide> end
<ide>
<ide> def self.help
<ide><path>Library/Homebrew/test/cask/cmd/outdated_spec.rb
<ide> EOS
<ide> end
<ide> end
<add>
<add> describe "--json" do
<add> it "lists outdated Casks in JSON format" do
<add> result = [
<add> {
<add> name: "local-caffeine",
<add> installed_versions: "1.2.2",
<add> current_version: "1.2.3"
<add> },
<add> {
<add> name: "local-transmission",
<add> installed_versions: "2.60",
<add> current_version: "2.61"
<add> }
<add> ].to_json
<add>
<add> expect {
<add> described_class.run("--json")
<add> }.to output(result + "\n").to_stdout
<add> end
<add> end
<add>
<add> describe "--json overrides --quiet" do
<add> it "ignores --quiet and lists outdated Casks in JSON format" do
<add> result = [
<add> {
<add> name: "local-caffeine",
<add> installed_versions: "1.2.2",
<add> current_version: "1.2.3"
<add> },
<add> {
<add> name: "local-transmission",
<add> installed_versions: "2.60",
<add> current_version: "2.61"
<add> }
<add> ].to_json
<add>
<add> expect {
<add> described_class.run("--json", "--quiet")
<add> }.to output(result + "\n").to_stdout
<add> end
<add> end
<add>
<add> describe "--json and --greedy" do
<add> it 'includes the Casks with "auto_updates true" or "version latest" in JSON format' do
<add> result = [
<add> {
<add> name: "auto-updates",
<add> installed_versions: "2.57",
<add> current_version: "2.61"
<add> },
<add> {
<add> name: "local-caffeine",
<add> installed_versions: "1.2.2",
<add> current_version: "1.2.3"
<add> },
<add> {
<add> name: "local-transmission",
<add> installed_versions: "2.60",
<add> current_version: "2.61"
<add> },
<add> {
<add> name: "version-latest-string",
<add> installed_versions: "latest",
<add> current_version: "latest"
<add> }
<add> ].to_json
<add>
<add> expect {
<add> described_class.run("--json", "--greedy")
<add> }.to output(result + "\n").to_stdout
<add> end
<add>
<add> it 'does not include the Casks with "auto_updates true" with no version change in JSON format' do
<add> cask = Cask::CaskLoader.load(cask_path("auto-updates"))
<add> InstallHelper.install_with_caskfile(cask)
<add>
<add> result = [
<add> {
<add> name: "local-caffeine",
<add> installed_versions: "1.2.2",
<add> current_version: "1.2.3"
<add> },
<add> {
<add> name: "local-transmission",
<add> installed_versions: "2.60",
<add> current_version: "2.61"
<add> },
<add> {
<add> name: "version-latest-string",
<add> installed_versions: "latest",
<add> current_version: "latest"
<add> }
<add> ].to_json
<add>
<add> expect {
<add> described_class.run("--json", "--greedy")
<add> }.to output(result + "\n").to_stdout
<add> end
<add> end
<ide> end | 2 |
Java | Java | fix crash in fabricuimanager.onmeasure | 0f0c9866cacf29aac408be88894e262e8991890e | <ide><path>ReactAndroid/src/main/java/com/facebook/react/fabric/FabricUIManager.java
<ide> public void updateRootLayoutSpecs(
<ide> if (ENABLE_FABRIC_LOGS) {
<ide> FLog.d(TAG, "Updating Root Layout Specs");
<ide> }
<add>
<add> ThemedReactContext reactContext = mReactContextForRootTag.get(rootTag);
<add> boolean isRTL = false;
<add> boolean doLeftAndRightSwapInRTL = false;
<add> if (reactContext != null) {
<add> isRTL = I18nUtil.getInstance().isRTL(reactContext);
<add> doLeftAndRightSwapInRTL = I18nUtil.getInstance().doLeftAndRightSwapInRTL(reactContext);
<add> } else {
<add> // TODO T65116569: analyze why this happens
<add> ReactSoftException.logSoftException(
<add> TAG,
<add> new IllegalStateException(
<add> "updateRootLayoutSpecs called before ReactContext set for tag: " + rootTag));
<add> }
<add>
<ide> mBinding.setConstraints(
<ide> rootTag,
<ide> getMinSize(widthMeasureSpec),
<ide> getMaxSize(widthMeasureSpec),
<ide> getMinSize(heightMeasureSpec),
<ide> getMaxSize(heightMeasureSpec),
<del> I18nUtil.getInstance().isRTL(mReactContextForRootTag.get(rootTag)),
<del> I18nUtil.getInstance().doLeftAndRightSwapInRTL(mReactContextForRootTag.get(rootTag)));
<add> isRTL,
<add> doLeftAndRightSwapInRTL);
<ide> }
<ide>
<ide> public void receiveEvent(int reactTag, String eventName, @Nullable WritableMap params) { | 1 |
Java | Java | remove functionlanguageadaptor from rxjava-core | 9fd3f3ed6ab9f70b0d61fc930b52c6b16d28faad | <ide><path>rxjava-core/src/main/java/rx/Observable.java
<ide> import rx.util.functions.Func4;
<ide> import rx.util.functions.FuncN;
<ide> import rx.util.functions.Function;
<del>import rx.util.functions.FunctionLanguageAdaptor;
<ide> import rx.util.functions.Functions;
<ide>
<ide> /**
<ide><path>rxjava-core/src/main/java/rx/util/functions/FunctionLanguageAdaptor.java
<del>/**
<del> * Copyright 2013 Netflix, Inc.
<del> *
<del> * Licensed under the Apache License, Version 2.0 (the "License");
<del> * you may not use this file except in compliance with the License.
<del> * You may obtain a copy of the License at
<del> *
<del> * http://www.apache.org/licenses/LICENSE-2.0
<del> *
<del> * Unless required by applicable law or agreed to in writing, software
<del> * distributed under the License is distributed on an "AS IS" BASIS,
<del> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<del> * See the License for the specific language governing permissions and
<del> * limitations under the License.
<del> */
<del>package rx.util.functions;
<del>
<del>public interface FunctionLanguageAdaptor {
<del>
<del> /**
<del> * Invoke the function and return the results.
<del> *
<del> * @param function
<del> * @param args
<del> * @return Object results from function execution
<del> */
<del> Object call(Object function, Object[] args);
<del>
<del> /**
<del> * The Class of the Function that this adaptor serves.
<del> * <p>
<del> * Example: groovy.lang.Closure
<del> * <p>
<del> * This should not return classes of java.* packages.
<del> *
<del> * @return Class[] of classes that this adaptor should be invoked for.
<del> */
<del> public Class<?>[] getFunctionClass();
<del>}
<ide><path>rxjava-core/src/main/java/rx/util/functions/Functions.java
<ide> import java.util.Collection;
<ide> import java.util.concurrent.ConcurrentHashMap;
<ide>
<del>/**
<del> * Allows execution of functions from multiple different languages.
<del> * <p>
<del> * Language support is provided via implementations of {@link FunctionLanguageAdaptor}.
<del> * <p>
<del> * This class will dynamically look for known language adaptors on the classpath at startup or new ones can be registered using {@link #registerLanguageAdaptor(Class[], FunctionLanguageAdaptor)}.
<del> */
<ide> public class Functions {
<ide>
<del> private final static ConcurrentHashMap<Class<?>, FunctionLanguageAdaptor> languageAdaptors = new ConcurrentHashMap<Class<?>, FunctionLanguageAdaptor>();
<del>
<del> static {
<del> /* optimistically look for supported languages if they are in the classpath */
<del> loadLanguageAdaptor("Groovy");
<del> loadLanguageAdaptor("JRuby");
<del> loadLanguageAdaptor("Clojure");
<del> loadLanguageAdaptor("Scala");
<del> // as new languages arise we can add them here but this does not prevent someone from using 'registerLanguageAdaptor' directly
<del> }
<del>
<del> private static boolean loadLanguageAdaptor(String name) {
<del> String className = "rx.lang." + name.toLowerCase() + "." + name + "Adaptor";
<del> try {
<del> Class<?> c = Class.forName(className);
<del> FunctionLanguageAdaptor a = (FunctionLanguageAdaptor) c.newInstance();
<del> registerLanguageAdaptor(a.getFunctionClass(), a);
<del> /*
<del> * Using System.err/System.out as this is the only place in the library where we do logging and it's only at startup.
<del> * I don't want to include SL4J/Log4j just for this and no one uses Java Logging.
<del> */
<del> System.out.println("RxJava => Successfully loaded function language adaptor: " + name + " with path: " + className);
<del> } catch (ClassNotFoundException e) {
<del> System.err.println("RxJava => Could not find function language adaptor: " + name + " with path: " + className);
<del> return false;
<del> } catch (Throwable e) {
<del> System.err.println("RxJava => Failed trying to initialize function language adaptor: " + className);
<del> e.printStackTrace();
<del> return false;
<del> }
<del> return true;
<del> }
<del>
<del> public static void registerLanguageAdaptor(Class<?>[] functionClasses, FunctionLanguageAdaptor adaptor) {
<del> for (Class<?> functionClass : functionClasses) {
<del> if (functionClass.getPackage().getName().startsWith("java.")) {
<del> throw new IllegalArgumentException("FunctionLanguageAdaptor implementations can not specify java.lang.* classes.");
<del> }
<del> languageAdaptors.put(functionClass, adaptor);
<del> }
<del> }
<del>
<del> public static void removeLanguageAdaptor(Class<?> functionClass) {
<del> languageAdaptors.remove(functionClass);
<del> }
<del>
<del> public static Collection<FunctionLanguageAdaptor> getRegisteredLanguageAdaptors() {
<del> return languageAdaptors.values();
<del> }
<del>
<ide> /**
<ide> * Utility method for determining the type of closure/function and executing it.
<ide> *
<ide> public static FuncN from(final Object function) {
<ide> /* check for typed Rx Function implementation first */
<ide> if (function instanceof Function) {
<ide> return fromFunction((Function) function);
<del> } else {
<del> /* not an Rx Function so try language adaptors */
<del>
<del> // check for language adaptor
<del> for (final Class c : languageAdaptors.keySet()) {
<del> if (c.isInstance(function)) {
<del> final FunctionLanguageAdaptor la = languageAdaptors.get(c);
<del> // found the language adaptor so wrap in FuncN and return
<del> return new FuncN() {
<del>
<del> @Override
<del> public Object call(Object... args) {
<del> return la.call(function, args);
<del> }
<del>
<del> };
<del> }
<del> }
<del> // no language adaptor found
<ide> }
<del>
<ide> // no support found
<ide> throw new RuntimeException("Unsupported closure type: " + function.getClass().getSimpleName());
<ide> }
<ide>
<del> //
<del> // @SuppressWarnings("unchecked")
<del> // private static <R> R executionRxFunction(Function function, Object... args) {
<del> // // check Func* classes
<del> // if (function instanceof Func0) {
<del> // Func0<R> f = (Func0<R>) function;
<del> // if (args.length != 0) {
<del> // throw new RuntimeException("The closure was Func0 and expected no arguments, but we received: " + args.length);
<del> // }
<del> // return (R) f.call();
<del> // } else if (function instanceof Func1) {
<del> // Func1<Object, R> f = (Func1<Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func1 and expected 1 argument, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0]);
<del> // } else if (function instanceof Func2) {
<del> // Func2<Object, Object, R> f = (Func2<Object, Object, R>) function;
<del> // if (args.length != 2) {
<del> // throw new RuntimeException("The closure was Func2 and expected 2 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1]);
<del> // } else if (function instanceof Func3) {
<del> // Func3<Object, Object, Object, R> f = (Func3<Object, Object, Object, R>) function;
<del> // if (args.length != 3) {
<del> // throw new RuntimeException("The closure was Func3 and expected 3 arguments, but we received: " + args.length);
<del> // }
<del> // return (R) f.call(args[0], args[1], args[2]);
<del> // } else if (function instanceof Func4) {
<del> // Func4<Object, Object, Object, Object, R> f = (Func4<Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func4 and expected 4 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3]);
<del> // } else if (function instanceof Func5) {
<del> // Func5<Object, Object, Object, Object, Object, R> f = (Func5<Object, Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func5 and expected 5 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3], args[4]);
<del> // } else if (function instanceof Func6) {
<del> // Func6<Object, Object, Object, Object, Object, Object, R> f = (Func6<Object, Object, Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func6 and expected 6 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3], args[4], args[5]);
<del> // } else if (function instanceof Func7) {
<del> // Func7<Object, Object, Object, Object, Object, Object, Object, R> f = (Func7<Object, Object, Object, Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func7 and expected 7 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3], args[4], args[5], args[6]);
<del> // } else if (function instanceof Func8) {
<del> // Func8<Object, Object, Object, Object, Object, Object, Object, Object, R> f = (Func8<Object, Object, Object, Object, Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func8 and expected 8 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3], args[4], args[5], args[6], args[7]);
<del> // } else if (function instanceof Func9) {
<del> // Func9<Object, Object, Object, Object, Object, Object, Object, Object, Object, R> f = (Func9<Object, Object, Object, Object, Object, Object, Object, Object, Object, R>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Func9 and expected 9 arguments, but we received: " + args.length);
<del> // }
<del> // return f.call(args[0], args[1], args[2], args[3], args[4], args[5], args[6], args[7], args[8]);
<del> // } else if (function instanceof FuncN) {
<del> // FuncN<R> f = (FuncN<R>) function;
<del> // return f.call(args);
<del> // } else if (function instanceof Action0) {
<del> // Action0 f = (Action0) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Action0 and expected 0 arguments, but we received: " + args.length);
<del> // }
<del> // f.call();
<del> // return null;
<del> // } else if (function instanceof Action1) {
<del> // Action1<Object> f = (Action1<Object>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Action1 and expected 1 argument, but we received: " + args.length);
<del> // }
<del> // f.call(args[0]);
<del> // return null;
<del> // } else if (function instanceof Action2) {
<del> // Action2<Object, Object> f = (Action2<Object, Object>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Action2 and expected 2 argument, but we received: " + args.length);
<del> // }
<del> // f.call(args[0], args[1]);
<del> // return null;
<del> // } else if (function instanceof Action3) {
<del> // Action3<Object, Object, Object> f = (Action3<Object, Object, Object>) function;
<del> // if (args.length != 1) {
<del> // throw new RuntimeException("The closure was Action1 and expected 1 argument, but we received: " + args.length);
<del> // }
<del> // f.call(args[0], args[1], args[2]);
<del> // return null;
<del> // }
<del> //
<del> // throw new RuntimeException("Unknown implementation of Function: " + function.getClass().getSimpleName());
<del> // }
<del>
<ide> @SuppressWarnings({ "unchecked", "rawtypes" })
<ide> private static FuncN fromFunction(Function function) {
<ide> // check Func* classes | 3 |
Python | Python | fix typo in 'self' | 20f5774acba6195fbf0611f37770a3490bb27f24 | <ide><path>numpy/distutils/fcompiler/__init__.py
<ide> def set_exe(exe_key, f77=None, f90=None):
<ide> set_exe('archiver')
<ide> set_exe('ranlib')
<ide>
<del> def update_executables(elf):
<add> def update_executables(self):
<ide> """Called at the beginning of customisation. Subclasses should
<ide> override this if they need to set up the executables dictionary.
<ide> | 1 |
Ruby | Ruby | reduce surface area of connectionspecification | b8fc0150d66866ce7e86c2608dc779fdd7688a61 | <ide><path>activerecord/lib/active_record/connection_adapters.rb
<ide> module ConnectionAdapters
<ide>
<ide> autoload :Column
<ide> autoload :ConnectionSpecification
<add> autoload :Resolver
<ide>
<ide> autoload_at "active_record/connection_adapters/abstract/schema_definitions" do
<ide> autoload :IndexDefinition
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/connection_pool.rb
<ide> def initialize(spec)
<ide>
<ide> @spec = spec
<ide>
<del> @checkout_timeout = (spec.underlying_configuration_hash[:checkout_timeout] && spec.underlying_configuration_hash[:checkout_timeout].to_f) || 5
<del> if @idle_timeout = spec.underlying_configuration_hash.fetch(:idle_timeout, 300)
<add> @checkout_timeout = (spec.db_config.configuration_hash[:checkout_timeout] && spec.db_config.configuration_hash[:checkout_timeout].to_f) || 5
<add> if @idle_timeout = spec.db_config.configuration_hash.fetch(:idle_timeout, 300)
<ide> @idle_timeout = @idle_timeout.to_f
<ide> @idle_timeout = nil if @idle_timeout <= 0
<ide> end
<ide>
<ide> # default max pool size to 5
<del> @size = (spec.underlying_configuration_hash[:pool] && spec.underlying_configuration_hash[:pool].to_i) || 5
<add> @size = (spec.db_config.configuration_hash[:pool] && spec.db_config.configuration_hash[:pool].to_i) || 5
<ide>
<ide> # This variable tracks the cache of threads mapped to reserved connections, with the
<ide> # sole purpose of speeding up the +connection+ method. It is not the authoritative
<ide> def initialize(spec)
<ide>
<ide> # +reaping_frequency+ is configurable mostly for historical reasons, but it could
<ide> # also be useful if someone wants a very low +idle_timeout+.
<del> reaping_frequency = spec.underlying_configuration_hash.fetch(:reaping_frequency, 60)
<add> reaping_frequency = spec.db_config.configuration_hash.fetch(:reaping_frequency, 60)
<ide> @reaper = Reaper.new(self, reaping_frequency && reaping_frequency.to_f)
<ide> @reaper.run
<ide> end
<ide> def connections
<ide> # Raises:
<ide> # - ActiveRecord::ExclusiveConnectionTimeoutError if unable to gain ownership of all
<ide> # connections in the pool within a timeout interval (default duration is
<del> # <tt>spec.underlying_configuration_hash[:checkout_timeout] * 2</tt> seconds).
<add> # <tt>spec.db_config.configuration_hash[:checkout_timeout] * 2</tt> seconds).
<ide> def disconnect(raise_on_acquisition_timeout = true)
<ide> with_exclusively_acquired_all_connections(raise_on_acquisition_timeout) do
<ide> synchronize do
<ide> def disconnect(raise_on_acquisition_timeout = true)
<ide> #
<ide> # The pool first tries to gain ownership of all connections. If unable to
<ide> # do so within a timeout interval (default duration is
<del> # <tt>spec.underlying_configuration_hash[:checkout_timeout] * 2</tt> seconds), then the pool is forcefully
<add> # <tt>spec.db_config.configuration_hash[:checkout_timeout] * 2</tt> seconds), then the pool is forcefully
<ide> # disconnected without any regard for other connection owning threads.
<ide> def disconnect!
<ide> disconnect(false)
<ide> def discarded? # :nodoc:
<ide> # Raises:
<ide> # - ActiveRecord::ExclusiveConnectionTimeoutError if unable to gain ownership of all
<ide> # connections in the pool within a timeout interval (default duration is
<del> # <tt>spec.underlying_configuration_hash[:checkout_timeout] * 2</tt> seconds).
<add> # <tt>spec.db_config.configuration_hash[:checkout_timeout] * 2</tt> seconds).
<ide> def clear_reloadable_connections(raise_on_acquisition_timeout = true)
<ide> with_exclusively_acquired_all_connections(raise_on_acquisition_timeout) do
<ide> synchronize do
<ide> def clear_reloadable_connections(raise_on_acquisition_timeout = true)
<ide> #
<ide> # The pool first tries to gain ownership of all connections. If unable to
<ide> # do so within a timeout interval (default duration is
<del> # <tt>spec.underlying_configuration_hash[:checkout_timeout] * 2</tt> seconds), then the pool forcefully
<add> # <tt>spec.db_config.configuration_hash[:checkout_timeout] * 2</tt> seconds), then the pool forcefully
<ide> # clears the cache and reloads connections without any regard for other
<ide> # connection owning threads.
<ide> def clear_reloadable_connections!
<ide> def remove_connection_from_thread_cache(conn, owner_thread = conn.owner)
<ide> alias_method :release, :remove_connection_from_thread_cache
<ide>
<ide> def new_connection
<del> Base.send(spec.adapter_method, spec.underlying_configuration_hash).tap do |conn|
<add> Base.send(spec.db_config.adapter_method, spec.db_config.configuration_hash).tap do |conn|
<ide> conn.check_version
<ide> end
<ide> end
<ide> def connection_pool_list
<ide> alias :connection_pools :connection_pool_list
<ide>
<ide> def establish_connection(config)
<del> resolver = ConnectionSpecification::Resolver.new(Base.configurations)
<add> resolver = Resolver.new(Base.configurations)
<ide> spec = resolver.spec(config)
<ide>
<ide> remove_connection(spec.name)
<ide> def establish_connection(config)
<ide> }
<ide> if spec
<ide> payload[:spec_name] = spec.name
<del> payload[:config] = spec.underlying_configuration_hash
<add> payload[:config] = spec.db_config.configuration_hash
<ide> end
<ide>
<ide> message_bus.instrument("!connection.active_record", payload) do
<ide> def remove_connection(spec_name)
<ide> if pool = owner_to_pool.delete(spec_name)
<ide> pool.automatic_reconnect = false
<ide> pool.disconnect!
<del> pool.spec.underlying_configuration_hash
<add> pool.spec.db_config.configuration_hash
<ide> end
<ide> end
<ide>
<ide> def retrieve_connection_pool(spec_name)
<ide> # A connection was established in an ancestor process that must have
<ide> # subsequently forked. We can't reuse the connection, but we can copy
<ide> # the specification and establish a new connection with it.
<del> establish_connection(ancestor_pool.spec.to_hash).tap do |pool|
<add> establish_connection(ancestor_pool.spec.db_config.configuration_hash).tap do |pool|
<ide> pool.schema_cache = ancestor_pool.schema_cache if ancestor_pool.schema_cache
<ide> end
<ide> else
<ide><path>activerecord/lib/active_record/connection_adapters/connection_specification.rb
<ide>
<ide> module ActiveRecord
<ide> module ConnectionAdapters
<del> class ConnectionSpecification #:nodoc:
<del> attr_reader :name, :adapter_method, :db_config
<add> class ConnectionSpecification # :nodoc:
<add> attr_reader :name, :db_config
<ide>
<del> def initialize(name, db_config, adapter_method)
<del> @name, @db_config, @adapter_method = name, db_config, adapter_method
<del> end
<del>
<del> def underlying_configuration_hash
<del> @db_config.configuration_hash
<del> end
<del>
<del> def initialize_dup(original)
<del> @db_config = original.db_config.dup
<del> end
<del>
<del> def to_hash
<del> underlying_configuration_hash.dup.merge(name: @name)
<del> end
<del>
<del> # Expands a connection string into a hash.
<del> class ConnectionUrlResolver # :nodoc:
<del> # == Example
<del> #
<del> # url = "postgresql://foo:bar@localhost:9000/foo_test?pool=5&timeout=3000"
<del> # ConnectionUrlResolver.new(url).to_hash
<del> # # => {
<del> # adapter: "postgresql",
<del> # host: "localhost",
<del> # port: 9000,
<del> # database: "foo_test",
<del> # username: "foo",
<del> # password: "bar",
<del> # pool: "5",
<del> # timeout: "3000"
<del> # }
<del> def initialize(url)
<del> raise "Database URL cannot be empty" if url.blank?
<del> @uri = uri_parser.parse(url)
<del> @adapter = @uri.scheme && @uri.scheme.tr("-", "_")
<del> @adapter = "postgresql" if @adapter == "postgres"
<del>
<del> if @uri.opaque
<del> @uri.opaque, @query = @uri.opaque.split("?", 2)
<del> else
<del> @query = @uri.query
<del> end
<del> end
<del>
<del> # Converts the given URL to a full connection hash.
<del> def to_hash
<del> config = raw_config.compact_blank
<del> config.map { |key, value| config[key] = uri_parser.unescape(value) if value.is_a? String }
<del> config
<del> end
<del>
<del> private
<del> attr_reader :uri
<del>
<del> def uri_parser
<del> @uri_parser ||= URI::Parser.new
<del> end
<del>
<del> # Converts the query parameters of the URI into a hash.
<del> #
<del> # "localhost?pool=5&reaping_frequency=2"
<del> # # => { pool: "5", reaping_frequency: "2" }
<del> #
<del> # returns empty hash if no query present.
<del> #
<del> # "localhost"
<del> # # => {}
<del> def query_hash
<del> Hash[(@query || "").split("&").map { |pair| pair.split("=") }].symbolize_keys
<del> end
<del>
<del> def raw_config
<del> if uri.opaque
<del> query_hash.merge(
<del> adapter: @adapter,
<del> database: uri.opaque
<del> )
<del> else
<del> query_hash.merge(
<del> adapter: @adapter,
<del> username: uri.user,
<del> password: uri.password,
<del> port: uri.port,
<del> database: database_from_path,
<del> host: uri.hostname
<del> )
<del> end
<del> end
<del>
<del> # Returns name of the database.
<del> def database_from_path
<del> if @adapter == "sqlite3"
<del> # 'sqlite3:/foo' is absolute, because that makes sense. The
<del> # corresponding relative version, 'sqlite3:foo', is handled
<del> # elsewhere, as an "opaque".
<del>
<del> uri.path
<del> else
<del> # Only SQLite uses a filename as the "database" name; for
<del> # anything else, a leading slash would be silly.
<del>
<del> uri.path.sub(%r{^/}, "")
<del> end
<del> end
<del> end
<del>
<del> ##
<del> # Builds a ConnectionSpecification from user input.
<del> class Resolver # :nodoc:
<del> attr_reader :configurations
<del>
<del> # Accepts a list of db config objects.
<del> def initialize(configurations)
<del> @configurations = configurations
<del> end
<del>
<del> # Returns an instance of ConnectionSpecification for a given adapter.
<del> # Accepts a hash one layer deep that contains all connection information.
<del> #
<del> # == Example
<del> #
<del> # config = { "production" => { "host" => "localhost", "database" => "foo", "adapter" => "sqlite3" } }
<del> # spec = Resolver.new(config).spec(:production)
<del> # spec.adapter_method
<del> # # => "sqlite3_connection"
<del> # spec.underlying_configuration_hash
<del> # # => { host: "localhost", database: "foo", adapter: "sqlite3" }
<del> #
<del> def spec(config)
<del> pool_name = config if config.is_a?(Symbol)
<del>
<del> db_config = resolve(config, pool_name)
<del> spec = db_config.configuration_hash
<del>
<del> raise(AdapterNotSpecified, "database configuration does not specify adapter") unless spec.key?(:adapter)
<del>
<del> # Require the adapter itself and give useful feedback about
<del> # 1. Missing adapter gems and
<del> # 2. Adapter gems' missing dependencies.
<del> path_to_adapter = "active_record/connection_adapters/#{spec[:adapter]}_adapter"
<del> begin
<del> require path_to_adapter
<del> rescue LoadError => e
<del> # We couldn't require the adapter itself. Raise an exception that
<del> # points out config typos and missing gems.
<del> if e.path == path_to_adapter
<del> # We can assume that a non-builtin adapter was specified, so it's
<del> # either misspelled or missing from Gemfile.
<del> raise LoadError, "Could not load the '#{spec[:adapter]}' Active Record adapter. Ensure that the adapter is spelled correctly in config/database.yml and that you've added the necessary adapter gem to your Gemfile.", e.backtrace
<del>
<del> # Bubbled up from the adapter require. Prefix the exception message
<del> # with some guidance about how to address it and reraise.
<del> else
<del> raise LoadError, "Error loading the '#{spec[:adapter]}' Active Record adapter. Missing a gem it depends on? #{e.message}", e.backtrace
<del> end
<del> end
<del>
<del> adapter_method = "#{spec[:adapter]}_connection"
<del>
<del> unless ActiveRecord::Base.respond_to?(adapter_method)
<del> raise AdapterNotFound, "database configuration specifies nonexistent #{spec[:adapter]} adapter"
<del> end
<del>
<del> ConnectionSpecification.new(spec.delete(:name) || "primary", db_config, adapter_method)
<del> end
<del>
<del> # Returns fully resolved connection, accepts hash, string or symbol.
<del> # Always returns a DatabaseConfiguration::DatabaseConfig
<del> #
<del> # == Examples
<del> #
<del> # Symbol representing current environment.
<del> #
<del> # Resolver.new("production" => {}).resolve(:production)
<del> # # => DatabaseConfigurations::HashConfig.new(env_name: "production", config: {})
<del> #
<del> # One layer deep hash of connection values.
<del> #
<del> # Resolver.new({}).resolve("adapter" => "sqlite3")
<del> # # => DatabaseConfigurations::HashConfig.new(config: {"adapter" => "sqlite3"})
<del> #
<del> # Connection URL.
<del> #
<del> # Resolver.new({}).resolve("postgresql://localhost/foo")
<del> # # => DatabaseConfigurations::UrlConfig.new(config: {"adapter" => "postgresql", "host" => "localhost", "database" => "foo"})
<del> #
<del> def resolve(config_or_env, pool_name = nil)
<del> env = ActiveRecord::ConnectionHandling::DEFAULT_ENV.call.to_s
<del>
<del> case config_or_env
<del> when Symbol
<del> resolve_symbol_connection(config_or_env, pool_name)
<del> when String
<del> DatabaseConfigurations::UrlConfig.new(env, "primary", config_or_env)
<del> when Hash
<del> DatabaseConfigurations::HashConfig.new(env, "primary", config_or_env)
<del> when DatabaseConfigurations::DatabaseConfig
<del> config_or_env
<del> else
<del> raise TypeError, "Invalid type for configuration. Expected Symbol, String, or Hash. Got #{config_or_env.inspect}"
<del> end
<del> end
<del>
<del> private
<del> # Takes the environment such as +:production+ or +:development+ and a
<del> # pool name the corresponds to the name given by the connection pool
<del> # to the connection. That pool name is merged into the hash with the
<del> # name key.
<del> #
<del> # This requires that the @configurations was initialized with a key that
<del> # matches.
<del> #
<del> # configurations = #<ActiveRecord::DatabaseConfigurations:0x00007fd9fdace3e0
<del> # @configurations=[
<del> # #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd9fdace250
<del> # @env_name="production", @spec_name="primary", @config={database: "my_db"}>
<del> # ]>
<del> #
<del> # Resolver.new(configurations).resolve_symbol_connection(:production, "primary")
<del> # # => DatabaseConfigurations::HashConfig(config: database: "my_db", env_name: "production", spec_name: "primary")
<del> def resolve_symbol_connection(env_name, pool_name)
<del> db_config = configurations.find_db_config(env_name)
<del>
<del> if db_config
<del> config = db_config.configuration_hash.merge(name: pool_name.to_s)
<del> DatabaseConfigurations::HashConfig.new(db_config.env_name, db_config.spec_name, config)
<del> else
<del> raise AdapterNotSpecified, <<~MSG
<del> The `#{env_name}` database is not configured for the `#{ActiveRecord::ConnectionHandling::DEFAULT_ENV.call}` environment.
<del>
<del> Available databases configurations are:
<del>
<del> #{build_configuration_sentence}
<del> MSG
<del> end
<del> end
<del>
<del> def build_configuration_sentence # :nodoc:
<del> configs = configurations.configs_for(include_replicas: true)
<del>
<del> configs.group_by(&:env_name).map do |env, config|
<del> namespaces = config.map(&:spec_name)
<del> if namespaces.size > 1
<del> "#{env}: #{namespaces.join(", ")}"
<del> else
<del> env
<del> end
<del> end.join("\n")
<del> end
<add> def initialize(name, db_config)
<add> @name, @db_config = name, db_config
<ide> end
<ide> end
<ide> end
<ide><path>activerecord/lib/active_record/connection_adapters/resolver.rb
<add># frozen_string_literal: true
<add>
<add>module ActiveRecord
<add> module ConnectionAdapters
<add> # Builds a ConnectionSpecification from user input.
<add> class Resolver # :nodoc:
<add> attr_reader :configurations
<add>
<add> # Accepts a list of db config objects.
<add> def initialize(configurations)
<add> @configurations = configurations
<add> end
<add>
<add> # Returns an instance of ConnectionSpecification for a given adapter.
<add> # Accepts a hash one layer deep that contains all connection information.
<add> #
<add> # == Example
<add> #
<add> # config = { "production" => { "host" => "localhost", "database" => "foo", "adapter" => "sqlite3" } }
<add> # spec = Resolver.new(config).spec(:production)
<add> # spec.db_config.configuration_hash
<add> # # => { host: "localhost", database: "foo", adapter: "sqlite3" }
<add> #
<add> def spec(config)
<add> pool_name = config if config.is_a?(Symbol)
<add>
<add> db_config = resolve(config, pool_name)
<add> spec = db_config.configuration_hash
<add>
<add> raise(AdapterNotSpecified, "database configuration does not specify adapter") unless spec.key?(:adapter)
<add>
<add> # Require the adapter itself and give useful feedback about
<add> # 1. Missing adapter gems and
<add> # 2. Adapter gems' missing dependencies.
<add> path_to_adapter = "active_record/connection_adapters/#{spec[:adapter]}_adapter"
<add> begin
<add> require path_to_adapter
<add> rescue LoadError => e
<add> # We couldn't require the adapter itself. Raise an exception that
<add> # points out config typos and missing gems.
<add> if e.path == path_to_adapter
<add> # We can assume that a non-builtin adapter was specified, so it's
<add> # either misspelled or missing from Gemfile.
<add> raise LoadError, "Could not load the '#{spec[:adapter]}' Active Record adapter. Ensure that the adapter is spelled correctly in config/database.yml and that you've added the necessary adapter gem to your Gemfile.", e.backtrace
<add>
<add> # Bubbled up from the adapter require. Prefix the exception message
<add> # with some guidance about how to address it and reraise.
<add> else
<add> raise LoadError, "Error loading the '#{spec[:adapter]}' Active Record adapter. Missing a gem it depends on? #{e.message}", e.backtrace
<add> end
<add> end
<add>
<add> unless ActiveRecord::Base.respond_to?(db_config.adapter_method)
<add> raise AdapterNotFound, "database configuration specifies nonexistent #{spec[:adapter]} adapter"
<add> end
<add>
<add> ConnectionSpecification.new(spec.delete(:name) || "primary", db_config)
<add> end
<add>
<add> # Returns fully resolved connection, accepts hash, string or symbol.
<add> # Always returns a DatabaseConfiguration::DatabaseConfig
<add> #
<add> # == Examples
<add> #
<add> # Symbol representing current environment.
<add> #
<add> # Resolver.new("production" => {}).resolve(:production)
<add> # # => DatabaseConfigurations::HashConfig.new(env_name: "production", config: {})
<add> #
<add> # One layer deep hash of connection values.
<add> #
<add> # Resolver.new({}).resolve("adapter" => "sqlite3")
<add> # # => DatabaseConfigurations::HashConfig.new(config: {"adapter" => "sqlite3"})
<add> #
<add> # Connection URL.
<add> #
<add> # Resolver.new({}).resolve("postgresql://localhost/foo")
<add> # # => DatabaseConfigurations::UrlConfig.new(config: {"adapter" => "postgresql", "host" => "localhost", "database" => "foo"})
<add> #
<add> def resolve(config_or_env, pool_name = nil)
<add> env = ActiveRecord::ConnectionHandling::DEFAULT_ENV.call.to_s
<add>
<add> case config_or_env
<add> when Symbol
<add> resolve_symbol_connection(config_or_env, pool_name)
<add> when String
<add> DatabaseConfigurations::UrlConfig.new(env, "primary", config_or_env)
<add> when Hash
<add> DatabaseConfigurations::HashConfig.new(env, "primary", config_or_env)
<add> when DatabaseConfigurations::DatabaseConfig
<add> config_or_env
<add> else
<add> raise TypeError, "Invalid type for configuration. Expected Symbol, String, or Hash. Got #{config_or_env.inspect}"
<add> end
<add> end
<add>
<add> private
<add> # Takes the environment such as +:production+ or +:development+ and a
<add> # pool name the corresponds to the name given by the connection pool
<add> # to the connection. That pool name is merged into the hash with the
<add> # name key.
<add> #
<add> # This requires that the @configurations was initialized with a key that
<add> # matches.
<add> #
<add> # configurations = #<ActiveRecord::DatabaseConfigurations:0x00007fd9fdace3e0
<add> # @configurations=[
<add> # #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd9fdace250
<add> # @env_name="production", @spec_name="primary", @config={database: "my_db"}>
<add> # ]>
<add> #
<add> # Resolver.new(configurations).resolve_symbol_connection(:production, "primary")
<add> # # => DatabaseConfigurations::HashConfig(config: database: "my_db", env_name: "production", spec_name: "primary")
<add> def resolve_symbol_connection(env_name, pool_name)
<add> db_config = configurations.find_db_config(env_name)
<add>
<add> if db_config
<add> config = db_config.configuration_hash.merge(name: pool_name.to_s)
<add> DatabaseConfigurations::HashConfig.new(db_config.env_name, db_config.spec_name, config)
<add> else
<add> raise AdapterNotSpecified, <<~MSG
<add> The `#{env_name}` database is not configured for the `#{ActiveRecord::ConnectionHandling::DEFAULT_ENV.call}` environment.
<add>
<add> Available databases configurations are:
<add>
<add> #{build_configuration_sentence}
<add> MSG
<add> end
<add> end
<add>
<add> def build_configuration_sentence # :nodoc:
<add> configs = configurations.configs_for(include_replicas: true)
<add>
<add> configs.group_by(&:env_name).map do |env, config|
<add> namespaces = config.map(&:spec_name)
<add> if namespaces.size > 1
<add> "#{env}: #{namespaces.join(", ")}"
<add> else
<add> env
<add> end
<add> end.join("\n")
<add> end
<add> end
<add> end
<add>end
<ide><path>activerecord/lib/active_record/connection_handling.rb
<ide> def resolve_config_for_connection(config_or_env) # :nodoc:
<ide> pool_name = primary_class? ? "primary" : name
<ide> self.connection_specification_name = pool_name
<ide>
<del> resolver = ConnectionAdapters::ConnectionSpecification::Resolver.new(Base.configurations)
<add> resolver = ConnectionAdapters::Resolver.new(Base.configurations)
<ide> config_hash = resolver.resolve(config_or_env, pool_name).configuration_hash
<ide> config_hash[:name] = pool_name
<ide>
<ide> def primary_class? # :nodoc:
<ide> #
<ide> # Please use only for reading.
<ide> def connection_config
<del> connection_pool.spec.underlying_configuration_hash
<add> connection_pool.spec.db_config.configuration_hash
<ide> end
<ide>
<ide> def connection_pool
<ide><path>activerecord/lib/active_record/database_configurations.rb
<ide> require "active_record/database_configurations/database_config"
<ide> require "active_record/database_configurations/hash_config"
<ide> require "active_record/database_configurations/url_config"
<add>require "active_record/database_configurations/connection_url_resolver"
<ide>
<ide> module ActiveRecord
<ide> # ActiveRecord::DatabaseConfigurations returns an array of DatabaseConfig
<ide><path>activerecord/lib/active_record/database_configurations/connection_url_resolver.rb
<add># frozen_string_literal: true
<add>
<add>module ActiveRecord
<add> class DatabaseConfigurations
<add> # Expands a connection string into a hash.
<add> class ConnectionUrlResolver # :nodoc:
<add> # == Example
<add> #
<add> # url = "postgresql://foo:bar@localhost:9000/foo_test?pool=5&timeout=3000"
<add> # ConnectionUrlResolver.new(url).to_hash
<add> # # => {
<add> # adapter: "postgresql",
<add> # host: "localhost",
<add> # port: 9000,
<add> # database: "foo_test",
<add> # username: "foo",
<add> # password: "bar",
<add> # pool: "5",
<add> # timeout: "3000"
<add> # }
<add> def initialize(url)
<add> raise "Database URL cannot be empty" if url.blank?
<add> @uri = uri_parser.parse(url)
<add> @adapter = @uri.scheme && @uri.scheme.tr("-", "_")
<add> @adapter = "postgresql" if @adapter == "postgres"
<add>
<add> if @uri.opaque
<add> @uri.opaque, @query = @uri.opaque.split("?", 2)
<add> else
<add> @query = @uri.query
<add> end
<add> end
<add>
<add> # Converts the given URL to a full connection hash.
<add> def to_hash
<add> config = raw_config.compact_blank
<add> config.map { |key, value| config[key] = uri_parser.unescape(value) if value.is_a? String }
<add> config
<add> end
<add>
<add> private
<add> attr_reader :uri
<add>
<add> def uri_parser
<add> @uri_parser ||= URI::Parser.new
<add> end
<add>
<add> # Converts the query parameters of the URI into a hash.
<add> #
<add> # "localhost?pool=5&reaping_frequency=2"
<add> # # => { pool: "5", reaping_frequency: "2" }
<add> #
<add> # returns empty hash if no query present.
<add> #
<add> # "localhost"
<add> # # => {}
<add> def query_hash
<add> Hash[(@query || "").split("&").map { |pair| pair.split("=") }].symbolize_keys
<add> end
<add>
<add> def raw_config
<add> if uri.opaque
<add> query_hash.merge(
<add> adapter: @adapter,
<add> database: uri.opaque
<add> )
<add> else
<add> query_hash.merge(
<add> adapter: @adapter,
<add> username: uri.user,
<add> password: uri.password,
<add> port: uri.port,
<add> database: database_from_path,
<add> host: uri.hostname
<add> )
<add> end
<add> end
<add>
<add> # Returns name of the database.
<add> def database_from_path
<add> if @adapter == "sqlite3"
<add> # 'sqlite3:/foo' is absolute, because that makes sense. The
<add> # corresponding relative version, 'sqlite3:foo', is handled
<add> # elsewhere, as an "opaque".
<add>
<add> uri.path
<add> else
<add> # Only SQLite uses a filename as the "database" name; for
<add> # anything else, a leading slash would be silly.
<add>
<add> uri.path.sub(%r{^/}, "")
<add> end
<add> end
<add> end
<add> end
<add>end
<ide><path>activerecord/lib/active_record/database_configurations/database_config.rb
<ide> def config
<ide> configuration_hash.stringify_keys
<ide> end
<ide>
<del> def initialize_dup(original)
<del> @config = original.configuration_hash.dup
<add> def adapter_method
<add> "#{adapter}_connection"
<add> end
<add>
<add> def adapter
<add> configuration_hash[:adapter]
<ide> end
<ide>
<ide> def replica?
<ide><path>activerecord/lib/active_record/database_configurations/hash_config.rb
<ide> def migrations_paths
<ide> private
<ide> def resolve_url_key
<ide> if configuration_hash[:url] && !configuration_hash[:url].match?(/^jdbc:/)
<del> connection_hash = ActiveRecord::ConnectionAdapters::ConnectionSpecification::ConnectionUrlResolver.new(configuration_hash[:url]).to_hash
<add> connection_hash = ConnectionUrlResolver.new(configuration_hash[:url]).to_hash
<ide> configuration_hash.merge!(connection_hash)
<ide> end
<ide> end
<ide><path>activerecord/lib/active_record/database_configurations/url_config.rb
<ide> def build_url_hash(url)
<ide> if url.nil? || /^jdbc:/.match?(url)
<ide> { url: url }
<ide> else
<del> ActiveRecord::ConnectionAdapters::ConnectionSpecification::ConnectionUrlResolver.new(url).to_hash
<add> ConnectionUrlResolver.new(url).to_hash
<ide> end
<ide> end
<ide>
<ide><path>activerecord/lib/active_record/tasks/database_tasks.rb
<ide> def current_config(options = {})
<ide> if options.has_key?(:config)
<ide> @current_config = options[:config]
<ide> else
<del> @current_config ||= ActiveRecord::Base.configurations.configs_for(env_name: options[:env], spec_name: options[:spec]).underlying_configuration_hash
<add> @current_config ||= ActiveRecord::Base.configurations.configs_for(env_name: options[:env], spec_name: options[:spec]).db_config.configuration_hash
<ide> end
<ide> end
<ide>
<ide> def create_all
<ide> old_pool = ActiveRecord::Base.connection_handler.retrieve_connection_pool(ActiveRecord::Base.connection_specification_name)
<ide> each_local_configuration { |configuration| create configuration }
<ide> if old_pool
<del> ActiveRecord::Base.connection_handler.establish_connection(old_pool.spec.to_hash)
<add> ActiveRecord::Base.connection_handler.establish_connection(old_pool.spec.db_config.configuration_hash)
<ide> end
<ide> end
<ide>
<ide><path>activerecord/test/cases/adapters/mysql2/schema_test.rb
<ide> class Mysql2SchemaTest < ActiveRecord::Mysql2TestCase
<ide>
<ide> def setup
<ide> @connection = ActiveRecord::Base.connection
<del> db = Post.connection_pool.spec.underlying_configuration_hash[:database]
<add> db = Post.connection_pool.spec.db_config.configuration_hash[:database]
<ide> table = Post.table_name
<ide> @db_name = db
<ide>
<ide><path>activerecord/test/cases/connection_adapters/adapter_leasing_test.rb
<ide> def test_expire_mutates_in_use
<ide>
<ide> def test_close
<ide> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("test", "primary", {})
<del> pool = Pool.new(ConnectionSpecification.new("primary", db_config, nil))
<add> pool = Pool.new(ConnectionSpecification.new("primary", db_config))
<ide> pool.insert_connection_for_test! @adapter
<ide> @adapter.pool = pool
<ide>
<ide><path>activerecord/test/cases/connection_adapters/connection_handler_test.rb
<ide> def test_default_env_fall_back_to_default_env_when_rails_env_or_rack_env_is_empt
<ide> ENV["RACK_ENV"] = original_rack_env
<ide> end
<ide>
<del> def test_establish_connection_uses_spec_name
<add> def test_establish_connection_uses_config_hash_with_spec_name
<ide> old_config = ActiveRecord::Base.configurations
<ide> config = { "readonly" => { "adapter" => "sqlite3", "pool" => "5" } }
<ide> ActiveRecord::Base.configurations = config
<del> resolver = ConnectionAdapters::ConnectionSpecification::Resolver.new(ActiveRecord::Base.configurations)
<del> spec = resolver.spec(:readonly)
<del> @handler.establish_connection(spec.to_hash)
<add> resolver = ConnectionAdapters::Resolver.new(ActiveRecord::Base.configurations)
<add> config_hash = resolver.resolve(config["readonly"], "readonly").configuration_hash
<add> config_hash[:name] = "readonly"
<add> @handler.establish_connection(config_hash)
<ide>
<ide> assert_not_nil @handler.retrieve_connection_pool("readonly")
<ide> ensure
<ide> def test_establish_connection_using_3_levels_config
<ide> @handler.establish_connection(:readonly)
<ide>
<ide> assert_not_nil pool = @handler.retrieve_connection_pool("readonly")
<del> assert_equal "db/readonly.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/readonly.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide>
<ide> assert_not_nil pool = @handler.retrieve_connection_pool("primary")
<del> assert_equal "db/primary.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/primary.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide>
<ide> assert_not_nil pool = @handler.retrieve_connection_pool("common")
<del> assert_equal "db/common.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/common.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ENV["RAILS_ENV"] = previous_env
<ide> def test_establish_connection_using_3_level_config_defaults_to_default_env_prima
<ide>
<ide> ActiveRecord::Base.establish_connection
<ide>
<del> assert_match "db/primary.sqlite3", ActiveRecord::Base.connection.pool.spec.underlying_configuration_hash[:database]
<add> assert_match "db/primary.sqlite3", ActiveRecord::Base.connection.pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ENV["RAILS_ENV"] = previous_env
<ide> def test_establish_connection_using_2_level_config_defaults_to_default_env_prima
<ide>
<ide> ActiveRecord::Base.establish_connection
<ide>
<del> assert_match "db/primary.sqlite3", ActiveRecord::Base.connection.pool.spec.underlying_configuration_hash[:database]
<add> assert_match "db/primary.sqlite3", ActiveRecord::Base.connection.pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ENV["RAILS_ENV"] = previous_env
<ide> def test_establish_connection_using_two_level_configurations
<ide> @handler.establish_connection(:development)
<ide>
<ide> assert_not_nil pool = @handler.retrieve_connection_pool("development")
<del> assert_equal "db/primary.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/primary.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> end
<ide> def test_establish_connection_using_top_level_key_in_two_level_config
<ide> @handler.establish_connection(:development_readonly)
<ide>
<ide> assert_not_nil pool = @handler.retrieve_connection_pool("development_readonly")
<del> assert_equal "db/readonly.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/readonly.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> end
<ide> def test_a_class_using_custom_pool_and_switching_back_to_primary
<ide>
<ide> assert_same klass2.connection, ActiveRecord::Base.connection
<ide>
<del> pool = klass2.establish_connection(ActiveRecord::Base.connection_pool.spec.underlying_configuration_hash)
<add> pool = klass2.establish_connection(ActiveRecord::Base.connection_pool.spec.db_config.configuration_hash)
<ide> assert_same klass2.connection, pool.connection
<ide> assert_not_same klass2.connection, ActiveRecord::Base.connection
<ide>
<ide><path>activerecord/test/cases/connection_adapters/connection_handlers_multi_db_test.rb
<ide> def test_establish_connection_using_3_levels_config
<ide> ActiveRecord::Base.connects_to(database: { writing: :primary, reading: :readonly })
<ide>
<ide> assert_not_nil pool = ActiveRecord::Base.connection_handlers[:writing].retrieve_connection_pool("primary")
<del> assert_equal "db/primary.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/primary.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide>
<ide> assert_not_nil pool = ActiveRecord::Base.connection_handlers[:reading].retrieve_connection_pool("primary")
<del> assert_equal "db/readonly.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/readonly.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> def test_establish_connection_using_3_levels_config_with_non_default_handlers
<ide> ActiveRecord::Base.connects_to(database: { default: :primary, readonly: :readonly })
<ide>
<ide> assert_not_nil pool = ActiveRecord::Base.connection_handlers[:default].retrieve_connection_pool("primary")
<del> assert_equal "db/primary.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/primary.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide>
<ide> assert_not_nil pool = ActiveRecord::Base.connection_handlers[:readonly].retrieve_connection_pool("primary")
<del> assert_equal "db/readonly.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/readonly.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> def test_switching_connections_with_database_url
<ide> assert_equal handler, ActiveRecord::Base.connection_handlers[:writing]
<ide>
<ide> assert_not_nil pool = handler.retrieve_connection_pool("primary")
<del> assert_equal({ adapter: "postgresql", database: "bar", host: "localhost" }, pool.spec.underlying_configuration_hash)
<add> assert_equal({ adapter: "postgresql", database: "bar", host: "localhost" }, pool.spec.db_config.configuration_hash)
<ide> end
<ide> ensure
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> def test_switching_connections_with_database_config_hash
<ide> assert_equal handler, ActiveRecord::Base.connection_handlers[:writing]
<ide>
<ide> assert_not_nil pool = handler.retrieve_connection_pool("primary")
<del> assert_equal(config, pool.spec.underlying_configuration_hash)
<add> assert_equal(config, pool.spec.db_config.configuration_hash)
<ide> end
<ide> ensure
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> def test_switching_connections_with_database_symbol_uses_default_role
<ide> assert_equal handler, ActiveRecord::Base.connection_handlers[:writing]
<ide>
<ide> assert_not_nil pool = handler.retrieve_connection_pool("primary")
<del> assert_equal(config["default_env"]["animals"], pool.spec.underlying_configuration_hash)
<add> assert_equal(config["default_env"]["animals"], pool.spec.db_config.configuration_hash)
<ide> end
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> def test_switching_connections_with_database_hash_uses_passed_role_and_database
<ide> assert_equal handler, ActiveRecord::Base.connection_handlers[:writing]
<ide>
<ide> assert_not_nil pool = handler.retrieve_connection_pool("primary")
<del> assert_equal(config["default_env"]["primary"], pool.spec.underlying_configuration_hash)
<add> assert_equal(config["default_env"]["primary"], pool.spec.db_config.configuration_hash)
<ide> end
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> def test_connects_to_using_top_level_key_in_two_level_config
<ide> ActiveRecord::Base.connects_to database: { writing: :development, reading: :development_readonly }
<ide>
<ide> assert_not_nil pool = ActiveRecord::Base.connection_handlers[:reading].retrieve_connection_pool("primary")
<del> assert_equal "db/readonly.sqlite3", pool.spec.underlying_configuration_hash[:database]
<add> assert_equal "db/readonly.sqlite3", pool.spec.db_config.configuration_hash[:database]
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide><path>activerecord/test/cases/connection_adapters/connection_specification_test.rb
<del># frozen_string_literal: true
<del>
<del>require "cases/helper"
<del>
<del>module ActiveRecord
<del> module ConnectionAdapters
<del> class ConnectionSpecificationTest < ActiveRecord::TestCase
<del> def test_dup_deep_copy_config
<del> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("development", "primary", { a: :b })
<del> spec = ConnectionSpecification.new("primary", db_config, "bar")
<del> assert_not_equal(spec.underlying_configuration_hash.object_id, spec.dup.underlying_configuration_hash.object_id)
<del> end
<del> end
<del> end
<del>end
<ide><path>activerecord/test/cases/connection_adapters/merge_and_resolve_default_url_config_test.rb
<ide> def resolve_config(config)
<ide>
<ide> def resolve_spec(spec, config)
<ide> configs = ActiveRecord::DatabaseConfigurations.new(config)
<del> resolver = ConnectionAdapters::ConnectionSpecification::Resolver.new(configs)
<add> resolver = ConnectionAdapters::Resolver.new(configs)
<ide> resolver.resolve(spec, spec).configuration_hash
<ide> end
<ide>
<ide><path>activerecord/test/cases/connection_pool_test.rb
<ide> def test_reap_inactive
<ide> def test_idle_timeout_configuration
<ide> @pool.disconnect!
<ide> spec = ActiveRecord::Base.connection_pool.spec
<del> spec.underlying_configuration_hash.merge!(idle_timeout: "0.02")
<add> spec.db_config.configuration_hash.merge!(idle_timeout: "0.02")
<ide> @pool = ConnectionPool.new(spec)
<ide> idle_conn = @pool.checkout
<ide> @pool.checkin(idle_conn)
<ide> def test_idle_timeout_configuration
<ide> def test_disable_flush
<ide> @pool.disconnect!
<ide> spec = ActiveRecord::Base.connection_pool.spec
<del> spec.underlying_configuration_hash.merge!(idle_timeout: -5)
<add> spec.db_config.configuration_hash.merge!(idle_timeout: -5)
<ide> @pool = ConnectionPool.new(spec)
<ide> idle_conn = @pool.checkout
<ide> @pool.checkin(idle_conn)
<ide> def test_public_connections_access_threadsafe
<ide>
<ide> private
<ide> def with_single_connection_pool
<del> one_conn_spec = ActiveRecord::Base.connection_pool.spec.dup
<del> one_conn_spec.underlying_configuration_hash[:pool] = 1 # this is safe to do, because .dupped ConnectionSpecification also auto-dups its config
<add> old_config = ActiveRecord::Base.connection_pool.spec.db_config.configuration_hash
<add> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("arunit", "primary", old_config.dup)
<add> one_conn_spec = ConnectionSpecification.new("primary", db_config)
<add>
<add> one_conn_spec.db_config.configuration_hash[:pool] = 1 # this is safe to do, because .dupped ConnectionSpecification also auto-dups its config
<add>
<ide> yield(pool = ConnectionPool.new(one_conn_spec))
<ide> ensure
<ide> pool.disconnect! if pool
<ide><path>activerecord/test/cases/connection_specification/resolver_test.rb
<ide> class ConnectionSpecification
<ide> class ResolverTest < ActiveRecord::TestCase
<ide> def resolve(spec, config = {})
<ide> configs = ActiveRecord::DatabaseConfigurations.new(config)
<del> resolver = ConnectionAdapters::ConnectionSpecification::Resolver.new(configs)
<add> resolver = ConnectionAdapters::Resolver.new(configs)
<ide> resolver.resolve(spec, spec).configuration_hash
<ide> end
<ide>
<ide> def spec(spec, config = {})
<ide> configs = ActiveRecord::DatabaseConfigurations.new(config)
<del> resolver = ConnectionAdapters::ConnectionSpecification::Resolver.new(configs)
<add> resolver = ConnectionAdapters::Resolver.new(configs)
<ide> resolver.spec(spec)
<ide> end
<ide>
<ide><path>activerecord/test/cases/helper.rb
<ide> def current_adapter?(*types)
<ide>
<ide> def in_memory_db?
<ide> current_adapter?(:SQLite3Adapter) &&
<del> ActiveRecord::Base.connection_pool.spec.underlying_configuration_hash[:database] == ":memory:"
<add> ActiveRecord::Base.connection_pool.spec.db_config.configuration_hash[:database] == ":memory:"
<ide> end
<ide>
<ide> def subsecond_precision_supported?
<ide><path>activerecord/test/cases/reaper_test.rb
<ide> def test_some_time
<ide> end
<ide>
<ide> def test_pool_has_reaper
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<add> config = ActiveRecord::Base.configurations.configs_for(env_name: "arunit", spec_name: "primary")
<add> spec = ConnectionSpecification.new("primary", config)
<ide> pool = ConnectionPool.new spec
<ide>
<ide> assert pool.reaper
<ide> def test_pool_has_reaper
<ide> end
<ide>
<ide> def test_reaping_frequency_configuration
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<del> spec.underlying_configuration_hash[:reaping_frequency] = "10.01"
<add> spec = duplicated_spec
<add> spec.db_config.configuration_hash[:reaping_frequency] = "10.01"
<add>
<ide> pool = ConnectionPool.new spec
<add>
<ide> assert_equal 10.01, pool.reaper.frequency
<ide> ensure
<ide> pool.discard!
<ide> end
<ide>
<ide> def test_connection_pool_starts_reaper
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<del> spec.underlying_configuration_hash[:reaping_frequency] = "0.0001"
<add> spec = duplicated_spec
<add> spec.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<ide> pool = ConnectionPool.new spec
<ide>
<ide> def test_connection_pool_starts_reaper
<ide> end
<ide>
<ide> def test_reaper_works_after_pool_discard
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<del> spec.underlying_configuration_hash[:reaping_frequency] = "0.0001"
<add> spec = duplicated_spec
<add> spec.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<ide> 2.times do
<ide> pool = ConnectionPool.new spec
<ide> def test_reaper_works_after_pool_discard
<ide> # This doesn't test the reaper directly, but we want to test the action
<ide> # it would take on a discarded pool
<ide> def test_reap_flush_on_discarded_pool
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<add> spec = duplicated_spec
<ide> pool = ConnectionPool.new spec
<ide>
<ide> pool.discard!
<ide> def test_reap_flush_on_discarded_pool
<ide> end
<ide>
<ide> def test_connection_pool_starts_reaper_in_fork
<del> spec = ActiveRecord::Base.connection_pool.spec.dup
<del> spec.underlying_configuration_hash[:reaping_frequency] = "0.0001"
<add> spec = duplicated_spec
<add> spec.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<ide> pool = ConnectionPool.new spec
<ide> pool.checkout
<ide> def test_reaper_does_not_reap_discarded_connection_pools
<ide> pool.discard!
<ide> end
<ide>
<del> def new_conn_in_thread(pool)
<del> event = Concurrent::Event.new
<del> conn = nil
<del>
<del> child = Thread.new do
<del> conn = pool.checkout
<del> event.set
<del> Thread.stop
<add> private
<add> def duplicated_spec
<add> old_config = ActiveRecord::Base.connection_pool.spec.db_config.configuration_hash
<add> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("arunit", "primary", old_config.dup)
<add> ConnectionSpecification.new("primary", db_config)
<ide> end
<ide>
<del> event.wait
<del> [conn, child]
<del> end
<add> def new_conn_in_thread(pool)
<add> event = Concurrent::Event.new
<add> conn = nil
<ide>
<del> def wait_for_conn_idle(conn, timeout = 5)
<del> start = Process.clock_gettime(Process::CLOCK_MONOTONIC)
<del> while conn.in_use? && Process.clock_gettime(Process::CLOCK_MONOTONIC) - start < timeout
<del> Thread.pass
<add> child = Thread.new do
<add> conn = pool.checkout
<add> event.set
<add> Thread.stop
<add> end
<add>
<add> event.wait
<add> [conn, child]
<add> end
<add>
<add> def wait_for_conn_idle(conn, timeout = 5)
<add> start = Process.clock_gettime(Process::CLOCK_MONOTONIC)
<add> while conn.in_use? && Process.clock_gettime(Process::CLOCK_MONOTONIC) - start < timeout
<add> Thread.pass
<add> end
<ide> end
<del> end
<ide> end
<ide> end
<ide> end
<ide><path>railties/test/application/rake/multi_dbs_test.rb
<ide> class TwoMigration < ActiveRecord::Migration::Current
<ide> db_migrate_and_schema_dump_and_load "schema"
<ide>
<ide> app_file "db/seeds.rb", <<-RUBY
<del> print Book.connection.pool.spec.underlying_configuration_hash[:database]
<add> print Book.connection.pool.spec.db_config.configuration_hash[:database]
<ide> RUBY
<ide>
<ide> output = rails("db:seed") | 22 |
Javascript | Javascript | expose posix realpath on windows as well | 6332a4cf00425c63ae476d89f6705881eb06a3e1 | <ide><path>lib/fs.js
<ide> fs.unwatchFile = function(filename) {
<ide>
<ide> var normalize = pathModule.normalize;
<ide>
<del>if (isWindows) {
<del> // Node doesn't support symlinks / lstat on windows. Hence realpath is just
<del> // the same as path.resolve that fails if the path doesn't exists.
<del>
<del> // windows version
<del> fs.realpathSync = function realpathSync(p, cache) {
<del> p = pathModule.resolve(p);
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<del> return cache[p];
<del> }
<del> fs.statSync(p);
<del> if (cache) cache[p] = p;
<del> return p;
<del> };
<del>
<del> // windows version
<del> fs.realpath = function(p, cache, cb) {
<del> if (typeof cb !== 'function') {
<del> cb = cache;
<del> cache = null;
<del> }
<del> p = pathModule.resolve(p);
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<del> return cb(null, cache[p]);
<del> }
<del> fs.stat(p, function(err) {
<del> if (err) return cb(err);
<del> if (cache) cache[p] = p;
<del> cb(null, p);
<del> });
<del> };
<add>// Regexp that finds the next partion of a (partial) path
<add>// result is [base_with_slash, base], e.g. ['somedir/', 'somedir']
<add>var nextPartRe = /(.*?)(?:[\/]+|$)/g;
<ide>
<add>fs.realpathSync = function realpathSync(p, cache) {
<add> // make p is absolute
<add> p = pathModule.resolve(p);
<ide>
<del>} else /* posix */ {
<del>
<del> // Regexp that finds the next partion of a (partial) path
<del> // result is [base_with_slash, base], e.g. ['somedir/', 'somedir']
<del> var nextPartRe = /(.*?)(?:[\/]+|$)/g;
<add> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<add> return cache[p];
<add> }
<ide>
<del> // posix version
<del> fs.realpathSync = function realpathSync(p, cache) {
<del> // make p is absolute
<del> p = pathModule.resolve(p);
<add> var original = p,
<add> seenLinks = {},
<add> knownHard = {};
<ide>
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<del> return cache[p];
<add> // current character position in p
<add> var pos = 0;
<add> // the partial path so far, including a trailing slash if any
<add> var current = '';
<add> // the partial path without a trailing slash
<add> var base = '';
<add> // the partial path scanned in the previous round, with slash
<add> var previous = '';
<add>
<add> // walk down the path, swapping out linked pathparts for their real
<add> // values
<add> // NB: p.length changes.
<add> while (pos < p.length) {
<add> // find the next part
<add> nextPartRe.lastIndex = pos;
<add> var result = nextPartRe.exec(p);
<add> previous = current;
<add> current += result[0];
<add> base = previous + result[1];
<add> pos = nextPartRe.lastIndex;
<add>
<add> // continue if not a symlink, or if root
<add> if (!base || knownHard[base] || (cache && cache[base] === base)) {
<add> continue;
<ide> }
<ide>
<del> var original = p,
<del> seenLinks = {},
<del> knownHard = {};
<del>
<del> // current character position in p
<del> var pos = 0;
<del> // the partial path so far, including a trailing slash if any
<del> var current = '';
<del> // the partial path without a trailing slash
<del> var base = '';
<del> // the partial path scanned in the previous round, with slash
<del> var previous = '';
<del>
<del> // walk down the path, swapping out linked pathparts for their real
<del> // values
<del> // NB: p.length changes.
<del> while (pos < p.length) {
<del> // find the next part
<del> nextPartRe.lastIndex = pos;
<del> var result = nextPartRe.exec(p);
<del> previous = current;
<del> current += result[0];
<del> base = previous + result[1];
<del> pos = nextPartRe.lastIndex;
<del>
<del> // continue if not a symlink, or if root
<del> if (!base || knownHard[base] || (cache && cache[base] === base)) {
<add> var resolvedLink;
<add> if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
<add> // some known symbolic link. no need to stat again.
<add> resolvedLink = cache[base];
<add> } else {
<add> var stat = fs.lstatSync(base);
<add> if (!stat.isSymbolicLink()) {
<add> knownHard[base] = true;
<add> if (cache) cache[base] = base;
<ide> continue;
<ide> }
<ide>
<del> var resolvedLink;
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
<del> // some known symbolic link. no need to stat again.
<del> resolvedLink = cache[base];
<del> } else {
<del> var stat = fs.lstatSync(base);
<del> if (!stat.isSymbolicLink()) {
<del> knownHard[base] = true;
<del> if (cache) cache[base] = base;
<del> continue;
<del> }
<del>
<del> // read the link if it wasn't read before
<del> var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
<del> if (!seenLinks[id]) {
<del> fs.statSync(base);
<del> seenLinks[id] = fs.readlinkSync(base);
<del> resolvedLink = pathModule.resolve(previous, seenLinks[id]);
<del> // track this, if given a cache.
<del> if (cache) cache[base] = resolvedLink;
<del> }
<add> // read the link if it wasn't read before
<add> var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
<add> if (!seenLinks[id]) {
<add> fs.statSync(base);
<add> seenLinks[id] = fs.readlinkSync(base);
<add> resolvedLink = pathModule.resolve(previous, seenLinks[id]);
<add> // track this, if given a cache.
<add> if (cache) cache[base] = resolvedLink;
<ide> }
<del>
<del> // resolve the link, then start over
<del> p = pathModule.resolve(resolvedLink, p.slice(pos));
<del> pos = 0;
<del> previous = base = current = '';
<ide> }
<ide>
<del> if (cache) cache[original] = p;
<add> // resolve the link, then start over
<add> p = pathModule.resolve(resolvedLink, p.slice(pos));
<add> pos = 0;
<add> previous = base = current = '';
<add> }
<ide>
<del> return p;
<del> };
<add> if (cache) cache[original] = p;
<ide>
<add> return p;
<add>};
<ide>
<del> // posix version
<del> fs.realpath = function realpath(p, cache, cb) {
<del> if (typeof cb !== 'function') {
<del> cb = cache;
<del> cache = null;
<del> }
<ide>
<del> // make p is absolute
<del> p = pathModule.resolve(p);
<add>fs.realpath = function realpath(p, cache, cb) {
<add> if (typeof cb !== 'function') {
<add> cb = cache;
<add> cache = null;
<add> }
<ide>
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<del> return cb(null, cache[p]);
<del> }
<add> // make p is absolute
<add> p = pathModule.resolve(p);
<ide>
<del> var original = p,
<del> seenLinks = {},
<del> knownHard = {};
<del>
<del> // current character position in p
<del> var pos = 0;
<del> // the partial path so far, including a trailing slash if any
<del> var current = '';
<del> // the partial path without a trailing slash
<del> var base = '';
<del> // the partial path scanned in the previous round, with slash
<del> var previous = '';
<del>
<del> // walk down the path, swapping out linked pathparts for their real
<del> // values
<del> LOOP();
<del> function LOOP() {
<del> // stop if scanned past end of path
<del> if (pos >= p.length) {
<del> if (cache) cache[original] = p;
<del> return cb(null, p);
<del> }
<add> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<add> return cb(null, cache[p]);
<add> }
<ide>
<del> // find the next part
<del> nextPartRe.lastIndex = pos;
<del> var result = nextPartRe.exec(p);
<del> previous = current;
<del> current += result[0];
<del> base = previous + result[1];
<del> pos = nextPartRe.lastIndex;
<del>
<del> // continue if known to be hard or if root or in cache already.
<del> if (!base || knownHard[base] || (cache && cache[base] === base)) {
<del> return process.nextTick(LOOP);
<del> }
<add> var original = p,
<add> seenLinks = {},
<add> knownHard = {};
<ide>
<del> if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
<del> // known symbolic link. no need to stat again.
<del> return gotResolvedLink(cache[base]);
<del> }
<add> // current character position in p
<add> var pos = 0;
<add> // the partial path so far, including a trailing slash if any
<add> var current = '';
<add> // the partial path without a trailing slash
<add> var base = '';
<add> // the partial path scanned in the previous round, with slash
<add> var previous = '';
<add>
<add> // walk down the path, swapping out linked pathparts for their real
<add> // values
<add> LOOP();
<add> function LOOP() {
<add> // stop if scanned past end of path
<add> if (pos >= p.length) {
<add> if (cache) cache[original] = p;
<add> return cb(null, p);
<add> }
<add>
<add> // find the next part
<add> nextPartRe.lastIndex = pos;
<add> var result = nextPartRe.exec(p);
<add> previous = current;
<add> current += result[0];
<add> base = previous + result[1];
<add> pos = nextPartRe.lastIndex;
<ide>
<del> return fs.lstat(base, gotStat);
<add> // continue if known to be hard or if root or in cache already.
<add> if (!base || knownHard[base] || (cache && cache[base] === base)) {
<add> return process.nextTick(LOOP);
<ide> }
<ide>
<del> function gotStat(err, stat) {
<del> if (err) return cb(err);
<add> if (cache && Object.prototype.hasOwnProperty.call(cache, base)) {
<add> // known symbolic link. no need to stat again.
<add> return gotResolvedLink(cache[base]);
<add> }
<ide>
<del> // if not a symlink, skip to the next path part
<del> if (!stat.isSymbolicLink()) {
<del> knownHard[base] = true;
<del> if (cache) cache[base] = base;
<del> return process.nextTick(LOOP);
<del> }
<add> return fs.lstat(base, gotStat);
<add> }
<ide>
<del> // stat & read the link if not read before
<del> // call gotTarget as soon as the link target is known
<del> var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
<del> if (seenLinks[id]) {
<del> return gotTarget(null, seenLinks[id], base);
<del> }
<del> fs.stat(base, function(err) {
<del> if (err) return cb(err);
<add> function gotStat(err, stat) {
<add> if (err) return cb(err);
<ide>
<del> fs.readlink(base, function(err, target) {
<del> gotTarget(err, seenLinks[id] = target);
<del> });
<del> });
<add> // if not a symlink, skip to the next path part
<add> if (!stat.isSymbolicLink()) {
<add> knownHard[base] = true;
<add> if (cache) cache[base] = base;
<add> return process.nextTick(LOOP);
<ide> }
<ide>
<del> function gotTarget(err, target, base) {
<add> // stat & read the link if not read before
<add> // call gotTarget as soon as the link target is known
<add> var id = stat.dev.toString(32) + ':' + stat.ino.toString(32);
<add> if (seenLinks[id]) {
<add> return gotTarget(null, seenLinks[id], base);
<add> }
<add> fs.stat(base, function(err) {
<ide> if (err) return cb(err);
<ide>
<del> var resolvedLink = pathModule.resolve(previous, target);
<del> if (cache) cache[base] = resolvedLink;
<del> gotResolvedLink(resolvedLink);
<del> }
<add> fs.readlink(base, function(err, target) {
<add> gotTarget(err, seenLinks[id] = target);
<add> });
<add> });
<add> }
<ide>
<del> function gotResolvedLink(resolvedLink) {
<add> function gotTarget(err, target, base) {
<add> if (err) return cb(err);
<ide>
<del> // resolve the link, then start over
<del> p = pathModule.resolve(resolvedLink, p.slice(pos));
<del> pos = 0;
<del> previous = base = current = '';
<add> var resolvedLink = pathModule.resolve(previous, target);
<add> if (cache) cache[base] = resolvedLink;
<add> gotResolvedLink(resolvedLink);
<add> }
<ide>
<del> return process.nextTick(LOOP);
<del> }
<del> };
<add> function gotResolvedLink(resolvedLink) {
<add>
<add> // resolve the link, then start over
<add> p = pathModule.resolve(resolvedLink, p.slice(pos));
<add> pos = 0;
<add> previous = base = current = '';
<add>
<add> return process.nextTick(LOOP);
<add> }
<add>};
<ide>
<del>}
<ide>
<ide>
<ide> var pool; | 1 |
Python | Python | remove use of q objects | 894f63259880252ed5317ce485eb13c4429b65c1 | <ide><path>djangorestframework/mixins.py
<ide> from django.contrib.auth.models import AnonymousUser
<ide> from django.core.paginator import Paginator
<ide> from django.db.models.fields.related import ForeignKey
<del>from django.db.models.query import Q
<ide> from django.http import HttpResponse
<ide> from urlobject import URLObject
<ide>
<ide> class ModelMixin(object):
<ide>
<ide> queryset = None
<ide>
<del> def build_query(self, *args, **kwargs):
<del> """ Returns django.db.models.Q object to be used for the objects retrival.
<del>
<del> Arguments:
<del> - args: unnamed URL arguments
<del> - kwargs: named URL arguments
<del>
<del> If a URL passes any arguments to the view being the QueryMixin subclass
<del> build_query manages the arguments and provides the Q object that will be
<del> used for the objects retrival with filter/get queryset methods.
<del>
<del> Technically, neither args nor kwargs have to be provided, however the default
<del> behaviour is to map all kwargs as the query constructors so that if this
<del> method is not overriden only kwargs keys being model fields are valid.
<del>
<del> If positional args are provided, the last one argument is understood
<del> as the primary key. However this usage should be considered
<del> deperecated, and will be removed in a future version.
<add> def get_query_kwargs(self, *args, **kwargs):
<add> """
<add> Return a dict of kwargs that will be used to build the
<add> model instance retrieval or to filter querysets.
<ide> """
<ide>
<del> tmp = dict(kwargs)
<add> kwargs = dict(kwargs)
<ide>
<ide> # If the URLconf includes a .(?P<format>\w+) pattern to match against
<ide> # a .json, .xml suffix, then drop the 'format' kwarg before
<ide> # constructing the query.
<del> if BaseRenderer._FORMAT_QUERY_PARAM in tmp:
<del> del tmp[BaseRenderer._FORMAT_QUERY_PARAM]
<add> if BaseRenderer._FORMAT_QUERY_PARAM in kwargs:
<add> del kwargs[BaseRenderer._FORMAT_QUERY_PARAM]
<ide>
<del> return Q(**tmp)
<add> return kwargs
<ide>
<ide> def get_instance_data(self, model, content, **kwargs):
<ide> """
<del> Returns the dict with the data for model instance creation/update query.
<add> Returns the dict with the data for model instance creation/update.
<ide>
<ide> Arguments:
<ide> - model: model class (django.db.models.Model subclass) to work with
<ide> def get_instance_data(self, model, content, **kwargs):
<ide>
<ide> return all_kw_args
<ide>
<del> def get_object(self, *args, **kwargs):
<add> def get_instance(self, **kwargs):
<ide> """
<del> Get the instance object for read/update/delete requests.
<add> Get a model instance for read/update/delete requests.
<ide> """
<del> model = self.resource.model
<del> return model.objects.get(self.build_query(*args, **kwargs))
<add> return self.get_queryset().get(**kwargs)
<ide>
<ide> def get_queryset(self):
<ide> """
<ide> class ReadModelMixin(ModelMixin):
<ide> """
<ide> def get(self, request, *args, **kwargs):
<ide> model = self.resource.model
<add> query_kwargs = self.get_query_kwargs(request, *args, **kwargs)
<ide>
<ide> try:
<del> self.model_instance = self.get_object(*args, **kwargs)
<add> self.model_instance = self.get_instance(**query_kwargs)
<ide> except model.DoesNotExist:
<ide> raise ErrorResponse(status.HTTP_404_NOT_FOUND)
<ide>
<ide> return self.model_instance
<ide>
<del> def build_query(self, *args, **kwargs):
<del> # Build query is overriden to filter the kwargs priori
<del> # to use them as build_query argument
<del> filtered_keywords = kwargs.copy()
<del>
<del> return super(ReadModelMixin, self).build_query(*args, **filtered_keywords)
<del>
<ide>
<ide> class CreateModelMixin(ModelMixin):
<ide> """
<ide> class UpdateModelMixin(ModelMixin):
<ide> """
<ide> def put(self, request, *args, **kwargs):
<ide> model = self.resource.model
<add> query_kwargs = self.get_query_kwargs(request, *args, **kwargs)
<ide>
<ide> # TODO: update on the url of a non-existing resource url doesn't work
<ide> # correctly at the moment - will end up with a new url
<ide> try:
<del> self.model_instance = self.get_object(*args, **kwargs)
<add> self.model_instance = self.get_instance(*query_kwargs)
<ide>
<ide> for (key, val) in self.CONTENT.items():
<ide> setattr(self.model_instance, key, val)
<ide> class DeleteModelMixin(ModelMixin):
<ide> """
<ide> def delete(self, request, *args, **kwargs):
<ide> model = self.resource.model
<add> query_kwargs = self.get_query_kwargs(request, *args, **kwargs)
<ide>
<ide> try:
<del> instance = self.get_object(*args, **kwargs)
<add> instance = self.get_instance(**query_kwargs)
<ide> except model.DoesNotExist:
<ide> raise ErrorResponse(status.HTTP_404_NOT_FOUND, None, {})
<ide>
<ide> class ListModelMixin(ModelMixin):
<ide> def get(self, request, *args, **kwargs):
<ide> queryset = self.get_queryset()
<ide> ordering = self.get_ordering()
<add> query_kwargs = self.get_query_kwargs(request, *args, **kwargs)
<ide>
<del> queryset = queryset.filter(self.build_query(**kwargs))
<add> queryset = queryset.filter(**query_kwargs)
<ide> if ordering:
<ide> queryset = queryset.order_by(*ordering)
<ide> | 1 |
PHP | PHP | fix paginatorcomponent tests | b02a34366c884ccac96bc61d2e1b69e2179d7db1 | <ide><path>lib/Cake/Test/TestCase/Controller/Component/PaginatorComponentTest.php
<ide> <?php
<ide> /**
<del> * PaginatorComponentTest file
<del> *
<del> * Series of tests for paginator component.
<del> *
<del> * PHP 5
<del> *
<ide> * CakePHP(tm) Tests <http://book.cakephp.org/2.0/en/development/testing.html>
<ide> * Copyright 2005-2012, Cake Software Foundation, Inc. (http://cakefoundation.org)
<ide> *
<ide> *
<ide> * @copyright Copyright 2005-2012, Cake Software Foundation, Inc. (http://cakefoundation.org)
<ide> * @link http://book.cakephp.org/2.0/en/development/testing.html CakePHP(tm) Tests
<del> * @package Cake.Test.Case.Controller.Component
<ide> * @since CakePHP(tm) v 2.0
<ide> * @license MIT License (http://www.opensource.org/licenses/mit-license.php)
<ide> */
<ide> public function testMergeOptionsModelSpecific() {
<ide> * @return void
<ide> */
<ide> public function testMergeOptionsCustomFindKey() {
<del> $this->request->params['named'] = array(
<add> $this->request->query = [
<ide> 'page' => 10,
<ide> 'limit' => 10
<del> );
<del> $this->Paginator->settings = array(
<add> ];
<add> $this->Paginator->settings = [
<ide> 'page' => 1,
<ide> 'limit' => 20,
<ide> 'maxLimit' => 100,
<del> 'paramType' => 'named',
<ide> 'findType' => 'myCustomFind'
<del> );
<add> ];
<ide> $result = $this->Paginator->mergeOptions('Post');
<del> $expected = array('page' => 10, 'limit' => 10, 'maxLimit' => 100, 'paramType' => 'named', 'findType' => 'myCustomFind');
<add> $expected = array(
<add> 'page' => 10,
<add> 'limit' => 10,
<add> 'maxLimit' => 100,
<add> 'findType' => 'myCustomFind'
<add> );
<ide> $this->assertEquals($expected, $result);
<ide> }
<ide>
<ide> public function testValidateSortInvalidDirection() {
<ide> public function testOutOfRangePageNumberGetsClamped() {
<ide> $Controller = new PaginatorTestController($this->request);
<ide> $Controller->uses = array('PaginatorControllerPost');
<del> $Controller->query['page'] = 3000;
<add> $Controller->request->query['page'] = 3000;
<ide> $Controller->constructClasses();
<ide> $Controller->PaginatorControllerPost->recursive = 0;
<ide> $Controller->Paginator->paginate('PaginatorControllerPost');
<ide> public function testValidateSortMultiple() {
<ide> * @return void
<ide> */
<ide> public function testValidateSortNoSort() {
<del> $model = $this->getMock('Model');
<add> $model = $this->getMock('Cake\Model\Model');
<ide> $model->alias = 'model';
<ide> $model->expects($this->any())->method('hasField')->will($this->returnValue(true));
<ide> | 1 |
Ruby | Ruby | rename the classes | 6d81eab13d65b7239444c9d0bfb14518755ddbc9 | <ide><path>activerecord/lib/active_record/connection_adapters.rb
<ide> module ConnectionAdapters
<ide> end
<ide>
<ide> autoload :Column
<del> autoload :Role
<del> autoload :RoleManager
<add> autoload :PoolConfig
<add> autoload :PoolManager
<ide> autoload :Resolver
<ide>
<ide> autoload_at "active_record/connection_adapters/abstract/schema_definitions" do
<ide><path>activerecord/lib/active_record/connection_adapters/abstract/connection_pool.rb
<ide> def run
<ide> include ConnectionAdapters::AbstractPool
<ide>
<ide> attr_accessor :automatic_reconnect, :checkout_timeout
<del> attr_reader :db_config, :size, :reaper, :role
<add> attr_reader :db_config, :size, :reaper, :pool_config
<ide>
<del> delegate :schema_cache, :schema_cache=, to: :role
<add> delegate :schema_cache, :schema_cache=, to: :pool_config
<ide>
<del> # Creates a new ConnectionPool object. +role+ is a Role
<add> # Creates a new ConnectionPool object. +pool_config+ is a PoolConfig
<ide> # object which describes database connection information (e.g. adapter,
<ide> # host name, username, password, etc), as well as the maximum size for
<ide> # this ConnectionPool.
<ide> #
<ide> # The default ConnectionPool maximum size is 5.
<del> def initialize(role)
<add> def initialize(pool_config)
<ide> super()
<ide>
<del> @role = role
<del> @db_config = role.db_config
<add> @pool_config = pool_config
<add> @db_config = pool_config.db_config
<ide>
<ide> @checkout_timeout = db_config.checkout_timeout
<ide> @idle_timeout = db_config.idle_timeout
<ide> class ConnectionHandler
<ide> private_constant :FINALIZER
<ide>
<ide> def initialize
<del> # These caches are keyed by role.connection_specification_name (Role#connection_specification_name).
<del> @owner_to_role_manager = Concurrent::Map.new(initial_capacity: 2)
<add> # These caches are keyed by pool_config.connection_specification_name (PoolConfig#connection_specification_name).
<add> @owner_to_pool_manager = Concurrent::Map.new(initial_capacity: 2)
<ide>
<ide> # Backup finalizer: if the forked child skipped Kernel#fork the early discard has not occurred
<ide> ObjectSpace.define_finalizer self, FINALIZER
<ide> def while_preventing_writes(enabled = true)
<ide> end
<ide>
<ide> def connection_pool_names # :nodoc:
<del> owner_to_role_manager.keys
<add> owner_to_pool_manager.keys
<ide> end
<ide>
<ide> def connection_pool_list
<del> owner_to_role_manager.values.compact.flat_map { |rc| rc.roles.map(&:pool) }
<add> owner_to_pool_manager.values.compact.flat_map { |m| m.pool_configs.map(&:pool) }
<ide> end
<ide> alias :connection_pools :connection_pool_list
<ide>
<del> def establish_connection(config, role_name = :default)
<add> def establish_connection(config, pool_key = :default)
<ide> resolver = Resolver.new(Base.configurations)
<del> role = resolver.resolve_role(config)
<del> db_config = role.db_config
<add> pool_config = resolver.resolve_pool_config(config)
<add> db_config = pool_config.db_config
<ide>
<del> remove_connection(role.connection_specification_name, role_name)
<add> remove_connection(pool_config.connection_specification_name, pool_key)
<ide>
<ide> message_bus = ActiveSupport::Notifications.instrumenter
<ide> payload = {
<ide> connection_id: object_id
<ide> }
<del> if role
<del> payload[:spec_name] = role.connection_specification_name
<add> if pool_config
<add> payload[:spec_name] = pool_config.connection_specification_name
<ide> payload[:config] = db_config.configuration_hash
<ide> end
<ide>
<del> owner_to_role_manager[role.connection_specification_name] ||= RoleManager.new
<del> role_manager = owner_to_role_manager[role.connection_specification_name]
<del> role_manager.set_role(role_name, role)
<add> owner_to_pool_manager[pool_config.connection_specification_name] ||= PoolManager.new
<add> pool_manager = owner_to_pool_manager[pool_config.connection_specification_name]
<add> pool_manager.set_pool_config(pool_key, pool_config)
<ide>
<ide> message_bus.instrument("!connection.active_record", payload) do
<del> role.pool
<add> pool_config.pool
<ide> end
<ide> end
<ide>
<ide> def retrieve_connection(spec_name) #:nodoc:
<ide>
<ide> # Returns true if a connection that's accessible to this class has
<ide> # already been opened.
<del> def connected?(spec_name, role_name = :default)
<del> pool = retrieve_connection_pool(spec_name, role_name)
<add> def connected?(spec_name, pool_key = :default)
<add> pool = retrieve_connection_pool(spec_name, pool_key)
<ide> pool && pool.connected?
<ide> end
<ide>
<ide> # Remove the connection for this class. This will close the active
<ide> # connection and the defined connection (if they exist). The result
<ide> # can be used as an argument for #establish_connection, for easily
<ide> # re-establishing the connection.
<del> def remove_connection(spec_name, role_name = :default)
<del> if role_manager = owner_to_role_manager[spec_name]
<del> role = role_manager.remove_role(role_name)
<add> def remove_connection(spec_name, pool_key = :default)
<add> if pool_manager = owner_to_pool_manager[spec_name]
<add> pool_config = pool_manager.remove_pool_config(pool_key)
<ide>
<del> if role
<del> role.disconnect!
<del> role.db_config.configuration_hash
<add> if pool_config
<add> pool_config.disconnect!
<add> pool_config.db_config.configuration_hash
<ide> end
<ide> end
<ide> end
<ide>
<del> # Retrieving the connection pool happens a lot, so we cache it in @owner_to_role_manager.
<add> # Retrieving the connection pool happens a lot, so we cache it in @owner_to_pool_manager.
<ide> # This makes retrieving the connection pool O(1) once the process is warm.
<ide> # When a connection is established or removed, we invalidate the cache.
<del> def retrieve_connection_pool(spec_name, role_name = :default)
<del> role = owner_to_role_manager[spec_name]&.get_role(role_name)
<del> role&.pool
<add> def retrieve_connection_pool(spec_name, pool_key = :default)
<add> pool_config = owner_to_pool_manager[spec_name]&.get_pool_config(pool_key)
<add> pool_config&.pool
<ide> end
<ide>
<ide> private
<del> attr_reader :owner_to_role_manager
<add> attr_reader :owner_to_pool_manager
<ide> end
<ide> end
<ide> end
<add><path>activerecord/lib/active_record/connection_adapters/pool_config.rb
<del><path>activerecord/lib/active_record/connection_adapters/role.rb
<ide>
<ide> module ActiveRecord
<ide> module ConnectionAdapters
<del> class Role # :nodoc:
<add> class PoolConfig # :nodoc:
<ide> include Mutex_m
<ide>
<ide> attr_reader :db_config, :connection_specification_name
<ide> def discard_pool!
<ide> end
<ide> end
<ide>
<del>ActiveSupport::ForkTracker.after_fork { ActiveRecord::ConnectionAdapters::Role.discard_pools! }
<add>ActiveSupport::ForkTracker.after_fork { ActiveRecord::ConnectionAdapters::PoolConfig.discard_pools! }
<ide><path>activerecord/lib/active_record/connection_adapters/pool_manager.rb
<add># frozen_string_literal: true
<add>
<add>module ActiveRecord
<add> module ConnectionAdapters
<add> class PoolManager # :nodoc:
<add> def initialize
<add> @name_to_pool_config = {}
<add> end
<add>
<add> def pool_configs
<add> @name_to_pool_config.values
<add> end
<add>
<add> def remove_pool_config(key)
<add> @name_to_pool_config.delete(key)
<add> end
<add>
<add> def get_pool_config(key)
<add> @name_to_pool_config[key]
<add> end
<add>
<add> def set_pool_config(key, pool_config)
<add> @name_to_pool_config[key] = pool_config
<add> end
<add> end
<add> end
<add>end
<ide><path>activerecord/lib/active_record/connection_adapters/resolver.rb
<ide>
<ide> module ActiveRecord
<ide> module ConnectionAdapters
<del> # Builds a Role from user input.
<add> # Builds a PoolConfig from user input.
<ide> class Resolver # :nodoc:
<ide> attr_reader :configurations
<ide>
<ide> def initialize(configurations)
<ide> @configurations = configurations
<ide> end
<ide>
<del> # Returns an instance of Role for a given adapter.
<add> # Returns an instance of PoolConfig for a given adapter.
<ide> # Accepts a hash one layer deep that contains all connection information.
<ide> #
<ide> # == Example
<ide> #
<ide> # config = { "production" => { "host" => "localhost", "database" => "foo", "adapter" => "sqlite3" } }
<del> # role = Resolver.new(config).resolve_role(:production)
<del> # role.db_config.configuration_hash
<add> # pool_config = Resolver.new(config).resolve_pool_config(:production)
<add> # pool_config.db_config.configuration_hash
<ide> # # => { host: "localhost", database: "foo", adapter: "sqlite3" }
<ide> #
<del> def resolve_role(config)
<add> def resolve_pool_config(config)
<ide> pool_name = config if config.is_a?(Symbol)
<ide>
<ide> db_config = resolve(config, pool_name)
<ide> def resolve_role(config)
<ide> raise AdapterNotFound, "database configuration specifies nonexistent #{db_config.adapter} adapter"
<ide> end
<ide>
<del> Role.new(db_config.configuration_hash.delete(:name) || "primary", db_config)
<add> PoolConfig.new(db_config.configuration_hash.delete(:name) || "primary", db_config)
<ide> end
<ide>
<ide> # Returns fully resolved connection, accepts hash, string or symbol.
<ide><path>activerecord/lib/active_record/connection_adapters/role_manager.rb
<del># frozen_string_literal: true
<del>
<del>module ActiveRecord
<del> module ConnectionAdapters
<del> class RoleManager # :nodoc:
<del> def initialize
<del> @name_to_role = {}
<del> end
<del>
<del> def roles
<del> @name_to_role.values
<del> end
<del>
<del> def remove_role(name)
<del> @name_to_role.delete(name)
<del> end
<del>
<del> def get_role(name)
<del> @name_to_role[name]
<del> end
<del>
<del> def set_role(name, role)
<del> @name_to_role[name] = role
<del> end
<del> end
<del> end
<del>end
<ide><path>activerecord/lib/active_record/test_fixtures.rb
<ide> def setup_shared_connection_pool
<ide> ActiveRecord::Base.connection_handlers.values.each do |handler|
<ide> if handler != writing_handler
<ide> handler.connection_pool_names.each do |name|
<del> writing_role_manager = writing_handler.send(:owner_to_role_manager)[name]
<del> writing_role = writing_role_manager.get_role(:default)
<add> writing_pool_manager = writing_handler.send(:owner_to_pool_manager)[name]
<add> writing_pool_config = writing_pool_manager.get_pool_config(:default)
<ide>
<del> role_manager = handler.send(:owner_to_role_manager)[name]
<del> role_manager.set_role(:default, writing_role)
<add> pool_manager = handler.send(:owner_to_pool_manager)[name]
<add> pool_manager.set_pool_config(:default, writing_pool_config)
<ide> end
<ide> end
<ide> end
<ide><path>activerecord/test/cases/connection_adapters/adapter_leasing_test.rb
<ide> def test_expire_mutates_in_use
<ide>
<ide> def test_close
<ide> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("test", "primary", {})
<del> role = ActiveRecord::ConnectionAdapters::Role.new("primary", db_config)
<del> pool = Pool.new(role)
<add> pool_config = ActiveRecord::ConnectionAdapters::PoolConfig.new("primary", db_config)
<add> pool = Pool.new(pool_config)
<ide> pool.insert_connection_for_test! @adapter
<ide> @adapter.pool = pool
<ide>
<add><path>activerecord/test/cases/connection_adapters/connection_handlers_multi_pool_config_test.rb
<del><path>activerecord/test/cases/connection_adapters/connection_handlers_multi_role_test.rb
<ide>
<ide> module ActiveRecord
<ide> module ConnectionAdapters
<del> class ConnectionHandlersMultiRoleTest < ActiveRecord::TestCase
<add> class ConnectionHandlersMultiPoolConfigTest < ActiveRecord::TestCase
<ide> self.use_transactional_tests = false
<ide>
<ide> fixtures :people
<ide> def teardown
<ide> end
<ide>
<ide> unless in_memory_db?
<del> def test_establish_connection_with_roles
<add> def test_establish_connection_with_pool_configs
<ide> previous_env, ENV["RAILS_ENV"] = ENV["RAILS_ENV"], "default_env"
<ide>
<ide> config = {
<ide> def test_establish_connection_with_roles
<ide> @prev_configs, ActiveRecord::Base.configurations = ActiveRecord::Base.configurations, config
<ide>
<ide> @writing_handler.establish_connection(:primary)
<del> @writing_handler.establish_connection(:primary, :role_two)
<add> @writing_handler.establish_connection(:primary, :pool_config_two)
<ide>
<ide> default_pool = @writing_handler.retrieve_connection_pool("primary", :default)
<del> other_pool = @writing_handler.retrieve_connection_pool("primary", :role_two)
<add> other_pool = @writing_handler.retrieve_connection_pool("primary", :pool_config_two)
<ide>
<ide> assert_not_nil default_pool
<ide> assert_not_equal default_pool, other_pool
<ide> def test_remove_connection
<ide> @prev_configs, ActiveRecord::Base.configurations = ActiveRecord::Base.configurations, config
<ide>
<ide> @writing_handler.establish_connection(:primary)
<del> @writing_handler.establish_connection(:primary, :role_two)
<add> @writing_handler.establish_connection(:primary, :pool_config_two)
<ide>
<ide> # remove default
<ide> @writing_handler.remove_connection("primary")
<ide>
<ide> assert_nil @writing_handler.retrieve_connection_pool("primary")
<del> assert_not_nil @writing_handler.retrieve_connection_pool("primary", :role_two)
<add> assert_not_nil @writing_handler.retrieve_connection_pool("primary", :pool_config_two)
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> def test_connected?
<ide> @prev_configs, ActiveRecord::Base.configurations = ActiveRecord::Base.configurations, config
<ide>
<ide> @writing_handler.establish_connection(:primary)
<del> @writing_handler.establish_connection(:primary, :role_two)
<add> @writing_handler.establish_connection(:primary, :pool_config_two)
<ide>
<ide> # connect to default
<ide> @writing_handler.connection_pool_list.first.checkout
<ide>
<ide> assert @writing_handler.connected?("primary")
<ide> assert @writing_handler.connected?("primary", :default)
<del> assert_not @writing_handler.connected?("primary", :role_two)
<add> assert_not @writing_handler.connected?("primary", :pool_config_two)
<ide> ensure
<ide> ActiveRecord::Base.configurations = @prev_configs
<ide> ActiveRecord::Base.establish_connection(:arunit)
<ide> ENV["RAILS_ENV"] = previous_env
<add> FileUtils.rm_rf "db"
<ide> end
<ide> end
<ide> end
<ide><path>activerecord/test/cases/connection_pool_test.rb
<ide> def setup
<ide>
<ide> # Keep a duplicate pool so we do not bother others
<ide> @db_config = ActiveRecord::Base.connection_pool.db_config
<del> @role = ActiveRecord::ConnectionAdapters::Role.new("primary", @db_config)
<del> @pool = ConnectionPool.new(@role)
<add> @pool_config = ActiveRecord::ConnectionAdapters::PoolConfig.new("primary", @db_config)
<add> @pool = ConnectionPool.new(@pool_config)
<ide>
<ide> if in_memory_db?
<ide> # Separate connections to an in-memory database create an entirely new database,
<ide> def test_reap_inactive
<ide> def test_idle_timeout_configuration
<ide> @pool.disconnect!
<ide> @db_config.configuration_hash.merge!(idle_timeout: "0.02")
<del> role = ActiveRecord::ConnectionAdapters::Role.new("primary", @db_config)
<del> @pool = ConnectionPool.new(role)
<add> pool_config = ActiveRecord::ConnectionAdapters::PoolConfig.new("primary", @db_config)
<add> @pool = ConnectionPool.new(pool_config)
<ide> idle_conn = @pool.checkout
<ide> @pool.checkin(idle_conn)
<ide>
<ide> def test_idle_timeout_configuration
<ide> def test_disable_flush
<ide> @pool.disconnect!
<ide> @db_config.configuration_hash.merge!(idle_timeout: -5)
<del> role = ActiveRecord::ConnectionAdapters::Role.new("primary", @db_config)
<del> @pool = ConnectionPool.new(role)
<add> pool_config = ActiveRecord::ConnectionAdapters::PoolConfig.new("primary", @db_config)
<add> @pool = ConnectionPool.new(pool_config)
<ide> idle_conn = @pool.checkout
<ide> @pool.checkin(idle_conn)
<ide>
<ide> def test_active_connection?
<ide> end
<ide>
<ide> def test_checkout_behaviour
<del> pool = ConnectionPool.new(@role)
<add> pool = ConnectionPool.new(@pool_config)
<ide> main_connection = pool.connection
<ide> assert_not_nil main_connection
<ide> threads = []
<ide> def test_checkout_fairness_by_group
<ide> end
<ide>
<ide> def test_automatic_reconnect_restores_after_disconnect
<del> pool = ConnectionPool.new(@role)
<add> pool = ConnectionPool.new(@pool_config)
<ide> assert pool.automatic_reconnect
<ide> assert pool.connection
<ide>
<ide> def test_automatic_reconnect_restores_after_disconnect
<ide> end
<ide>
<ide> def test_automatic_reconnect_can_be_disabled
<del> pool = ConnectionPool.new(@role)
<add> pool = ConnectionPool.new(@pool_config)
<ide> pool.disconnect!
<ide> pool.automatic_reconnect = false
<ide>
<ide> def with_single_connection_pool
<ide> old_config = @db_config.configuration_hash
<ide> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("arunit", "primary", old_config.dup)
<ide>
<del> db_config.configuration_hash[:pool] = 1 # this is safe to do, because .dupped Role also auto-dups its config
<add> db_config.configuration_hash[:pool] = 1 # this is safe to do, because .dupped PoolConfig also auto-dups its config
<ide>
<del> role = ActiveRecord::ConnectionAdapters::Role.new("primary", db_config)
<del> yield(pool = ConnectionPool.new(role))
<add> pool_config = ActiveRecord::ConnectionAdapters::PoolConfig.new("primary", db_config)
<add> yield(pool = ConnectionPool.new(pool_config))
<ide> ensure
<ide> pool.disconnect! if pool
<ide> end
<ide><path>activerecord/test/cases/disconnected_test.rb
<ide> def setup
<ide>
<ide> teardown do
<ide> return if in_memory_db?
<del> role = ActiveRecord::Base.connection_config
<del> ActiveRecord::Base.establish_connection(role)
<add> config = ActiveRecord::Base.connection_config
<add> ActiveRecord::Base.establish_connection(config)
<ide> end
<ide>
<ide> unless in_memory_db?
<ide><path>activerecord/test/cases/fixtures_test.rb
<ide> class MultipleDatabaseFixturesTest < ActiveRecord::TestCase
<ide>
<ide> private
<ide> def with_temporary_connection_pool
<del> role = ActiveRecord::Base.connection_handler.send(:owner_to_role_manager).fetch("primary").get_role(:default)
<del> new_pool = ActiveRecord::ConnectionAdapters::ConnectionPool.new(role)
<add> pool_config = ActiveRecord::Base.connection_handler.send(:owner_to_pool_manager).fetch("primary").get_pool_config(:default)
<add> new_pool = ActiveRecord::ConnectionAdapters::ConnectionPool.new(pool_config)
<ide>
<del> role.stub(:pool, new_pool) do
<add> pool_config.stub(:pool, new_pool) do
<ide> yield
<ide> end
<ide> end
<add><path>activerecord/test/cases/pool_config/resolver_test.rb
<del><path>activerecord/test/cases/connection_specification/resolver_test.rb
<ide>
<ide> module ActiveRecord
<ide> module ConnectionAdapters
<del> class Role
<add> class PoolConfig
<ide> class ResolverTest < ActiveRecord::TestCase
<del> def resolve(role, config = {})
<add> def resolve(pool_config, config = {})
<ide> configs = ActiveRecord::DatabaseConfigurations.new(config)
<ide> resolver = ConnectionAdapters::Resolver.new(configs)
<del> resolver.resolve(role, role).configuration_hash
<add> resolver.resolve(pool_config, pool_config).configuration_hash
<ide> end
<ide>
<del> def resolve_role(role, config = {})
<add> def resolve_pool_config(pool_config, config = {})
<ide> configs = ActiveRecord::DatabaseConfigurations.new(config)
<ide> resolver = ConnectionAdapters::Resolver.new(configs)
<del> resolver.resolve_role(role)
<add> resolver.resolve_pool_config(pool_config)
<ide> end
<ide>
<ide> def test_url_invalid_adapter
<ide> error = assert_raises(LoadError) do
<del> resolve_role "ridiculous://foo?encoding=utf8"
<add> resolve_pool_config "ridiculous://foo?encoding=utf8"
<ide> end
<ide>
<ide> assert_match "Could not load the 'ridiculous' Active Record adapter. Ensure that the adapter is spelled correctly in config/database.yml and that you've added the necessary adapter gem to your Gemfile.", error.message
<ide> end
<ide>
<ide> def test_error_if_no_adapter_method
<ide> error = assert_raises(AdapterNotFound) do
<del> resolve_role "abstract://foo?encoding=utf8"
<add> resolve_pool_config "abstract://foo?encoding=utf8"
<ide> end
<ide>
<ide> assert_match "database configuration specifies nonexistent abstract adapter", error.message
<ide> def test_error_if_no_adapter_method
<ide> # checks that the adapter file can be required in.
<ide>
<ide> def test_url_from_environment
<del> role = resolve :production, "production" => "abstract://foo?encoding=utf8"
<add> pool_config = resolve :production, "production" => "abstract://foo?encoding=utf8"
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> host: "foo",
<ide> encoding: "utf8",
<ide> name: "production"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_url_sub_key
<del> role = resolve :production, "production" => { "url" => "abstract://foo?encoding=utf8" }
<add> pool_config = resolve :production, "production" => { "url" => "abstract://foo?encoding=utf8" }
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> host: "foo",
<ide> encoding: "utf8",
<ide> name: "production"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_url_sub_key_merges_correctly
<ide> hash = { "url" => "abstract://foo?encoding=utf8&", "adapter" => "sqlite3", "host" => "bar", "pool" => "3" }
<del> role = resolve :production, "production" => hash
<add> pool_config = resolve :production, "production" => hash
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> host: "foo",
<ide> encoding: "utf8",
<ide> pool: "3",
<ide> name: "production"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_url_host_no_db
<del> role = resolve "abstract://foo?encoding=utf8"
<add> pool_config = resolve "abstract://foo?encoding=utf8"
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> host: "foo",
<ide> encoding: "utf8"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_url_missing_scheme
<del> role = resolve "foo"
<del> assert_equal({ database: "foo" }, role)
<add> pool_config = resolve "foo"
<add> assert_equal({ database: "foo" }, pool_config)
<ide> end
<ide>
<ide> def test_url_host_db
<del> role = resolve "abstract://foo/bar?encoding=utf8"
<add> pool_config = resolve "abstract://foo/bar?encoding=utf8"
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> database: "bar",
<ide> host: "foo",
<ide> encoding: "utf8"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_url_port
<del> role = resolve "abstract://foo:123?encoding=utf8"
<add> pool_config = resolve "abstract://foo:123?encoding=utf8"
<ide> assert_equal({
<ide> adapter: "abstract",
<ide> port: 123,
<ide> host: "foo",
<ide> encoding: "utf8"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<ide> def test_encoded_password
<ide> password = "am@z1ng_p@ssw0rd#!"
<ide> encoded_password = URI.encode_www_form_component(password)
<del> role = resolve "abstract://foo:#{encoded_password}@localhost/bar"
<del> assert_equal password, role[:password]
<add> pool_config = resolve "abstract://foo:#{encoded_password}@localhost/bar"
<add> assert_equal password, pool_config[:password]
<ide> end
<ide>
<ide> def test_url_with_authority_for_sqlite3
<del> role = resolve "sqlite3:///foo_test"
<del> assert_equal("/foo_test", role[:database])
<add> pool_config = resolve "sqlite3:///foo_test"
<add> assert_equal("/foo_test", pool_config[:database])
<ide> end
<ide>
<ide> def test_url_absolute_path_for_sqlite3
<del> role = resolve "sqlite3:/foo_test"
<del> assert_equal("/foo_test", role[:database])
<add> pool_config = resolve "sqlite3:/foo_test"
<add> assert_equal("/foo_test", pool_config[:database])
<ide> end
<ide>
<ide> def test_url_relative_path_for_sqlite3
<del> role = resolve "sqlite3:foo_test"
<del> assert_equal("foo_test", role[:database])
<add> pool_config = resolve "sqlite3:foo_test"
<add> assert_equal("foo_test", pool_config[:database])
<ide> end
<ide>
<ide> def test_url_memory_db_for_sqlite3
<del> role = resolve "sqlite3::memory:"
<del> assert_equal(":memory:", role[:database])
<add> pool_config = resolve "sqlite3::memory:"
<add> assert_equal(":memory:", pool_config[:database])
<ide> end
<ide>
<ide> def test_url_sub_key_for_sqlite3
<del> role = resolve :production, "production" => { "url" => "sqlite3:foo?encoding=utf8" }
<add> pool_config = resolve :production, "production" => { "url" => "sqlite3:foo?encoding=utf8" }
<ide> assert_equal({
<ide> adapter: "sqlite3",
<ide> database: "foo",
<ide> encoding: "utf8",
<ide> name: "production"
<del> }, role)
<add> }, pool_config)
<ide> end
<ide>
<del> def test_role_connection_specification_name_on_key_lookup
<del> role = resolve_role(:readonly, "readonly" => { "adapter" => "sqlite3" })
<del> assert_equal "readonly", role.connection_specification_name
<add> def test_pool_config_connection_specification_name_on_key_lookup
<add> pool_config = resolve_pool_config(:readonly, "readonly" => { "adapter" => "sqlite3" })
<add> assert_equal "readonly", pool_config.connection_specification_name
<ide> end
<ide>
<del> def test_role_connection_specification_name_with_inline_config
<del> role = resolve_role("adapter" => "sqlite3")
<del> assert_equal "primary", role.connection_specification_name, "should default to primary id"
<add> def test_pool_config_connection_specification_name_with_inline_config
<add> pool_config = resolve_pool_config("adapter" => "sqlite3")
<add> assert_equal "primary", pool_config.connection_specification_name, "should default to primary id"
<ide> end
<ide>
<del> def test_role_with_invalid_type
<add> def test_pool_config_with_invalid_type
<ide> assert_raises TypeError do
<del> resolve_role(Object.new)
<add> resolve_pool_config(Object.new)
<ide> end
<ide> end
<ide> end
<ide><path>activerecord/test/cases/query_cache_test.rb
<ide> def test_clear_query_cache_is_called_on_all_connections
<ide>
<ide> private
<ide> def with_temporary_connection_pool
<del> role = ActiveRecord::Base.connection_handler.send(:owner_to_role_manager).fetch("primary").get_role(:default)
<del> new_pool = ActiveRecord::ConnectionAdapters::ConnectionPool.new(role)
<add> pool_config = ActiveRecord::Base.connection_handler.send(:owner_to_pool_manager).fetch("primary").get_pool_config(:default)
<add> new_pool = ActiveRecord::ConnectionAdapters::ConnectionPool.new(pool_config)
<ide>
<del> role.stub(:pool, new_pool) do
<add> pool_config.stub(:pool, new_pool) do
<ide> yield
<ide> end
<ide> end
<ide><path>activerecord/test/cases/reaper_test.rb
<ide> def test_some_time
<ide>
<ide> def test_pool_has_reaper
<ide> config = ActiveRecord::Base.configurations.configs_for(env_name: "arunit", spec_name: "primary")
<del> role = Role.new("primary", config)
<del> pool = ConnectionPool.new(role)
<add> pool_config = PoolConfig.new("primary", config)
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> assert pool.reaper
<ide> ensure
<ide> pool.discard!
<ide> end
<ide>
<ide> def test_reaping_frequency_configuration
<del> role = duplicated_role
<del> role.db_config.configuration_hash[:reaping_frequency] = "10.01"
<add> pool_config = duplicated_pool_config
<add> pool_config.db_config.configuration_hash[:reaping_frequency] = "10.01"
<ide>
<del> pool = ConnectionPool.new(role)
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> assert_equal 10.01, pool.reaper.frequency
<ide> ensure
<ide> pool.discard!
<ide> end
<ide>
<ide> def test_connection_pool_starts_reaper
<del> role = duplicated_role
<del> role.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<add> pool_config = duplicated_pool_config
<add> pool_config.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<del> pool = ConnectionPool.new(role)
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> conn, child = new_conn_in_thread(pool)
<ide>
<ide> def test_connection_pool_starts_reaper
<ide> end
<ide>
<ide> def test_reaper_works_after_pool_discard
<del> role = duplicated_role
<del> role.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<add> pool_config = duplicated_pool_config
<add> pool_config.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<ide> 2.times do
<del> pool = ConnectionPool.new(role)
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> conn, child = new_conn_in_thread(pool)
<ide>
<ide> def test_reaper_works_after_pool_discard
<ide> # This doesn't test the reaper directly, but we want to test the action
<ide> # it would take on a discarded pool
<ide> def test_reap_flush_on_discarded_pool
<del> role = duplicated_role
<del> pool = ConnectionPool.new(role)
<add> pool_config = duplicated_pool_config
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> pool.discard!
<ide> pool.reap
<ide> pool.flush
<ide> end
<ide>
<ide> def test_connection_pool_starts_reaper_in_fork
<del> role = duplicated_role
<del> role.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<add> pool_config = duplicated_pool_config
<add> pool_config.db_config.configuration_hash[:reaping_frequency] = "0.0001"
<ide>
<del> pool = ConnectionPool.new(role)
<add> pool = ConnectionPool.new(pool_config)
<ide> pool.checkout
<ide>
<ide> pid = fork do
<del> pool = ConnectionPool.new(role)
<add> pool = ConnectionPool.new(pool_config)
<ide>
<ide> conn, child = new_conn_in_thread(pool)
<ide> child.terminate
<ide> def test_reaper_does_not_reap_discarded_connection_pools
<ide> end
<ide>
<ide> private
<del> def duplicated_role
<add> def duplicated_pool_config
<ide> old_config = ActiveRecord::Base.connection_pool.db_config.configuration_hash
<ide> db_config = ActiveRecord::DatabaseConfigurations::HashConfig.new("arunit", "primary", old_config.dup)
<del> Role.new("primary", db_config)
<add> PoolConfig.new("primary", db_config)
<ide> end
<ide>
<ide> def new_conn_in_thread(pool)
<ide><path>activerecord/test/cases/unconnected_test.rb
<ide> def setup
<ide> @specification = ActiveRecord::Base.remove_connection
<ide>
<ide> # Clear out connection info from other pids (like a fork parent) too
<del> ActiveRecord::ConnectionAdapters::Role.discard_pools!
<add> ActiveRecord::ConnectionAdapters::PoolConfig.discard_pools!
<ide> end
<ide>
<ide> teardown do | 16 |
Text | Text | add list of translatable footer links | 0a027ca2a5efa1e9e4921bc4da1b34eced415e05 | <ide><path>docs/language-lead-handbook.md
<ide> You can convert from one format to the other carefully changing it manually. Or
<ide> > [!TIP]
<ide> > A new workflow is being worked on, there will be only one place to change in the future.
<ide>
<add>## How to translate articles in the footer links
<add>
<add>There are some links listed at the bottom of the footer (About, Alumni Network, Open Source etc.) and some of them can be translated into your language in the same way as other articles.
<add>
<add>Articles that can be translated:
<add>
<add>- About
<add>- Support
<add>- Academic Honesty
<add>- Code of Conduct
<add>
<add>The following articles should **not** be translated:
<add>
<add>- Shop
<add>- Sponsors
<add>- Privacy Policy
<add>- Terms of Service
<add>- Copyright Policy
<add>
<add>The following links are pointing to external sites and cannot be translated:
<add>
<add>- Alumni Network
<add>- Open Source
<add>
<add>### Change the footer links in the news
<add>
<add>Once you have translated and published the articles listed as "can be translated" above, you can update the links in the footer for `/news` by editing the file at `news/config/i18n/locales/<your language>/links.json` in the [freeCodeCamp/news](https://github.com/freeCodeCamp/news) repository.
<add>
<add>> [!NOTE]
<add>> Pull requests to this repository are currently limited to staff only. If you want to update this file, ask someone on the staff team for help.
<add>
<add>Update the following part in the file:
<add>
<add>```json
<add>{
<add> ...
<add> "footer": {
<add> "about": "https://www.freecodecamp.org/news/about/",
<add> "support": "https://www.freecodecamp.org/news/support/",
<add> "honesty": "https://www.freecodecamp.org/news/academic-honesty-policy/",
<add> "coc": "https://www.freecodecamp.org/news/code-of-conduct/"
<add> }
<add>}
<add>```
<add>
<add>### Change the footer links in the curriculum
<add>
<add>When you have translated and published the articles listed as "can be translated" above, as well as when the curriculum in your language is ready for launch, you can update the links in the footer for `/learn` by editing the file at `client/i18n/locales/<your language>/links.json` in the [freeCodeCamp/freeCodeCamp](https://github.com/freeCodeCamp/freeCodeCamp) repository.
<add>
<add>> [!WARNING]
<add>> Only "About", "Support", "Academic Honesty", and "Code of Conduct" can be translated. Leave other URLs unchanged.
<add>
<add>Update the following part in the file:
<add>
<add>```json
<add>{
<add> ...
<add> "footer": {
<add> "about-url": "https://www.freecodecamp.org/news/about/",
<add> "shop-url": "https://www.freecodecamp.org/shop/",
<add> "support-url": "https://www.freecodecamp.org/news/support/",
<add> "sponsors-url": "https://www.freecodecamp.org/news/sponsors/",
<add> "honesty-url": "https://www.freecodecamp.org/news/academic-honesty-policy/",
<add> "coc-url": "https://www.freecodecamp.org/news/code-of-conduct/",
<add> "privacy-url": "https://www.freecodecamp.org/news/privacy-policy/",
<add> "tos-url": "https://www.freecodecamp.org/news/terms-of-service/",
<add> "copyright-url": "https://www.freecodecamp.org/news/copyright-policy/"
<add> },
<add> ...
<add>}
<add>```
<add>
<ide> ## How to translate the info boxes headers in the documentation
<ide>
<ide> You can find these boxes all around the documentation: | 1 |
Go | Go | fix the overlay cleanup in the multi-subnet case | 0b40559c694fe8077bc3e1c12971344579ccbf97 | <ide><path>libnetwork/drivers/overlay/joinleave.go
<ide> func (d *driver) Join(nid, eid string, sboxKey string, jinfo driverapi.JoinInfo,
<ide> return fmt.Errorf("subnet sandbox join failed for %q: %v", s.subnetIP.String(), err)
<ide> }
<ide>
<add> // joinSubnetSandbox gets called when an endpoint comes up on a new subnet in the
<add> // overlay network. Hence the Endpoint count should be updated outside joinSubnetSandbox
<add> n.incEndpointCount()
<add>
<ide> sbox := n.sandbox()
<ide>
<ide> name1, name2, err := createVethPair()
<ide><path>libnetwork/drivers/overlay/ov_network.go
<ide> func (d *driver) DeleteNetwork(nid string) error {
<ide> return n.releaseVxlanID()
<ide> }
<ide>
<del>func (n *network) joinSandbox() error {
<add>func (n *network) incEndpointCount() {
<ide> n.Lock()
<del> if n.joinCnt != 0 {
<del> n.joinCnt++
<del> n.Unlock()
<del> return nil
<del> }
<del> n.Unlock()
<add> defer n.Unlock()
<add> n.joinCnt++
<add>}
<ide>
<add>func (n *network) joinSandbox() error {
<ide> // If there is a race between two go routines here only one will win
<ide> // the other will wait.
<ide> n.once.Do(func() {
<ide> func (n *network) joinSandbox() error {
<ide> }
<ide>
<ide> func (n *network) joinSubnetSandbox(s *subnet) error {
<del>
<ide> s.once.Do(func() {
<ide> s.initErr = n.initSubnetSandbox(s)
<ide> })
<del> // Increment joinCnt in all the goroutines only when the one time initSandbox
<del> // was a success.
<del> n.Lock()
<del> if s.initErr == nil {
<del> n.joinCnt++
<del> }
<del> err := s.initErr
<del> n.Unlock()
<del>
<del> return err
<add> return s.initErr
<ide> }
<ide>
<ide> func (n *network) leaveSandbox() { | 2 |
Javascript | Javascript | fix alertios examples | a6bca4041bc95eb9fe43a61e1f4552c5551d9575 | <ide><path>Examples/UIExplorer/AlertIOSExample.js
<ide> exports.examples = [{
<ide> 'Hello World',
<ide> null,
<ide> [
<del> {text: 'OK', onPress: (text) => console.log('OK pressed')},
<del> ],
<del> 'default'
<add> {text: 'OK', onPress: (text) => console.log('OK pressed'), type: 'default'}
<add> ]
<ide> )}>
<ide>
<ide> <View style={styles.button}>
<ide> exports.examples = [{
<ide> 'Plain Text Entry',
<ide> null,
<ide> [
<del> {text: 'Submit', onPress: (text) => console.log('Text: ' + text)},
<del> ],
<del> 'plain-text'
<add> {text: 'Submit', onPress: (text) => console.log('Text: ' + text), type: 'plain-text'},
<add> {text: 'Cancel', onPress: () => console.log('Cancel'), style: 'cancel'}
<add> ]
<ide> )}>
<ide>
<ide> <View style={styles.button}>
<ide> exports.examples = [{
<ide> 'Secure Text Entry',
<ide> null,
<ide> [
<del> {text: 'Submit', onPress: (text) => console.log('Password: ' + text)},
<del> ],
<del> 'secure-text'
<add> {text: 'Submit', onPress: (text) => console.log('Password: ' + text), type: 'secure-text'},
<add> {text: 'Cancel', onPress: () => console.log('Cancel'), style: 'cancel'}
<add> ]
<ide> )}>
<ide>
<ide> <View style={styles.button}>
<ide> exports.examples = [{
<ide> 'Login & Password',
<ide> null,
<ide> [
<del> {text: 'Submit', onPress: (details) => console.log('Login: ' + details.login + '; Password: ' + details.password)},
<del> ],
<del> 'login-password'
<add> {text: 'Submit', onPress: (details) => console.log('Login: ' + details.login + '; Password: ' + details.password), type: 'login-password'},
<add> {text: 'Cancel', onPress: () => console.log('Cancel'), style: 'cancel'}
<add> ]
<ide> )}>
<ide>
<ide> <View style={styles.button}> | 1 |
Javascript | Javascript | remove unnecessary property and method | 6ad458b752e2c2818244714e499e560f6e668c87 | <ide><path>lib/module.js
<ide> Module.prototype._compile = function(content, filename) {
<ide> return Module._resolveFilename(request, self);
<ide> };
<ide>
<del> Object.defineProperty(require, 'paths', { get: function() {
<del> throw new Error('require.paths is removed. Use ' +
<del> 'node_modules folders, or the NODE_PATH ' +
<del> 'environment variable instead.');
<del> }});
<del>
<ide> require.main = process.mainModule;
<ide>
<ide> // Enable support to add extra extension types
<ide> require.extensions = Module._extensions;
<del> require.registerExtension = function() {
<del> throw new Error('require.registerExtension() removed. Use ' +
<del> 'require.extensions instead.');
<del> };
<ide>
<ide> require.cache = Module._cache;
<ide>
<ide><path>test/sequential/test-module-loading.js
<ide> assert.equal(require('../fixtures/registerExt2').custom, 'passed');
<ide> assert.equal(require('../fixtures/foo').foo, 'ok',
<ide> 'require module with no extension');
<ide>
<del>assert.throws(function() {
<del> require.paths;
<del>}, /removed/, 'Accessing require.paths should throw.');
<del>
<ide> // Should not attempt to load a directory
<ide> try {
<ide> require('../fixtures/empty'); | 2 |
Ruby | Ruby | handle devel-only correctly | 3f318b8ed4f33fa1f27f5ce1816ca72513196208 | <ide><path>Library/Homebrew/cmd/install.rb
<ide> def install
<ide> end
<ide>
<ide> ARGV.formulae.each do |f|
<del> # Building head-only without --HEAD is an error
<del> if not ARGV.build_head? and f.stable.nil?
<add> # head-only without --HEAD is an error
<add> if not ARGV.build_head? and f.stable.nil? and f.devel.nil?
<ide> raise <<-EOS.undent
<ide> #{f.name} is a head-only formula
<ide> Install with `brew install --HEAD #{f.name}`
<ide> EOS
<ide> end
<ide>
<del> # Building stable-only with --HEAD is an error
<add> # devel-only without --devel is an error
<add> if not ARGV.build_devel? and f.stable.nil?
<add> if f.head.nil?
<add> raise <<-EOS.undent
<add> #{f.name} is a devel-only formula
<add> Install with `brew install --devel #{f.name}`
<add> EOS
<add> else
<add> raise "#{f.name} has no stable download, please choose --devel or --HEAD"
<add> end
<add> end
<add>
<add> # --HEAD, fail with no head defined
<ide> if ARGV.build_head? and f.head.nil?
<ide> raise "No head is defined for #{f.name}"
<ide> end
<ide>
<del> # Building stable-only with --devel is an error
<add> # --devel, fail with no devel defined
<ide> if ARGV.build_devel? and f.devel.nil?
<ide> raise "No devel block is defined for #{f.name}"
<ide> end | 1 |
Python | Python | add assert that git version is available. | 0498afea94525ea157e52b8edc255f13ed7d4004 | <ide><path>setup.py
<ide> def _minimal_ext_cmd(cmd):
<ide> except (subprocess.SubprocessError, OSError):
<ide> GIT_REVISION = "Unknown"
<ide>
<add> if not GIT_REVISION:
<add> # this shouldn't happen but apparently can (see gh-8512)
<add> GIT_REVISION = "Unknown"
<add>
<ide> return GIT_REVISION
<ide>
<ide> # BEFORE importing setuptools, remove MANIFEST. Otherwise it may not be | 1 |
Text | Text | update 11.0.0 changelog with missing commit | b32c5f04084a20e0db950a0b74b7b69d776e5e3f | <ide><path>doc/changelogs/CHANGELOG_V11.md
<ide> * [[`42bded83e8`](https://github.com/nodejs/node/commit/42bded83e8)] - **(SEMVER-MAJOR)** **fs**: throw ERR\_INVALID\_ARG\_VALUE when buffer being written is empty (AdityaSrivast) [#21262](https://github.com/nodejs/node/pull/21262)
<ide> * [[`7bd48896e9`](https://github.com/nodejs/node/commit/7bd48896e9)] - **(SEMVER-MAJOR)** **fs**: move SyncWriteStream to end-of-life (James M Snell) [#20735](https://github.com/nodejs/node/pull/20735)
<ide> * [[`19374fd25b`](https://github.com/nodejs/node/commit/19374fd25b)] - **(SEMVER-MAJOR)** **fs**: improve argument handling for ReadStream (Ujjwal Sharma) [#19898](https://github.com/nodejs/node/pull/19898)
<add>* [[`f22c7c10ca`](https://github.com/nodejs/node/commit/f22c7c10ca)] - **(SEMVER-MAJOR)** **http**: always emit close on req and res (Robert Nagy) [#20611](https://github.com/nodejs/node/pull/20611)
<ide> * [[`1744205ff5`](https://github.com/nodejs/node/commit/1744205ff5)] - **(SEMVER-MAJOR)** **http**: move process.binding('http\_parser') to internalBinding (James M Snell) [#22329](https://github.com/nodejs/node/pull/22329)
<ide> * [[`4b00c4fafa`](https://github.com/nodejs/node/commit/4b00c4fafa)] - **(SEMVER-MAJOR)** **http**: make client `.aborted` boolean (Robert Nagy) [#20230](https://github.com/nodejs/node/pull/20230)
<ide> * [[`564048dc29`](https://github.com/nodejs/node/commit/564048dc29)] - **(SEMVER-MAJOR)** **http,https,tls**: switch to WHATWG URL parser (Hackzzila) [#20270](https://github.com/nodejs/node/pull/20270) | 1 |
Python | Python | add better error for failed model shortcut loading | 2a1fa86a0d90e555574801bda5d04a38b34b620a | <ide><path>spacy/cli/download.py
<ide> from ._util import app, Arg, Opt
<ide> from .. import about
<ide> from ..util import is_package, get_base_version, run_command
<del>
<del># These are the old shortcuts we previously supported in spacy download. As of
<del># v3, shortcuts are deprecated so we're not expecting to add anything to this
<del># list. It only exists to show users warnings.
<del>OLD_SHORTCUTS = {
<del> "en": "en_core_web_sm",
<del> "de": "de_core_news_sm",
<del> "es": "es_core_news_sm",
<del> "pt": "pt_core_news_sm",
<del> "fr": "fr_core_news_sm",
<del> "it": "it_core_news_sm",
<del> "nl": "nl_core_news_sm",
<del> "el": "el_core_news_sm",
<del> "nb": "nb_core_news_sm",
<del> "lt": "lt_core_news_sm",
<del> "xx": "xx_ent_wiki_sm",
<del>}
<add>from ..errors import OLD_MODEL_SHORTCUTS
<ide>
<ide>
<ide> @app.command(
<ide> def download(model: str, direct: bool = False, *pip_args) -> None:
<ide> download_model(dl_tpl.format(m=model_name, v=version), pip_args)
<ide> else:
<ide> model_name = model
<del> if model in OLD_SHORTCUTS:
<add> if model in OLD_MODEL_SHORTCUTS:
<ide> msg.warn(
<del> f"As of spaCy v3.0, shortcuts like '{model}' are deprecated. "
<del> f"Please use the full model name '{OLD_SHORTCUTS[model]}' instead."
<add> f"As of spaCy v3.0, shortcuts like '{model}' are deprecated. Please"
<add> f"use the full model name '{OLD_MODEL_SHORTCUTS[model]}' instead."
<ide> )
<del> model_name = OLD_SHORTCUTS[model]
<add> model_name = OLD_MODEL_SHORTCUTS[model]
<ide> compatibility = get_compatibility()
<ide> version = get_version(model_name, compatibility)
<ide> download_model(dl_tpl.format(m=model_name, v=version), pip_args)
<ide><path>spacy/errors.py
<ide> class Errors:
<ide> E199 = ("Unable to merge 0-length span at doc[{start}:{end}].")
<ide>
<ide> # TODO: fix numbering after merging develop into master
<add> E941 = ("Can't find model '{name}'. It looks like you're trying to load a "
<add> "model from a shortcut, which is deprecated as of spaCy v3.0. To "
<add> "load the model, use its full name instead. For example:\n\n"
<add> "nlp = spacy.load(\"{full}\")\n\nFor more details on the available "
<add> "models, see the models directory: https://spacy.io/models")
<ide> E942 = ("Executing after_{name} callback failed. Expected the function to "
<ide> "return an initialized nlp object but got: {value}. Maybe "
<ide> "you forgot to return the modified object in your function?")
<ide> class TempErrors:
<ide> "issue tracker: http://github.com/explosion/spaCy/issues")
<ide>
<ide>
<add># Deprecated model shortcuts, only used in errors and warnings
<add>OLD_MODEL_SHORTCUTS = {
<add> "en": "en_core_web_sm", "de": "de_core_news_sm", "es": "es_core_news_sm",
<add> "pt": "pt_core_news_sm", "fr": "fr_core_news_sm", "it": "it_core_news_sm",
<add> "nl": "nl_core_news_sm", "el": "el_core_news_sm", "nb": "nb_core_news_sm",
<add> "lt": "lt_core_news_sm", "xx": "xx_ent_wiki_sm"
<add>}
<add>
<add>
<ide> # fmt: on
<ide>
<ide>
<ide><path>spacy/util.py
<ide>
<ide> from .symbols import ORTH
<ide> from .compat import cupy, CudaStream, is_windows
<del>from .errors import Errors, Warnings
<add>from .errors import Errors, Warnings, OLD_MODEL_SHORTCUTS
<ide> from . import about
<ide>
<ide> if TYPE_CHECKING:
<ide> def load_model(
<ide> return load_model_from_path(Path(name), **kwargs)
<ide> elif hasattr(name, "exists"): # Path or Path-like to model data
<ide> return load_model_from_path(name, **kwargs)
<add> if name in OLD_MODEL_SHORTCUTS:
<add> raise IOError(Errors.E941.format(name=name, full=OLD_MODEL_SHORTCUTS[name]))
<ide> raise IOError(Errors.E050.format(name=name))
<ide>
<ide> | 3 |
Text | Text | update some literal translations in index.md | f48af0d2f66c9a4c05f806819cdb7aae64189513 | <ide><path>guide/spanish/agile/the-agile-manifesto/index.md
<ide> Estamos descubriendo mejores formas de desarrollar software haciéndolo y ayudan
<ide> A través de este trabajo, hemos llegado a valorar.
<ide>
<ide> * **Individuos e interacciones** sobre procesos y herramientas.
<del>* **Software de trabajo** sobre documentación integral.
<add>* **Software que funciona** sobre documentación integral.
<ide> * **Colaboración** con **clientes** sobre negociación de contratos.
<ide> * **Respondiendo al cambio** siguiendo un plan.
<ide>
<del>Es decir, mientras hay valor en los elementos de la derecha, valoramos más los elementos de la izquierda.
<add>Es decir, aunque hay valor en los elementos de la derecha, valoramos más los elementos de la izquierda.
<ide>
<ide> ### Doce principios de software ágil
<ide>
<ide> 1. Nuestra máxima prioridad es satisfacer al cliente a través de la entrega temprana y continua de software valioso.
<del>2. Bienvenido cambiando los requisitos, incluso tarde en el desarrollo. Los procesos ágiles aprovechan el cambio para la ventaja competitiva del cliente.
<add>2. Los cambios de los requisitos son bienvenidos, incluso tarde en el desarrollo. Los procesos ágiles aprovechan el cambio para la ventaja competitiva del cliente.
<ide> 3. Ofrezca software de trabajo con frecuencia, desde un par de semanas hasta un par de meses, con una preferencia por el plazo más corto.
<del>4. La gente de negocios y los desarrolladores deben trabajar juntos todos los días a lo largo del proyecto.
<add>4. La gente de negocio y los desarrolladores deben trabajar juntos todos los días a lo largo del proyecto.
<ide> 5. Construye proyectos alrededor de individuos motivados. Deles el ambiente y el apoyo que necesitan y confíen en ellos para hacer el trabajo.
<ide> 6. El método más eficiente y efectivo de transmitir información hacia y dentro de un equipo de desarrollo es la conversación cara a cara.
<del>7. El software de trabajo es la principal medida del progreso.
<add>7. El software que funciona es la principal medida del progreso.
<ide> 8. Los procesos ágiles promueven el desarrollo sostenible. Los patrocinadores, los desarrolladores y los usuarios deberían poder mantener un ritmo constante de forma indefinida.
<ide> 9. La atención continua a la excelencia técnica y el buen diseño mejora la agilidad.
<ide> 10. La simplicidad, el arte de maximizar la cantidad de trabajo no hecho, es esencial.
<ide> Es decir, mientras hay valor en los elementos de la derecha, valoramos más los
<ide> #### Más información:
<ide>
<ide> * [(1) Historia: El manifiesto ágil.](http://agilemanifesto.org/history.html)
<del>* [Manifiesto Ágil](http://agilemanifesto.org/)
<ide>\ No newline at end of file
<add>* [Manifiesto Ágil](http://agilemanifesto.org/) | 1 |
Text | Text | commit 7/8 rosetta tokenize | ca2b97f11e4d8a9067fe1f3458368e215fc8cee0 | <ide><path>curriculum/challenges/english/10-coding-interview-prep/rosetta-code/tokenize-a-string-with-escaping.english.md
<ide> tests:
<ide> <div id='js-seed'>
<ide>
<ide> ```js
<del>function tokenize(str, esc, sep) {
<add>function tokenize(str, sep, esc) {
<ide> return true;
<ide> }
<ide> ``` | 1 |
Python | Python | avoid recalculation of output in join merge | 5352e46e09c1f586cb173f6bb2f0d6a01a9aa0a9 | <ide><path>keras/layers/core.py
<ide> def get_output(self, train=False):
<ide> if X.name is None:
<ide> raise ValueError("merge_mode='join' only works with named inputs")
<ide> else:
<del> inputs[X.name] = self.layers[i].get_output(train)
<add> inputs[X.name] = X
<ide> return inputs
<ide> elif self.mode == 'mul':
<ide> s = self.layers[0].get_output(train) | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.