content_type stringclasses 8
values | main_lang stringclasses 7
values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Text | Text | add note on changes to configuration options | faa4a527440fb1a8f47bf066bb89bbff380b914d | <ide><path>UPDATING.md
<ide> https://developers.google.com/style/inclusive-documentation
<ide>
<ide> -->
<ide>
<add>### Change the configuration options for field masking
<add>
<add>We've improved masking for sensitive data in Web UI and logs. As part of it, the following configurations have been changed:
<add>
<add>* `hide_sensitive_variable_fields` option in `admin` section has been replaced by `hide_sensitive_var_conn_fields` section in `core` section,
<add>* `sensitive_variable_fields` option in `admin` section has been replaced by `sensitive_var_conn_names` section in `core` section.
<add>
<ide> ### Deprecated PodDefaults and add_xcom_sidecar in airflow.kubernetes.pod_generator
<ide>
<ide> We have moved PodDefaults from `airflow.kubernetes.pod_generator.PodDefaults` to | 1 |
Text | Text | fix typo in source maps documentation. | fa02b197f5254cfe9f0feabaa64e598d60d4745d | <ide><path>docs/advanced-features/source-maps.md
<ide> description: Enables browser source map generation during the production build.
<ide>
<ide> # Source Maps
<ide>
<del>Source Maps are enabled by default during development. During production builds they are disabled as generation source maps can significantly increase build times and memory usage while being generated.
<add>Source Maps are enabled by default during development. During production builds, they are disabled as generating source maps can significantly increase build times and memory usage while being generated.
<ide>
<ide> Next.js provides a configuration flag you can use to enable browser source map generation during the production build:
<ide>
<ide> module.exports = {
<ide> }
<ide> ```
<ide>
<del>When the `productionBrowserSourceMaps` option is enabled the source maps will be output in the same directory as the JavaScript files, Next.js will automatically serve these files when requested.
<add>When the `productionBrowserSourceMaps` option is enabled, the source maps will be output in the same directory as the JavaScript files. Next.js will automatically serve these files when requested.
<ide>
<ide> ## Caveats
<ide>
<del>- Can increase `next build` time
<add>- Adding source maps can increase `next build` time
<ide> - Increases memory usage during `next build` | 1 |
Ruby | Ruby | prevent void context warnings | 811fd0a6b47c83cd4f668127bb822dc5cdcfc0f5 | <ide><path>activesupport/test/autoloading_fixtures/raises_arbitrary_exception.rb
<ide> RaisesArbitraryException = 1
<del>A::B # Autoloading recursion, also expected to be watched and discarded.
<add>_ = A::B # Autoloading recursion, also expected to be watched and discarded.
<ide>
<ide> raise Exception, 'arbitray exception message'
<ide><path>activesupport/test/autoloading_fixtures/throws.rb
<ide> Throws = 1
<del>A::B # Autoloading recursion, expected to be discarded.
<add>_ = A::B # Autoloading recursion, expected to be discarded.
<ide>
<ide> throw :t | 2 |
Javascript | Javascript | define extra scope for switch-case | 42cb86bfdcc5c782cd393deede1fbdd104feb186 | <ide><path>lib/javascript/JavascriptParser.js
<ide> class JavascriptParser extends Parser {
<ide> if (switchCase.test) {
<ide> this.walkExpression(switchCase.test);
<ide> }
<del> this.walkStatements(switchCase.consequent);
<add> if (switchCase.consequent.length === 0) return;
<add> if (
<add> switchCase.consequent.length === 1 &&
<add> switchCase.consequent[0].type === "BlockStatement"
<add> ) {
<add> this.walkStatement(switchCase.consequent[0]);
<add> } else {
<add> this.inBlockScope(() => {
<add> this.blockPreWalkStatements(switchCase.consequent);
<add> this.walkStatements(switchCase.consequent);
<add> });
<add> }
<ide> }
<ide> }
<ide>
<ide><path>test/cases/parsing/issue-11283/A.js
<add>export default "B";
<ide><path>test/cases/parsing/issue-11283/index.js
<add>import A from "./A.js";
<add>
<add>function magicA() {
<add> // To be sure that future optimization
<add> // will not affect test suite
<add> return String.fromCharCode(65);
<add>}
<add>
<add>it("should parse switch case properly", () => {
<add> switch (1) {
<add> case 2:
<add> case 1:
<add> const A = magicA();
<add> expect(A).toBe("A");
<add> break;
<add> }
<add>
<add> switch (1) {
<add> case 2:
<add> case 1: {
<add> const A = magicA();
<add> expect(A).toBe("A");
<add> break;
<add> }
<add> }
<add>}); | 3 |
Javascript | Javascript | simplify array initialization | 44c6b6b22a6ea92b3307bfcfab933b75b9b70639 | <ide><path>test/parallel/test-buffer-alloc.js
<ide> assert.throws(() => Buffer.from('', 'buffer'), TypeError);
<ide> // Regression test for #6111. Constructing a buffer from another buffer
<ide> // should a) work, and b) not corrupt the source buffer.
<ide> {
<del> let a = [0];
<del> for (let i = 0; i < 7; ++i) a = a.concat(a);
<del> a = a.map((_, i) => { return i; });
<add> const a = [...Array(128).keys()]; // [0, 1, 2, 3, ... 126, 127]
<ide> const b = Buffer.from(a);
<ide> const c = Buffer.from(b);
<ide> assert.strictEqual(b.length, a.length); | 1 |
Javascript | Javascript | explain blur events | 095627ad17c7f85f723a74c38ce96674140592ea | <ide><path>src/ng/directive/ngEventDirs.js
<ide> forEach(
<ide> * @description
<ide> * Specify custom behavior on blur event.
<ide> *
<add> * A [blur event](https://developer.mozilla.org/en-US/docs/Web/Events/blur) fires when
<add> * an element has lost focus.
<add> *
<ide> * Note: As the `blur` event is executed synchronously also during DOM manipulations
<ide> * (e.g. removing a focussed input),
<ide> * AngularJS executes the expression using `scope.$evalAsync` if the event is fired | 1 |
Javascript | Javascript | add assertion for the jquery#val method | 659ac9c155a9ecee03e635dde1661be7081322f3 | <ide><path>test/unit/attributes.js
<ide> if ( "value" in document.createElement("meter") &&
<ide> }
<ide>
<ide> var testVal = function( valueObj ) {
<del> expect( 8 );
<add> expect( 9 );
<ide>
<ide> jQuery("#text1").val( valueObj("test") );
<ide> equal( document.getElementById("text1").value, "test", "Check for modified (via val(String)) value of input element" );
<ide> var testVal = function( valueObj ) {
<ide> equal( document.getElementById("text1").value, "", "Check for modified (via val(null)) value of input element" );
<ide>
<ide> var j,
<del> $select1 = jQuery("#select1");
<add> select = jQuery( "<select multiple><option value='1'/><option value='2'/></select>" );
<add> $select1 = jQuery("#select1"),
<add>
<ide> $select1.val( valueObj("3") );
<ide> equal( $select1.val(), "3", "Check for modified (via val(String)) value of select element" );
<ide>
<ide> var testVal = function( valueObj ) {
<ide> j.val( valueObj( "asdf" ) );
<ide> equal( j.val(), "asdf", "Check node,textnode,comment with val()" );
<ide> j.removeAttr("value");
<add>
<add> select.val( valueObj( [ "1", "2" ] ) );
<add> deepEqual( select.val(), [ "1", "2" ], "Should set array of values" );
<ide> };
<ide>
<ide> test( "val(String/Number)", function() { | 1 |
Python | Python | remove ftphook from hooks | 05be03c9127644edf065b87ce6fbd9b8c30345de | <ide><path>airflow/hooks/__init__.py
<ide> from airflow.hooks.base_hook import BaseHook as _BaseHook
<ide>
<ide> _hooks = {
<del> 'ftp_hook': ['FTPHook'],
<ide> 'hive_hooks': [
<ide> 'HiveCliHook',
<ide> 'HiveMetastoreHook',
<ide><path>airflow/hooks/ftp_hook.py
<del>import datetime
<del>import ftplib
<del>import logging
<del>import os.path
<del>from airflow.hooks.base_hook import BaseHook
<del>from past.builtins import basestring
<del>
<del>
<del>def mlsd(conn, path="", facts=[]):
<del> '''
<del> BACKPORT FROM PYTHON3 FTPLIB
<del>
<del> List a directory in a standardized format by using MLSD
<del> command (RFC-3659). If path is omitted the current directory
<del> is assumed. "facts" is a list of strings representing the type
<del> of information desired (e.g. ["type", "size", "perm"]).
<del>
<del> Return a generator object yielding a tuple of two elements
<del> for every file found in path.
<del> First element is the file name, the second one is a dictionary
<del> including a variable number of "facts" depending on the server
<del> and whether "facts" argument has been provided.
<del> '''
<del> if facts:
<del> conn.sendcmd("OPTS MLST " + ";".join(facts) + ";")
<del> if path:
<del> cmd = "MLSD %s" % path
<del> else:
<del> cmd = "MLSD"
<del> lines = []
<del> conn.retrlines(cmd, lines.append)
<del> for line in lines:
<del> facts_found, _, name = line.rstrip(ftplib.CRLF).partition(' ')
<del> entry = {}
<del> for fact in facts_found[:-1].split(";"):
<del> key, _, value = fact.partition("=")
<del> entry[key.lower()] = value
<del> yield (name, entry)
<del>
<del>
<del>class FTPHook(BaseHook):
<del>
<del> """
<del> Interact with FTP.
<del>
<del> Errors that may occur throughout but should be handled
<del> downstream.
<del> """
<del>
<del> def __init__(self, ftp_conn_id='ftp_default'):
<del> self.ftp_conn_id = ftp_conn_id
<del> self.conn = None
<del>
<del> def get_conn(self):
<del> """
<del> Returns a FTP connection object
<del> """
<del> if self.conn is None:
<del> params = self.get_connection(self.ftp_conn_id)
<del> self.conn = ftplib.FTP(params.host, params.login, params.password)
<del>
<del> return self.conn
<del>
<del> def close_conn(self):
<del> """
<del> Closes the connection. An error will occur if the
<del> connection wasnt ever opened.
<del> """
<del> conn = self.conn
<del> conn.quit()
<del>
<del> def describe_directory(self, path):
<del> """
<del> Returns a dictionary of {filename: {attributes}} for all files
<del> on the remote system (where the MLSD command is supported).
<del>
<del> :param path: full path to the remote directory
<del> :type path: str
<del> """
<del> conn = self.get_conn()
<del> conn.cwd(path)
<del> try:
<del> # only works in Python 3
<del> files = dict(conn.mlsd())
<del> except AttributeError:
<del> files = dict(mlsd(conn))
<del> return files
<del>
<del> def list_directory(self, path, nlst=False):
<del> """
<del> Returns a list of files on the remote system.
<del>
<del> :param path: full path to the remote directory to list
<del> :type path: str
<del> """
<del> conn = self.get_conn()
<del> conn.cwd(path)
<del>
<del> files = conn.nlst()
<del> return files
<del>
<del> def create_directory(self, path):
<del> """
<del> Creates a directory on the remote system.
<del>
<del> :param path: full path to the remote directory to create
<del> :type path: str
<del> """
<del> conn = self.get_conn()
<del> conn.mkd(path)
<del>
<del> def delete_directory(self, path):
<del> """
<del> Deletes a directory on the remote system.
<del>
<del> :param path: full path to the remote directory to delete
<del> :type path: str
<del> """
<del> conn = self.get_conn()
<del> conn.rmd(path)
<del>
<del> def retrieve_file(self, remote_full_path, local_full_path_or_buffer):
<del> """
<del> Transfers the remote file to a local location.
<del>
<del> If local_full_path_or_buffer is a string path, the file will be put
<del> at that location; if it is a file-like buffer, the file will
<del> be written to the buffer but not closed.
<del>
<del> :param remote_full_path: full path to the remote file
<del> :type remote_full_path: str
<del> :param local_full_path_or_buffer: full path to the local file or a
<del> file-like buffer
<del> :type local_full_path: str or file-like buffer
<del> """
<del> conn = self.get_conn()
<del>
<del> is_path = isinstance(local_full_path_or_buffer, basestring)
<del>
<del> if is_path:
<del> output_handle = open(local_full_path_or_buffer, 'wb')
<del> else:
<del> output_handle = local_full_path_or_buffer
<del>
<del> remote_path, remote_file_name = os.path.split(remote_full_path)
<del> conn.cwd(remote_path)
<del> logging.info('Retrieving file from FTP: {}'.format(remote_full_path))
<del> conn.retrbinary('RETR %s' % remote_file_name, output_handle.write)
<del> logging.info('Finished etrieving file from FTP: {}'.format(
<del> remote_full_path))
<del>
<del> if is_path:
<del> output_handle.close()
<del>
<del> def store_file(self, remote_full_path, local_full_path_or_buffer):
<del> """
<del> Transfers a local file to the remote location.
<del>
<del> If local_full_path_or_buffer is a string path, the file will be read
<del> from that location; if it is a file-like buffer, the file will
<del> be read from the buffer but not closed.
<del>
<del> :param remote_full_path: full path to the remote file
<del> :type remote_full_path: str
<del> :param local_full_path_or_buffer: full path to the local file or a
<del> file-like buffer
<del> :type local_full_path_or_buffer: str or file-like buffer
<del> """
<del> conn = self.get_conn()
<del>
<del> is_path = isinstance(local_full_path_or_buffer, basestring)
<del>
<del> if is_path:
<del> input_handle = open(local_full_path_or_buffer, 'rb')
<del> else:
<del> input_handle = local_full_path_or_buffer
<del> remote_path, remote_file_name = os.path.split(remote_full_path)
<del> conn.cwd(remote_path)
<del> conn.storbinary('STOR %s' % remote_file_name, input_handle)
<del>
<del> if is_path:
<del> input_handle.close()
<del>
<del> def delete_file(self, path):
<del> """
<del> Removes a file on the FTP Server
<del>
<del> :param path: full path to the remote file
<del> :type path: str
<del> """
<del> conn = self.get_conn()
<del> conn.delete(path)
<del>
<del> def get_mod_time(self, path):
<del> conn = self.get_conn()
<del> ftp_mdtm = conn.sendcmd('MDTM ' + path)
<del> return datetime.datetime.strptime(ftp_mdtm[4:], '%Y%m%d%H%M%S') | 2 |
Text | Text | add missing docs about binary remote contexts | a5ba032c7421ef7a429e780d12d0f604a045258a | <ide><path>docs/reference/api/docker_remote_api_v1.19.md
<ide> or being killed.
<ide> ignored if `remote` is specified and points to an individual filename.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<del>- **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<del> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file. If
<del> the URI points to a tarball, the file is downloaded by the daemon and
<del> the contents therein used as the context for the build. If the URI
<del> points to a tarball and the `dockerfile` parameter is also specified,
<del> there must be a file with the corresponding path inside the tarball.
<add>- **remote** – A Git repository URI or HTTP/HTTPS URI build source. If the
<add> URI specifies a filename, the file's contents are placed into a file
<add> called `Dockerfile`.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.20.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.21.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> You can provide one or more `t` parameters.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.22.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> You can provide one or more `t` parameters.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.23.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> You can provide one or more `t` parameters.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.24.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> You can provide one or more `t` parameters.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/api/docker_remote_api_v1.25.md
<ide> or being killed.
<ide>
<ide> **Query parameters**:
<ide>
<del>- **dockerfile** - Path within the build context to the Dockerfile. This is
<del> ignored if `remote` is specified and points to an individual filename.
<add>- **dockerfile** - Path within the build context to the `Dockerfile`. This is
<add> ignored if `remote` is specified and points to an external `Dockerfile`.
<ide> - **t** – A name and optional tag to apply to the image in the `name:tag` format.
<ide> If you omit the `tag` the default `latest` value is assumed.
<ide> You can provide one or more `t` parameters.
<ide> - **remote** – A Git repository URI or HTTP/HTTPS context URI. If the
<ide> URI points to a single text file, the file's contents are placed into
<del> a file called `Dockerfile` and the image is built from that file.
<add> a file called `Dockerfile` and the image is built from that file. If
<add> the URI points to a tarball, the file is downloaded by the daemon and
<add> the contents therein used as the context for the build. If the URI
<add> points to a tarball and the `dockerfile` parameter is also specified,
<add> there must be a file with the corresponding path inside the tarball.
<ide> - **q** – Suppress verbose build output.
<ide> - **nocache** – Do not use the cache when building the image.
<ide> - **pull** - Attempt to pull the image even if an older image exists locally.
<ide><path>docs/reference/commandline/build.md
<ide> to any of the files in the context. For example, your build can use an
<ide> [*ADD*](../builder.md#add) instruction to reference a file in the
<ide> context.
<ide>
<del>The `URL` parameter can specify the location of a Git repository; the repository
<del>acts as the build context. The system recursively clones the repository and its
<del>submodules using a `git clone --depth 1 --recursive` command. This command runs
<del>in a temporary directory on your local host. After the command succeeds, the
<del>directory is sent to the Docker daemon as the context. Local clones give you the
<del>ability to access private repositories using local user credentials, VPNs, and
<del>so forth.
<add>The `URL` parameter can refer to three kinds of resources: Git repositories,
<add>pre-packaged tarball contexts and plain text files.
<add>
<add>### Git repositories
<add>
<add>When the `URL` parameter points to the location of a Git repository, the
<add>repository acts as the build context. The system recursively clones the
<add>repository and its submodules using a `git clone --depth 1 --recursive`
<add>command. This command runs in a temporary directory on your local host. After
<add>the command succeeds, the directory is sent to the Docker daemon as the
<add>context. Local clones give you the ability to access private repositories using
<add>local user credentials, VPN's, and so forth.
<ide>
<ide> Git URLs accept context configuration in their fragment section, separated by a
<ide> colon `:`. The first part represents the reference that Git will check out,
<ide> Build Syntax Suffix | Commit Used | Build Context Used
<ide> `myrepo.git#mybranch:myfolder` | `refs/heads/mybranch` | `/myfolder`
<ide> `myrepo.git#abcdef:myfolder` | `sha1 = abcdef` | `/myfolder`
<ide>
<add>
<add>### Tarball contexts
<add>
<add>If you pass an URL to a remote tarball, the URL itself is sent to the daemon:
<add>
<ide> Instead of specifying a context, you can pass a single Dockerfile in the `URL`
<ide> or pipe the file in via `STDIN`. To pipe a Dockerfile from `STDIN`:
<ide>
<add>```bash
<add>$ docker build http://server/context.tar.gz
<add>
<add>The download operation will be performed on the host the Docker daemon is
<add>running on, which is not necessarily the same host from which the build command
<add>is being issued. The Docker daemon will fetch `context.tar.gz` and use it as the
<add>build context. Tarball contexts must be tar archives conforming to the standard
<add>`tar` UNIX format and can be compressed with any one of the 'xz', 'bzip2',
<add>'gzip' or 'identity' (no compression) formats.
<add>
<add>### Text files
<add>
<add>Instead of specifying a context, you can pass a single `Dockerfile` in the
<add>`URL` or pipe the file in via `STDIN`. To pipe a `Dockerfile` from `STDIN`:
<add>
<ide> ```bash
<ide> $ docker build - < Dockerfile
<ide> ```
<ide> With Powershell on Windows, you can run:
<ide> Get-Content Dockerfile | docker build -
<ide> ```
<ide>
<del>If you use STDIN or specify a `URL`, the system places the contents into a file
<del>called `Dockerfile`, and any `-f`, `--file` option is ignored. In this
<del>scenario, there is no context.
<add>If you use `STDIN` or specify a `URL` pointing to a plain text file, the system
<add>places the contents into a file called `Dockerfile`, and any `-f`, `--file`
<add>option is ignored. In this scenario, there is no context.
<ide>
<ide> By default the `docker build` command will look for a `Dockerfile` at the root
<ide> of the build context. The `-f`, `--file`, option lets you specify the path to
<ide> an alternative file to use instead. This is useful in cases where the same set
<ide> of files are used for multiple builds. The path must be to a file within the
<del>build context. If a relative path is specified then it must to be relative to
<del>the current directory.
<add>build context. If a relative path is specified then it is interpreted as
<add>relative to the root of the context.
<ide>
<ide> In most cases, it's best to put each Dockerfile in an empty directory. Then,
<ide> add to that directory only the files needed for building the Dockerfile. To
<ide> $ docker build github.com/creack/docker-firefox
<ide> ```
<ide>
<ide> This will clone the GitHub repository and use the cloned repository as context.
<del>The Dockerfile at the root of the repository is used as Dockerfile. Note that
<del>you can specify an arbitrary Git repository by using the `git://` or `git@`
<del>scheme.
<add>The Dockerfile at the root of the repository is used as Dockerfile. You can
<add>specify an arbitrary Git repository by using the `git://` or `git@` scheme.
<add>
<add>```bash
<add>$ docker build -f ctx/Dockerfile http://server/ctx.tar.gz
<add>
<add>Downloading context: http://server/ctx.tar.gz [===================>] 240 B/240 B
<add>Step 0 : FROM busybox
<add> ---> 8c2e06607696
<add>Step 1 : ADD ctx/container.cfg /
<add> ---> e7829950cee3
<add>Removing intermediate container b35224abf821
<add>Step 2 : CMD /bin/ls
<add> ---> Running in fbc63d321d73
<add> ---> 3286931702ad
<add>Removing intermediate container fbc63d321d73
<add>Successfully built 377c409b35e4
<add>```
<add>
<add>This sends the URL `http://server/ctx.tar.gz` to the Docker daemon, which
<add>downloads and extracts the referenced tarball. The `-f ctx/Dockerfile`
<add>parameter specifies a path inside `ctx.tar.gz` to the `Dockerfile` that is used
<add>to build the image. Any `ADD` commands in that `Dockerfile` that refer to local
<add>paths must be relative to the root of the contents inside `ctx.tar.gz`. In the
<add>example above, the tarball contains a directory `ctx/`, so the `ADD
<add>ctx/container.cfg /` operation works as expected.
<ide>
<ide> ### Build with -
<ide> | 8 |
Python | Python | add a test for simple rnn | bab230c0d6debd6b1ab2c521f21d133841c1a7f5 | <ide><path>tests/keras/layers/test_simplernn.py
<add>import theano
<add>import unittest
<add>from numpy.testing import assert_allclose
<add>import numpy as np
<add>from keras.layers.recurrent import SimpleRNN
<add>from mock import Mock
<add>
<add>floatX = theano.config.floatX
<add>
<add>__author__ = "Jeff Ye"
<add>
<add>
<add>class TestSimpleRNN(unittest.TestCase):
<add> left_padding_data = np.array(
<add> [
<add> [ # batch 1
<add> [0], [1], [2], [3]
<add> ],
<add> [ # batch 2
<add> [0], [0], [1], [2]
<add> ]
<add> ], dtype=floatX)
<add> left_padding_mask = np.array( # n_sample x n_time
<add> [
<add> [ # batch 1
<add> 0, 1, 1, 1
<add> ],
<add> [ # batch 2
<add> 0, 0, 1, 1
<add> ]
<add> ], dtype=np.int32)
<add>
<add> def setUp(self):
<add> W = np.array([[1]], dtype=floatX)
<add> U = np.array([[1]], dtype=floatX)
<add> b = np.array([0], dtype=floatX)
<add> weights = [W, U, b]
<add> self.forward = SimpleRNN(output_dim=1, activation='linear', weights=weights)
<add> self.backward = SimpleRNN(output_dim=1, activation='linear', weights=weights)
<add>
<add> previous = Mock()
<add> previous.nb_input = 1
<add> previous.nb_output = 1
<add> previous.output_shape = self.left_padding_data.shape
<add> previous.get_output_mask = Mock()
<add> self.previous = previous
<add>
<add> def test_left_padding(self):
<add> forward = self.forward
<add> forward.go_backwards = False
<add> forward.return_sequences = True
<add> self.previous.get_output.return_value = theano.shared(value=self.left_padding_data)
<add> self.previous.get_output_mask.return_value = theano.shared(value=self.left_padding_mask)
<add> forward.set_previous(self.previous)
<add> np.testing.assert_allclose(forward.get_output().eval(),
<add> np.array([
<add> [[0], [1], [3], [6]],
<add> [[0], [0], [1], [3]]]))
<add>
<add> backward = self.backward
<add> backward.go_backwards = True
<add> backward.return_sequences = True
<add> self.previous.get_output.return_value = theano.shared(value=self.left_padding_data)
<add> self.previous.get_output_mask.return_value = theano.shared(value=self.left_padding_mask)
<add> backward.set_previous(self.previous)
<add> np.testing.assert_allclose(backward.get_output().eval(),
<add> np.array([
<add> [[3], [5], [6], [0]],
<add> [[2], [3], [0], [0]]])) | 1 |
Text | Text | fix broken link | 6d54a250472a5bb303f69adc4db8970bb8b9e3e0 | <ide><path>share/doc/homebrew/Brew-Test-Bot-For-Core-Contributors.md
<ide> This job automatically builds any pull requests submitted to Homebrew/homebrew-c
<ide> ## [Homebrew Testing](https://bot.brew.sh/job/Homebrew%20Testing/)
<ide> This job is manually triggered to run [`brew test-bot`](https://github.com/Homebrew/brew/blob/master/Library/Homebrew/dev-cmd/test-bot.rb) with user-specified parameters. On a successful build it automatically uploads bottles.
<ide>
<del>You can manually start this job with parameters to run [`brew test-bot`](https://github.com/Homebrew/brew/blob/master/Library/Homebrew/cmd/test-bot.rb) with the same parameters. It's often useful to pass a pull request URL, a commit URL, a commit SHA-1 and/or formula names to have `brew-test-bot` test them, report the results and produce bottles.
<add>You can manually start this job with parameters to run [`brew test-bot`](https://github.com/Homebrew/brew/blob/master/Library/Homebrew/dev-cmd/test-bot.rb) with the same parameters. It's often useful to pass a pull request URL, a commit URL, a commit SHA-1 and/or formula names to have `brew-test-bot` test them, report the results and produce bottles.
<ide>
<ide> ## Bottling
<ide> To pull and bottle a pull request with `brew pull`: | 1 |
Javascript | Javascript | add myntra to showcase | 39bea429323248850dde609d821446b87ff3254d | <ide><path>website/src/react-native/showcase.js
<ide> var apps = [
<ide> link: 'https://itunes.apple.com/us/app/mr.-dapper-men-fashion-app/id989735184?ls=1&mt=8',
<ide> author: 'wei ping woon',
<ide> },
<add> {
<add> name: 'Myntra',
<add> icon: 'http://a5.mzstatic.com/us/r30/Purple6/v4/9c/78/df/9c78dfa6-0061-1af2-5026-3e1d5a073c94/icon350x350.png',
<add> link: 'https://itunes.apple.com/in/app/myntra-fashion-shopping-app/id907394059',
<add> author: 'Myntra Designs',
<add> },
<ide> {
<ide> name: 'Ncredible',
<ide> icon: 'http://a3.mzstatic.com/us/r30/Purple2/v4/a9/00/74/a9007400-7ccf-df10-553b-3b6cb67f3f5f/icon350x350.png', | 1 |
PHP | PHP | fix (aggregates with having) | 5a92f04b0287c609d025cd8ea87829b9c7b3e290 | <ide><path>src/Illuminate/Database/Query/Builder.php
<ide> public function average($column)
<ide> */
<ide> public function aggregate($function, $columns = ['*'])
<ide> {
<del> $results = $this->cloneWithout($this->unions ? [] : ['columns'])
<del> ->cloneWithoutBindings($this->unions ? [] : ['select'])
<add> $results = $this->cloneWithout($this->unions || $this->havings ? [] : ['columns'])
<add> ->cloneWithoutBindings($this->unions || $this->havings ? [] : ['select'])
<ide> ->setAggregate($function, $columns)
<ide> ->get($columns);
<ide>
<ide><path>src/Illuminate/Database/Query/Grammars/Grammar.php
<ide> class Grammar extends BaseGrammar
<ide> */
<ide> public function compileSelect(Builder $query)
<ide> {
<del> if ($query->unions && $query->aggregate) {
<add> if (($query->unions || $query->havings) && $query->aggregate) {
<ide> return $this->compileUnionAggregate($query);
<ide> }
<ide>
<ide><path>tests/Database/DatabaseQueryBuilderTest.php
<ide> public function testUnionAggregate()
<ide> $builder->from('posts')->union($this->getSqlServerBuilder()->from('videos'))->count();
<ide> }
<ide>
<add> public function testHavingAggregate()
<add> {
<add> $expected = 'select count(*) as aggregate from (select (select `count(*)` from `videos` where `posts`.`id` = `videos`.`post_id`) as `videos_count` from `posts` having `videos_count` > ?) as `temp_table`';
<add> $builder = $this->getMySqlBuilder();
<add> $builder->getConnection()->shouldReceive('getDatabaseName');
<add> $builder->getConnection()->shouldReceive('select')->once()->with($expected, [0 => 1], true)->andReturn([['aggregate' => 1]]);
<add> $builder->getProcessor()->shouldReceive('processSelect')->once()->andReturnUsing(function ($builder, $results) {
<add> return $results;
<add> });
<add>
<add> $builder->from('posts')->selectSub(function ($query) {
<add> $query->from('videos')->select('count(*)')->whereColumn('posts.id', '=', 'videos.post_id');
<add> }, 'videos_count')->having('videos_count', '>', 1);
<add> $builder->count();
<add> }
<add>
<ide> public function testSubSelectWhereIns()
<ide> {
<ide> $builder = $this->getBuilder(); | 3 |
Javascript | Javascript | unify inputdiscretelane with synclane | acde654698659326af0ac4eded4a3ae7cb780a55 | <ide><path>packages/react-dom/src/__tests__/ReactDOMFiberAsync-test.js
<ide> describe('ReactDOMFiberAsync', () => {
<ide> });
<ide>
<ide> // @gate experimental
<del> it('ignores discrete events on a pending removed element', () => {
<add> it('ignores discrete events on a pending removed element', async () => {
<ide> const disableButtonRef = React.createRef();
<ide> const submitButtonRef = React.createRef();
<ide>
<del> let formSubmitted = false;
<del>
<ide> function Form() {
<ide> const [active, setActive] = React.useState(true);
<ide> function disableForm() {
<ide> setActive(false);
<ide> }
<del> function submitForm() {
<del> formSubmitted = true; // This should not get invoked
<del> }
<add>
<ide> return (
<ide> <div>
<ide> <button onClick={disableForm} ref={disableButtonRef}>
<ide> Disable
<ide> </button>
<del> {active ? (
<del> <button onClick={submitForm} ref={submitButtonRef}>
<del> Submit
<del> </button>
<del> ) : null}
<add> {active ? <button ref={submitButtonRef}>Submit</button> : null}
<ide> </div>
<ide> );
<ide> }
<ide>
<ide> const root = ReactDOM.unstable_createRoot(container);
<del> root.render(<Form />);
<del> // Flush
<del> Scheduler.unstable_flushAll();
<add> await act(async () => {
<add> root.render(<Form />);
<add> });
<ide>
<ide> const disableButton = disableButtonRef.current;
<ide> expect(disableButton.tagName).toBe('BUTTON');
<ide>
<add> const submitButton = submitButtonRef.current;
<add> expect(submitButton.tagName).toBe('BUTTON');
<add>
<ide> // Dispatch a click event on the Disable-button.
<ide> const firstEvent = document.createEvent('Event');
<ide> firstEvent.initEvent('click', true, true);
<ide> disableButton.dispatchEvent(firstEvent);
<ide>
<del> // There should now be a pending update to disable the form.
<del>
<del> // This should not have flushed yet since it's in concurrent mode.
<del> const submitButton = submitButtonRef.current;
<del> expect(submitButton.tagName).toBe('BUTTON');
<del>
<del> // In the meantime, we can dispatch a new client event on the submit button.
<del> const secondEvent = document.createEvent('Event');
<del> secondEvent.initEvent('click', true, true);
<del> // This should force the pending update to flush which disables the submit button before the event is invoked.
<del> submitButton.dispatchEvent(secondEvent);
<del>
<del> // Therefore the form should never have been submitted.
<del> expect(formSubmitted).toBe(false);
<del>
<del> expect(submitButtonRef.current).toBe(null);
<add> // The click event is flushed synchronously, even in concurrent mode.
<add> expect(submitButton.current).toBe(undefined);
<ide> });
<ide>
<ide> // @gate experimental
<ide><path>packages/react-dom/src/__tests__/ReactDOMNativeEventHeuristic-test.js
<ide> describe('ReactDOMNativeEventHeuristic-test', () => {
<ide> }
<ide>
<ide> const root = ReactDOM.unstable_createRoot(container);
<del> root.render(<Form />);
<del> // Flush
<del> Scheduler.unstable_flushAll();
<add> await act(() => {
<add> root.render(<Form />);
<add> });
<ide>
<ide> const disableButton = disableButtonRef.current;
<ide> expect(disableButton.tagName).toBe('BUTTON');
<ide> describe('ReactDOMNativeEventHeuristic-test', () => {
<ide> dispatchAndSetCurrentEvent(disableButton, firstEvent),
<ide> ).toErrorDev(['An update to Form inside a test was not wrapped in act']);
<ide>
<del> // There should now be a pending update to disable the form.
<del> // This should not have flushed yet since it's in concurrent mode.
<del> const submitButton = submitButtonRef.current;
<del> expect(submitButton.tagName).toBe('BUTTON');
<del>
<ide> // Discrete events should be flushed in a microtask.
<ide> // Verify that the second button was removed.
<ide> await null;
<ide><path>packages/react-dom/src/events/plugins/__tests__/ChangeEventPlugin-test.js
<ide> describe('ChangeEventPlugin', () => {
<ide> });
<ide>
<ide> // @gate experimental
<del> it('is async for non-input events', async () => {
<add> it('is sync for non-input events', async () => {
<ide> const root = ReactDOM.unstable_createRoot(container);
<ide> let input;
<ide>
<ide> describe('ChangeEventPlugin', () => {
<ide> input.dispatchEvent(
<ide> new Event('click', {bubbles: true, cancelable: true}),
<ide> );
<del> // Nothing should have changed
<del> expect(Scheduler).toHaveYielded([]);
<del> expect(input.value).toBe('initial');
<ide>
<ide> // Flush microtask queue.
<ide> await null;
<ide><path>packages/react-dom/src/events/plugins/__tests__/SimpleEventPlugin-test.js
<ide> describe('SimpleEventPlugin', function() {
<ide> });
<ide>
<ide> // @gate experimental
<del> it('flushes pending interactive work before extracting event handler', () => {
<add> it('flushes pending interactive work before exiting event handler', () => {
<ide> container = document.createElement('div');
<ide> const root = ReactDOM.unstable_createRoot(container);
<ide> document.body.appendChild(container);
<ide> describe('SimpleEventPlugin', function() {
<ide> expect(Scheduler).toHaveYielded([
<ide> // The handler fired
<ide> 'Side-effect',
<del> // but the component did not re-render yet, because it's async
<add> // The component re-rendered synchronously, even in concurrent mode.
<add> 'render button: disabled',
<ide> ]);
<ide>
<ide> // Click the button again
<ide> click();
<ide> expect(Scheduler).toHaveYielded([
<del> // Before handling this second click event, the previous interactive
<del> // update is flushed
<del> 'render button: disabled',
<del> // The event handler was removed from the button, so there's no second
<del> // side-effect
<add> // The event handler was removed from the button, so there's no effect.
<ide> ]);
<ide>
<ide> // The handler should not fire again no matter how many times we
<ide> describe('SimpleEventPlugin', function() {
<ide>
<ide> // Click the button a single time
<ide> click();
<del> // The counter should not have updated yet because it's async
<del> expect(button.textContent).toEqual('Count: 0');
<add> // The counter should update synchronously, even in concurrent mode.
<add> expect(button.textContent).toEqual('Count: 1');
<ide>
<ide> // Click the button many more times
<ide> await TestUtils.act(async () => {
<ide> describe('SimpleEventPlugin', function() {
<ide> button.dispatchEvent(event);
<ide> }
<ide>
<del> // Click the button a single time
<del> click();
<del> // Nothing should flush on the first click.
<del> expect(Scheduler).toHaveYielded([]);
<del> // Click again. This will force the previous discrete update to flush. But
<del> // only the high-pri count will increase.
<add> // Click the button a single time.
<add> // This will flush at the end of the event, even in concurrent mode.
<ide> click();
<ide> expect(Scheduler).toHaveYielded(['High-pri count: 1, Low-pri count: 0']);
<del> expect(button.textContent).toEqual('High-pri count: 1, Low-pri count: 0');
<ide>
<ide> // Click the button many more times
<ide> click();
<ide> describe('SimpleEventPlugin', function() {
<ide> click();
<ide> click();
<ide>
<del> // Flush the remaining work.
<add> // Each update should synchronously flush, even in concurrent mode.
<ide> expect(Scheduler).toHaveYielded([
<ide> 'High-pri count: 2, Low-pri count: 0',
<ide> 'High-pri count: 3, Low-pri count: 0',
<ide> describe('SimpleEventPlugin', function() {
<ide> 'High-pri count: 7, Low-pri count: 0',
<ide> ]);
<ide>
<del> // Flush the microtask queue
<del> await null;
<del>
<del> // At the end, both counters should equal the total number of clicks
<del> expect(Scheduler).toHaveYielded(['High-pri count: 8, Low-pri count: 0']);
<add> // Now flush the scheduler to apply the transition updates.
<add> // At the end, both counters should equal the total number of clicks.
<ide> expect(Scheduler).toFlushAndYield([
<del> 'High-pri count: 8, Low-pri count: 8',
<add> 'High-pri count: 7, Low-pri count: 7',
<ide> ]);
<ide>
<del> expect(button.textContent).toEqual('High-pri count: 8, Low-pri count: 8');
<add> expect(button.textContent).toEqual('High-pri count: 7, Low-pri count: 7');
<ide> });
<ide> });
<ide>
<ide><path>packages/react-reconciler/src/ReactFiberLane.new.js
<ide> export function findUpdateLane(lanePriority: LanePriority): Lane {
<ide> case SyncBatchedLanePriority:
<ide> return SyncBatchedLane;
<ide> case InputDiscreteLanePriority:
<del> return InputDiscreteLane;
<add> return SyncLane;
<ide> case InputContinuousLanePriority:
<ide> return InputContinuousLane;
<ide> case DefaultLanePriority:
<ide><path>packages/react-reconciler/src/ReactFiberLane.old.js
<ide> export function findUpdateLane(lanePriority: LanePriority): Lane {
<ide> case SyncBatchedLanePriority:
<ide> return SyncBatchedLane;
<ide> case InputDiscreteLanePriority:
<del> return InputDiscreteLane;
<add> return SyncLane;
<ide> case InputContinuousLanePriority:
<ide> return InputContinuousLane;
<ide> case DefaultLanePriority:
<ide><path>packages/react-reconciler/src/ReactFiberWorkLoop.new.js
<ide> import {
<ide> clearContainer,
<ide> getCurrentEventPriority,
<ide> supportsMicrotasks,
<del> scheduleMicrotask,
<ide> } from './ReactFiberHostConfig';
<ide>
<ide> import {
<ide> function ensureRootIsScheduled(root: FiberRoot, currentTime: number) {
<ide> ImmediateSchedulerPriority,
<ide> performSyncWorkOnRoot.bind(null, root),
<ide> );
<del> } else if (
<del> supportsMicrotasks &&
<del> newCallbackPriority === InputDiscreteLanePriority
<del> ) {
<del> scheduleMicrotask(performSyncWorkOnRoot.bind(null, root));
<del> newCallbackNode = null;
<ide> } else {
<ide> const schedulerPriorityLevel = lanePriorityToSchedulerPriority(
<ide> newCallbackPriority,
<ide><path>packages/react-reconciler/src/ReactFiberWorkLoop.old.js
<ide> import {
<ide> clearContainer,
<ide> getCurrentEventPriority,
<ide> supportsMicrotasks,
<del> scheduleMicrotask,
<ide> } from './ReactFiberHostConfig';
<ide>
<ide> import {
<ide> function ensureRootIsScheduled(root: FiberRoot, currentTime: number) {
<ide> ImmediateSchedulerPriority,
<ide> performSyncWorkOnRoot.bind(null, root),
<ide> );
<del> } else if (
<del> supportsMicrotasks &&
<del> newCallbackPriority === InputDiscreteLanePriority
<del> ) {
<del> scheduleMicrotask(performSyncWorkOnRoot.bind(null, root));
<del> newCallbackNode = null;
<ide> } else {
<ide> const schedulerPriorityLevel = lanePriorityToSchedulerPriority(
<ide> newCallbackPriority,
<ide><path>packages/react-reconciler/src/__tests__/SchedulingProfilerLabels-test.internal.js
<ide> describe('SchedulingProfiler labels', () => {
<ide> targetRef.current.click();
<ide> });
<ide> expect(clearedMarks).toContain(
<del> `--schedule-state-update-${formatLanes(
<del> ReactFiberLane.InputDiscreteLane,
<del> )}-App`,
<add> `--schedule-state-update-${formatLanes(ReactFiberLane.SyncLane)}-App`,
<ide> );
<ide> });
<ide> | 9 |
Text | Text | update image references | 1d6e3c73cfd9c2bce6682274f08a09fa2ff98b0b | <ide><path>docs/your-first-package.md
<ide> ol.entries .hide-me {
<ide> Refresh Atom, and run the `changer` command. You'll see all the non-changed
<ide> files disappear from the tree. Success!
<ide>
<del>
<add>![Changer File View][changer-file-view]
<ide>
<ide> There are a number of ways you can get the list back; let's just naively iterate
<ide> over the same elements and remove the class:
<ide> rootView.vertical.append(this)
<ide> If you refresh Atom and hit the key command, you'll see a box appear right underneath
<ide> the editor:
<ide>
<del>
<add>![Changer Panel][changer-panel-append]
<ide>
<ide> As you might have guessed, `rootView.vertical.append` tells Atom to append `this`
<ide> item (_i.e._, whatever is defined by`@content`) _vertically_ to the editor. If
<ide> else
<ide> for file in modifiedFiles
<ide> stat = fs.lstatSync(file)
<ide> mtime = stat.mtime
<del> @modifiedFilesList.append("<li>#{file} - Modified at #{mtime}")
<add> @modifiedFilesList.append("<li>#{file} - Modified at #{mtime}")
<ide> rootView.vertical.append(this)
<ide> ```
<ide>
<ide> When you toggle the modified files list, your pane is now populated with the
<ide> filenames and modified times of files in your project:
<ide>
<del>
<add>![Changer Panel][changer-panel-timestamps]
<ide>
<ide> You might notice that subsequent calls to this command reduplicate information.
<ide> We could provide an elegant way of rechecking files already in the list, but for
<ide> Using theme variables ensures that packages look great alongside any theme.
<ide> [node]: http://nodejs.org/
<ide> [path]: http://nodejs.org/docs/latest/api/path.html
<ide> [theme-vars]: ../theme-variables.html
<add>[changer-file-view]: https://f.cloud.github.com/assets/69169/1441187/d7a7cb46-41a7-11e3-8128-d93f70a5d5c1.png
<add>[changer-panel-append]: https://f.cloud.github.com/assets/69169/1441189/db0c74da-41a7-11e3-8286-b82dd9190c34.png
<add>[changer-panel-timestamps]: https://f.cloud.github.com/assets/69169/1441190/dcc8eeb6-41a7-11e3-830f-1f1b33072fcd.png | 1 |
Python | Python | move django.contrib.auth import out of compat | d4d9cc1d53fa38fdf6b38e2a64b4aa3a71a9760c | <ide><path>rest_framework/compat.py
<ide> import django
<ide> from django.apps import apps
<ide> from django.conf import settings
<del>from django.contrib.auth import views
<ide> from django.core.exceptions import ImproperlyConfigured, ValidationError
<ide> from django.core.validators import \
<ide> MaxLengthValidator as DjangoMaxLengthValidator
<ide> def authenticate(request=None, **credentials):
<ide> else:
<ide> return authenticate(request=request, **credentials)
<ide>
<del>if django.VERSION < (1, 11):
<del> login = views.login
<del> login_kwargs = {'template_name': 'rest_framework/login.html'}
<del> logout = views.logout
<del>else:
<del> login = views.LoginView.as_view(template_name='rest_framework/login.html')
<del> login_kwargs = {}
<del> logout = views.LogoutView.as_view()
<ide><path>rest_framework/urls.py
<ide> from __future__ import unicode_literals
<ide>
<ide> from django.conf.urls import url
<add>from django.contrib.auth import views
<add>
<add>if django.VERSION < (1, 11):
<add> login = views.login
<add> login_kwargs = {'template_name': 'rest_framework/login.html'}
<add> logout = views.logout
<add>else:
<add> login = views.LoginView.as_view(template_name='rest_framework/login.html')
<add> login_kwargs = {}
<add> logout = views.LogoutView.as_view()
<ide>
<del>from rest_framework.compat import login, login_kwargs, logout
<ide>
<ide> app_name = 'rest_framework'
<ide> urlpatterns = [ | 2 |
Javascript | Javascript | fix legacy suspense false positive | 269dd6ec5da85fc5ca819cfa010ce60dd1c83ec6 | <ide><path>packages/react-reconciler/src/ReactFiberHooks.new.js
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> if (
<ide> current !== null &&
<ide> (current.flags & StaticMaskEffect) !==
<del> (workInProgress.flags & StaticMaskEffect)
<add> (workInProgress.flags & StaticMaskEffect) &&
<add> // Disable this warning in legacy mode, because legacy Suspense is weird
<add> // and creates false positives. To make this work in legacy mode, we'd
<add> // need to mark fibers that commit in an incomplete state, somehow. For
<add> // now I'll disable the warning that most of the bugs that would trigger
<add> // it are either exclusive to concurrent mode or exist in both.
<add> (current.mode & ConcurrentMode) !== NoMode
<ide> ) {
<ide> console.error(
<ide> 'Internal React error: Expected static flag was missing. Please ' +
<ide><path>packages/react-reconciler/src/ReactFiberHooks.old.js
<ide> export function renderWithHooks<Props, SecondArg>(
<ide> if (
<ide> current !== null &&
<ide> (current.flags & StaticMaskEffect) !==
<del> (workInProgress.flags & StaticMaskEffect)
<add> (workInProgress.flags & StaticMaskEffect) &&
<add> // Disable this warning in legacy mode, because legacy Suspense is weird
<add> // and creates false positives. To make this work in legacy mode, we'd
<add> // need to mark fibers that commit in an incomplete state, somehow. For
<add> // now I'll disable the warning that most of the bugs that would trigger
<add> // it are either exclusive to concurrent mode or exist in both.
<add> (current.mode & ConcurrentMode) !== NoMode
<ide> ) {
<ide> console.error(
<ide> 'Internal React error: Expected static flag was missing. Please ' +
<ide><path>packages/react-reconciler/src/__tests__/ReactSubtreeFlagsWarning-test.js
<add>let React;
<add>let ReactNoop;
<add>let Scheduler;
<add>let Suspense;
<add>let useEffect;
<add>let getCacheForType;
<add>
<add>let caches;
<add>let seededCache;
<add>
<add>describe('ReactSuspenseWithNoopRenderer', () => {
<add> beforeEach(() => {
<add> jest.resetModules();
<add>
<add> React = require('react');
<add> ReactNoop = require('react-noop-renderer');
<add> Scheduler = require('scheduler');
<add> Suspense = React.Suspense;
<add> useEffect = React.useEffect;
<add>
<add> getCacheForType = React.unstable_getCacheForType;
<add>
<add> caches = [];
<add> seededCache = null;
<add> });
<add>
<add> function createTextCache() {
<add> if (seededCache !== null) {
<add> // Trick to seed a cache before it exists.
<add> // TODO: Need a built-in API to seed data before the initial render (i.e.
<add> // not a refresh because nothing has mounted yet).
<add> const cache = seededCache;
<add> seededCache = null;
<add> return cache;
<add> }
<add>
<add> const data = new Map();
<add> const version = caches.length + 1;
<add> const cache = {
<add> version,
<add> data,
<add> resolve(text) {
<add> const record = data.get(text);
<add> if (record === undefined) {
<add> const newRecord = {
<add> status: 'resolved',
<add> value: text,
<add> };
<add> data.set(text, newRecord);
<add> } else if (record.status === 'pending') {
<add> const thenable = record.value;
<add> record.status = 'resolved';
<add> record.value = text;
<add> thenable.pings.forEach(t => t());
<add> }
<add> },
<add> reject(text, error) {
<add> const record = data.get(text);
<add> if (record === undefined) {
<add> const newRecord = {
<add> status: 'rejected',
<add> value: error,
<add> };
<add> data.set(text, newRecord);
<add> } else if (record.status === 'pending') {
<add> const thenable = record.value;
<add> record.status = 'rejected';
<add> record.value = error;
<add> thenable.pings.forEach(t => t());
<add> }
<add> },
<add> };
<add> caches.push(cache);
<add> return cache;
<add> }
<add>
<add> function readText(text) {
<add> const textCache = getCacheForType(createTextCache);
<add> const record = textCache.data.get(text);
<add> if (record !== undefined) {
<add> switch (record.status) {
<add> case 'pending':
<add> Scheduler.unstable_yieldValue(`Suspend! [${text}]`);
<add> throw record.value;
<add> case 'rejected':
<add> Scheduler.unstable_yieldValue(`Error! [${text}]`);
<add> throw record.value;
<add> case 'resolved':
<add> return textCache.version;
<add> }
<add> } else {
<add> Scheduler.unstable_yieldValue(`Suspend! [${text}]`);
<add>
<add> const thenable = {
<add> pings: [],
<add> then(resolve) {
<add> if (newRecord.status === 'pending') {
<add> thenable.pings.push(resolve);
<add> } else {
<add> Promise.resolve().then(() => resolve(newRecord.value));
<add> }
<add> },
<add> };
<add>
<add> const newRecord = {
<add> status: 'pending',
<add> value: thenable,
<add> };
<add> textCache.data.set(text, newRecord);
<add>
<add> throw thenable;
<add> }
<add> }
<add>
<add> function resolveMostRecentTextCache(text) {
<add> if (caches.length === 0) {
<add> throw Error('Cache does not exist.');
<add> } else {
<add> // Resolve the most recently created cache. An older cache can by
<add> // resolved with `caches[index].resolve(text)`.
<add> caches[caches.length - 1].resolve(text);
<add> }
<add> }
<add>
<add> const resolveText = resolveMostRecentTextCache;
<add>
<add> // @gate experimental
<add> it('regression: false positive for legacy suspense', async () => {
<add> // Wrapping in memo because regular function components go through the
<add> // mountIndeterminateComponent path, which acts like there's no `current`
<add> // fiber even though there is. `memo` is not indeterminate, so it goes
<add> // through the update path.
<add> const Child = React.memo(({text}) => {
<add> // If text hasn't resolved, this will throw and exit before the passive
<add> // static effect flag is added by the useEffect call below.
<add> readText(text);
<add>
<add> useEffect(() => {
<add> Scheduler.unstable_yieldValue('Effect');
<add> }, []);
<add>
<add> Scheduler.unstable_yieldValue(text);
<add> return text;
<add> });
<add>
<add> function App() {
<add> return (
<add> <Suspense fallback="Loading...">
<add> <Child text="Async" />
<add> </Suspense>
<add> );
<add> }
<add>
<add> const root = ReactNoop.createLegacyRoot(null);
<add>
<add> // On initial mount, the suspended component is committed in an incomplete
<add> // state, without a passive static effect flag.
<add> await ReactNoop.act(async () => {
<add> root.render(<App />);
<add> });
<add> expect(Scheduler).toHaveYielded(['Suspend! [Async]']);
<add> expect(root).toMatchRenderedOutput('Loading...');
<add>
<add> // When the promise resolves, a passive static effect flag is added. In the
<add> // regression, the "missing expected static flag" would fire, because the
<add> // previous fiber did not have one.
<add> await ReactNoop.act(async () => {
<add> resolveText('Async');
<add> });
<add> expect(Scheduler).toHaveYielded(['Async', 'Effect']);
<add> expect(root).toMatchRenderedOutput('Async');
<add> });
<add>}); | 3 |
Java | Java | add getprincipal to serverwebexchange | ec1eb14280e2a5b4584504938d657c63c938a0d0 | <ide><path>spring-web/src/main/java/org/springframework/web/server/ServerWebExchange.java
<ide>
<ide> package org.springframework.web.server;
<ide>
<add>import java.security.Principal;
<ide> import java.time.Instant;
<ide> import java.util.Map;
<ide> import java.util.Optional;
<ide> public interface ServerWebExchange {
<ide> */
<ide> Mono<WebSession> getSession();
<ide>
<add> /**
<add> * Return the authenticated user for the request, if any.
<add> */
<add> <T extends Principal> Optional<T> getPrincipal();
<add>
<ide> /**
<ide> * Returns {@code true} if the one of the {@code checkNotModified} methods
<ide> * in this contract were used and they returned true.
<ide><path>spring-web/src/main/java/org/springframework/web/server/ServerWebExchangeDecorator.java
<add>/*
<add> * Copyright 2002-2016 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>package org.springframework.web.server;
<add>
<add>import java.security.Principal;
<add>import java.time.Instant;
<add>import java.util.Map;
<add>import java.util.Optional;
<add>
<add>import reactor.core.publisher.Mono;
<add>
<add>import org.springframework.http.server.reactive.ServerHttpRequest;
<add>import org.springframework.http.server.reactive.ServerHttpResponse;
<add>import org.springframework.util.Assert;
<add>
<add>/**
<add> * Wraps another {@link ServerWebExchange} and delegates all methods to it.
<add> * Sub-classes can override specific methods, e.g. {@link #getPrincipal()} to
<add> * return the authenticated user for the request.
<add> *
<add> * @author Rossen Stoyanchev
<add> */
<add>public class ServerWebExchangeDecorator implements ServerWebExchange {
<add>
<add> private final ServerWebExchange delegate;
<add>
<add>
<add> public ServerWebExchangeDecorator(ServerWebExchange delegate) {
<add> Assert.notNull(delegate, "'delegate' is required.");
<add> this.delegate = delegate;
<add> }
<add>
<add>
<add> public ServerWebExchange getDelegate() {
<add> return this.delegate;
<add> }
<add>
<add> // ServerWebExchange delegation methods...
<add>
<add> @Override
<add> public ServerHttpRequest getRequest() {
<add> return this.getDelegate().getRequest();
<add> }
<add>
<add> @Override
<add> public ServerHttpResponse getResponse() {
<add> return this.getDelegate().getResponse();
<add> }
<add>
<add> @Override
<add> public Map<String, Object> getAttributes() {
<add> return this.getDelegate().getAttributes();
<add> }
<add>
<add> @Override
<add> public <T> Optional<T> getAttribute(String name) {
<add> return this.getDelegate().getAttribute(name);
<add> }
<add>
<add> @Override
<add> public Mono<WebSession> getSession() {
<add> return this.getDelegate().getSession();
<add> }
<add>
<add> @Override
<add> public <T extends Principal> Optional<T> getPrincipal() {
<add> return this.getDelegate().getPrincipal();
<add> }
<add>
<add> @Override
<add> public boolean isNotModified() {
<add> return this.getDelegate().isNotModified();
<add> }
<add>
<add> @Override
<add> public boolean checkNotModified(Instant lastModified) {
<add> return this.getDelegate().checkNotModified(lastModified);
<add> }
<add>
<add> @Override
<add> public boolean checkNotModified(String etag) {
<add> return this.getDelegate().checkNotModified(etag);
<add> }
<add>
<add> @Override
<add> public boolean checkNotModified(String etag, Instant lastModified) {
<add> return this.getDelegate().checkNotModified(etag, lastModified);
<add> }
<add>
<add>
<add> @Override
<add> public String toString() {
<add> return getClass().getSimpleName() + " [delegate=" + getDelegate() + "]";
<add> }
<add>
<add>}
<ide><path>spring-web/src/main/java/org/springframework/web/server/adapter/DefaultServerWebExchange.java
<ide>
<ide> package org.springframework.web.server.adapter;
<ide>
<add>import java.security.Principal;
<ide> import java.time.Instant;
<ide> import java.time.temporal.ChronoUnit;
<ide> import java.util.Arrays;
<ide> public Mono<WebSession> getSession() {
<ide> return this.sessionMono;
<ide> }
<ide>
<add> @Override
<add> public <T extends Principal> Optional<T> getPrincipal() {
<add> return Optional.empty();
<add> }
<add>
<ide> @Override
<ide> public boolean isNotModified() {
<ide> return this.notModified; | 3 |
Python | Python | fix model call | 22a00338e8b51a714955ea6ba4a03c6d3cf86b66 | <ide><path>official/resnet/cifar10_main.py
<ide> def __init__(self, resnet_size, data_format=None, num_classes=_NUM_CLASSES,
<ide> conv_stride=1,
<ide> first_pool_size=None,
<ide> first_pool_stride=None,
<del> second_pool_size=8,
<del> second_pool_stride=1,
<ide> block_sizes=[num_blocks] * 3,
<ide> block_strides=[1, 2, 2],
<ide> final_size=64,
<ide><path>official/resnet/imagenet_main.py
<ide> def __init__(self, resnet_size, data_format=None, num_classes=_NUM_CLASSES,
<ide> conv_stride=2,
<ide> first_pool_size=3,
<ide> first_pool_stride=2,
<del> second_pool_size=7,
<del> second_pool_stride=1,
<ide> block_sizes=_get_block_sizes(resnet_size),
<ide> block_strides=[1, 2, 2, 2],
<ide> final_size=final_size, | 2 |
Java | Java | add refcount with count & disconnect timeout | cedfc538d05f8bdc5bfc6c644e258bdad946333f | <ide><path>src/main/java/io/reactivex/flowables/ConnectableFlowable.java
<ide>
<ide> package io.reactivex.flowables;
<ide>
<del>import io.reactivex.annotations.NonNull;
<add>import java.util.concurrent.TimeUnit;
<add>
<ide> import org.reactivestreams.Subscriber;
<ide>
<del>import io.reactivex.Flowable;
<add>import io.reactivex.*;
<add>import io.reactivex.annotations.*;
<ide> import io.reactivex.disposables.Disposable;
<ide> import io.reactivex.functions.Consumer;
<del>import io.reactivex.internal.functions.Functions;
<add>import io.reactivex.internal.functions.*;
<ide> import io.reactivex.internal.operators.flowable.*;
<ide> import io.reactivex.internal.util.ConnectConsumer;
<ide> import io.reactivex.plugins.RxJavaPlugins;
<add>import io.reactivex.schedulers.Schedulers;
<ide>
<ide> /**
<ide> * A {@code ConnectableFlowable} resembles an ordinary {@link Flowable}, except that it does not begin
<ide> public final Disposable connect() {
<ide> /**
<ide> * Returns a {@code Flowable} that stays connected to this {@code ConnectableFlowable} as long as there
<ide> * is at least one subscription to this {@code ConnectableFlowable}.
<del> *
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload does not operate on any particular {@link Scheduler}.</dd>
<add> * </dl>
<ide> * @return a {@link Flowable}
<ide> * @see <a href="http://reactivex.io/documentation/operators/refcount.html">ReactiveX documentation: RefCount</a>
<add> * @see #refCount(int)
<add> * @see #refCount(long, TimeUnit)
<add> * @see #refCount(int, long, TimeUnit)
<ide> */
<ide> @NonNull
<add> @CheckReturnValue
<add> @SchedulerSupport(SchedulerSupport.NONE)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<ide> public Flowable<T> refCount() {
<ide> return RxJavaPlugins.onAssembly(new FlowableRefCount<T>(this));
<ide> }
<ide>
<add> /**
<add> * Connects to the upstream {@code ConnectableFlowable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload does not operate on any particular {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @return the new Flowable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.NONE)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<add> public final Flowable<T> refCount(int subscriberCount) {
<add> return refCount(subscriberCount, 0, TimeUnit.NANOSECONDS, Schedulers.trampoline());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableFlowable} if the number of subscribed
<add> * subscriber reaches 1 and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the {@code computation} {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @return the new Flowable instance
<add> * @since 2.1.14 - experimental
<add> * @see #refCount(long, TimeUnit, Scheduler)
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<add> public final Flowable<T> refCount(long timeout, TimeUnit unit) {
<add> return refCount(1, timeout, unit, Schedulers.computation());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableFlowable} if the number of subscribed
<add> * subscriber reaches 1 and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the specified {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @param scheduler the target scheduler to wait on before disconnecting
<add> * @return the new Flowable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.CUSTOM)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<add> public final Flowable<T> refCount(long timeout, TimeUnit unit, Scheduler scheduler) {
<add> return refCount(1, timeout, unit, scheduler);
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableFlowable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the {@code computation} {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @return the new Flowable instance
<add> * @since 2.1.14 - experimental
<add> * @see #refCount(int, long, TimeUnit, Scheduler)
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<add> public final Flowable<T> refCount(int subscriberCount, long timeout, TimeUnit unit) {
<add> return refCount(subscriberCount, timeout, unit, Schedulers.computation());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableFlowable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Backpressure:</b></dt>
<add> * <dd>The operator itself doesn't interfere with backpressure which is determined by the upstream
<add> * {@code ConnectableFlowable}'s backpressure behavior.</dd>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the specified {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @param scheduler the target scheduler to wait on before disconnecting
<add> * @return the new Flowable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.CUSTOM)
<add> @BackpressureSupport(BackpressureKind.PASS_THROUGH)
<add> public final Flowable<T> refCount(int subscriberCount, long timeout, TimeUnit unit, Scheduler scheduler) {
<add> ObjectHelper.verifyPositive(subscriberCount, "subscriberCount");
<add> ObjectHelper.requireNonNull(unit, "unit is null");
<add> ObjectHelper.requireNonNull(scheduler, "scheduler is null");
<add> return RxJavaPlugins.onAssembly(new FlowableRefCount<T>(this, subscriberCount, timeout, unit, scheduler));
<add> }
<add>
<ide> /**
<ide> * Returns a Flowable that automatically connects (at most once) to this ConnectableFlowable
<ide> * when the first Subscriber subscribes.
<ide><path>src/main/java/io/reactivex/observables/ConnectableObservable.java
<ide>
<ide> package io.reactivex.observables;
<ide>
<del>import io.reactivex.annotations.NonNull;
<add>import java.util.concurrent.TimeUnit;
<ide>
<ide> import io.reactivex.*;
<add>import io.reactivex.annotations.*;
<ide> import io.reactivex.disposables.Disposable;
<ide> import io.reactivex.functions.Consumer;
<del>import io.reactivex.internal.functions.Functions;
<add>import io.reactivex.internal.functions.*;
<ide> import io.reactivex.internal.operators.observable.*;
<ide> import io.reactivex.internal.util.ConnectConsumer;
<ide> import io.reactivex.plugins.RxJavaPlugins;
<add>import io.reactivex.schedulers.Schedulers;
<ide>
<ide> /**
<ide> * A {@code ConnectableObservable} resembles an ordinary {@link Observable}, except that it does not begin
<ide> public final Disposable connect() {
<ide> /**
<ide> * Returns an {@code Observable} that stays connected to this {@code ConnectableObservable} as long as there
<ide> * is at least one subscription to this {@code ConnectableObservable}.
<del> *
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload does not operate on any particular {@link Scheduler}.</dd>
<add> * </dl>
<ide> * @return an {@link Observable}
<ide> * @see <a href="http://reactivex.io/documentation/operators/refcount.html">ReactiveX documentation: RefCount</a>
<add> * @see #refCount(int)
<add> * @see #refCount(long, TimeUnit)
<add> * @see #refCount(int, long, TimeUnit)
<ide> */
<ide> @NonNull
<add> @CheckReturnValue
<add> @SchedulerSupport(SchedulerSupport.NONE)
<ide> public Observable<T> refCount() {
<ide> return RxJavaPlugins.onAssembly(new ObservableRefCount<T>(this));
<ide> }
<ide>
<add> /**
<add> * Connects to the upstream {@code ConnectableObservable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload does not operate on any particular {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @return the new Observable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.NONE)
<add> public final Observable<T> refCount(int subscriberCount) {
<add> return refCount(subscriberCount, 0, TimeUnit.NANOSECONDS, Schedulers.trampoline());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableObservable} if the number of subscribed
<add> * subscriber reaches 1 and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the {@code computation} {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @return the new Observable instance
<add> * @since 2.1.14 - experimental
<add> * @see #refCount(long, TimeUnit, Scheduler)
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<add> public final Observable<T> refCount(long timeout, TimeUnit unit) {
<add> return refCount(1, timeout, unit, Schedulers.computation());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableObservable} if the number of subscribed
<add> * subscriber reaches 1 and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the specified {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @param scheduler the target scheduler to wait on before disconnecting
<add> * @return the new Observable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.CUSTOM)
<add> public final Observable<T> refCount(long timeout, TimeUnit unit, Scheduler scheduler) {
<add> return refCount(1, timeout, unit, scheduler);
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableObservable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the {@code computation} {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @return the new Observable instance
<add> * @since 2.1.14 - experimental
<add> * @see #refCount(int, long, TimeUnit, Scheduler)
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.COMPUTATION)
<add> public final Observable<T> refCount(int subscriberCount, long timeout, TimeUnit unit) {
<add> return refCount(subscriberCount, timeout, unit, Schedulers.computation());
<add> }
<add>
<add> /**
<add> * Connects to the upstream {@code ConnectableObservable} if the number of subscribed
<add> * subscriber reaches the specified count and disconnect after the specified
<add> * timeout if all subscribers have unsubscribed.
<add> * <dl>
<add> * <dt><b>Scheduler:</b></dt>
<add> * <dd>This {@code refCount} overload operates on the specified {@link Scheduler}.</dd>
<add> * </dl>
<add> * @param subscriberCount the number of subscribers required to connect to the upstream
<add> * @param timeout the time to wait before disconnecting after all subscribers unsubscribed
<add> * @param unit the time unit of the timeout
<add> * @param scheduler the target scheduler to wait on before disconnecting
<add> * @return the new Observable instance
<add> * @since 2.1.14 - experimental
<add> */
<add> @CheckReturnValue
<add> @Experimental
<add> @SchedulerSupport(SchedulerSupport.CUSTOM)
<add> public final Observable<T> refCount(int subscriberCount, long timeout, TimeUnit unit, Scheduler scheduler) {
<add> ObjectHelper.verifyPositive(subscriberCount, "subscriberCount");
<add> ObjectHelper.requireNonNull(unit, "unit is null");
<add> ObjectHelper.requireNonNull(scheduler, "scheduler is null");
<add> return RxJavaPlugins.onAssembly(new ObservableRefCount<T>(this, subscriberCount, timeout, unit, scheduler));
<add> }
<add>
<ide> /**
<ide> * Returns an Observable that automatically connects (at most once) to this ConnectableObservable
<ide> * when the first Observer subscribes.
<ide><path>src/test/java/io/reactivex/internal/operators/flowable/FlowableRefCountTest.java
<ide> public void subscribe(FlowableEmitter<Object> emitter) throws Exception {
<ide> assertTrue(interrupted.get());
<ide> }
<ide>
<del> static <T> FlowableTransformer<T, T> refCount(final int n) {
<del> return refCount(n, 0, TimeUnit.NANOSECONDS, Schedulers.trampoline());
<del> }
<del>
<del> static <T> FlowableTransformer<T, T> refCount(final long time, final TimeUnit unit) {
<del> return refCount(1, time, unit, Schedulers.computation());
<del> }
<del>
<del> static <T> FlowableTransformer<T, T> refCount(final long time, final TimeUnit unit, final Scheduler scheduler) {
<del> return refCount(1, time, unit, scheduler);
<del> }
<del>
<del> static <T> FlowableTransformer<T, T> refCount(final int n, final long time, final TimeUnit unit) {
<del> return refCount(1, time, unit, Schedulers.computation());
<del> }
<del>
<del> static <T> FlowableTransformer<T, T> refCount(final int n, final long time, final TimeUnit unit, final Scheduler scheduler) {
<del> return new FlowableTransformer<T, T>() {
<del> @Override
<del> public Publisher<T> apply(Flowable<T> f) {
<del> return new FlowableRefCount<T>((ConnectableFlowable<T>)f, n, time, unit, scheduler);
<del> }
<del> };
<del> }
<del>
<ide> @Test
<ide> public void byCount() {
<ide> final int[] subscriptions = { 0 };
<ide> public void accept(Subscription s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(FlowableRefCountTest.<Integer>refCount(2));
<add> .refCount(2);
<ide>
<ide> for (int i = 0; i < 3; i++) {
<ide> TestSubscriber<Integer> ts1 = source.test();
<ide> public void accept(Subscription s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(FlowableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS));
<add> .refCount(500, TimeUnit.MILLISECONDS);
<ide>
<ide> TestSubscriber<Integer> ts1 = source.test(0);
<ide>
<ide> public void accept(Subscription s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(FlowableRefCountTest.<Integer>refCount(1, 100, TimeUnit.MILLISECONDS));
<add> .refCount(1, 100, TimeUnit.MILLISECONDS);
<ide>
<ide> TestSubscriber<Integer> ts1 = source.test(0);
<ide>
<ide> public void accept(Subscription s) throws Exception {
<ide> public void error() {
<ide> Flowable.<Integer>error(new IOException())
<ide> .publish()
<del> .compose(FlowableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS))
<add> .refCount(500, TimeUnit.MILLISECONDS)
<ide> .test()
<ide> .assertFailure(IOException.class);
<ide> }
<ide>
<del> @Test(expected = ClassCastException.class)
<del> public void badUpstream() {
<del> Flowable.range(1, 5)
<del> .compose(FlowableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS, Schedulers.single()))
<del> ;
<del> }
<del>
<ide> @Test
<ide> public void comeAndGo() {
<ide> PublishProcessor<Integer> pp = PublishProcessor.create();
<ide>
<ide> Flowable<Integer> source = pp
<ide> .publish()
<del> .compose(FlowableRefCountTest.<Integer>refCount(1));
<add> .refCount(1);
<ide>
<ide> TestSubscriber<Integer> ts1 = source.test(0);
<ide>
<ide> public void unsubscribeSubscribeRace() {
<ide>
<ide> final Flowable<Integer> source = Flowable.range(1, 5)
<ide> .replay()
<del> .compose(FlowableRefCountTest.<Integer>refCount(1))
<add> .refCount(1)
<ide> ;
<ide>
<ide> final TestSubscriber<Integer> ts1 = source.test(0);
<ide> public void doubleOnX() {
<ide> }
<ide> }
<ide>
<add> @Test
<add> public void doubleOnXCount() {
<add> List<Throwable> errors = TestHelper.trackPluginErrors();
<add> try {
<add> new BadFlowableDoubleOnX()
<add> .refCount(1)
<add> .test()
<add> .assertResult();
<add>
<add> TestHelper.assertError(errors, 0, ProtocolViolationException.class);
<add> TestHelper.assertUndeliverable(errors, 1, TestException.class);
<add> } finally {
<add> RxJavaPlugins.reset();
<add> }
<add> }
<add>
<add> @Test
<add> public void doubleOnXTime() {
<add> List<Throwable> errors = TestHelper.trackPluginErrors();
<add> try {
<add> new BadFlowableDoubleOnX()
<add> .refCount(5, TimeUnit.SECONDS, Schedulers.single())
<add> .test()
<add> .assertResult();
<add>
<add> TestHelper.assertError(errors, 0, ProtocolViolationException.class);
<add> TestHelper.assertUndeliverable(errors, 1, TestException.class);
<add> } finally {
<add> RxJavaPlugins.reset();
<add> }
<add> }
<add>
<ide> @Test
<ide> public void cancelTerminateStateExclusion() {
<ide> FlowableRefCount<Object> o = (FlowableRefCount<Object>)PublishProcessor.create()
<ide><path>src/test/java/io/reactivex/internal/operators/observable/ObservableRefCountTest.java
<ide> public void subscribe(ObservableEmitter<Object> emitter) throws Exception {
<ide> assertTrue(interrupted.get());
<ide> }
<ide>
<del> static <T> ObservableTransformer<T, T> refCount(final int n) {
<del> return refCount(n, 0, TimeUnit.NANOSECONDS, Schedulers.trampoline());
<del> }
<del>
<del> static <T> ObservableTransformer<T, T> refCount(final long time, final TimeUnit unit) {
<del> return refCount(1, time, unit, Schedulers.computation());
<del> }
<del>
<del> static <T> ObservableTransformer<T, T> refCount(final long time, final TimeUnit unit, final Scheduler scheduler) {
<del> return refCount(1, time, unit, scheduler);
<del> }
<del>
<del> static <T> ObservableTransformer<T, T> refCount(final int n, final long time, final TimeUnit unit) {
<del> return refCount(1, time, unit, Schedulers.computation());
<del> }
<del>
<del> static <T> ObservableTransformer<T, T> refCount(final int n, final long time, final TimeUnit unit, final Scheduler scheduler) {
<del> return new ObservableTransformer<T, T>() {
<del> @Override
<del> public Observable<T> apply(Observable<T> f) {
<del> return new ObservableRefCount<T>((ConnectableObservable<T>)f, n, time, unit, scheduler);
<del> }
<del> };
<del> }
<del>
<ide> @Test
<ide> public void byCount() {
<ide> final int[] subscriptions = { 0 };
<ide> public void accept(Disposable s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(ObservableRefCountTest.<Integer>refCount(2));
<add> .refCount(2);
<ide>
<ide> for (int i = 0; i < 3; i++) {
<ide> TestObserver<Integer> to1 = source.test();
<ide> public void accept(Disposable s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(ObservableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS));
<add> .refCount(500, TimeUnit.MILLISECONDS);
<ide>
<ide> TestObserver<Integer> to1 = source.test();
<ide>
<ide> public void accept(Disposable s) throws Exception {
<ide> }
<ide> })
<ide> .publish()
<del> .compose(ObservableRefCountTest.<Integer>refCount(1, 100, TimeUnit.MILLISECONDS));
<add> .refCount(1, 100, TimeUnit.MILLISECONDS);
<ide>
<ide> TestObserver<Integer> to1 = source.test();
<ide>
<ide> public void accept(Disposable s) throws Exception {
<ide> public void error() {
<ide> Observable.<Integer>error(new IOException())
<ide> .publish()
<del> .compose(ObservableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS))
<add> .refCount(500, TimeUnit.MILLISECONDS)
<ide> .test()
<ide> .assertFailure(IOException.class);
<ide> }
<ide>
<del> @Test(expected = ClassCastException.class)
<del> public void badUpstream() {
<del> Observable.range(1, 5)
<del> .compose(ObservableRefCountTest.<Integer>refCount(500, TimeUnit.MILLISECONDS, Schedulers.single()))
<del> ;
<del> }
<del>
<ide> @Test
<ide> public void comeAndGo() {
<ide> PublishSubject<Integer> ps = PublishSubject.create();
<ide>
<ide> Observable<Integer> source = ps
<ide> .publish()
<del> .compose(ObservableRefCountTest.<Integer>refCount(1));
<add> .refCount(1);
<ide>
<ide> TestObserver<Integer> to1 = source.test();
<ide>
<ide> public void unsubscribeSubscribeRace() {
<ide>
<ide> final Observable<Integer> source = Observable.range(1, 5)
<ide> .replay()
<del> .compose(ObservableRefCountTest.<Integer>refCount(1))
<add> .refCount(1)
<ide> ;
<ide>
<ide> final TestObserver<Integer> to1 = source.test();
<ide> public void doubleOnX() {
<ide> }
<ide> }
<ide>
<add> @Test
<add> public void doubleOnXCount() {
<add> List<Throwable> errors = TestHelper.trackPluginErrors();
<add> try {
<add> new BadObservableDoubleOnX()
<add> .refCount(1)
<add> .test()
<add> .assertResult();
<add>
<add> TestHelper.assertError(errors, 0, ProtocolViolationException.class);
<add> TestHelper.assertUndeliverable(errors, 1, TestException.class);
<add> } finally {
<add> RxJavaPlugins.reset();
<add> }
<add> }
<add>
<add> @Test
<add> public void doubleOnXTime() {
<add> List<Throwable> errors = TestHelper.trackPluginErrors();
<add> try {
<add> new BadObservableDoubleOnX()
<add> .refCount(5, TimeUnit.SECONDS, Schedulers.single())
<add> .test()
<add> .assertResult();
<add>
<add> TestHelper.assertError(errors, 0, ProtocolViolationException.class);
<add> TestHelper.assertUndeliverable(errors, 1, TestException.class);
<add> } finally {
<add> RxJavaPlugins.reset();
<add> }
<add> }
<add>
<ide> @Test
<ide> public void cancelTerminateStateExclusion() {
<ide> ObservableRefCount<Object> o = (ObservableRefCount<Object>)PublishSubject.create() | 4 |
Javascript | Javascript | upgrade harmonymodulesplugin to es6 | 4e2c8a1d6e5ac3e3957aaca04c69e10f9f40f9b5 | <ide><path>lib/dependencies/HarmonyModulesPlugin.js
<ide> MIT License http://www.opensource.org/licenses/mit-license.php
<ide> Author Tobias Koppers @sokra
<ide> */
<del>var HarmonyImportDependency = require("./HarmonyImportDependency");
<del>var HarmonyImportSpecifierDependency = require("./HarmonyImportSpecifierDependency");
<del>var HarmonyCompatibilityDependency = require("./HarmonyCompatibilityDependency");
<del>var HarmonyExportHeaderDependency = require("./HarmonyExportHeaderDependency");
<del>var HarmonyExportExpressionDependency = require("./HarmonyExportExpressionDependency");
<del>var HarmonyExportSpecifierDependency = require("./HarmonyExportSpecifierDependency");
<del>var HarmonyExportImportedSpecifierDependency = require("./HarmonyExportImportedSpecifierDependency");
<del>var HarmonyAcceptDependency = require("./HarmonyAcceptDependency");
<del>var HarmonyAcceptImportDependency = require("./HarmonyAcceptImportDependency");
<del>
<del>var NullFactory = require("../NullFactory");
<del>
<del>var HarmonyDetectionParserPlugin = require("./HarmonyDetectionParserPlugin");
<del>var HarmonyImportDependencyParserPlugin = require("./HarmonyImportDependencyParserPlugin");
<del>var HarmonyExportDependencyParserPlugin = require("./HarmonyExportDependencyParserPlugin");
<del>
<del>function HarmonyModulesPlugin() {}
<del>module.exports = HarmonyModulesPlugin;
<add>"use strict";
<add>const HarmonyImportDependency = require("./HarmonyImportDependency");
<add>const HarmonyImportSpecifierDependency = require("./HarmonyImportSpecifierDependency");
<add>const HarmonyCompatiblilityDependency = require("./HarmonyCompatibilityDependency");
<add>const HarmonyExportHeaderDependency = require("./HarmonyExportHeaderDependency");
<add>const HarmonyExportExpressionDependency = require("./HarmonyExportExpressionDependency");
<add>const HarmonyExportSpecifierDependency = require("./HarmonyExportSpecifierDependency");
<add>const HarmonyExportImportedSpecifierDependency = require("./HarmonyExportImportedSpecifierDependency");
<add>const HarmonyAcceptDependency = require("./HarmonyAcceptDependency");
<add>const HarmonyAcceptImportDependency = require("./HarmonyAcceptImportDependency");
<add>
<add>const NullFactory = require("../NullFactory");
<add>
<add>const HarmonyDetectionParserPlugin = require("./HarmonyDetectionParserPlugin");
<add>const HarmonyImportDependencyParserPlugin = require("./HarmonyImportDependencyParserPlugin");
<add>const HarmonyExportDependencyParserPlugin = require("./HarmonyExportDependencyParserPlugin");
<ide>
<del>HarmonyModulesPlugin.prototype.apply = function(compiler) {
<del> compiler.plugin("compilation", function(compilation, params) {
<del> var normalModuleFactory = params.normalModuleFactory;
<add>class HarmonyModulesPlugin {
<ide>
<del> compilation.dependencyFactories.set(HarmonyImportDependency, normalModuleFactory);
<del> compilation.dependencyTemplates.set(HarmonyImportDependency, new HarmonyImportDependency.Template());
<add> apply(compiler) {
<add> compiler.plugin("compilation", (compilation, params) => {
<add> const normalModuleFactory = params.normalModuleFactory;
<ide>
<del> compilation.dependencyFactories.set(HarmonyImportSpecifierDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyImportSpecifierDependency, new HarmonyImportSpecifierDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyImportDependency, normalModuleFactory);
<add> compilation.dependencyTemplates.set(HarmonyImportDependency, new HarmonyImportDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyCompatibilityDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyCompatibilityDependency, new HarmonyCompatibilityDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyImportSpecifierDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyImportSpecifierDependency, new HarmonyImportSpecifierDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyExportHeaderDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyExportHeaderDependency, new HarmonyExportHeaderDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyCompatiblilityDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyCompatiblilityDependency, new HarmonyCompatiblilityDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyExportExpressionDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyExportExpressionDependency, new HarmonyExportExpressionDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyExportHeaderDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyExportHeaderDependency, new HarmonyExportHeaderDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyExportSpecifierDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyExportSpecifierDependency, new HarmonyExportSpecifierDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyExportExpressionDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyExportExpressionDependency, new HarmonyExportExpressionDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyExportImportedSpecifierDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyExportImportedSpecifierDependency, new HarmonyExportImportedSpecifierDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyExportSpecifierDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyExportSpecifierDependency, new HarmonyExportSpecifierDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyAcceptDependency, new NullFactory());
<del> compilation.dependencyTemplates.set(HarmonyAcceptDependency, new HarmonyAcceptDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyExportImportedSpecifierDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyExportImportedSpecifierDependency, new HarmonyExportImportedSpecifierDependency.Template());
<ide>
<del> compilation.dependencyFactories.set(HarmonyAcceptImportDependency, normalModuleFactory);
<del> compilation.dependencyTemplates.set(HarmonyAcceptImportDependency, new HarmonyAcceptImportDependency.Template());
<add> compilation.dependencyFactories.set(HarmonyAcceptDependency, new NullFactory());
<add> compilation.dependencyTemplates.set(HarmonyAcceptDependency, new HarmonyAcceptDependency.Template());
<ide>
<del> params.normalModuleFactory.plugin("parser", function(parser, parserOptions) {
<add> compilation.dependencyFactories.set(HarmonyAcceptImportDependency, normalModuleFactory);
<add> compilation.dependencyTemplates.set(HarmonyAcceptImportDependency, new HarmonyAcceptImportDependency.Template());
<ide>
<del> if(typeof parserOptions.harmony !== "undefined" && !parserOptions.harmony)
<del> return;
<add> params.normalModuleFactory.plugin("parser", (parser, parserOptions) => {
<ide>
<del> parser.apply(
<del> new HarmonyDetectionParserPlugin(),
<del> new HarmonyImportDependencyParserPlugin(),
<del> new HarmonyExportDependencyParserPlugin()
<del> );
<add> if(typeof parserOptions.harmony !== "undefined" && !parserOptions.harmony)
<add> return;
<add>
<add> parser.apply(
<add> new HarmonyDetectionParserPlugin(),
<add> new HarmonyImportDependencyParserPlugin(),
<add> new HarmonyExportDependencyParserPlugin()
<add> );
<add> });
<ide> });
<del> });
<del>};
<add> }
<add>}
<add>module.exports = HarmonyModulesPlugin; | 1 |
PHP | PHP | use input() to get page number | abc7b9889be3154d400443b8a9ba0a9435157dd6 | <ide><path>src/Illuminate/Pagination/Environment.php
<ide> public function getPaginationView(Paginator $paginator, $view = null)
<ide> */
<ide> public function getCurrentPage()
<ide> {
<del> $page = (int) $this->currentPage ?: $this->request->query->get($this->pageName, 1);
<add> $page = (int) $this->currentPage ?: $this->request->input($this->pageName, 1);
<ide>
<ide> if ($page < 1 || filter_var($page, FILTER_VALIDATE_INT) === false)
<ide> { | 1 |
PHP | PHP | add binary type | 929202873c95e3f43068c56822da144fd72b65a6 | <ide><path>lib/Cake/Model/Datasource/Database/Type/BinaryType.php
<add><?php
<add>/**
<add> * PHP Version 5.4
<add> *
<add> * CakePHP(tm) : Rapid Development Framework (http://cakephp.org)
<add> * Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> *
<add> * Licensed under The MIT License
<add> * For full copyright and license information, please see the LICENSE.txt
<add> * Redistributions of files must retain the above copyright notice.
<add> *
<add> * @copyright Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> * @link http://cakephp.org CakePHP(tm) Project
<add> * @since CakePHP(tm) v 3.0.0
<add> * @license MIT License (http://www.opensource.org/licenses/mit-license.php)
<add> */
<add>namespace Cake\Model\Datasource\Database\Type;
<add>
<add>use Cake\Error;
<add>use Cake\Model\Datasource\Database\Driver;
<add>use \PDO;
<add>
<add>/**
<add> * Binary type converter.
<add> *
<add> * Use to convert binary data between PHP and the database types.
<add> */
<add>class BinaryType extends \Cake\Model\Datasource\Database\Type {
<add>
<add>/**
<add> * Convert binary data into the database format.
<add> *
<add> * Binary data is not altered before being inserted into the database.
<add> * As PDO will handle reading file handles.
<add> *
<add> * @param string|resource $value The value to convert.
<add> * @param Driver $driver The driver instance to convert with.
<add> * @return string|resource
<add> */
<add> public function toDatabase($value, Driver $driver) {
<add> return $value;
<add> }
<add>
<add>/**
<add> * Convert binary into resource handles
<add> *
<add> * @param null|string|resource $value The value to convert.
<add> * @param Driver $driver The driver instance to convert with.
<add> * @return resource
<add> * @throws Cake\Error\Exception
<add> */
<add> public function toPHP($value, Driver $driver) {
<add> if ($value === null) {
<add> return null;
<add> }
<add> if (is_string($value)) {
<add> return fopen('data:text/plain;base64,' . base64_encode($value), 'rb');
<add> }
<add> if (is_resource($value)) {
<add> return $value;
<add> }
<add> throw new Error\Exception(__d('cake_dev', 'Unable to convert %s into binary.', gettype($value)));
<add> }
<add>
<add>/**
<add> * Get the correct PDO binding type for Binary data.
<add> *
<add> * @param mixed $value The value being bound.
<add> * @param Driver $driver The driver.
<add> * @return integer
<add> */
<add> public function toStatement($value, Driver $driver) {
<add> return PDO::PARAM_LOB;
<add> }
<add>
<add>}
<ide><path>lib/Cake/Test/TestCase/Model/Datasource/Database/Type/BinaryTypeTest.php
<add><?php
<add>/**
<add> * PHP Version 5.4
<add> *
<add> * CakePHP(tm) : Rapid Development Framework (http://cakephp.org)
<add> * Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> *
<add> * Licensed under The MIT License
<add> * For full copyright and license information, please see the LICENSE.txt
<add> * Redistributions of files must retain the above copyright notice.
<add> *
<add> * @copyright Copyright (c) Cake Software Foundation, Inc. (http://cakefoundation.org)
<add> * @link http://cakephp.org CakePHP(tm) Project
<add> * @since CakePHP(tm) v 3.0.0
<add> * @license MIT License (http://www.opensource.org/licenses/mit-license.php)
<add> */
<add>namespace Cake\Test\TestCase\Model\Datasource\Database\Type;
<add>
<add>use Cake\Model\Datasource\Database\Type;
<add>use Cake\Model\Datasource\Database\Type\BinaryType;
<add>use Cake\TestSuite\TestCase;
<add>use \PDO;
<add>
<add>/**
<add> * Test for the Binary type.
<add> */
<add>class BinaryTypeTest extends TestCase {
<add>
<add>/**
<add> * Setup
<add> *
<add> * @return void
<add> */
<add> public function setUp() {
<add> parent::setUp();
<add> $this->type = Type::build('binary');
<add> $this->driver = $this->getMock('Cake\Model\Datasource\Database\Driver');
<add> }
<add>
<add>/**
<add> * Test toPHP
<add> *
<add> * @return void
<add> */
<add> public function testToPHP() {
<add> $this->assertNull($this->type->toPHP(null, $this->driver));
<add>
<add> $result = $this->type->toPHP('some data', $this->driver);
<add> $this->assertInternalType('resource', $result);
<add>
<add> $fh = fopen(__FILE__, 'r');
<add> $result = $this->type->toPHP($fh, $this->driver);
<add> $this->assertSame($fh, $result);
<add> fclose($fh);
<add> }
<add>
<add>/**
<add> * Test exceptions on invalid data.
<add> *
<add> * @expectedException \Cake\Error\Exception
<add> * @expectedExceptionMessage Unable to convert array into binary.
<add> */
<add> public function testToPHPFailure() {
<add> $this->type->toPHP([], $this->driver);
<add> }
<add>
<add>/**
<add> * Test converting to database format
<add> *
<add> * @return void
<add> */
<add> public function testToDatabase() {
<add> $value = 'some data';
<add> $result = $this->type->toDatabase($value, $this->driver);
<add> $this->assertEquals($value, $result);
<add>
<add> $fh = fopen(__FILE__, 'r');
<add> $result = $this->type->toDatabase($fh, $this->driver);
<add> $this->assertSame($fh, $result);
<add> }
<add>
<add>/**
<add> * Test that the PDO binding type is correct.
<add> *
<add> * @return void
<add> */
<add> public function testToStatement() {
<add> $this->assertEquals(PDO::PARAM_LOB, $this->type->toStatement('', $this->driver));
<add> }
<add>
<add>} | 2 |
Go | Go | add token cache | dd914f91d779f64e20ce86767ab4f84f40b9ef6a | <ide><path>registry/auth.go
<ide> import (
<ide> "os"
<ide> "path"
<ide> "strings"
<add> "sync"
<add> "time"
<ide>
<ide> log "github.com/Sirupsen/logrus"
<ide> "github.com/docker/docker/utils"
<ide> type RequestAuthorization struct {
<ide> resource string
<ide> scope string
<ide> actions []string
<add>
<add> tokenLock sync.Mutex
<add> tokenCache string
<add> tokenExpiration time.Time
<ide> }
<ide>
<ide> func NewRequestAuthorization(authConfig *AuthConfig, registryEndpoint *Endpoint, resource, scope string, actions []string) *RequestAuthorization {
<ide> func NewRequestAuthorization(authConfig *AuthConfig, registryEndpoint *Endpoint,
<ide> }
<ide>
<ide> func (auth *RequestAuthorization) getToken() (string, error) {
<del> // TODO check if already has token and before expiration
<add> auth.tokenLock.Lock()
<add> defer auth.tokenLock.Unlock()
<add> now := time.Now()
<add> if now.Before(auth.tokenExpiration) {
<add> log.Debugf("Using cached token for %s", auth.authConfig.Username)
<add> return auth.tokenCache, nil
<add> }
<add>
<ide> client := &http.Client{
<ide> Transport: &http.Transport{
<ide> DisableKeepAlives: true,
<ide> func (auth *RequestAuthorization) getToken() (string, error) {
<ide> if err != nil {
<ide> return "", err
<ide> }
<del> // TODO cache token and set expiration to one minute from now
<add> auth.tokenCache = token
<add> auth.tokenExpiration = now.Add(time.Minute)
<ide>
<ide> return token, nil
<ide> default:
<ide> log.Infof("Unsupported auth scheme: %q", challenge.Scheme)
<ide> }
<ide> }
<del> // TODO no expiration, do not reattempt to get a token
<add>
<add> // Do not expire cache since there are no challenges which use a token
<add> auth.tokenExpiration = time.Now().Add(time.Hour * 24)
<add>
<ide> return "", nil
<ide> }
<ide> | 1 |
Text | Text | fix typo `titlelize` -> `titleize` [ci skip] | b9fa0fd2f1060a1c2d3481a66a04d8bd71c04196 | <ide><path>activesupport/CHANGELOG.md
<del>* Update `titlelize` regex to allow apostrophes
<add>* Update `titleize` regex to allow apostrophes
<ide>
<del> In 4b685aa the regex in `titlelize` was updated to not match apostrophes to
<add> In 4b685aa the regex in `titleize` was updated to not match apostrophes to
<ide> better reflect the nature of the transformation. Unfortunately this had the
<ide> side effect of breaking capitalization on the first word of a sub-string, e.g:
<ide> | 1 |
Javascript | Javascript | update config pickup from env | 786626f21e99334d9d4ac4a3f934c891b78a5818 | <ide><path>client/src/components/Donation/PaypalButton.js
<ide> import PropTypes from 'prop-types';
<ide> import { connect } from 'react-redux';
<ide> import { createSelector } from 'reselect';
<ide> import { PayPalButton } from 'react-paypal-button-v2';
<del>import { paypalClientId } from '../../../config/env.json';
<add>import { paypalClientId, deploymentEnv } from '../../../config/env.json';
<ide> import { verifySubscriptionPaypal } from '../../utils/ajax';
<del>import { paypalConfigurator } from '../../../../config/donation-settings';
<add>import {
<add> paypalConfigurator,
<add> paypalConfigTypes
<add>} from '../../../../config/donation-settings';
<ide> import { signInLoadingSelector, userSelector } from '../../redux';
<ide>
<ide> export class PaypalButton extends Component {
<ide> export class PaypalButton extends Component {
<ide>
<ide> const configurationObj = paypalConfigurator(
<ide> donationAmount,
<del> donationDuration
<add> donationDuration,
<add> paypalConfigTypes[deploymentEnv || 'staging']
<ide> );
<ide> if (state === configurationObj) {
<ide> return null;
<ide><path>config/donation-settings.js
<del>const path = require('path');
<del>require('dotenv').config({ path: path.resolve(__dirname, '../.env') });
<del>
<ide> // Configuration for client side
<ide> const durationsConfig = {
<ide> year: 'yearly',
<ide> const paypalConfigTypes = {
<ide> planId: 'P-1L11422374370240ULZKX3PA'
<ide> },
<ide> '3500': {
<del> planId: 'P-1L11422374370240ULZKX3PA'
<add> planId: 'P-81U00703FF076883HLZ2PWMI'
<ide> },
<ide> '25000': {
<del> planId: 'P-1L11422374370240ULZKX3PA'
<add> planId: 'P-7M045671FN915794KLZ2PW6I'
<ide> }
<ide> },
<ide> year: {
<ide> const paypalConfigTypes = {
<ide> }
<ide> };
<ide>
<del>const paypalConfig =
<del> process.env.DEPLOYMENT_ENV && process.env.DEPLOYMENT_ENV === 'live'
<del> ? paypalConfigTypes['live']
<del> : paypalConfigTypes['staging'];
<del>
<del>const paypalConfigurator = (donationAmount, donationDuration) => {
<add>const paypalConfigurator = (donationAmount, donationDuration, paypalConfig) => {
<ide> if (donationDuration === 'onetime') {
<ide> return { amount: donationAmount, duration: donationDuration };
<ide> }
<ide> module.exports = {
<ide> donationOneTimeConfig,
<ide> donationSubscriptionConfig,
<ide> modalDefaultStateConfig,
<del> paypalConfig,
<add> paypalConfigTypes,
<ide> paypalConfigurator
<ide> };
<ide><path>config/env.js
<ide> const {
<ide> SERVICEBOT_ID: servicebotId,
<ide> ALGOLIA_APP_ID: algoliaAppId,
<ide> ALGOLIA_API_KEY: algoliaAPIKey,
<del> PAYPAL_CLIENT_ID: paypalClientId
<add> PAYPAL_CLIENT_ID: paypalClientId,
<add> DEPLOYMENT_ENV: deploymentEnv
<ide> } = process.env;
<ide>
<ide> const locations = {
<ide> const locations = {
<ide>
<ide> module.exports = Object.assign(locations, {
<ide> locale,
<add> deploymentEnv,
<ide> stripePublicKey:
<ide> !stripePublicKey || stripePublicKey === 'pk_from_stripe_dashboard'
<ide> ? null | 3 |
Python | Python | add tests for empty quotes and escaped quotechars | 1489805af8ec0f2b27e4f7439bdc4e48acfdaa6a | <ide><path>numpy/lib/tests/test_io.py
<ide> def test_loadtxt_structured_dtype_with_quotes():
<ide> )
<ide> res = np.loadtxt(data, dtype=dtype, delimiter=";", quotechar="'")
<ide> assert_array_equal(res, expected)
<add>
<add>
<add>def test_loadtxt_quoted_field_is_not_empty():
<add> txt = StringIO('1\n\n"4"\n""')
<add> expected = np.array(["1", "4", ""], dtype="U1")
<add> res = np.loadtxt(txt, delimiter=",", dtype="U1", quotechar='"')
<add> assert_equal(res, expected)
<add>
<add>
<add>def test_loadtxt_consecutive_quotechar_escaped():
<add> txt = TextIO('"Hello, my name is ""Monty""!"')
<add> expected = np.array('Hello, my name is "Monty"!', dtype="U40")
<add> res = np.loadtxt(txt, dtype="U40", delimiter=",", quotechar='"')
<add> assert_equal(res, expected) | 1 |
Text | Text | fix entry for slack channel in onboarding.md | 3d3aa5d8376e350343550f5c1819098b9115844c | <ide><path>onboarding.md
<ide> onboarding session.
<ide> * Watching the main repository will flood your inbox (several hundred
<ide> notifications on typical weekdays), so be prepared
<ide>
<del>The project has two venues for real-time discussion:
<add>The project has a venue for real-time discussion:
<ide>
<ide> * [`#nodejs-dev`](https://openjs-foundation.slack.com/archives/C019Y2T6STH) on
<del> the [OpenJS Foundation](https://slack-invite.openjsf.org/)
<add> the [OpenJS Foundation Slack](https://slack-invite.openjsf.org/)
<ide>
<ide> ## Project goals and values
<ide> | 1 |
Text | Text | add molow to triagers | eabd1c2dbc2c141ffa6688613bbc981f3f8223d3 | <ide><path>README.md
<ide> maintaining the Node.js project.
<ide> **Xuguang Mei** <<meixuguang@gmail.com>> (he/him)
<ide> * [Mesteery](https://github.com/Mesteery) -
<ide> **Mestery** <<mestery@protonmail.com>> (he/him)
<add>* [MoLow](https://github.com/MoLow) -
<add> **Moshe Atlow** <<moshe@atlow.co.il>> (he/him)
<ide> * [PoojaDurgad](https://github.com/PoojaDurgad) -
<ide> **Pooja Durgad** <<Pooja.D.P@ibm.com>>
<ide> * [RaisinTen](https://github.com/RaisinTen) - | 1 |
Python | Python | fix symlink function to check for windows | fefe6684cd9e0d9d24a895e48a19f5ab633fe7af | <ide><path>spacy/compat.py
<ide>
<ide>
<ide> def symlink_to(orig, dest):
<del> if is_python3:
<del> orig.symlink_to(dest)
<del>
<del> elif is_python2:
<add> if is_python2 and is_windows:
<ide> import subprocess
<ide> subprocess.call(['mklink', '/d', unicode(orig), unicode(dest)], shell=True)
<add> else:
<add> orig.symlink_to(dest)
<ide>
<ide>
<ide> def is_config(python2=None, python3=None, windows=None, linux=None, osx=None): | 1 |
Ruby | Ruby | fix regexp intervals | 8a04bd0c832a5fe3f248225a732db54392e38fe4 | <ide><path>railties/lib/rails/generators/generated_attribute.rb
<ide> def parse(column_definition)
<ide> # when declaring options curly brackets should be used
<ide> def parse_type_and_options(type)
<ide> case type
<del> when /(string|text|binary|integer){(\d+)}/
<add> when /(string|text|binary|integer)\{(\d+)\}/
<ide> return $1, :limit => $2.to_i
<del> when /decimal{(\d+),(\d+)}/
<add> when /decimal\{(\d+),(\d+)\}/
<ide> return :decimal, :precision => $1.to_i, :scale => $2.to_i
<ide> else
<ide> return type, {} | 1 |
Python | Python | remove unused code | 9f2c6d59bb43435b46c1192582e40e5f092c32b7 | <ide><path>numpy/core/setup.py
<ide> def testcode_mathlib():
<ide> """
<ide>
<ide> import sys
<del>def generate_testcode(target):
<del> if sys.platform == 'win32':
<del> target = target.replace('\\','\\\\')
<del> testcode = [r'''
<del>#include <Python.h>
<del>#include <limits.h>
<del>#include <stdio.h>
<del>
<del>int main(int argc, char **argv)
<del>{
<del>
<del> FILE *fp;
<del>
<del> fp = fopen("'''+target+'''","w");
<del> ''']
<del>
<del> c_size_test = r'''
<del>#ifndef %(sz)s
<del> fprintf(fp,"#define %(sz)s %%d\n", sizeof(%(type)s));
<del>#else
<del> fprintf(fp,"/* #define %(sz)s %%d */\n", %(sz)s);
<del>#endif
<del>'''
<del> for sz, t in [('SIZEOF_SHORT', 'short'),
<del> ('SIZEOF_INT', 'int'),
<del> ('SIZEOF_LONG', 'long'),
<del> ('SIZEOF_FLOAT', 'float'),
<del> ('SIZEOF_DOUBLE', 'double'),
<del> ('SIZEOF_LONG_DOUBLE', 'long double'),
<del> ('SIZEOF_PY_INTPTR_T', 'Py_intptr_t'),
<del> ]:
<del> testcode.append(c_size_test % {'sz' : sz, 'type' : t})
<del>
<del> testcode.append('#ifdef PY_LONG_LONG')
<del> testcode.append(c_size_test % {'sz' : 'SIZEOF_LONG_LONG',
<del> 'type' : 'PY_LONG_LONG'})
<del> testcode.append(c_size_test % {'sz' : 'SIZEOF_PY_LONG_LONG',
<del> 'type' : 'PY_LONG_LONG'})
<del>
<del>
<del> testcode.append(r'''
<del>#else
<del> fprintf(fp, "/* PY_LONG_LONG not defined */\n");
<del>#endif
<del>#ifndef CHAR_BIT
<del> {
<del> unsigned char var = 2;
<del> int i=0;
<del> while (var >= 2) {
<del> var = var << 1;
<del> i++;
<del> }
<del> fprintf(fp,"#define CHAR_BIT %d\n", i+1);
<del> }
<del>#else
<del> fprintf(fp, "/* #define CHAR_BIT %d */\n", CHAR_BIT);
<del>#endif
<del> fclose(fp);
<del> return 0;
<del>}
<del>''')
<del> testcode = '\n'.join(testcode)
<del> return testcode
<del>
<ide> def generate_numpyconfig_code(target):
<ide> """Return the source code as a string of the code to generate the
<ide> numpyconfig header file.""" | 1 |
Javascript | Javascript | remove hydrate() warning about empty container | 35e31336109a5277623652634703b358eceb193f | <ide><path>src/renderers/dom/fiber/ReactDOMFiberEntry.js
<ide> ReactGenericBatching.injection.injectFiberBatchedUpdates(
<ide> );
<ide>
<ide> var warnedAboutHydrateAPI = false;
<del>var warnedAboutEmptyContainer = false;
<ide>
<ide> function renderSubtreeIntoContainer(
<ide> parentComponent: ?ReactComponent<any, any, any>,
<ide> function renderSubtreeIntoContainer(
<ide> 'with ReactDOM.hydrate() if you want React to attach to the server HTML.',
<ide> );
<ide> }
<del> if (forceHydrate && !container.firstChild && !warnedAboutEmptyContainer) {
<del> warnedAboutEmptyContainer = true;
<del> warning(
<del> false,
<del> 'hydrate(): Expected to hydrate from server-rendered markup, but the passed ' +
<del> 'DOM container node was empty. React will create the DOM from scratch.',
<del> );
<del> }
<ide> }
<ide> const newRoot = DOMRenderer.createContainer(container);
<ide> root = container._reactRootContainer = newRoot;
<ide><path>src/renderers/dom/shared/__tests__/ReactDOMServerIntegration-test.js
<ide> describe('ReactDOMServerIntegration', () => {
<ide> expect(parent.childNodes[1].tagName).toBe('DIV');
<ide> expect(parent.childNodes[2].tagName).toBe('DIV');
<ide> });
<add>
<add> itRenders('emptyish values', async render => {
<add> let e = await render(0);
<add> expect(e.nodeType).toBe(TEXT_NODE_TYPE);
<add> expect(e.nodeValue).toMatch('0');
<add>
<add> // Empty string is special because client renders a node
<add> // but server returns empty HTML. So we compare parent text.
<add> expect((await render(<div>{''}</div>)).textContent).toBe('');
<add>
<add> expect(await render([])).toBe(null);
<add> expect(await render(false)).toBe(null);
<add> expect(await render(true)).toBe(null);
<add> expect(await render(undefined)).toBe(null);
<add> expect(await render([[[false]], undefined])).toBe(null);
<add> });
<ide> }
<ide> });
<ide>
<ide><path>src/renderers/dom/shared/__tests__/ReactRenderDocument-test.js
<ide> describe('rendering React components at document', () => {
<ide>
<ide> if (ReactDOMFeatureFlags.useFiber) {
<ide> describe('with new explicit hydration API', () => {
<del> it('warns if there is no server rendered markup to hydrate', () => {
<del> spyOn(console, 'error');
<del> const container = document.createElement('div');
<del> ReactDOM.hydrate(<div />, container);
<del> expectDev(console.error.calls.count()).toBe(1);
<del> expectDev(console.error.calls.argsFor(0)[0]).toContain(
<del> 'hydrate(): Expected to hydrate from server-rendered markup, but the passed ' +
<del> 'DOM container node was empty. React will create the DOM from scratch.',
<del> );
<del> });
<del>
<ide> it('should be able to adopt server markup', () => {
<ide> class Root extends React.Component {
<ide> render() { | 3 |
Javascript | Javascript | add monitization meta | ebcb34f3d952a7adf47cacdf1a61642c17aef82a | <ide><path>client/src/head/meta.js
<ide> const meta = [
<ide> 'companies including Google, Apple, Amazon, and Microsoft.'
<ide> }
<ide> name='twitter:description'
<del> />
<add> />,
<add> <meta content='$ilp.uphold.com/LJmbPn7WD4JB' name='monetization' />
<ide> ];
<ide>
<ide> export default meta; | 1 |
Javascript | Javascript | increase coverage for dns.promises.lookup() | 7167eb2f12a5070c79d4373a0ac0010e6154c22e | <ide><path>test/parallel/test-dns-lookup.js
<ide> const common = require('../common');
<ide> const assert = require('assert');
<ide> const { internalBinding } = require('internal/test/binding');
<ide> const cares = internalBinding('cares_wrap');
<add>
<add>// Stub `getaddrinfo` to *always* error. This has to be done before we load the
<add>// `dns` module to guarantee that the `dns` module uses the stub.
<add>cares.getaddrinfo = () => internalBinding('uv').UV_ENOMEM;
<add>
<ide> const dns = require('dns');
<ide> const dnsPromises = dns.promises;
<ide>
<del>// Stub `getaddrinfo` to *always* error.
<del>cares.getaddrinfo = () => internalBinding('uv').UV_ENOENT;
<del>
<ide> {
<ide> const err = {
<ide> code: 'ERR_INVALID_ARG_TYPE',
<ide> dns.lookup('127.0.0.1', {
<ide>
<ide> let tickValue = 0;
<ide>
<add>// Should fail due to stub.
<ide> dns.lookup('example.com', common.mustCall((error, result, addressType) => {
<ide> assert(error);
<ide> assert.strictEqual(tickValue, 1);
<del> assert.strictEqual(error.code, 'ENOENT');
<add> assert.strictEqual(error.code, 'ENOMEM');
<ide> const descriptor = Object.getOwnPropertyDescriptor(error, 'message');
<ide> // The error message should be non-enumerable.
<ide> assert.strictEqual(descriptor.enumerable, false);
<ide> }));
<ide>
<del>// Make sure that the error callback is called
<del>// on next tick.
<add>// Make sure that the error callback is called on next tick.
<ide> tickValue = 1;
<add>
<add>// Should fail due to stub.
<add>assert.rejects(dnsPromises.lookup('example.com'),
<add> { code: 'ENOMEM', hostname: 'example.com' }); | 1 |
PHP | PHP | add default link | b10ee6cc91a8e9f5edab9661fa48fb084f0e1ce6 | <ide><path>src/Illuminate/Foundation/Console/StorageLinkCommand.php
<ide> class StorageLinkCommand extends Command
<ide> */
<ide> public function handle()
<ide> {
<del> foreach ($this->laravel['config']['filesystems.links'] ?? [] as $link => $target) {
<add> foreach ($this->laravel['config']['filesystems.links'] ?? [public_path('storage') => storage_path('app/public')] as $link => $target) {
<ide> if (file_exists($link)) {
<ide> $this->error("The [$link] link already exists.");
<ide> } else { | 1 |
PHP | PHP | allow string values for templates option | ffc092bc39b566028d29d6da2e7d57b3cd190922 | <ide><path>src/View/Helper/FormHelper.php
<ide> public function input($fieldName, array $options = []) {
<ide>
<ide> if ($newTemplates) {
<ide> $templater->push();
<del> $templater->add($options['templates']);
<add> $templateMethod = is_string($options['templates']) ? 'load' : 'add';
<add> $templater->{$templateMethod}($options['templates']);
<ide> }
<ide> unset($options['templates']);
<ide>
<ide><path>tests/TestCase/View/Helper/FormHelperTest.php
<ide> public function testInputCustomization() {
<ide> $this->assertTags($result, $expected);
<ide> }
<ide>
<add>/**
<add> * Test that input() accepts a template file.
<add> *
<add> * @return void
<add> */
<add> public function testInputWithTemplateFile() {
<add> $result = $this->Form->input('field', array(
<add> 'templates' => 'htmlhelper_tags'
<add> ));
<add> $expected = array(
<add> 'label' => array('for' => 'field'),
<add> 'Field',
<add> '/label',
<add> 'input' => array(
<add> 'type' => 'text', 'name' => 'field',
<add> 'id' => 'field'
<add> ),
<add> );
<add> $this->assertTags($result, $expected);
<add>
<add> }
<add>
<ide> /**
<ide> * Test id prefix
<ide> *
<ide><path>tests/test_app/TestApp/Config/htmlhelper_tags.php
<ide> $config = [
<ide> 'formstart' => 'start form',
<ide> 'formend' => 'finish form',
<del> 'hiddenblock' => '<div class="hidden">{{content}}</div>'
<add> 'hiddenblock' => '<div class="hidden">{{content}}</div>',
<add> 'inputContainer' => '{{content}}'
<ide> ]; | 3 |
Javascript | Javascript | fix typo in tests | a900cfcd95509487ba0f4fd98582d3c22592dc3a | <ide><path>packages/ember-views/tests/views/view/transition_to_deprecation_test.js
<ide> test('deprecates when calling transitionTo', function() {
<ide> }, '');
<ide> });
<ide>
<del>test("doesn't deprecafte when calling _transitionTo", function() {
<add>test("doesn't deprecate when calling _transitionTo", function() {
<ide> expect(1);
<ide>
<ide> view = EmberView.create(); | 1 |
Javascript | Javascript | remove subscribable.mixin from react native core | afa6d9ba7bc9359070bbf76078d372b89e0a6427 | <ide><path>Libraries/Components/Subscribable.js
<del>/**
<del> * Copyright (c) Facebook, Inc. and its affiliates.
<del> *
<del> * This source code is licensed under the MIT license found in the
<del> * LICENSE file in the root directory of this source tree.
<del> *
<del> * @format
<del> * @flow
<del> */
<del>
<del>'use strict';
<del>
<del>import type EventEmitter from 'EventEmitter';
<del>
<del>/**
<del> * Subscribable provides a mixin for safely subscribing a component to an
<del> * eventEmitter
<del> *
<del> * This will be replaced with the observe interface that will be coming soon to
<del> * React Core
<del> */
<del>
<del>const Subscribable = {};
<del>
<del>Subscribable.Mixin = {
<del> UNSAFE_componentWillMount: function() {
<del> this._subscribableSubscriptions = [];
<del> },
<del>
<del> componentWillUnmount: function() {
<del> // This null check is a fix for a broken version of uglify-es. Should be deleted eventually
<del> // https://github.com/facebook/react-native/issues/17348
<del> this._subscribableSubscriptions &&
<del> this._subscribableSubscriptions.forEach(subscription =>
<del> subscription.remove(),
<del> );
<del> this._subscribableSubscriptions = null;
<del> },
<del>
<del> /**
<del> * Special form of calling `addListener` that *guarantees* that a
<del> * subscription *must* be tied to a component instance, and therefore will
<del> * be cleaned up when the component is unmounted. It is impossible to create
<del> * the subscription and pass it in - this method must be the one to create
<del> * the subscription and therefore can guarantee it is retained in a way that
<del> * will be cleaned up.
<del> *
<del> * @param {EventEmitter} eventEmitter emitter to subscribe to.
<del> * @param {string} eventType Type of event to listen to.
<del> * @param {function} listener Function to invoke when event occurs.
<del> * @param {object} context Object to use as listener context.
<del> */
<del> addListenerOn: function(
<del> eventEmitter: EventEmitter,
<del> eventType: string,
<del> listener: Function,
<del> context: Object,
<del> ) {
<del> this._subscribableSubscriptions.push(
<del> eventEmitter.addListener(eventType, listener, context),
<del> );
<del> },
<del>};
<del>
<del>module.exports = Subscribable; | 1 |
Ruby | Ruby | show doctor error if xcode-select path is invalid | abe0be8a2e7ae2a84f5bae859bce0f0faf7008b1 | <ide><path>Library/Homebrew/cmd/doctor.rb
<ide> def check_xcode_prefix
<ide> end
<ide> end
<ide>
<add>def check_xcode_select_path
<add> path = `xcode-select -print-path 2>/dev/null`.chomp
<add> unless File.directory? path and File.file? "#{path}/usr/bin/xcodebuild"
<add> # won't guess at the path they should use because it's too hard to get right
<add> ohai "Your Xcode is configured with an invalid path."
<add> puts <<-EOS.undent
<add> You should change it to the correct path. Please note that there is no correct
<add> path at this time if you have *only* installed the Command Line Tools for Xcode.
<add> If your Xcode is pre-4.3 or you installed the whole of Xcode 4.3 then one of
<add> these is (probably) what you want:
<add>
<add> sudo xcode-select -switch /Developer
<add> sudo xcode-select -switch /Application/Xcode.app
<add>
<add> EOS
<add> end
<add>end
<add>
<ide> def check_user_path
<ide> seen_prefix_bin = false
<ide> seen_prefix_sbin = false
<ide> def doctor
<ide> check_usr_bin_ruby
<ide> check_homebrew_prefix
<ide> check_xcode_prefix
<add> check_xcode_select_path
<ide> check_for_macgpg2
<ide> check_for_stray_dylibs
<ide> check_for_stray_static_libs | 1 |
Text | Text | add initial list of technical priorities | 8d6a02583f51368a76f4244584351b2e5d061e2a | <ide><path>doc/guides/technical-priorities.md
<add># Technical Priorities
<add>
<add>This list represents the current view of key technical priorities recognized
<add>by the project as important to ensure the ongoing success of Node.js.
<add>It is based on an understanding of the Node.js
<add>[constituencies](https://github.com/nodejs/next-10/blob/main/CONSTITUENCIES.md)
<add>and their [needs](https://github.com/nodejs/next-10/blob/main/CONSTITUENCY-NEEDS.md).
<add>
<add>The initial version was created based on the work of the
<add>[Next-10 team](https://github.com/nodejs/next-10) and the
<add>[mini-summit](https://github.com/nodejs/next-10/issues/76)
<add>on August 5th 2021.
<add>
<add>They will be updated regularly and will be reviewed by the next-10 team
<add>and the TSC on a 6-month basis.
<add>
<add>## Modern HTTP
<add>
<add>Base HTTP support is a key component of modern cloud-native applications
<add>and built-in support was part of what made Node.js a success in the first
<add>10 years. The current implementation is hard to support and a common
<add>source of vulnerabilities. We must work towards an
<add>implementation which is easier to support and makes it easier to integrate
<add>the new HTTP versions (HTTP3, QUIC) and to support efficient
<add>implementations of different versions concurrently.
<add>
<add>## Suitable types for end-users
<add>
<add>Using typings with JavaScript can allow a richer experience when using Visual
<add>Studio Code (or any other IDEs) environments, more complete documentation
<add>of APIs and the ability to identify and resolve errors earlier in the
<add>development process. These benefits are important to a large number of Node.js
<add>developers (maybe 50%). Further typing support may be important
<add>to enterprises that are considering expanding their preferred platforms to
<add>include Node.js. It is, therefore, important that the Node.js project work
<add>to ensure there are good typings available for the public Node.js APIs.
<add>
<add>## Documentation
<add>
<add>The current documentation is great for experienced developers or people
<add>who are aware of what they are looking for. On the other hand, for
<add>beginners this documentation can be quite hard to read and finding the
<add>desired information is difficult. We must have documentation
<add>that is suitable for beginners to continue the rapid growth in use.
<add>This documentation should include more concrete examples and a learning
<add>path for newcomers.
<add>
<add>## WebAssembly
<add>
<add>The use of WebAssembly has been growing over the last few years.
<add>To ensure Node.js continues to be part of solutions where a
<add>subset of the solution needs the performance that WebAssembly can
<add>deliver, Node.js must provide good support for running
<add>WebAssembly components along with the JavaScript that makes up the rest
<add>of the solution. This includes implementations of “host” APIs like WASI.
<add>
<add>## ESM
<add>
<add>The CommonJS module system was one of the key components that led to the success
<add>of Node.js in its first 10 years. ESM is the standard that has been adopted as
<add>the equivalent in the broader JavaScript ecosystem and Node.js must continue to
<add>develop and improve its ESM implementation to stay relevant and ensure
<add>continued growth for the next 10 years.
<add>
<add>## Support for features from the latest ECMAScript spec
<add>
<add>JavaScript developers are a fast moving group and need/want support for new ES
<add>JavaScript features in a timely manner. Node.js must continue
<add>to provide support for up to date ES versions to remain the runtime
<add>of choice and to ensure its continued growth for the next 10 years.
<add>
<add>## Observability
<add>
<add>The ability to investigate and resolve problems that occur in applications
<add>running in production is crucial for organizations. Tools that allow
<add>people to observe the current and past operation of the application are
<add>needed to support that need. It is therefore important that the Node.js
<add>project work towards well understood and defined processes for observing
<add>the behavior of Node.js applications as well as ensuring there are well
<add>supported tools to implement those processes (logging, metrics and tracing).
<add>This includes support within the Node.js runtime itself (for example
<add>generating headumps, performance metrics, etc.) as well as support for
<add>applications on top of the runtime. In addition, it is also important to clearly
<add>document the use cases, problem determination methods and best
<add>practices for those tools.
<add>
<add>## Permissions/policies/security model
<add>
<add>Organizations will only choose technologies that allow them to sufficiently
<add>manage risk in their production deployments. For Node.js to
<add>continue its growth in product/enterprise deployments we need to ensure
<add>that we help them manage that risk. We must have a well-documented
<add>security model so that consumers understand what threats are/are
<add>not addressed by the Node.js runtime. We also need to provide
<add>functions/features which help them limit attack surfaces even if it does
<add>not result in 100% protection as this will still help organizations
<add>manage their overall risk level.
<add>
<add>## Better multithreaded support
<add>
<add>Today's servers support multiple threads of concurrent execution.
<add>Node.js deployments must be able to make full and efficient
<add>use of the available resources. The right answer is often to use
<add>technologies like containers to run multiple single threaded Node.js
<add>instances on the same server. However, there are important use cases
<add>where a single Node.js instance needs to make use of multiple threads
<add>to achieve a performant and efficient implementation. In addition,
<add>even when a Node.js instance only needs to consume a single thread to
<add>complete its work there can be issues. If that work is long running,
<add>blocking the event loop will interfere with other supporting work like
<add>metrics gathering and health checks. Node.js
<add>must provide good support for using multiple threads
<add>to ensure the continued growth and success of Node.js.
<add>
<add>## Single Executable Applications
<add>
<add>Node.js often loses out to other runtimes/languages in cases where
<add>being able to package a single, executable application simplifies
<add>distribution and management of what needs to be delivered. While there are
<add>components/approaches for doing this, they need to be better
<add>documented and evangelized so that this is not seen as a barrier
<add>for using Node.js in these situations. This is important to support
<add>the expansion of where/when Node.js is used in building solutions. | 1 |
Python | Python | fix race condition with dagrun callbacks | fb3031acf51f95384154143553aac1a40e568ebf | <ide><path>airflow/jobs/scheduler_job.py
<ide> def _do_scheduling(self, session) -> int:
<ide> # Bulk fetch the currently active dag runs for the dags we are
<ide> # examining, rather than making one query per DagRun
<ide>
<add> callback_tuples = []
<ide> for dag_run in dag_runs:
<ide> # Use try_except to not stop the Scheduler when a Serialized DAG is not found
<ide> # This takes care of Dynamic DAGs especially
<ide> def _do_scheduling(self, session) -> int:
<ide> # But this would take care of the scenario when the Scheduler is restarted after DagRun is
<ide> # created and the DAG is deleted / renamed
<ide> try:
<del> self._schedule_dag_run(dag_run, session)
<add> callback_to_run = self._schedule_dag_run(dag_run, session)
<add> callback_tuples.append((dag_run, callback_to_run))
<ide> except SerializedDagNotFound:
<ide> self.log.exception("DAG '%s' not found in serialized_dag table", dag_run.dag_id)
<ide> continue
<ide>
<ide> guard.commit()
<ide>
<add> # Send the callbacks after we commit to ensure the context is up to date when it gets run
<add> for dag_run, callback_to_run in callback_tuples:
<add> self._send_dag_callbacks_to_processor(dag_run, callback_to_run)
<add>
<ide> # Without this, the session has an invalid view of the DB
<ide> session.expunge_all()
<ide> # END: schedule TIs
<ide> def _schedule_dag_run(
<ide> self,
<ide> dag_run: DagRun,
<ide> session: Session,
<del> ) -> int:
<add> ) -> Optional[DagCallbackRequest]:
<ide> """
<ide> Make scheduling decisions about an individual dag run
<ide>
<ide> :param dag_run: The DagRun to schedule
<del> :return: Number of tasks scheduled
<add> :return: Callback that needs to be executed
<ide> """
<ide> dag = dag_run.dag = self.dagbag.get_dag(dag_run.dag_id, session=session)
<ide>
<ide> def _schedule_dag_run(
<ide> # TODO[HA]: Rename update_state -> schedule_dag_run, ?? something else?
<ide> schedulable_tis, callback_to_run = dag_run.update_state(session=session, execute_callbacks=False)
<ide>
<del> self._send_dag_callbacks_to_processor(dag_run, callback_to_run)
<del>
<ide> # This will do one query per dag run. We "could" build up a complex
<ide> # query to update all the TIs across all the execution dates and dag
<ide> # IDs in a single query, but it turns out that can be _very very slow_
<ide> # see #11147/commit ee90807ac for more details
<del> return dag_run.schedule_tis(schedulable_tis, session)
<add> dag_run.schedule_tis(schedulable_tis, session)
<add>
<add> return callback_to_run
<ide>
<ide> @provide_session
<ide> def _verify_integrity_if_dag_changed(self, dag_run: DagRun, session=None):
<ide><path>tests/dag_processing/test_processor.py
<ide> def setUpClass(cls):
<ide> non_serialized_dagbag.sync_to_db()
<ide> cls.dagbag = DagBag(read_dags_from_db=True)
<ide>
<add> @staticmethod
<add> def assert_scheduled_ti_count(session, count):
<add> assert count == session.query(TaskInstance).filter_by(state=State.SCHEDULED).count()
<add>
<ide> def test_dag_file_processor_sla_miss_callback(self):
<ide> """
<ide> Test that the dag file processor calls the sla miss callback
<ide> def test_dag_file_processor_process_task_instances(self, state, start_date, end_
<ide> ti.start_date = start_date
<ide> ti.end_date = end_date
<ide>
<del> count = self.scheduler_job._schedule_dag_run(dr, session)
<del> assert count == 1
<add> self.scheduler_job._schedule_dag_run(dr, session)
<add> self.assert_scheduled_ti_count(session, 1)
<ide>
<ide> session.refresh(ti)
<ide> assert ti.state == State.SCHEDULED
<ide> def test_dag_file_processor_process_task_instances_with_task_concurrency(
<ide> ti.start_date = start_date
<ide> ti.end_date = end_date
<ide>
<del> count = self.scheduler_job._schedule_dag_run(dr, session)
<del> assert count == 1
<add> self.scheduler_job._schedule_dag_run(dr, session)
<add> self.assert_scheduled_ti_count(session, 1)
<ide>
<ide> session.refresh(ti)
<ide> assert ti.state == State.SCHEDULED
<ide> def test_dag_file_processor_process_task_instances_depends_on_past(self, state,
<ide> ti.start_date = start_date
<ide> ti.end_date = end_date
<ide>
<del> count = self.scheduler_job._schedule_dag_run(dr, session)
<del> assert count == 2
<add> self.scheduler_job._schedule_dag_run(dr, session)
<add> self.assert_scheduled_ti_count(session, 2)
<ide>
<ide> session.refresh(tis[0])
<ide> session.refresh(tis[1])
<ide> def test_scheduler_job_add_new_task(self):
<ide> BashOperator(task_id='dummy2', dag=dag, owner='airflow', bash_command='echo test')
<ide> SerializedDagModel.write_dag(dag=dag)
<ide>
<del> scheduled_tis = self.scheduler_job._schedule_dag_run(dr, session)
<add> self.scheduler_job._schedule_dag_run(dr, session)
<add> self.assert_scheduled_ti_count(session, 2)
<ide> session.flush()
<del> assert scheduled_tis == 2
<ide>
<ide> drs = DagRun.find(dag_id=dag.dag_id, session=session)
<ide> assert len(drs) == 1
<ide><path>tests/jobs/test_scheduler_job.py
<ide> def test_dagrun_callbacks_are_called(self, state, expected_callback_msg):
<ide> ti = dr.get_task_instance('dummy')
<ide> ti.set_state(state, session)
<ide>
<del> self.scheduler_job._schedule_dag_run(dr, session)
<add> with mock.patch.object(settings, "USE_JOB_SCHEDULE", False):
<add> self.scheduler_job._do_scheduling(session)
<ide>
<ide> expected_callback = DagCallbackRequest(
<del> full_filepath=dr.dag.fileloc,
<add> full_filepath=dag.fileloc,
<ide> dag_id=dr.dag_id,
<ide> is_failure_callback=bool(state == State.FAILED),
<ide> execution_date=dr.execution_date,
<ide> def test_dagrun_callbacks_are_called(self, state, expected_callback_msg):
<ide> session.rollback()
<ide> session.close()
<ide>
<add> def test_dagrun_callbacks_commited_before_sent(self):
<add> """
<add> Tests that before any callbacks are sent to the processor, the session is committed. This ensures
<add> that the dagrun details are up to date when the callbacks are run.
<add> """
<add> dag = DAG(dag_id='test_dagrun_callbacks_commited_before_sent', start_date=DEFAULT_DATE)
<add> DummyOperator(task_id='dummy', dag=dag, owner='airflow')
<add>
<add> self.scheduler_job = SchedulerJob(subdir=os.devnull)
<add> self.scheduler_job.processor_agent = mock.Mock()
<add> self.scheduler_job._send_dag_callbacks_to_processor = mock.Mock()
<add> self.scheduler_job._schedule_dag_run = mock.Mock()
<add>
<add> # Sync DAG into DB
<add> with mock.patch.object(settings, "STORE_DAG_CODE", False):
<add> self.scheduler_job.dagbag.bag_dag(dag, root_dag=dag)
<add> self.scheduler_job.dagbag.sync_to_db()
<add>
<add> session = settings.Session()
<add> orm_dag = session.query(DagModel).get(dag.dag_id)
<add> assert orm_dag is not None
<add>
<add> # Create DagRun
<add> self.scheduler_job._create_dag_runs([orm_dag], session)
<add>
<add> drs = DagRun.find(dag_id=dag.dag_id, session=session)
<add> assert len(drs) == 1
<add> dr = drs[0]
<add>
<add> ti = dr.get_task_instance('dummy')
<add> ti.set_state(State.SUCCESS, session)
<add>
<add> with mock.patch.object(settings, "USE_JOB_SCHEDULE", False), mock.patch(
<add> "airflow.jobs.scheduler_job.prohibit_commit"
<add> ) as mock_gaurd:
<add> mock_gaurd.return_value.__enter__.return_value.commit.side_effect = session.commit
<add>
<add> def mock_schedule_dag_run(*args, **kwargs):
<add> mock_gaurd.reset_mock()
<add> return None
<add>
<add> def mock_send_dag_callbacks_to_processor(*args, **kwargs):
<add> mock_gaurd.return_value.__enter__.return_value.commit.assert_called_once()
<add>
<add> self.scheduler_job._send_dag_callbacks_to_processor.side_effect = (
<add> mock_send_dag_callbacks_to_processor
<add> )
<add> self.scheduler_job._schedule_dag_run.side_effect = mock_schedule_dag_run
<add>
<add> self.scheduler_job._do_scheduling(session)
<add>
<add> # Verify dag failure callback request is sent to file processor
<add> self.scheduler_job._send_dag_callbacks_to_processor.assert_called_once()
<add> # and mock_send_dag_callbacks_to_processor has asserted the callback was sent after a commit
<add>
<add> session.rollback()
<add> session.close()
<add>
<ide> @parameterized.expand([(State.SUCCESS,), (State.FAILED,)])
<ide> def test_dagrun_callbacks_are_not_added_when_callbacks_are_not_defined(self, state):
<ide> """
<ide> def test_dagrun_callbacks_are_not_added_when_callbacks_are_not_defined(self, sta
<ide> ti = dr.get_task_instance('test_task')
<ide> ti.set_state(state, session)
<ide>
<del> self.scheduler_job._schedule_dag_run(dr, session)
<add> with mock.patch.object(settings, "USE_JOB_SCHEDULE", False):
<add> self.scheduler_job._do_scheduling(session)
<ide>
<ide> # Verify Callback is not set (i.e is None) when no callbacks are set on DAG
<del> self.scheduler_job._send_dag_callbacks_to_processor.assert_called_once_with(dr, None)
<add> self.scheduler_job._send_dag_callbacks_to_processor.assert_called_once()
<add> call_args = self.scheduler_job._send_dag_callbacks_to_processor.call_args[0]
<add> assert call_args[0].dag_id == dr.dag_id
<add> assert call_args[0].execution_date == dr.execution_date
<add> assert call_args[1] is None
<ide>
<ide> session.rollback()
<ide> session.close()
<ide> def test_verify_integrity_if_dag_not_changed(self):
<ide>
<ide> # Verify that DagRun.verify_integrity is not called
<ide> with mock.patch('airflow.jobs.scheduler_job.DagRun.verify_integrity') as mock_verify_integrity:
<del> scheduled_tis = self.scheduler_job._schedule_dag_run(dr, session)
<add> self.scheduler_job._schedule_dag_run(dr, session)
<ide> mock_verify_integrity.assert_not_called()
<ide> session.flush()
<ide>
<del> assert scheduled_tis == 1
<del>
<ide> tis_count = (
<ide> session.query(func.count(TaskInstance.task_id))
<ide> .filter(
<ide> def test_verify_integrity_if_dag_changed(self):
<ide> dag_version_2 = SerializedDagModel.get_latest_version_hash(dr.dag_id, session=session)
<ide> assert dag_version_2 != dag_version_1
<ide>
<del> scheduled_tis = self.scheduler_job._schedule_dag_run(dr, session)
<add> self.scheduler_job._schedule_dag_run(dr, session)
<ide> session.flush()
<ide>
<del> assert scheduled_tis == 2
<del>
<ide> drs = DagRun.find(dag_id=dag.dag_id, session=session)
<ide> assert len(drs) == 1
<ide> dr = drs[0] | 3 |
Ruby | Ruby | remove a comment related to 920753f | 73b13f7dc908e419a404add8a331436fe5d67708 | <ide><path>activesupport/lib/active_support/core_ext/class/attribute.rb
<ide> class Class
<ide> # To opt out of both instance methods, pass <tt>instance_accessor: false</tt>.
<ide> def class_attribute(*attrs)
<ide> options = attrs.extract_options!
<del> # double assignment is used to avoid "assigned but unused variable" warning
<ide> instance_reader = options.fetch(:instance_accessor, true) && options.fetch(:instance_reader, true)
<ide> instance_writer = options.fetch(:instance_accessor, true) && options.fetch(:instance_writer, true)
<ide> instance_predicate = options.fetch(:instance_predicate, true) | 1 |
Ruby | Ruby | integrate brew home and cask home | ceb56df834d558d94251414672b97d1c7833ab95 | <ide><path>Library/Homebrew/cmd/home.rb
<ide> def home
<ide> if args.no_named?
<ide> exec_browser HOMEBREW_WWW
<ide> else
<del> exec_browser(*args.formulae.map(&:homepage))
<add> exec_browser(*args.formulae_and_casks.map(&:homepage))
<ide> end
<ide> end
<ide> end | 1 |
PHP | PHP | remove unnecessary property | 474e2732e55ef896b3a22333e1fd1f43c2ddab55 | <ide><path>src/Http/Middleware/CsrfProtectionMiddleware.php
<ide> class CsrfProtectionMiddleware implements MiddlewareInterface
<ide> {
<ide> /**
<del> * Default config for the CSRF handling.
<add> * Config for the CSRF handling.
<ide> *
<ide> * - `cookieName` The name of the cookie to send.
<ide> * - `expiry` A strotime compatible value of how long the CSRF token should last.
<ide> class CsrfProtectionMiddleware implements MiddlewareInterface
<ide> *
<ide> * @var array
<ide> */
<del> protected $_defaultConfig = [
<add> protected $_config = [
<ide> 'cookieName' => 'csrfToken',
<ide> 'expiry' => 0,
<ide> 'secure' => false,
<ide> 'httpOnly' => false,
<ide> 'field' => '_csrfToken',
<ide> ];
<ide>
<del> /**
<del> * Configuration
<del> *
<del> * @var array
<del> */
<del> protected $_config = [];
<del>
<ide> /**
<ide> * Callback for deciding whether or not to skip the token check for particular request.
<ide> *
<ide> class CsrfProtectionMiddleware implements MiddlewareInterface
<ide> /**
<ide> * Constructor
<ide> *
<del> * @param array $config Config options. See $_defaultConfig for valid keys.
<add> * @param array $config Config options. See $_config for valid keys.
<ide> */
<ide> public function __construct(array $config = [])
<ide> {
<del> $this->_config = $config + $this->_defaultConfig;
<add> $this->_config = $config + $this->_config;
<ide> }
<ide>
<ide> /** | 1 |
Python | Python | include trident.js in the extras_files build array | 39e993d50d77a54b692b9b72a8bdebbe42b61522 | <ide><path>utils/build.py
<ide> 'extras/io/BinaryLoader.js',
<ide> 'extras/io/SceneLoader.js',
<ide> 'extras/objects/MarchingCubes.js',
<add>'extras/objects/Trident.js',
<ide> 'extras/physics/Collisions.js',
<ide> 'extras/physics/CollisionUtils.js'
<ide> ] | 1 |
Text | Text | fix typo cppu | 1c47a28c6b365c448b8bbe00b453e32e4844c1d9 | <ide><path>guide/chinese/computer-hardware/cpu/index.md
<ide> CPU速度以千兆赫兹(GHz)为单位。对于每千兆赫的速度,CPU
<ide>
<ide> 千兆赫不是处理器实际速度的唯一决定因素,因为具有相同千兆赫速度(也称为时钟速度)的不同处理器可能会以不同的速度执行实际任务,因为使用不同的指令集来执行这些任务。这些指令集称为CPU架构。
<ide>
<del>大多数现代CPU使用64位架构,这意味着它们使用64位长的内存地址。较旧的CPU使用32位,16位甚至8位架构。 64位CPU可以存储的最大数量是18,446,744,073,709,552,000。 CPPU需要内存地址才能从RAM中获取指定的值。如果我们称内存的长度为n,则2 ^ n是CPU可以寻址的内存单元的数量。
<add>大多数现代CPU使用64位架构,这意味着它们使用64位长的内存地址。较旧的CPU使用32位,16位甚至8位架构。 64位CPU可以存储的最大数量是18,446,744,073,709,552,000。 CPU需要内存地址才能从RAM中获取指定的值。如果我们称内存的长度为n,则2 ^ n是CPU可以寻址的内存单元的数量。
<ide>
<ide> CPU的指令周期称为fetch-decode-execute循环 - 计算机从其内存中检索指令,确定它取出的指令及其作用,然后执行所述指令。
<ide>
<ide> CPU的指令周期称为fetch-decode-execute循环 - 计算机从其内存中检
<ide>
<ide> #### 更多信息:
<ide>
<del>[维基百科](https://en.wikipedia.org/wiki/Central_processing_unit)
<ide>\ No newline at end of file
<add>[维基百科](https://en.wikipedia.org/wiki/Central_processing_unit) | 1 |
Javascript | Javascript | move initial observer setup to finishpartial | 6c428ec4a890d8e2e58e545d503995b46c327f5f | <ide><path>packages/ember-metal/lib/mixin.js
<ide> function connectBindings(obj, m) {
<ide> }
<ide> }
<ide>
<add>function applyObservers(obj) {
<add> var meta = Ember.meta(obj),
<add> observers = meta.observers,
<add> beforeObservers = meta.beforeObservers,
<add> methodName, method, observerPaths, i, l;
<add>
<add> for (methodName in observers) {
<add> method = observers[methodName];
<add> observerPaths = getObserverPaths(method);
<add>
<add> for (i=0, l=observerPaths.length; i<l; i++) {
<add> Ember.addObserver(obj, observerPaths[i], null, methodName);
<add> }
<add> }
<add>
<add> for (methodName in beforeObservers) {
<add> method = beforeObservers[methodName];
<add> observerPaths = getBeforeObserverPaths(method);
<add>
<add> for (i=0, l=observerPaths.length; i<l; i++) {
<add> Ember.addBeforeObserver(obj, observerPaths[i], null, methodName);
<add> }
<add> }
<add>}
<add>
<ide> /** @private */
<ide> function applyMixin(obj, mixins, partial) {
<ide> var descs = {}, values = {}, m = Ember.meta(obj), req = m.required,
<ide> function applyMixin(obj, mixins, partial) {
<ide> if (desc === undefined && value === undefined) { continue; }
<ide> if (willApply) { willApply.call(obj, key); }
<ide>
<del> // If an observer replaces an existing superclass observer,
<del> // remove the superclass observers.
<del> var observerPaths = getObserverPaths(value),
<del> curObserverPaths = observerPaths && getObserverPaths(obj[key]),
<del> beforeObserverPaths = getBeforeObserverPaths(value),
<del> curBeforeObserverPaths = beforeObserverPaths && getBeforeObserverPaths(obj[key]),
<del> len, idx;
<del>
<del> if (curObserverPaths) {
<del> len = curObserverPaths.length;
<del> for (idx=0; idx < len; idx++) {
<del> Ember.removeObserver(obj, curObserverPaths[idx], null, key);
<del> }
<del> }
<add> var hasObservers = getObserverPaths(value),
<add> hasBeforeObservers = getBeforeObserverPaths(value);
<ide>
<del> if (curBeforeObserverPaths) {
<del> len = curBeforeObserverPaths.length;
<del> for (idx=0; idx < len; idx++) {
<del> Ember.removeBeforeObserver(obj, curBeforeObserverPaths[idx], null, key);
<del> }
<del> }
<add> if (hasObservers) { m.observers[key] = value; }
<add> if (hasBeforeObservers) { m.beforeObservers[key] = value; }
<ide>
<ide> detectBinding(obj, key, m);
<del>
<ide> Ember.defineProperty(obj, key, desc, value);
<ide>
<del> if (observerPaths) {
<del> len = observerPaths.length;
<del> for(idx=0; idx < len; idx++) {
<del> Ember.addObserver(obj, observerPaths[idx], null, key);
<del> }
<del> }
<del>
<del> if (beforeObserverPaths) {
<del> len = beforeObserverPaths.length;
<del> for(idx=0; idx < len; idx++) {
<del> Ember.addBeforeObserver(obj, beforeObserverPaths[idx], null, key);
<del> }
<del> }
<del>
<ide> if (req && req[key]) {
<ide> req = writableReq(obj);
<ide> req.__ember_count__--;
<ide> function applyMixin(obj, mixins, partial) {
<ide> }
<ide> }
<ide>
<del> if (!partial) { // don't apply to prototype
<del> value = connectBindings(obj, m);
<del> }
<add> //if (!partial) { // don't apply to prototype
<add> //value = connectBindings(obj, m);
<add> //}
<ide>
<ide> // Make sure no required attrs remain
<ide> if (!partial && req && req.__ember_count__>0) {
<ide> function applyMixin(obj, mixins, partial) {
<ide>
<ide> Ember.mixin = function(obj) {
<ide> var args = a_slice.call(arguments, 1);
<del> return applyMixin(obj, args, false);
<add> applyMixin(obj, args, false);
<add> Mixin.finishPartial(obj);
<add> return obj;
<ide> };
<ide>
<ide> /**
<ide> Mixin.applyPartial = function(obj) {
<ide>
<ide> Mixin.finishPartial = function(obj) {
<ide> connectBindings(obj);
<add> applyObservers(obj);
<ide> return obj;
<ide> };
<ide>
<ide> MixinPrototype.reopen = function() {
<ide> return this;
<ide> };
<ide>
<del>var TMP_ARRAY = [];
<ide> MixinPrototype.apply = function(obj) {
<del> TMP_ARRAY[0] = this;
<del> var ret = applyMixin(obj, TMP_ARRAY, false);
<del> TMP_ARRAY.length=0;
<del> return ret;
<add> applyMixin(obj, [this], false);
<add> Mixin.finishPartial(obj);
<add> return obj;
<ide> };
<ide>
<ide> MixinPrototype.applyPartial = function(obj) {
<del> TMP_ARRAY[0] = this;
<del> var ret = applyMixin(obj, TMP_ARRAY, true);
<del> TMP_ARRAY.length=0;
<del> return ret;
<add> applyMixin(obj, [this], true);
<add> return obj;
<ide> };
<ide>
<ide> /** @private */
<ide><path>packages/ember-metal/lib/utils.js
<ide> Ember.meta = function meta(obj, writable) {
<ide> ret = obj[META_KEY] = createMeta({
<ide> descs: {},
<ide> watching: {},
<add> observers: {},
<add> beforeObservers: {},
<ide> values: {},
<ide> lastSetValues: {},
<ide> cache: {},
<ide> Ember.meta = function meta(obj, writable) {
<ide> ret.descs = o_create(ret.descs);
<ide> ret.values = o_create(ret.values);
<ide> ret.watching = o_create(ret.watching);
<add> ret.observers = o_create(ret.observers);
<add> ret.beforeObservers = o_create(ret.beforeObservers);
<ide> ret.lastSetValues = {};
<ide> ret.cache = {};
<ide> ret.source = obj; | 2 |
Ruby | Ruby | inline error message | d4c8f83381b909c7448c435244745d9ad08cc54b | <ide><path>Library/Homebrew/global.rb
<ide> module Homebrew
<ide> require 'compat' unless ARGV.include? "--no-compat" or ENV['HOMEBREW_NO_COMPAT']
<ide>
<ide> ORIGINAL_PATHS = ENV['PATH'].split(File::PATH_SEPARATOR).map{ |p| Pathname.new(p).expand_path rescue nil }.compact.freeze
<del>
<del>SUDO_BAD_ERRMSG = <<-EOS.undent
<del> You can use brew with sudo, but only if the brew executable is owned by root.
<del> However, this is both not recommended and completely unsupported so do so at
<del> your own risk.
<del>EOS
<ide><path>Library/brew.rb
<ide> def require? path
<ide>
<ide> if sudo_check.include? cmd
<ide> if Process.uid.zero? and not File.stat(HOMEBREW_BREW_FILE).uid.zero?
<del> raise "Cowardly refusing to `sudo brew #{cmd}`\n#{SUDO_BAD_ERRMSG}"
<add> raise <<-EOS.undent
<add> Cowardly refusing to `sudo brew #{cmd}`
<add> You can use brew with sudo, but only if the brew executable is owned by root.
<add> However, this is both not recommended and completely unsupported so do so at
<add> your own risk.
<add> EOS
<ide> end
<ide> end
<ide> | 2 |
Go | Go | allow libcontainer to eval symlink destination | f25bbedc85e8a99c1389dbe8f48436907ce24526 | <ide><path>daemon/execdriver/native/create.go
<ide> import (
<ide> "errors"
<ide> "fmt"
<ide> "net"
<del> "path/filepath"
<ide> "strings"
<ide> "syscall"
<ide>
<ide> "github.com/docker/docker/daemon/execdriver"
<del> "github.com/docker/docker/pkg/symlink"
<ide> "github.com/docker/libcontainer/apparmor"
<ide> "github.com/docker/libcontainer/configs"
<ide> "github.com/docker/libcontainer/devices"
<ide> func (d *driver) setupMounts(container *configs.Config, c *execdriver.Command) e
<ide> container.Mounts = defaultMounts
<ide>
<ide> for _, m := range c.Mounts {
<del> dest, err := symlink.FollowSymlinkInScope(filepath.Join(c.Rootfs, m.Destination), c.Rootfs)
<del> if err != nil {
<del> return err
<del> }
<ide> flags := syscall.MS_BIND | syscall.MS_REC
<ide> if !m.Writable {
<ide> flags |= syscall.MS_RDONLY
<ide> }
<ide> if m.Slave {
<ide> flags |= syscall.MS_SLAVE
<ide> }
<del>
<ide> container.Mounts = append(container.Mounts, &configs.Mount{
<ide> Source: m.Source,
<del> Destination: dest,
<add> Destination: m.Destination,
<ide> Device: "bind",
<ide> Flags: flags,
<ide> })
<ide><path>integration-cli/docker_cli_run_test.go
<ide> func TestRunReadProcLatency(t *testing.T) {
<ide> }
<ide> logDone("run - read /proc/latency_stats")
<ide> }
<add>
<add>func TestMountIntoProc(t *testing.T) {
<add> defer deleteAllContainers()
<add> code, err := runCommand(exec.Command(dockerBinary, "run", "-v", "/proc//sys", "busybox", "true"))
<add> if err == nil || code == 0 {
<add> t.Fatal("container should not be able to mount into /proc")
<add> }
<add> logDone("run - mount into proc")
<add>}
<add>
<add>func TestMountIntoSys(t *testing.T) {
<add> defer deleteAllContainers()
<add> code, err := runCommand(exec.Command(dockerBinary, "run", "-v", "/sys/", "busybox", "true"))
<add> if err == nil || code == 0 {
<add> t.Fatal("container should not be able to mount into /sys")
<add> }
<add> logDone("run - mount into sys")
<add>} | 2 |
PHP | PHP | fix lint and failing tests | f5c242f4f77ab982e053981618ddda8ec4cec480 | <ide><path>src/Error/Debugger.php
<ide> public static function checkSecurityKeys(): void
<ide> $salt = Security::getSalt();
<ide> if ($salt === '__SALT__' || strlen($salt) < 32) {
<ide> trigger_error(
<del> "Please change the value of `Security.salt` in `ROOT/config/app.php` to a random value of at least 32 characters.",
<add> 'Please change the value of `Security.salt` in `ROOT/config/app.php` ' .
<add> 'to a random value of at least 32 characters.',
<ide> E_USER_NOTICE
<ide> );
<ide> }
<ide><path>tests/TestCase/Auth/FallbackPasswordHasherTest.php
<ide> use Cake\Auth\DefaultPasswordHasher;
<ide> use Cake\Auth\FallbackPasswordHasher;
<ide> use Cake\Auth\WeakPasswordHasher;
<add>use Cake\Utility\Security;
<ide> use Cake\TestSuite\TestCase;
<ide>
<ide> /**
<ide> * Test case for FallbackPasswordHasher
<ide> */
<ide> class FallbackPasswordHasherTest extends TestCase
<ide> {
<add> public function setUp(): void
<add> {
<add> parent::setUp();
<add> Security::setSalt('YJfIxfs2guVoUubWDYhG93b0qyJfIxfs2guwvniR2G0FgaC9mia1390as13dla8kjasdlwerpoiASf');
<add> }
<add>
<ide> /**
<ide> * Tests that only the first hasher is user for hashing a password
<ide> *
<ide><path>tests/TestCase/Auth/WeakPasswordHasherTest.php
<ide> public function setUp(): void
<ide> {
<ide> parent::setUp();
<ide>
<del> Security::setSalt('YJfIxfs2guVoUubWDYhG93b0qyJfIxfs2guwvniR2G0FgaC9mi');
<add> Security::setSalt('YJfIxfs2guVoUubWDYhG93b0qyJfIxfs2guwvniR2G0FgaC9mia1390as13dla8kjasdlwerpoiASf');
<ide> }
<ide>
<ide> /** | 3 |
Python | Python | fix flaky test | d870e45eb0e4b3d6a8c8441d797becde2d17ab4d | <ide><path>tests/keras/layers/test_embeddings.py
<ide> def test_unitnorm_constraint():
<ide> class_mode='binary')
<ide> lookup.train_on_batch(X1, np.array([[1], [0]], dtype='int32'))
<ide> norm = np.linalg.norm(K.get_value(lookup.params[0]), axis=1)
<del> assert_allclose(norm, np.ones_like(norm).astype('float32'))
<add> assert_allclose(norm, np.ones_like(norm).astype('float32'), rtol=1e-05)
<ide>
<ide>
<ide> if __name__ == '__main__': | 1 |
Python | Python | set version to v3.0.0a14 | 12e1279f6b4c8db6f7f9f399de6901a429d3aaca | <ide><path>spacy/about.py
<ide> # fmt: off
<ide> __title__ = "spacy-nightly"
<del>__version__ = "3.0.0a13"
<add>__version__ = "3.0.0a14"
<ide> __release__ = True
<ide> __download_url__ = "https://github.com/explosion/spacy-models/releases/download"
<ide> __compatibility__ = "https://raw.githubusercontent.com/explosion/spacy-models/master/compatibility.json" | 1 |
Go | Go | add todo for return error on registered() | 69b0913e1f47c1f1bdf8d191f6061202455f875b | <ide><path>distribution/xfer/download.go
<ide> type DownloadDescriptor interface {
<ide> // DownloadDescriptorWithRegistered is successful.
<ide> type DownloadDescriptorWithRegistered interface {
<ide> DownloadDescriptor
<add>
<add> // TODO existing implementations in distribution and builder-next swallow errors
<add> // when registering the diffID. Consider changing the Registered signature
<add> // to return the error.
<add>
<ide> Registered(diffID layer.DiffID)
<ide> }
<ide> | 1 |
Python | Python | use list to modify in-place | 89111adf059483780955f4176f536607d3e559a6 | <ide><path>airflow/jobs.py
<ide> def _execute(self):
<ide>
<ide> # Triggering what is ready to get triggered
<ide> while tasks_to_run:
<del> for key, ti in tasks_to_run.items():
<add> for key, ti in list(tasks_to_run.items()):
<ide> ti.refresh_from_db()
<ide> if ti.state == State.SUCCESS and key in tasks_to_run:
<ide> succeeded.append(key) | 1 |
Javascript | Javascript | fix ext commands to be double quoted | 82f067e60bb3eb87cc1119655ae0a5968e988326 | <ide><path>test/parallel/test-eval.js
<ide> var exec = require('child_process').exec;
<ide> var success_count = 0;
<ide> var error_count = 0;
<ide>
<del>var cmd = [process.execPath, '-e', '"console.error(process.argv)"',
<add>var cmd = ['"' + process.execPath + '"', '-e', '"console.error(process.argv)"',
<ide> 'foo', 'bar'].join(' ');
<ide> var expected = util.format([process.execPath, 'foo', 'bar']) + '\n';
<ide> var child = exec(cmd, function(err, stdout, stderr) {
<ide><path>test/parallel/test-fs-readfile-error.js
<ide> var callbacks = 0;
<ide>
<ide> function test(env, cb) {
<ide> var filename = path.join(common.fixturesDir, 'test-fs-readfile-error.js');
<del> var execPath = process.execPath + ' ' + filename;
<add> var execPath = '"' + process.execPath + '" "' + filename + '"';
<ide> var options = { env: env || {} };
<ide> exec(execPath, options, function(err, stdout, stderr) {
<ide> assert(err);
<ide><path>test/parallel/test-http-curl-chunk-problem.js
<ide> var count = 0;
<ide> function maybeMakeRequest() {
<ide> if (++count < 2) return;
<ide> console.log('making curl request');
<del> var cmd = 'curl http://127.0.0.1:' + common.PORT + '/ | ' + common.opensslCli + ' sha1';
<add> var cmd = 'curl http://127.0.0.1:' + common.PORT + '/ | ' +
<add> '"' + common.opensslCli + '" sha1';
<ide> cp.exec(cmd, function(err, stdout, stderr) {
<ide> if (err) throw err;
<ide> var hex = stdout.match(/([A-Fa-f0-9]{40})/)[0];
<ide><path>test/parallel/test-tls-ecdh-disable.js
<ide> var server = tls.createServer(options, function(conn) {
<ide> });
<ide>
<ide> server.listen(common.PORT, '127.0.0.1', function() {
<del> var cmd = common.opensslCli + ' s_client -cipher ' + options.ciphers +
<add> var cmd = '"' + common.opensslCli + '" s_client -cipher ' + options.ciphers +
<ide> ' -connect 127.0.0.1:' + common.PORT;
<ide>
<ide> exec(cmd, function(err, stdout, stderr) {
<ide><path>test/parallel/test-tls-ecdh.js
<ide> var server = tls.createServer(options, function(conn) {
<ide> });
<ide>
<ide> server.listen(common.PORT, '127.0.0.1', function() {
<del> var cmd = common.opensslCli + ' s_client -cipher ' + options.ciphers +
<add> var cmd = '"' + common.opensslCli + '" s_client -cipher ' + options.ciphers +
<ide> ' -connect 127.0.0.1:' + common.PORT;
<ide>
<ide> exec(cmd, function(err, stdout, stderr) {
<ide><path>test/parallel/test-tls-set-ciphers.js
<ide> var server = tls.createServer(options, function(conn) {
<ide> });
<ide>
<ide> server.listen(common.PORT, '127.0.0.1', function() {
<del> var cmd = common.opensslCli + ' s_client -cipher ' + options.ciphers +
<add> var cmd = '"' + common.opensslCli + '" s_client -cipher ' + options.ciphers +
<ide> ' -connect 127.0.0.1:' + common.PORT;
<ide>
<ide> exec(cmd, function(err, stdout, stderr) {
<ide><path>test/pummel/test-exec.js
<ide> var success_count = 0;
<ide> var error_count = 0;
<ide>
<ide>
<del>exec(process.execPath + ' -p -e process.versions',
<add>exec('"' + process.execPath + '" -p -e process.versions',
<ide> function(err, stdout, stderr) {
<ide> if (err) {
<ide> error_count++;
<ide><path>test/sequential/test-child-process-execsync.js
<ide> var err;
<ide> var caught = false;
<ide> try
<ide> {
<del> var cmd = util.format('%s -e "setTimeout(function(){}, %d);"',
<add> var cmd = util.format('"%s" -e "setTimeout(function(){}, %d);"',
<ide> process.execPath, SLEEP);
<ide> var ret = execSync(cmd, {timeout: TIMER});
<ide> } catch (e) {
<ide> var msg = 'foobar';
<ide> var msgBuf = new Buffer(msg + '\n');
<ide>
<ide> // console.log ends every line with just '\n', even on Windows.
<del>cmd = util.format('%s -e "console.log(\'%s\');"', process.execPath, msg);
<add>cmd = util.format('"%s" -e "console.log(\'%s\');"', process.execPath, msg);
<ide>
<ide> var ret = execSync(cmd);
<ide>
<ide><path>test/sequential/test-init.js
<ide> var envCopy = JSON.parse(JSON.stringify(process.env));
<ide> envCopy.TEST_INIT = 1;
<ide>
<del> child.exec(process.execPath + ' test-init', {env: envCopy},
<add> child.exec('"' + process.execPath + '" test-init', {env: envCopy},
<ide> function(err, stdout, stderr) {
<ide> assert.equal(stdout, 'Loaded successfully!',
<ide> '`node test-init` failed!');
<ide> });
<del> child.exec(process.execPath + ' test-init.js', {env: envCopy},
<add> child.exec('"' + process.execPath + '" test-init.js', {env: envCopy},
<ide> function(err, stdout, stderr) {
<ide> assert.equal(stdout, 'Loaded successfully!',
<ide> '`node test-init.js` failed!');
<ide> // test-init-index is in fixtures dir as requested by ry, so go there
<ide> process.chdir(common.fixturesDir);
<ide>
<del> child.exec(process.execPath + ' test-init-index', {env: envCopy},
<add> child.exec('"' + process.execPath + '" test-init-index', {env: envCopy},
<ide> function(err, stdout, stderr) {
<ide> assert.equal(stdout, 'Loaded successfully!',
<ide> '`node test-init-index failed!');
<ide> // expected in node
<ide> process.chdir(common.fixturesDir + '/test-init-native/');
<ide>
<del> child.exec(process.execPath + ' fs', {env: envCopy},
<add> child.exec('"' + process.execPath + '" fs', {env: envCopy},
<ide> function(err, stdout, stderr) {
<ide> assert.equal(stdout, 'fs loaded successfully',
<ide> '`node fs` failed!');
<ide><path>test/sequential/test-regress-GH-4015.js
<ide> var common = require('../common');
<ide> var assert = require('assert');
<ide> var exec = require('child_process').exec;
<ide>
<del>var cmd = process.execPath
<del> + ' '
<del> + common.fixturesDir
<del> + '/test-regress-GH-4015.js';
<add>var cmd = '"' + process.execPath + '" ' +
<add> '"' + common.fixturesDir + '/test-regress-GH-4015.js"';
<ide>
<ide> exec(cmd, function(err, stdout, stderr) {
<ide> assert(/RangeError: Maximum call stack size exceeded/.test(stderr)); | 10 |
Javascript | Javascript | update prefetch check to prevent re-prefetching | e360c105ab685caa62ef99c8227540a4c010df21 | <ide><path>packages/next/client/page-loader.js
<ide> export default class PageLoader {
<ide>
<ide> async prefetch (route) {
<ide> route = this.normalizeRoute(route)
<del> const scriptRoute = route === '/' ? '/index.js' : `${route}.js`
<del> if (this.prefetchCache.has(scriptRoute)) {
<add> const scriptRoute = `${route === '/' ? '/index' : route}.js`
<add>
<add> if (
<add> this.prefetchCache.has(scriptRoute) ||
<add> document.getElementById(`__NEXT_PAGE__${route}`)
<add> ) {
<ide> return
<ide> }
<ide> this.prefetchCache.add(scriptRoute)
<ide><path>test/integration/flying-shuttle/test/index.test.js
<ide> describe('Flying Shuttle', () => {
<ide> }
<ide> })
<ide>
<add> it('should not re-prefetch for the page its already on', async () => {
<add> const browser = await webdriver(appPort, '/')
<add> const links = await browser.elementsByCss('link[rel=preload]')
<add> let found = false
<add>
<add> for (const link of links) {
<add> const href = await link.getAttribute('href')
<add> if (href.endsWith('index.js')) {
<add> found = true
<add> break
<add> }
<add> }
<add> expect(found).toBe(false)
<add> if (browser) await browser.close()
<add> })
<add>
<ide> it('should render /index', async () => {
<ide> const html = await renderViaHTTP(secondAppPort, '/')
<ide> expect(html).toMatch(/omega lol/) | 2 |
Javascript | Javascript | move temporary state to node._tree | 6e901439fd9080c0bf29ee3552d2b1ca592e073e | <ide><path>d3.layout.js
<ide> d3.layout.tree = function() {
<ide> y1 = 0; // max depth
<ide>
<ide> function firstWalk(node, previousSibling) {
<del> var children = node.children;
<del> if (!children) {
<del> if (previousSibling) {
<del> node.prelim = previousSibling.prelim + separation(node, previousSibling);
<del> }
<del> } else {
<add> var children = node.children,
<add> layout = node._tree;
<add> if (children) {
<ide> var n = children.length,
<del> firstChild = children[0],
<del> lastChild = children[n - 1],
<del> ancestor = firstChild,
<add> ancestor = children[0],
<ide> previousChild,
<ide> child,
<ide> i = -1;
<ide> d3.layout.tree = function() {
<ide> previousChild = child;
<ide> }
<ide> d3_layout_treeShift(node);
<del> var midpoint = .5 * (firstChild.prelim + lastChild.prelim);
<add> var midpoint = .5 * (children[0]._tree.prelim + children[n - 1]._tree.prelim);
<ide> if (previousSibling) {
<del> node.prelim = previousSibling.prelim + separation(node, previousSibling);
<del> node.mod = node.prelim - midpoint;
<add> layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
<add> layout.mod = layout.prelim - midpoint;
<ide> } else {
<del> node.prelim = midpoint;
<add> layout.prelim = midpoint;
<add> }
<add> } else {
<add> if (previousSibling) {
<add> layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
<ide> }
<ide> }
<ide> }
<ide>
<ide> function secondWalk(node, x) {
<del> node.breadth = node.prelim + x;
<add> node.x = node._tree.prelim + x;
<ide> var children = node.children;
<ide> if (children) {
<ide> var i = -1,
<ide> n = children.length;
<del> x += node.mod;
<add> x += node._tree.mod;
<ide> while (++i < n) {
<ide> secondWalk(children[i], x);
<ide> }
<ide> }
<ide>
<ide> // Compute extent of breadth and depth.
<del> if (node.breadth < x0) x0 = node.breadth;
<del> if (node.breadth > x1) x1 = node.breadth;
<add> if (node.x < x0) x0 = node.x;
<add> if (node.x > x1) x1 = node.x;
<ide> if (node.depth > y1) y1 = node.depth;
<ide> }
<ide>
<ide> d3.layout.tree = function() {
<ide> vop = node,
<ide> vim = previousSibling,
<ide> vom = node.parent.children[0],
<del> sip = vip.mod,
<del> sop = vop.mod,
<del> sim = vim.mod,
<del> som = vom.mod,
<add> sip = vip._tree.mod,
<add> sop = vop._tree.mod,
<add> sim = vim._tree.mod,
<add> som = vom._tree.mod,
<ide> shift;
<ide> while (vim = d3_tree_layoutRight(vim), vip = d3_tree_layoutLeft(vip), vim && vip) {
<ide> vom = d3_tree_layoutLeft(vom);
<ide> vop = d3_tree_layoutRight(vop);
<del> vop.ancestor = node;
<del> shift = vim.prelim + sim - vip.prelim - sip + separation(vim, vip);
<add> vop._tree.ancestor = node;
<add> shift = vim._tree.prelim + sim - vip._tree.prelim - sip + separation(vim, vip);
<ide> if (shift > 0) {
<ide> d3_layout_treeMove(d3_layout_treeAncestor(vim, node, ancestor), node, shift);
<ide> sip += shift;
<ide> sop += shift;
<ide> }
<del> sim += vim.mod;
<del> sip += vip.mod;
<del> som += vom.mod;
<del> sop += vop.mod;
<add> sim += vim._tree.mod;
<add> sip += vip._tree.mod;
<add> som += vom._tree.mod;
<add> sop += vop._tree.mod;
<ide> }
<ide> if (vim && !d3_tree_layoutRight(vop)) {
<del> vop.thread = vim;
<del> vop.mod += sim - sop;
<add> vop._tree.thread = vim;
<add> vop._tree.mod += sim - sop;
<ide> }
<ide> if (vip && !d3_tree_layoutLeft(vom)) {
<del> vom.thread = vip;
<del> vom.mod += sip - som;
<add> vom._tree.thread = vip;
<add> vom._tree.mod += sip - som;
<ide> ancestor = node;
<ide> }
<ide> }
<ide> return ancestor;
<ide> }
<ide>
<del> // Initialize temporary layout variables. TODO store separately?
<add> // Initialize temporary layout variables.
<ide> d3_layout_treeVisitAfter(root, function(node, previousSibling) {
<del> node.ancestor = node;
<del> node.prelim = 0;
<del> node.mod = 0;
<del> node.change = 0;
<del> node.shift = 0;
<del> node.number = previousSibling ? previousSibling.number + 1 : 0;
<add> node._tree = {
<add> ancestor: node,
<add> prelim: 0,
<add> mod: 0,
<add> change: 0,
<add> shift: 0,
<add> number: previousSibling ? previousSibling._tree.number + 1 : 0
<add> };
<ide> });
<ide>
<ide> // Compute the layout using Buchheim et al.'s algorithm.
<ide> firstWalk(root);
<del> secondWalk(root, -root.prelim);
<add> secondWalk(root, -root._tree.prelim);
<ide>
<ide> // Clear temporary layout variables; transform depth and breadth.
<ide> d3_layout_treeVisitAfter(root, function(node) {
<del> node.x = ((node.breadth - x0) / (x1 - x0)) * size[0];
<add> node.x = ((node.x - x0) / (x1 - x0)) * size[0];
<ide> node.y = node.depth / y1 * size[1];
<del> delete node.breadth;
<del> delete node.ancestor;
<del> delete node.prelim;
<del> delete node.mod;
<del> delete node.change;
<del> delete node.shift;
<del> delete node.number;
<del> delete node.thread;
<add> delete node._tree;
<ide> });
<ide>
<ide> return nodes;
<ide> function d3_layout_treeSeparation(a, b) {
<ide> // }
<ide>
<ide> function d3_tree_layoutLeft(node) {
<del> return node.children ? node.children[0] : node.thread;
<add> return node.children ? node.children[0] : node._tree.thread;
<ide> }
<ide>
<ide> function d3_tree_layoutRight(node) {
<del> return node.children ? node.children[node.children.length - 1] : node.thread;
<add> return node.children ? node.children[node.children.length - 1] : node._tree.thread;
<ide> }
<ide>
<ide> function d3_layout_treeVisitAfter(node, callback) {
<ide> function d3_layout_treeShift(node) {
<ide> i = children.length,
<ide> child;
<ide> while (--i >= 0) {
<del> child = children[i];
<add> child = children[i]._tree;
<ide> child.prelim += shift;
<ide> child.mod += shift;
<ide> shift += child.shift + (change += child.change);
<ide> }
<ide> }
<ide>
<ide> function d3_layout_treeMove(ancestor, node, shift) {
<del> var subtrees = node.number - ancestor.number;
<del> node.change -= shift / subtrees;
<add> ancestor = ancestor._tree;
<add> node = node._tree;
<add> var change = shift / (node.number - ancestor.number);
<add> ancestor.change += change;
<add> node.change -= change;
<ide> node.shift += shift;
<del> ancestor.change += shift / subtrees;
<ide> node.prelim += shift;
<ide> node.mod += shift;
<ide> }
<ide>
<ide> function d3_layout_treeAncestor(vim, node, ancestor) {
<del> return vim.ancestor.parent == node.parent ? vim.ancestor : ancestor;
<add> return vim._tree.ancestor.parent == node.parent
<add> ? vim._tree.ancestor
<add> : ancestor;
<ide> }
<ide> // Squarified Treemaps by Mark Bruls, Kees Huizing, and Jarke J. van Wijk
<ide> d3.layout.treemap = function() {
<ide><path>d3.layout.min.js
<del>(function(){function p(a,b,c){return a.ancestor.parent==b.parent?a.ancestor:c}function o(a,b,c){var d=b.number-a.number;b.change-=c/d,b.shift+=c,a.change+=c/d,b.prelim+=c,b.mod+=c}function n(a){var b=0,c=0,d=a.children,e=d.length,f;while(--e>=0)f=d[e],f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function m(a,b){function c(a,d){var e=a.children;if(e){var f,g=null,h=-1,i=e.length;while(++h<i)f=e[h],c(f,g),g=f}b(a,d)}c(a,null)}function l(a){return a.children?a.children[a.children.length-1]:a.thread}function k(a){return a.children?a.children[0]:a.thread}function j(a,b){return a.parent==b.parent?1:2}function i(a,b){return b.value-a.value}function h(a){return a.value}function g(a){return a.children}function f(a,b){return a+b.y}function e(a){var b=1,c=0,d=a[0].y,e,f=a.length;for(;b<f;++b)(e=a[b].y)>d&&(c=b,d=e);return c}function c(a){return a.reduce(f,0)}d3.layout={},d3.layout.chord=function(){function k(){b.sort(function(a,b){a=Math.min(a.source.value,a.target.value),b=Math.min(b.source.value,b.target.value);return i(a,b)})}function j(){var a={},j=[],l=d3.range(e),m=[],n,o,p,q,r;b=[],c=[],n=0,q=-1;while(++q<e){o=0,r=-1;while(++r<e)o+=d[q][r];j.push(o),m.push(d3.range(e)),n+=o}g&&l.sort(function(a,b){return g(j[a],j[b])}),h&&m.forEach(function(a,b){a.sort(function(a,c){return h(d[b][a],d[b][c])})}),n=(2*Math.PI-f*e)/n,o=0,q=-1;while(++q<e){p=o,r=-1;while(++r<e){var s=l[q],t=m[q][r],u=d[s][t];a[s+"-"+t]={index:s,subindex:t,startAngle:o,endAngle:o+=u*n,value:u}}c.push({index:s,startAngle:p,endAngle:o,value:(o-p)/n}),o+=f}q=-1;while(++q<e){r=q-1;while(++r<e){var v=a[q+"-"+r],w=a[r+"-"+q];(v.value||w.value)&&b.push({source:v,target:w})}}i&&k()}var a={},b,c,d,e,f=0,g,h,i;a.matrix=function(f){if(!arguments.length)return d;e=(d=f)&&d.length,b=c=null;return a},a.padding=function(d){if(!arguments.length)return f;f=d,b=c=null;return a},a.sortGroups=function(d){if(!arguments.length)return g;g=d,b=c=null;return a},a.sortSubgroups=function(c){if(!arguments.length)return h;h=c,b=null;return a},a.sortChords=function(c){if(!arguments.length)return i;i=c,b&&k();return a},a.chords=function(){b||j();return b},a.groups=function(){c||j();return c};return a},d3.layout.force=function(){function j(){var a=i.length,c,f,g,h,j,k,l;for(c=0;c<a;++c){f=i[c],g=f.source,h=f.target,k=h.x-g.x,l=h.y-g.y;if(j=Math.sqrt(k*k+l*l))j=d/(f.distance*f.distance)*(j-e*f.distance)/j,k*=j,l*=j,h.fixed||(h.x-=k,h.y-=l),g.fixed||(g.x+=k,g.y+=l)}b.tick.dispatch({type:"tick"});return(d*=.99)<.005}var a={},b=d3.dispatch("tick"),c=[1,1],d=.5,e=30,f,g,h,i;a.on=function(c,d){b[c].add(d);return a},a.nodes=function(b){if(!arguments.length)return g;g=b;return a},a.links=function(b){if(!arguments.length)return h;h=b;return a},a.size=function(b){if(!arguments.length)return c;c=b;return a},a.distance=function(b){if(!arguments.length)return e;e=b;return a},a.start=function(){var b,d,e,f=g.length,k=h.length,l=c[0],m=c[1],n,o=[];for(b=0;b<f;++b){n=g[b],n.x=n.x||Math.random()*l,n.y=n.y||Math.random()*m,n.fixed=0,o[b]=[];for(d=0;d<f;++d)o[b][d]=Infinity;o[b][b]=0}for(b=0;b<k;++b)n=h[b],o[n.source][n.target]=1,o[n.target][n.source]=1,n.source=g[n.source],n.target=g[n.target];for(e=0;e<f;++e)for(b=0;b<f;++b)for(d=0;d<f;++d)o[b][d]=Math.min(o[b][d],o[b][e]+o[e][d]);i=[];for(b=0;b<f;++b)for(d=b+1;d<f;++d)i.push({source:g[b],target:g[d],distance:o[b][d]*o[b][d]});i.sort(function(a,b){return a.distance-b.distance}),d3.timer(j);return a},a.resume=function(){d=.1,d3.timer(j);return a},a.stop=function(){d=0;return a},a.drag=function(){function f(){!b||(e(),b.fixed=!1,b=c=null)}function e(){if(!!b){var d=d3.svg.mouse(c);b.x=d[0],b.y=d[1],a.resume()}}function d(a){(b=a).fixed=!0,c=this,d3.event.preventDefault()}var b,c;this.on("mouseover",function(a){a.fixed=!0}).on("mouseout",function(a){a!=b&&(a.fixed=!1)}).on("mousedown",d),d3.select(window).on("mousemove",e).on("mouseup",f);return a};return a},d3.layout.partition=function(){function e(e,f){var g=a.call(this,e,f);c(g[0],0,b[0],b[1]/d(g[0]));return g}function d(a){var b=a.children,c=0;if(b){var e=-1,f=b.length;while(++e<f)c=Math.max(c,d(b[e]))}return 1+c}function c(a,b,d,e){var f=a.children;a.x=b,a.y=a.depth*e,a.dx=d,a.dy=e;if(f){var g=-1,h=f.length,i,j;d/=a.value;while(++g<h)c(i=f[g],b,j=i.value*d,e),b+=j}}var a=d3.layout.hierarchy(),b=[1,1];e.sort=d3.rebind(e,a.sort),e.children=d3.rebind(e,a.children),e.value=d3.rebind(e,a.value),e.size=function(a){if(!arguments.length)return b;b=a;return e};return e},d3.layout.pie=function(){function f(f,g){var h=+(typeof c=="function"?c.apply(this,arguments):c),i=(typeof e=="function"?e.apply(this,arguments):e)-c,j=d3.range(f.length);b!=null&&j.sort(function(a,c){return b(f[a],f[c])});var k=f.map(a);i/=k.reduce(function(a,b){return a+b},0);var l=j.map(function(a){return{value:d=k[a],startAngle:h,endAngle:h+=d*i}});return f.map(function(a,b){return l[j[b]]})}var a=Number,b=null,c=0,e=2*Math.PI;f.value=function(b){if(!arguments.length)return a;a=b;return f},f.sort=function(a){if(!arguments.length)return b;b=a;return f},f.startAngle=function(a){if(!arguments.length)return c;c=a;return f},f.endAngle=function(a){if(!arguments.length)return e;e=a;return f};return f},d3.layout.stack=function(){function e(e){var f=e.length,g=e[0].length,h,i,j,k=a[c](e);b[d](e,k);for(i=0;i<g;++i)for(h=1,j=e[k[0]][i].y0;h<f;++h)e[k[h]][i].y0=j+=e[k[h-1]][i].y;return e}var c="default",d="zero";e.order=function(a){if(!arguments.length)return c;c=a;return e},e.offset=function(a){if(!arguments.length)return d;d=a;return e};return e};var a={"inside-out":function(a){var b=a.length,d,f,g=a.map(e),h=a.map(c),i=d3.range(b).sort(function(a,b){return g[a]-g[b]}),j=0,k=0,l=[],m=[];for(d=0;d<b;d++)f=i[d],j<k?(j+=h[f],l.push(f)):(k+=h[f],m.push(f));return m.reverse().concat(l)},reverse:function(a){return d3.range(a.length).reverse()},"default":function(a){return d3.range(a.length)}},b={silhouette:function(a,b){var c=a.length,d=a[0].length,e=[],f=0,g,h,i;for(h=0;h<d;++h){for(g=0,i=0;g<c;g++)i+=a[g][h].y;i>f&&(f=i),e.push(i)}for(h=0,g=b[0];h<d;++h)a[g][h].y0=(f-e[h])/2},wiggle:function(a,b){var c=a.length,d=a[0],e=d.length,f=0,g,h,i,j,k,l=b[0],m,n,o,p,q,r;a[l][0].y0=q=r=0;for(h=1;h<e;++h){for(g=0,m=0;g<c;++g)m+=a[g][h].y;for(g=0,n=0,p=d[h].x-d[h-1].x;g<c;++g){for(i=0,j=b[g],o=(a[j][h].y-a[j][h-1].y)/(2*p);i<g;++i)o+=(a[k=b[i]][h].y-a[k][h-1].y)/p;n+=o*a[j][h].y}a[l][h].y0=q-=m?n/m*p:0,q<r&&(r=q)}for(h=0;h<e;++h)a[l][h].y0-=r},zero:function(a,b){var c=0,d=a[0].length,e=b[0];for(;c<d;++c)a[e][c].y0=0}};d3.layout.hierarchy=function(){function j(a){var b=[];e(a,0,b);return b}function f(a,b){var d=a.children,e=0;if(d){var g=-1,h=d.length,i=b+1;while(++g<h)e+=f(d[g],i)}else e=c.call(j,a.data,b);return a.value=e}function e(f,g,h){var i=b.call(j,f,g),k={depth:g,data:f};h.push(k);if(i){var l=-1,m=i.length,n=k.children=[],o=0,p=g+1;while(++l<m)d=e(i[l],p,h),d.value>0&&(n.push(d),o+=d.value,d.parent=k);a&&n.sort(a),k.value=o}else k.value=c.call(j,f,g);return k}var a=i,b=g,c=h;j.sort=function(b){if(!arguments.length)return a;a=b;return j},j.children=function(a){if(!arguments.length)return b;b=a;return j},j.value=function(a){if(!arguments.length)return c;c=a;return j},j.revalue=function(a){f(a,0);return a};return j},d3.layout.tree=function(){function d(d,e){function s(a,c,d){if(c){var e=a,f=a,g=c,h=a.parent.children[0],i=e.mod,j=f.mod,m=g.mod,n=h.mod,q;while(g=l(g),e=k(e),g&&e)h=k(h),f=l(f),f.ancestor=a,q=g.prelim+m-e.prelim-i+b(g,e),q>0&&(o(p(g,a,d),a,q),i+=q,j+=q),m+=g.mod,i+=e.mod,n+=h.mod,j+=f.mod;g&&!l(f)&&(f.thread=g,f.mod+=m-j),e&&!k(h)&&(h.thread=e,h.mod+=i-n,d=a)}return d}function r(a,b){a.breadth=a.prelim+b;var c=a.children;if(c){var d=-1,e=c.length;b+=a.mod;while(++d<e)r(c[d],b)}a.breadth<h&&(h=a.breadth),a.breadth>i&&(i=a.breadth),a.depth>j&&(j=a.depth)}function q(a,c){var d=a.children;if(!d)c&&(a.prelim=c.prelim+b(a,c));else{var e=d.length,f=d[0],g=d[e-1],h=f,i,j,k=-1;while(++k<e)j=d[k],q(j,i),h=s(j,i,h),i=j;n(a);var l=.5*(f.prelim+g.prelim);c?(a.prelim=c.prelim+b(a,c),a.mod=a.prelim-l):a.prelim=l}}var f=a.call(this,d,e),g=f[0],h=0,i=0,j=0;m(g,function(a,b){a.ancestor=a,a.prelim=0,a.mod=0,a.change=0,a.shift=0,a.number=b?b.number+1:0}),q(g),r(g,-g.prelim),m(g,function(a){a.x=(a.breadth-h)/(i-h)*c[0],a.y=a.depth/j*c[1],delete a.breadth,delete a.ancestor,delete a.prelim,delete a.mod,delete a.change,delete a.shift,delete a.number,delete a.thread});return f}var a=d3.layout.hierarchy(),b=j,c=[1,1];d.sort=d3.rebind(d,a.sort),d.children=d3.rebind(d,a.children),d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return d},d3.layout.treemap=function(){function k(b){var i=e||a(b),j=i[0];j.x=0,j.y=0,j.dx=c[0],j.dy=c[1],e&&a.revalue(j),f(j,c[0]*c[1]/j.value),(e?h:g)(j),d&&(e=i);return i}function j(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=d.dy;while(++f<g)k=a[f],k.x=h,k.y=i,k.dy=j,h+=k.dx=b(k.area/j);k.z=!0,k.dx+=d.x+d.dx-h,d.y+=j,d.dy-=j}else{if(e||j>d.dx)j=d.dx;while(++f<g)k=a[f],k.x=h,k.y=i,k.dx=j,i+=k.dy=b(k.area/j);k.z=!1,k.dy+=d.y+d.dy-i,d.x+=j,d.dx-=j}}function i(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,h=a.length;while(++g<h)d=a[g].area,d<f&&(f=d),d>e&&(e=d);c*=c,b*=b;return Math.max(b*e/c,c/(b*f))}function h(a){if(!!a.children){var b={x:a.x,y:a.y,dx:a.dx,dy:a.dy},c=a.children.slice(),d,e=[];e.area=0;while(d=c.pop())e.push(d),e.area+=d.area,d.z!=null&&(j(e,d.z?b.dx:b.dy,b,!c.length),e.length=e.area=0);a.children.forEach(h)}}function g(a){if(!!a.children){var b={x:a.x,y:a.y,dx:a.dx,dy:a.dy},c=[],d=a.children.slice(),e,f=Infinity,h,k=Math.min(b.dx,b.dy),l;c.area=0;while((l=d.length)>0)c.push(e=d[l-1]),c.area+=e.area,(h=i(c,k))<=f?(d.pop(),f=h):(c.area-=c.pop().area,j(c,k,b,!1),k=Math.min(b.dx,b.dy),c.length=c.area=0,f=Infinity);c.length&&(j(c,k,b,!0),c.length=c.area=0),a.children.forEach(g)}}function f(a,b){var c=a.children;a.area=a.value*b;if(c){var d=-1,e=c.length;while(++d<e)f(c[d],b)}}var a=d3.layout.hierarchy(),b=Math.round,c=[1,1],d=!1,e;k.sort=d3.rebind(k,a.sort),k.children=d3.rebind(k,a.children),k.value=d3.rebind(k,a.value),k.size=function(a){if(!arguments.length)return c;c=a;return k},k.round=function(a){if(!arguments.length)return b!=Number;b=a?Math.round:Number;return k},k.sticky=function(a){if(!arguments.length)return d;d=a,e=null;return k};return k}})()
<ide>\ No newline at end of file
<add>(function(){function p(a,b,c){return a._tree.ancestor.parent==b.parent?a._tree.ancestor:c}function o(a,b,c){a=a._tree,b=b._tree;var d=c/(b.number-a.number);a.change+=d,b.change-=d,b.shift+=c,b.prelim+=c,b.mod+=c}function n(a){var b=0,c=0,d=a.children,e=d.length,f;while(--e>=0)f=d[e]._tree,f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function m(a,b){function c(a,d){var e=a.children;if(e){var f,g=null,h=-1,i=e.length;while(++h<i)f=e[h],c(f,g),g=f}b(a,d)}c(a,null)}function l(a){return a.children?a.children[a.children.length-1]:a._tree.thread}function k(a){return a.children?a.children[0]:a._tree.thread}function j(a,b){return a.parent==b.parent?1:2}function i(a,b){return b.value-a.value}function h(a){return a.value}function g(a){return a.children}function f(a,b){return a+b.y}function e(a){var b=1,c=0,d=a[0].y,e,f=a.length;for(;b<f;++b)(e=a[b].y)>d&&(c=b,d=e);return c}function c(a){return a.reduce(f,0)}d3.layout={},d3.layout.chord=function(){function k(){b.sort(function(a,b){a=Math.min(a.source.value,a.target.value),b=Math.min(b.source.value,b.target.value);return i(a,b)})}function j(){var a={},j=[],l=d3.range(e),m=[],n,o,p,q,r;b=[],c=[],n=0,q=-1;while(++q<e){o=0,r=-1;while(++r<e)o+=d[q][r];j.push(o),m.push(d3.range(e)),n+=o}g&&l.sort(function(a,b){return g(j[a],j[b])}),h&&m.forEach(function(a,b){a.sort(function(a,c){return h(d[b][a],d[b][c])})}),n=(2*Math.PI-f*e)/n,o=0,q=-1;while(++q<e){p=o,r=-1;while(++r<e){var s=l[q],t=m[q][r],u=d[s][t];a[s+"-"+t]={index:s,subindex:t,startAngle:o,endAngle:o+=u*n,value:u}}c.push({index:s,startAngle:p,endAngle:o,value:(o-p)/n}),o+=f}q=-1;while(++q<e){r=q-1;while(++r<e){var v=a[q+"-"+r],w=a[r+"-"+q];(v.value||w.value)&&b.push({source:v,target:w})}}i&&k()}var a={},b,c,d,e,f=0,g,h,i;a.matrix=function(f){if(!arguments.length)return d;e=(d=f)&&d.length,b=c=null;return a},a.padding=function(d){if(!arguments.length)return f;f=d,b=c=null;return a},a.sortGroups=function(d){if(!arguments.length)return g;g=d,b=c=null;return a},a.sortSubgroups=function(c){if(!arguments.length)return h;h=c,b=null;return a},a.sortChords=function(c){if(!arguments.length)return i;i=c,b&&k();return a},a.chords=function(){b||j();return b},a.groups=function(){c||j();return c};return a},d3.layout.force=function(){function j(){var a=i.length,c,f,g,h,j,k,l;for(c=0;c<a;++c){f=i[c],g=f.source,h=f.target,k=h.x-g.x,l=h.y-g.y;if(j=Math.sqrt(k*k+l*l))j=d/(f.distance*f.distance)*(j-e*f.distance)/j,k*=j,l*=j,h.fixed||(h.x-=k,h.y-=l),g.fixed||(g.x+=k,g.y+=l)}b.tick.dispatch({type:"tick"});return(d*=.99)<.005}var a={},b=d3.dispatch("tick"),c=[1,1],d=.5,e=30,f,g,h,i;a.on=function(c,d){b[c].add(d);return a},a.nodes=function(b){if(!arguments.length)return g;g=b;return a},a.links=function(b){if(!arguments.length)return h;h=b;return a},a.size=function(b){if(!arguments.length)return c;c=b;return a},a.distance=function(b){if(!arguments.length)return e;e=b;return a},a.start=function(){var b,d,e,f=g.length,k=h.length,l=c[0],m=c[1],n,o=[];for(b=0;b<f;++b){n=g[b],n.x=n.x||Math.random()*l,n.y=n.y||Math.random()*m,n.fixed=0,o[b]=[];for(d=0;d<f;++d)o[b][d]=Infinity;o[b][b]=0}for(b=0;b<k;++b)n=h[b],o[n.source][n.target]=1,o[n.target][n.source]=1,n.source=g[n.source],n.target=g[n.target];for(e=0;e<f;++e)for(b=0;b<f;++b)for(d=0;d<f;++d)o[b][d]=Math.min(o[b][d],o[b][e]+o[e][d]);i=[];for(b=0;b<f;++b)for(d=b+1;d<f;++d)i.push({source:g[b],target:g[d],distance:o[b][d]*o[b][d]});i.sort(function(a,b){return a.distance-b.distance}),d3.timer(j);return a},a.resume=function(){d=.1,d3.timer(j);return a},a.stop=function(){d=0;return a},a.drag=function(){function f(){!b||(e(),b.fixed=!1,b=c=null)}function e(){if(!!b){var d=d3.svg.mouse(c);b.x=d[0],b.y=d[1],a.resume()}}function d(a){(b=a).fixed=!0,c=this,d3.event.preventDefault()}var b,c;this.on("mouseover",function(a){a.fixed=!0}).on("mouseout",function(a){a!=b&&(a.fixed=!1)}).on("mousedown",d),d3.select(window).on("mousemove",e).on("mouseup",f);return a};return a},d3.layout.partition=function(){function e(e,f){var g=a.call(this,e,f);c(g[0],0,b[0],b[1]/d(g[0]));return g}function d(a){var b=a.children,c=0;if(b){var e=-1,f=b.length;while(++e<f)c=Math.max(c,d(b[e]))}return 1+c}function c(a,b,d,e){var f=a.children;a.x=b,a.y=a.depth*e,a.dx=d,a.dy=e;if(f){var g=-1,h=f.length,i,j;d/=a.value;while(++g<h)c(i=f[g],b,j=i.value*d,e),b+=j}}var a=d3.layout.hierarchy(),b=[1,1];e.sort=d3.rebind(e,a.sort),e.children=d3.rebind(e,a.children),e.value=d3.rebind(e,a.value),e.size=function(a){if(!arguments.length)return b;b=a;return e};return e},d3.layout.pie=function(){function f(f,g){var h=+(typeof c=="function"?c.apply(this,arguments):c),i=(typeof e=="function"?e.apply(this,arguments):e)-c,j=d3.range(f.length);b!=null&&j.sort(function(a,c){return b(f[a],f[c])});var k=f.map(a);i/=k.reduce(function(a,b){return a+b},0);var l=j.map(function(a){return{value:d=k[a],startAngle:h,endAngle:h+=d*i}});return f.map(function(a,b){return l[j[b]]})}var a=Number,b=null,c=0,e=2*Math.PI;f.value=function(b){if(!arguments.length)return a;a=b;return f},f.sort=function(a){if(!arguments.length)return b;b=a;return f},f.startAngle=function(a){if(!arguments.length)return c;c=a;return f},f.endAngle=function(a){if(!arguments.length)return e;e=a;return f};return f},d3.layout.stack=function(){function e(e){var f=e.length,g=e[0].length,h,i,j,k=a[c](e);b[d](e,k);for(i=0;i<g;++i)for(h=1,j=e[k[0]][i].y0;h<f;++h)e[k[h]][i].y0=j+=e[k[h-1]][i].y;return e}var c="default",d="zero";e.order=function(a){if(!arguments.length)return c;c=a;return e},e.offset=function(a){if(!arguments.length)return d;d=a;return e};return e};var a={"inside-out":function(a){var b=a.length,d,f,g=a.map(e),h=a.map(c),i=d3.range(b).sort(function(a,b){return g[a]-g[b]}),j=0,k=0,l=[],m=[];for(d=0;d<b;d++)f=i[d],j<k?(j+=h[f],l.push(f)):(k+=h[f],m.push(f));return m.reverse().concat(l)},reverse:function(a){return d3.range(a.length).reverse()},"default":function(a){return d3.range(a.length)}},b={silhouette:function(a,b){var c=a.length,d=a[0].length,e=[],f=0,g,h,i;for(h=0;h<d;++h){for(g=0,i=0;g<c;g++)i+=a[g][h].y;i>f&&(f=i),e.push(i)}for(h=0,g=b[0];h<d;++h)a[g][h].y0=(f-e[h])/2},wiggle:function(a,b){var c=a.length,d=a[0],e=d.length,f=0,g,h,i,j,k,l=b[0],m,n,o,p,q,r;a[l][0].y0=q=r=0;for(h=1;h<e;++h){for(g=0,m=0;g<c;++g)m+=a[g][h].y;for(g=0,n=0,p=d[h].x-d[h-1].x;g<c;++g){for(i=0,j=b[g],o=(a[j][h].y-a[j][h-1].y)/(2*p);i<g;++i)o+=(a[k=b[i]][h].y-a[k][h-1].y)/p;n+=o*a[j][h].y}a[l][h].y0=q-=m?n/m*p:0,q<r&&(r=q)}for(h=0;h<e;++h)a[l][h].y0-=r},zero:function(a,b){var c=0,d=a[0].length,e=b[0];for(;c<d;++c)a[e][c].y0=0}};d3.layout.hierarchy=function(){function j(a){var b=[];e(a,0,b);return b}function f(a,b){var d=a.children,e=0;if(d){var g=-1,h=d.length,i=b+1;while(++g<h)e+=f(d[g],i)}else e=c.call(j,a.data,b);return a.value=e}function e(f,g,h){var i=b.call(j,f,g),k={depth:g,data:f};h.push(k);if(i){var l=-1,m=i.length,n=k.children=[],o=0,p=g+1;while(++l<m)d=e(i[l],p,h),d.value>0&&(n.push(d),o+=d.value,d.parent=k);a&&n.sort(a),k.value=o}else k.value=c.call(j,f,g);return k}var a=i,b=g,c=h;j.sort=function(b){if(!arguments.length)return a;a=b;return j},j.children=function(a){if(!arguments.length)return b;b=a;return j},j.value=function(a){if(!arguments.length)return c;c=a;return j},j.revalue=function(a){f(a,0);return a};return j},d3.layout.tree=function(){function d(d,e){function s(a,c,d){if(c){var e=a,f=a,g=c,h=a.parent.children[0],i=e._tree.mod,j=f._tree.mod,m=g._tree.mod,n=h._tree.mod,q;while(g=l(g),e=k(e),g&&e)h=k(h),f=l(f),f._tree.ancestor=a,q=g._tree.prelim+m-e._tree.prelim-i+b(g,e),q>0&&(o(p(g,a,d),a,q),i+=q,j+=q),m+=g._tree.mod,i+=e._tree.mod,n+=h._tree.mod,j+=f._tree.mod;g&&!l(f)&&(f._tree.thread=g,f._tree.mod+=m-j),e&&!k(h)&&(h._tree.thread=e,h._tree.mod+=i-n,d=a)}return d}function r(a,b){a.x=a._tree.prelim+b;var c=a.children;if(c){var d=-1,e=c.length;b+=a._tree.mod;while(++d<e)r(c[d],b)}a.x<h&&(h=a.x),a.x>i&&(i=a.x),a.depth>j&&(j=a.depth)}function q(a,c){var d=a.children,e=a._tree;if(d){var f=d.length,g=d[0],h,i,j=-1;while(++j<f)i=d[j],q(i,h),g=s(i,h,g),h=i;n(a);var k=.5*(d[0]._tree.prelim+d[f-1]._tree.prelim);c?(e.prelim=c._tree.prelim+b(a,c),e.mod=e.prelim-k):e.prelim=k}else c&&(e.prelim=c._tree.prelim+b(a,c))}var f=a.call(this,d,e),g=f[0],h=0,i=0,j=0;m(g,function(a,b){a._tree={ancestor:a,prelim:0,mod:0,change:0,shift:0,number:b?b._tree.number+1:0}}),q(g),r(g,-g._tree.prelim),m(g,function(a){a.x=(a.x-h)/(i-h)*c[0],a.y=a.depth/j*c[1],delete a._tree});return f}var a=d3.layout.hierarchy(),b=j,c=[1,1];d.sort=d3.rebind(d,a.sort),d.children=d3.rebind(d,a.children),d.separation=function(a){if(!arguments.length)return b;b=a;return d},d.size=function(a){if(!arguments.length)return c;c=a;return d};return d},d3.layout.treemap=function(){function k(b){var i=e||a(b),j=i[0];j.x=0,j.y=0,j.dx=c[0],j.dy=c[1],e&&a.revalue(j),f(j,c[0]*c[1]/j.value),(e?h:g)(j),d&&(e=i);return i}function j(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=d.dy;while(++f<g)k=a[f],k.x=h,k.y=i,k.dy=j,h+=k.dx=b(k.area/j);k.z=!0,k.dx+=d.x+d.dx-h,d.y+=j,d.dy-=j}else{if(e||j>d.dx)j=d.dx;while(++f<g)k=a[f],k.x=h,k.y=i,k.dx=j,i+=k.dy=b(k.area/j);k.z=!1,k.dy+=d.y+d.dy-i,d.x+=j,d.dx-=j}}function i(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,h=a.length;while(++g<h)d=a[g].area,d<f&&(f=d),d>e&&(e=d);c*=c,b*=b;return Math.max(b*e/c,c/(b*f))}function h(a){if(!!a.children){var b={x:a.x,y:a.y,dx:a.dx,dy:a.dy},c=a.children.slice(),d,e=[];e.area=0;while(d=c.pop())e.push(d),e.area+=d.area,d.z!=null&&(j(e,d.z?b.dx:b.dy,b,!c.length),e.length=e.area=0);a.children.forEach(h)}}function g(a){if(!!a.children){var b={x:a.x,y:a.y,dx:a.dx,dy:a.dy},c=[],d=a.children.slice(),e,f=Infinity,h,k=Math.min(b.dx,b.dy),l;c.area=0;while((l=d.length)>0)c.push(e=d[l-1]),c.area+=e.area,(h=i(c,k))<=f?(d.pop(),f=h):(c.area-=c.pop().area,j(c,k,b,!1),k=Math.min(b.dx,b.dy),c.length=c.area=0,f=Infinity);c.length&&(j(c,k,b,!0),c.length=c.area=0),a.children.forEach(g)}}function f(a,b){var c=a.children;a.area=a.value*b;if(c){var d=-1,e=c.length;while(++d<e)f(c[d],b)}}var a=d3.layout.hierarchy(),b=Math.round,c=[1,1],d=!1,e;k.sort=d3.rebind(k,a.sort),k.children=d3.rebind(k,a.children),k.value=d3.rebind(k,a.value),k.size=function(a){if(!arguments.length)return c;c=a;return k},k.round=function(a){if(!arguments.length)return b!=Number;b=a?Math.round:Number;return k},k.sticky=function(a){if(!arguments.length)return d;d=a,e=null;return k};return k}})()
<ide>\ No newline at end of file
<ide><path>src/layout/tree.js
<ide> d3.layout.tree = function() {
<ide> y1 = 0; // max depth
<ide>
<ide> function firstWalk(node, previousSibling) {
<del> var children = node.children;
<del> if (!children) {
<del> if (previousSibling) {
<del> node.prelim = previousSibling.prelim + separation(node, previousSibling);
<del> }
<del> } else {
<add> var children = node.children,
<add> layout = node._tree;
<add> if (children) {
<ide> var n = children.length,
<del> firstChild = children[0],
<del> lastChild = children[n - 1],
<del> ancestor = firstChild,
<add> ancestor = children[0],
<ide> previousChild,
<ide> child,
<ide> i = -1;
<ide> d3.layout.tree = function() {
<ide> previousChild = child;
<ide> }
<ide> d3_layout_treeShift(node);
<del> var midpoint = .5 * (firstChild.prelim + lastChild.prelim);
<add> var midpoint = .5 * (children[0]._tree.prelim + children[n - 1]._tree.prelim);
<ide> if (previousSibling) {
<del> node.prelim = previousSibling.prelim + separation(node, previousSibling);
<del> node.mod = node.prelim - midpoint;
<add> layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
<add> layout.mod = layout.prelim - midpoint;
<ide> } else {
<del> node.prelim = midpoint;
<add> layout.prelim = midpoint;
<add> }
<add> } else {
<add> if (previousSibling) {
<add> layout.prelim = previousSibling._tree.prelim + separation(node, previousSibling);
<ide> }
<ide> }
<ide> }
<ide>
<ide> function secondWalk(node, x) {
<del> node.breadth = node.prelim + x;
<add> node.x = node._tree.prelim + x;
<ide> var children = node.children;
<ide> if (children) {
<ide> var i = -1,
<ide> n = children.length;
<del> x += node.mod;
<add> x += node._tree.mod;
<ide> while (++i < n) {
<ide> secondWalk(children[i], x);
<ide> }
<ide> }
<ide>
<ide> // Compute extent of breadth and depth.
<del> if (node.breadth < x0) x0 = node.breadth;
<del> if (node.breadth > x1) x1 = node.breadth;
<add> if (node.x < x0) x0 = node.x;
<add> if (node.x > x1) x1 = node.x;
<ide> if (node.depth > y1) y1 = node.depth;
<ide> }
<ide>
<ide> d3.layout.tree = function() {
<ide> vop = node,
<ide> vim = previousSibling,
<ide> vom = node.parent.children[0],
<del> sip = vip.mod,
<del> sop = vop.mod,
<del> sim = vim.mod,
<del> som = vom.mod,
<add> sip = vip._tree.mod,
<add> sop = vop._tree.mod,
<add> sim = vim._tree.mod,
<add> som = vom._tree.mod,
<ide> shift;
<ide> while (vim = d3_tree_layoutRight(vim), vip = d3_tree_layoutLeft(vip), vim && vip) {
<ide> vom = d3_tree_layoutLeft(vom);
<ide> vop = d3_tree_layoutRight(vop);
<del> vop.ancestor = node;
<del> shift = vim.prelim + sim - vip.prelim - sip + separation(vim, vip);
<add> vop._tree.ancestor = node;
<add> shift = vim._tree.prelim + sim - vip._tree.prelim - sip + separation(vim, vip);
<ide> if (shift > 0) {
<ide> d3_layout_treeMove(d3_layout_treeAncestor(vim, node, ancestor), node, shift);
<ide> sip += shift;
<ide> sop += shift;
<ide> }
<del> sim += vim.mod;
<del> sip += vip.mod;
<del> som += vom.mod;
<del> sop += vop.mod;
<add> sim += vim._tree.mod;
<add> sip += vip._tree.mod;
<add> som += vom._tree.mod;
<add> sop += vop._tree.mod;
<ide> }
<ide> if (vim && !d3_tree_layoutRight(vop)) {
<del> vop.thread = vim;
<del> vop.mod += sim - sop;
<add> vop._tree.thread = vim;
<add> vop._tree.mod += sim - sop;
<ide> }
<ide> if (vip && !d3_tree_layoutLeft(vom)) {
<del> vom.thread = vip;
<del> vom.mod += sip - som;
<add> vom._tree.thread = vip;
<add> vom._tree.mod += sip - som;
<ide> ancestor = node;
<ide> }
<ide> }
<ide> return ancestor;
<ide> }
<ide>
<del> // Initialize temporary layout variables. TODO store separately?
<add> // Initialize temporary layout variables.
<ide> d3_layout_treeVisitAfter(root, function(node, previousSibling) {
<del> node.ancestor = node;
<del> node.prelim = 0;
<del> node.mod = 0;
<del> node.change = 0;
<del> node.shift = 0;
<del> node.number = previousSibling ? previousSibling.number + 1 : 0;
<add> node._tree = {
<add> ancestor: node,
<add> prelim: 0,
<add> mod: 0,
<add> change: 0,
<add> shift: 0,
<add> number: previousSibling ? previousSibling._tree.number + 1 : 0
<add> };
<ide> });
<ide>
<ide> // Compute the layout using Buchheim et al.'s algorithm.
<ide> firstWalk(root);
<del> secondWalk(root, -root.prelim);
<add> secondWalk(root, -root._tree.prelim);
<ide>
<ide> // Clear temporary layout variables; transform depth and breadth.
<ide> d3_layout_treeVisitAfter(root, function(node) {
<del> node.x = ((node.breadth - x0) / (x1 - x0)) * size[0];
<add> node.x = ((node.x - x0) / (x1 - x0)) * size[0];
<ide> node.y = node.depth / y1 * size[1];
<del> delete node.breadth;
<del> delete node.ancestor;
<del> delete node.prelim;
<del> delete node.mod;
<del> delete node.change;
<del> delete node.shift;
<del> delete node.number;
<del> delete node.thread;
<add> delete node._tree;
<ide> });
<ide>
<ide> return nodes;
<ide> function d3_layout_treeSeparation(a, b) {
<ide> // }
<ide>
<ide> function d3_tree_layoutLeft(node) {
<del> return node.children ? node.children[0] : node.thread;
<add> return node.children ? node.children[0] : node._tree.thread;
<ide> }
<ide>
<ide> function d3_tree_layoutRight(node) {
<del> return node.children ? node.children[node.children.length - 1] : node.thread;
<add> return node.children ? node.children[node.children.length - 1] : node._tree.thread;
<ide> }
<ide>
<ide> function d3_layout_treeVisitAfter(node, callback) {
<ide> function d3_layout_treeShift(node) {
<ide> i = children.length,
<ide> child;
<ide> while (--i >= 0) {
<del> child = children[i];
<add> child = children[i]._tree;
<ide> child.prelim += shift;
<ide> child.mod += shift;
<ide> shift += child.shift + (change += child.change);
<ide> }
<ide> }
<ide>
<ide> function d3_layout_treeMove(ancestor, node, shift) {
<del> var subtrees = node.number - ancestor.number;
<del> node.change -= shift / subtrees;
<add> ancestor = ancestor._tree;
<add> node = node._tree;
<add> var change = shift / (node.number - ancestor.number);
<add> ancestor.change += change;
<add> node.change -= change;
<ide> node.shift += shift;
<del> ancestor.change += shift / subtrees;
<ide> node.prelim += shift;
<ide> node.mod += shift;
<ide> }
<ide>
<ide> function d3_layout_treeAncestor(vim, node, ancestor) {
<del> return vim.ancestor.parent == node.parent ? vim.ancestor : ancestor;
<add> return vim._tree.ancestor.parent == node.parent
<add> ? vim._tree.ancestor
<add> : ancestor;
<ide> } | 3 |
Python | Python | fix mypy in amazon provider for sagemaker operator | dd12cfcfe9034b8c11fe9e2c3e504bae2036bade | <ide><path>airflow/providers/amazon/aws/operators/sagemaker.py
<ide>
<ide> import json
<ide> import sys
<del>from typing import TYPE_CHECKING, List, Optional, Sequence
<add>from typing import TYPE_CHECKING, Any, List, Optional, Sequence
<ide>
<ide> from botocore.exceptions import ClientError
<ide>
<ide> class SageMakerBaseOperator(BaseOperator):
<ide> template_ext: Sequence[str] = ()
<ide> template_fields_renderers = {'config': 'json'}
<ide> ui_color = '#ededed'
<del> integer_fields = []
<add> integer_fields: List[List[Any]] = []
<ide>
<ide> def __init__(self, *, config: dict, aws_conn_id: str = 'aws_default', **kwargs):
<ide> super().__init__(**kwargs) | 1 |
Mixed | Go | remove redundant parameter and fix typos | 713cae7ca2b8a5fa36e25f3fb30531f6400fafc3 | <ide><path>cli/command/container/exec.go
<ide> func NewExecCommand(dockerCli *command.DockerCli) *cobra.Command {
<ide> }
<ide>
<ide> func runExec(dockerCli *command.DockerCli, opts *execOptions, container string, execCmd []string) error {
<del> execConfig, err := parseExec(opts, container, execCmd)
<add> execConfig, err := parseExec(opts, execCmd)
<ide> // just in case the ParseExec does not exit
<ide> if container == "" || err != nil {
<ide> return cli.StatusError{StatusCode: 1}
<ide> func getExecExitCode(ctx context.Context, client apiclient.ContainerAPIClient, e
<ide>
<ide> // parseExec parses the specified args for the specified command and generates
<ide> // an ExecConfig from it.
<del>func parseExec(opts *execOptions, container string, execCmd []string) (*types.ExecConfig, error) {
<add>func parseExec(opts *execOptions, execCmd []string) (*types.ExecConfig, error) {
<ide> execConfig := &types.ExecConfig{
<ide> User: opts.user,
<ide> Privileged: opts.privileged,
<ide> Tty: opts.tty,
<ide> Cmd: execCmd,
<ide> Detach: opts.detach,
<del> // container is not used here
<ide> }
<ide>
<ide> // If -d is not set, attach to everything by default
<ide><path>cli/command/container/exec_test.go
<ide> import (
<ide> )
<ide>
<ide> type arguments struct {
<del> options execOptions
<del> container string
<del> execCmd []string
<add> options execOptions
<add> execCmd []string
<ide> }
<ide>
<ide> func TestParseExec(t *testing.T) {
<ide> func TestParseExec(t *testing.T) {
<ide> }
<ide>
<ide> for valid, expectedExecConfig := range valids {
<del> execConfig, err := parseExec(&valid.options, valid.container, valid.execCmd)
<add> execConfig, err := parseExec(&valid.options, valid.execCmd)
<ide> if err != nil {
<ide> t.Fatal(err)
<ide> }
<ide><path>cli/command/stack/list.go
<ide> func printTable(out io.Writer, stacks []*stack) {
<ide>
<ide> type stack struct {
<ide> // Name is the name of the stack
<del> Name string
<add> Name string
<ide> // Services is the number of the services
<ide> Services int
<ide> }
<ide><path>docs/reference/commandline/diff.md
<ide> Options:
<ide> --help Print usage
<ide> ```
<ide>
<del>List the changed files and directories in a container᾿s filesystem
<add>List the changed files and directories in a container᾿s filesystem.
<ide> There are 3 events that are listed in the `diff`:
<ide>
<ide> 1. `A` - Add | 4 |
Ruby | Ruby | remove surprise if from show_exception middleware | 0f0630aaaeef2a7f9b57e09906a420b99a4f862f | <ide><path>actionpack/lib/action_dispatch/middleware/show_exceptions.rb
<ide> def initialize(app, exceptions_app)
<ide> def call(env)
<ide> @app.call(env)
<ide> rescue Exception => exception
<del> raise exception if env['action_dispatch.show_exceptions'] == false
<del> render_exception(env, exception)
<add> if env['action_dispatch.show_exceptions'] == false
<add> raise exception
<add> else
<add> render_exception(env, exception)
<add> end
<ide> end
<ide>
<ide> private | 1 |
Mixed | Javascript | allow `options` object as constructor arg | ce58df58d0360779d16d60ce3bb0e9979ec5fdf4 | <ide><path>doc/api/console.md
<ide> const { Console } = console;
<ide> ```
<ide>
<ide> ### new Console(stdout[, stderr][, ignoreErrors])
<add>### new Console(options)
<ide> <!-- YAML
<ide> changes:
<ide> - version: v8.0.0
<ide> pr-url: https://github.com/nodejs/node/pull/9744
<ide> description: The `ignoreErrors` option was introduced.
<add> - version: REPLACEME
<add> pr-url: https://github.com/nodejs/node/pull/19372
<add> description: The `Console` constructor now supports an `options` argument.
<ide> -->
<ide>
<del>* `stdout` {stream.Writable}
<del>* `stderr` {stream.Writable}
<del>* `ignoreErrors` {boolean} Ignore errors when writing to the underlying streams.
<del> Defaults to `true`.
<add>* `options` {Object}
<add> * `stdout` {stream.Writable}
<add> * `stderr` {stream.Writable}
<add> * `ignoreErrors` {boolean} Ignore errors when writing to the underlying
<add> streams. **Default:** `true`.
<ide>
<ide> Creates a new `Console` with one or two writable stream instances. `stdout` is a
<ide> writable stream to print log or info output. `stderr` is used for warning or
<ide> error output. If `stderr` is not provided, `stdout` is used for `stderr`.
<ide> const output = fs.createWriteStream('./stdout.log');
<ide> const errorOutput = fs.createWriteStream('./stderr.log');
<ide> // custom simple logger
<del>const logger = new Console(output, errorOutput);
<add>const logger = new Console({ stdout: output, stderr: errorOutput });
<ide> // use it like console
<ide> const count = 5;
<ide> logger.log('count: %d', count);
<ide> The global `console` is a special `Console` whose output is sent to
<ide> [`process.stdout`][] and [`process.stderr`][]. It is equivalent to calling:
<ide>
<ide> ```js
<del>new Console(process.stdout, process.stderr);
<add>new Console({ stdout: process.stdout, stderr: process.stderr });
<ide> ```
<ide>
<ide> ### console.assert(value[, ...message])
<ide><path>lib/console.js
<ide> const {
<ide> // Track amount of indentation required via `console.group()`.
<ide> const kGroupIndent = Symbol('groupIndent');
<ide>
<del>function Console(stdout, stderr, ignoreErrors = true) {
<add>function Console(options /* or: stdout, stderr, ignoreErrors = true */) {
<ide> if (!(this instanceof Console)) {
<del> return new Console(stdout, stderr, ignoreErrors);
<add> return new Console(...arguments);
<ide> }
<add>
<add> let stdout, stderr, ignoreErrors;
<add> if (options && typeof options.write !== 'function') {
<add> ({
<add> stdout,
<add> stderr = stdout,
<add> ignoreErrors = true
<add> } = options);
<add> } else {
<add> stdout = options;
<add> stderr = arguments[1];
<add> ignoreErrors = arguments[2] === undefined ? true : arguments[2];
<add> }
<add>
<ide> if (!stdout || typeof stdout.write !== 'function') {
<ide> throw new ERR_CONSOLE_WRITABLE_STREAM('stdout');
<ide> }
<del> if (!stderr) {
<del> stderr = stdout;
<del> } else if (typeof stderr.write !== 'function') {
<add> if (!stderr || typeof stderr.write !== 'function') {
<ide> throw new ERR_CONSOLE_WRITABLE_STREAM('stderr');
<ide> }
<ide>
<ide> Console.prototype.table = function(tabularData, properties) {
<ide> return final(keys, values);
<ide> };
<ide>
<del>module.exports = new Console(process.stdout, process.stderr);
<add>module.exports = new Console({
<add> stdout: process.stdout,
<add> stderr: process.stderr
<add>});
<ide> module.exports.Console = Console;
<ide>
<ide> function noop() {} | 2 |
Ruby | Ruby | remove unused require | cf6b13b2be78c5728b23f8a69aad0bf6eaee5fcb | <ide><path>actionpack/lib/action_controller/metal/mime_responds.rb
<del>require 'active_support/core_ext/array/extract_options'
<ide> require 'abstract_controller/collector'
<ide>
<ide> module ActionController #:nodoc: | 1 |
Text | Text | explain advanced contribution workflow more | b4d7b0f8657cd29147dd3215876b9f739ead7097 | <ide><path>docs/sources/project/advanced-contributing.md
<ide> The following provides greater detail on the process:
<ide>
<ide> 14. Acceptance and merge!
<ide>
<add>## About the Advanced process
<add>
<add>Docker is a large project. Our core team gets a great many design proposals.
<add>Design proposal discussions can span days, weeks, and longer. The number of comments can reach the 100s.
<add>In that situation, following the discussion flow and the decisions reached is crucial.
<add>
<add>Making a pull request with a design proposal simplifies this process:
<add>* you can leave comments on specific design proposal line
<add>* replies around line are easy to track
<add>* as a proposal changes and is updated, pages reset as line items resolve
<add>* Github maintains the entire history
<add>
<add>While proposals in pull requests do not end up merged into a master repository, they provide a convenient tool for managing the design process. | 1 |
Mixed | Python | remove sql like function in base_hook | cb0bf4a142656ee40b43a01660b6f6b08a9840fa | <ide><path>UPDATING.md
<ide> https://developers.google.com/style/inclusive-documentation
<ide>
<ide> -->
<ide>
<add>### Remove SQL support in base_hook
<add>
<add>Remove ``get_records`` and ``get_pandas_df`` and ``run`` from base_hook, which only apply for sql like hook,
<add>If want to use them, or your custom hook inherit them, please use ``dbapi_hook``
<add>
<ide> ### Changes to SalesforceHook
<ide>
<ide> Replace parameter ``sandbox`` with ``domain``. According to change in simple-salesforce package
<ide><path>airflow/hooks/base_hook.py
<ide> def get_hook(cls, conn_id: str) -> "BaseHook":
<ide> def get_conn(self):
<ide> """Returns connection for the hook."""
<ide> raise NotImplementedError()
<del>
<del> def get_records(self, sql):
<del> """Returns records for the sql query (for hooks that support SQL)."""
<del> # TODO: move it out from the base hook. It belongs to some common SQL hook most likely
<del> raise NotImplementedError()
<del>
<del> def get_pandas_df(self, sql):
<del> """Returns pandas dataframe for the sql query (for hooks that support SQL)."""
<del> # TODO: move it out from the base hook. It belongs to some common SQL hook most likely
<del> raise NotImplementedError()
<del>
<del> def run(self, sql):
<del> """Runs SQL query (for hooks that support SQL)."""
<del> # TODO: move it out from the base hook. It belongs to some common SQL hook most likely
<del> raise NotImplementedError()
<ide><path>airflow/providers/apache/hive/hooks/hive.py
<ide> from airflow.configuration import conf
<ide> from airflow.exceptions import AirflowException
<ide> from airflow.hooks.base_hook import BaseHook
<add>from airflow.hooks.dbapi_hook import DbApiHook
<ide> from airflow.security import utils
<ide> from airflow.utils.helpers import as_flattened_list
<ide> from airflow.utils.operator_helpers import AIRFLOW_VAR_NAME_FORMAT_MAPPING
<ide> def table_exists(self, table_name, db='default'):
<ide> return False
<ide>
<ide>
<del>class HiveServer2Hook(BaseHook):
<add>class HiveServer2Hook(DbApiHook):
<ide> """
<ide> Wrapper around the pyhive library
<ide>
<ide> class HiveServer2Hook(BaseHook):
<ide> are using impala you may need to set it to false in the
<ide> ``extra`` of your connection in the UI
<ide> """
<del> def __init__(self, hiveserver2_conn_id='hiveserver2_default'):
<del> super().__init__()
<del> self.hiveserver2_conn_id = hiveserver2_conn_id
<add> conn_name_attr = 'hiveserver2_conn_id'
<add> default_conn_name = 'hiveserver2_default'
<add> supports_autocommit = False
<ide>
<ide> def get_conn(self, schema=None):
<ide> """
<ide> Returns a Hive connection object.
<ide> """
<del> db = self.get_connection(self.hiveserver2_conn_id)
<add> db = self.get_connection(self.hiveserver2_conn_id) # pylint: disable=no-member
<ide> auth_mechanism = db.extra_dejson.get('authMechanism', 'NONE')
<ide> if auth_mechanism == 'NONE' and db.login is None:
<ide> # we need to give a username
<ide> def get_conn(self, schema=None):
<ide> self.log.warning(
<ide> "Detected deprecated 'GSSAPI' for authMechanism "
<ide> "for %s. Please use 'KERBEROS' instead",
<del> self.hiveserver2_conn_id
<add> self.hiveserver2_conn_id # pylint: disable=no-member
<ide> )
<ide> auth_mechanism = 'KERBEROS'
<ide>
<ide> def _get_results(self, hql, schema='default', fetch_size=None, hive_conf=None):
<ide> cur.arraysize = fetch_size or 1000
<ide>
<ide> # not all query services (e.g. impala AIRFLOW-4434) support the set command
<del> db = self.get_connection(self.hiveserver2_conn_id)
<add> db = self.get_connection(self.hiveserver2_conn_id) # pylint: disable=no-member
<ide> if db.extra_dejson.get('run_set_variable_statements', True):
<ide> env_context = get_context_from_env_var()
<ide> if hive_conf:
<ide><path>airflow/providers/grpc/hooks/grpc.py
<ide> def get_conn(self):
<ide> return channel
<ide>
<ide> def run(self, stub_class, call_func, streaming=False, data=None):
<add> """
<add> Call gRPC function and yield response to caller
<add> """
<ide> if data is None:
<ide> data = {}
<ide> with self.get_conn() as channel: | 4 |
Ruby | Ruby | remove useless import | 7eef8d35d7bda87d642eac47472aee5352ec0dea | <ide><path>activejob/test/jobs/application_job.rb
<del>require_relative "../support/job_buffer"
<del>
<ide> class ApplicationJob < ActiveJob::Base
<ide> end | 1 |
Ruby | Ruby | update repology for changes to github module | ed23eb1fab44268a2ace3d5bafd2c45b38a6ce60 | <ide><path>Library/Homebrew/test/dev-cmd/bump_spec.rb
<ide>
<ide> require "cmd/shared_examples/args_parse"
<ide>
<del>describe "Homebrew.bump_args" do
<del> it_behaves_like "parseable arguments"
<add>describe "brew bump" do
<add> describe "Homebrew.bump_args" do
<add> it_behaves_like "parseable arguments"
<add> end
<add>
<add> describe "formula", :integration_test do
<add> it "returns data for valid specified formula" do
<add> install_test_formula "testball"
<add>
<add> expect { brew "bump", "--formula=testball" }
<add> .to output.to_stdout
<add> .and not_to_output.to_stderr
<add> .and be_a_success
<add> end
<add> end
<ide> end
<ide><path>Library/Homebrew/test/utils/repology_spec.rb
<ide> require "utils/repology"
<ide>
<ide> describe Repology do
<del> describe "formula_data", :integration_test do
<add> describe "formula_data" do
<ide> it "returns nil for invalid Homebrew Formula" do
<ide> expect(described_class.formula_data("invalidName")).to be_nil
<ide> end
<del>
<del> it "validates Homebrew Formula by name" do
<del> install_test_formula "testball"
<del> expect(described_class.formula_data("testball")).not_to be_nil
<del> end
<ide> end
<ide>
<ide> describe "query_api" do
<ide> expect(response).to be_a(Hash)
<ide> end
<ide> end
<del>
<del> describe "format_package", :integration_test do
<del> it "returns nil if package is not a valid formula" do
<del> invalid_formula_response = described_class.format_package("invalidName", "5.5.5")
<del>
<del> expect(invalid_formula_response).to be_nil
<del> end
<del>
<del> it "returns hash with data for valid formula" do
<del> install_test_formula "testball"
<del> formatted_data = described_class.format_package("testball", "0.1")
<del>
<del> expect(formatted_data).not_to be_nil
<del> expect(formatted_data).to be_a(Hash)
<del> expect(formatted_data[:repology_latest_version]).not_to be_nil
<del> expect(formatted_data[:current_formula_version]).not_to be_nil
<del> expect(formatted_data[:current_formula_version]).to eq("0.1")
<del> expect(formatted_data).to include(:livecheck_latest_version)
<del> expect(formatted_data).to include(:open_pull_requests)
<del> end
<del> end
<ide> end
<ide><path>Library/Homebrew/utils/repology.rb
<ide> def validate_and_format_packages(outdated_repology_packages, limit)
<ide>
<ide> next if repology_homebrew_repo.blank?
<ide>
<del> latest_version = repositories.find { |repo| repo["status"] == "newest" }["version"]
<add> latest_version = repositories.find { |repo| repo["status"] == "newest" }
<add>
<add> next if latest_version.blank?
<add>
<add> latest_version = latest_version["version"]
<ide> srcname = repology_homebrew_repo["srcname"]
<ide> package_details = format_package(srcname, latest_version)
<ide> packages[srcname] = package_details unless package_details.nil?
<ide> def format_package(package_name, latest_version)
<ide>
<ide> return if formula.blank?
<ide>
<add> formula_name = formula.to_s
<ide> tap_full_name = formula.tap&.full_name
<ide> current_version = formula.version.to_s
<ide> livecheck_response = LivecheckFormula.init(package_name)
<del> pull_requests = GitHub.check_for_duplicate_pull_requests(formula, tap_full_name, latest_version)
<add> pull_requests = GitHub.fetch_pull_requests(formula_name, tap_full_name, state: "open")
<ide>
<ide> if pull_requests.try(:any?)
<del> pull_requests = pull_requests.map { |pr| "#{pr[:title]} (#{Formatter.url(pr[:url])})" }.join(", ")
<add> pull_requests = pull_requests.map { |pr| "#{pr["title"]} (#{Formatter.url(pr["url"])})" }.join(", ")
<ide> end
<ide>
<add> pull_requests = "None" if pull_requests.empty?
<add>
<ide> {
<ide> repology_latest_version: latest_version,
<ide> current_formula_version: current_version.to_s, | 3 |
Mixed | Javascript | use requirenativecomponent in view.js | 7cd7591f0466bc8aca617f75f5fd2dee53712846 | <ide><path>Libraries/Components/View/View.js
<ide> var ReactNativeViewAttributes = require('ReactNativeViewAttributes');
<ide> var StyleSheetPropType = require('StyleSheetPropType');
<ide> var ViewStylePropTypes = require('ViewStylePropTypes');
<ide>
<del>var createReactNativeComponentClass = require('createReactNativeComponentClass');
<add>var requireNativeComponent = require('requireNativeComponent');
<ide>
<ide> var stylePropType = StyleSheetPropType(ViewStylePropTypes);
<ide>
<ide> var View = React.createClass({
<ide> },
<ide> });
<ide>
<del>var RCTView = createReactNativeComponentClass({
<del> validAttributes: ReactNativeViewAttributes.RCTView,
<del> uiViewClassName: 'RCTView',
<add>var RCTView = requireNativeComponent('RCTView', View, {
<add> nativeOnly: {
<add> nativeBackgroundAndroid: true,
<add> }
<ide> });
<del>RCTView.propTypes = View.propTypes;
<add>
<ide> if (__DEV__) {
<ide> var viewConfig = RCTUIManager.viewConfigs && RCTUIManager.viewConfigs.RCTView || {};
<ide> for (var prop in viewConfig.nativeProps) {
<ide><path>Libraries/ReactIOS/verifyPropTypes.js
<ide> 'use strict';
<ide>
<ide> var ReactNativeStyleAttributes = require('ReactNativeStyleAttributes');
<del>var View = require('View');
<ide>
<ide> export type ComponentInterface = ReactClass<any, any, any> | {
<ide> name?: string;
<ide> function verifyPropTypes(
<ide> var nativeProps = viewConfig.NativeProps;
<ide> for (var prop in nativeProps) {
<ide> if (!componentInterface.propTypes[prop] &&
<del> !View.propTypes[prop] &&
<ide> !ReactNativeStyleAttributes[prop] &&
<ide> (!nativePropsToIgnore || !nativePropsToIgnore[prop])) {
<ide> throw new Error(
<ide><path>ReactAndroid/src/main/java/com/facebook/react/views/view/ReactViewManager.java
<ide> public void setBorderColor(ReactViewGroup view, int index, Integer color) {
<ide> color == null ? CSSConstants.UNDEFINED : (float) color);
<ide> }
<ide>
<add> @ReactProp(name = ViewProps.COLLAPSABLE)
<add> public void setCollapsable(ReactViewGroup view, boolean collapsable) {
<add> // no-op: it's here only so that "collapsable" property is exported to JS. The value is actually
<add> // handled in NativeViewHierarchyOptimizer
<add> }
<add>
<ide> @Override
<ide> public String getName() {
<ide> return REACT_CLASS; | 3 |
Java | Java | return 500 (not 406) if content-type was preset | 37f9ce5cc90644c8b157508c3370d1673c92af51 | <ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/method/annotation/AbstractMessageWriterResultHandler.java
<ide> protected Mono<Void> writeBody(@Nullable Object body, MethodParameter bodyParame
<ide> }
<ide> }
<ide>
<add> MediaType contentType = exchange.getResponse().getHeaders().getContentType();
<add> if (contentType != null && contentType.equals(bestMediaType)) {
<add> return Mono.error(new IllegalStateException(
<add> "No Encoder for [" + elementType + "] with preset Content-Type '" + contentType + "'"));
<add> }
<add>
<ide> List<MediaType> mediaTypes = getMediaTypesFor(elementType);
<ide> if (bestMediaType == null && mediaTypes.isEmpty()) {
<ide> return Mono.error(new IllegalStateException("No HttpMessageWriter for " + elementType));
<ide> }
<add>
<ide> return Mono.error(new NotAcceptableStatusException(mediaTypes));
<ide> }
<ide>
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/method/annotation/ResponseEntityResultHandlerTests.java
<ide> public void handleMonoWithWildcardBodyTypeAndNullBody() throws Exception {
<ide> }
<ide>
<ide> @Test // SPR-17082
<del> public void handleResponseEntityWithExistingResponseHeaders() throws Exception {
<add> public void handleWithPresetContentType() {
<ide> ResponseEntity<Void> value = ResponseEntity.ok().contentType(MediaType.APPLICATION_JSON).build();
<ide> MethodParameter returnType = on(TestController.class).resolveReturnType(entity(Void.class));
<ide> HandlerResult result = handlerResult(value, returnType);
<ide> public void handleResponseEntityWithExistingResponseHeaders() throws Exception {
<ide> assertResponseBodyIsEmpty(exchange);
<ide> }
<ide>
<add> @Test // gh-23205
<add> public void handleWithPresetContentTypeShouldFailWithServerError() {
<add> ResponseEntity<String> value = ResponseEntity.ok().contentType(MediaType.APPLICATION_XML).body("<foo/>");
<add> MethodParameter returnType = on(TestController.class).resolveReturnType(entity(String.class));
<add> HandlerResult result = handlerResult(value, returnType);
<add>
<add> MockServerWebExchange exchange = MockServerWebExchange.from(get("/path"));
<add> ResponseEntityResultHandler resultHandler = new ResponseEntityResultHandler(
<add> Collections.singletonList(new EncoderHttpMessageWriter<>(CharSequenceEncoder.textPlainOnly())),
<add> new RequestedContentTypeResolverBuilder().build()
<add> );
<add>
<add> StepVerifier.create(resultHandler.handleResult(exchange, result))
<add> .consumeErrorWith(ex -> assertThat(ex)
<add> .isInstanceOf(IllegalStateException.class)
<add> .hasMessageContaining("with preset Content-Type"))
<add> .verify();
<add> }
<ide>
<ide>
<ide> private void testHandle(Object returnValue, MethodParameter returnType) {
<ide><path>spring-webmvc/src/main/java/org/springframework/web/servlet/mvc/method/annotation/AbstractMessageConverterMethodProcessor.java
<ide> protected <T> void writeWithMessageConverters(@Nullable T value, MethodParameter
<ide>
<ide> MediaType selectedMediaType = null;
<ide> MediaType contentType = outputMessage.getHeaders().getContentType();
<del> if (contentType != null && contentType.isConcrete()) {
<add> boolean isContentTypePreset = contentType != null && contentType.isConcrete();
<add> if (isContentTypePreset) {
<ide> if (logger.isDebugEnabled()) {
<ide> logger.debug("Found 'Content-Type:" + contentType + "' in response");
<ide> }
<ide> else if (mediaType.isPresentIn(ALL_APPLICATION_MEDIA_TYPES)) {
<ide> }
<ide>
<ide> if (body != null) {
<add> if (isContentTypePreset) {
<add> throw new IllegalStateException(
<add> "No converter for [" + valueType + "] with preset Content-Type '" + contentType + "'");
<add> }
<ide> throw new HttpMediaTypeNotAcceptableException(this.allSupportedMediaTypes);
<ide> }
<ide> }
<ide><path>spring-webmvc/src/test/java/org/springframework/web/servlet/mvc/method/annotation/HttpEntityMethodProcessorMockTests.java
<ide> import static java.time.format.DateTimeFormatter.RFC_1123_DATE_TIME;
<ide> import static org.assertj.core.api.Assertions.assertThat;
<ide> import static org.assertj.core.api.Assertions.assertThatExceptionOfType;
<add>import static org.assertj.core.api.Assertions.assertThatIllegalStateException;
<ide> import static org.mockito.ArgumentMatchers.any;
<ide> import static org.mockito.ArgumentMatchers.anyCollection;
<ide> import static org.mockito.ArgumentMatchers.argThat;
<ide> public void shouldFailHandlingWhenContentTypeNotSupported() throws Exception {
<ide> processor.handleReturnValue(returnValue, returnTypeResponseEntity, mavContainer, webRequest));
<ide> }
<ide>
<add> @Test // gh-23205
<add> public void shouldFailWithServerErrorIfContentTypeFromResponseEntity() {
<add> ResponseEntity<String> returnValue = ResponseEntity.ok()
<add> .contentType(MediaType.APPLICATION_XML)
<add> .body("<foo/>");
<add>
<add> given(stringHttpMessageConverter.canWrite(String.class, null)).willReturn(true);
<add> given(stringHttpMessageConverter.getSupportedMediaTypes()).willReturn(Collections.singletonList(TEXT_PLAIN));
<add>
<add> assertThatIllegalStateException()
<add> .isThrownBy(() ->
<add> processor.handleReturnValue(returnValue, returnTypeResponseEntity, mavContainer, webRequest))
<add> .withMessageContaining("with preset Content-Type");
<add> }
<add>
<ide> @Test
<ide> public void shouldFailHandlingWhenConverterCannotWrite() throws Exception {
<ide> String body = "Foo"; | 4 |
Ruby | Ruby | remove needless print | 933bbb9c372d3365be57aa11fdf5922b139c93dc | <ide><path>railties/test/application/test_runner_test.rb
<ide> def test_run_in_parallel_with_threads
<ide> app_path("/test/test_helper.rb") do |file_name|
<ide> file = File.read(file_name)
<ide> file.sub!(/parallelize\(([^\)]*)\)/, "parallelize(\\1, with: :threads)")
<del> puts file
<ide> File.write(file_name, file)
<ide> end
<ide> | 1 |
Ruby | Ruby | add dependencies to load_path | c3ea073a07cdf14b40102c72bf65419af83bb6ef | <ide><path>Library/Homebrew/utils/gems.rb
<ide> def install_gem!(name, version: nil, setup_gem_environment: true)
<ide> specs = Gem.install name, version, document: []
<ide> end
<ide>
<del> # Add the new specs to the $LOAD_PATH.
<add> specs += specs.flat_map(&:runtime_dependencies)
<add> .flat_map(&:to_specs)
<add>
<add> # Add the specs to the $LOAD_PATH.
<ide> specs.each do |spec|
<ide> spec.require_paths.each do |path|
<ide> full_path = File.join(spec.full_gem_path, path) | 1 |
Text | Text | update informations about proptypes lib | 7476fa1ca4dbd36605f7a05cb08d313ee7d6dfcb | <ide><path>guide/portuguese/react/what-are-react-props/index.md
<ide> ---
<del>title: React TypeChecking with PropTypes
<del>localeTitle: Verificação de Tipo em React com PropTypes
<add>title: Typechecking With PropTypes
<add>localeTitle: Checagem de tipo com PropTypes
<ide> ---
<add>
<ide> ## React PropTypes
<ide>
<del>Estes servem como um método de typechecking (verificação de tipo) à medida que um aplicativo tende a crescer, com isso uma base muito grande de bugs tende a ser corrigida com o uso desse recurso.
<add>PropTypes servem como um método de typechecking à medida que um aplicativo tende a crescer, com isso uma base muito grande de bugs tende a ser corrigida com o uso desse recurso.
<ide>
<del>## Como obter PropTypes
<add>## Como instalar PropTypes
<ide>
<ide> Começando com o React versão 15.5, esse recurso foi movido para um pacote separado chamado prop-types.
<ide>
<del>Para usá-lo, é necessário que ele seja adicionado ao projeto como uma dependência, emitindo o seguinte comando em um console.
<add>Para usá-lo, é necessário que ele seja adicionado ao projeto como uma dependência, digitando o seguinte comando em um console.
<ide>
<ide> ```sh
<ide> npm install --save prop-types
<ide> import PropTypes from 'prop-types';
<ide>
<ide> Como parte deste recurso, também é possível definir valores padrão para qualquer componente definido durante o desenvolvimento.
<ide>
<del>Eles garantem que o propore tenha um valor, mesmo que não especificado pelo componente pai.
<add>Eles garantem que a propriedade tenha um valor, mesmo que não especificado pelo componente pai.
<ide>
<ide> O código abaixo ilustra como usar essa funcionalidade.
<ide>
<ide> import React,{Component} from 'react';
<ide> };
<ide> ```
<ide>
<del>Para obter mais informações sobre PropTypes e outros documentos no React.
<add>Para obter mais informações sobre PropTypes e outros recursos do React.
<ide>
<del>Vá para o [site oficial](https://reactjs.org/) e leia os documentos ou o [Github Repo](https://github.com/facebook/react/)
<add>Vá para o [site oficial](https://reactjs.org/) ou o [Github Repo](https://github.com/facebook/react/) | 1 |
PHP | PHP | fix phpcs and linting errors | fc35c1564a44eec357844af7c923d05ba693d8cd | <ide><path>src/TestSuite/Stub/TestExceptionRenderer.php
<ide> public function __construct(Throwable $exception)
<ide> }
<ide>
<ide> /**
<del> * {@inheritDoc}
<add> * @inheritDoc
<ide> */
<ide> public function render(): ResponseInterface
<ide> {
<ide> throw new LogicException('You cannot use this class to render exceptions.');
<ide> }
<ide>
<ide> /**
<del> * {@inheritDoc}
<add> * Part of upcoming interface requirements
<add> *
<add> * @param \Psr\Http\Message\ResponseInterface|string $output The output or response to send.
<add> * @return void
<ide> */
<ide> public function write($output): void
<ide> {
<del> echo $output;
<ide> }
<ide> }
<ide><path>tests/TestCase/Error/Middleware/ErrorHandlerMiddlewareTest.php
<ide> use Cake\Core\Configure;
<ide> use Cake\Datasource\Exception\RecordNotFoundException;
<ide> use Cake\Error\ErrorHandler;
<del>use Cake\Error\ExceptionTrap;
<ide> use Cake\Error\ExceptionRendererInterface;
<add>use Cake\Error\ExceptionTrap;
<ide> use Cake\Error\Middleware\ErrorHandlerMiddleware;
<ide> use Cake\Http\Exception\MissingControllerException;
<ide> use Cake\Http\Exception\NotFoundException; | 2 |
Javascript | Javascript | add support for assets | 4ab4df07afb9766fda373138897bbbecb1d2d90d | <ide><path>packager/src/ModuleGraph/types.flow.js
<ide> export type File = {|
<ide> type: FileTypes,
<ide> |};
<ide>
<del>type FileTypes = 'module' | 'script';
<add>type FileTypes = 'module' | 'script' | 'asset';
<ide>
<ide> export type GraphFn = (
<ide> entryPoints: Iterable<string>,
<ide><path>packager/src/ModuleGraph/worker/transform-module.js
<ide> function transformModule(
<ide> return;
<ide> }
<ide>
<add> if (options.filename.endsWith('.png')) {
<add> transformAsset(code, options, callback);
<add> return;
<add> }
<add>
<ide> const {filename, transformer, variants = defaultVariants} = options;
<ide> const tasks = {};
<ide> Object.keys(variants).forEach(name => {
<ide> function transformJSON(json, options, callback) {
<ide> file: filename,
<ide> hasteID: value.name,
<ide> transformed,
<del> type: 'module',
<add> type: 'asset',
<ide> };
<ide>
<ide> if (basename(filename) === 'package.json') {
<ide> function transformJSON(json, options, callback) {
<ide> callback(null, result);
<ide> }
<ide>
<add>function transformAsset(data, options, callback) {
<add> callback(null, {
<add> code: data,
<add> file: options.filename,
<add> hasteID: null,
<add> transformed: {},
<add> type: 'module',
<add> });
<add>}
<add>
<ide> function makeResult(ast, filename, sourceCode, isPolyfill = false) {
<ide> let dependencies, dependencyMapName, file;
<ide> if (isPolyfill) { | 2 |
PHP | PHP | fix cs errors and use correct exceptions | e45abe3449c24043b9ab888d2a4b0a29aef94996 | <ide><path>src/Auth/ControllerAuthorize.php
<ide> public function __construct(ComponentRegistry $registry, array $config = array()
<ide> *
<ide> * @param Controller $controller null to get, a controller to set.
<ide> * @return \Cake\Controller\Controller
<del> * @throws \Cake\Error\Exception If controller does not have method `isAuthorized()`.
<add> * @throws \Cake\Core\Exception\Exception If controller does not have method `isAuthorized()`.
<ide> */
<ide> public function controller(Controller $controller = null) {
<ide> if ($controller) {
<ide><path>src/Console/ShellDispatcher.php
<ide> public static function alias($short, $original = null) {
<ide> static::$_aliases[$short] = $original;
<ide> }
<ide>
<del> return isset(static::$_aliases[$short]) ? static::$_aliases[$short] : false;
<add> return isset(static::$_aliases[$short]) ? static::$_aliases[$short] : false;
<ide> }
<ide>
<ide> /**
<ide><path>src/Core/Plugin.php
<ide> public static function classPath($plugin) {
<ide> */
<ide> public static function configPath($plugin) {
<ide> if (empty(static::$_plugins[$plugin])) {
<del> throw new Error\MissingPluginException(['plugin' => $plugin]);
<add> throw new Exception\MissingPluginException(['plugin' => $plugin]);
<ide> }
<ide> return static::$_plugins[$plugin]['configPath'];
<ide> }
<ide><path>src/Database/Driver.php
<ide> public function supportsSavePoints() {
<ide> return true;
<ide> }
<ide>
<del>
<ide> /**
<ide> * Returns a value in a safe representation to be used in a query string
<ide> *
<ide><path>src/I18n/Number.php
<ide> public static function currency($value, $currency = null, array $options = array
<ide> /**
<ide> * Getter/setter for default currency
<ide> *
<del> * @param string|boolean $currency Default currency string to be used by currency()
<add> * @param string|bool $currency Default currency string to be used by currency()
<ide> * if $currency argument is not provided. If boolean false is passed, it will clear the
<ide> * currently stored value
<ide> * @return string Currency
<ide><path>src/I18n/Parser/MoFileParser.php
<ide> public function parse($resource) {
<ide> * Reads an unsigned long from stream respecting endianess.
<ide> *
<ide> * @param resource $stream The File being read.
<del> * @param boolean $isBigEndian Whether or not the current platfomr is Big Endian
<add> * @param bool $isBigEndian Whether or not the current platfomr is Big Endian
<ide> * @return int
<ide> */
<ide> protected function _readLong($stream, $isBigEndian) {
<ide><path>src/I18n/PluralRules.php
<ide> class PluralRules {
<ide> * to the countable provided in $n.
<ide> *
<ide> * @param string $locale The locale to get the rule calculated for.
<del> * @param integer|float $n The number to apply the rules to.
<del> * @return integer The plural rule number that should be used.
<add> * @param int|float $n The number to apply the rules to.
<add> * @return int The plural rule number that should be used.
<ide> * @link http://localization-guide.readthedocs.org/en/latest/l10n/pluralforms.html
<ide> */
<ide> public static function calculate($locale, $n) {
<ide><path>src/Utility/Security.php
<ide> protected static function _checkKey($key, $method) {
<ide> * @param string $key The 256 bit/32 byte key to use as a cipher key.
<ide> * @param string $hmacSalt The salt to use for the HMAC process. Leave null to use Security.salt.
<ide> * @return string Decrypted data. Any trailing null bytes will be removed.
<del> * @throws \Cake\Core\Exception\Exception On invalid data or key.
<add> * @throws InvalidArgumentException On invalid data or key.
<ide> */
<ide> public static function decrypt($cipher, $key, $hmacSalt = null) {
<ide> self::_checkKey($key, 'decrypt()');
<ide><path>src/Validation/Validation.php
<ide> namespace Cake\Validation;
<ide>
<ide> use Cake\Utility\String;
<add>use LogicException;
<ide> use RuntimeException;
<ide>
<ide> /**
<ide><path>src/View/Helper/NumberHelper.php
<ide> public function formatDelta($value, array $options = array()) {
<ide> /**
<ide> * Getter/setter for default currency
<ide> *
<del> * @param string|boolean $currency Default currency string to be used by currency()
<add> * @param string|bool $currency Default currency string to be used by currency()
<ide> * if $currency argument is not provided. If boolean false is passed, it will clear the
<ide> * currently stored value
<ide> * @return string Currency
<ide><path>tests/TestCase/Error/ErrorHandlerTest.php
<ide> protected function _clearOutput() {
<ide> protected function _sendResponse($response) {
<ide> $this->response = $response;
<ide> }
<add>
<ide> }
<ide>
<ide> /**
<ide><path>tests/TestCase/Event/EventManagerTraitTest.php
<ide> public function testSettingEventManager() {
<ide> * @return void
<ide> */
<ide> public function testDispatchEvent() {
<del>
<ide> $event = $this->subject->dispatchEvent('some.event', ['foo' => 'bar']);
<ide>
<ide> $this->assertInstanceOf('\Cake\Event\Event', $event);
<ide><path>tests/TestCase/ORM/MarshallerTest.php
<ide> public function testOneAccessibleFieldsOption() {
<ide> $this->assertTrue($result->not_in_schema);
<ide> }
<ide>
<del>
<ide> /**
<ide> * test one() with a wrapping model name.
<ide> *
<ide><path>tests/TestCase/Validation/ValidationTest.php
<ide> public function testUploadedFileWithDifferentFileParametersOrder() {
<ide> $this->assertTrue(Validation::uploadedFile($file, $options), 'Wrong order');
<ide> }
<ide>
<del>
<ide> } | 14 |
Javascript | Javascript | remove disablehiddenpropdeprioritization flag | 64f50c667a778c85dc8f1d56e26d881fada4c85a | <ide><path>packages/react-dom/src/client/ReactDOMHostConfig.js
<ide> import {
<ide> enableModernEventSystem,
<ide> enableCreateEventHandleAPI,
<ide> enableScopeAPI,
<del> disableHiddenPropDeprioritization,
<ide> } from 'shared/ReactFeatureFlags';
<ide> import {HostComponent, HostText} from 'react-reconciler/src/ReactWorkTags';
<ide> import {TOP_BEFORE_BLUR, TOP_AFTER_BLUR} from '../events/DOMTopLevelEventTypes';
<ide> export function shouldSetTextContent(type: string, props: Props): boolean {
<ide> }
<ide>
<ide> export function shouldDeprioritizeSubtree(type: string, props: Props): boolean {
<del> if (disableHiddenPropDeprioritization) {
<del> // This is obnoxiously specific so that nobody uses it, but we can still opt
<del> // in via an infra-level userspace abstraction.
<del> return props.hidden === 'unstable-do-not-use-legacy-hidden';
<del> } else {
<del> // Legacy behavior. Any truthy value works.
<del> return !!props.hidden;
<del> }
<add> // This is obnoxiously specific so that nobody uses it, but we can still opt
<add> // in via an infra-level userspace abstraction.
<add> return props.hidden === 'unstable-do-not-use-legacy-hidden';
<ide> }
<ide>
<ide> export function createTextInstance(
<ide><path>packages/shared/ReactFeatureFlags.js
<ide> export const enableLegacyFBSupport = false;
<ide> // expiration time is currently rendering. Remove this flag once we have
<ide> // migrated to the new behavior.
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>
<del>// Flag used by www build so we can log occurrences of legacy hidden API
<del>export const disableHiddenPropDeprioritization = true;
<ide><path>packages/shared/forks/ReactFeatureFlags.native-fb.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.native-oss.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.test-renderer.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.test-renderer.www.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.testing.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.testing.www.js
<ide> export const enableFilterEmptyStringAttributesDOM = false;
<ide>
<ide> export const enableNewReconciler = false;
<ide> export const deferRenderPhaseUpdateToNextBatch = true;
<del>export const disableHiddenPropDeprioritization = true;
<ide>
<ide> // Flow magic to verify the exports of this file match the original version.
<ide> // eslint-disable-next-line no-unused-vars
<ide><path>packages/shared/forks/ReactFeatureFlags.www-dynamic.js
<ide> export const enableModernEventSystem = __VARIANT__;
<ide> export const enableLegacyFBSupport = __VARIANT__;
<ide> export const enableDebugTracing = !__VARIANT__;
<ide>
<del>// Temporary flag, in case we need to re-enable this feature.
<del>export const disableHiddenPropDeprioritization = __VARIANT__;
<del>
<ide> // This only has an effect in the new reconciler. But also, the new reconciler
<ide> // is only enabled when __VARIANT__ is true. So this is set to the opposite of
<ide> // __VARIANT__ so that it's `false` when running against the new reconciler.
<ide><path>packages/shared/forks/ReactFeatureFlags.www.js
<ide> export const {
<ide> enableLegacyFBSupport,
<ide> enableDebugTracing,
<ide> deferRenderPhaseUpdateToNextBatch,
<del> disableHiddenPropDeprioritization,
<ide> } = dynamicFeatureFlags;
<ide>
<ide> // On WWW, __EXPERIMENTAL__ is used for a new modern build. | 10 |
Javascript | Javascript | remove dup property | 155687cb7ee23d7777d1072c4c2956564bff8ac4 | <ide><path>lib/_stream_writable.js
<ide> function WritableState(options, stream) {
<ide> // if _final has been called
<ide> this.finalCalled = false;
<ide>
<del> // if _final has been called
<del> this.finalCalled = false;
<del>
<ide> // drain event flag.
<ide> this.needDrain = false;
<ide> // at the start of calling end() | 1 |
Python | Python | fix flaky test | 5bb5eb1657ea7517179b35f03949f4673b5c32d5 | <ide><path>tests/keras/test_optimizers.py
<ide> def get_model(input_dim, nb_hidden, output_dim):
<ide> return model
<ide>
<ide>
<del>def _test_optimizer(optimizer, target=0.9):
<add>def _test_optimizer(optimizer, target=0.89):
<ide> model = get_model(X_train.shape[1], 10, y_train.shape[1])
<ide> model.compile(loss='categorical_crossentropy',
<ide> optimizer=optimizer,
<ide> def _test_optimizer(optimizer, target=0.9):
<ide> validation_data=(X_test, y_test), verbose=2)
<ide> config = optimizer.get_config()
<ide> assert type(config) == dict
<del> assert history.history['val_acc'][-1] > target
<add> assert history.history['val_acc'][-1] >= target
<ide>
<ide>
<ide> def test_sgd(): | 1 |
Ruby | Ruby | check compilerselectionerror twice | 5883f1675d71c09a722343ef3f71280c246bc1bb | <ide><path>Library/Contributions/cmd/brew-test-bot.rb
<ide> def single_commit? start_revision, end_revision
<ide> end
<ide> end
<ide>
<add> def skip formula
<add> puts "#{Tty.blue}==>#{Tty.white} SKIPPING: #{formula}#{Tty.reset}"
<add> end
<add>
<ide> def setup
<ide> @category = __method__
<ide> return if ARGV.include? "--skip-setup"
<ide> def formula formula
<ide> requirements = formula_object.recursive_requirements
<ide> unsatisfied_requirements = requirements.reject {|r| r.satisfied? or r.default_formula?}
<ide> unless unsatisfied_requirements.empty?
<del> puts "#{Tty.blue}==>#{Tty.white} SKIPPING: #{formula}#{Tty.reset}"
<add> skip formula
<ide> unsatisfied_requirements.each {|r| puts r.message}
<ide> return
<ide> end
<ide>
<add> installed_gcc = false
<ide> begin
<ide> CompilerSelector.new(formula_object).compiler
<del> rescue CompilerSelectionError
<del> test "brew install apple-gcc42"
<add> rescue CompilerSelectionError => e
<add> unless installed_gcc
<add> test "brew install apple-gcc42"
<add> installed_gcc = true
<add> retry
<add> end
<add> skip formula
<add> puts e.message
<add> return
<ide> end
<ide>
<ide> test "brew fetch --retry #{dependencies}" unless dependencies.empty? | 1 |
PHP | PHP | fix insertion errors from the previous change | cafb4adfef041007a8c24c26d0411b18b074aa69 | <ide><path>src/Database/Expression/ValuesExpression.php
<ide> public function sql(ValueBinder $generator)
<ide> }
<ide>
<ide> $i = 0;
<del> $defaults = array_fill_keys($this->_columns, null);
<add> $columns = [];
<add>
<add> // Remove identifier quoting so column names match keys.
<add> foreach ($this->_columns as $col) {
<add> $columns[] = trim($col, '`[]"');
<add> }
<add> $defaults = array_fill_keys($columns, null);
<ide> $placeholders = [];
<ide>
<ide> foreach ($this->_values as $row) {
<ide> public function sql(ValueBinder $generator)
<ide> if ($this->query()) {
<ide> return ' ' . $this->query()->sql($generator);
<ide> }
<del>
<ide> return sprintf(' VALUES (%s)', implode('), (', $placeholders));
<ide> }
<ide>
<ide><path>src/Database/Query.php
<ide> public function insert(array $columns, array $types = [])
<ide> $this->_dirty();
<ide> $this->_type = 'insert';
<ide> $this->_parts['insert'][1] = $columns;
<del> $this->_parts['values'] = new ValuesExpression($columns, $this->typeMap()->types($types));
<del>
<add> if (!$this->_parts['values']) {
<add> $this->_parts['values'] = new ValuesExpression($columns, $this->typeMap()->types($types));
<add> } else {
<add> $this->_parts['values']->columns($columns);
<add> }
<ide> return $this;
<ide> }
<ide> | 2 |
Text | Text | add rowspan and colspan with example | 63c30eb4627f2d78745b845dc0ad6c899fbe31c1 | <ide><path>guide/english/html/tables/index.md
<ide> Result:
<ide> </tr>
<ide> </tfoot>
<ide> </table>
<add>
<add>
<add>### Using Row Span and Col Span Attribute
<add>With **Row Span** allows a single table cell to span the height of more than one cell or row.
<add>
<add>Example:
<add>
<add>```html
<add><table>
<add> <tr>
<add> <th>Firstname</th>
<add> <th>Lastname</th>
<add> <th>Age</th>
<add> </tr>
<add> <tr>
<add> <td>Jill</td>
<add> <td>Smith</td>
<add> <td rowspan="2">50</td>
<add> </tr>
<add> <tr>
<add> <td>Eve</td>
<add> <td>Jackson</td>
<add> </tr>
<add></table>
<add>```
<add>
<add>With **Col Span** allows a single table cell to span the width of more than one cell or column.
<add>
<add>Example:
<add>
<add>```html
<add><table>
<add> <tr>
<add> <th>Firstname</th>
<add> <th>Lastname</th>
<add> <th>Age</th>
<add> </tr>
<add> <tr>
<add> <td>Jill</td>
<add> <td>Smith</td>
<add> <td>50</td>
<add> </tr>
<add> <tr>
<add> <td>Eve</td>
<add> <td>Jackson</td>
<add> <td>50</td>
<add> </tr>
<add> <tr>
<add> <td colspan="3">Total: 2 Response</td>
<add> </tr>
<add></table>
<add>```
<ide>
<ide> ## Adding/Removing table border
<ide> The table border width can be increased/decreased using the table border attribute. | 1 |
Javascript | Javascript | denormalize directive templates | dfe99836cd98c2a1b0f9bde6216bd44088de275a | <ide><path>src/ng/compile.js
<ide> function $CompileProvider($provide) {
<ide> }
<ide> };
<ide>
<add> var startSymbol = $interpolate.startSymbol(),
<add> endSymbol = $interpolate.endSymbol(),
<add> denormalizeTemplate = (startSymbol == '{{' || endSymbol == '}}')
<add> ? identity
<add> : function denormalizeTemplate(template) {
<add> return template.replace(/\{\{/g, startSymbol).replace(/}}/g, endSymbol);
<add> };
<add>
<add>
<ide> return compile;
<ide>
<ide> //================================
<ide> function $CompileProvider($provide) {
<ide> if ((directiveValue = directive.template)) {
<ide> assertNoDuplicate('template', templateDirective, directive, $compileNode);
<ide> templateDirective = directive;
<add> directiveValue = denormalizeTemplate(directiveValue);
<ide>
<ide> if (directive.replace) {
<ide> $template = jqLite('<div>' +
<ide> function $CompileProvider($provide) {
<ide> success(function(content) {
<ide> var compileNode, tempTemplateAttrs, $template;
<ide>
<add> content = denormalizeTemplate(content);
<add>
<ide> if (replace) {
<ide> $template = jqLite('<div>' + trim(content) + '</div>').contents();
<ide> compileNode = $template[0];
<ide><path>test/ng/compileSpec.js
<ide> describe('$compile', function() {
<ide> '<option>Greet Misko!</option>' +
<ide> '</select>');
<ide> }));
<add>
<add>
<add> it('should support custom start/end interpolation symbols in template and directive template',
<add> function() {
<add> module(function($interpolateProvider, $compileProvider) {
<add> $interpolateProvider.startSymbol('##').endSymbol(']]');
<add> $compileProvider.directive('myDirective', function() {
<add> return {
<add> template: '<span>{{hello}}|{{hello|uppercase}}</span>'
<add> };
<add> });
<add> });
<add>
<add> inject(function($compile, $rootScope) {
<add> element = $compile('<div>##hello|uppercase]]|<div my-directive></div></div>')($rootScope);
<add> $rootScope.hello = 'ahoj';
<add> $rootScope.$digest();
<add> expect(element.text()).toBe('AHOJ|ahoj|AHOJ');
<add> });
<add> });
<add>
<add>
<add> it('should support custom start/end interpolation symbols in async directive template',
<add> function() {
<add> module(function($interpolateProvider, $compileProvider) {
<add> $interpolateProvider.startSymbol('##').endSymbol(']]');
<add> $compileProvider.directive('myDirective', function() {
<add> return {
<add> templateUrl: 'myDirective.html'
<add> };
<add> });
<add> });
<add>
<add> inject(function($compile, $rootScope, $templateCache) {
<add> $templateCache.put('myDirective.html', '<span>{{hello}}|{{hello|uppercase}}</span>');
<add> element = $compile('<div>##hello|uppercase]]|<div my-directive></div></div>')($rootScope);
<add> $rootScope.hello = 'ahoj';
<add> $rootScope.$digest();
<add> expect(element.text()).toBe('AHOJ|ahoj|AHOJ');
<add> });
<add> });
<ide> });
<ide>
<ide> | 2 |
Python | Python | add one nested compound dtype | 4f35a210e275600bd3f9c0a6df7be4fcb15c5101 | <ide><path>numpy/core/tests/test_dtype.py
<ide> def test_nonequivalent_record(self):
<ide>
<ide> class TestMonsterType(TestCase):
<ide> """Test deeply nested subtypes."""
<del> pass
<add> def test1(self):
<add> simple1 = np.dtype({'names': ['r','b'], 'formats': ['u1', 'u1'],
<add> 'titles': ['Red pixel', 'Blue pixel']})
<add> a = np.dtype([('yo', np.int), ('ye', simple1),
<add> ('yi', np.dtype((np.int, (3, 2))))])
<add> b = np.dtype([('yo', np.int), ('ye', simple1),
<add> ('yi', np.dtype((np.int, (3, 2))))])
<add> self.failUnless(hash(a) == hash(b))
<add>
<add> c = np.dtype([('yo', np.int), ('ye', simple1),
<add> ('yi', np.dtype((a, (3, 2))))])
<add> d = np.dtype([('yo', np.int), ('ye', simple1),
<add> ('yi', np.dtype((a, (3, 2))))])
<add> self.failUnless(hash(c) == hash(d))
<ide>
<ide> if __name__ == "__main__":
<ide> run_module_suite() | 1 |
PHP | PHP | fix failing test on sqlserver | b8c00d4f5250692a2d96f0d75156921b23baf559 | <ide><path>src/Database/Expression/CaseExpression.php
<ide> class CaseExpression implements ExpressionInterface {
<ide> * @param string|array|ExpressionInterface $trueValues Value of each condition if that condition is true
<ide> * @param string|array|ExpressionInterface $defaultValue Default value if none of the conditiosn are true
<ide> */
<del> public function __construct($conditions = [], $trueValues = [], $defaultValue = '0') {
<add> public function __construct($conditions = [], $trueValues = [], $defaultValue = 0) {
<ide> if (!empty($conditions)) {
<ide> $this->add($conditions, $trueValues);
<ide> }
<ide> protected function _addExpressions($conditions, $trueValues) {
<ide> continue;
<ide> }
<ide>
<del> $trueValue = isset($trueValues[$k]) ? $trueValues[$k] : 1;
<add> $trueValue = !empty($trueValues[$k]) ? $trueValues[$k] : 1;
<ide>
<ide> if ($trueValue === 'literal') {
<ide> $trueValue = $k;
<ide> protected function _addExpressions($conditions, $trueValues) {
<ide> 'value' => $trueValue,
<ide> 'type' => null
<ide> ];
<del> } elseif (empty($trueValue)) {
<del> $trueValue = 1;
<ide> }
<ide>
<ide> $this->_conditions[] = $c;
<ide><path>src/Database/Expression/QueryExpression.php
<ide> public function in($field, $values, $type = null) {
<ide> *
<ide> * @return QueryExpression
<ide> */
<del> public function addCase($conditions, $trueValues = [], $defaultValue = '0') {
<add> public function addCase($conditions, $trueValues = [], $defaultValue = 0) {
<ide> return $this->add(new CaseExpression($conditions, $trueValues, $defaultValue));
<ide> }
<ide>
<ide><path>tests/TestCase/Database/QueryTest.php
<ide> public function testDirectIsNull() {
<ide> * @return void
<ide> */
<ide> public function testSqlCaseStatement() {
<add> $convert = $this->connection->driver() instanceof \Cake\Database\Driver\Sqlserver;
<add>
<ide> $query = new Query($this->connection);
<ide> $publishedCase = $query
<ide> ->newExpr()
<ide> public function testSqlCaseStatement() {
<ide> ->newExpr()
<ide> ->add(['published' => 'N'])
<ide> );
<add>
<add> //SQLServer requires the case statements to be converted to int
<add> if ($convert) {
<add> $publishedCase = $query->func()
<add> ->convert([
<add> 'INT' => 'literal',
<add> $publishedCase
<add> ]);
<add> $notPublishedCase = $query->func()
<add> ->convert([
<add> 'INT' => 'literal',
<add> $notPublishedCase
<add> ]);
<add> }
<add>
<ide> $results = $query
<ide> ->select([
<ide> 'published' => $query->func()->sum($publishedCase), | 3 |
Python | Python | fix test_build_dependencies by ignoring new libs | 617977427897bd2c2bb1fce9ff190a3045169cf9 | <ide><path>spacy/language.py
<ide> def _fix_pretrained_vectors_name(nlp):
<ide> else:
<ide> raise ValueError(Errors.E092)
<ide> if nlp.vocab.vectors.size != 0:
<del> link_vectors_to_models(nlp.vocab, skip_rank=True)
<add> link_vectors_to_models(nlp.vocab)
<ide> for name, proc in nlp.pipeline:
<ide> if not hasattr(proc, "cfg"):
<ide> continue
<ide><path>spacy/tests/package/test_requirements.py
<ide> def test_build_dependencies():
<ide> "mock",
<ide> "flake8",
<ide> ]
<del> libs_ignore_setup = ["fugashi", "natto-py", "pythainlp"]
<add> libs_ignore_setup = ["fugashi", "natto-py", "pythainlp", "sudachipy", "sudachidict_core"]
<ide>
<ide> # check requirements.txt
<ide> req_dict = {} | 2 |
Ruby | Ruby | remove consts on cache clear | da9e42f3125d191ef73eabc0db03868229904231 | <ide><path>Library/Homebrew/formulary.rb
<ide> def self.formula_class_get(path)
<ide> cache.fetch(path)
<ide> end
<ide>
<add> def self.clear_cache
<add> cache.each do |key, klass|
<add> next if key == :formulary_factory
<add>
<add> namespace = klass.name.deconstantize
<add> next if namespace.deconstantize != name
<add>
<add> remove_const(namespace.demodulize)
<add> end
<add>
<add> super
<add> end
<add>
<ide> def self.load_formula(name, path, contents, namespace, flags:)
<ide> raise "Formula loading disabled by HOMEBREW_DISABLE_LOAD_FORMULA!" if Homebrew::EnvConfig.disable_load_formula?
<ide> | 1 |
Javascript | Javascript | add unit test for getdata | e00b986bd35856a86b07739284ae6431c46e5693 | <ide><path>test/unit/api_spec.js
<ide> describe('api', function() {
<ide> expect(metadata.metadata.get('dc:title')).toEqual('Basic API Test');
<ide> });
<ide> });
<add> it('gets data', function() {
<add> var promise = doc.getData();
<add> waitsForPromise(promise, function (data) {
<add> expect(true).toEqual(true);
<add> });
<add> });
<ide> });
<ide> describe('Page', function() {
<ide> var resolvePromise; | 1 |
Javascript | Javascript | add errno property to exceptions | ca6ededbd1af5cc759ec98952b84c0b34edb2d40 | <ide><path>lib/child_process.js
<ide> function setupChannel(target, channel) {
<ide> var writeReq = channel.write(buffer, 0, buffer.length, sendHandle);
<ide>
<ide> if (!writeReq) {
<del> throw new Error(errno + 'cannot write to IPC channel.');
<add> throw errnoException(errno, 'write', 'cannot write to IPC channel.');
<ide> }
<ide>
<ide> writeReq.oncomplete = nop;
<ide> ChildProcess.prototype.spawn = function(options) {
<ide> };
<ide>
<ide>
<del>function errnoException(errorno, syscall) {
<add>function errnoException(errorno, syscall, errmsg) {
<ide> // TODO make this more compatible with ErrnoException from src/node.cc
<ide> // Once all of Node is using this function the ErrnoException from
<ide> // src/node.cc should be removed.
<del> var e = new Error(syscall + ' ' + errorno);
<add> var message = syscall + ' ' + errorno;
<add> if (errmsg) {
<add> message += ' - ' + errmsg;
<add> }
<add> var e = new Error(message);
<ide> e.errno = e.code = errorno;
<ide> e.syscall = syscall;
<ide> return e; | 1 |
Text | Text | fix version numbers in developers section of docs | 8e640f4c710fde95d4542abc75cb779d6bd1c737 | <ide><path>docs/docs/developers/index.md
<ide> Thanks to [BrowserStack](https://browserstack.com) for allowing our team to test
<ide>
<ide> ## Previous versions
<ide>
<del>To migrate from version 3 to version 2, please see [the v3 migration guide](../getting-started/v3-migration).
<add>To migrate from version 2 to version 3, please see [the v3 migration guide](../getting-started/v3-migration).
<ide>
<del>Version 2 has a completely different API than earlier versions.
<add>Version 3 has a largely different API than earlier versions.
<ide>
<ide> Most earlier version options have current equivalents or are the same.
<ide> | 1 |
Ruby | Ruby | add support for more http cache controls | c94a00757dac150b17d9272b72288217c66f0a2d | <ide><path>actionpack/lib/action_controller/metal/conditional_get.rb
<ide> def expires_in(seconds, options = {})
<ide> response.cache_control.merge!(
<ide> max_age: seconds,
<ide> public: options.delete(:public),
<del> must_revalidate: options.delete(:must_revalidate)
<add> must_revalidate: options.delete(:must_revalidate),
<add> stale_while_revalidate: options.delete(:stale_while_revalidate),
<add> stale_if_error: options.delete(:stale_if_error),
<ide> )
<ide> options.delete(:private)
<ide>
<ide><path>actionpack/lib/action_dispatch/http/cache.rb
<ide> def merge_and_normalize_cache_control!(cache_control)
<ide> self._cache_control = _cache_control + ", #{control[:extras].join(', ')}"
<ide> end
<ide> else
<del> extras = control[:extras]
<add> extras = control[:extras]
<ide> max_age = control[:max_age]
<add> stale_while_revalidate = control[:stale_while_revalidate]
<add> stale_if_error = control[:stale_if_error]
<ide>
<ide> options = []
<ide> options << "max-age=#{max_age.to_i}" if max_age
<ide> options << (control[:public] ? PUBLIC : PRIVATE)
<ide> options << MUST_REVALIDATE if control[:must_revalidate]
<add> options << "stale-while-revalidate=#{stale_while_revalidate.to_i}" if stale_while_revalidate
<add> options << "stale-if-error=#{stale_if_error.to_i}" if stale_if_error
<ide> options.concat(extras) if extras
<ide>
<ide> self._cache_control = options.join(", ")
<ide><path>actionpack/test/controller/render_test.rb
<ide> def conditional_hello_with_expires_in_with_public_and_must_revalidate
<ide> render action: "hello_world"
<ide> end
<ide>
<add> def conditional_hello_with_expires_in_with_stale_while_revalidate
<add> expires_in 1.minute, public: true, stale_while_revalidate: 5.minutes
<add> render action: "hello_world"
<add> end
<add>
<add> def conditional_hello_with_expires_in_with_stale_if_error
<add> expires_in 1.minute, public: true, stale_if_error: 5.minutes
<add> render action: "hello_world"
<add> end
<add>
<ide> def conditional_hello_with_expires_in_with_public_with_more_keys
<ide> expires_in 1.minute, :public => true, "s-maxage" => 5.hours
<ide> render action: "hello_world"
<ide> def test_expires_in_header_with_public_and_must_revalidate
<ide> assert_equal "max-age=60, public, must-revalidate", @response.headers["Cache-Control"]
<ide> end
<ide>
<add> def test_expires_in_header_with_stale_while_revalidate
<add> get :conditional_hello_with_expires_in_with_stale_while_revalidate
<add> assert_equal "max-age=60, public, stale-while-revalidate=300", @response.headers["Cache-Control"]
<add> end
<add>
<add> def test_expires_in_header_with_stale_if_error
<add> get :conditional_hello_with_expires_in_with_stale_if_error
<add> assert_equal "max-age=60, public, stale-if-error=300", @response.headers["Cache-Control"]
<add> end
<add>
<ide> def test_expires_in_header_with_additional_headers
<ide> get :conditional_hello_with_expires_in_with_public_with_more_keys
<ide> assert_equal "max-age=60, public, s-maxage=18000", @response.headers["Cache-Control"] | 3 |
PHP | PHP | prevent negative offsets | 54073e4bbc9f606f28b499b1e10d34db0b1ef53f | <ide><path>src/ORM/Behavior/TreeBehavior.php
<ide> protected function _moveUp($node, $number)
<ide> if (!$number) {
<ide> return false;
<ide> }
<add> if ($number < 0) {
<add> return $node;
<add> }
<ide>
<ide> $config = $this->config();
<ide> list($parent, $left, $right) = [$config['parent'], $config['left'], $config['right']];
<ide> protected function _moveDown($node, $number)
<ide> if (!$number) {
<ide> return false;
<ide> }
<add> if ($number < 0) {
<add> return $node;
<add> }
<ide>
<ide> $config = $this->config();
<ide> list($parent, $left, $right) = [$config['parent'], $config['left'], $config['right']]; | 1 |
Javascript | Javascript | use defaultagent.protocol in protocol check | 9f23fe11418b66ab5812bed61e207af4e8f18efb | <ide><path>lib/_http_client.js
<ide> function ClientRequest(options, cb) {
<ide> }
<ide> self.agent = agent;
<ide>
<add> var protocol = options.protocol || defaultAgent.protocol;
<add> var expectedProtocol = defaultAgent.protocol;
<add> if (self.agent && self.agent.protocol)
<add> expectedProtocol = self.agent.protocol;
<add>
<ide> if (options.path && / /.test(options.path)) {
<ide> // The actual regex is more like /[^A-Za-z0-9\-._~!$&'()*+,;=/:@]/
<ide> // with an additional rule for ignoring percentage-escaped characters
<ide> function ClientRequest(options, cb) {
<ide> // why it only scans for spaces because those are guaranteed to create
<ide> // an invalid request.
<ide> throw new TypeError('Request path contains unescaped characters.');
<del> } else if (options.protocol && options.protocol !== self.agent.protocol) {
<del> throw new Error('Protocol:' + options.protocol + ' not supported.');
<add> } else if (protocol !== expectedProtocol) {
<add> throw new Error('Protocol:' + protocol + ' not supported.');
<ide> }
<ide>
<ide> var defaultPort = options.defaultPort || self.agent && self.agent.defaultPort;
<ide><path>test/simple/test-http-agent-no-protocol.js
<add>// Copyright Joyent, Inc. and other Node contributors.
<add>//
<add>// Permission is hereby granted, free of charge, to any person obtaining a
<add>// copy of this software and associated documentation files (the
<add>// "Software"), to deal in the Software without restriction, including
<add>// without limitation the rights to use, copy, modify, merge, publish,
<add>// distribute, sublicense, and/or sell copies of the Software, and to permit
<add>// persons to whom the Software is furnished to do so, subject to the
<add>// following conditions:
<add>//
<add>// The above copyright notice and this permission notice shall be included
<add>// in all copies or substantial portions of the Software.
<add>//
<add>// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
<add>// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
<add>// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
<add>// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
<add>// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
<add>// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
<add>// USE OR OTHER DEALINGS IN THE SOFTWARE.
<add>
<add>var common = require('../common');
<add>var assert = require('assert');
<add>var http = require('http');
<add>var url = require('url');
<add>
<add>var request = 0;
<add>var response = 0;
<add>process.on('exit', function() {
<add> assert.equal(1, request, 'http server "request" callback was not called');
<add> assert.equal(1, response, 'http client "response" callback was not called');
<add>});
<add>
<add>var server = http.createServer(function(req, res) {
<add> res.end();
<add> request++;
<add>}).listen(common.PORT, '127.0.0.1', function() {
<add> var opts = url.parse('http://127.0.0.1:' + common.PORT + '/');
<add>
<add> // remove the `protocol` field… the `http` module should fall back
<add> // to "http:", as defined by the global, default `http.Agent` instance.
<add> opts.agent = new http.Agent();
<add> opts.agent.protocol = null;
<add>
<add> http.get(opts, function (res) {
<add> response++;
<add> res.resume();
<add> server.close();
<add> });
<add>}); | 2 |
Javascript | Javascript | expose plugins required by worker-loader | ba7cedf79f4323b9c70fa38dca78d27a944ff616 | <ide><path>lib/webpack.js
<ide> exportPlugins(exports, {
<ide> "NamedModulesPlugin": () => require("./NamedModulesPlugin"),
<ide> "NamedChunksPlugin": () => require("./NamedChunksPlugin"),
<ide> "HashedModuleIdsPlugin": () => require("./HashedModuleIdsPlugin"),
<add> "SingleEntryPlugin": () => require("./SingleEntryPlugin"),
<ide> "ModuleFilenameHelpers": () => require("./ModuleFilenameHelpers")
<ide> });
<ide> exportPlugins(exports.optimize = {}, {
<ide> exportPlugins(exports.web = {}, {
<ide> "JsonpTemplatePlugin": () => require("./web/JsonpTemplatePlugin"),
<ide> "FetchCompileWasmTemplatePlugin": () => require("./web/FetchCompileWasmTemplatePlugin"),
<ide> });
<add>exportPlugins(exports.webworker = {}, {
<add> "WebWorkerTemplatePlugin": () => require("./webworker/WebWorkerTemplatePlugin"),
<add>});
<ide> exportPlugins(exports.node = {}, {
<ide> "NodeTemplatePlugin": () => require("./node/NodeTemplatePlugin"),
<ide> "ReadFileCompileWasmTemplatePlugin": () => require("./node/ReadFileCompileWasmTemplatePlugin"), | 1 |
Java | Java | update javadoc in simpmessagesendingoperations | 96563c7eea7f33e5a1fc37b6491ca333b2cca2c9 | <ide><path>spring-messaging/src/main/java/org/springframework/messaging/simp/SimpMessageSendingOperations.java
<ide> * the Spring Framework support for Simple Messaging Protocols (like STOMP).
<ide> *
<ide> * <p>For more on user destinations see
<del> * {@link org.springframework.messaging.simp.user.UserDestinationResolver}.
<add> * {@link org.springframework.messaging.simp.user.UserDestinationResolver
<add> * UserDestinationResolver}.
<add> *
<add> * <p>Generally it is expected the user is the one authenticated with the
<add> * WebSocket session (or by extension the user authenticated with the
<add> * handshake request that started the session). However if the session is
<add> * not authenticated, it is also possible to pass the session id (if known)
<add> * in place of the user name. Keep in mind though that in that scenario,
<add> * you must use one of the overloaded methods that accept headers making sure the
<add> * {@link org.springframework.messaging.simp.SimpMessageHeaderAccessor#setSessionId
<add> * sessionId} header has been set accordingly.
<ide> *
<ide> * @author Rossen Stoyanchev
<ide> * @since 4.0 | 1 |
Javascript | Javascript | remove usage of require('util') | 1500e5de64e4f7cfb482ad353293645a5599d93f | <ide><path>lib/internal/child_process.js
<ide> const { validateString } = require('internal/validators');
<ide> const EventEmitter = require('events');
<ide> const net = require('net');
<ide> const dgram = require('dgram');
<del>const util = require('util');
<add>const inspect = require('internal/util/inspect').inspect;
<ide> const assert = require('internal/assert');
<ide>
<ide> const { Process } = internalBinding('process_wrap');
<ide> function _validateStdio(stdio, sync) {
<ide> throw new ERR_INVALID_OPT_VALUE('stdio', stdio);
<ide> }
<ide> } else if (!Array.isArray(stdio)) {
<del> throw new ERR_INVALID_OPT_VALUE('stdio', util.inspect(stdio));
<add> throw new ERR_INVALID_OPT_VALUE('stdio', inspect(stdio));
<ide> }
<ide>
<ide> // At least 3 stdio will be created
<ide> function _validateStdio(stdio, sync) {
<ide> } else if (isArrayBufferView(stdio) || typeof stdio === 'string') {
<ide> if (!sync) {
<ide> cleanup();
<del> throw new ERR_INVALID_SYNC_FORK_INPUT(util.inspect(stdio));
<add> throw new ERR_INVALID_SYNC_FORK_INPUT(inspect(stdio));
<ide> }
<ide> } else {
<ide> // Cleanup
<ide> cleanup();
<del> throw new ERR_INVALID_OPT_VALUE('stdio', util.inspect(stdio));
<add> throw new ERR_INVALID_OPT_VALUE('stdio', inspect(stdio));
<ide> }
<ide>
<ide> return acc; | 1 |
Javascript | Javascript | add inspect function to tupleorigin | 883173289d23fb9ad86a39dc074e77ee9e6969ee | <ide><path>lib/internal/url.js
<ide> class TupleOrigin {
<ide> result += `:${this.port}`;
<ide> return result;
<ide> }
<add>
<add> inspect() {
<add> return `TupleOrigin {
<add> scheme: ${this.scheme},
<add> host: ${this.host},
<add> port: ${this.port},
<add> domain: ${this.domain}
<add> }`;
<add> }
<ide> }
<ide>
<ide> class URL {
<ide><path>test/parallel/test-util-inspect-tuple-origin.js
<add>'use strict';
<add>
<add>require('../common');
<add>const assert = require('assert');
<add>const inspect = require('util').inspect;
<add>const URL = require('url').URL;
<add>
<add>assert.strictEqual(
<add> inspect(URL.originFor('http://test.com:8000')),
<add> `TupleOrigin {
<add> scheme: http,
<add> host: test.com,
<add> port: 8000,
<add> domain: null
<add> }`
<add> );
<add>
<add>assert.strictEqual(
<add> inspect(URL.originFor('http://test.com')),
<add> `TupleOrigin {
<add> scheme: http,
<add> host: test.com,
<add> port: undefined,
<add> domain: null
<add> }`
<add> );
<add>
<add>
<add>assert.strictEqual(
<add> inspect(URL.originFor('https://test.com')),
<add> `TupleOrigin {
<add> scheme: https,
<add> host: test.com,
<add> port: undefined,
<add> domain: null
<add> }`
<add> ); | 2 |
Text | Text | remove link to google cert | 307eda4e1cc8ac0847ac42385271fdf2a2c0d242 | <ide><path>guide/english/certifications/index.md
<ide> Upon completion of all six certificates, the freeCodeCamp [Full Stack Developmen
<ide> For more information about freeCodeCamp, visit [about.freecodecamp.org](https://about.freecodecamp.org/).
<ide>
<ide> For more information about the new certification program, see the forum post [here](https://www.freecodecamp.org/forum/t/freecodecamps-new-certificates-heres-how-were-rolling-them-out/141618).
<del>
<del>### Associate Android Developer Certification
<del>
<del>A great way to test one's Android skills and certify that by none other than Google.
<del>The link to the certification and its details are [here](https://developers.google.com/training/certification/). | 1 |
Go | Go | fix a minor typo | ffcc4a1e52def3b1552ae3c7ba4ee1fb47f92cea | <ide><path>daemon/execdriver/native/driver.go
<ide> func (d *driver) Run(c *execdriver.Command, pipes *execdriver.Pipes, startCallba
<ide> oom := notifyOnOOM(cont)
<ide> waitF := p.Wait
<ide> if nss := cont.Config().Namespaces; !nss.Contains(configs.NEWPID) {
<del> // we need such hack for tracking processes with inerited fds,
<add> // we need such hack for tracking processes with inherited fds,
<ide> // because cmd.Wait() waiting for all streams to be copied
<ide> waitF = waitInPIDHost(p, cont)
<ide> } | 1 |
Ruby | Ruby | preserve chained method punctuation | d399ee93329a72584009239f0fc010701361ba24 | <ide><path>activerecord/lib/active_record/callbacks.rb
<ide> def before_validation_on_update() end
<ide> # existing objects that have a record.
<ide> def after_validation_on_update() end
<ide>
<del> def valid_with_callbacks #:nodoc:
<add> def valid_with_callbacks? #:nodoc:
<ide> return false if callback(:before_validation) == false
<ide> if new_record? then result = callback(:before_validation_on_create) else result = callback(:before_validation_on_update) end
<ide> return false if result == false | 1 |
Ruby | Ruby | fix cache invalidation | 05568420c0f6290cf09c5448c2891e5134360629 | <ide><path>Library/Homebrew/formula_installer.rb
<ide> def finish
<ide> # Updates the cache for a particular formula after doing an install
<ide> CacheStoreDatabase.use(:linkage) do |db|
<ide> break unless db.created?
<del> LinkageChecker.new(keg, formula, cache_db: db)
<add> LinkageChecker.new(keg, formula, cache_db: db, rebuild_cache: true)
<ide> end
<ide>
<ide> # Update tab with actual runtime dependencies
<ide><path>Library/Homebrew/keg.rb
<ide> def remove_opt_record
<ide> end
<ide>
<ide> def uninstall
<add> CacheStoreDatabase.use(:linkage) do |db|
<add> break unless db.created?
<add> LinkageCacheStore.new(path, db).flush_cache!
<add> end
<add>
<ide> path.rmtree
<ide> path.parent.rmdir_if_possible
<ide> remove_opt_record if optlinked?
<ide><path>Library/Homebrew/linkage_cache_store.rb
<ide> # by the `brew linkage` command
<ide> #
<ide> class LinkageCacheStore < CacheStore
<del> # @param [String] keg_name
<add> # @param [String] keg_path
<ide> # @param [CacheStoreDatabase] database
<ide> # @return [nil]
<del> def initialize(keg_name, database)
<del> @keg_name = keg_name
<add> def initialize(keg_path, database)
<add> @keg_path = keg_path
<ide> super(database)
<ide> end
<ide>
<del> # Returns `true` if the database has any value for the current `keg_name`
<add> # Returns `true` if the database has any value for the current `keg_path`
<ide> #
<ide> # @return [Boolean]
<ide> def keg_exists?
<del> !database.get(@keg_name).nil?
<add> !database.get(@keg_path).nil?
<ide> end
<ide>
<ide> # Inserts dylib-related information into the cache if it does not exist or
<ide> def update!(hash_values)
<ide> EOS
<ide> end
<ide>
<del> database.set @keg_name, ruby_hash_to_json_string(hash_values)
<add> database.set @keg_path, ruby_hash_to_json_string(hash_values)
<ide> end
<ide>
<ide> # @param [Symbol] the type to fetch from the `LinkageCacheStore`
<ide> def fetch_type(type)
<ide>
<ide> # @return [nil]
<ide> def flush_cache!
<del> database.delete(@keg_name)
<add> database.delete(@keg_path)
<ide> end
<ide>
<ide> private
<ide> def flush_cache!
<ide> # @param [Symbol] type
<ide> # @return [Hash]
<ide> def fetch_hash_values(type)
<del> keg_cache = database.get(@keg_name)
<add> keg_cache = database.get(@keg_path)
<ide> return {} unless keg_cache
<ide> json_string_to_ruby_hash(keg_cache)[type.to_s]
<ide> end
<ide><path>Library/Homebrew/linkage_checker.rb
<ide> class LinkageChecker
<ide> attr_reader :undeclared_deps
<ide>
<ide> def initialize(keg, formula = nil, cache_db:,
<del> use_cache: !ENV["HOMEBREW_LINKAGE_CACHE"].nil?)
<add> use_cache: !ENV["HOMEBREW_LINKAGE_CACHE"].nil?,
<add> rebuild_cache: false)
<ide> @keg = keg
<ide> @formula = formula || resolve_formula(keg)
<del> @store = LinkageCacheStore.new(keg.name, cache_db) if use_cache
<add> @store = LinkageCacheStore.new(keg.to_s, cache_db) if use_cache
<ide>
<ide> @system_dylibs = Set.new
<ide> @broken_dylibs = Set.new
<ide> def initialize(keg, formula = nil, cache_db:,
<ide> @undeclared_deps = []
<ide> @unnecessary_deps = []
<ide>
<del> check_dylibs
<add> check_dylibs(rebuild_cache: rebuild_cache)
<ide> end
<ide>
<ide> def display_normal_output
<ide> def dylib_to_dep(dylib)
<ide> Regexp.last_match(2)
<ide> end
<ide>
<del> def check_dylibs
<add> def check_dylibs(rebuild_cache:)
<add> keg_files_dylibs = nil
<add>
<add> if rebuild_cache
<add> store&.flush_cache!
<add> else
<add> keg_files_dylibs = store&.fetch_type(:keg_files_dylibs)
<add> end
<add>
<ide> keg_files_dylibs_was_empty = false
<del> keg_files_dylibs = store&.fetch_type(:keg_files_dylibs)
<ide> keg_files_dylibs ||= {}
<ide> if keg_files_dylibs.empty?
<ide> keg_files_dylibs_was_empty = true | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.