content_type stringclasses 8
values | main_lang stringclasses 7
values | message stringlengths 1 50 | sha stringlengths 40 40 | patch stringlengths 52 962k | file_count int64 1 300 |
|---|---|---|---|---|---|
Javascript | Javascript | update example to use ngmodel best practices | e13224b2dff2cbaeee5cc5845d81b37d4a4839c1 | <ide><path>src/ng/directive/input.js
<ide> var inputType = {
<ide> <script>
<ide> angular.module('checkboxExample', [])
<ide> .controller('ExampleController', ['$scope', function($scope) {
<del> $scope.value1 = true;
<del> $scope.value2 = 'YES'
<add> $scope.checkboxModel = {
<add> value1 : true,
<add> value2 : 'YES'
<add> };
<ide> }]);
<ide> </script>
<ide> <form name="myForm" ng-controller="ExampleController">
<del> Value1: <input type="checkbox" ng-model="value1"> <br/>
<del> Value2: <input type="checkbox" ng-model="value2"
<add> Value1: <input type="checkbox" ng-model="checkboxModel.value1"> <br/>
<add> Value2: <input type="checkbox" ng-model="checkboxModel.value2"
<ide> ng-true-value="'YES'" ng-false-value="'NO'"> <br/>
<del> <tt>value1 = {{value1}}</tt><br/>
<del> <tt>value2 = {{value2}}</tt><br/>
<add> <tt>value1 = {{checkboxModel.value1}}</tt><br/>
<add> <tt>value2 = {{checkboxModel.value2}}</tt><br/>
<ide> </form>
<ide> </file>
<ide> <file name="protractor.js" type="protractor">
<ide> it('should change state', function() {
<del> var value1 = element(by.binding('value1'));
<del> var value2 = element(by.binding('value2'));
<add> var value1 = element(by.binding('checkboxModel.value1'));
<add> var value2 = element(by.binding('checkboxModel.value2'));
<ide>
<ide> expect(value1.getText()).toContain('true');
<ide> expect(value2.getText()).toContain('YES');
<ide>
<del> element(by.model('value1')).click();
<del> element(by.model('value2')).click();
<add> element(by.model('checkboxModel.value1')).click();
<add> element(by.model('checkboxModel.value2')).click();
<ide>
<ide> expect(value1.getText()).toContain('false');
<ide> expect(value2.getText()).toContain('NO'); | 1 |
Ruby | Ruby | remove invalid comment | bbbef23e1f7ece2eebe860d687931b8998b1a926 | <ide><path>activerecord/lib/active_record/connection_adapters/mysql_adapter.rb
<ide> def client_encoding
<ide> end
<ide>
<ide> def exec_query(sql, name = 'SQL', binds = [])
<del> # If the configuration sets prepared_statements:false, binds will
<del> # always be empty, since the bind variables will have been already
<del> # substituted and removed from binds by BindVisitor, so this will
<del> # effectively disable prepared statement usage completely.
<ide> if without_prepared_statement?(binds)
<ide> result_set, affected_rows = exec_without_stmt(sql, name)
<ide> else | 1 |
Text | Text | fix a typo in ar query interface [ci skip] | 6140d0ca3fc763b20beeefd4a0b8212ebe9149a6 | <ide><path>guides/source/active_record_querying.md
<ide> articles, all the articles would still be loaded. By using `joins` (an INNER
<ide> JOIN), the join conditions **must** match, otherwise no records will be
<ide> returned.
<ide>
<del>NOTE: If an association is eager loaded as part of a join, any fields from a custom select clause will not present be on the loaded models.
<add>NOTE: If an association is eager loaded as part of a join, any fields from a custom select clause will not be present on the loaded models.
<ide> This is because it is ambiguous whether they should appear on the parent record, or the child.
<ide>
<ide> Scopes | 1 |
Python | Python | fix sub-classing of matrices. | ba41f2f5d864551b4462a9f53e8f23f626b11105 | <ide><path>numpy/core/defmatrix.py
<ide> def __new__(subtype, data, dtype=None, copy=True):
<ide> intype = data.dtype
<ide> else:
<ide> intype = N.dtype(dtype)
<del> new = data.view(matrix)
<add> new = data.view(subtype)
<ide> if intype != data.dtype:
<ide> return new.astype(intype)
<ide> if copy: return new.copy() | 1 |
Text | Text | change faq for dynamic routing | 0e01435f648d266198ed5de0a64ca1603f76c5a0 | <ide><path>packages/next/README.md
<ide> - [Custom server and routing](#custom-server-and-routing)
<ide> - [Disabling file-system routing](#disabling-file-system-routing)
<ide> - [Dynamic assetPrefix](#dynamic-assetprefix)
<add> - [Changing x-powered-by](#changing-x-powered-by)
<ide> - [Dynamic Import](#dynamic-import)
<ide> - [Basic Usage (Also does SSR)](#basic-usage-also-does-ssr)
<ide> - [With named exports](#with-named-exports)
<ide> As a result, we were able to introduce a very simple approach to routing that co
<ide> <details>
<ide> <summary>How do I define a custom fancy route?</summary>
<ide>
<del>We [added](#custom-server-and-routing) the ability to map between an arbitrary URL and any component by supplying a request handler.
<add>Next.js provide [dynamic routing](#dynamic-routing) solution out of the box. This allows to use pretty links in url.
<ide>
<del>On the client side, we have a parameter call `as` on `<Link>` that _decorates_ the URL differently from the URL it _fetches_.
<add>You can check an [exmaple](https://github.com/zeit/next.js/tree/canary/examples/dynamic-routing) how it works.
<ide>
<ide> </details>
<ide> | 1 |
Ruby | Ruby | remove useless empty string | 308ed01f14d89f84793d471c40325dc5fe25efda | <ide><path>activesupport/lib/active_support/number_helper/number_to_phone_converter.rb
<ide> module ActiveSupport
<ide> module NumberHelper
<ide> class NumberToPhoneConverter < NumberConverter
<ide> def convert
<del> str = ''
<del> str << country_code(opts[:country_code])
<add> str = country_code(opts[:country_code])
<ide> str << convert_to_phone_number(number.to_s.strip)
<ide> str << phone_ext(opts[:extension])
<ide> end | 1 |
Ruby | Ruby | remove duplicates from argv.named | 6dbdc9ab0c5c87d94569bb418c4838edceff09b7 | <ide><path>Library/Homebrew/ARGV+yeast.rb
<ide> #
<ide> module HomebrewArgvExtension
<ide> def named
<del> reject {|arg| arg[0..0] == '-'}
<add> reject{|arg| arg[0..0] == '-'}.collect{|arg| arg.downcase}.uniq
<ide> end
<ide> def options
<ide> select {|arg| arg[0..0] == '-'}
<ide><path>Library/Homebrew/unittest.rb
<ide> def test_abstract_formula
<ide> assert_raises(RuntimeError) { f.prefix }
<ide> nostdout { assert_raises(ExecutionError) { f.brew } }
<ide> end
<add>
<add> def test_no_ARGV_dupes
<add> ARGV.unshift'foo'
<add> ARGV.unshift'foo'
<add> n=0
<add> ARGV.named.each{|arg| n+=1 if arg == 'foo'}
<add> assert_equal 1, n
<add> end
<ide> end | 2 |
Javascript | Javascript | remove messages in assert.strictequal | 5843b2c945c66d3c21541ae9d15b9fbd9dde6622 | <ide><path>test/parallel/test-tls-pfx-gh-5100-regr.js
<ide> const server = tls.createServer({
<ide> requestCert: true,
<ide> rejectUnauthorized: false
<ide> }, common.mustCall(function(c) {
<del> assert.strictEqual(
<del> c.authorizationError,
<del> null,
<del> 'authorizationError must be null'
<del> );
<add> assert.strictEqual(c.authorizationError, null);
<ide> c.end();
<ide> })).listen(0, function() {
<ide> const client = tls.connect({ | 1 |
Javascript | Javascript | fix synchronous thenable rejection | 10a7a5b5ced683125d68584a78084faac3197846 | <ide><path>packages/react-reconciler/src/ReactFiberLazyComponent.js
<ide> export function readLazyComponentType<T>(lazyComponent: LazyComponent<T>): T {
<ide> }
<ide> },
<ide> );
<del> // Check if it resolved synchronously
<del> if (lazyComponent._status === Resolved) {
<del> return lazyComponent._result;
<add> // Handle synchronous thenables.
<add> switch (lazyComponent._status) {
<add> case Resolved:
<add> return lazyComponent._result;
<add> case Rejected:
<add> throw lazyComponent._result;
<ide> }
<ide> lazyComponent._result = thenable;
<ide> throw thenable;
<ide><path>packages/react-reconciler/src/__tests__/ReactLazy-test.internal.js
<ide> describe('ReactLazy', () => {
<ide> expect(root).toMatchRenderedOutput('Hi');
<ide> });
<ide>
<add> it('can reject synchronously without suspending', async () => {
<add> const LazyText = lazy(() => ({
<add> then(resolve, reject) {
<add> reject(new Error('oh no'));
<add> },
<add> }));
<add>
<add> class ErrorBoundary extends React.Component {
<add> state = {};
<add> static getDerivedStateFromError(error) {
<add> return {message: error.message};
<add> }
<add> render() {
<add> return this.state.message
<add> ? `Error: ${this.state.message}`
<add> : this.props.children;
<add> }
<add> }
<add>
<add> const root = ReactTestRenderer.create(
<add> <ErrorBoundary>
<add> <Suspense fallback={<Text text="Loading..." />}>
<add> <LazyText text="Hi" />
<add> </Suspense>
<add> </ErrorBoundary>,
<add> );
<add> expect(ReactTestRenderer).toHaveYielded([]);
<add> expect(root).toMatchRenderedOutput('Error: oh no');
<add> });
<add>
<ide> it('multiple lazy components', async () => {
<ide> function Foo() {
<ide> return <Text text="Foo" />; | 2 |
PHP | PHP | fix cs error | e4b4eedf9cff24e137090cfa7f6976eda9285fe3 | <ide><path>src/Collection/Iterator/MapReduce.php
<ide> public function emitIntermediate($val, $bucket): void
<ide> */
<ide> public function emit($val, $key = null): void
<ide> {
<del> $this->_result[$key === null ? $this->_counter : $key] = $val;
<add> $this->_result[$key ?? $this->_counter] = $val;
<ide> $this->_counter++;
<ide> }
<ide>
<ide><path>src/Core/InstanceConfigTrait.php
<ide> public function getConfig(?string $key = null, $default = null)
<ide>
<ide> $return = $this->_configRead($key);
<ide>
<del> return $return === null ? $default : $return;
<add> return $return ?? $default;
<ide> }
<ide>
<ide> /**
<ide><path>src/I18n/DateFormatTrait.php
<ide> public function i18nFormat($format = null, $timezone = null, $locale = null)
<ide> $time = $time->timezone($timezone);
<ide> }
<ide>
<del> $format = $format !== null ? $format : static::$_toStringFormat;
<add> $format = $format ?? static::$_toStringFormat;
<ide> $locale = $locale ?: static::$defaultLocale;
<ide>
<ide> return $this->_formatObject($time, $format, $locale);
<ide><path>src/ORM/EagerLoader.php
<ide> protected function _buildAssociationsMap(array $map, array $level, bool $matchin
<ide> 'canBeJoined' => $canBeJoined,
<ide> 'entityClass' => $instance->getTarget()->getEntityClass(),
<ide> 'nestKey' => $canBeJoined ? $assoc : $meta->aliasPath(),
<del> 'matching' => $forMatching !== null ? $forMatching : $matching,
<add> 'matching' => $forMatching ?? $matching,
<ide> 'targetProperty' => $meta->targetProperty(),
<ide> ];
<ide> if ($canBeJoined && $associations) { | 4 |
Python | Python | restore meta information for each search field. | 3ca0b15b17182f046555da9348f5c58796b54851 | <ide><path>rest_framework/filters.py
<ide> def must_call_distinct(self, queryset, search_fields):
<ide> """
<ide> Return True if 'distinct()' should be used to query the given lookups.
<ide> """
<del> opts = queryset.model._meta
<ide> for search_field in search_fields:
<add> opts = queryset.model._meta
<ide> if search_field[0] in self.lookup_prefixes:
<ide> search_field = search_field[1:]
<ide> parts = search_field.split(LOOKUP_SEP)
<ide><path>tests/test_filters.py
<ide> class AttributeModel(models.Model):
<ide> label = models.CharField(max_length=32)
<ide>
<ide>
<add>class SearchFilterModelFk(models.Model):
<add> title = models.CharField(max_length=20)
<add> attribute = models.ForeignKey(AttributeModel)
<add>
<add>
<add>class SearchFilterFkSerializer(serializers.ModelSerializer):
<add> class Meta:
<add> model = SearchFilterModelFk
<add> fields = '__all__'
<add>
<add>
<add>class SearchFilterFkTests(TestCase):
<add>
<add> def test_must_call_distinct(self):
<add> filter_ = filters.SearchFilter()
<add> prefixes = [''] + list(filter_.lookup_prefixes)
<add> for prefix in prefixes:
<add> self.assertFalse(
<add> filter_.must_call_distinct(
<add> SearchFilterModelFk._meta, ["%stitle" % prefix]
<add> )
<add> )
<add> self.assertFalse(
<add> filter_.must_call_distinct(
<add> SearchFilterModelFk._meta, ["%stitle" % prefix, "%sattribute__label" % prefix]
<add> )
<add> )
<add>
<add> def test_must_call_distinct_restores_meta_for_each_field(self):
<add> # In this test case the attribute of the fk model comes first in the
<add> # list of search fields.
<add> filter_ = filters.SearchFilter()
<add> prefixes = [''] + list(filter_.lookup_prefixes)
<add> for prefix in prefixes:
<add> self.assertFalse(
<add> filter_.must_call_distinct(
<add> SearchFilterModelFk._meta, ["%sattribute__label" % prefix, "%stitle" % prefix]
<add> )
<add> )
<add>
<add>
<ide> class SearchFilterModelM2M(models.Model):
<ide> title = models.CharField(max_length=20)
<ide> text = models.CharField(max_length=100) | 2 |
Python | Python | remove log group from aws batch system test | 52c9a888285e7b5a95fe015fb20e4f2754df437c | <ide><path>tests/system/providers/amazon/aws/example_batch.py
<ide> BatchSensor,
<ide> )
<ide> from airflow.utils.trigger_rule import TriggerRule
<del>from tests.system.providers.amazon.aws.utils import ENV_ID_KEY, SystemTestContextBuilder, split_string
<add>from tests.system.providers.amazon.aws.utils import (
<add> ENV_ID_KEY,
<add> SystemTestContextBuilder,
<add> purge_logs,
<add> split_string,
<add>)
<ide>
<ide> DAG_ID = 'example_batch'
<ide>
<ide> def delete_job_queue(job_queue_name):
<ide> )
<ide>
<ide>
<add>@task(trigger_rule=TriggerRule.ALL_DONE)
<add>def delete_logs(env_id: str) -> None:
<add> generated_log_groups: list[tuple[str, str | None]] = [
<add> # Format: ('log group name', 'log stream prefix')
<add> ('/aws/batch/job', env_id),
<add> ]
<add>
<add> purge_logs(generated_log_groups)
<add>
<add>
<ide> with DAG(
<ide> dag_id=DAG_ID,
<ide> schedule='@once',
<ide> def delete_job_queue(job_queue_name):
<ide> catchup=False,
<ide> ) as dag:
<ide> test_context = sys_test_context_task()
<add> env_id = test_context[ENV_ID_KEY]
<ide>
<del> batch_job_name: str = f'{test_context[ENV_ID_KEY]}-test-job'
<del> batch_job_definition_name: str = f'{test_context[ENV_ID_KEY]}-test-job-definition'
<del> batch_job_compute_environment_name: str = f'{test_context[ENV_ID_KEY]}-test-job-compute-environment'
<del> batch_job_queue_name: str = f'{test_context[ENV_ID_KEY]}-test-job-queue'
<add> batch_job_name: str = f'{env_id}-test-job'
<add> batch_job_definition_name: str = f'{env_id}-test-job-definition'
<add> batch_job_compute_environment_name: str = f'{env_id}-test-job-compute-environment'
<add> batch_job_queue_name: str = f'{env_id}-test-job-queue'
<ide>
<ide> security_groups = split_string(test_context[SECURITY_GROUPS_KEY])
<ide> subnets = split_string(test_context[SUBNETS_KEY])
<ide> def delete_job_queue(job_queue_name):
<ide> wait_for_compute_environment_disabled,
<ide> delete_compute_environment(batch_job_compute_environment_name),
<ide> delete_job_definition(batch_job_definition_name),
<add> delete_logs(env_id),
<ide> )
<ide>
<ide> from tests.system.utils.watcher import watcher | 1 |
Text | Text | change node ls and service ls api and docs | 48c3fcedfa95f7c12868b218bd831dac8e04864f | <ide><path>docs/reference/api/docker_remote_api_v1.24.md
<ide> List nodes
<ide> - **filters** – a JSON encoded value of the filters (a `map[string][]string`) to process on the
<ide> nodes list. Available filters:
<ide> - `id=<node id>`
<add> - `label=<engine label>`
<add> - `membership=`(`accepted`|`pending`)`
<ide> - `name=<node name>`
<del> - `membership=`(`pending`|`accepted`|`rejected`)`
<del> - `role=`(`worker`|`manager`)`
<add> - `role=`(`manager`|`worker`)`
<ide>
<ide> **Status codes**:
<ide>
<ide> List services
<ide>
<ide> - **filters** – a JSON encoded value of the filters (a `map[string][]string`) to process on the
<ide> services list. Available filters:
<del> - `id=<node id>`
<del> - `name=<node name>`
<add> - `id=<service id>`
<add> - `label=<service label>`
<add> - `name=<service name>`
<ide>
<ide> **Status codes**:
<ide>
<ide><path>docs/reference/api/docker_remote_api_v1.25.md
<ide> List nodes
<ide> - **filters** – a JSON encoded value of the filters (a `map[string][]string`) to process on the
<ide> nodes list. Available filters:
<ide> - `id=<node id>`
<add> - `label=<engine label>`
<add> - `membership=`(`accepted`|`pending`)`
<ide> - `name=<node name>`
<del> - `membership=`(`pending`|`accepted`|`rejected`)`
<del> - `role=`(`worker`|`manager`)`
<add> - `role=`(`manager`|`worker`)`
<ide>
<ide> **Status codes**:
<ide>
<ide> List services
<ide>
<ide> - **filters** – a JSON encoded value of the filters (a `map[string][]string`) to process on the
<ide> services list. Available filters:
<del> - `id=<node id>`
<del> - `name=<node name>`
<add> - `id=<service id>`
<add> - `label=<service label>`
<add> - `name=<service name>`
<ide>
<ide> **Status codes**:
<ide>
<ide><path>docs/reference/commandline/node_ls.md
<ide> than one filter, then pass multiple flags (e.g., `--filter "foo=bar" --filter "b
<ide>
<ide> The currently supported filters are:
<ide>
<del>* name
<del>* id
<del>* label
<add>* [id](node_ls.md#id)
<add>* [label](node_ls.md#label)
<add>* [membership](node_ls.md#membership)
<add>* [name](node_ls.md#name)
<add>* [role](node_ls.md#role)
<ide>
<del>### name
<add>#### ID
<ide>
<del>The `name` filter matches on all or part of a node name.
<del>
<del>The following filter matches the node with a name equal to `swarm-master` string.
<add>The `id` filter matches all or part of a node's id.
<ide>
<ide> ```bash
<del>$ docker node ls -f name=swarm-manager1
<add>$ docker node ls -f id=1
<ide>
<del>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<del>e216jshn25ckzbvmwlnh5jr3g * swarm-manager1 Ready Active Leader
<add>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<add>1bcef6utixb0l0ca7gxuivsj0 swarm-worker2 Ready Active
<ide> ```
<ide>
<del>### id
<add>#### Label
<ide>
<del>The `id` filter matches all or part of a node's id.
<add>The `label` filter matches nodes based on engine labels and on the presence of a `label` alone or a `label` and a value. Node labels are currently not used for filtering.
<add>
<add>The following filter matches nodes with the `foo` label regardless of its value.
<ide>
<ide> ```bash
<del>$ docker node ls -f id=1
<add>$ docker node ls -f "label=foo"
<ide>
<ide> ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<ide> 1bcef6utixb0l0ca7gxuivsj0 swarm-worker2 Ready Active
<ide> ```
<ide>
<del>#### label
<add>#### Membership
<ide>
<del>The `label` filter matches tasks based on the presence of a `label` alone or a `label` and a
<del>value.
<add>The `membership` filter matches nodes based on the presence of a `membership` and a value
<add>`accepted` or `pending`.
<ide>
<del>The following filter matches nodes with the `usage` label regardless of its value.
<add>The following filter matches nodes with the `membership` of `accepted`.
<ide>
<ide> ```bash
<del>$ docker node ls -f "label=foo"
<add>$ docker node ls -f "membership=accepted"
<ide>
<del>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<del>1bcef6utixb0l0ca7gxuivsj0 swarm-worker2 Ready Active
<add>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<add>1bcef6utixb0l0ca7gxuivsj0 swarm-worker2 Ready Active
<add>38ciaotwjuritcdtn9npbnkuz swarm-worker1 Ready Active
<ide> ```
<ide>
<add>#### Name
<add>
<add>The `name` filter matches on all or part of a node hostname.
<add>
<add>The following filter matches the nodes with a name equal to `swarm-master` string.
<add>
<add>```bash
<add>$ docker node ls -f name=swarm-manager1
<add>
<add>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<add>e216jshn25ckzbvmwlnh5jr3g * swarm-manager1 Ready Active Leader
<add>```
<add>
<add>#### Role
<add>
<add>The `role` filter matches nodes based on the presence of a `role` and a value `worker` or `manager`.
<add>
<add>The following filter matches nodes with the `manager` role.
<add>
<add>```bash
<add>$ docker node ls -f "role=manager"
<add>
<add>ID HOSTNAME STATUS AVAILABILITY MANAGER STATUS
<add>e216jshn25ckzbvmwlnh5jr3g * swarm-manager1 Ready Active Leader
<add>```
<ide>
<ide> ## Related information
<ide>
<ide><path>docs/reference/commandline/service_ls.md
<ide> than one filter, then pass multiple flags (e.g., `--filter "foo=bar" --filter "b
<ide>
<ide> The currently supported filters are:
<ide>
<del>* [id](#id)
<del>* [label](#label)
<del>* [name](#name)
<add>* [id](service_ls.md#id)
<add>* [label](service_ls.md#label)
<add>* [name](service_ls.md#name)
<ide>
<ide> #### ID
<ide>
<ide> ID NAME MODE REPLICAS IMAGE
<ide>
<ide> #### Name
<ide>
<del>The `name` filter matches on all or part of a tasks's name.
<add>The `name` filter matches on all or part of a service's name.
<ide>
<ide> The following filter matches services with a name containing `redis`.
<ide> | 4 |
Go | Go | remove unused runconfig.config.securityopt field | e39646d2e1becb652ff0a8a7d131067f38a94247 | <ide><path>runconfig/config.go
<ide> type Config struct {
<ide> NetworkDisabled bool
<ide> MacAddress string
<ide> OnBuild []string
<del> SecurityOpt []string
<ide> Labels map[string]string
<ide> }
<ide> | 1 |
Text | Text | fix a typo in the contributing.md | f6d90a912b8b29ab4d32c0892d1d501ebec509b0 | <ide><path>CONTRIBUTING.md
<ide> and built upon.
<ide> #### Respect the stability index
<ide>
<ide> The rules for the master branch are less strict; consult the
<del>[stability index](./doc/api/documentation..md#stability-index) for details.
<add>[stability index](./doc/api/documentation.md#stability-index) for details.
<ide>
<ide> In a nutshell, modules are at varying levels of API stability. Bug fixes are
<ide> always welcome but API or behavioral changes to modules at stability level 3 | 1 |
Go | Go | skip graphdriver data for snapshotters | 313a7d716da2e98a7b7bdd8eaf5fdbd148f0dedd | <ide><path>daemon/inspect.go
<ide> func (daemon *Daemon) getInspectData(container *container.Container) (*types.Con
<ide>
<ide> contJSONBase.GraphDriver.Name = container.Driver
<ide>
<add> if daemon.UsesSnapshotter() {
<add> // Additional information only applies to graphDrivers, so we're done.
<add> return contJSONBase, nil
<add> }
<add>
<ide> if container.RWLayer == nil {
<ide> if container.Dead {
<ide> return contJSONBase, nil | 1 |
Javascript | Javascript | make pdfhistory optional in pdflinkservice | 20881dc99abd520274766e9161d772c6277d8b17 | <ide><path>web/pdf_link_service.js
<ide> var PDFLinkService = (function () {
<ide> }
<ide> self.pdfViewer.scrollPageIntoView(pageNumber, dest);
<ide>
<del> // Update the browsing history.
<del> self.pdfHistory.push({
<del> dest: dest,
<del> hash: destString,
<del> page: pageNumber
<del> });
<add> if (self.pdfHistory) {
<add> // Update the browsing history.
<add> self.pdfHistory.push({
<add> dest: dest,
<add> hash: destString,
<add> page: pageNumber
<add> });
<add> }
<ide> } else {
<ide> self.pdfDocument.getPageIndex(destRef).then(function (pageIndex) {
<ide> var pageNum = pageIndex + 1;
<ide> var PDFLinkService = (function () {
<ide> var params = parseQueryString(hash);
<ide> // borrowing syntax from "Parameters for Opening PDF Files"
<ide> if ('nameddest' in params) {
<del> this.pdfHistory.updateNextHashParam(params.nameddest);
<add> if (this.pdfHistory) {
<add> this.pdfHistory.updateNextHashParam(params.nameddest);
<add> }
<ide> this.navigateTo(params.nameddest);
<ide> return;
<ide> }
<ide> var PDFLinkService = (function () {
<ide> } else if (/^\d+$/.test(hash)) { // page number
<ide> this.page = hash;
<ide> } else { // named destination
<del> this.pdfHistory.updateNextHashParam(unescape(hash));
<add> if (this.pdfHistory) {
<add> this.pdfHistory.updateNextHashParam(unescape(hash));
<add> }
<ide> this.navigateTo(unescape(hash));
<ide> }
<ide> },
<ide> var PDFLinkService = (function () {
<ide> // See PDF reference, table 8.45 - Named action
<ide> switch (action) {
<ide> case 'GoBack':
<del> this.pdfHistory.back();
<add> if (this.pdfHistory) {
<add> this.pdfHistory.back();
<add> }
<ide> break;
<ide>
<ide> case 'GoForward':
<del> this.pdfHistory.forward();
<add> if (this.pdfHistory) {
<add> this.pdfHistory.forward();
<add> }
<ide> break;
<ide>
<ide> case 'NextPage': | 1 |
Javascript | Javascript | simplify socket constructor | aa6eaae0aa8ba46ca2a25bd0b1e8eec5498df847 | <ide><path>lib/net.js
<ide> function initSocket (self) {
<ide> self.writable = false;
<ide> }
<ide>
<del>function Socket (peerInfo) {
<add>function Socket (fd) {
<ide> process.EventEmitter.call(this);
<ide>
<del> if (peerInfo) {
<add> if (fd) {
<ide> initSocket(this);
<ide>
<del> this.fd = peerInfo.fd;
<del> this.remoteAddress = peerInfo.remoteAddress;
<del> this.remotePort = peerInfo.remotePort;
<add> this.fd = fd;
<ide>
<ide> this.resume();
<ide> this.readable = true;
<ide> function Server (listener) {
<ide>
<ide> self.watcher = new IOWatcher();
<ide> self.watcher.host = self;
<del> self.watcher.callback = function (readable, writeable) {
<add> self.watcher.callback = function () {
<ide> while (self.fd) {
<ide> var peerInfo = accept(self.fd);
<ide> if (!peerInfo) return;
<del> var peer = new Socket(peerInfo);
<del> peer.type = self.type;
<del> peer.server = self;
<del> self.emit('connection', peer);
<add> var s = new Socket(peerInfo.fd);
<add> s.remoteAddress = peerInfo.remoteAddress;
<add> s.remotePort = peerInfo.remotePort;
<add> s.type = self.type;
<add> s.server = self;
<add> self.emit('connection', s);
<ide> // The 'connect' event probably should be removed for server-side
<ide> // sockets. It's redundent.
<del> peer.emit('connect');
<del> timeout.active(peer);
<add> s.emit('connect');
<add> timeout.active(s);
<ide> }
<ide> };
<ide> } | 1 |
Go | Go | add test for restarting entire swarm cluster | ae4137ae3cc6ee479f5e7f86f9859b485473285a | <ide><path>integration-cli/docker_api_swarm_test.go
<ide> package main
<ide>
<ide> import (
<ide> "net/http"
<add> "os"
<add> "path/filepath"
<ide> "strconv"
<ide> "strings"
<add> "sync"
<ide> "syscall"
<ide> "time"
<ide>
<ide> func setGlobalMode(s *swarm.Service) {
<ide> Global: &swarm.GlobalService{},
<ide> }
<ide> }
<add>
<add>func checkClusterHealth(c *check.C, cl []*SwarmDaemon, managerCount, workerCount int) {
<add> var totalMCount, totalWCount int
<add> for _, d := range cl {
<add> info, err := d.info()
<add> c.Assert(err, check.IsNil)
<add> if !info.ControlAvailable {
<add> totalWCount++
<add> continue
<add> }
<add> var leaderFound bool
<add> totalMCount++
<add> var mCount, wCount int
<add> for _, n := range d.listNodes(c) {
<add> c.Assert(n.Status.State, checker.Equals, swarm.NodeStateReady, check.Commentf("state of node %s, reported by %s", n.ID, d.Info.NodeID))
<add> c.Assert(n.Spec.Availability, checker.Equals, swarm.NodeAvailabilityActive, check.Commentf("availability of node %s, reported by %s", n.ID, d.Info.NodeID))
<add> c.Assert(n.Spec.Membership, checker.Equals, swarm.NodeMembershipAccepted, check.Commentf("membership of node %s, reported by %s", n.ID, d.Info.NodeID))
<add> if n.Spec.Role == swarm.NodeRoleManager {
<add> c.Assert(n.ManagerStatus, checker.NotNil, check.Commentf("manager status of node %s (manager), reported by %s", n.ID, d.Info.NodeID))
<add> if n.ManagerStatus.Leader {
<add> leaderFound = true
<add> }
<add> mCount++
<add> } else {
<add> c.Assert(n.ManagerStatus, checker.IsNil, check.Commentf("manager status of node %s (worker), reported by %s", n.ID, d.Info.NodeID))
<add> wCount++
<add> }
<add> }
<add> c.Assert(leaderFound, checker.True, check.Commentf("lack of leader reported by node %s", info.NodeID))
<add> c.Assert(mCount, checker.Equals, managerCount, check.Commentf("managers count reported by node %s", info.NodeID))
<add> c.Assert(wCount, checker.Equals, workerCount, check.Commentf("workers count reported by node %s", info.NodeID))
<add> }
<add> c.Assert(totalMCount, checker.Equals, managerCount)
<add> c.Assert(totalWCount, checker.Equals, workerCount)
<add>}
<add>
<add>func (s *DockerSwarmSuite) TestApiSwarmRestartCluster(c *check.C) {
<add> mCount, wCount := 5, 1
<add>
<add> var nodes []*SwarmDaemon
<add> for i := 0; i < mCount; i++ {
<add> manager := s.AddDaemon(c, true, true)
<add> info, err := manager.info()
<add> c.Assert(err, checker.IsNil)
<add> c.Assert(info.ControlAvailable, checker.True)
<add> c.Assert(info.LocalNodeState, checker.Equals, swarm.LocalNodeStateActive)
<add> nodes = append(nodes, manager)
<add> }
<add>
<add> for i := 0; i < wCount; i++ {
<add> worker := s.AddDaemon(c, true, false)
<add> info, err := worker.info()
<add> c.Assert(err, checker.IsNil)
<add> c.Assert(info.ControlAvailable, checker.False)
<add> c.Assert(info.LocalNodeState, checker.Equals, swarm.LocalNodeStateActive)
<add> nodes = append(nodes, worker)
<add> }
<add>
<add> // stop whole cluster
<add> {
<add> var wg sync.WaitGroup
<add> wg.Add(len(nodes))
<add> errs := make(chan error, len(nodes))
<add>
<add> for _, d := range nodes {
<add> go func(daemon *SwarmDaemon) {
<add> defer wg.Done()
<add> if err := daemon.Stop(); err != nil {
<add> errs <- err
<add> }
<add> if root := os.Getenv("DOCKER_REMAP_ROOT"); root != "" {
<add> daemon.root = filepath.Dir(daemon.root)
<add> }
<add> }(d)
<add> }
<add> wg.Wait()
<add> close(errs)
<add> for err := range errs {
<add> c.Assert(err, check.IsNil)
<add> }
<add> }
<add>
<add> // start whole cluster
<add> {
<add> var wg sync.WaitGroup
<add> wg.Add(len(nodes))
<add> errs := make(chan error, len(nodes))
<add>
<add> for _, d := range nodes {
<add> go func(daemon *SwarmDaemon) {
<add> defer wg.Done()
<add> if err := daemon.Start("--iptables=false"); err != nil {
<add> errs <- err
<add> }
<add> }(d)
<add> }
<add> wg.Wait()
<add> close(errs)
<add> for err := range errs {
<add> c.Assert(err, check.IsNil)
<add> }
<add> }
<add>
<add> checkClusterHealth(c, nodes, mCount, wCount)
<add>} | 1 |
Text | Text | update changelog for 2.7 beta 3 | 0ffcf8209be2e11b8f2fa5c4b5b7acc11a9fbc45 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<add>### 2.7.0-beta.3 (July 5, 2016)
<add>
<add>- [#13768](https://github.com/emberjs/ember.js/pull/13768) [BUGFIX] Update route-recognizer to v0.2.0. This addresses a large number of per-existing bugs related to URL encoding. However, in doing so, it might inevitably break existing workarounds in this area. Please refer to the linked pull request for more details.
<add>
<ide> ### 2.7.0-beta.2 (June 27, 2016)
<ide>
<ide> - [#13634](https://github.com/emberjs/ember.js/pull/13634) [BUGFIX] Fix issues with rerendering blockless and tagless components. | 1 |
Javascript | Javascript | remove extra 'hours' from /donate page | b083b5891375639bf07bb12afe13c31c0f64c906 | <ide><path>client/src/components/Donation/DonateForm.js
<ide> class DonateForm extends Component {
<ide> }
<ide>
<ide> convertToTimeContributed(amount) {
<del> return `${numToCommas((amount / 100) * 50)} hours`;
<add> return numToCommas((amount / 100) * 50);
<ide> }
<ide>
<ide> getFormattedAmountLabel(amount) { | 1 |
Mixed | Ruby | add a route to handle mandrill's webhook url check | fce29be3356269d2a0f586e15dcaf91ae680278a | <ide><path>actionmailbox/CHANGELOG.md
<add>* Update Mandrill inbound email route to respond appropriately to HEAD requests for URL health checks from Mandrill.
<add>
<add> *Bill Cromie*
<add>
<ide> * Add way to deliver emails via source instead of filling out a form through the conductor interface.
<ide>
<ide> *DHH*
<ide><path>actionmailbox/app/controllers/action_mailbox/ingresses/mandrill/inbound_emails_controller.rb
<ide> module ActionMailbox
<ide> # - <tt>500 Server Error</tt> if the Mandrill API key is missing, or one of the Active Record database,
<ide> # the Active Storage service, or the Active Job backend is misconfigured or unavailable
<ide> class Ingresses::Mandrill::InboundEmailsController < ActionMailbox::BaseController
<del> before_action :authenticate
<add> before_action :authenticate, except: :health_check
<ide>
<ide> def create
<ide> raw_emails.each { |raw_email| ActionMailbox::InboundEmail.create_and_extract_message_id! raw_email }
<ide> def create
<ide> head :unprocessable_entity
<ide> end
<ide>
<add> def health_check
<add> head :ok
<add> end
<add>
<ide> private
<ide> def raw_emails
<ide> events.select { |event| event["event"] == "inbound" }.collect { |event| event.dig("msg", "raw_msg") }
<ide><path>actionmailbox/config/routes.rb
<ide>
<ide> Rails.application.routes.draw do
<ide> scope "/rails/action_mailbox", module: "action_mailbox/ingresses" do
<del> post "/mandrill/inbound_emails" => "mandrill/inbound_emails#create", as: :rails_mandrill_inbound_emails
<ide> post "/postmark/inbound_emails" => "postmark/inbound_emails#create", as: :rails_postmark_inbound_emails
<ide> post "/relay/inbound_emails" => "relay/inbound_emails#create", as: :rails_relay_inbound_emails
<ide> post "/sendgrid/inbound_emails" => "sendgrid/inbound_emails#create", as: :rails_sendgrid_inbound_emails
<ide>
<add> # Mandrill checks for the existence of a URL with a HEAD request before it will create the webhook.
<add> get "/mandrill/inbound_emails" => "mandrill/inbound_emails#health_check", as: :rails_mandrill_inbound_health_check
<add> post "/mandrill/inbound_emails" => "mandrill/inbound_emails#create", as: :rails_mandrill_inbound_emails
<add>
<ide> # Mailgun requires that a webhook's URL end in 'mime' for it to receive the raw contents of emails.
<ide> post "/mailgun/inbound_emails/mime" => "mailgun/inbound_emails#create", as: :rails_mailgun_inbound_emails
<ide> end
<ide><path>actionmailbox/test/controllers/ingresses/mandrill/inbound_emails_controller_test.rb
<ide> class ActionMailbox::Ingresses::Mandrill::InboundEmailsControllerTest < ActionDi
<ide> @events = JSON.generate([{ event: "inbound", msg: { raw_msg: file_fixture("../files/welcome.eml").read } }])
<ide> end
<ide>
<add> test "verifying existence of Mandrill inbound route" do
<add> # Mandrill uses a HEAD request to verify if a URL exists before creating the ingress webhook
<add> head rails_mandrill_inbound_health_check_url
<add> assert_response :ok
<add> end
<add>
<ide> test "receiving an inbound email from Mandrill" do
<ide> assert_difference -> { ActionMailbox::InboundEmail.count }, +1 do
<ide> post rails_mandrill_inbound_emails_url,
<ide><path>railties/test/application/rake/routes_test.rb
<ide> class RakeRoutesTest < ActiveSupport::TestCase
<ide> assert_equal <<~MESSAGE, run_rake_routes
<ide> Prefix Verb URI Pattern Controller#Action
<ide> cart GET /cart(.:format) cart#show
<del> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_postmark_inbound_emails POST /rails/action_mailbox/postmark/inbound_emails(.:format) action_mailbox/ingresses/postmark/inbound_emails#create
<ide> rails_relay_inbound_emails POST /rails/action_mailbox/relay/inbound_emails(.:format) action_mailbox/ingresses/relay/inbound_emails#create
<ide> rails_sendgrid_inbound_emails POST /rails/action_mailbox/sendgrid/inbound_emails(.:format) action_mailbox/ingresses/sendgrid/inbound_emails#create
<add> rails_mandrill_inbound_health_check GET /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#health_check
<add> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_mailgun_inbound_emails POST /rails/action_mailbox/mailgun/inbound_emails/mime(.:format) action_mailbox/ingresses/mailgun/inbound_emails#create
<ide> rails_conductor_inbound_emails GET /rails/conductor/action_mailbox/inbound_emails(.:format) rails/conductor/action_mailbox/inbound_emails#index
<ide> POST /rails/conductor/action_mailbox/inbound_emails(.:format) rails/conductor/action_mailbox/inbound_emails#create
<ide><path>railties/test/commands/routes_test.rb
<ide> class Rails::Command::RoutesTest < ActiveSupport::TestCase
<ide> assert_equal <<~MESSAGE, run_routes_command([ "-g", "POST" ])
<ide> Prefix Verb URI Pattern Controller#Action
<ide> POST /cart(.:format) cart#create
<del> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_postmark_inbound_emails POST /rails/action_mailbox/postmark/inbound_emails(.:format) action_mailbox/ingresses/postmark/inbound_emails#create
<ide> rails_relay_inbound_emails POST /rails/action_mailbox/relay/inbound_emails(.:format) action_mailbox/ingresses/relay/inbound_emails#create
<ide> rails_sendgrid_inbound_emails POST /rails/action_mailbox/sendgrid/inbound_emails(.:format) action_mailbox/ingresses/sendgrid/inbound_emails#create
<add> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_mailgun_inbound_emails POST /rails/action_mailbox/mailgun/inbound_emails/mime(.:format) action_mailbox/ingresses/mailgun/inbound_emails#create
<ide> POST /rails/conductor/action_mailbox/inbound_emails(.:format) rails/conductor/action_mailbox/inbound_emails#create
<ide> rails_conductor_inbound_email_sources POST /rails/conductor/action_mailbox/inbound_emails/sources(.:format) rails/conductor/action_mailbox/inbound_emails/sources#create
<ide> class Rails::Command::RoutesTest < ActiveSupport::TestCase
<ide>
<ide> assert_equal <<~MESSAGE, run_routes_command
<ide> Prefix Verb URI Pattern Controller#Action
<del> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_postmark_inbound_emails POST /rails/action_mailbox/postmark/inbound_emails(.:format) action_mailbox/ingresses/postmark/inbound_emails#create
<ide> rails_relay_inbound_emails POST /rails/action_mailbox/relay/inbound_emails(.:format) action_mailbox/ingresses/relay/inbound_emails#create
<ide> rails_sendgrid_inbound_emails POST /rails/action_mailbox/sendgrid/inbound_emails(.:format) action_mailbox/ingresses/sendgrid/inbound_emails#create
<add> rails_mandrill_inbound_health_check GET /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#health_check
<add> rails_mandrill_inbound_emails POST /rails/action_mailbox/mandrill/inbound_emails(.:format) action_mailbox/ingresses/mandrill/inbound_emails#create
<ide> rails_mailgun_inbound_emails POST /rails/action_mailbox/mailgun/inbound_emails/mime(.:format) action_mailbox/ingresses/mailgun/inbound_emails#create
<ide> rails_conductor_inbound_emails GET /rails/conductor/action_mailbox/inbound_emails(.:format) rails/conductor/action_mailbox/inbound_emails#index
<ide> POST /rails/conductor/action_mailbox/inbound_emails(.:format) rails/conductor/action_mailbox/inbound_emails#create
<ide> class Rails::Command::RoutesTest < ActiveSupport::TestCase
<ide> URI | /cart(.:format)
<ide> Controller#Action | cart#show
<ide> --[ Route 2 ]--------------
<del> Prefix | rails_mandrill_inbound_emails
<del> Verb | POST
<del> URI | /rails/action_mailbox/mandrill/inbound_emails(.:format)
<del> Controller#Action | action_mailbox/ingresses/mandrill/inbound_emails#create
<del> --[ Route 3 ]--------------
<ide> Prefix | rails_postmark_inbound_emails
<ide> Verb | POST
<ide> URI | /rails/action_mailbox/postmark/inbound_emails(.:format)
<ide> Controller#Action | action_mailbox/ingresses/postmark/inbound_emails#create
<del> --[ Route 4 ]--------------
<add> --[ Route 3 ]--------------
<ide> Prefix | rails_relay_inbound_emails
<ide> Verb | POST
<ide> URI | /rails/action_mailbox/relay/inbound_emails(.:format)
<ide> Controller#Action | action_mailbox/ingresses/relay/inbound_emails#create
<del> --[ Route 5 ]--------------
<add> --[ Route 4 ]--------------
<ide> Prefix | rails_sendgrid_inbound_emails
<ide> Verb | POST
<ide> URI | /rails/action_mailbox/sendgrid/inbound_emails(.:format)
<ide> Controller#Action | action_mailbox/ingresses/sendgrid/inbound_emails#create
<add> --[ Route 5 ]--------------
<add> Prefix | rails_mandrill_inbound_health_check
<add> Verb | GET
<add> URI | /rails/action_mailbox/mandrill/inbound_emails(.:format)
<add> Controller#Action | action_mailbox/ingresses/mandrill/inbound_emails#health_check
<ide> --[ Route 6 ]--------------
<add> Prefix | rails_mandrill_inbound_emails
<add> Verb | POST
<add> URI | /rails/action_mailbox/mandrill/inbound_emails(.:format)
<add> Controller#Action | action_mailbox/ingresses/mandrill/inbound_emails#create
<add> --[ Route 7 ]--------------
<ide> Prefix | rails_mailgun_inbound_emails
<ide> Verb | POST
<ide> URI | /rails/action_mailbox/mailgun/inbound_emails/mime(.:format)
<ide> Controller#Action | action_mailbox/ingresses/mailgun/inbound_emails#create
<del> --[ Route 7 ]--------------
<add> --[ Route 8 ]--------------
<ide> Prefix | rails_conductor_inbound_emails
<ide> Verb | GET
<ide> URI | /rails/conductor/action_mailbox/inbound_emails(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#index
<del> --[ Route 8 ]--------------
<add> --[ Route 9 ]--------------
<ide> Prefix |
<ide> Verb | POST
<ide> URI | /rails/conductor/action_mailbox/inbound_emails(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#create
<del> --[ Route 9 ]--------------
<add> --[ Route 10 ]-------------
<ide> Prefix | new_rails_conductor_inbound_email
<ide> Verb | GET
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/new(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#new
<del> --[ Route 10 ]-------------
<add> --[ Route 11 ]-------------
<ide> Prefix | edit_rails_conductor_inbound_email
<ide> Verb | GET
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/:id/edit(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#edit
<del> --[ Route 11 ]-------------
<add> --[ Route 12 ]-------------
<ide> Prefix | rails_conductor_inbound_email
<ide> Verb | GET
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/:id(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#show
<del> --[ Route 12 ]-------------
<add> --[ Route 13 ]-------------
<ide> Prefix |
<ide> Verb | PATCH
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/:id(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#update
<del> --[ Route 13 ]-------------
<add> --[ Route 14 ]-------------
<ide> Prefix |
<ide> Verb | PUT
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/:id(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#update
<del> --[ Route 14 ]-------------
<add> --[ Route 15 ]-------------
<ide> Prefix |
<ide> Verb | DELETE
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/:id(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails#destroy
<del> --[ Route 15 ]-------------
<add> --[ Route 16 ]-------------
<ide> Prefix | new_rails_conductor_inbound_email_source
<ide> Verb | GET
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/sources/new(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails/sources#new
<del> --[ Route 16 ]-------------
<add> --[ Route 17 ]-------------
<ide> Prefix | rails_conductor_inbound_email_sources
<ide> Verb | POST
<ide> URI | /rails/conductor/action_mailbox/inbound_emails/sources(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/inbound_emails/sources#create
<del> --[ Route 17 ]-------------
<add> --[ Route 18 ]-------------
<ide> Prefix | rails_conductor_inbound_email_reroute
<ide> Verb | POST
<ide> URI | /rails/conductor/action_mailbox/:inbound_email_id/reroute(.:format)
<ide> Controller#Action | rails/conductor/action_mailbox/reroutes#create
<del> --[ Route 18 ]-------------
<add> --[ Route 19 ]-------------
<ide> Prefix | rails_service_blob
<ide> Verb | GET
<ide> URI | /rails/active_storage/blobs/:signed_id/*filename(.:format)
<ide> Controller#Action | active_storage/blobs#show
<del> --[ Route 19 ]-------------
<add> --[ Route 20 ]-------------
<ide> Prefix | rails_blob_representation
<ide> Verb | GET
<ide> URI | /rails/active_storage/representations/:signed_blob_id/:variation_key/*filename(.:format)
<ide> Controller#Action | active_storage/representations#show
<del> --[ Route 20 ]-------------
<add> --[ Route 21 ]-------------
<ide> Prefix | rails_disk_service
<ide> Verb | GET
<ide> URI | /rails/active_storage/disk/:encoded_key/*filename(.:format)
<ide> Controller#Action | active_storage/disk#show
<del> --[ Route 21 ]-------------
<add> --[ Route 22 ]-------------
<ide> Prefix | update_rails_disk_service
<ide> Verb | PUT
<ide> URI | /rails/active_storage/disk/:encoded_token(.:format)
<ide> Controller#Action | active_storage/disk#update
<del> --[ Route 22 ]-------------
<add> --[ Route 23 ]-------------
<ide> Prefix | rails_direct_uploads
<ide> Verb | POST
<ide> URI | /rails/active_storage/direct_uploads(.:format) | 6 |
Ruby | Ruby | update documentation to match change in [ci skip] | e57ad6dc03efd269d289fcd447aafcf6e3058904 | <ide><path>activemodel/lib/active_model/validations/confirmation.rb
<ide> module HelperMethods
<ide> #
<ide> # Configuration options:
<ide> # * <tt>:message</tt> - A custom error message (default is: "doesn't match
<del> # confirmation").
<add> # <tt>%{translated_attribute_name}</tt>").
<ide> #
<ide> # There is also a list of default options supported by every validator:
<ide> # +:if+, +:unless+, +:on+, +:allow_nil+, +:allow_blank+, and +:strict+. | 1 |
PHP | PHP | fix failing unit tests | 4a08af5a338fb140d380f91b772b49130208bc04 | <ide><path>tests/TestCase/Database/ConnectionTest.php
<ide> public function testUpdateWithConditionsAndTypes()
<ide> $title = 'changed the title!';
<ide> $body = new \DateTime('2012-01-01');
<ide> $values = compact('title', 'body');
<del> $this->connection->update('things', $values, ['id' => '1-string-parsed-as-int'], ['body' => 'date', 'id' => 'integer']);
<add> $this->connection->update('things', $values, ['id' => '1'], ['body' => 'date', 'id' => 'integer']);
<ide> $result = $this->connection->execute('SELECT * FROM things WHERE title = :title AND body = :body', $values, ['body' => 'date']);
<ide> $this->assertCount(1, $result);
<ide> $row = $result->fetch('assoc');
<ide> public function testDeleteNoConditions()
<ide> */
<ide> public function testDeleteWithConditions()
<ide> {
<del> $this->connection->delete('things', ['id' => '1-rest-is-omitted'], ['id' => 'integer']);
<add> $this->connection->delete('things', ['id' => '1'], ['id' => 'integer']);
<ide> $result = $this->connection->execute('SELECT * FROM things');
<ide> $this->assertCount(1, $result);
<ide> $result->closeCursor();
<ide>
<del> $this->connection->delete('things', ['id' => '1-rest-is-omitted'], ['id' => 'integer']);
<add> $this->connection->delete('things', ['id' => '1'], ['id' => 'integer']);
<ide> $result = $this->connection->execute('SELECT * FROM things');
<ide> $this->assertCount(1, $result);
<ide> $result->closeCursor();
<ide>
<del> $this->connection->delete('things', ['id' => '2-rest-is-omitted'], ['id' => 'integer']);
<add> $this->connection->delete('things', ['id' => '2'], ['id' => 'integer']);
<ide> $result = $this->connection->execute('SELECT * FROM things');
<ide> $this->assertCount(0, $result);
<ide> $result->closeCursor();
<ide><path>tests/TestCase/Database/QueryTest.php
<ide> public function testSelectWhereTypes()
<ide> ->from('comments')
<ide> ->where(
<ide> [
<del> 'id' => '3something-crazy',
<add> 'id' => '3',
<ide> 'created <' => new \DateTime('2013-01-01 12:00')
<ide> ],
<ide> ['created' => 'datetime', 'id' => 'integer']
<ide> public function testSelectWhereTypes()
<ide> ->from('comments')
<ide> ->where(
<ide> [
<del> 'id' => '1something-crazy',
<add> 'id' => '1',
<ide> 'created <' => new \DateTime('2013-01-01 12:00')
<ide> ],
<ide> ['created' => 'datetime', 'id' => 'integer'] | 2 |
Ruby | Ruby | fix one more time order_hacks method for oracle | 237e580c4920b2414ef1bb72c14250688b479b9e | <ide><path>lib/arel/visitors/oracle.rb
<ide> def order_hacks o
<ide> /DISTINCT.*FIRST_VALUE/ === projection
<ide> end
<ide> end
<del> # FIXME: previous version with join and split broke ORDER BY clause
<add> # Previous version with join and split broke ORDER BY clause
<ide> # if it contained functions with several arguments (separated by ',').
<del> # Currently splitting is done only if there is no function calls
<ide> #
<ide> # orders = o.orders.map { |x| visit x }.join(', ').split(',')
<ide> orders = o.orders.map do |x|
<ide> string = visit x
<del> # if there is function call
<del> if string.include?('(')
<del> string
<del> # if no function call then comma splits several ORDER BY columns
<del> elsif string.include?(',')
<del> string.split(',')
<add> if string.include?(',')
<add> split_order_string(string)
<ide> else
<ide> string
<ide> end
<ide> def order_hacks o
<ide> end
<ide> o
<ide> end
<add>
<add> # Split string by commas but count opening and closing brackets
<add> # and ignore commas inside brackets.
<add> def split_order_string(string)
<add> array = []
<add> i = 0
<add> string.split(',').each do |part|
<add> array[i] ||= ""
<add> array[i] << part
<add> i += 1 if array[i].count('(') == array[i].count(')')
<add> end
<add> array
<add> end
<add>
<ide> end
<ide> end
<ide> end
<ide><path>test/visitors/test_oracle.rb
<ide> module Visitors
<ide> }
<ide> end
<ide>
<add> it 'splits orders with commas and function calls' do
<add> # *sigh*
<add> select = "DISTINCT foo.id, FIRST_VALUE(projects.name) OVER (foo) AS alias_0__"
<add> stmt = Nodes::SelectStatement.new
<add> stmt.cores.first.projections << Nodes::SqlLiteral.new(select)
<add> stmt.orders << Nodes::SqlLiteral.new('NVL(LOWER(bar, foo), foo) DESC, UPPER(baz)')
<add> sql = @visitor.accept(stmt)
<add> sql.must_be_like %{
<add> SELECT #{select} ORDER BY alias_0__ DESC, alias_1__
<add> }
<add> end
<add>
<ide> describe 'Nodes::SelectStatement' do
<ide> describe 'limit' do
<ide> it 'adds a rownum clause' do | 2 |
PHP | PHP | use expectedexception annotations in the tests | b106c76dee0f7f2a40ab7c46428d2e69ed4b4fba | <ide><path>tests/Cache/ClearCommandTest.php
<ide> public function testClearWithStoreOption()
<ide> $this->runCommand($command, ['store' => 'foo']);
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> */
<ide> public function testClearWithInvalidStoreOption()
<ide> {
<ide> $command = new ClearCommandTestStub(
<ide> public function testClearWithInvalidStoreOption()
<ide> $app = new Application();
<ide> $command->setLaravel($app);
<ide>
<del> $cacheManager->shouldReceive('store')->once()->with('bar')->andThrow('\InvalidArgumentException');
<add> $cacheManager->shouldReceive('store')->once()->with('bar')->andThrow('InvalidArgumentException');
<ide> $cacheRepository->shouldReceive('flush')->never();
<del> $this->setExpectedException('InvalidArgumentException');
<ide>
<ide> $this->runCommand($command, ['store' => 'bar']);
<ide> }
<ide><path>tests/Container/ContainerTest.php
<ide> public function testCreatingBoundConcreteClassPassesParameters()
<ide> $this->assertEquals($parameters, $instance->receivedParameters);
<ide> }
<ide>
<add> /**
<add> * @expectedException Illuminate\Contracts\Container\BindingResolutionException
<add> * @expectedExceptionMessage Unresolvable dependency resolving [Parameter #0 [ <required> $first ]] in class ContainerMixedPrimitiveStub
<add> */
<ide> public function testInternalClassWithDefaultParameters()
<ide> {
<del> $this->setExpectedException('Illuminate\Contracts\Container\BindingResolutionException', 'Unresolvable dependency resolving [Parameter #0 [ <required> $first ]] in class ContainerMixedPrimitiveStub');
<ide> $container = new Container;
<del> $parameters = [];
<del> $container->make('ContainerMixedPrimitiveStub', $parameters);
<add> $container->make('ContainerMixedPrimitiveStub', []);
<ide> }
<ide>
<ide> public function testCallWithDependencies()
<ide><path>tests/Http/HttpRedirectResponseTest.php
<ide> public function testMagicCall()
<ide> $response->withFoo('bar');
<ide> }
<ide>
<add> /**
<add> * @expectedException BadMethodCallException
<add> */
<ide> public function testMagicCallException()
<ide> {
<del> $this->setExpectedException('BadMethodCallException');
<ide> $response = new RedirectResponse('foo.bar');
<ide> $response->doesNotExist('bar');
<ide> }
<ide><path>tests/Http/HttpRequestTest.php
<ide> public function testFormatReturnsAcceptsCharset()
<ide> $this->assertTrue($request->accepts('application/baz+json'));
<ide> }
<ide>
<add> /**
<add> * @expectedException RuntimeException
<add> */
<ide> public function testSessionMethod()
<ide> {
<del> $this->setExpectedException('RuntimeException');
<ide> $request = Request::create('/', 'GET');
<ide> $request->session();
<ide> }
<ide><path>tests/Http/HttpResponseTest.php
<ide> public function testMagicCall()
<ide> $response->withFoo('bar');
<ide> }
<ide>
<add> /**
<add> * @expectedException BadMethodCallException
<add> */
<ide> public function testMagicCallException()
<ide> {
<del> $this->setExpectedException('BadMethodCallException');
<ide> $response = new RedirectResponse('foo.bar');
<ide> $response->doesNotExist('bar');
<ide> }
<ide><path>tests/Session/SessionStoreTest.php
<ide> public function testSessionIsLoadedFromHandler()
<ide> $this->assertTrue($session->has('baz'));
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> */
<ide> public function testSessionGetBagException()
<ide> {
<del> $this->setExpectedException('InvalidArgumentException');
<ide> $session = $this->getSession();
<ide> $session->getBag('doesNotExist');
<ide> }
<ide><path>tests/View/ViewEngineResolverTest.php
<ide> public function testResolversMayBeResolved()
<ide> $this->assertEquals(spl_object_hash($result), spl_object_hash($resolver->resolve('foo')));
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> */
<ide> public function testResolverThrowsExceptionOnUnknownEngine()
<ide> {
<del> $this->setExpectedException('InvalidArgumentException');
<ide> $resolver = new Illuminate\View\Engines\EngineResolver;
<ide> $resolver->resolve('foo');
<ide> }
<ide><path>tests/View/ViewFactoryTest.php
<ide> public function testMakeWithAlias()
<ide> $this->assertEquals('real', $view->getName());
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> */
<ide> public function testExceptionIsThrownForUnknownExtension()
<ide> {
<del> $this->setExpectedException('InvalidArgumentException');
<ide> $factory = $this->getFactory();
<ide> $factory->getFinder()->shouldReceive('find')->once()->with('view')->andReturn('view.foo');
<ide> $factory->make('view');
<ide> }
<ide>
<add> /**
<add> * @expectedException Exception
<add> * @expectedExceptionMessage section exception message
<add> */
<ide> public function testExceptionsInSectionsAreThrown()
<ide> {
<del> $engine = new \Illuminate\View\Engines\CompilerEngine(m::mock('Illuminate\View\Compilers\CompilerInterface'));
<add> $engine = new Illuminate\View\Engines\CompilerEngine(m::mock('Illuminate\View\Compilers\CompilerInterface'));
<ide> $engine->getCompiler()->shouldReceive('getCompiledPath')->andReturnUsing(function ($path) { return $path; });
<ide> $engine->getCompiler()->shouldReceive('isExpired')->twice()->andReturn(false);
<ide> $factory = $this->getFactory();
<ide> public function testExceptionsInSectionsAreThrown()
<ide> $factory->getFinder()->shouldReceive('find')->once()->with('view')->andReturn(__DIR__.'/fixtures/section-exception.php');
<ide> $factory->getDispatcher()->shouldReceive('fire')->times(4);
<ide>
<del> $this->setExpectedException('Exception', 'section exception message');
<ide> $factory->make('view')->render();
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> * @expectedExceptionMessage Cannot end a section without first starting one.
<add> */
<ide> public function testExtraStopSectionCallThrowsException()
<ide> {
<ide> $factory = $this->getFactory();
<ide> $factory->startSection('foo');
<ide> $factory->stopSection();
<ide>
<del> $this->setExpectedException('InvalidArgumentException', 'Cannot end a section without first starting one.');
<ide> $factory->stopSection();
<ide> }
<ide>
<add> /**
<add> * @expectedException InvalidArgumentException
<add> * @expectedExceptionMessage Cannot end a section without first starting one.
<add> */
<ide> public function testExtraAppendSectionCallThrowsException()
<ide> {
<ide> $factory = $this->getFactory();
<ide> $factory->startSection('foo');
<ide> $factory->stopSection();
<ide>
<del> $this->setExpectedException('InvalidArgumentException', 'Cannot end a section without first starting one.');
<ide> $factory->appendSection();
<ide> }
<ide>
<ide><path>tests/View/ViewTest.php
<ide> public function testViewMagicMethods()
<ide> $this->assertFalse($view->offsetExists('foo'));
<ide> }
<ide>
<add> /**
<add> * @expectedException BadMethodCallException
<add> */
<ide> public function testViewBadMethod()
<ide> {
<del> $this->setExpectedException('BadMethodCallException');
<ide> $view = $this->getView();
<ide> $view->badMethodCall();
<ide> } | 9 |
Javascript | Javascript | replace string concatenation with template | 8c6c06091888785f33da453c3e4177f6364f3c88 | <ide><path>tools/doc/type-parser.js
<ide> 'use strict';
<ide> const nodeDocUrl = '';
<ide> const jsDocPrefix = 'https://developer.mozilla.org/en-US/docs/Web/JavaScript/';
<del>const jsDocUrl = jsDocPrefix + 'Reference/Global_Objects/';
<del>const jsPrimitiveUrl = jsDocPrefix + 'Data_structures';
<add>const jsDocUrl = `${jsDocPrefix}Reference/Global_Objects/`;
<add>const jsPrimitiveUrl = `${jsDocPrefix}Data_structures`;
<ide> const jsPrimitives = {
<ide> 'boolean': 'Boolean',
<ide> 'integer': 'Number', // not a primitive, used for clarification
<ide> const jsGlobalTypes = [
<ide> 'AsyncFunction', 'SharedArrayBuffer'
<ide> ];
<ide> const typeMap = {
<del> 'Iterable': jsDocPrefix +
<del> 'Reference/Iteration_protocols#The_iterable_protocol',
<del> 'Iterator': jsDocPrefix +
<del> 'Reference/Iteration_protocols#The_iterator_protocol',
<add> 'Iterable':
<add> `${jsDocPrefix}Reference/Iteration_protocols#The_iterable_protocol`,
<add> 'Iterator':
<add> `${jsDocPrefix}Reference/Iteration_protocols#The_iterator_protocol`,
<ide>
<ide> 'Buffer': 'buffer.html#buffer_class_buffer',
<ide>
<ide> module.exports = {
<ide> }
<ide>
<ide> if (typeUrl) {
<del> typeLinks.push('<a href="' + typeUrl + '" class="type"><' +
<del> typeTextFull + '></a>');
<add> typeLinks.push(`
<add> <a href="${typeUrl}" class="type"><${typeTextFull}></a>`);
<ide> } else {
<del> typeLinks.push('<span class="type"><' + typeTextFull +
<del> '></span>');
<add> typeLinks.push(`<span class="type"><${typeTextFull}></span>`);
<ide> }
<ide> }
<ide> }); | 1 |
Python | Python | catch error in breeze when docker is not running | f662b7de8c5e61f640f150d4e68bde21dcdd09b4 | <ide><path>dev/breeze/src/airflow_breeze/shell/enter_shell.py
<ide> SOURCE_OF_DEFAULT_VALUES_FOR_VARIABLES,
<ide> VARIABLES_IN_CACHE,
<ide> check_docker_compose_version,
<add> check_docker_is_running,
<ide> check_docker_resources,
<ide> check_docker_version,
<ide> construct_env_variables_docker_compose_command,
<ide> def enter_shell(**kwargs):
<ide> """
<ide> verbose = kwargs['verbose']
<ide> dry_run = kwargs['dry_run']
<add> if not check_docker_is_running(verbose):
<add> console.print(
<add> '[red]Docker is not running.[/]\n'
<add> '[bright_yellow]Please make sure Docker is installed and running.[/]'
<add> )
<add> sys.exit(1)
<ide> check_docker_version(verbose)
<ide> check_docker_compose_version(verbose)
<ide> updated_kwargs = synchronize_cached_params(kwargs)
<ide><path>dev/breeze/src/airflow_breeze/utils/docker_command_utils.py
<ide> def compare_version(current_version: str, min_version: str) -> bool:
<ide> return version.parse(current_version) >= version.parse(min_version)
<ide>
<ide>
<add>def check_docker_is_running(verbose: bool) -> bool:
<add> """
<add> Checks if docker is running. Suppressed Dockers stdout and stderr output.
<add> :param verbose: print commands when running
<add> :return: False if docker is not running.
<add> """
<add> response = run_command(
<add> ["docker", "info"],
<add> verbose=verbose,
<add> no_output_dump_on_exception=True,
<add> text=False,
<add> check=True,
<add> stdout=subprocess.DEVNULL,
<add> stderr=subprocess.STDOUT,
<add> )
<add> if not response:
<add> return False
<add> return True
<add>
<add>
<ide> def check_docker_version(verbose: bool):
<ide> """
<ide> Checks if the docker compose version is as expected (including some specific modifications done by
<ide> some vendors such as Microsoft (they might have modified version of docker-compose/docker in their
<ide> cloud. In case docker compose version is wrong we continue but print warning for the user.
<ide>
<add>
<ide> :param verbose: print commands when running
<ide> """
<ide> permission_denied = check_docker_permission(verbose) | 2 |
Go | Go | make network ls output order | 838ed1866bf285759efdc01a97ae837f0f1c326a | <ide><path>api/client/network.go
<ide> package client
<ide> import (
<ide> "fmt"
<ide> "net"
<add> "sort"
<ide> "strings"
<ide> "text/tabwriter"
<ide>
<ide> func (cli *DockerCli) CmdNetworkLs(args ...string) error {
<ide> if !*quiet {
<ide> fmt.Fprintln(wr, "NETWORK ID\tNAME\tDRIVER")
<ide> }
<del>
<add> sort.Sort(byNetworkName(networkResources))
<ide> for _, networkResource := range networkResources {
<ide> ID := networkResource.ID
<ide> netName := networkResource.Name
<ide> func (cli *DockerCli) CmdNetworkLs(args ...string) error {
<ide> return nil
<ide> }
<ide>
<add>type byNetworkName []types.NetworkResource
<add>
<add>func (r byNetworkName) Len() int { return len(r) }
<add>func (r byNetworkName) Swap(i, j int) { r[i], r[j] = r[j], r[i] }
<add>func (r byNetworkName) Less(i, j int) bool { return r[i].Name < r[j].Name }
<add>
<ide> // CmdNetworkInspect inspects the network object for more details
<ide> //
<ide> // Usage: docker network inspect [OPTIONS] <NETWORK> [NETWORK...]
<ide><path>integration-cli/docker_cli_network_unix_test.go
<ide> import (
<ide> "net/http"
<ide> "net/http/httptest"
<ide> "os"
<del> "sort"
<ide> "strings"
<ide> "time"
<ide>
<ide> func assertNwList(c *check.C, out string, expectNws []string) {
<ide> // wrap all network name in nwList
<ide> nwList = append(nwList, netFields[1])
<ide> }
<del> // first need to sort out and expected
<del> sort.StringSlice(nwList).Sort()
<del> sort.StringSlice(expectNws).Sort()
<ide>
<ide> // network ls should contains all expected networks
<ide> c.Assert(nwList, checker.DeepEquals, expectNws)
<ide> func (s *DockerNetworkSuite) TestDockerNetworkLsFilter(c *check.C) {
<ide> // filter with partial ID and partial name
<ide> // only show 'bridge' and 'dev' network
<ide> out, _ = dockerCmd(c, "network", "ls", "-f", "id="+networkID[0:5], "-f", "name=dge")
<del> assertNwList(c, out, []string{"dev", "bridge"})
<add> assertNwList(c, out, []string{"bridge", "dev"})
<ide>
<ide> // only show built-in network (bridge, none, host)
<ide> out, _ = dockerCmd(c, "network", "ls", "-f", "type=builtin")
<del> assertNwList(c, out, []string{"bridge", "none", "host"})
<add> assertNwList(c, out, []string{"bridge", "host", "none"})
<ide>
<ide> // only show custom networks (dev)
<ide> out, _ = dockerCmd(c, "network", "ls", "-f", "type=custom")
<ide> func (s *DockerNetworkSuite) TestDockerNetworkLsFilter(c *check.C) {
<ide> // show all networks with filter
<ide> // it should be equivalent of ls without option
<ide> out, _ = dockerCmd(c, "network", "ls", "-f", "type=custom", "-f", "type=builtin")
<del> assertNwList(c, out, []string{"dev", "bridge", "host", "none"})
<add> assertNwList(c, out, []string{"bridge", "dev", "host", "none"})
<ide> }
<ide>
<ide> func (s *DockerNetworkSuite) TestDockerNetworkCreateDelete(c *check.C) { | 2 |
Javascript | Javascript | remove duplicated word in comment | b2a77654b7761e6708844611d0464797211e52f4 | <ide><path>lib/timers.js
<ide> const TIMEOUT_MAX = 2147483647; // 2^31-1
<ide> //
<ide> // Timers are crucial to Node.js. Internally, any TCP I/O connection creates a
<ide> // timer so that we can time out of connections. Additionally, many user
<del>// user libraries and applications also use timers. As such there may be a
<add>// libraries and applications also use timers. As such there may be a
<ide> // significantly large amount of timeouts scheduled at any given time.
<ide> // Therefore, it is very important that the timers implementation is performant
<ide> // and efficient. | 1 |
Text | Text | add comment about tf version to readme.md | 754c5a90bc295fb5ef47e16298a65c268da0a698 | <ide><path>im2txt/README.md
<ide> approximately 10 times slower.
<ide> First ensure that you have installed the following required packages:
<ide>
<ide> * **Bazel** ([instructions](http://bazel.io/docs/install.html)).
<del>* **TensorFlow** ([instructions](https://www.tensorflow.org/versions/r0.10/get_started/os_setup.html)).
<add>* **TensorFlow** ([instructions](https://www.tensorflow.org/versions/master/get_started/os_setup.html#installing-from-sources)).
<add> * Note: the model currently requires TensorFlow built from source because it
<add> uses features added after release r0.10.
<ide> * **NumPy** ([instructions](http://www.scipy.org/install.html)).
<ide> * **Natural Language Toolkit (NLTK)**:
<ide> * First install NLTK ([instructions](http://www.nltk.org/install.html)). | 1 |
PHP | PHP | fix wrong variable assignment | 3240a2860f56f25271db33e25eadb1df34cfa8fa | <ide><path>src/Routing/RouteBuilder.php
<ide> protected function _methodRoute($method, $template, $target, $name)
<ide> $name = $this->_namePrefix . $name;
<ide> }
<ide> $options = [
<del> '_method' => $method,
<ide> '_ext' => $this->_extensions,
<ide> 'routeClass' => $this->_routeClass,
<ide> '_name' => $name,
<ide> ];
<ide>
<add> $target['_method'] = $method;
<add>
<ide> $route = $this->_makeRoute($template, $target, $options);
<ide> $this->_collection->add($route, $options);
<ide>
<ide><path>tests/TestCase/Routing/RouteBuilderTest.php
<ide> public function testHttpMethods($method)
<ide> 'route-name'
<ide> );
<ide> $this->assertInstanceOf(Route::class, $route, 'Should return a route');
<del> $this->assertSame($method, $route->options['_method']);
<add> $this->assertSame($method, $route->defaults['_method']);
<ide> $this->assertSame('app:route-name', $route->options['_name']);
<ide> $this->assertSame('/bookmarks/:id', $route->template);
<ide> $this->assertEquals(
<del> ['plugin' => null, 'controller' => 'Bookmarks', 'action' => 'view'],
<add> ['plugin' => null, 'controller' => 'Bookmarks', 'action' => 'view', '_method' => $method],
<ide> $route->defaults
<ide> );
<ide> } | 2 |
Python | Python | remove init form path | 0a1ee109db2fc30d98e41b269a751b0d3dcd8168 | <ide><path>spacy/cli/train.py
<ide> def train_cli(
<ide> config = util.load_config(config_path, overrides=overrides, interpolate=False)
<ide> msg.divider("Initializing pipeline")
<ide> with show_validation_error(config_path, hint_fill=False):
<del> nlp = init_pipeline(config, output_path, use_gpu=use_gpu)
<add> nlp = init_nlp(config, use_gpu=use_gpu)
<ide> msg.divider("Training pipeline")
<ide> train(nlp, output_path, use_gpu=use_gpu, silent=False)
<ide>
<ide>
<del>def init_pipeline(
<del> config: Config, output_path: Optional[Path], *, use_gpu: int = -1
<del>) -> Language:
<del> init_kwargs = {"use_gpu": use_gpu}
<del> if output_path is not None:
<del> init_path = output_path / "model-initial"
<del> if not init_path.exists():
<del> msg.info(f"Initializing the pipeline in {init_path}")
<del> nlp = init_nlp(config, **init_kwargs)
<del> nlp.to_disk(init_path)
<del> msg.good(f"Saved initialized pipeline to {init_path}")
<del> else:
<del> nlp = util.load_model(init_path)
<del> if must_reinitialize(config, nlp.config):
<del> msg.warn("Config has changed: need to re-initialize pipeline")
<del> nlp = init_nlp(config, **init_kwargs)
<del> nlp.to_disk(init_path)
<del> msg.good(f"Re-initialized pipeline in {init_path}")
<del> else:
<del> msg.good(f"Loaded initialized pipeline from {init_path}")
<del> return nlp
<del> return init_nlp(config, **init_kwargs)
<del>
<del>
<ide> def verify_cli_args(config_path: Path, output_path: Optional[Path] = None) -> None:
<ide> # Make sure all files and paths exists if they are needed
<ide> if not config_path or not config_path.exists(): | 1 |
PHP | PHP | add default function | 5691f12b63f3cbe41d950a26cc0f64d8ce4a7cd3 | <ide><path>src/Illuminate/Support/ServiceProvider.php
<ide> public function isDeferred()
<ide> return $this->defer;
<ide> }
<ide>
<add> /**
<add> * Get a list of files that should be compiled for the package.
<add> *
<add> * @return array
<add> */
<add> public static function compiles()
<add> {
<add> return [];
<add> }
<add>
<ide> } | 1 |
PHP | PHP | implement meridian select | afabfcad6c235975b88f6e098e704d545b153ff0 | <ide><path>src/View/Input/DateTime.php
<ide> class DateTime implements InputInterface {
<ide> 'hour',
<ide> 'minute',
<ide> 'second',
<add> 'meridian',
<ide> ];
<ide>
<ide> /**
<ide> public function __construct($templates, $selectBox) {
<ide> *
<ide> * @param array $data Data to render with.
<ide> * @return string A generated select box.
<del> * @throws \RuntimeException when the name attribute is empty.
<add> * @throws \RuntimeException When option data is invalid.
<ide> */
<ide> public function render(array $data) {
<ide> $data += [
<ide> public function render(array $data) {
<ide> 'hour' => [],
<ide> 'minute' => [],
<ide> 'second' => [],
<add> 'meridian' => null,
<ide> ];
<ide>
<ide> $selected = $this->_deconstuctDate($data['val'], $data);
<ide>
<add> if (!isset($data['meridian']) &&
<add> isset($data['hour']['format']) &&
<add> $data['hour']['format'] == 12
<add> ) {
<add> $data['meridian'] = [];
<add> }
<add>
<ide> $templateOptions = [];
<ide> foreach ($this->_selects as $select) {
<ide> if ($data[$select] === false || $data[$select] === null) {
<ide> $templateOptions[$select] = '';
<ide> unset($data[$select]);
<ide> continue;
<ide> }
<add> if (!is_array($data[$select])) {
<add> throw \RuntimeException(sprintf(
<add> 'Options for "%s" must be an array|false|null',
<add> $select
<add> ));
<add> }
<ide> $method = $select . 'Select';
<ide> $data[$select]['name'] = $data['name'] . "[" . $select . "]";
<ide> $data[$select]['val'] = $selected[$select];
<ide> protected function _deconstuctDate($value, $options) {
<ide> 'day' => $date->format('d'),
<ide> 'hour' => $date->format('H'),
<ide> 'minute' => $date->format('i'),
<del> 'second' => $date->format('s')
<add> 'second' => $date->format('s'),
<add> 'meridian' => $date->format('a'),
<ide> ];
<ide> }
<ide>
<ide> public function secondSelect($options = []) {
<ide> return $this->_select->render($options);
<ide> }
<ide>
<add>/**
<add> * Generates a meridian select
<add> *
<add> * @param array $options
<add> * @return string
<add> */
<add> public function meridianSelect($options = []) {
<add> $options += [
<add> 'name' => '',
<add> 'val' => null,
<add> 'options' => ['am' => 'am', 'pm' => 'pm']
<add> ];
<add> return $this->_select->render($options);
<add> }
<add>
<ide> /**
<ide> * Returns a translated list of month names
<ide> *
<ide><path>tests/TestCase/View/Input/DateTimeTest.php
<ide> public function setUp() {
<ide> 'selectMultiple' => '<select name="{{name}}[]" multiple="multiple"{{attrs}}>{{content}}</select>',
<ide> 'option' => '<option value="{{value}}"{{attrs}}>{{text}}</option>',
<ide> 'optgroup' => '<optgroup label="{{label}}"{{attrs}}>{{content}}</optgroup>',
<del> 'dateWidget' => '{{year}}{{month}}{{day}}{{hour}}{{minute}}{{second}}'
<add> 'dateWidget' => '{{year}}{{month}}{{day}}{{hour}}{{minute}}{{second}}{{meridian}}'
<ide> ];
<ide> $this->templates = new StringTemplate($templates);
<ide> $this->selectBox = new SelectBox($this->templates);
<ide> public function testRenderHourWidget24() {
<ide> $this->assertNotContains('date[day]', $result, 'No day select.');
<ide> $this->assertNotContains('value="0"', $result, 'No zero hour');
<ide> $this->assertNotContains('value="25"', $result, 'No 25th hour');
<add> $this->assertNotContains('<select name="date[meridian]">', $result);
<add> $this->assertNotContains('<option value="pm" selected="selected">pm</option>', $result);
<ide> }
<ide>
<ide> /**
<ide> public function testRenderHourWidget12() {
<ide> );
<ide> $this->assertNotContains('date[day]', $result, 'No day select.');
<ide> $this->assertNotContains('value="0"', $result, 'No zero hour');
<add>
<add> $this->assertContains('<select name="date[meridian]">', $result);
<add> $this->assertContains('<option value="pm" selected="selected">pm</option>', $result);
<ide> }
<ide>
<ide> /**
<ide> public function testRenderSecondsWidget() {
<ide> * @return void
<ide> */
<ide> public function testRenderMeridianWidget() {
<del> $this->markTestIncomplete();
<add> $now = new \DateTime('2010-09-09 13:00:25');
<add> $result = $this->DateTime->render([
<add> 'name' => 'date',
<add> 'year' => false,
<add> 'month' => false,
<add> 'day' => false,
<add> 'hour' => false,
<add> 'minute' => false,
<add> 'second' => false,
<add> 'meridian' => [],
<add> 'val' => $now,
<add> ]);
<add> $expected = [
<add> 'select' => ['name' => 'date[meridian]'],
<add> ['option' => ['value' => 'am']], 'am', '/option',
<add> ['option' => ['value' => 'pm', 'selected' => 'selected']], 'pm', '/option',
<add> '/select',
<add> ];
<add> $this->assertTags($result, $expected);
<add>
<add> $now = new \DateTime('2010-09-09 09:00:25');
<add> $result = $this->DateTime->render([
<add> 'name' => 'date',
<add> 'year' => false,
<add> 'month' => false,
<add> 'day' => false,
<add> 'hour' => false,
<add> 'minute' => false,
<add> 'second' => false,
<add> 'meridian' => [],
<add> 'val' => $now,
<add> ]);
<add> $expected = [
<add> 'select' => ['name' => 'date[meridian]'],
<add> ['option' => ['value' => 'am', 'selected' => 'selected']], 'am', '/option',
<add> ['option' => ['value' => 'pm']], 'pm', '/option',
<add> '/select',
<add> ];
<add> $this->assertTags($result, $expected);
<ide> }
<ide>
<ide> /** | 2 |
Ruby | Ruby | add notes for `define_model_callbacks` [ci skip] | f50430ee3e14fbde1dd83c38f55afe70e567f77d | <ide><path>activemodel/lib/active_model/callbacks.rb
<ide> def self.extended(base) #:nodoc:
<ide> # # obj is the MyModel instance that the callback is being called on
<ide> # end
<ide> # end
<add> #
<add> # NOTE: +method_name+ passed to `define_model_callbacks` must not end with
<add> # `!`, `?` and `=`.
<ide> def define_model_callbacks(*callbacks)
<ide> options = callbacks.extract_options!
<ide> options = {
<ide><path>activesupport/lib/active_support/callbacks.rb
<ide> def reset_callbacks(name)
<ide> # define_callbacks :save, scope: [:name]
<ide> #
<ide> # would call <tt>Audit#save</tt>.
<add> #
<add> # NOTE: +method_name+ passed to `define_model_callbacks` must not end with
<add> # `!`, `?` and `=`.
<ide> def define_callbacks(*names)
<ide> options = names.extract_options!
<ide> | 2 |
Text | Text | add citation for model garden | f5e8557daa4b0f7a8f0bc8376bd1893090324955 | <ide><path>official/README.md
<ide> pip3 install --user -r official/requirements.txt
<ide> ## Contributions
<ide>
<ide> If you want to contribute, please review the [contribution guidelines](https://github.com/tensorflow/models/wiki/How-to-contribute).
<add>
<add>## Citing TF Official Model Garden
<add>
<add>To cite this repository:
<add>
<add>```
<add>@software{tfmodels2020github,
<add> author = {Chen Chen and Xianzhi Du and Le Hou and Jaeyoun Kim and Pengchong
<add> Jin and Jing Li and Yeqing Li and Abdullah Rashwan and Hongkun Yu},
<add> title = {TensorFlow Official Model Garden},
<add> url = {https://github.com/tensorflow/models/tree/master/official},
<add> year = {2020},
<add>}
<add>``` | 1 |
Ruby | Ruby | fix rubocop warnings | 6d1d3ff013dbf8e6bedbb9710fa15f4749fcc4e3 | <ide><path>Library/Homebrew/os/mac/cctools_mach.rb
<ide> def parse_otool_L_output
<ide> id = libs.shift[OTOOL_RX, 1] if path.dylib?
<ide> libs.map! { |lib| lib[OTOOL_RX, 1] }.compact!
<ide>
<del> return id, libs
<add> [id, libs]
<ide> end
<ide> end
<ide> | 1 |
PHP | PHP | fix conversion of empty string to empty array | 4b643ba47e2f2a9910e1163a826abaeb3753ad1b | <ide><path>src/Controller/ControllerFactory.php
<ide> protected function coerceStringToType(string $argument, ReflectionNamedType $typ
<ide> case 'bool':
<ide> return $argument === '0' ? false : ($argument === '1' ? true : null);
<ide> case 'array':
<del> return explode(',', $argument);
<add> return $argument === '' ? [] : explode(',', $argument);
<ide> }
<ide>
<ide> return null;
<ide><path>tests/TestCase/Controller/ControllerFactoryTest.php
<ide> public function testInvokePassedParametersCoercion(): void
<ide>
<ide> $this->assertNotNull($data);
<ide> $this->assertSame(['one' => 1.0, 'two' => 2, 'three' => false, 'four' => ['8', '9']], $data);
<add>
<add> $request = new ServerRequest([
<add> 'url' => 'test_plugin_three/dependencies/requiredTyped',
<add> 'params' => [
<add> 'plugin' => null,
<add> 'controller' => 'Dependencies',
<add> 'action' => 'requiredTyped',
<add> 'pass' => ['1.0', '02', '0', ''],
<add> ],
<add> ]);
<add> $controller = $this->factory->create($request);
<add>
<add> $result = $this->factory->invoke($controller);
<add> $data = json_decode((string)$result->getBody(), true);
<add>
<add> $this->assertNotNull($data);
<add> $this->assertSame(['one' => 1.0, 'two' => 2, 'three' => false, 'four' => []], $data);
<ide> }
<ide>
<ide> /** | 2 |
PHP | PHP | allow processing assets of only specified plugin | 20c09860caa9e4029c4bf3ce890c27f0936a0230 | <ide><path>src/Shell/PluginAssetsShell.php
<ide> class PluginAssetsShell extends Shell {
<ide> * fallback to copying the assets. For vendor namespaced plugin, parent folder
<ide> * for vendor name are created if required.
<ide> *
<add> * @param string|string $name Name of plugin for which to symlink assets.
<add> * If null all plugins will be processed.
<ide> * @return void
<ide> */
<del> public function symlink() {
<del> $this->_process($this->_list());
<add> public function symlink($name = null) {
<add> $this->_process($this->_list($name));
<ide> }
<ide>
<ide> /**
<ide> * Get list of plugins to process. Plugins without a webroot directory are skipped.
<ide> *
<del> * @return array
<add> * @param string|string $name Name of plugin for which to symlink assets.
<add> * If null all plugins will be processed.
<add> * @return array List of plugins with meta data.
<ide> */
<del> protected function _list() {
<add> protected function _list($name = null) {
<add> if ($name === null) {
<add> $pluginsList = Plugin::loaded();
<add> } else {
<add> $pluginsList = [$name];
<add> }
<add>
<ide> $plugins = [];
<del> foreach (Plugin::loaded() as $plugin) {
<add>
<add> foreach ($pluginsList as $plugin) {
<ide> $path = Plugin::path($plugin) . 'webroot';
<ide> if (!is_dir($path)) {
<ide> $this->out('', 1, Shell::VERBOSE);
<ide> protected function _list() {
<ide> 'namespaced' => $namespaced
<ide> ];
<ide> }
<add>
<ide> return $plugins;
<ide> }
<ide>
<ide> public function getOptionParser() {
<ide> $parser = parent::getOptionParser();
<ide>
<ide> $parser->addSubcommand('symlink', [
<del> 'help' => 'Symlink / copy assets to app\'s webroot'
<add> 'help' => 'Symlink / copy assets to app\'s webroot.'
<add> ])->addArgument('name', [
<add> 'help' => 'A specific plugin you want to symlink assets for.',
<add> 'optional' => true,
<ide> ]);
<ide>
<ide> return $parser;
<ide><path>tests/TestCase/Shell/PluginAssetsShellTest.php
<ide> public function setUp() {
<ide> parent::setUp();
<ide>
<ide> $this->skipIf(
<del> DIRECTORY_SEPARATOR === '\\',
<add> DS === '\\',
<ide> 'Skip PluginAssetsShell tests on windows to prevent side effects for UrlHelper tests on AppVeyor.'
<ide> );
<ide>
<ide> public function testSymlink() {
<ide> $path = WWW_ROOT . 'test_plugin';
<ide> $link = new \SplFileInfo($path);
<ide> $this->assertTrue(file_exists($path . DS . 'root.js'));
<del> if (DIRECTORY_SEPARATOR === '\\') {
<add> if (DS === '\\') {
<ide> $this->assertTrue($link->isDir());
<ide> $folder = new Folder($path);
<ide> $folder->delete();
<ide> public function testSymlinkWhenVendorDirectoryExits() {
<ide> $this->shell->symlink();
<ide> $path = WWW_ROOT . 'company' . DS . 'test_plugin_three';
<ide> $link = new \SplFileInfo($path);
<del> if (DIRECTORY_SEPARATOR === '\\') {
<add> if (DS === '\\') {
<ide> $this->assertTrue($link->isDir());
<ide> } else {
<ide> $this->assertTrue($link->isLink());
<ide> public function testForPluginWithoutWebroot() {
<ide> $this->assertFalse(file_exists(WWW_ROOT . 'test_plugin_two'));
<ide> }
<ide>
<add>/**
<add> * testSymlinkingSpecifiedPlugin
<add> *
<add> * @return void
<add> */
<add> public function testSymlinkingSpecifiedPlugin() {
<add> Plugin::load('TestPlugin');
<add> Plugin::load('Company/TestPluginThree');
<add>
<add> $this->shell->symlink('TestPlugin');
<add>
<add> $path = WWW_ROOT . 'test_plugin';
<add> $link = new \SplFileInfo($path);
<add> $this->assertTrue(file_exists($path . DS . 'root.js'));
<add>
<add> $path = WWW_ROOT . 'company' . DS . 'test_plugin_three';
<add> $link = new \SplFileInfo($path);
<add> $this->assertFalse($link->isDir());
<add> $this->assertFalse($link->isLink());
<add> }
<add>
<ide> } | 2 |
Text | Text | fix typo in cli.md for report-dir | 641c466671556b179f1cdb7f9cbb36f4750c253d | <ide><path>doc/api/cli.md
<ide> Write reports in a compact format, single-line JSON, more easily consumable
<ide> by log processing systems than the default multi-line format designed for
<ide> human consumption.
<ide>
<del>### --report-dir=directory`, `report-directory=directory`
<add>### `--report-dir=directory`, `report-directory=directory`
<ide> <!-- YAML
<ide> added: v11.8.0
<ide> changes: | 1 |
Javascript | Javascript | remove unused var from stream2 test | c8eb62fcee0d4e26c5bfe9b3405b40d5d07793f5 | <ide><path>test/parallel/test-stream2-readable-legacy-drain.js
<ide> r.on('end', function() {
<ide>
<ide> var w = new Stream();
<ide> w.writable = true;
<del>var writes = 0;
<ide> var buffered = 0;
<ide> w.write = function(c) {
<del> writes += c.length;
<ide> buffered += c.length;
<ide> process.nextTick(drain);
<ide> return false; | 1 |
Javascript | Javascript | fix fs.realpathsync on windows | 8d70294c31da06ff04279e47739c57a1b57cd5aa | <ide><path>lib/fs.js
<ide> if (isWindows) {
<ide> // the same as path.resolve that fails if the path doesn't exists.
<ide>
<ide> // windows version
<del> fs.realpathSync = function realpathSync(p) {
<add> fs.realpathSync = function realpathSync(p, cache) {
<ide> var p = path.resolve(p);
<ide> if (cache && Object.prototype.hasOwnProperty.call(cache, p)) {
<ide> return cache[p]; | 1 |
Javascript | Javascript | add getintoption function to reduce dupl | d4726d2f3f7a411b2d22d2ef04fe42f8d5e22fcf | <ide><path>lib/internal/crypto/sig.js
<ide> Sign.prototype.update = function update(data, encoding) {
<ide> return this;
<ide> };
<ide>
<add>function getPadding(options) {
<add> return getIntOption('padding', RSA_PKCS1_PADDING, options);
<add>}
<add>
<add>function getSaltLength(options) {
<add> return getIntOption('saltLength', RSA_PSS_SALTLEN_AUTO, options);
<add>}
<add>
<add>function getIntOption(name, defaultValue, options) {
<add> if (options.hasOwnProperty(name)) {
<add> if (options[name] === options[name] >> 0) {
<add> return options[name];
<add> } else {
<add> throw new ERR_INVALID_OPT_VALUE(name, options[name]);
<add> }
<add> }
<add> return defaultValue;
<add>}
<add>
<ide> Sign.prototype.sign = function sign(options, encoding) {
<ide> if (!options)
<ide> throw new ERR_CRYPTO_SIGN_KEY_REQUIRED();
<ide> Sign.prototype.sign = function sign(options, encoding) {
<ide> var passphrase = options.passphrase || null;
<ide>
<ide> // Options specific to RSA
<del> var rsaPadding = RSA_PKCS1_PADDING;
<del> if (options.hasOwnProperty('padding')) {
<del> if (options.padding === options.padding >> 0) {
<del> rsaPadding = options.padding;
<del> } else {
<del> throw new ERR_INVALID_OPT_VALUE('padding', options.padding);
<del> }
<del> }
<add> var rsaPadding = getPadding(options);
<ide>
<del> var pssSaltLength = RSA_PSS_SALTLEN_AUTO;
<del> if (options.hasOwnProperty('saltLength')) {
<del> if (options.saltLength === options.saltLength >> 0) {
<del> pssSaltLength = options.saltLength;
<del> } else {
<del> throw new ERR_INVALID_OPT_VALUE('saltLength', options.saltLength);
<del> }
<del> }
<add> var pssSaltLength = getSaltLength(options);
<ide>
<ide> key = toBuf(key);
<ide> if (!isArrayBufferView(key)) {
<ide> Verify.prototype.verify = function verify(options, signature, sigEncoding) {
<ide> sigEncoding = sigEncoding || getDefaultEncoding();
<ide>
<ide> // Options specific to RSA
<del> var rsaPadding = RSA_PKCS1_PADDING;
<del> if (options.hasOwnProperty('padding')) {
<del> if (options.padding === options.padding >> 0) {
<del> rsaPadding = options.padding;
<del> } else {
<del> throw new ERR_INVALID_OPT_VALUE('padding', options.padding);
<del> }
<del> }
<add> var rsaPadding = getPadding(options);
<ide>
<del> var pssSaltLength = RSA_PSS_SALTLEN_AUTO;
<del> if (options.hasOwnProperty('saltLength')) {
<del> if (options.saltLength === options.saltLength >> 0) {
<del> pssSaltLength = options.saltLength;
<del> } else {
<del> throw new ERR_INVALID_OPT_VALUE('saltLength', options.saltLength);
<del> }
<del> }
<add> var pssSaltLength = getSaltLength(options);
<ide>
<ide> key = toBuf(key);
<ide> if (!isArrayBufferView(key)) { | 1 |
Javascript | Javascript | fix parallel/test-repl with new v8 | 6e9d1c868474273b3b5891508c28aa13f70ff465 | <ide><path>test/parallel/test-repl.js
<ide> function error_test() {
<ide> expect: /^SyntaxError: Delete of an unqualified identifier in strict mode/ },
<ide> { client: client_unix, send: '(function() { "use strict"; eval = 17; })()',
<ide> expect: /^SyntaxError: Unexpected eval or arguments in strict mode/ },
<del> { client: client_unix, send: '(function() { "use strict"; if (true){ function f() { } } })()',
<add> { client: client_unix, send: '(function() { "use strict"; if (true) function f() { } })()',
<ide> expect: /^SyntaxError: In strict mode code, functions can only be declared at top level or immediately within another function/ },
<ide> // Named functions can be used:
<ide> { client: client_unix, send: 'function blah() { return 1; }', | 1 |
Go | Go | ignore ping errors in notary repository setup | 5e11cd43aa21a9d0eb1f5f205f05dc7b14ee4d43 | <ide><path>api/client/trust.go
<ide> func (cli *DockerCli) getNotaryRepository(repoInfo *registry.RepositoryInfo, aut
<ide> if err != nil {
<ide> return nil, err
<ide> }
<add>
<add> challengeManager := auth.NewSimpleChallengeManager()
<add>
<ide> resp, err := pingClient.Do(req)
<ide> if err != nil {
<del> return nil, err
<del> }
<del> defer resp.Body.Close()
<add> // Ignore error on ping to operate in offline mode
<add> logrus.Debugf("Error pinging notary server %q: %s", endpointStr, err)
<add> } else {
<add> defer resp.Body.Close()
<ide>
<del> challengeManager := auth.NewSimpleChallengeManager()
<del> if err := challengeManager.AddResponse(resp); err != nil {
<del> return nil, err
<add> // Add response to the challenge manager to parse out
<add> // authentication header and register authentication method
<add> if err := challengeManager.AddResponse(resp); err != nil {
<add> return nil, err
<add> }
<ide> }
<ide>
<ide> creds := simpleCredentialStore{auth: authConfig}
<ide> func notaryError(err error) error {
<ide> return fmt.Errorf("remote repository out-of-date: %v", err)
<ide> case trustmanager.ErrKeyNotFound:
<ide> return fmt.Errorf("signing keys not found: %v", err)
<add> case *net.OpError:
<add> return fmt.Errorf("error contacting notary server: %v", err)
<ide> }
<ide>
<ide> return err
<ide><path>integration-cli/docker_cli_pull_trusted_test.go
<ide> func (s *DockerTrustSuite) TestTrustedPullWithExpiredSnapshot(c *check.C) {
<ide> }
<ide> })
<ide> }
<add>
<add>func (s *DockerTrustSuite) TestTrustedOfflinePull(c *check.C) {
<add> repoName := s.setupTrustedImage(c, "trusted-offline-pull")
<add>
<add> pullCmd := exec.Command(dockerBinary, "pull", repoName)
<add> s.trustedCmdWithServer(pullCmd, "https://invalidnotaryserver")
<add> out, _, err := runCommandWithOutput(pullCmd)
<add> if err == nil {
<add> c.Fatalf("Expected error pulling with invalid notary server:\n%s", out)
<add> }
<add>
<add> if !strings.Contains(string(out), "error contacting notary server") {
<add> c.Fatalf("Missing expected output on trusted pull:\n%s", out)
<add> }
<add>
<add> // Do valid trusted pull to warm cache
<add> pullCmd = exec.Command(dockerBinary, "pull", repoName)
<add> s.trustedCmd(pullCmd)
<add> out, _, err = runCommandWithOutput(pullCmd)
<add> if err != nil {
<add> c.Fatalf("Error running trusted pull: %s\n%s", err, out)
<add> }
<add>
<add> if !strings.Contains(string(out), "Tagging") {
<add> c.Fatalf("Missing expected output on trusted push:\n%s", out)
<add> }
<add>
<add> dockerCmd(c, "rmi", repoName)
<add>
<add> // Try pull again with invalid notary server, should use cache
<add> pullCmd = exec.Command(dockerBinary, "pull", repoName)
<add> s.trustedCmdWithServer(pullCmd, "https://invalidnotaryserver")
<add> out, _, err = runCommandWithOutput(pullCmd)
<add> if err != nil {
<add> c.Fatalf("Error running trusted pull: %s\n%s", err, out)
<add> }
<add>
<add> if !strings.Contains(string(out), "Tagging") {
<add> c.Fatalf("Missing expected output on trusted push:\n%s", out)
<add> }
<add>}
<ide><path>integration-cli/docker_cli_push_test.go
<ide> func (s *DockerTrustSuite) TestTrustedPushWithFaillingServer(c *check.C) {
<ide> c.Fatalf("Missing error while running trusted push w/ no server")
<ide> }
<ide>
<del> if !strings.Contains(string(out), "Error establishing connection to notary repository") {
<add> if !strings.Contains(string(out), "error contacting notary server") {
<ide> c.Fatalf("Missing expected output on trusted push:\n%s", out)
<ide> }
<ide> } | 3 |
Ruby | Ruby | remove tests for #with_scope (it's now deprecated) | 40a711ce29505e95d5033fe02c22ba7469206f06 | <ide><path>activerecord/test/cases/associations/eager_test.rb
<ide> def test_eager_with_has_many_and_limit_and_scoped_conditions_on_the_eagers
<ide> end
<ide> end
<ide>
<del> def test_eager_with_has_many_and_limit_and_scoped_and_explicit_conditions_on_the_eagers
<del> Post.send(:with_scope, :find => { :conditions => "1=1" }) do
<del> posts =
<del> authors(:david).posts
<del> .includes(:comments)
<del> .where("comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment'")
<del> .references(:comments)
<del> .limit(2)
<del> .to_a
<del> assert_equal 2, posts.size
<del>
<del> count = Post.includes(:comments, :author)
<del> .where("authors.name = 'David' AND (comments.body like 'Normal%' OR comments.#{QUOTED_TYPE}= 'SpecialComment')")
<del> .references(:authors, :comments)
<del> .limit(2)
<del> .count
<del> assert_equal count, posts.size
<del> end
<del> end
<del>
<del> def test_eager_with_scoped_order_using_association_limiting_without_explicit_scope
<del> posts_with_explicit_order =
<del> Post.where('comments.id is not null').references(:comments)
<del> .includes(:comments).order('posts.id DESC').limit(2)
<del>
<del> posts_with_scoped_order = Post.order('posts.id DESC').scoping do
<del> Post.where('comments.id is not null').references(:comments)
<del> .includes(:comments).limit(2)
<del> end
<del>
<del> assert_equal posts_with_explicit_order, posts_with_scoped_order
<del> end
<del>
<ide> def test_eager_association_loading_with_habtm
<ide> posts = Post.find(:all, :include => :categories, :order => "posts.id")
<ide> assert_equal 2, posts[0].categories.size
<ide><path>activerecord/test/cases/associations/join_model_test.rb
<ide> def test_has_many_through_with_custom_primary_key_on_has_many_source
<ide> assert_equal [authors(:david), authors(:bob)], posts(:thinking).authors_using_custom_pk.order('authors.id')
<ide> end
<ide>
<del> def test_both_scoped_and_explicit_joins_should_be_respected
<del> assert_nothing_raised do
<del> Post.send(:with_scope, :find => {:joins => "left outer join comments on comments.id = posts.id"}) do
<del> Post.find :all, :select => "comments.id, authors.id", :joins => "left outer join authors on authors.id = posts.author_id"
<del> end
<del> end
<del> end
<del>
<ide> def test_belongs_to_polymorphic_with_counter_cache
<ide> assert_equal 1, posts(:welcome)[:taggings_count]
<ide> tagging = posts(:welcome).taggings.create(:tag => tags(:general))
<ide><path>activerecord/test/cases/base_test.rb
<ide> def test_count_with_join
<ide> assert_equal res6, res7
<ide> end
<ide>
<del> def test_scoped_find_conditions
<del> scoped_developers = Developer.send(:with_scope, :find => { :conditions => 'salary > 90000' }) do
<del> Developer.find(:all, :conditions => 'id < 5')
<del> end
<del> assert !scoped_developers.include?(developers(:david)) # David's salary is less than 90,000
<del> assert_equal 3, scoped_developers.size
<del> end
<del>
<ide> def test_no_limit_offset
<ide> assert_nothing_raised do
<ide> Developer.find(:all, :offset => 2)
<ide> end
<ide> end
<ide>
<del> def test_scoped_find_limit_offset
<del> scoped_developers = Developer.send(:with_scope, :find => { :limit => 3, :offset => 2 }) do
<del> Developer.find(:all, :order => 'id')
<del> end
<del> assert !scoped_developers.include?(developers(:david))
<del> assert !scoped_developers.include?(developers(:jamis))
<del> assert_equal 3, scoped_developers.size
<del>
<del> # Test without scoped find conditions to ensure we get the whole thing
<del> developers = Developer.find(:all, :order => 'id')
<del> assert_equal Developer.count, developers.size
<del> end
<del>
<del> def test_scoped_find_order
<del> # Test order in scope
<del> scoped_developers = Developer.send(:with_scope, :find => { :limit => 1, :order => 'salary DESC' }) do
<del> Developer.find(:all)
<del> end
<del> assert_equal 'Jamis', scoped_developers.first.name
<del> assert scoped_developers.include?(developers(:jamis))
<del> # Test scope without order and order in find
<del> scoped_developers = Developer.send(:with_scope, :find => { :limit => 1 }) do
<del> Developer.find(:all, :order => 'salary DESC')
<del> end
<del> # Test scope order + find order, order has priority
<del> scoped_developers = Developer.send(:with_scope, :find => { :limit => 3, :order => 'id DESC' }) do
<del> Developer.find(:all, :order => 'salary ASC')
<del> end
<del> assert scoped_developers.include?(developers(:poor_jamis))
<del> assert ! scoped_developers.include?(developers(:david))
<del> assert ! scoped_developers.include?(developers(:jamis))
<del> assert_equal 3, scoped_developers.size
<del>
<del> # Test without scoped find conditions to ensure we get the right thing
<del> assert ! scoped_developers.include?(Developer.find(1))
<del> assert scoped_developers.include?(Developer.find(11))
<del> end
<del>
<del> def test_scoped_find_limit_offset_including_has_many_association
<del> topics = Topic.send(:with_scope, :find => {:limit => 1, :offset => 1, :include => :replies}) do
<del> Topic.find(:all, :order => "topics.id")
<del> end
<del> assert_equal 1, topics.size
<del> assert_equal 2, topics.first.id
<del> end
<del>
<del> def test_scoped_find_order_including_has_many_association
<del> developers = Developer.send(:with_scope, :find => { :order => 'developers.salary DESC', :include => :projects }) do
<del> Developer.find(:all)
<del> end
<del> assert developers.size >= 2
<del> (1...developers.size).each do |i|
<del> assert developers[i-1].salary >= developers[i].salary
<del> end
<del> end
<del>
<del> def test_scoped_find_with_group_and_having
<del> developers = Developer.send(:with_scope, :find => { :group => 'developers.salary', :having => "SUM(salary) > 10000", :select => "SUM(salary) as salary" }) do
<del> Developer.find(:all)
<del> end
<del> assert_equal 3, developers.size
<del> end
<del>
<ide> def test_find_last
<ide> last = Developer.find :last
<ide> assert_equal last, Developer.find(:first, :order => 'id desc')
<ide> def test_find_symbol_ordered_last
<ide> assert_equal last, Developer.find(:all, :order => :salary).last
<ide> end
<ide>
<del> def test_find_scoped_ordered_last
<del> last_developer = Developer.send(:with_scope, :find => { :order => 'developers.salary ASC' }) do
<del> Developer.find(:last)
<del> end
<del> assert_equal last_developer, Developer.find(:all, :order => 'developers.salary ASC').last
<del> end
<del>
<ide> def test_abstract_class
<ide> assert !ActiveRecord::Base.abstract_class?
<ide> assert LoosePerson.abstract_class?
<ide><path>activerecord/test/cases/calculations_test.rb
<ide> def test_should_get_maximum_of_field_with_include
<ide> assert_equal 55, Account.where("companies.name != 'Summit'").references(:companies).includes(:firm).maximum(:credit_limit)
<ide> end
<ide>
<del> def test_should_get_maximum_of_field_with_scoped_include
<del> Account.send :with_scope, :find => { :include => :firm } do
<del> assert_equal 55, Account.where("companies.name != 'Summit'").references(:companies).maximum(:credit_limit)
<del> end
<del> end
<del>
<ide> def test_should_get_minimum_of_field
<ide> assert_equal 50, Account.minimum(:credit_limit)
<ide> end
<ide><path>activerecord/test/cases/finder_test.rb
<ide> def test_exists_with_aggregate_having_three_mappings_with_one_difference
<ide> Address.new(existing_address.street + "1", existing_address.city, existing_address.country))
<ide> end
<ide>
<del> def test_exists_with_scoped_include
<del> Developer.send(:with_scope, :find => { :include => :projects, :order => "projects.name" }) do
<del> assert Developer.exists?
<del> end
<del> end
<del>
<ide> def test_exists_does_not_instantiate_records
<ide> Developer.expects(:instantiate).never
<ide> Developer.exists?
<ide> def test_with_limiting_with_custom_select
<ide> assert_equal [0, 1, 1], posts.map(&:author_id).sort
<ide> end
<ide>
<del> def test_finder_with_scoped_from
<del> all_topics = Topic.find(:all)
<del>
<del> Topic.send(:with_scope, :find => { :from => 'fake_topics' }) do
<del> assert_equal all_topics, Topic.from('topics').to_a
<del> end
<del> end
<del>
<ide> def test_find_one_message_with_custom_primary_key
<ide> Toy.primary_key = :name
<ide> begin
<ide><path>activerecord/test/cases/locking_test.rb
<ide> def test_sane_find_with_lock
<ide> end
<ide> end
<ide>
<del> # Test scoped lock.
<del> def test_sane_find_with_scoped_lock
<del> assert_nothing_raised do
<del> Person.transaction do
<del> Person.send(:with_scope, :find => { :lock => true }) do
<del> Person.find 1
<del> end
<del> end
<del> end
<del> end
<del>
<ide> # PostgreSQL protests SELECT ... FOR UPDATE on an outer join.
<ide> unless current_adapter?(:PostgreSQLAdapter)
<ide> # Test locked eager find.
<ide><path>activerecord/test/cases/method_scoping_test.rb
<del># This file can be removed once with_exclusive_scope and with_scope are removed.
<del># All the tests were already ported to relation_scoping_test.rb when the new
<del># relation scoping API was added.
<del>
<del>require "cases/helper"
<del>require 'models/post'
<del>require 'models/author'
<del>require 'models/developer'
<del>require 'models/project'
<del>require 'models/comment'
<del>
<del>class MethodScopingTest < ActiveRecord::TestCase
<del> fixtures :authors, :developers, :projects, :comments, :posts, :developers_projects
<del>
<del> def test_set_conditions
<del> Developer.send(:with_scope, :find => { :conditions => 'just a test...' }) do
<del> assert_match '(just a test...)', Developer.scoped.to_sql
<del> end
<del> end
<del>
<del> def test_scoped_find
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> assert_nothing_raised { Developer.find(1) }
<del> end
<del> end
<del>
<del> def test_scoped_find_first
<del> Developer.send(:with_scope, :find => { :conditions => "salary = 100000" }) do
<del> assert_equal Developer.find(10), Developer.find(:first, :order => 'name')
<del> end
<del> end
<del>
<del> def test_scoped_find_last
<del> highest_salary = Developer.find(:first, :order => "salary DESC")
<del>
<del> Developer.send(:with_scope, :find => { :order => "salary" }) do
<del> assert_equal highest_salary, Developer.last
<del> end
<del> end
<del>
<del> def test_scoped_find_last_preserves_scope
<del> lowest_salary = Developer.find(:first, :order => "salary ASC")
<del> highest_salary = Developer.find(:first, :order => "salary DESC")
<del>
<del> Developer.send(:with_scope, :find => { :order => "salary" }) do
<del> assert_equal highest_salary, Developer.last
<del> assert_equal lowest_salary, Developer.first
<del> end
<del> end
<del>
<del> def test_scoped_find_combines_conditions
<del> Developer.send(:with_scope, :find => { :conditions => "salary = 9000" }) do
<del> assert_equal developers(:poor_jamis), Developer.find(:first, :conditions => "name = 'Jamis'")
<del> end
<del> end
<del>
<del> def test_scoped_find_sanitizes_conditions
<del> Developer.send(:with_scope, :find => { :conditions => ['salary = ?', 9000] }) do
<del> assert_equal developers(:poor_jamis), Developer.find(:first)
<del> end
<del> end
<del>
<del> def test_scoped_find_combines_and_sanitizes_conditions
<del> Developer.send(:with_scope, :find => { :conditions => ['salary = ?', 9000] }) do
<del> assert_equal developers(:poor_jamis), Developer.find(:first, :conditions => ['name = ?', 'Jamis'])
<del> end
<del> end
<del>
<del> def test_scoped_find_all
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> assert_equal [developers(:david)], Developer.all
<del> end
<del> end
<del>
<del> def test_scoped_find_select
<del> Developer.send(:with_scope, :find => { :select => "id, name" }) do
<del> developer = Developer.find(:first, :conditions => "name = 'David'")
<del> assert_equal "David", developer.name
<del> assert !developer.has_attribute?(:salary)
<del> end
<del> end
<del>
<del> def test_scope_select_concatenates
<del> Developer.send(:with_scope, :find => { :select => "name" }) do
<del> developer = Developer.find(:first, :select => 'id, salary', :conditions => "name = 'David'")
<del> assert_equal 80000, developer.salary
<del> assert developer.has_attribute?(:id)
<del> assert developer.has_attribute?(:name)
<del> assert developer.has_attribute?(:salary)
<del> end
<del> end
<del>
<del> def test_scoped_count
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> assert_equal 1, Developer.count
<del> end
<del>
<del> Developer.send(:with_scope, :find => { :conditions => 'salary = 100000' }) do
<del> assert_equal 8, Developer.count
<del> assert_equal 1, Developer.count(:conditions => "name LIKE 'fixture_1%'")
<del> end
<del> end
<del>
<del> def test_scoped_find_include
<del> # with the include, will retrieve only developers for the given project
<del> scoped_developers = Developer.send(:with_scope, :find => { :include => :projects }) do
<del> Developer.find(:all, :conditions => { 'projects.id' => 2 })
<del> end
<del> assert scoped_developers.include?(developers(:david))
<del> assert !scoped_developers.include?(developers(:jamis))
<del> assert_equal 1, scoped_developers.size
<del> end
<del>
<del> def test_scoped_find_joins
<del> scoped_developers = Developer.send(:with_scope, :find => { :joins => 'JOIN developers_projects ON id = developer_id' } ) do
<del> Developer.find(:all, :conditions => 'developers_projects.project_id = 2')
<del> end
<del> assert scoped_developers.include?(developers(:david))
<del> assert !scoped_developers.include?(developers(:jamis))
<del> assert_equal 1, scoped_developers.size
<del> assert_equal developers(:david).attributes, scoped_developers.first.attributes
<del> end
<del>
<del> def test_scoped_find_using_new_style_joins
<del> scoped_developers = Developer.send(:with_scope, :find => { :joins => :projects }) do
<del> Developer.find(:all, :conditions => 'projects.id = 2')
<del> end
<del> assert scoped_developers.include?(developers(:david))
<del> assert !scoped_developers.include?(developers(:jamis))
<del> assert_equal 1, scoped_developers.size
<del> assert_equal developers(:david).attributes, scoped_developers.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_old_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => 'INNER JOIN posts ON authors.id = posts.author_id ' }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => 'INNER JOIN comments ON posts.id = comments.post_id', :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_new_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => :posts }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => :comments, :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_new_and_old_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => :posts }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => 'JOIN comments ON posts.id = comments.post_id', :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_string_array_style_and_string_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => ["INNER JOIN posts ON posts.author_id = authors.id"]}) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => 'INNER JOIN comments ON posts.id = comments.post_id', :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_string_array_style_and_hash_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => :posts}) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => ['INNER JOIN comments ON posts.id = comments.post_id'], :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_merges_joins_and_eliminates_duplicate_string_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => 'INNER JOIN posts ON posts.author_id = authors.id'}) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => ["INNER JOIN posts ON posts.author_id = authors.id", "INNER JOIN comments ON posts.id = comments.post_id"], :conditions => 'comments.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_find_strips_spaces_from_string_joins_and_eliminates_duplicate_string_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => ' INNER JOIN posts ON posts.author_id = authors.id '}) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => ['INNER JOIN posts ON posts.author_id = authors.id'], :conditions => 'posts.id = 1')
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_scoped_count_include
<del> # with the include, will retrieve only developers for the given project
<del> Developer.send(:with_scope, :find => { :include => :projects }) do
<del> assert_equal 1, Developer.count(:conditions => { 'projects.id' => 2 })
<del> end
<del> end
<del>
<del> def test_scope_for_create_only_uses_equal
<del> table = VerySpecialComment.arel_table
<del> relation = VerySpecialComment.scoped
<del> relation.where_values << table[:id].not_eq(1)
<del> assert_equal({:type => "VerySpecialComment"}, relation.send(:scope_for_create))
<del> end
<del>
<del> def test_scoped_create
<del> new_comment = nil
<del>
<del> VerySpecialComment.send(:with_scope, :create => { :post_id => 1 }) do
<del> assert_equal({:post_id => 1, :type => 'VerySpecialComment' }, VerySpecialComment.scoped.send(:scope_for_create))
<del> new_comment = VerySpecialComment.create :body => "Wonderful world"
<del> end
<del>
<del> assert Post.find(1).comments.include?(new_comment)
<del> end
<del>
<del> def test_scoped_create_with_join_and_merge
<del> Comment.where(:body => "but Who's Buying?").joins(:post).merge(Post.where(:body => 'Peace Sells...')).with_scope do
<del> assert_equal({:body => "but Who's Buying?"}, Comment.scoped.scope_for_create)
<del> end
<del> end
<del>
<del> def test_immutable_scope
<del> options = { :conditions => "name = 'David'" }
<del> Developer.send(:with_scope, :find => options) do
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> options[:conditions] = "name != 'David'"
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> end
<del>
<del> scope = { :find => { :conditions => "name = 'David'" }}
<del> Developer.send(:with_scope, scope) do
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> scope[:find][:conditions] = "name != 'David'"
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> end
<del> end
<del>
<del> def test_scoped_with_duck_typing
<del> scoping = Struct.new(:current_scope).new(:find => { :conditions => ["name = ?", 'David'] })
<del> Developer.send(:with_scope, scoping) do
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> end
<del> end
<del>
<del> def test_ensure_that_method_scoping_is_correctly_restored
<del> begin
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'Jamis'" }) do
<del> raise "an exception"
<del> end
<del> rescue
<del> end
<del>
<del> assert !Developer.scoped.where_values.include?("name = 'Jamis'")
<del> end
<del>end
<del>
<del>class NestedScopingTest < ActiveRecord::TestCase
<del> fixtures :authors, :developers, :projects, :comments, :posts
<del>
<del> def test_merge_options
<del> Developer.send(:with_scope, :find => { :conditions => 'salary = 80000' }) do
<del> Developer.send(:with_scope, :find => { :limit => 10 }) do
<del> devs = Developer.scoped
<del> assert_match '(salary = 80000)', devs.to_sql
<del> assert_equal 10, devs.taken
<del> end
<del> end
<del> end
<del>
<del> def test_merge_inner_scope_has_priority
<del> Developer.send(:with_scope, :find => { :limit => 5 }) do
<del> Developer.send(:with_scope, :find => { :limit => 10 }) do
<del> assert_equal 10, Developer.scoped.taken
<del> end
<del> end
<del> end
<del>
<del> def test_replace_options
<del> Developer.send(:with_scope, :find => { :conditions => {:name => 'David'} }) do
<del> Developer.send(:with_exclusive_scope, :find => { :conditions => {:name => 'Jamis'} }) do
<del> assert_equal 'Jamis', Developer.scoped.send(:scope_for_create)[:name]
<del> end
<del>
<del> assert_equal 'David', Developer.scoped.send(:scope_for_create)[:name]
<del> end
<del> end
<del>
<del> def test_with_exclusive_scope_with_relation
<del> assert_raise(ArgumentError) do
<del> Developer.all_johns
<del> end
<del> end
<del>
<del> def test_append_conditions
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> Developer.send(:with_scope, :find => { :conditions => 'salary = 80000' }) do
<del> devs = Developer.scoped
<del> assert_match "(name = 'David') AND (salary = 80000)", devs.to_sql
<del> assert_equal(1, Developer.count)
<del> end
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'Maiha'" }) do
<del> assert_equal(0, Developer.count)
<del> end
<del> end
<del> end
<del>
<del> def test_merge_and_append_options
<del> Developer.send(:with_scope, :find => { :conditions => 'salary = 80000', :limit => 10 }) do
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> devs = Developer.scoped
<del> assert_match "(salary = 80000) AND (name = 'David')", devs.to_sql
<del> assert_equal 10, devs.taken
<del> end
<del> end
<del> end
<del>
<del> def test_nested_scoped_find
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'Jamis'" }) do
<del> Developer.send(:with_exclusive_scope, :find => { :conditions => "name = 'David'" }) do
<del> assert_nothing_raised { Developer.find(1) }
<del> assert_equal('David', Developer.find(:first).name)
<del> end
<del> assert_equal('Jamis', Developer.find(:first).name)
<del> end
<del> end
<del>
<del> def test_nested_scoped_find_include
<del> Developer.send(:with_scope, :find => { :include => :projects }) do
<del> Developer.send(:with_scope, :find => { :conditions => { 'projects.id' => 2 } }) do
<del> assert_nothing_raised { Developer.find(1) }
<del> assert_equal('David', Developer.find(:first).name)
<del> end
<del> end
<del> end
<del>
<del> def test_nested_scoped_find_merged_include
<del> # :include's remain unique and don't "double up" when merging
<del> Developer.send(:with_scope, :find => { :include => :projects, :conditions => { 'projects.id' => 2 } }) do
<del> Developer.send(:with_scope, :find => { :include => :projects }) do
<del> assert_equal 1, Developer.scoped.includes_values.uniq.length
<del> assert_equal 'David', Developer.find(:first).name
<del> end
<del> end
<del>
<del> # the nested scope doesn't remove the first :include
<del> Developer.send(:with_scope, :find => { :include => :projects, :conditions => { 'projects.id' => 2 } }) do
<del> Developer.send(:with_scope, :find => { :include => [] }) do
<del> assert_equal 1, Developer.scoped.includes_values.uniq.length
<del> assert_equal('David', Developer.find(:first).name)
<del> end
<del> end
<del>
<del> # mixing array and symbol include's will merge correctly
<del> Developer.send(:with_scope, :find => { :include => [:projects], :conditions => { 'projects.id' => 2 } }) do
<del> Developer.send(:with_scope, :find => { :include => :projects }) do
<del> assert_equal 1, Developer.scoped.includes_values.uniq.length
<del> assert_equal('David', Developer.find(:first).name)
<del> end
<del> end
<del> end
<del>
<del> def test_nested_scoped_find_replace_include
<del> Developer.send(:with_scope, :find => { :include => :projects }) do
<del> Developer.send(:with_exclusive_scope, :find => { :include => [] }) do
<del> assert_equal 0, Developer.scoped.includes_values.length
<del> end
<del> end
<del> end
<del>
<del> def test_three_level_nested_exclusive_scoped_find
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'Jamis'" }) do
<del> assert_equal('Jamis', Developer.find(:first).name)
<del>
<del> Developer.send(:with_exclusive_scope, :find => { :conditions => "name = 'David'" }) do
<del> assert_equal('David', Developer.find(:first).name)
<del>
<del> Developer.send(:with_exclusive_scope, :find => { :conditions => "name = 'Maiha'" }) do
<del> assert_equal(nil, Developer.find(:first))
<del> end
<del>
<del> # ensure that scoping is restored
<del> assert_equal('David', Developer.find(:first).name)
<del> end
<del>
<del> # ensure that scoping is restored
<del> assert_equal('Jamis', Developer.find(:first).name)
<del> end
<del> end
<del>
<del> def test_merged_scoped_find
<del> poor_jamis = developers(:poor_jamis)
<del> Developer.send(:with_scope, :find => { :conditions => "salary < 100000" }) do
<del> Developer.send(:with_scope, :find => { :offset => 1, :order => 'id asc' }) do
<del> # Oracle adapter does not generated space after asc therefore trailing space removed from regex
<del> assert_sql(/ORDER BY\s+id asc/) do
<del> assert_equal(poor_jamis, Developer.find(:first, :order => 'id asc'))
<del> end
<del> end
<del> end
<del> end
<del>
<del> def test_merged_scoped_find_sanitizes_conditions
<del> Developer.send(:with_scope, :find => { :conditions => ["name = ?", 'David'] }) do
<del> Developer.send(:with_scope, :find => { :conditions => ['salary = ?', 9000] }) do
<del> assert_raise(ActiveRecord::RecordNotFound) { developers(:poor_jamis) }
<del> end
<del> end
<del> end
<del>
<del> def test_nested_scoped_find_combines_and_sanitizes_conditions
<del> Developer.send(:with_scope, :find => { :conditions => ["name = ?", 'David'] }) do
<del> Developer.send(:with_exclusive_scope, :find => { :conditions => ['salary = ?', 9000] }) do
<del> assert_equal developers(:poor_jamis), Developer.find(:first)
<del> assert_equal developers(:poor_jamis), Developer.find(:first, :conditions => ['name = ?', 'Jamis'])
<del> end
<del> end
<del> end
<del>
<del> def test_merged_scoped_find_combines_and_sanitizes_conditions
<del> Developer.send(:with_scope, :find => { :conditions => ["name = ?", 'David'] }) do
<del> Developer.send(:with_scope, :find => { :conditions => ['salary > ?', 9000] }) do
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> end
<del> end
<del> end
<del>
<del> def test_nested_scoped_create
<del> comment = nil
<del> Comment.send(:with_scope, :create => { :post_id => 1}) do
<del> Comment.send(:with_scope, :create => { :post_id => 2}) do
<del> assert_equal({:post_id => 2}, Comment.scoped.send(:scope_for_create))
<del> comment = Comment.create :body => "Hey guys, nested scopes are broken. Please fix!"
<del> end
<del> end
<del> assert_equal 2, comment.post_id
<del> end
<del>
<del> def test_nested_exclusive_scope_for_create
<del> comment = nil
<del>
<del> Comment.send(:with_scope, :create => { :body => "Hey guys, nested scopes are broken. Please fix!" }) do
<del> Comment.send(:with_exclusive_scope, :create => { :post_id => 1 }) do
<del> assert_equal({:post_id => 1}, Comment.scoped.send(:scope_for_create))
<del> assert_blank Comment.new.body
<del> comment = Comment.create :body => "Hey guys"
<del> end
<del> end
<del> assert_equal 1, comment.post_id
<del> assert_equal 'Hey guys', comment.body
<del> end
<del>
<del> def test_merged_scoped_find_on_blank_conditions
<del> [nil, " ", [], {}].each do |blank|
<del> Developer.send(:with_scope, :find => {:conditions => blank}) do
<del> Developer.send(:with_scope, :find => {:conditions => blank}) do
<del> assert_nothing_raised { Developer.find(:first) }
<del> end
<del> end
<del> end
<del> end
<del>
<del> def test_merged_scoped_find_on_blank_bind_conditions
<del> [ [""], ["",{}] ].each do |blank|
<del> Developer.send(:with_scope, :find => {:conditions => blank}) do
<del> Developer.send(:with_scope, :find => {:conditions => blank}) do
<del> assert_nothing_raised { Developer.find(:first) }
<del> end
<del> end
<del> end
<del> end
<del>
<del> def test_immutable_nested_scope
<del> options1 = { :conditions => "name = 'Jamis'" }
<del> options2 = { :conditions => "name = 'David'" }
<del> Developer.send(:with_scope, :find => options1) do
<del> Developer.send(:with_exclusive_scope, :find => options2) do
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> options1[:conditions] = options2[:conditions] = nil
<del> assert_equal %w(David), Developer.all.map(&:name)
<del> end
<del> end
<del> end
<del>
<del> def test_immutable_merged_scope
<del> options1 = { :conditions => "name = 'Jamis'" }
<del> options2 = { :conditions => "salary > 10000" }
<del> Developer.send(:with_scope, :find => options1) do
<del> Developer.send(:with_scope, :find => options2) do
<del> assert_equal %w(Jamis), Developer.all.map(&:name)
<del> options1[:conditions] = options2[:conditions] = nil
<del> assert_equal %w(Jamis), Developer.all.map(&:name)
<del> end
<del> end
<del> end
<del>
<del> def test_ensure_that_method_scoping_is_correctly_restored
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'David'" }) do
<del> begin
<del> Developer.send(:with_scope, :find => { :conditions => "name = 'Maiha'" }) do
<del> raise "an exception"
<del> end
<del> rescue
<del> end
<del>
<del> assert Developer.scoped.where_values.include?("name = 'David'")
<del> assert !Developer.scoped.where_values.include?("name = 'Maiha'")
<del> end
<del> end
<del>
<del> def test_nested_scoped_find_merges_old_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => 'INNER JOIN posts ON authors.id = posts.author_id' }) do
<del> Author.send(:with_scope, :find => { :joins => 'INNER JOIN comments ON posts.id = comments.post_id' }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :conditions => 'comments.id = 1')
<del> end
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_nested_scoped_find_merges_new_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => :posts }) do
<del> Author.send(:with_scope, :find => { :joins => :comments }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :conditions => 'comments.id = 1')
<del> end
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>
<del> def test_nested_scoped_find_merges_new_and_old_style_joins
<del> scoped_authors = Author.send(:with_scope, :find => { :joins => :posts }) do
<del> Author.send(:with_scope, :find => { :joins => 'INNER JOIN comments ON posts.id = comments.post_id' }) do
<del> Author.find(:all, :select => 'DISTINCT authors.*', :joins => '', :conditions => 'comments.id = 1')
<del> end
<del> end
<del> assert scoped_authors.include?(authors(:david))
<del> assert !scoped_authors.include?(authors(:mary))
<del> assert_equal 1, scoped_authors.size
<del> assert_equal authors(:david).attributes, scoped_authors.first.attributes
<del> end
<del>end
<ide><path>activerecord/test/cases/readonly_test.rb
<ide> def test_has_many_with_through_is_not_implicitly_marked_readonly_while_finding_l
<ide> end
<ide>
<ide> def test_readonly_scoping
<del> Post.send(:with_scope, :find => { :conditions => '1=1' }) do
<add> Post.where('1=1').scoped do
<ide> assert !Post.find(1).readonly?
<ide> assert Post.readonly(true).find(1).readonly?
<ide> assert !Post.readonly(false).find(1).readonly?
<ide> end
<ide>
<del> Post.send(:with_scope, :find => { :joins => ' ' }) do
<add> Post.joins(' ').scoped do
<ide> assert !Post.find(1).readonly?
<ide> assert Post.readonly.find(1).readonly?
<ide> assert !Post.readonly(false).find(1).readonly?
<ide> def test_readonly_scoping
<ide> # Oracle barfs on this because the join includes unqualified and
<ide> # conflicting column names
<ide> unless current_adapter?(:OracleAdapter)
<del> Post.send(:with_scope, :find => { :joins => ', developers' }) do
<add> Post.joins(', developers').scoped do
<ide> assert Post.find(1).readonly?
<ide> assert Post.readonly.find(1).readonly?
<ide> assert !Post.readonly(false).find(1).readonly?
<ide> end
<ide> end
<ide>
<del> Post.send(:with_scope, :find => { :readonly => true }) do
<add> Post.readonly(true).scoped do
<ide> assert Post.find(1).readonly?
<ide> assert Post.readonly.find(1).readonly?
<ide> assert !Post.readonly(false).find(1).readonly?
<ide><path>activerecord/test/cases/relation_scoping_test.rb
<ide> def test_default_scope_with_multiple_calls
<ide> assert_equal 50000, wheres[:salary]
<ide> end
<ide>
<del> def test_method_scope
<del> expected = Developer.find(:all, :order => 'salary DESC, name DESC').collect { |dev| dev.salary }
<del> received = DeveloperOrderedBySalary.all_ordered_by_name.collect { |dev| dev.salary }
<del> assert_equal expected, received
<del> end
<del>
<del> def test_nested_scope
<del> expected = Developer.find(:all, :order => 'salary DESC, name DESC').collect { |dev| dev.salary }
<del> received = DeveloperOrderedBySalary.send(:with_scope, :find => { :order => 'name DESC'}) do
<del> DeveloperOrderedBySalary.find(:all).collect { |dev| dev.salary }
<del> end
<del> assert_equal expected, received
<del> end
<del>
<ide> def test_scope_overwrites_default
<ide> expected = Developer.find(:all, :order => 'salary DESC, name DESC').collect { |dev| dev.name }
<ide> received = DeveloperOrderedBySalary.by_name.find(:all).collect { |dev| dev.name }
<ide> def test_order_after_reorder_combines_orders
<ide> assert_equal expected, received
<ide> end
<ide>
<del> def test_nested_exclusive_scope
<del> expected = Developer.find(:all, :limit => 100).collect { |dev| dev.salary }
<del> received = DeveloperOrderedBySalary.send(:with_exclusive_scope, :find => { :limit => 100 }) do
<del> DeveloperOrderedBySalary.find(:all).collect { |dev| dev.salary }
<del> end
<del> assert_equal expected, received
<del> end
<del>
<ide> def test_order_in_default_scope_should_prevail
<ide> expected = Developer.find(:all, :order => 'salary desc').collect { |dev| dev.salary }
<ide> received = DeveloperOrderedBySalary.find(:all, :order => 'salary').collect { |dev| dev.salary }
<ide><path>activerecord/test/cases/relations_test.rb
<ide> def test_respond_to_dynamic_finders
<ide> end
<ide>
<ide> def test_respond_to_class_methods_and_scopes
<del> assert DeveloperOrderedBySalary.scoped.respond_to?(:all_ordered_by_name)
<ide> assert Topic.scoped.respond_to?(:by_lifo)
<ide> end
<ide>
<ide><path>activerecord/test/cases/validations/uniqueness_validation_test.rb
<ide> def test_validate_uniqueness_with_non_standard_table_names
<ide> assert i1.errors[:value].any?, "Should not be empty"
<ide> end
<ide>
<del> def test_validates_uniqueness_inside_with_scope
<add> def test_validates_uniqueness_inside_scoping
<ide> Topic.validates_uniqueness_of(:title)
<ide>
<del> Topic.send(:with_scope, :find => { :conditions => { :author_name => "David" } }) do
<add> Topic.where(:author_name => "David").scoping do
<ide> t1 = Topic.new("title" => "I'm unique!", "author_name" => "Mary")
<ide> assert t1.save
<ide> t2 = Topic.new("title" => "I'm unique!", "author_name" => "David")
<ide><path>activerecord/test/cases/validations_test.rb
<ide> def test_exception_on_create_bang_many_with_block
<ide> end
<ide> end
<ide>
<del> def test_scoped_create_without_attributes
<del> WrongReply.send(:with_scope, :create => {}) do
<del> assert_raise(ActiveRecord::RecordInvalid) { WrongReply.create! }
<del> end
<del> end
<del>
<del> def test_create_with_exceptions_using_scope_for_protected_attributes
<del> assert_nothing_raised do
<del> ProtectedPerson.send(:with_scope, :create => { :first_name => "Mary" } ) do
<del> person = ProtectedPerson.create! :addon => "Addon"
<del> assert_equal person.first_name, "Mary", "scope should ignore attr_protected"
<del> end
<del> end
<del> end
<del>
<del> def test_create_with_exceptions_using_scope_and_empty_attributes
<del> assert_nothing_raised do
<del> ProtectedPerson.send(:with_scope, :create => { :first_name => "Mary" } ) do
<del> person = ProtectedPerson.create!
<del> assert_equal person.first_name, "Mary", "should be ok when no attributes are passed to create!"
<del> end
<del> end
<del> end
<del>
<ide> def test_create_without_validation
<ide> reply = WrongReply.new
<ide> assert !reply.save
<ide><path>activerecord/test/models/developer.rb
<ide> class DeveloperOrderedBySalary < ActiveRecord::Base
<ide> default_scope { order('salary DESC') }
<ide>
<ide> scope :by_name, -> { order('name DESC') }
<del>
<del> def self.all_ordered_by_name
<del> with_scope(:find => { :order => 'name DESC' }) do
<del> find(:all)
<del> end
<del> end
<ide> end
<ide>
<ide> class DeveloperCalledDavid < ActiveRecord::Base | 13 |
Javascript | Javascript | use template instead of render buffer | f11e02afce4056734ec7ebb994a8860200cb384c | <ide><path>packages/ember-views/tests/system/ext_test.js
<ide> import run from "ember-metal/run_loop";
<ide> import View from "ember-views/views/view";
<add>import { compile } from "ember-template-compiler";
<ide>
<ide> QUnit.module("Ember.View additions to run queue");
<ide>
<del>QUnit.skip("View hierarchy is done rendering to DOM when functions queued in afterRender execute", function() {
<add>QUnit.test("View hierarchy is done rendering to DOM when functions queued in afterRender execute", function() {
<ide> var didInsert = 0;
<ide> var childView = View.create({
<ide> elementId: 'child_view',
<ide> QUnit.skip("View hierarchy is done rendering to DOM when functions queued in aft
<ide> });
<ide> var parentView = View.create({
<ide> elementId: 'parent_view',
<del> render(buffer) {
<del> this.appendChild(childView);
<del> },
<add> template: compile("{{view view.childView}}"),
<add> childView: childView,
<ide> didInsertElement() {
<ide> didInsert++;
<ide> } | 1 |
Python | Python | update instace list | 752688eb432106fe019868a75e842af8e820505e | <ide><path>contrib/scrape-ec2-prices.py
<ide> 't2.micro',
<ide> 't2.small',
<ide> 't2.medium',
<del> 't2.large'
<add> 't2.large',
<add> 'x1.32xlarge'
<ide> ]
<ide>
<ide> # Maps EC2 region name to region name used in the pricing file | 1 |
Go | Go | implement cleanup unix sockets after serving | 0c0e9836c4bed0190718b0ddd0028790b19c200c | <ide><path>api/server/server.go
<ide> func ServeApi(job *engine.Job) engine.Status {
<ide> chErrors <- err
<ide> return
<ide> }
<add> job.Eng.OnShutdown(func() {
<add> if err := srv.Close(); err != nil {
<add> log.Errorf("%s", err.Error())
<add> }
<add> })
<ide> chErrors <- srv.Serve()
<ide> }()
<ide> }
<ide><path>api/server/server_linux.go
<ide> import (
<ide>
<ide> "github.com/docker/docker/engine"
<ide> "github.com/docker/docker/pkg/systemd"
<add> "net"
<ide> )
<ide>
<add>type UnixHttpServer struct {
<add> srv *http.Server
<add> l net.Listener
<add>}
<add>
<add>func (s *UnixHttpServer) Serve() error {
<add> return s.srv.Serve(s.l)
<add>}
<add>func (s *UnixHttpServer) Close() error {
<add> if err := s.l.Close(); err != nil {
<add> return err
<add> }
<add> if _, err := os.Stat(s.srv.Addr); err != nil {
<add> return fmt.Errorf("Error removing unix socket %s: %s", s.srv.Addr, err.Error())
<add> }
<add> if err := os.Remove(s.srv.Addr); err != nil {
<add> return fmt.Errorf("Error removing unix socket %s: %s", s.srv.Addr, err.Error())
<add> }
<add> return nil
<add>}
<add>
<ide> // NewServer sets up the required Server and does protocol specific checking.
<ide> func NewServer(proto, addr string, job *engine.Job) (Server, error) {
<ide> // Basic error and sanity checking
<ide> func NewServer(proto, addr string, job *engine.Job) (Server, error) {
<ide> }
<ide> }
<ide>
<del>func setupUnixHttp(addr string, job *engine.Job) (*HttpServer, error) {
<add>func setupUnixHttp(addr string, job *engine.Job) (*UnixHttpServer, error) {
<ide> r := createRouter(job.Eng, job.GetenvBool("Logging"), job.GetenvBool("EnableCors"), job.Getenv("CorsHeaders"), job.Getenv("Version"))
<ide>
<ide> if err := syscall.Unlink(addr); err != nil && !os.IsNotExist(err) {
<ide> func setupUnixHttp(addr string, job *engine.Job) (*HttpServer, error) {
<ide> return nil, err
<ide> }
<ide>
<del> return &HttpServer{&http.Server{Addr: addr, Handler: r}, l}, nil
<add> return &UnixHttpServer{&http.Server{Addr: addr, Handler: r}, l}, nil
<ide> }
<ide>
<ide> // serveFd creates an http.Server and sets it up to serve given a socket activated | 2 |
Java | Java | fix checkstyle errors | 2f732a8dea7c056a24afbafd59747118d56f5106 | <ide><path>spring-web/src/main/java/org/springframework/http/client/reactive/ReactorClientHttpConnector.java
<ide> import reactor.netty.http.client.HttpClientResponse;
<ide> import reactor.netty.resources.ConnectionProvider;
<ide> import reactor.netty.resources.LoopResources;
<del>import reactor.netty.tcp.TcpClient;
<ide>
<ide> import org.springframework.http.HttpMethod;
<ide> import org.springframework.util.Assert;
<ide> public class ReactorClientHttpConnector implements ClientHttpConnector {
<ide>
<ide>
<ide> /**
<del> * Default constructor that initializes an {@link HttpClient} with:
<add> * Default constructor. Initializes {@link HttpClient} via:
<ide> * <pre class="code">
<ide> * HttpClient.create().compress()
<ide> * </pre> | 1 |
Javascript | Javascript | avoid unneeded .textcontent | 47f2c1472303f7f439bc7b56daaf5c8e1066aea8 | <ide><path>web/text_layer_builder.js
<ide> var TextLayerBuilder = (function TextLayerBuilderClosure() {
<ide> // We don't bother scaling single-char text divs, because it has very
<ide> // little effect on text highlighting. This makes scrolling on docs with
<ide> // lots of such divs a lot faster.
<del> if (textDiv.textContent.length > 1) {
<add> if (geom.str.length > 1) {
<ide> if (style.vertical) {
<ide> textDiv.dataset.canvasWidth = geom.height * this.viewport.scale;
<ide> } else { | 1 |
PHP | PHP | rename some classes for consistency | 8d80cc42fa368364a40d201cd2aea3b6b9147a12 | <ide><path>src/Illuminate/Auth/Console/MakeRemindersCommand.php
<ide> use Illuminate\Console\Command;
<ide> use Illuminate\Filesystem\Filesystem;
<ide>
<del>class MakeRemindersCommand extends Command {
<add>class RemindersTableCommand extends Command {
<ide>
<ide> /**
<ide> * The console command name.
<ide><path>src/Illuminate/Auth/Reminders/ReminderServiceProvider.php
<ide> <?php namespace Illuminate\Auth\Reminders;
<ide>
<ide> use Illuminate\Support\ServiceProvider;
<del>use Illuminate\Auth\Console\MakeRemindersCommand;
<add>use Illuminate\Auth\Console\RemindersTableCommand;
<ide> use Illuminate\Auth\Console\ClearRemindersCommand;
<ide> use Illuminate\Auth\Console\RemindersControllerCommand;
<ide> use Illuminate\Auth\Reminders\DatabaseReminderRepository as DbRepository;
<ide> protected function registerCommands()
<ide> {
<ide> $this->app->bindShared('command.auth.reminders', function($app)
<ide> {
<del> return new MakeRemindersCommand($app['files']);
<add> return new RemindersTableCommand($app['files']);
<ide> });
<ide>
<ide> $this->app->bindShared('command.auth.reminders.clear', function($app)
<add><path>src/Illuminate/Database/Console/Migrations/MigrateMakeCommand.php
<del><path>src/Illuminate/Database/Console/Migrations/MakeCommand.php
<ide> use Symfony\Component\Console\Input\InputArgument;
<ide> use Illuminate\Database\Migrations\MigrationCreator;
<ide>
<del>class MakeCommand extends BaseCommand {
<add>class MigrateMakeCommand extends BaseCommand {
<ide>
<ide> /**
<ide> * The console command name.
<ide> public function fire()
<ide> $table = $this->input->getOption('table');
<ide>
<ide> $create = $this->input->getOption('create');
<del>
<add>
<ide> if ( ! $table && is_string($create))
<ide> {
<ide> $table = $create;
<ide><path>src/Illuminate/Database/MigrationServiceProvider.php
<ide> use Illuminate\Support\ServiceProvider;
<ide> use Illuminate\Database\Migrations\Migrator;
<ide> use Illuminate\Database\Migrations\MigrationCreator;
<del>use Illuminate\Database\Console\Migrations\MakeCommand;
<ide> use Illuminate\Database\Console\Migrations\ResetCommand;
<ide> use Illuminate\Database\Console\Migrations\RefreshCommand;
<ide> use Illuminate\Database\Console\Migrations\InstallCommand;
<ide> use Illuminate\Database\Console\Migrations\MigrateCommand;
<ide> use Illuminate\Database\Console\Migrations\RollbackCommand;
<add>use Illuminate\Database\Console\Migrations\MigrateMakeCommand;
<ide> use Illuminate\Database\Migrations\DatabaseMigrationRepository;
<ide>
<ide> class MigrationServiceProvider extends ServiceProvider {
<ide> protected function registerMakeCommand()
<ide>
<ide> $packagePath = $app['path.base'].'/vendor';
<ide>
<del> return new MakeCommand($creator, $packagePath);
<add> return new MigrateMakeCommand($creator, $packagePath);
<ide> });
<ide> }
<ide>
<ide><path>src/Illuminate/Session/CommandsServiceProvider.php
<ide> public function register()
<ide> {
<ide> $this->app->bindShared('command.session.database', function($app)
<ide> {
<del> return new Console\MakeTableCommand($app['files']);
<add> return new Console\SessionTableCommand($app['files']);
<ide> });
<ide>
<ide> $this->commands('command.session.database');
<add><path>src/Illuminate/Session/Console/SessionTableCommand.php
<del><path>src/Illuminate/Session/Console/MakeTableCommand.php
<ide> use Illuminate\Console\Command;
<ide> use Illuminate\Filesystem\Filesystem;
<ide>
<del>class MakeTableCommand extends Command {
<add>class SessionTableCommand extends Command {
<ide>
<ide> /**
<ide> * The console command name.
<ide><path>tests/Database/DatabaseMigrationMakeCommandTest.php
<ide> <?php
<ide>
<ide> use Mockery as m;
<del>use Illuminate\Database\Console\Migrations\MakeCommand;
<add>use Illuminate\Database\Console\Migrations\MigrateMakeCommand;
<ide>
<ide> class DatabaseMigrationMakeCommandTest extends PHPUnit_Framework_TestCase {
<ide>
<ide> protected function runCommand($command, $input = array())
<ide>
<ide>
<ide>
<del>class DatabaseMigrationMakeCommandTestStub extends MakeCommand
<add>class DatabaseMigrationMakeCommandTestStub extends MigrateMakeCommand
<ide> {
<ide> public function call($command, array $arguments = array())
<ide> { | 7 |
Ruby | Ruby | simplify ruby_version checking in tests | 5aae2b27832abb0f2238300d055fbb5fe461e8f7 | <ide><path>activerecord/test/cases/adapters/postgresql/range_test.rb
<ide> def test_infinity_values
<ide> assert_equal(-Float::INFINITY...Float::INFINITY, record.float_range)
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.6.0")
<add> if RUBY_VERSION >= "2.6"
<ide> def test_endless_range_values
<ide> record = PostgresqlRange.create!(
<ide> int4_range: eval("1.."),
<ide><path>activerecord/test/cases/arel/attributes/attribute_test.rb
<ide> class AttributeTest < Arel::Spec
<ide> )
<ide> end
<ide>
<del> if Gem::Version.new("2.7.0") <= Gem::Version.new(RUBY_VERSION)
<add> if RUBY_VERSION >= "2.7"
<ide> it "can be constructed with a range implicitly starting at Infinity" do
<ide> attribute = Attribute.new nil, nil
<ide> node = attribute.between(eval("..0")) # eval for backwards compatibility
<ide> class AttributeTest < Arel::Spec
<ide> end
<ide> end
<ide>
<del> if Gem::Version.new("2.6.0") <= Gem::Version.new(RUBY_VERSION)
<add> if RUBY_VERSION >= "2.6"
<ide> it "can be constructed with a range implicitly ending at Infinity" do
<ide> attribute = Attribute.new nil, nil
<ide> node = attribute.between(eval("0..")) # Use eval for compatibility with Ruby < 2.6 parser
<ide> class AttributeTest < Arel::Spec
<ide> )
<ide> end
<ide>
<del> if Gem::Version.new("2.7.0") <= Gem::Version.new(RUBY_VERSION)
<add> if RUBY_VERSION >= "2.7"
<ide> it "can be constructed with a range implicitly starting at Infinity" do
<ide> attribute = Attribute.new nil, nil
<ide> node = attribute.not_between(eval("..0")) # eval for backwards compatibility
<ide> class AttributeTest < Arel::Spec
<ide> end
<ide> end
<ide>
<del> if Gem::Version.new("2.6.0") <= Gem::Version.new(RUBY_VERSION)
<add> if RUBY_VERSION >= "2.6"
<ide> it "can be constructed with a range implicitly ending at Infinity" do
<ide> attribute = Attribute.new nil, nil
<ide> node = attribute.not_between(eval("0..")) # Use eval for compatibility with Ruby < 2.6 parser
<ide><path>activesupport/test/core_ext/range_ext_test.rb
<ide> def test_should_include_other_with_exclusive_end
<ide> assert((1..10).include?(1...11))
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.6.0")
<add> if RUBY_VERSION >= "2.6"
<ide> def test_include_with_endless_range
<ide> assert(eval("1..").include?(2))
<ide> end
<ide> def test_should_not_include_range_with_endless_range
<ide> end
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.7.0")
<add> if RUBY_VERSION >= "2.7"
<ide> def test_include_with_beginless_range
<ide> assert(eval("..2").include?(1))
<ide> end
<ide> def test_should_compare_other_with_exclusive_end
<ide> assert((1..10) === (1...11))
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.6.0")
<add> if RUBY_VERSION >= "2.6"
<ide> def test_should_compare_range_with_endless_range
<ide> assert(eval("1..") === (2..4))
<ide> end
<ide> def test_should_not_compare_range_with_endless_range
<ide> end
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.7.0")
<add> if RUBY_VERSION >= "2.7"
<ide> def test_should_compare_range_with_beginless_range
<ide> assert(eval("..2") === (-1..1))
<ide> end
<ide> def test_should_cover_other_with_exclusive_end
<ide> assert((1..10).cover?(1...11))
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.6.0")
<add> if RUBY_VERSION >= "2.6"
<ide> def test_should_cover_range_with_endless_range
<ide> assert(eval("1..").cover?(2..4))
<ide> end
<ide> def test_should_not_cover_range_with_endless_range
<ide> end
<ide> end
<ide>
<del> if Gem::Version.new(RUBY_VERSION) >= Gem::Version.new("2.7.0")
<add> if RUBY_VERSION >= "2.7"
<ide> def test_should_cover_range_with_beginless_range
<ide> assert(eval("..2").cover?(-1..1))
<ide> end | 3 |
Javascript | Javascript | add paypal donation to modals and certificates | 585527d145b0c27c2f99809bd0ae9e063cc6439c | <ide><path>api-server/server/utils/donation.js
<ide> export function createDonationObj(body) {
<ide> const {
<ide> resource: {
<ide> id,
<del> start_time,
<add> status_update_time,
<ide> subscriber: { email_address } = {
<ide> email_address: null
<ide> }
<ide> export function createDonationObj(body) {
<ide> provider: 'paypal',
<ide> subscriptionId: id,
<ide> customerId: email_address,
<del> startDate: new Date(start_time).toISOString()
<add> startDate: new Date(status_update_time).toISOString()
<ide> };
<ide> return donation;
<ide> }
<ide><path>client/src/client-only-routes/ShowCertification.js
<ide> class ShowCertification extends Component {
<ide> <Grid className='donation-section'>
<ide> {!isDonationSubmitted && (
<ide> <Row>
<del> <Col sm={10} smOffset={1} xs={12}>
<add> <Col lg={8} lgOffset={2} sm={10} smOffset={1} xs={12}>
<ide> <p>
<ide> Only you can see this message. Congratulations on earning this
<ide> certification. It’s no easy task. Running freeCodeCamp isn’t
<ide><path>client/src/components/Donation/DonateCompletion.js
<ide> function DonateCompletion({ processing, reset, success, error = null }) {
<ide> Your donations will support free technology education for people
<ide> all over the world.
<ide> </p>
<del> <p>
<del> You can update your supporter status at any time from your
<del> settings page.
<del> </p>
<ide> </div>
<ide> )}
<ide> {error && <p>{error}</p>}
<ide><path>client/src/components/Donation/DonateForm.js
<ide> class DonateForm extends Component {
<ide> this.getDonationButtonLabel = this.getDonationButtonLabel.bind(this);
<ide> this.handleSelectAmount = this.handleSelectAmount.bind(this);
<ide> this.handleSelectDuration = this.handleSelectDuration.bind(this);
<del> this.handleSelectPaymentType = this.handleSelectPaymentType.bind(this);
<ide> this.hideAmountOptionsCB = this.hideAmountOptionsCB.bind(this);
<ide> }
<ide>
<ide> class DonateForm extends Component {
<ide> this.setState({ donationAmount });
<ide> }
<ide>
<del> handleSelectPaymentType(e) {
<del> this.setState({
<del> paymentType: e.currentTarget.value
<del> });
<del> }
<del>
<ide> renderAmountButtons(duration) {
<ide> return this.amounts[duration].map(amount => (
<ide> <ToggleButton
<ide> class DonateForm extends Component {
<ide>
<ide> renderDonationOptions() {
<ide> const { stripe, handleProcessing } = this.props;
<del> const { donationAmount, donationDuration, paymentType } = this.state;
<add> const { donationAmount, donationDuration } = this.state;
<ide> return (
<ide> <div>
<del> {paymentType === 'Card' ? (
<del> <StripeProvider stripe={stripe}>
<del> <Elements>
<del> <DonateFormChildViewForHOC
<del> defaultTheme='default'
<del> donationAmount={donationAmount}
<del> donationDuration={donationDuration}
<del> getDonationButtonLabel={this.getDonationButtonLabel}
<del> handleProcessing={handleProcessing}
<del> hideAmountOptionsCB={this.hideAmountOptionsCB}
<del> />
<del> </Elements>
<del> </StripeProvider>
<del> ) : (
<del> <p>
<del> PayPal is currently unavailable. Please use a Credit/Debit card
<del> instead.
<del> </p>
<del> )}
<add> <StripeProvider stripe={stripe}>
<add> <Elements>
<add> <DonateFormChildViewForHOC
<add> defaultTheme='default'
<add> donationAmount={donationAmount}
<add> donationDuration={donationDuration}
<add> getDonationButtonLabel={this.getDonationButtonLabel}
<add> handleProcessing={handleProcessing}
<add> hideAmountOptionsCB={this.hideAmountOptionsCB}
<add> />
<add> </Elements>
<add> </StripeProvider>
<add> <Spacer size={2} />
<ide> </div>
<ide> );
<ide> }
<ide><path>client/src/components/Donation/DonationModal.js
<ide> function DonateModal({
<ide> });
<ide> }
<ide>
<del> const donationText = (
<del> <b>
<del> Become a $5 / month supporter and help us create even more learning
<del> resources for you and your family.
<del> </b>
<del> );
<add> const donationText = <b>Become an annual supporter of our nonprofit.</b>;
<ide> const blockDonationText = (
<ide> <div className='block-modal-text'>
<ide> <div className='donation-icon-container'>
<ide> function DonateModal({
<ide> {!closeLabel && (
<ide> <Col sm={10} smOffset={1} xs={12}>
<ide> <b>Nicely done. You just completed {blockNameify(block)}. </b>
<add> <br />
<ide> {donationText}
<ide> </Col>
<ide> )}
<ide><path>client/src/components/Donation/MinimalDonateForm.js
<ide> class MinimalDonateForm extends Component {
<ide> return (
<ide> <Row>
<ide> <Col lg={8} lgOffset={2} sm={10} smOffset={1} xs={12}>
<add> <Spacer />
<add> <b>Confirm your donation of $60 / year with PayPal:</b>
<add> <Spacer />
<ide> <PaypalButton
<ide> handleProcessing={handleProcessing}
<ide> onDonationStateChange={this.onDonationStateChange}
<ide> />
<ide> </Col>
<del> <Col sm={10} smOffset={1} xs={12}>
<add> <Col lg={8} lgOffset={2} sm={10} smOffset={1} xs={12}>
<ide> <Spacer />
<ide> <b>Or donate with a credit card:</b>
<ide> <Spacer />
<del> </Col>
<del> <Col lg={8} lgOffset={2} sm={10} smOffset={1} xs={12}>
<ide> <StripeProvider stripe={stripe}>
<ide> <Elements>
<ide> <DonateFormChildViewForHOC
<ide> defaultTheme={defaultTheme}
<ide> donationAmount={donationAmount}
<ide> donationDuration={donationDuration}
<ide> getDonationButtonLabel={() =>
<del> `Confirm your donation of $5 / month`
<add> `Confirm your donation of $60 / year`
<ide> }
<ide> handleProcessing={handleProcessing}
<ide> />
<ide><path>client/src/pages/donate.js
<ide> import DonateForm from '../components/Donation/DonateForm';
<ide> import DonateText from '../components/Donation/DonateText';
<ide> import { signInLoadingSelector, userSelector, executeGA } from '../redux';
<ide> import { stripeScriptLoader } from '../utils/scriptLoaders';
<del>import { PaypalButton } from '../components/Donation/PaypalButton';
<ide>
<ide> const propTypes = {
<ide> executeGA: PropTypes.func,
<ide> export class DonatePage extends Component {
<ide> handleProcessing={this.handleProcessing}
<ide> stripe={stripe}
<ide> />
<del> <Spacer size={2} />
<del> <Row>
<del> <Col sm={10} smOffset={1} xs={12}>
<del> <b>Or donate with:</b>
<del> <Spacer />
<del> <PaypalButton />
<del> </Col>
<del> </Row>
<ide> </Col>
<ide> <Col md={6}>
<ide> <DonateText />
<ide><path>client/src/redux/index.js
<ide> export const shouldRequestDonationSelector = state => {
<ide> const isDonating = isDonatingSelector(state);
<ide> const canRequestBlockDonation = canRequestBlockDonationSelector(state);
<ide>
<add> const debugModal = localStorage.getItem('DEBUG_DONATE_MODAL', false);
<add> console.log(debugModal);
<add> if (debugModal === 'yes-please') {
<add> return true;
<add> }
<add>
<ide> // don't request donation if already donating
<ide> if (isDonating) return false;
<ide>
<ide><path>config/donation-settings.js
<ide> const durationsConfig = {
<ide> onetime: 'one-time'
<ide> };
<ide> const amountsConfig = {
<del> year: [100000, 25000, 3500],
<add> year: [100000, 25000, 6000],
<ide> month: [5000, 3500, 500],
<ide> onetime: [100000, 25000, 3500]
<ide> };
<ide> const defaultAmount = {
<del> year: 25000,
<add> year: 6000,
<ide> month: 3500,
<ide> onetime: 25000
<ide> };
<ide> const defaultStateConfig = {
<ide> donationAmount: defaultAmount['month'],
<del> donationDuration: 'month',
<del> paymentType: 'Card'
<add> donationDuration: 'month'
<ide> };
<ide> const modalDefaultStateConfig = {
<del> donationAmount: 500,
<del> donationDuration: 'month',
<del> paymentType: 'Card'
<add> donationAmount: 6000,
<add> donationDuration: 'year'
<ide> };
<ide>
<ide> // Configuration for server side | 9 |
Javascript | Javascript | enable cache in test renderer | b1acff0cc27a4c9844cb8049fae2ec6576a0f956 | <ide><path>packages/shared/forks/ReactFeatureFlags.test-renderer.www.js
<ide> export const enableUpdaterTracking = false;
<ide> export const enableSuspenseServerRenderer = false;
<ide> export const enableSelectiveHydration = false;
<ide> export const enableLazyElements = false;
<del>export const enableCache = false;
<add>export const enableCache = true;
<ide> export const enableSchedulerDebugging = false;
<ide> export const disableJavaScriptURLs = false;
<ide> export const disableInputAttributeSyncing = false; | 1 |
Text | Text | fix example jsx output | 86bac33408de1e0284c27d5c82d4559301b831ae | <ide><path>docs/docs/06-transferring-props.md
<ide> It's a common pattern in React to wrap a component in an abstraction. The outer
<ide> You can use [JSX spread attributes](/react/docs/jsx-spread.html) to merge the old props with additional values:
<ide>
<ide> ```javascript
<del>return <Component {...this.props} more="values" />;
<add><Component {...this.props} more="values" />
<ide> ```
<ide>
<ide> If you don't use JSX, you can use any object helper such as ES6 `Object.assign` or Underscore `_.extend`:
<ide>
<ide> ```javascript
<del>return Component(Object.assign({}, this.props, { more: 'values' }));
<add>React.createElement(Component, Object.assign({}, this.props, { more: 'values' }));
<ide> ```
<ide>
<ide> The rest of this tutorial explains best practices. It uses JSX and experimental ES7 syntax. | 1 |
Python | Python | fix lazy init to stop hiding errors in import | 8560b55b5e6ee62f673ed63e159f89df4a702154 | <ide><path>src/transformers/file_utils.py
<ide> def __getattr__(self, name: str) -> Any:
<ide> return value
<ide>
<ide> def _get_module(self, module_name: str):
<del> return importlib.import_module("." + module_name, self.__name__)
<add> try:
<add> return importlib.import_module("." + module_name, self.__name__)
<add> except Exception as e:
<add> raise RuntimeError(
<add> f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its traceback):\n{e}"
<add> ) from e
<ide>
<ide> def __reduce__(self):
<ide> return (self.__class__, (self._name, self.__file__, self._import_structure)) | 1 |
Javascript | Javascript | replace console.log() with debuglog() | 579e6b95aee561d8ddd3e3663fa7a588c6449334 | <ide><path>test/parallel/test-process-exit-code.js
<ide> 'use strict';
<ide> require('../common');
<ide> const assert = require('assert');
<add>const debug = require('util').debuglog('test');
<ide>
<ide> const testCases = require('../fixtures/process-exit-code-cases');
<ide>
<ide> if (!process.argv[2]) {
<ide> } else {
<ide> const i = parseInt(process.argv[2]);
<ide> if (Number.isNaN(i)) {
<del> console.log('Invalid test case index');
<add> debug('Invalid test case index');
<ide> process.exit(100);
<ide> return;
<ide> }
<ide> function parent() {
<ide> assert.strictEqual(
<ide> code, exit,
<ide> `wrong exit for ${arg}-${name}\nexpected:${exit} but got:${code}`);
<del> console.log(`ok - ${arg} exited with ${exit}`);
<add> debug(`ok - ${arg} exited with ${exit}`);
<ide> });
<ide> };
<ide> | 1 |
Javascript | Javascript | add protractor tests for docs app | 039b990d8dc73e59418114b49c36797e44127c76 | <ide><path>protractor-conf.js
<ide> exports.config = {
<ide>
<ide> specs: [
<ide> 'build/docs/ptore2e/**/*.js',
<add> 'test/e2e/docsAppE2E.js'
<ide> ],
<ide>
<ide> capabilities: {
<ide><path>test/e2e/docsAppE2E.js
<add>describe('docs.angularjs.org', function () {
<add> describe('App', function () {
<add> // it('should filter the module list when searching', function () {
<add> // browser.get();
<add> // browser.waitForAngular();
<add>
<add> // var search = element(by.input('q'));
<add> // search.clear();
<add> // search.sendKeys('ngBind');
<add>
<add> // var firstModule = element(by.css('.search-results a'));
<add> // expect(firstModule.getText()).toEqual('ngBind');
<add> // });
<add>
<add>
<add> it('should change the page content when clicking a link to a service', function () {
<add> browser.get('');
<add>
<add> var ngBindLink = element(by.css('.definition-table td a[href="api/ng.directive:ngClick"]'));
<add> ngBindLink.click();
<add>
<add> var pageBody = element(by.css('.content h1 code'));
<add> expect(pageBody.getText()).toEqual('ngClick');
<add> });
<add>
<add>
<add> it('should show the functioning input directive example', function () {
<add> browser.get('index-nocache.html#!/api/ng.directive:input');
<add> //Wait for animation
<add> browser.sleep(500);
<add>
<add> var nameInput = element(by.input('user.name'));
<add> nameInput.click();
<add> nameInput.sendKeys('!!!');
<add>
<add> var code = element(by.css('.doc-example-live tt'));
<add> expect(code.getText()).toContain('guest!!!');
<add> });
<add> });
<add>}) | 2 |
Ruby | Ruby | add migrated_at column to schema_migrations table | c283cdd63cafdb04784cfcc5094da41c9268c20c | <ide><path>activerecord/lib/active_record/connection_adapters/abstract/schema_statements.rb
<ide> def dump_schema_information #:nodoc:
<ide> def initialize_schema_migrations_table
<ide> sm_table = ActiveRecord::Migrator.schema_migrations_table_name
<ide>
<del> unless table_exists?(sm_table)
<add> if table_exists?(sm_table)
<add> cols = columns(sm_table).collect { |col| col.name }
<add> unless cols.include?("migrated_at")
<add> add_column sm_table, :migrated_at, :datetime
<add> update "UPDATE #{quote_table_name(sm_table)} SET migrated_at = '#{quoted_date(Time.now)}' WHERE migrated_at IS NULL"
<add> change_column sm_table, :migrated_at, :datetime, :null => false
<add> end
<add> else
<ide> create_table(sm_table, :id => false) do |schema_migrations_table|
<ide> schema_migrations_table.column :version, :string, :null => false
<add> schema_migrations_table.column :migrated_at, :datetime, :null => false
<ide> end
<ide> add_index sm_table, :version, :unique => true,
<ide> :name => "#{Base.table_name_prefix}unique_schema_migrations#{Base.table_name_suffix}"
<ide><path>activerecord/test/cases/migration_test.rb
<ide> def puts(text="")
<ide> end
<ide>
<ide> class MigrationTableAndIndexTest < ActiveRecord::TestCase
<del> def test_add_schema_info_respects_prefix_and_suffix
<del> conn = ActiveRecord::Base.connection
<add> def setup
<add> @conn = ActiveRecord::Base.connection
<add> @conn.drop_table(ActiveRecord::Migrator.schema_migrations_table_name) if @conn.table_exists?(ActiveRecord::Migrator.schema_migrations_table_name)
<add> end
<ide>
<del> conn.drop_table(ActiveRecord::Migrator.schema_migrations_table_name) if conn.table_exists?(ActiveRecord::Migrator.schema_migrations_table_name)
<add> def test_add_schema_migrations_respects_prefix_and_suffix
<ide> # Use shorter prefix and suffix as in Oracle database identifier cannot be larger than 30 characters
<ide> ActiveRecord::Base.table_name_prefix = 'p_'
<ide> ActiveRecord::Base.table_name_suffix = '_s'
<del> conn.drop_table(ActiveRecord::Migrator.schema_migrations_table_name) if conn.table_exists?(ActiveRecord::Migrator.schema_migrations_table_name)
<add> @conn.drop_table(ActiveRecord::Migrator.schema_migrations_table_name) if @conn.table_exists?(ActiveRecord::Migrator.schema_migrations_table_name)
<ide>
<del> conn.initialize_schema_migrations_table
<add> @conn.initialize_schema_migrations_table
<ide>
<del> assert_equal "p_unique_schema_migrations_s", conn.indexes(ActiveRecord::Migrator.schema_migrations_table_name)[0][:name]
<add> assert_equal "p_unique_schema_migrations_s", @conn.indexes(ActiveRecord::Migrator.schema_migrations_table_name)[0][:name]
<ide> ensure
<ide> ActiveRecord::Base.table_name_prefix = ""
<ide> ActiveRecord::Base.table_name_suffix = ""
<ide> end
<add>
<add> def test_schema_migrations_columns
<add> @conn.initialize_schema_migrations_table
<add>
<add> columns = @conn.columns(ActiveRecord::Migrator.schema_migrations_table_name).collect(&:name)
<add> %w[version migrated_at].each { |col| assert columns.include?(col) }
<add> end
<add>
<add> def test_add_migrated_at_to_exisiting_schema_migrations
<add> sm_table = ActiveRecord::Migrator.schema_migrations_table_name
<add> @conn.create_table(sm_table, :id => false) do |schema_migrations_table|
<add> schema_migrations_table.column :version, :string, :null => false
<add> end
<add> @conn.insert "INSERT INTO #{@conn.quote_table_name(sm_table)} (version) VALUES (100)"
<add> @conn.insert "INSERT INTO #{@conn.quote_table_name(sm_table)} (version) VALUES (200)"
<add>
<add> @conn.initialize_schema_migrations_table
<add>
<add> m_ats = @conn.select_values("SELECT migrated_at FROM #{@conn.quote_table_name(sm_table)}")
<add> assert_equal 2, m_ats.length
<add> assert_equal 2, m_ats.compact.length
<add> end
<ide> end
<ide>
<ide> class MigrationTest < ActiveRecord::TestCase | 2 |
PHP | PHP | extract relativetimeformatter from time & date | 1c6ac85d070357da524c09de6628f6d47078a44c | <ide><path>src/I18n/Date.php
<ide> class Date extends MutableDate implements JsonSerializable
<ide> */
<ide> public function timeAgoInWords(array $options = [])
<ide> {
<del> $date = $this;
<del>
<del> $options += [
<del> 'from' => static::now(),
<del> 'timezone' => null,
<del> 'format' => static::$wordFormat,
<del> 'accuracy' => static::$wordAccuracy,
<del> 'end' => static::$wordEnd,
<del> 'relativeString' => __d('cake', '%s ago'),
<del> 'absoluteString' => __d('cake', 'on %s'),
<del> ];
<del> if (is_string($options['accuracy'])) {
<del> foreach (static::$wordAccuracy as $key => $level) {
<del> $options[$key] = $options['accuracy'];
<del> }
<del> } else {
<del> $options['accuracy'] += static::$wordAccuracy;
<del> }
<del> if ($options['timezone']) {
<del> $date = $date->timezone($options['timezone']);
<del> }
<del>
<del> $now = $options['from']->format('U');
<del> $inSeconds = $date->format('U');
<del> $backwards = ($inSeconds > $now);
<del>
<del> $futureTime = $now;
<del> $pastTime = $inSeconds;
<del> if ($backwards) {
<del> $futureTime = $inSeconds;
<del> $pastTime = $now;
<del> }
<del> $diff = $futureTime - $pastTime;
<del>
<del> if (!$diff) {
<del> return __d('cake', 'today');
<del> }
<del>
<del> if ($diff > abs($now - (new static($options['end']))->format('U'))) {
<del> return sprintf($options['absoluteString'], $date->i18nFormat($options['format']));
<del> }
<del>
<del> // If more than a week, then take into account the length of months
<del> if ($diff >= 604800) {
<del> list($future['H'], $future['i'], $future['s'], $future['d'], $future['m'], $future['Y']) = explode('/', date('H/i/s/d/m/Y', $futureTime));
<del>
<del> list($past['H'], $past['i'], $past['s'], $past['d'], $past['m'], $past['Y']) = explode('/', date('H/i/s/d/m/Y', $pastTime));
<del> $weeks = $days = $hours = $minutes = $seconds = 0;
<del>
<del> $years = $future['Y'] - $past['Y'];
<del> $months = $future['m'] + ((12 * $years) - $past['m']);
<del>
<del> if ($months >= 12) {
<del> $years = floor($months / 12);
<del> $months = $months - ($years * 12);
<del> }
<del> if ($future['m'] < $past['m'] && $future['Y'] - $past['Y'] === 1) {
<del> $years--;
<del> }
<del>
<del> if ($future['d'] >= $past['d']) {
<del> $days = $future['d'] - $past['d'];
<del> } else {
<del> $daysInPastMonth = date('t', $pastTime);
<del> $daysInFutureMonth = date('t', mktime(0, 0, 0, $future['m'] - 1, 1, $future['Y']));
<del>
<del> if (!$backwards) {
<del> $days = ($daysInPastMonth - $past['d']) + $future['d'];
<del> } else {
<del> $days = ($daysInFutureMonth - $past['d']) + $future['d'];
<del> }
<del>
<del> if ($future['m'] != $past['m']) {
<del> $months--;
<del> }
<del> }
<del>
<del> if (!$months && $years >= 1 && $diff < ($years * 31536000)) {
<del> $months = 11;
<del> $years--;
<del> }
<del>
<del> if ($months >= 12) {
<del> $years = $years + 1;
<del> $months = $months - 12;
<del> }
<del>
<del> if ($days >= 7) {
<del> $weeks = floor($days / 7);
<del> $days = $days - ($weeks * 7);
<del> }
<del> } else {
<del> $years = $months = $weeks = 0;
<del> $days = floor($diff / 86400);
<del>
<del> $diff = $diff - ($days * 86400);
<del>
<del> $hours = floor($diff / 3600);
<del> $diff = $diff - ($hours * 3600);
<del>
<del> $minutes = floor($diff / 60);
<del> $diff = $diff - ($minutes * 60);
<del> $seconds = $diff;
<del> }
<del>
<del> $fWord = $options['accuracy']['day'];
<del> if ($years > 0) {
<del> $fWord = $options['accuracy']['year'];
<del> } elseif (abs($months) > 0) {
<del> $fWord = $options['accuracy']['month'];
<del> } elseif (abs($weeks) > 0) {
<del> $fWord = $options['accuracy']['week'];
<del> } elseif (abs($days) > 0) {
<del> $fWord = $options['accuracy']['day'];
<del> }
<del>
<del> $fNum = str_replace(['year', 'month', 'week', 'day'], [1, 2, 3, 4], $fWord);
<del>
<del> $relativeDate = '';
<del> if ($fNum >= 1 && $years > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} year', '{0} years', $years, $years);
<del> }
<del> if ($fNum >= 2 && $months > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} month', '{0} months', $months, $months);
<del> }
<del> if ($fNum >= 3 && $weeks > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} week', '{0} weeks', $weeks, $weeks);
<del> }
<del> if ($fNum >= 4 && $days > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} day', '{0} days', $days, $days);
<del> }
<del>
<del> // When time has passed
<del> if (!$backwards && $relativeDate) {
<del> return sprintf($options['relativeString'], $relativeDate);
<del> }
<del> if (!$backwards) {
<del> $aboutAgo = [
<del> 'day' => __d('cake', 'about a day ago'),
<del> 'week' => __d('cake', 'about a week ago'),
<del> 'month' => __d('cake', 'about a month ago'),
<del> 'year' => __d('cake', 'about a year ago')
<del> ];
<del>
<del> return $aboutAgo[$fWord];
<del> }
<del>
<del> // When time is to come
<del> if (!$relativeDate) {
<del> $aboutIn = [
<del> 'day' => __d('cake', 'in about a day'),
<del> 'week' => __d('cake', 'in about a week'),
<del> 'month' => __d('cake', 'in about a month'),
<del> 'year' => __d('cake', 'in about a year')
<del> ];
<del>
<del> return $aboutIn[$fWord];
<del> }
<del>
<del> return $relativeDate;
<add> return (new RelativeTimeFormatter($this))->dateAgoInWords($options);
<ide> }
<ide> }
<ide><path>src/I18n/RelativeTimeFormatter.php
<add><?php
<add>namespace Cake\I18n;
<add>
<add>use Cake\I18n\FrozenDate;
<add>use Cake\I18n\FrozenTime;
<add>
<add>class RelativeTimeFormatter
<add>{
<add> protected $_time;
<add>
<add> public function __construct($time)
<add> {
<add> $this->_time = $time;
<add> }
<add>
<add> /**
<add> * Returns either a relative or a formatted absolute date depending
<add> * on the difference between the current time and this object.
<add> *
<add> * ### Options:
<add> *
<add> * - `from` => another Time object representing the "now" time
<add> * - `format` => a fall back format if the relative time is longer than the duration specified by end
<add> * - `accuracy` => Specifies how accurate the date should be described (array)
<add> * - year => The format if years > 0 (default "day")
<add> * - month => The format if months > 0 (default "day")
<add> * - week => The format if weeks > 0 (default "day")
<add> * - day => The format if weeks > 0 (default "hour")
<add> * - hour => The format if hours > 0 (default "minute")
<add> * - minute => The format if minutes > 0 (default "minute")
<add> * - second => The format if seconds > 0 (default "second")
<add> * - `end` => The end of relative time telling
<add> * - `relativeString` => The printf compatible string when outputting relative time
<add> * - `absoluteString` => The printf compatible string when outputting absolute time
<add> * - `timezone` => The user timezone the timestamp should be formatted in.
<add> *
<add> * Relative dates look something like this:
<add> *
<add> * - 3 weeks, 4 days ago
<add> * - 15 seconds ago
<add> *
<add> * Default date formatting is d/M/YY e.g: on 18/2/09. Formatting is done internally using
<add> * `i18nFormat`, see the method for the valid formatting strings
<add> *
<add> * The returned string includes 'ago' or 'on' and assumes you'll properly add a word
<add> * like 'Posted ' before the function output.
<add> *
<add> * NOTE: If the difference is one week or more, the lowest level of accuracy is day
<add> *
<add> * @param array $options Array of options.
<add> * @return string Relative time string.
<add> */
<add> public function timeAgoInWords(array $options = [])
<add> {
<add> $time = $this->_time;
<add>
<add> $timezone = null;
<add> // TODO use options like below.
<add> $format = FrozenTime::$wordFormat;
<add> $end = FrozenTime::$wordEnd;
<add> $relativeString = __d('cake', '%s ago');
<add> $absoluteString = __d('cake', 'on %s');
<add> $accuracy = FrozenTime::$wordAccuracy;
<add> $from = FrozenTime::now();
<add> $opts = ['timezone', 'format', 'end', 'relativeString', 'absoluteString', 'from'];
<add>
<add> foreach ($opts as $option) {
<add> if (isset($options[$option])) {
<add> ${$option} = $options[$option];
<add> unset($options[$option]);
<add> }
<add> }
<add>
<add> if (isset($options['accuracy'])) {
<add> if (is_array($options['accuracy'])) {
<add> $accuracy = $options['accuracy'] + $accuracy;
<add> } else {
<add> foreach ($accuracy as $key => $level) {
<add> $accuracy[$key] = $options['accuracy'];
<add> }
<add> }
<add> }
<add>
<add> if ($timezone) {
<add> $time = $time->timezone($timezone);
<add> }
<add>
<add> $now = $from->format('U');
<add> $inSeconds = $time->format('U');
<add> $backwards = ($inSeconds > $now);
<add>
<add> $futureTime = $now;
<add> $pastTime = $inSeconds;
<add> if ($backwards) {
<add> $futureTime = $inSeconds;
<add> $pastTime = $now;
<add> }
<add> $diff = $futureTime - $pastTime;
<add>
<add> if (!$diff) {
<add> return __d('cake', 'just now', 'just now');
<add> }
<add>
<add> if ($diff > abs($now - (new FrozenTime($end))->format('U'))) {
<add> return sprintf($absoluteString, $time->i18nFormat($format));
<add> }
<add>
<add> // If more than a week, then take into account the length of months
<add> if ($diff >= 604800) {
<add> list($future['H'], $future['i'], $future['s'], $future['d'], $future['m'], $future['Y']) = explode('/', date('H/i/s/d/m/Y', $futureTime));
<add>
<add> list($past['H'], $past['i'], $past['s'], $past['d'], $past['m'], $past['Y']) = explode('/', date('H/i/s/d/m/Y', $pastTime));
<add> $weeks = $days = $hours = $minutes = $seconds = 0;
<add>
<add> $years = $future['Y'] - $past['Y'];
<add> $months = $future['m'] + ((12 * $years) - $past['m']);
<add>
<add> if ($months >= 12) {
<add> $years = floor($months / 12);
<add> $months = $months - ($years * 12);
<add> }
<add> if ($future['m'] < $past['m'] && $future['Y'] - $past['Y'] === 1) {
<add> $years--;
<add> }
<add>
<add> if ($future['d'] >= $past['d']) {
<add> $days = $future['d'] - $past['d'];
<add> } else {
<add> $daysInPastMonth = date('t', $pastTime);
<add> $daysInFutureMonth = date('t', mktime(0, 0, 0, $future['m'] - 1, 1, $future['Y']));
<add>
<add> if (!$backwards) {
<add> $days = ($daysInPastMonth - $past['d']) + $future['d'];
<add> } else {
<add> $days = ($daysInFutureMonth - $past['d']) + $future['d'];
<add> }
<add>
<add> if ($future['m'] != $past['m']) {
<add> $months--;
<add> }
<add> }
<add>
<add> if (!$months && $years >= 1 && $diff < ($years * 31536000)) {
<add> $months = 11;
<add> $years--;
<add> }
<add>
<add> if ($months >= 12) {
<add> $years = $years + 1;
<add> $months = $months - 12;
<add> }
<add>
<add> if ($days >= 7) {
<add> $weeks = floor($days / 7);
<add> $days = $days - ($weeks * 7);
<add> }
<add> } else {
<add> $years = $months = $weeks = 0;
<add> $days = floor($diff / 86400);
<add>
<add> $diff = $diff - ($days * 86400);
<add>
<add> $hours = floor($diff / 3600);
<add> $diff = $diff - ($hours * 3600);
<add>
<add> $minutes = floor($diff / 60);
<add> $diff = $diff - ($minutes * 60);
<add> $seconds = $diff;
<add> }
<add>
<add> $fWord = $accuracy['second'];
<add> if ($years > 0) {
<add> $fWord = $accuracy['year'];
<add> } elseif (abs($months) > 0) {
<add> $fWord = $accuracy['month'];
<add> } elseif (abs($weeks) > 0) {
<add> $fWord = $accuracy['week'];
<add> } elseif (abs($days) > 0) {
<add> $fWord = $accuracy['day'];
<add> } elseif (abs($hours) > 0) {
<add> $fWord = $accuracy['hour'];
<add> } elseif (abs($minutes) > 0) {
<add> $fWord = $accuracy['minute'];
<add> }
<add>
<add> $fNum = str_replace(['year', 'month', 'week', 'day', 'hour', 'minute', 'second'], [1, 2, 3, 4, 5, 6, 7], $fWord);
<add>
<add> $relativeDate = '';
<add> if ($fNum >= 1 && $years > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} year', '{0} years', $years, $years);
<add> }
<add> if ($fNum >= 2 && $months > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} month', '{0} months', $months, $months);
<add> }
<add> if ($fNum >= 3 && $weeks > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} week', '{0} weeks', $weeks, $weeks);
<add> }
<add> if ($fNum >= 4 && $days > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} day', '{0} days', $days, $days);
<add> }
<add> if ($fNum >= 5 && $hours > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} hour', '{0} hours', $hours, $hours);
<add> }
<add> if ($fNum >= 6 && $minutes > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} minute', '{0} minutes', $minutes, $minutes);
<add> }
<add> if ($fNum >= 7 && $seconds > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} second', '{0} seconds', $seconds, $seconds);
<add> }
<add>
<add> // When time has passed
<add> if (!$backwards && $relativeDate) {
<add> return sprintf($relativeString, $relativeDate);
<add> }
<add> if (!$backwards) {
<add> $aboutAgo = [
<add> 'second' => __d('cake', 'about a second ago'),
<add> 'minute' => __d('cake', 'about a minute ago'),
<add> 'hour' => __d('cake', 'about an hour ago'),
<add> 'day' => __d('cake', 'about a day ago'),
<add> 'week' => __d('cake', 'about a week ago'),
<add> 'year' => __d('cake', 'about a year ago')
<add> ];
<add>
<add> return $aboutAgo[$fWord];
<add> }
<add>
<add> // When time is to come
<add> if (!$relativeDate) {
<add> $aboutIn = [
<add> 'second' => __d('cake', 'in about a second'),
<add> 'minute' => __d('cake', 'in about a minute'),
<add> 'hour' => __d('cake', 'in about an hour'),
<add> 'day' => __d('cake', 'in about a day'),
<add> 'week' => __d('cake', 'in about a week'),
<add> 'year' => __d('cake', 'in about a year')
<add> ];
<add>
<add> return $aboutIn[$fWord];
<add> }
<add>
<add> return $relativeDate;
<add> }
<add>
<add> public function dateAgoInWords(array $options = [])
<add> {
<add> $date = $this->_time;
<add> $options += [
<add> 'from' => FrozenDate::now(),
<add> 'timezone' => null,
<add> 'format' => FrozenDate::$wordFormat,
<add> 'accuracy' => FrozenDate::$wordAccuracy,
<add> 'end' => FrozenDate::$wordEnd,
<add> 'relativeString' => __d('cake', '%s ago'),
<add> 'absoluteString' => __d('cake', 'on %s'),
<add> ];
<add> if (is_string($options['accuracy'])) {
<add> foreach (FrozenDate::$wordAccuracy as $key => $level) {
<add> $options[$key] = $options['accuracy'];
<add> }
<add> } else {
<add> $options['accuracy'] += FrozenDate::$wordAccuracy;
<add> }
<add> if ($options['timezone']) {
<add> $date = $date->timezone($options['timezone']);
<add> }
<add>
<add> $now = $options['from']->format('U');
<add> $inSeconds = $date->format('U');
<add> $backwards = ($inSeconds > $now);
<add>
<add> $futureTime = $now;
<add> $pastTime = $inSeconds;
<add> if ($backwards) {
<add> $futureTime = $inSeconds;
<add> $pastTime = $now;
<add> }
<add> $diff = $futureTime - $pastTime;
<add>
<add> if (!$diff) {
<add> return __d('cake', 'today');
<add> }
<add>
<add> if ($diff > abs($now - (new FrozenDate($options['end']))->format('U'))) {
<add> return sprintf($options['absoluteString'], $date->i18nFormat($options['format']));
<add> }
<add>
<add> // If more than a week, then take into account the length of months
<add> if ($diff >= 604800) {
<add> list($future['H'], $future['i'], $future['s'], $future['d'], $future['m'], $future['Y']) = explode('/', date('H/i/s/d/m/Y', $futureTime));
<add>
<add> list($past['H'], $past['i'], $past['s'], $past['d'], $past['m'], $past['Y']) = explode('/', date('H/i/s/d/m/Y', $pastTime));
<add> $weeks = $days = $hours = $minutes = $seconds = 0;
<add>
<add> $years = $future['Y'] - $past['Y'];
<add> $months = $future['m'] + ((12 * $years) - $past['m']);
<add>
<add> if ($months >= 12) {
<add> $years = floor($months / 12);
<add> $months = $months - ($years * 12);
<add> }
<add> if ($future['m'] < $past['m'] && $future['Y'] - $past['Y'] === 1) {
<add> $years--;
<add> }
<add>
<add> if ($future['d'] >= $past['d']) {
<add> $days = $future['d'] - $past['d'];
<add> } else {
<add> $daysInPastMonth = date('t', $pastTime);
<add> $daysInFutureMonth = date('t', mktime(0, 0, 0, $future['m'] - 1, 1, $future['Y']));
<add>
<add> if (!$backwards) {
<add> $days = ($daysInPastMonth - $past['d']) + $future['d'];
<add> } else {
<add> $days = ($daysInFutureMonth - $past['d']) + $future['d'];
<add> }
<add>
<add> if ($future['m'] != $past['m']) {
<add> $months--;
<add> }
<add> }
<add>
<add> if (!$months && $years >= 1 && $diff < ($years * 31536000)) {
<add> $months = 11;
<add> $years--;
<add> }
<add>
<add> if ($months >= 12) {
<add> $years = $years + 1;
<add> $months = $months - 12;
<add> }
<add>
<add> if ($days >= 7) {
<add> $weeks = floor($days / 7);
<add> $days = $days - ($weeks * 7);
<add> }
<add> } else {
<add> $years = $months = $weeks = 0;
<add> $days = floor($diff / 86400);
<add>
<add> $diff = $diff - ($days * 86400);
<add>
<add> $hours = floor($diff / 3600);
<add> $diff = $diff - ($hours * 3600);
<add>
<add> $minutes = floor($diff / 60);
<add> $diff = $diff - ($minutes * 60);
<add> $seconds = $diff;
<add> }
<add>
<add> $fWord = $options['accuracy']['day'];
<add> if ($years > 0) {
<add> $fWord = $options['accuracy']['year'];
<add> } elseif (abs($months) > 0) {
<add> $fWord = $options['accuracy']['month'];
<add> } elseif (abs($weeks) > 0) {
<add> $fWord = $options['accuracy']['week'];
<add> } elseif (abs($days) > 0) {
<add> $fWord = $options['accuracy']['day'];
<add> }
<add>
<add> $fNum = str_replace(['year', 'month', 'week', 'day'], [1, 2, 3, 4], $fWord);
<add>
<add> $relativeDate = '';
<add> if ($fNum >= 1 && $years > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} year', '{0} years', $years, $years);
<add> }
<add> if ($fNum >= 2 && $months > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} month', '{0} months', $months, $months);
<add> }
<add> if ($fNum >= 3 && $weeks > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} week', '{0} weeks', $weeks, $weeks);
<add> }
<add> if ($fNum >= 4 && $days > 0) {
<add> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} day', '{0} days', $days, $days);
<add> }
<add>
<add> // When time has passed
<add> if (!$backwards && $relativeDate) {
<add> return sprintf($options['relativeString'], $relativeDate);
<add> }
<add> if (!$backwards) {
<add> $aboutAgo = [
<add> 'day' => __d('cake', 'about a day ago'),
<add> 'week' => __d('cake', 'about a week ago'),
<add> 'month' => __d('cake', 'about a month ago'),
<add> 'year' => __d('cake', 'about a year ago')
<add> ];
<add>
<add> return $aboutAgo[$fWord];
<add> }
<add>
<add> // When time is to come
<add> if (!$relativeDate) {
<add> $aboutIn = [
<add> 'day' => __d('cake', 'in about a day'),
<add> 'week' => __d('cake', 'in about a week'),
<add> 'month' => __d('cake', 'in about a month'),
<add> 'year' => __d('cake', 'in about a year')
<add> ];
<add>
<add> return $aboutIn[$fWord];
<add> }
<add> return $relativeDate;
<add> }
<add>}
<ide><path>src/I18n/Time.php
<ide> public function __construct($time = null, $tz = null)
<ide> */
<ide> public function timeAgoInWords(array $options = [])
<ide> {
<del> $time = $this;
<del>
<del> $timezone = null;
<del> $format = static::$wordFormat;
<del> $end = static::$wordEnd;
<del> $relativeString = __d('cake', '%s ago');
<del> $absoluteString = __d('cake', 'on %s');
<del> $accuracy = static::$wordAccuracy;
<del> $from = static::now();
<del> $opts = ['timezone', 'format', 'end', 'relativeString', 'absoluteString', 'from'];
<del>
<del> foreach ($opts as $option) {
<del> if (isset($options[$option])) {
<del> ${$option} = $options[$option];
<del> unset($options[$option]);
<del> }
<del> }
<del>
<del> if (isset($options['accuracy'])) {
<del> if (is_array($options['accuracy'])) {
<del> $accuracy = $options['accuracy'] + $accuracy;
<del> } else {
<del> foreach ($accuracy as $key => $level) {
<del> $accuracy[$key] = $options['accuracy'];
<del> }
<del> }
<del> }
<del>
<del> if ($timezone) {
<del> $time = $time->timezone($timezone);
<del> }
<del>
<del> $now = $from->format('U');
<del> $inSeconds = $time->format('U');
<del> $backwards = ($inSeconds > $now);
<del>
<del> $futureTime = $now;
<del> $pastTime = $inSeconds;
<del> if ($backwards) {
<del> $futureTime = $inSeconds;
<del> $pastTime = $now;
<del> }
<del> $diff = $futureTime - $pastTime;
<del>
<del> if (!$diff) {
<del> return __d('cake', 'just now', 'just now');
<del> }
<del>
<del> if ($diff > abs($now - (new static($end))->format('U'))) {
<del> return sprintf($absoluteString, $time->i18nFormat($format));
<del> }
<del>
<del> // If more than a week, then take into account the length of months
<del> if ($diff >= 604800) {
<del> list($future['H'], $future['i'], $future['s'], $future['d'], $future['m'], $future['Y']) = explode('/', date('H/i/s/d/m/Y', $futureTime));
<del>
<del> list($past['H'], $past['i'], $past['s'], $past['d'], $past['m'], $past['Y']) = explode('/', date('H/i/s/d/m/Y', $pastTime));
<del> $weeks = $days = $hours = $minutes = $seconds = 0;
<del>
<del> $years = $future['Y'] - $past['Y'];
<del> $months = $future['m'] + ((12 * $years) - $past['m']);
<del>
<del> if ($months >= 12) {
<del> $years = floor($months / 12);
<del> $months = $months - ($years * 12);
<del> }
<del> if ($future['m'] < $past['m'] && $future['Y'] - $past['Y'] === 1) {
<del> $years--;
<del> }
<del>
<del> if ($future['d'] >= $past['d']) {
<del> $days = $future['d'] - $past['d'];
<del> } else {
<del> $daysInPastMonth = date('t', $pastTime);
<del> $daysInFutureMonth = date('t', mktime(0, 0, 0, $future['m'] - 1, 1, $future['Y']));
<del>
<del> if (!$backwards) {
<del> $days = ($daysInPastMonth - $past['d']) + $future['d'];
<del> } else {
<del> $days = ($daysInFutureMonth - $past['d']) + $future['d'];
<del> }
<del>
<del> if ($future['m'] != $past['m']) {
<del> $months--;
<del> }
<del> }
<del>
<del> if (!$months && $years >= 1 && $diff < ($years * 31536000)) {
<del> $months = 11;
<del> $years--;
<del> }
<del>
<del> if ($months >= 12) {
<del> $years = $years + 1;
<del> $months = $months - 12;
<del> }
<del>
<del> if ($days >= 7) {
<del> $weeks = floor($days / 7);
<del> $days = $days - ($weeks * 7);
<del> }
<del> } else {
<del> $years = $months = $weeks = 0;
<del> $days = floor($diff / 86400);
<del>
<del> $diff = $diff - ($days * 86400);
<del>
<del> $hours = floor($diff / 3600);
<del> $diff = $diff - ($hours * 3600);
<del>
<del> $minutes = floor($diff / 60);
<del> $diff = $diff - ($minutes * 60);
<del> $seconds = $diff;
<del> }
<del>
<del> $fWord = $accuracy['second'];
<del> if ($years > 0) {
<del> $fWord = $accuracy['year'];
<del> } elseif (abs($months) > 0) {
<del> $fWord = $accuracy['month'];
<del> } elseif (abs($weeks) > 0) {
<del> $fWord = $accuracy['week'];
<del> } elseif (abs($days) > 0) {
<del> $fWord = $accuracy['day'];
<del> } elseif (abs($hours) > 0) {
<del> $fWord = $accuracy['hour'];
<del> } elseif (abs($minutes) > 0) {
<del> $fWord = $accuracy['minute'];
<del> }
<del>
<del> $fNum = str_replace(['year', 'month', 'week', 'day', 'hour', 'minute', 'second'], [1, 2, 3, 4, 5, 6, 7], $fWord);
<del>
<del> $relativeDate = '';
<del> if ($fNum >= 1 && $years > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} year', '{0} years', $years, $years);
<del> }
<del> if ($fNum >= 2 && $months > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} month', '{0} months', $months, $months);
<del> }
<del> if ($fNum >= 3 && $weeks > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} week', '{0} weeks', $weeks, $weeks);
<del> }
<del> if ($fNum >= 4 && $days > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} day', '{0} days', $days, $days);
<del> }
<del> if ($fNum >= 5 && $hours > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} hour', '{0} hours', $hours, $hours);
<del> }
<del> if ($fNum >= 6 && $minutes > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} minute', '{0} minutes', $minutes, $minutes);
<del> }
<del> if ($fNum >= 7 && $seconds > 0) {
<del> $relativeDate .= ($relativeDate ? ', ' : '') . __dn('cake', '{0} second', '{0} seconds', $seconds, $seconds);
<del> }
<del>
<del> // When time has passed
<del> if (!$backwards && $relativeDate) {
<del> return sprintf($relativeString, $relativeDate);
<del> }
<del> if (!$backwards) {
<del> $aboutAgo = [
<del> 'second' => __d('cake', 'about a second ago'),
<del> 'minute' => __d('cake', 'about a minute ago'),
<del> 'hour' => __d('cake', 'about an hour ago'),
<del> 'day' => __d('cake', 'about a day ago'),
<del> 'week' => __d('cake', 'about a week ago'),
<del> 'year' => __d('cake', 'about a year ago')
<del> ];
<del>
<del> return $aboutAgo[$fWord];
<del> }
<del>
<del> // When time is to come
<del> if (!$relativeDate) {
<del> $aboutIn = [
<del> 'second' => __d('cake', 'in about a second'),
<del> 'minute' => __d('cake', 'in about a minute'),
<del> 'hour' => __d('cake', 'in about an hour'),
<del> 'day' => __d('cake', 'in about a day'),
<del> 'week' => __d('cake', 'in about a week'),
<del> 'year' => __d('cake', 'in about a year')
<del> ];
<del>
<del> return $aboutIn[$fWord];
<del> }
<del>
<del> return $relativeDate;
<del> }
<del>
<del> /**
<del> * Returns the difference between this date and the provided one in a human
<del> * readable format.
<del> *
<del> * See `Time::timeAgoInWords()` for a full list of options that can be passed
<del> * to this method.
<del> *
<del> * @param \Cake\Chronos\ChronosInterface|null $other the date to diff with
<del> * @param array $options options accepted by timeAgoInWords
<del> * @return string
<del> * @see Time::timeAgoInWords()
<del> */
<del> public function diffForHumans(ChronosInterface $other = null, array $options = [])
<del> {
<del> $options = ['from' => $other] + $options;
<del> return $this->timeAgoInWords($options);
<add> return (new RelativeTimeFormatter($this))->timeAgoInWords($options);
<ide> }
<ide>
<ide> /**
<ide><path>tests/TestCase/I18n/TimeTest.php
<ide> public function testDiffForHumans()
<ide> {
<ide> $time = new Time('2014-04-20 10:10:10');
<ide> $other = new Time('2014-04-27 10:10:10');
<del> $this->assertEquals('1 week ago', $time->diffForHumans($other));
<add> $this->assertEquals('1 week before', $time->diffForHumans($other));
<ide>
<ide> $other = new Time('2014-04-21 09:10:10');
<del> $this->assertEquals('23 hours ago', $time->diffForHumans($other));
<add> $this->assertEquals('23 hours before', $time->diffForHumans($other));
<ide>
<ide> $other = new Time('2014-04-13 09:10:10');
<del> $this->assertEquals('1 week', $time->diffForHumans($other));
<add> $this->assertEquals('1 week after', $time->diffForHumans($other));
<ide> }
<ide>
<ide> /** | 4 |
Ruby | Ruby | prefer xcode 6.1 on 10.10 | 8205ebabcb3d090901e983b7946de1978df108f4 | <ide><path>Library/Homebrew/os/mac.rb
<ide> def preferred_arch
<ide> "5.1" => { :clang => "5.1", :clang_build => 503 },
<ide> "5.1.1" => { :clang => "5.1", :clang_build => 503 },
<ide> "6.0" => { :clang => "6.0", :clang_build => 600 },
<add> "6.1" => { :clang => "6.0", :clang_build => 600 },
<ide> }
<ide>
<ide> def compilers_standard?
<ide><path>Library/Homebrew/os/mac/xcode.rb
<ide> def latest_version
<ide> when "10.7" then "4.6.3"
<ide> when "10.8" then "5.1.1"
<ide> when "10.9" then "5.1.1"
<del> when "10.10" then "6.0"
<add> when "10.10" then "6.1"
<ide> else
<ide> # Default to newest known version of Xcode for unreleased OSX versions.
<ide> if MacOS.version > "10.10"
<del> "6.0"
<add> "6.1"
<ide> else
<ide> raise "Mac OS X '#{MacOS.version}' is invalid"
<ide> end | 2 |
Python | Python | improve tests for python integer special cases | d47b09c047c2128e820f4d4f27f10b3fd7d8b05b | <ide><path>numpy/core/tests/test_nep50_promotions.py
<ide> def test_nep50_examples():
<ide> assert res.dtype == np.float64
<ide>
<ide>
<del>def test_nep50_without_warnings():
<del> # Test that avoid the "warn" method, since that may lead to different
<del> # code paths in some cases.
<del> # Set promotion to weak (no warning), the auto-fixture will reset it.
<add>@pytest.mark.parametrize("dtype", np.typecodes["AllInteger"])
<add>def test_nep50_weak_integers(dtype):
<add> # Avoids warning (different code path for scalars)
<ide> np._set_promotion_state("weak")
<add> scalar_type = np.dtype(dtype).type
<add>
<add> maxint = int(np.iinfo(dtype).max)
<add>
<ide> with np.errstate(over="warn"):
<ide> with pytest.warns(RuntimeWarning):
<del> res = np.uint8(100) + 200
<del> assert res.dtype == np.uint8
<add> res = scalar_type(100) + maxint
<add> assert res.dtype == dtype
<ide>
<del> with pytest.warns(RuntimeWarning):
<del> res = np.float32(1) + 3e100
<del> assert res.dtype == np.float32
<add> # Array operations are not expected to warn, but should give the same
<add> # result dtype.
<add> res = np.array(100, dtype=dtype) + maxint
<add> assert res.dtype == dtype
<add>
<add>
<add>@pytest.mark.parametrize("dtype", np.typecodes["AllFloat"])
<add>def test_nep50_weak_integers_with_inexact(dtype):
<add> # Avoids warning (different code path for scalars)
<add> np._set_promotion_state("weak")
<add> scalar_type = np.dtype(dtype).type
<add>
<add> too_big_int = int(np.finfo(dtype).max) * 2
<add>
<add> if dtype in "dDG":
<add> # These dtypes currently convert to Python float internally, which
<add> # raises an OverflowError, while the other dtypes overflow to inf.
<add> # NOTE: It may make sense to normalize the behavior!
<add> with pytest.raises(OverflowError):
<add> scalar_type(1) + too_big_int
<add>
<add> with pytest.raises(OverflowError):
<add> np.array(1, dtype=dtype) + too_big_int
<add> else:
<add> # Otherwise, we overflow to infinity:
<add> with pytest.warns(RuntimeWarning):
<add> res = scalar_type(1) + too_big_int
<add> assert res.dtype == dtype
<add> assert res == np.inf
<add>
<add> with pytest.warns(RuntimeWarning):
<add> res = np.array(1, dtype=dtype) + too_big_int
<add> assert res.dtype == dtype
<add> assert res == np.inf
<ide>
<ide>
<ide> def test_nep50_integer_conversion_errors(): | 1 |
Ruby | Ruby | relocate homebrew_repository when necessary | 997ccb044d21fccb1a95ef4eefad0fa892289e02 | <ide><path>Library/Homebrew/dev-cmd/bottle.rb
<ide> def bottle_formula(f)
<ide> tar_path = Pathname.pwd/tar_filename
<ide>
<ide> prefix = HOMEBREW_PREFIX.to_s
<add> repository = HOMEBREW_REPOSITORY.to_s
<ide> cellar = HOMEBREW_CELLAR.to_s
<ide>
<ide> ohai "Bottling #{filename}..."
<ide> def bottle_formula(f)
<ide> keg.relocate_dynamic_linkage prefix, Keg::PREFIX_PLACEHOLDER,
<ide> cellar, Keg::CELLAR_PLACEHOLDER
<ide> keg.relocate_text_files prefix, Keg::PREFIX_PLACEHOLDER,
<del> cellar, Keg::CELLAR_PLACEHOLDER
<add> cellar, Keg::CELLAR_PLACEHOLDER,
<add> repository, Keg::REPOSITORY_PLACEHOLDER
<ide> end
<ide>
<ide> keg.delete_pyc_files!
<ide> def bottle_formula(f)
<ide> skip_relocation = true
<ide> else
<ide> relocatable = false if keg_contain?(prefix_check, keg, ignores)
<add> relocatable = false if keg_contain?(repository, keg, ignores)
<ide> relocatable = false if keg_contain?(cellar, keg, ignores)
<ide> if prefix != prefix_check
<ide> relocatable = false if keg_contain_absolute_symlink_starting_with?(prefix, keg)
<ide> def bottle_formula(f)
<ide> keg.relocate_dynamic_linkage Keg::PREFIX_PLACEHOLDER, prefix,
<ide> Keg::CELLAR_PLACEHOLDER, cellar
<ide> keg.relocate_text_files Keg::PREFIX_PLACEHOLDER, prefix,
<del> Keg::CELLAR_PLACEHOLDER, cellar
<add> Keg::CELLAR_PLACEHOLDER, cellar,
<add> Keg::REPOSITORY_PLACEHOLDER, repository
<ide> end
<ide> end
<ide> end
<ide><path>Library/Homebrew/formula_installer.rb
<ide> def pour
<ide> Keg::CELLAR_PLACEHOLDER, HOMEBREW_CELLAR.to_s
<ide> end
<ide> keg.relocate_text_files Keg::PREFIX_PLACEHOLDER, HOMEBREW_PREFIX.to_s,
<del> Keg::CELLAR_PLACEHOLDER, HOMEBREW_CELLAR.to_s
<add> Keg::CELLAR_PLACEHOLDER, HOMEBREW_CELLAR.to_s,
<add> Keg::REPOSITORY_PLACEHOLDER, HOMEBREW_REPOSITORY.to_s
<ide>
<ide> Pathname.glob("#{formula.bottle_prefix}/{etc,var}/**/*") do |path|
<ide> path.extend(InstallRenamed)
<ide><path>Library/Homebrew/keg_relocate.rb
<ide> class Keg
<ide> PREFIX_PLACEHOLDER = "".freeze
<ide> CELLAR_PLACEHOLDER = "".freeze
<add> REPOSITORY_PLACEHOLDER = "".freeze
<ide>
<ide> def fix_dynamic_linkage
<ide> symlink_files.each do |file|
<ide> def relocate_dynamic_linkage(_old_prefix, _new_prefix, _old_cellar, _new_cellar)
<ide> []
<ide> end
<ide>
<del> def relocate_text_files(old_prefix, new_prefix, old_cellar, new_cellar)
<add> def relocate_text_files(old_prefix, new_prefix, old_cellar, new_cellar,
<add> old_repository, new_repository)
<ide> files = text_files | libtool_files
<ide>
<ide> files.group_by { |f| f.stat.ino }.each_value do |first, *rest|
<ide> s = first.open("rb", &:read)
<ide> changed = s.gsub!(old_cellar, new_cellar)
<ide> changed = s.gsub!(old_prefix, new_prefix) || changed
<add> changed = s.gsub!(old_repository, new_repository) || changed
<ide>
<ide> next unless changed
<ide> | 3 |
Java | Java | introduce lookuppath in webflux request routing | cf1bc8119999e2bf3f41ae572f5b83536685ce5e | <ide><path>spring-web/src/main/java/org/springframework/web/cors/reactive/UrlBasedCorsConfigurationSource.java
<ide> import org.springframework.util.PathMatcher;
<ide> import org.springframework.web.cors.CorsConfiguration;
<ide> import org.springframework.web.server.ServerWebExchange;
<del>import org.springframework.web.server.support.HttpRequestPathHelper;
<ide> import org.springframework.web.util.pattern.ParsingPathMatcher;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> /**
<ide> * Provide a per reactive request {@link CorsConfiguration} instance based on a
<ide> public class UrlBasedCorsConfigurationSource implements CorsConfigurationSource
<ide>
<ide> private PathMatcher pathMatcher = new ParsingPathMatcher();
<ide>
<del> private HttpRequestPathHelper pathHelper = new HttpRequestPathHelper();
<del>
<ide>
<ide> /**
<ide> * Set the PathMatcher implementation to use for matching URL paths
<ide> public void setPathMatcher(PathMatcher pathMatcher) {
<ide> this.pathMatcher = pathMatcher;
<ide> }
<ide>
<del> /**
<del> * Set if context path and request URI should be URL-decoded. Both are returned
<del> * <i>undecoded</i> by the Servlet API, in contrast to the servlet path.
<del> * <p>Uses either the request encoding or the default encoding according
<del> * to the Servlet spec (ISO-8859-1).
<del> * @see HttpRequestPathHelper#setUrlDecode
<del> */
<del> public void setUrlDecode(boolean urlDecode) {
<del> this.pathHelper.setUrlDecode(urlDecode);
<del> }
<del>
<del> /**
<del> * Set the UrlPathHelper to use for resolution of lookup paths.
<del> * <p>Use this to override the default UrlPathHelper with a custom subclass.
<del> */
<del> public void setHttpRequestPathHelper(HttpRequestPathHelper pathHelper) {
<del> Assert.notNull(pathHelper, "HttpRequestPathHelper must not be null");
<del> this.pathHelper = pathHelper;
<del> }
<del>
<ide> /**
<ide> * Set CORS configuration based on URL patterns.
<ide> */
<ide> public void registerCorsConfiguration(String path, CorsConfiguration config) {
<ide>
<ide> @Override
<ide> public CorsConfiguration getCorsConfiguration(ServerWebExchange exchange) {
<del> String lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<add> String lookupPath = exchange.<LookupPath>getAttribute(LookupPath.LOOKUP_PATH_ATTRIBUTE).get().getPath();
<ide> for (Map.Entry<String, CorsConfiguration> entry : this.corsConfigurations.entrySet()) {
<ide> if (this.pathMatcher.match(entry.getKey(), lookupPath)) {
<ide> return entry.getValue();
<ide><path>spring-web/src/main/java/org/springframework/web/server/support/HttpRequestPathHelper.java
<ide> public boolean shouldUrlDecode() {
<ide> }
<ide>
<ide>
<del> public String getLookupPathForRequest(ServerWebExchange exchange) {
<add> public LookupPath getLookupPathForRequest(ServerWebExchange exchange) {
<ide> String path = getPathWithinApplication(exchange.getRequest());
<del> return (shouldUrlDecode() ? decode(exchange, path) : path);
<add> path = (shouldUrlDecode() ? decode(exchange, path) : path);
<add> int begin = path.lastIndexOf('/') + 1;
<add> int end = path.length();
<add> int paramIndex = path.indexOf(';', begin);
<add> int extIndex = path.lastIndexOf('.', paramIndex != -1 ? paramIndex : end);
<add> return new LookupPath(path, extIndex, paramIndex);
<ide> }
<ide>
<ide> private String getPathWithinApplication(ServerHttpRequest request) {
<ide><path>spring-web/src/main/java/org/springframework/web/server/support/LookupPath.java
<add>/*
<add> * Copyright 2002-2017 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.web.server.support;
<add>
<add>import org.springframework.lang.Nullable;
<add>import org.springframework.web.server.ServerWebExchange;
<add>
<add>/**
<add> * Lookup path information of an incoming HTTP request.
<add> *
<add> * @author Brian Clozel
<add> * @since 5.0
<add> * @see HttpRequestPathHelper
<add> */
<add>public final class LookupPath {
<add>
<add> public static final String LOOKUP_PATH_ATTRIBUTE = LookupPath.class.getName();
<add>
<add> private final String path;
<add>
<add> private final int fileExtensionIndex;
<add>
<add> private final int pathParametersIndex;
<add>
<add> public LookupPath(String path, int fileExtensionIndex, int pathParametersIndex) {
<add> this.path = path;
<add> this.fileExtensionIndex = fileExtensionIndex;
<add> this.pathParametersIndex = pathParametersIndex;
<add> }
<add>
<add> public String getPath() {
<add> if (this.pathParametersIndex != -1) {
<add> // TODO: variant without the path parameter information?
<add> //return this.path.substring(0, this.pathParametersIndex);
<add> return this.path;
<add> }
<add> else {
<add> return this.path;
<add> }
<add> }
<add>
<add> public String getPathWithoutExtension() {
<add> if (this.fileExtensionIndex != -1) {
<add> return this.path.substring(0, this.fileExtensionIndex);
<add> }
<add> else {
<add> return this.path;
<add> }
<add> }
<add>
<add> @Nullable
<add> public String getFileExtension() {
<add> if (this.fileExtensionIndex == -1) {
<add> return null;
<add> }
<add> else if (this.pathParametersIndex == -1) {
<add> return this.path.substring(this.fileExtensionIndex);
<add> }
<add> else {
<add> return this.path.substring(this.fileExtensionIndex, this.pathParametersIndex);
<add> }
<add> }
<add>
<add> @Nullable
<add> public String getPathParameters() {
<add> return this.pathParametersIndex == -1 ?
<add> null : this.path.substring(this.pathParametersIndex + 1);
<add> }
<add>
<add>}
<ide><path>spring-web/src/test/java/org/springframework/web/cors/reactive/UrlBasedCorsConfigurationSourceTests.java
<ide> import org.springframework.mock.http.server.reactive.test.MockServerHttpRequest;
<ide> import org.springframework.web.cors.CorsConfiguration;
<ide> import org.springframework.web.server.ServerWebExchange;
<add>import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> import static org.junit.Assert.assertEquals;
<ide> import static org.junit.Assert.assertNull;
<ide> public class UrlBasedCorsConfigurationSourceTests {
<ide> @Test
<ide> public void empty() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/bar/test.html").toExchange();
<add> setLookupPathAttribute(exchange);
<ide> assertNull(this.configSource.getCorsConfiguration(exchange));
<ide> }
<ide>
<ide> public void registerAndMatch() {
<ide> this.configSource.registerCorsConfiguration("/bar/**", config);
<ide>
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo/test.html").toExchange();
<add> setLookupPathAttribute(exchange);
<ide> assertNull(this.configSource.getCorsConfiguration(exchange));
<ide>
<ide> exchange = MockServerHttpRequest.get("/bar/test.html").toExchange();
<add> setLookupPathAttribute(exchange);
<ide> assertEquals(config, this.configSource.getCorsConfiguration(exchange));
<ide> }
<ide>
<ide> public void unmodifiableConfigurationsMap() {
<ide> this.configSource.getCorsConfigurations().put("/**", new CorsConfiguration());
<ide> }
<ide>
<add> public void setLookupPathAttribute(ServerWebExchange exchange) {
<add> HttpRequestPathHelper helper = new HttpRequestPathHelper();
<add> exchange.getAttributes().put(LookupPath.LOOKUP_PATH_ATTRIBUTE,
<add> helper.getLookupPathForRequest(exchange));
<add> }
<add>
<ide> }
<ide><path>spring-web/src/test/java/org/springframework/web/server/support/LookupPathTests.java
<add>/*
<add> * Copyright 2002-2017 the original author or authors.
<add> *
<add> * Licensed under the Apache License, Version 2.0 (the "License");
<add> * you may not use this file except in compliance with the License.
<add> * You may obtain a copy of the License at
<add> *
<add> * http://www.apache.org/licenses/LICENSE-2.0
<add> *
<add> * Unless required by applicable law or agreed to in writing, software
<add> * distributed under the License is distributed on an "AS IS" BASIS,
<add> * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
<add> * See the License for the specific language governing permissions and
<add> * limitations under the License.
<add> */
<add>
<add>package org.springframework.web.server.support;
<add>
<add>import org.junit.Test;
<add>
<add>import org.springframework.mock.http.server.reactive.test.MockServerHttpRequest;
<add>import org.springframework.web.server.ServerWebExchange;
<add>
<add>import static org.junit.Assert.assertEquals;
<add>
<add>/**
<add> * Unit tests for {@link LookupPath}
<add> * @author Brian Clozel
<add> */
<add>public class LookupPathTests {
<add>
<add> @Test
<add> public void parsePath() {
<add> LookupPath path = create("/foo");
<add> assertEquals("/foo", path.getPath());
<add> assertEquals("/foo", path.getPathWithoutExtension());
<add> }
<add>
<add> @Test
<add> public void parsePathWithExtension() {
<add> LookupPath path = create("/foo.txt");
<add> assertEquals("/foo.txt", path.getPath());
<add> assertEquals("/foo", path.getPathWithoutExtension());
<add> assertEquals(".txt", path.getFileExtension());
<add> }
<add>
<add> @Test
<add> public void parsePathWithParams() {
<add> LookupPath path = create("/test/foo.txt;foo=bar?framework=spring");
<add> assertEquals("/test/foo.txt;foo=bar", path.getPath());
<add> assertEquals("/test/foo", path.getPathWithoutExtension());
<add> assertEquals(".txt", path.getFileExtension());
<add> assertEquals("foo=bar", path.getPathParameters());
<add> }
<add>
<add> private LookupPath create(String path) {
<add> HttpRequestPathHelper helper = new HttpRequestPathHelper();
<add> ServerWebExchange exchange = MockServerHttpRequest.get(path).build().toExchange();
<add> return helper.getLookupPathForRequest(exchange);
<add> }
<add>}
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/handler/AbstractHandlerMapping.java
<ide> package org.springframework.web.reactive.handler;
<ide>
<ide> import java.util.Map;
<add>import java.util.Optional;
<ide>
<ide> import reactor.core.publisher.Mono;
<ide>
<ide> import org.springframework.web.cors.reactive.DefaultCorsProcessor;
<ide> import org.springframework.web.cors.reactive.UrlBasedCorsConfigurationSource;
<ide> import org.springframework.web.reactive.HandlerMapping;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide> import org.springframework.web.server.WebHandler;
<ide> import org.springframework.web.server.support.HttpRequestPathHelper;
<ide> *
<ide> * @author Rossen Stoyanchev
<ide> * @author Juergen Hoeller
<add> * @author Brian Clozel
<ide> * @since 5.0
<ide> */
<ide> public abstract class AbstractHandlerMapping extends ApplicationObjectSupport implements HandlerMapping, Ordered {
<ide> public Mono<Object> getHandler(ServerWebExchange exchange) {
<ide> });
<ide> }
<ide>
<add> protected LookupPath getLookupPath(ServerWebExchange exchange) {
<add> Optional<LookupPath> attribute = exchange.getAttribute(LookupPath.LOOKUP_PATH_ATTRIBUTE);
<add> return attribute.orElseGet(() -> {
<add> LookupPath lookupPath = createLookupPath(exchange);
<add> exchange.getAttributes().put(LookupPath.LOOKUP_PATH_ATTRIBUTE, lookupPath);
<add> return lookupPath;
<add> });
<add> }
<add>
<add> protected LookupPath createLookupPath(ServerWebExchange exchange) {
<add> return getPathHelper().getLookupPathForRequest(exchange);
<add> }
<add>
<ide> /**
<ide> * Look up a handler for the given request, returning an empty {@code Mono}
<ide> * if no specific one is found. This method is called by {@link #getHandler}.
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/handler/AbstractUrlHandlerMapping.java
<ide> import org.springframework.beans.BeansException;
<ide> import org.springframework.lang.Nullable;
<ide> import org.springframework.util.Assert;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide>
<ide> /**
<ide> *
<ide> * @author Rossen Stoyanchev
<ide> * @author Juergen Hoeller
<add> * @author Brian Clozel
<ide> * @since 5.0
<ide> */
<ide> public abstract class AbstractUrlHandlerMapping extends AbstractHandlerMapping {
<ide> public final Map<String, Object> getHandlerMap() {
<ide>
<ide> @Override
<ide> public Mono<Object> getHandlerInternal(ServerWebExchange exchange) {
<del> String lookupPath = getPathHelper().getLookupPathForRequest(exchange);
<add> LookupPath lookupPath = getLookupPath(exchange);
<ide> Object handler;
<ide> try {
<ide> handler = lookupHandler(lookupPath, exchange);
<ide> public Mono<Object> getHandlerInternal(ServerWebExchange exchange) {
<ide> }
<ide>
<ide> if (handler != null && logger.isDebugEnabled()) {
<del> logger.debug("Mapping [" + lookupPath + "] to " + handler);
<add> logger.debug("Mapping [" + lookupPath.getPath() + "] to " + handler);
<ide> }
<ide> else if (handler == null && logger.isTraceEnabled()) {
<del> logger.trace("No handler mapping found for [" + lookupPath + "]");
<add> logger.trace("No handler mapping found for [" + lookupPath.getPath() + "]");
<ide> }
<ide>
<ide> return Mono.justOrEmpty(handler);
<ide> }
<ide>
<ide> /**
<del> * Look up a handler instance for the given URL path.
<add> * Look up a handler instance for the given URL lookup path.
<add> *
<ide> * <p>Supports direct matches, e.g. a registered "/test" matches "/test",
<del> * and various Ant-style pattern matches, e.g. a registered "/t*" matches
<del> * both "/test" and "/team". For details, see the AntPathMatcher class.
<del> * <p>Looks for the most exact pattern, where most exact is defined as
<del> * the longest path pattern.
<del> * @param urlPath URL the bean is mapped to
<add> * and various path pattern matches, e.g. a registered "/t*" matches
<add> * both "/test" and "/team". For details, see the PathPattern class.
<add> *
<add> * @param lookupPath URL the handler is mapped to
<ide> * @param exchange the current exchange
<ide> * @return the associated handler instance, or {@code null} if not found
<ide> * @see org.springframework.web.util.pattern.ParsingPathMatcher
<ide> */
<ide> @Nullable
<del> protected Object lookupHandler(String urlPath, ServerWebExchange exchange) throws Exception {
<add> protected Object lookupHandler(LookupPath lookupPath, ServerWebExchange exchange) throws Exception {
<ide> // Direct match?
<add> String urlPath = lookupPath.getPath();
<ide> Object handler = this.handlerMap.get(urlPath);
<ide> if (handler != null) {
<ide> return handleMatch(handler, urlPath, urlPath, exchange);
<ide> else if (useTrailingSlashMatch()) {
<ide> if (!matches.isEmpty()) {
<ide> Collections.sort(matches, comparator);
<ide> if (logger.isDebugEnabled()) {
<del> logger.debug("Matching patterns for request [" + urlPath + "] are " + matches);
<add> logger.debug("Matching patterns for request [" + lookupPath + "] are " + matches);
<ide> }
<ide> bestMatch = matches.get(0);
<ide> }
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/resource/ResourceUrlProvider.java
<ide> import org.springframework.web.reactive.handler.SimpleUrlHandlerMapping;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide> import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.util.pattern.ParsingPathMatcher;
<ide>
<ide> /**
<ide> public final Mono<String> getForRequestUrl(ServerWebExchange exchange, String re
<ide> private int getLookupPathIndex(ServerWebExchange exchange) {
<ide> ServerHttpRequest request = exchange.getRequest();
<ide> String requestPath = request.getURI().getPath();
<del> String lookupPath = getPathHelper().getLookupPathForRequest(exchange);
<del> return requestPath.indexOf(lookupPath);
<add> LookupPath lookupPath = getPathHelper().getLookupPathForRequest(exchange);
<add> return requestPath.indexOf(lookupPath.getPath());
<ide> }
<ide>
<ide> private int getEndPathIndex(String lookupPath) {
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/condition/PatternsRequestCondition.java
<ide>
<ide> import org.springframework.util.PathMatcher;
<ide> import org.springframework.util.StringUtils;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.server.ServerWebExchange;
<del>import org.springframework.web.server.support.HttpRequestPathHelper;
<ide> import org.springframework.web.util.pattern.ParsingPathMatcher;
<ide>
<ide> /**
<ide> public final class PatternsRequestCondition extends AbstractRequestCondition<Pat
<ide>
<ide> private final Set<String> patterns;
<ide>
<del> private final HttpRequestPathHelper pathHelper;
<del>
<ide> private final PathMatcher pathMatcher;
<ide>
<ide> private final boolean useSuffixPatternMatch;
<ide> public final class PatternsRequestCondition extends AbstractRequestCondition<Pat
<ide> * @param patterns 0 or more URL patterns; if 0 the condition will match to every request.
<ide> */
<ide> public PatternsRequestCondition(String... patterns) {
<del> this(asList(patterns), null, null, true, true, null);
<add> this(asList(patterns), null, true, true, null);
<ide> }
<ide>
<ide> /**
<ide> * Creates a new instance with the given URL patterns.
<ide> * Each pattern that is not empty and does not start with "/" is pre-pended with "/".
<ide> * @param patterns the URL patterns to use; if 0, the condition will match to every request.
<del> * @param pathHelper to determine the lookup path for a request
<ide> * @param pathMatcher for pattern path matching
<ide> * @param useSuffixPatternMatch whether to enable matching by suffix (".*")
<ide> * @param useTrailingSlashMatch whether to match irrespective of a trailing slash
<ide> * @param extensions file extensions to consider for path matching
<ide> */
<del> public PatternsRequestCondition(String[] patterns, HttpRequestPathHelper pathHelper,
<del> PathMatcher pathMatcher, boolean useSuffixPatternMatch, boolean useTrailingSlashMatch,
<add> public PatternsRequestCondition(String[] patterns, PathMatcher pathMatcher,
<add> boolean useSuffixPatternMatch, boolean useTrailingSlashMatch,
<ide> Set<String> extensions) {
<ide>
<del> this(asList(patterns), pathHelper, pathMatcher, useSuffixPatternMatch, useTrailingSlashMatch, extensions);
<add> this(asList(patterns), pathMatcher, useSuffixPatternMatch, useTrailingSlashMatch, extensions);
<ide> }
<ide>
<ide> /**
<ide> * Private constructor accepting a collection of patterns.
<ide> */
<del> private PatternsRequestCondition(Collection<String> patterns, HttpRequestPathHelper pathHelper,
<del> PathMatcher pathMatcher, boolean useSuffixPatternMatch, boolean useTrailingSlashMatch,
<add> private PatternsRequestCondition(Collection<String> patterns, PathMatcher pathMatcher,
<add> boolean useSuffixPatternMatch, boolean useTrailingSlashMatch,
<ide> Set<String> fileExtensions) {
<ide>
<ide> this.patterns = Collections.unmodifiableSet(prependLeadingSlash(patterns));
<del> this.pathHelper = (pathHelper != null ? pathHelper : new HttpRequestPathHelper());
<ide> this.pathMatcher = (pathMatcher != null ? pathMatcher : new ParsingPathMatcher());
<ide> this.useSuffixPatternMatch = useSuffixPatternMatch;
<ide> this.useTrailingSlashMatch = useTrailingSlashMatch;
<ide> else if (!other.patterns.isEmpty()) {
<ide> else {
<ide> result.add("");
<ide> }
<del> return new PatternsRequestCondition(result, this.pathHelper, this.pathMatcher, this.useSuffixPatternMatch,
<add> return new PatternsRequestCondition(result, this.pathMatcher, this.useSuffixPatternMatch,
<ide> this.useTrailingSlashMatch, this.fileExtensions);
<ide> }
<ide>
<ide> public PatternsRequestCondition getMatchingCondition(ServerWebExchange exchange)
<ide> return this;
<ide> }
<ide>
<del> String lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<add> LookupPath lookupPath = exchange
<add> .<LookupPath>getAttribute(LookupPath.LOOKUP_PATH_ATTRIBUTE).get();
<ide> List<String> matches = getMatchingPatterns(lookupPath);
<ide>
<ide> return matches.isEmpty() ? null :
<del> new PatternsRequestCondition(matches, this.pathHelper, this.pathMatcher, this.useSuffixPatternMatch,
<del> this.useTrailingSlashMatch, this.fileExtensions);
<add> new PatternsRequestCondition(matches, this.pathMatcher, this.useSuffixPatternMatch,
<add> this.useTrailingSlashMatch, this.fileExtensions);
<ide> }
<ide>
<ide> /**
<ide> public PatternsRequestCondition getMatchingCondition(ServerWebExchange exchange)
<ide> * @param lookupPath the lookup path to match to existing patterns
<ide> * @return a collection of matching patterns sorted with the closest match at the top
<ide> */
<del> public List<String> getMatchingPatterns(String lookupPath) {
<add> public List<String> getMatchingPatterns(LookupPath lookupPath) {
<ide> List<String> matches = new ArrayList<>();
<ide> for (String pattern : this.patterns) {
<ide> String match = getMatchingPattern(pattern, lookupPath);
<ide> if (match != null) {
<ide> matches.add(match);
<ide> }
<ide> }
<del> Collections.sort(matches, this.pathMatcher.getPatternComparator(lookupPath));
<add> Collections.sort(matches, this.pathMatcher.getPatternComparator(lookupPath.getPath()));
<ide> return matches;
<ide> }
<ide>
<del> private String getMatchingPattern(String pattern, String lookupPath) {
<del> if (pattern.equals(lookupPath)) {
<add> private String getMatchingPattern(String pattern, LookupPath lookupPath) {
<add> if (pattern.equals(lookupPath.getPath())) {
<ide> return pattern;
<ide> }
<ide> if (this.useSuffixPatternMatch) {
<del> if (!this.fileExtensions.isEmpty() && lookupPath.indexOf('.') != -1) {
<del> for (String extension : this.fileExtensions) {
<del> if (this.pathMatcher.match(pattern + extension, lookupPath)) {
<del> return pattern + extension;
<del> }
<add> if (!this.fileExtensions.isEmpty() && lookupPath.getFileExtension() != null) {
<add> if (this.fileExtensions.contains(lookupPath.getFileExtension()) &&
<add> this.pathMatcher.match(pattern, lookupPath.getPathWithoutExtension())) {
<add> return pattern + lookupPath.getFileExtension();
<ide> }
<ide> }
<ide> else {
<del> boolean hasSuffix = pattern.indexOf('.') != -1;
<del> if (!hasSuffix && this.pathMatcher.match(pattern + ".*", lookupPath)) {
<add> if (lookupPath.getFileExtension() != null
<add> && this.pathMatcher.match(pattern , lookupPath.getPathWithoutExtension())) {
<ide> return pattern + ".*";
<ide> }
<ide> }
<ide> }
<del> if (this.pathMatcher.match(pattern, lookupPath)) {
<add> if (this.pathMatcher.match(pattern, lookupPath.getPath())) {
<ide> return pattern;
<ide> }
<ide> if (this.useTrailingSlashMatch) {
<del> if (!pattern.endsWith("/") && this.pathMatcher.match(pattern + "/", lookupPath)) {
<add> if (!pattern.endsWith("/") && this.pathMatcher.match(pattern + "/", lookupPath.getPath())) {
<ide> return pattern +"/";
<ide> }
<ide> }
<ide> private String getMatchingPattern(String pattern, String lookupPath) {
<ide> */
<ide> @Override
<ide> public int compareTo(PatternsRequestCondition other, ServerWebExchange exchange) {
<del> String lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<del> Comparator<String> patternComparator = this.pathMatcher.getPatternComparator(lookupPath);
<add> LookupPath lookupPath = exchange
<add> .<LookupPath>getAttribute(LookupPath.LOOKUP_PATH_ATTRIBUTE).get();
<add> Comparator<String> patternComparator = this.pathMatcher.getPatternComparator(lookupPath.getPath());
<ide> Iterator<String> iterator = this.patterns.iterator();
<ide> Iterator<String> iteratorOther = other.patterns.iterator();
<ide> while (iterator.hasNext() && iteratorOther.hasNext()) {
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/method/AbstractHandlerMethodMapping.java
<ide> import org.springframework.web.reactive.HandlerMapping;
<ide> import org.springframework.web.reactive.handler.AbstractHandlerMapping;
<ide> import org.springframework.web.server.ServerWebExchange;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> /**
<ide> * Abstract base class for {@link HandlerMapping} implementations that define
<ide> * subclasses defining the details of the mapping type {@code <T>}.
<ide> *
<ide> * @author Rossen Stoyanchev
<add> * @author Brian Clozel
<ide> * @since 5.0
<ide> * @param <T> The mapping for a {@link HandlerMethod} containing the conditions
<ide> * needed to match the handler method to incoming request.
<ide> protected void handlerMethodsInitialized(Map<T, HandlerMethod> handlerMethods) {
<ide> */
<ide> @Override
<ide> public Mono<HandlerMethod> getHandlerInternal(ServerWebExchange exchange) {
<del> String lookupPath = getPathHelper().getLookupPathForRequest(exchange);
<add> LookupPath lookupPath = getLookupPath(exchange);
<ide> if (logger.isDebugEnabled()) {
<ide> logger.debug("Looking up handler method for path " + lookupPath);
<ide> }
<ide> public Mono<HandlerMethod> getHandlerInternal(ServerWebExchange exchange) {
<ide> /**
<ide> * Look up the best-matching handler method for the current request.
<ide> * If multiple matches are found, the best match is selected.
<del> * @param lookupPath mapping lookup path within the current servlet mapping
<add> * @param lookupPath the lookup path within the current mapping
<ide> * @param exchange the current exchange
<ide> * @return the best-matching handler method, or {@code null} if no match
<del> * @see #handleMatch(Object, String, ServerWebExchange)
<del> * @see #handleNoMatch(Set, String, ServerWebExchange)
<add> * @see #handleMatch(Object, LookupPath, ServerWebExchange)
<add> * @see #handleNoMatch(Set, LookupPath, ServerWebExchange)
<ide> */
<ide> @Nullable
<del> protected HandlerMethod lookupHandlerMethod(String lookupPath, ServerWebExchange exchange)
<add> protected HandlerMethod lookupHandlerMethod(LookupPath lookupPath, ServerWebExchange exchange)
<ide> throws Exception {
<ide>
<ide> List<Match> matches = new ArrayList<Match>();
<del> List<T> directPathMatches = this.mappingRegistry.getMappingsByUrl(lookupPath);
<add> List<T> directPathMatches = this.mappingRegistry.getMappingsByUrl(lookupPath.getPath());
<ide> if (directPathMatches != null) {
<ide> addMatchingMappings(directPathMatches, matches, exchange);
<ide> }
<ide> private void addMatchingMappings(Collection<T> mappings, List<Match> matches, Se
<ide> /**
<ide> * Invoked when a matching mapping is found.
<ide> * @param mapping the matching mapping
<del> * @param lookupPath mapping lookup path within the current servlet mapping
<add> * @param lookupPath the lookup path within the current mapping
<ide> * @param exchange the current exchange
<ide> */
<del> protected void handleMatch(T mapping, String lookupPath, ServerWebExchange exchange) {
<add> protected void handleMatch(T mapping, LookupPath lookupPath, ServerWebExchange exchange) {
<ide> }
<ide>
<ide> /**
<ide> * Invoked when no matching mapping is not found.
<ide> * @param mappings all registered mappings
<del> * @param lookupPath mapping lookup path within the current servlet mapping
<add> * @param lookupPath the lookup path within the current mapping
<ide> * @param exchange the current exchange
<ide> * @return an alternative HandlerMethod or {@code null}
<ide> * @throws Exception provides details that can be translated into an error status code
<ide> */
<ide> @Nullable
<del> protected HandlerMethod handleNoMatch(Set<T> mappings, String lookupPath, ServerWebExchange exchange)
<add> protected HandlerMethod handleNoMatch(Set<T> mappings, LookupPath lookupPath, ServerWebExchange exchange)
<ide> throws Exception {
<ide>
<ide> return null;
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/method/RequestMappingInfo.java
<ide> import org.springframework.web.reactive.result.condition.RequestConditionHolder;
<ide> import org.springframework.web.reactive.result.condition.RequestMethodsRequestCondition;
<ide> import org.springframework.web.server.ServerWebExchange;
<del>import org.springframework.web.server.support.HttpRequestPathHelper;
<ide>
<ide> /**
<ide> * Encapsulates the following request mapping conditions:
<ide> public RequestMappingInfo build() {
<ide> RequestedContentTypeResolver contentTypeResolver = this.options.getContentTypeResolver();
<ide>
<ide> PatternsRequestCondition patternsCondition = new PatternsRequestCondition(
<del> this.paths, this.options.getPathHelper(), this.options.getPathMatcher(),
<del> this.options.useSuffixPatternMatch(), this.options.useTrailingSlashMatch(),
<del> this.options.getFileExtensions());
<add> this.paths, this.options.getPathMatcher(), this.options.useSuffixPatternMatch(),
<add> this.options.useTrailingSlashMatch(), this.options.getFileExtensions());
<ide>
<ide> return new RequestMappingInfo(this.mappingName, patternsCondition,
<ide> new RequestMethodsRequestCondition(methods),
<ide> public RequestMappingInfo build() {
<ide> */
<ide> public static class BuilderConfiguration {
<ide>
<del> private HttpRequestPathHelper pathHelper;
<del>
<ide> private PathMatcher pathMatcher;
<ide>
<ide> private boolean trailingSlashMatch = true;
<ide> public static class BuilderConfiguration {
<ide>
<ide> private RequestedContentTypeResolver contentTypeResolver;
<ide>
<del> /**
<del> * Set a custom UrlPathHelper to use for the PatternsRequestCondition.
<del> * <p>By default this is not set.
<del> */
<del> public void setPathHelper(HttpRequestPathHelper pathHelper) {
<del> this.pathHelper = pathHelper;
<del> }
<del>
<del> public HttpRequestPathHelper getPathHelper() {
<del> return this.pathHelper;
<del> }
<del>
<ide> /**
<ide> * Set a custom PathMatcher to use for the PatternsRequestCondition.
<ide> * <p>By default this is not set.
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/method/RequestMappingInfoHandlerMapping.java
<ide> import java.util.Map;
<ide> import java.util.Map.Entry;
<ide> import java.util.Set;
<del>import java.util.StringTokenizer;
<ide> import java.util.stream.Collectors;
<ide>
<ide> import org.springframework.http.HttpHeaders;
<ide> import org.springframework.http.HttpMethod;
<ide> import org.springframework.http.InvalidMediaTypeException;
<ide> import org.springframework.http.MediaType;
<ide> import org.springframework.http.server.reactive.ServerHttpRequest;
<del>import org.springframework.util.LinkedMultiValueMap;
<ide> import org.springframework.util.MultiValueMap;
<del>import org.springframework.util.StringUtils;
<ide> import org.springframework.web.method.HandlerMethod;
<ide> import org.springframework.web.reactive.HandlerMapping;
<ide> import org.springframework.web.reactive.result.condition.NameValueExpression;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide> import org.springframework.web.server.ServerWebInputException;
<ide> import org.springframework.web.server.UnsupportedMediaTypeStatusException;
<add>import org.springframework.web.server.support.LookupPath;
<add>import org.springframework.web.util.WebUtils;
<ide>
<ide> /**
<ide> * Abstract base class for classes for which {@link RequestMappingInfo} defines
<ide> protected Comparator<RequestMappingInfo> getMappingComparator(final ServerWebExc
<ide> * @see HandlerMapping#PRODUCIBLE_MEDIA_TYPES_ATTRIBUTE
<ide> */
<ide> @Override
<del> protected void handleMatch(RequestMappingInfo info, String lookupPath, ServerWebExchange exchange) {
<add> protected void handleMatch(RequestMappingInfo info, LookupPath lookupPath, ServerWebExchange exchange) {
<ide> super.handleMatch(info, lookupPath, exchange);
<ide>
<ide> String bestPattern;
<ide> protected void handleMatch(RequestMappingInfo info, String lookupPath, ServerWeb
<ide>
<ide> Set<String> patterns = info.getPatternsCondition().getPatterns();
<ide> if (patterns.isEmpty()) {
<del> bestPattern = lookupPath;
<add> bestPattern = lookupPath.getPath();
<ide> uriVariables = Collections.emptyMap();
<ide> decodedUriVariables = Collections.emptyMap();
<ide> }
<ide> else {
<ide> bestPattern = patterns.iterator().next();
<del> uriVariables = getPathMatcher().extractUriTemplateVariables(bestPattern, lookupPath);
<add> uriVariables = getPathMatcher().extractUriTemplateVariables(bestPattern, lookupPath.getPath());
<ide> decodedUriVariables = getPathHelper().decodePathVariables(exchange, uriVariables);
<ide> }
<ide>
<ide> private Map<String, MultiValueMap<String, String>> extractMatrixVariables(
<ide> uriVariables.put(uriVar.getKey(), uriVarValue.substring(0, semicolonIndex));
<ide> }
<ide>
<del> MultiValueMap<String, String> vars = parseMatrixVariables(matrixVariables);
<add> MultiValueMap<String, String> vars = WebUtils.parseMatrixVariables(matrixVariables);
<ide> result.put(uriVar.getKey(), getPathHelper().decodeMatrixVariables(exchange, vars));
<ide> }
<ide> return result;
<ide> }
<ide>
<del> /**
<del> * Parse the given string with matrix variables. An example string would look
<del> * like this {@code "q1=a;q1=b;q2=a,b,c"}. The resulting map would contain
<del> * keys {@code "q1"} and {@code "q2"} with values {@code ["a","b"]} and
<del> * {@code ["a","b","c"]} respectively.
<del> * @param matrixVariables the unparsed matrix variables string
<del> * @return a map with matrix variable names and values (never {@code null})
<del> */
<del> private static MultiValueMap<String, String> parseMatrixVariables(String matrixVariables) {
<del> MultiValueMap<String, String> result = new LinkedMultiValueMap<>();
<del> if (!StringUtils.hasText(matrixVariables)) {
<del> return result;
<del> }
<del> StringTokenizer pairs = new StringTokenizer(matrixVariables, ";");
<del> while (pairs.hasMoreTokens()) {
<del> String pair = pairs.nextToken();
<del> int index = pair.indexOf('=');
<del> if (index != -1) {
<del> String name = pair.substring(0, index);
<del> String rawValue = pair.substring(index + 1);
<del> for (String value : StringUtils.commaDelimitedListToStringArray(rawValue)) {
<del> result.add(name, value);
<del> }
<del> }
<del> else {
<del> result.add(pair, "");
<del> }
<del> }
<del> return result;
<del> }
<del>
<ide> /**
<ide> * Iterate all RequestMappingInfos once again, look if any match by URL at
<ide> * least and raise exceptions accordingly.
<ide> private static MultiValueMap<String, String> parseMatrixVariables(String matrixV
<ide> * method but not by query parameter conditions
<ide> */
<ide> @Override
<del> protected HandlerMethod handleNoMatch(Set<RequestMappingInfo> infos, String lookupPath,
<add> protected HandlerMethod handleNoMatch(Set<RequestMappingInfo> infos, LookupPath lookupPath,
<ide> ServerWebExchange exchange) throws Exception {
<ide>
<ide> PartialMatchHelper helper = new PartialMatchHelper(infos, exchange);
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/method/annotation/RequestMappingHandlerMapping.java
<ide> import org.springframework.web.reactive.result.condition.RequestCondition;
<ide> import org.springframework.web.reactive.result.method.RequestMappingInfo;
<ide> import org.springframework.web.reactive.result.method.RequestMappingInfoHandlerMapping;
<add>import org.springframework.web.server.support.LookupPath;
<add>import org.springframework.web.server.ServerWebExchange;
<ide>
<ide> /**
<ide> * An extension of {@link RequestMappingInfoHandlerMapping} that creates
<ide> public void setEmbeddedValueResolver(StringValueResolver resolver) {
<ide> @Override
<ide> public void afterPropertiesSet() {
<ide> this.config = new RequestMappingInfo.BuilderConfiguration();
<del> this.config.setPathHelper(getPathHelper());
<ide> this.config.setPathMatcher(getPathMatcher());
<ide> this.config.setSuffixPatternMatch(this.useSuffixPatternMatch);
<del> this.config.setTrailingSlashMatch(this.useTrailingSlashMatch);
<ide> this.config.setRegisteredSuffixPatternMatch(this.useRegisteredSuffixPatternMatch);
<ide> this.config.setContentTypeResolver(getContentTypeResolver());
<ide>
<ide> public Set<String> getFileExtensions() {
<ide> return this.config.getFileExtensions();
<ide> }
<ide>
<del>
<ide> /**
<ide> * {@inheritDoc}
<ide> * Expects a handler to have a type-level @{@link Controller} annotation.
<ide> protected boolean isHandler(Class<?> beanType) {
<ide> AnnotatedElementUtils.hasAnnotation(beanType, RequestMapping.class));
<ide> }
<ide>
<add> @Override
<add> protected LookupPath createLookupPath(ServerWebExchange exchange) {
<add> return getPathHelper().getLookupPathForRequest(exchange);
<add> }
<add>
<ide> /**
<ide> * Uses method and type-level @{@link RequestMapping} annotations to create
<ide> * the RequestMappingInfo.
<ide><path>spring-webflux/src/main/java/org/springframework/web/reactive/result/view/ViewResolutionResultHandler.java
<ide> import org.springframework.web.reactive.result.HandlerResultHandlerSupport;
<ide> import org.springframework.web.server.NotAcceptableStatusException;
<ide> import org.springframework.web.server.ServerWebExchange;
<del>import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> /**
<ide> * {@code HandlerResultHandler} that encapsulates the view resolution algorithm
<ide> public class ViewResolutionResultHandler extends HandlerResultHandlerSupport
<ide>
<ide> private final List<View> defaultViews = new ArrayList<>(4);
<ide>
<del> private final HttpRequestPathHelper pathHelper = new HttpRequestPathHelper();
<del>
<ide>
<ide> /**
<ide> * Basic constructor with a default {@link ReactiveAdapterRegistry}.
<ide> else if (View.class.isAssignableFrom(clazz)) {
<ide> * Use the request path the leading and trailing slash stripped.
<ide> */
<ide> private String getDefaultViewName(ServerWebExchange exchange) {
<del> String path = this.pathHelper.getLookupPathForRequest(exchange);
<add> String path = exchange.<LookupPath>getAttribute(LookupPath.LOOKUP_PATH_ATTRIBUTE).get().getPath();
<ide> if (path.startsWith("/")) {
<ide> path = path.substring(1);
<ide> }
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/condition/PatternsRequestConditionTests.java
<ide> import org.junit.Test;
<ide>
<ide> import org.springframework.mock.http.server.reactive.test.MockServerWebExchange;
<add>import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide>
<ide> import static org.junit.Assert.assertEquals;
<ide> public void combineMultiplePatterns() {
<ide> @Test
<ide> public void matchDirectPath() throws Exception {
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/foo");
<del> PatternsRequestCondition match = condition.getMatchingCondition(get("/foo").toExchange());
<add> PatternsRequestCondition match = condition.getMatchingCondition(createExchange("/foo"));
<ide>
<ide> assertNotNull(match);
<ide> }
<ide>
<ide> @Test
<ide> public void matchPattern() throws Exception {
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/foo/*");
<del> PatternsRequestCondition match = condition.getMatchingCondition(get("/foo/bar").toExchange());
<add> PatternsRequestCondition match = condition.getMatchingCondition(createExchange("/foo/bar"));
<ide>
<ide> assertNotNull(match);
<ide> }
<ide>
<ide> @Test
<ide> public void matchSortPatterns() throws Exception {
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/*/*", "/foo/bar", "/foo/*");
<del> PatternsRequestCondition match = condition.getMatchingCondition(get("/foo/bar").toExchange());
<add> PatternsRequestCondition match = condition.getMatchingCondition(createExchange("/foo/bar"));
<ide> PatternsRequestCondition expected = new PatternsRequestCondition("/foo/bar", "/foo/*", "/*/*");
<ide>
<ide> assertEquals(expected, match);
<ide> }
<ide>
<ide> @Test
<ide> public void matchSuffixPattern() throws Exception {
<del> ServerWebExchange exchange = get("/foo.html").toExchange();
<add> ServerWebExchange exchange = createExchange("/foo.html");
<ide>
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/{foo}");
<ide> PatternsRequestCondition match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> assertEquals("/{foo}.*", match.getPatterns().iterator().next());
<ide>
<del> condition = new PatternsRequestCondition(new String[] {"/{foo}"}, null, null, false, false, null);
<add> condition = new PatternsRequestCondition(new String[] {"/{foo}"}, null,false, false, null);
<ide> match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> public void matchSuffixPattern() throws Exception {
<ide> public void matchSuffixPatternUsingFileExtensions() throws Exception {
<ide> String[] patterns = new String[] {"/jobs/{jobName}"};
<ide> Set<String> extensions = Collections.singleton("json");
<del> PatternsRequestCondition condition = new PatternsRequestCondition(patterns, null, null, true, false, extensions);
<add> PatternsRequestCondition condition = new PatternsRequestCondition(patterns, null, true, false, extensions);
<ide>
<del> MockServerWebExchange exchange = get("/jobs/my.job").toExchange();
<add> MockServerWebExchange exchange = createExchange("/jobs/my.job");
<ide> PatternsRequestCondition match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> assertEquals("/jobs/{jobName}", match.getPatterns().iterator().next());
<ide>
<del> exchange = get("/jobs/my.job.json").toExchange();
<add> exchange = createExchange("/jobs/my.job.json");
<ide> match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> public void matchSuffixPatternUsingFileExtensions() throws Exception {
<ide> @Test
<ide> public void matchSuffixPatternUsingFileExtensions2() throws Exception {
<ide> PatternsRequestCondition condition1 = new PatternsRequestCondition(
<del> new String[] {"/prefix"}, null, null, true, false, Collections.singleton("json"));
<add> new String[] {"/prefix"}, null, true, false, Collections.singleton("json"));
<ide>
<ide> PatternsRequestCondition condition2 = new PatternsRequestCondition(
<del> new String[] {"/suffix"}, null, null, true, false, null);
<add> new String[] {"/suffix"}, null, true, false, null);
<ide>
<ide> PatternsRequestCondition combined = condition1.combine(condition2);
<ide>
<del> MockServerWebExchange exchange = get("/prefix/suffix.json").toExchange();
<add> MockServerWebExchange exchange = createExchange("/prefix/suffix.json");
<ide> PatternsRequestCondition match = combined.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> }
<ide>
<ide> @Test
<ide> public void matchTrailingSlash() throws Exception {
<del> MockServerWebExchange exchange = get("/foo/").toExchange();
<add> MockServerWebExchange exchange = createExchange("/foo/");
<ide>
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/foo");
<ide> PatternsRequestCondition match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> assertEquals("Should match by default", "/foo/", match.getPatterns().iterator().next());
<ide>
<del> condition = new PatternsRequestCondition(new String[] {"/foo"}, null, null, false, true, null);
<add> condition = new PatternsRequestCondition(new String[] {"/foo"}, null, false, true, null);
<ide> match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNotNull(match);
<ide> assertEquals("Trailing slash should be insensitive to useSuffixPatternMatch settings (SPR-6164, SPR-5636)",
<ide> "/foo/", match.getPatterns().iterator().next());
<ide>
<del> condition = new PatternsRequestCondition(new String[] {"/foo"}, null, null, false, false, null);
<add> exchange = createExchange("/foo/");
<add> condition = new PatternsRequestCondition(new String[] {"/foo"}, null, false, false, null);
<ide> match = condition.getMatchingCondition(exchange);
<ide>
<ide> assertNull(match);
<ide> public void matchTrailingSlash() throws Exception {
<ide> @Test
<ide> public void matchPatternContainsExtension() throws Exception {
<ide> PatternsRequestCondition condition = new PatternsRequestCondition("/foo.jpg");
<del> PatternsRequestCondition match = condition.getMatchingCondition(get("/foo.html").toExchange());
<add> PatternsRequestCondition match = condition.getMatchingCondition(createExchange("/foo.html"));
<ide>
<ide> assertNull(match);
<ide> }
<ide> public void compareEqualPatterns() throws Exception {
<ide> PatternsRequestCondition c1 = new PatternsRequestCondition("/foo*");
<ide> PatternsRequestCondition c2 = new PatternsRequestCondition("/foo*");
<ide>
<del> assertEquals(0, c1.compareTo(c2, get("/foo").toExchange()));
<add> assertEquals(0, c1.compareTo(c2, createExchange("/foo")));
<ide> }
<ide>
<ide> @Test
<ide> public void comparePatternSpecificity() throws Exception {
<ide> PatternsRequestCondition c1 = new PatternsRequestCondition("/fo*");
<ide> PatternsRequestCondition c2 = new PatternsRequestCondition("/foo");
<ide>
<del> assertEquals(1, c1.compareTo(c2, get("/foo").toExchange()));
<add> assertEquals(1, c1.compareTo(c2, createExchange("/foo")));
<ide> }
<ide>
<ide> @Test
<ide> public void compareNumberOfMatchingPatterns() throws Exception {
<del> ServerWebExchange exchange = get("/foo.html").toExchange();
<add> ServerWebExchange exchange = createExchange("/foo.html");
<ide>
<ide> PatternsRequestCondition c1 = new PatternsRequestCondition("/foo", "*.jpeg");
<ide> PatternsRequestCondition c2 = new PatternsRequestCondition("/foo", "*.html");
<ide> public void compareNumberOfMatchingPatterns() throws Exception {
<ide> assertEquals(1, match1.compareTo(match2, exchange));
<ide> }
<ide>
<add> private MockServerWebExchange createExchange(String path) {
<add> MockServerWebExchange exchange = get(path).toExchange();
<add> HttpRequestPathHelper helper = new HttpRequestPathHelper();
<add> exchange.getAttributes().put(LookupPath.LOOKUP_PATH_ATTRIBUTE, helper.getLookupPathForRequest(exchange));
<add> return exchange;
<add> }
<ide>
<ide> }
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/condition/RequestMappingInfoTests.java
<ide> import org.springframework.mock.http.server.reactive.test.MockServerWebExchange;
<ide> import org.springframework.web.bind.annotation.RequestMethod;
<ide> import org.springframework.web.reactive.result.method.RequestMappingInfo;
<add>import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide> import org.springframework.web.server.ServerWebExchange;
<ide>
<ide> import static java.util.Arrays.asList;
<ide> public void createEmpty() {
<ide> @Test
<ide> public void matchPatternsCondition() {
<ide> MockServerWebExchange exchange = MockServerHttpRequest.get("/foo").toExchange();
<add> setLookupPathAttribute(exchange);
<ide>
<ide> RequestMappingInfo info = paths("/foo*", "/bar").build();
<ide> RequestMappingInfo expected = paths("/foo*").build();
<ide> public void matchPatternsCondition() {
<ide> @Test
<ide> public void matchParamsCondition() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo?foo=bar").toExchange();
<add> setLookupPathAttribute(exchange);
<ide>
<ide> RequestMappingInfo info = paths("/foo").params("foo=bar").build();
<ide> RequestMappingInfo match = info.getMatchingCondition(exchange);
<ide> public void matchParamsCondition() {
<ide> @Test
<ide> public void matchHeadersCondition() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo").header("foo", "bar").toExchange();
<add> setLookupPathAttribute(exchange);
<ide>
<ide> RequestMappingInfo info = paths("/foo").headers("foo=bar").build();
<ide> RequestMappingInfo match = info.getMatchingCondition(exchange);
<ide> public void matchHeadersCondition() {
<ide> @Test
<ide> public void matchConsumesCondition() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.post("/foo").contentType(MediaType.TEXT_PLAIN).toExchange();
<add> setLookupPathAttribute(exchange);
<ide>
<ide> RequestMappingInfo info = paths("/foo").consumes("text/plain").build();
<ide> RequestMappingInfo match = info.getMatchingCondition(exchange);
<ide> public void matchConsumesCondition() {
<ide> @Test
<ide> public void matchProducesCondition() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo").accept(MediaType.TEXT_PLAIN).toExchange();
<add> setLookupPathAttribute(exchange);
<ide>
<ide> RequestMappingInfo info = paths("/foo").produces("text/plain").build();
<ide> RequestMappingInfo match = info.getMatchingCondition(exchange);
<ide> public void matchProducesCondition() {
<ide> @Test
<ide> public void matchCustomCondition() {
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo?foo=bar").toExchange();
<del>
<add> setLookupPathAttribute(exchange);
<add>
<ide> RequestMappingInfo info = paths("/foo").params("foo=bar").build();
<ide> RequestMappingInfo match = info.getMatchingCondition(exchange);
<ide>
<ide> public void compareTwoHttpMethodsOneParam() {
<ide> RequestMappingInfo oneMethodOneParam = paths().methods(RequestMethod.GET).params("foo").build();
<ide>
<ide> ServerWebExchange exchange = MockServerHttpRequest.get("/foo").toExchange();
<add> setLookupPathAttribute(exchange);
<ide> Comparator<RequestMappingInfo> comparator = (info, otherInfo) -> info.compareTo(otherInfo, exchange);
<ide>
<ide> List<RequestMappingInfo> list = asList(none, oneMethod, oneMethodOneParam);
<ide> public void preFlightRequest() throws Exception {
<ide> assertNull("Pre-flight should match the ACCESS_CONTROL_REQUEST_METHOD", match);
<ide> }
<ide>
<add> public void setLookupPathAttribute(ServerWebExchange exchange) {
<add> HttpRequestPathHelper helper = new HttpRequestPathHelper();
<add> exchange.getAttributes().put(LookupPath.LOOKUP_PATH_ATTRIBUTE, helper.getLookupPathForRequest(exchange));
<add> }
<add>
<ide> }
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/method/RequestMappingInfoHandlerMappingTests.java
<ide> import org.springframework.web.server.ServerWebInputException;
<ide> import org.springframework.web.server.UnsupportedMediaTypeStatusException;
<ide> import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> import static org.hamcrest.CoreMatchers.*;
<ide> import static org.junit.Assert.*;
<ide> public class RequestMappingInfoHandlerMappingTests {
<ide>
<ide> private TestRequestMappingInfoHandlerMapping handlerMapping;
<ide>
<add> private HttpRequestPathHelper pathHelper;
<add>
<ide>
<ide> @Before
<ide> public void setup() throws Exception {
<ide> this.handlerMapping = new TestRequestMappingInfoHandlerMapping();
<ide> this.handlerMapping.registerHandler(new TestController());
<add> this.pathHelper = new HttpRequestPathHelper();
<add> this.handlerMapping.setPathHelper(this.pathHelper);
<ide> }
<ide>
<ide>
<ide> public void getHandlerProducibleMediaTypesAttribute() throws Exception {
<ide> @Test
<ide> @SuppressWarnings("unchecked")
<ide> public void handleMatchUriTemplateVariables() throws Exception {
<del> String lookupPath = "/1/2";
<del> ServerWebExchange exchange = get(lookupPath).toExchange();
<add> ServerWebExchange exchange = get("/1/2").toExchange();
<add> LookupPath lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<ide>
<ide> RequestMappingInfo key = paths("/{path1}/{path2}").build();
<ide> this.handlerMapping.handleMatch(key, lookupPath, exchange);
<ide> public void handleMatchUriTemplateVariablesDecode() throws Exception {
<ide> URI url = URI.create("/group/a%2Fb");
<ide> ServerWebExchange exchange = MockServerHttpRequest.method(HttpMethod.GET, url).toExchange();
<ide>
<del> HttpRequestPathHelper pathHelper = new HttpRequestPathHelper();
<del> pathHelper.setUrlDecode(false);
<del> String lookupPath = pathHelper.getLookupPathForRequest(exchange);
<del>
<del> this.handlerMapping.setPathHelper(pathHelper);
<add> this.pathHelper.setUrlDecode(false);
<add> LookupPath lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<add> this.handlerMapping.setPathHelper(this.pathHelper);
<add>
<ide> this.handlerMapping.handleMatch(key, lookupPath, exchange);
<ide>
<ide> String name = HandlerMapping.URI_TEMPLATE_VARIABLES_ATTRIBUTE;
<ide> public void handleMatchUriTemplateVariablesDecode() throws Exception {
<ide> public void handleMatchBestMatchingPatternAttribute() throws Exception {
<ide> RequestMappingInfo key = paths("/{path1}/2", "/**").build();
<ide> ServerWebExchange exchange = get("/1/2").toExchange();
<del> this.handlerMapping.handleMatch(key, "/1/2", exchange);
<add> LookupPath lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<add> this.handlerMapping.handleMatch(key, lookupPath, exchange);
<ide>
<ide> assertEquals("/{path1}/2", exchange.getAttributes().get(HandlerMapping.BEST_MATCHING_PATTERN_ATTRIBUTE));
<ide> }
<ide> public void handleMatchBestMatchingPatternAttribute() throws Exception {
<ide> public void handleMatchBestMatchingPatternAttributeNoPatternsDefined() throws Exception {
<ide> RequestMappingInfo key = paths().build();
<ide> ServerWebExchange exchange = get("/1/2").toExchange();
<del>
<del> this.handlerMapping.handleMatch(key, "/1/2", exchange);
<add> LookupPath lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<add> this.handlerMapping.handleMatch(key, lookupPath, exchange);
<ide>
<ide> assertEquals("/1/2", exchange.getAttributes().get(HandlerMapping.BEST_MATCHING_PATTERN_ATTRIBUTE));
<ide> }
<ide> public void handleMatchMatrixVariables() throws Exception {
<ide> MultiValueMap<String, String> matrixVariables;
<ide> Map<String, String> uriVariables;
<ide>
<del> ServerWebExchange exchange = get("/").toExchange();
<del> handleMatch(exchange, "/{cars}", "/cars;colors=red,blue,green;year=2012");
<add> ServerWebExchange exchange = get("/cars;colors=red,blue,green;year=2012").toExchange();
<add> handleMatch(exchange, "/{cars}");
<ide>
<ide> matrixVariables = getMatrixVariables(exchange, "cars");
<ide> uriVariables = getUriTemplateVariables(exchange);
<ide> public void handleMatchMatrixVariables() throws Exception {
<ide> assertEquals("2012", matrixVariables.getFirst("year"));
<ide> assertEquals("cars", uriVariables.get("cars"));
<ide>
<del> exchange = get("/").toExchange();
<del> handleMatch(exchange, "/{cars:[^;]+}{params}", "/cars;colors=red,blue,green;year=2012");
<add> exchange = get("/cars;colors=red,blue,green;year=2012").toExchange();
<add> handleMatch(exchange, "/{cars:[^;]+}{params}");
<ide>
<ide> matrixVariables = getMatrixVariables(exchange, "params");
<ide> uriVariables = getUriTemplateVariables(exchange);
<ide> public void handleMatchMatrixVariables() throws Exception {
<ide> assertEquals("cars", uriVariables.get("cars"));
<ide> assertEquals(";colors=red,blue,green;year=2012", uriVariables.get("params"));
<ide>
<del> exchange = get("/").toExchange();
<del> handleMatch(exchange, "/{cars:[^;]+}{params}", "/cars");
<add> exchange = get("/cars").toExchange();
<add> handleMatch(exchange, "/{cars:[^;]+}{params}");
<ide>
<ide> matrixVariables = getMatrixVariables(exchange, "params");
<ide> uriVariables = getUriTemplateVariables(exchange);
<ide> public void handleMatchMatrixVariablesDecoding() throws Exception {
<ide> urlPathHelper.setUrlDecode(false);
<ide> this.handlerMapping.setPathHelper(urlPathHelper);
<ide>
<del> ServerWebExchange exchange = get("/").toExchange();
<del> handleMatch(exchange, "/path{filter}", "/path;mvar=a%2fb");
<add> ServerWebExchange exchange = get("/path;mvar=a%2fb").toExchange();
<add> handleMatch(exchange, "/path{filter}");
<ide>
<ide> MultiValueMap<String, String> matrixVariables = getMatrixVariables(exchange, "filter");
<ide> Map<String, String> uriVariables = getUriTemplateVariables(exchange);
<ide> private void testMediaTypeNotAcceptable(String url) throws Exception {
<ide> ex.getSupportedMediaTypes()));
<ide> }
<ide>
<del> private void handleMatch(ServerWebExchange exchange, String pattern, String lookupPath) {
<add> private void handleMatch(ServerWebExchange exchange, String pattern) {
<ide> RequestMappingInfo info = paths(pattern).build();
<add> LookupPath lookupPath = this.pathHelper.getLookupPathForRequest(exchange);
<ide> this.handlerMapping.handleMatch(info, lookupPath, exchange);
<ide> }
<ide>
<ide> protected RequestMappingInfo getMappingForMethod(Method method, Class<?> handler
<ide> RequestMapping annot = AnnotatedElementUtils.findMergedAnnotation(method, RequestMapping.class);
<ide> if (annot != null) {
<ide> BuilderConfiguration options = new BuilderConfiguration();
<del> options.setPathHelper(getPathHelper());
<ide> options.setPathMatcher(getPathMatcher());
<ide> options.setSuffixPatternMatch(true);
<ide> options.setTrailingSlashMatch(true);
<ide><path>spring-webflux/src/test/java/org/springframework/web/reactive/result/view/ViewResolutionResultHandlerTests.java
<ide> import org.springframework.web.reactive.accept.RequestedContentTypeResolver;
<ide> import org.springframework.web.server.NotAcceptableStatusException;
<ide> import org.springframework.web.server.ServerWebExchange;
<add>import org.springframework.web.server.support.HttpRequestPathHelper;
<add>import org.springframework.web.server.support.LookupPath;
<ide>
<ide> import static java.nio.charset.StandardCharsets.UTF_8;
<ide> import static org.junit.Assert.assertEquals;
<ide> private void testDefaultViewName(Object returnValue, MethodParameter returnType)
<ide> ViewResolutionResultHandler handler = resultHandler(new TestViewResolver("account"));
<ide>
<ide> MockServerWebExchange exchange = get("/account").toExchange();
<add> addLookupPathAttribute(exchange);
<ide> handler.handleResult(exchange, result).block(Duration.ofMillis(5000));
<ide> assertResponseBody(exchange, "account: {id=123}");
<ide>
<ide> exchange = get("/account/").toExchange();
<add> addLookupPathAttribute(exchange);
<ide> handler.handleResult(exchange, result).block(Duration.ofMillis(5000));
<ide> assertResponseBody(exchange, "account: {id=123}");
<ide>
<ide> exchange = get("/account.123").toExchange();
<add> addLookupPathAttribute(exchange);
<ide> handler.handleResult(exchange, result).block(Duration.ofMillis(5000));
<ide> assertResponseBody(exchange, "account: {id=123}");
<ide> }
<ide> public void contentNegotiation() throws Exception {
<ide> HandlerResult handlerResult = new HandlerResult(new Object(), value, returnType, this.bindingContext);
<ide>
<ide> MockServerWebExchange exchange = get("/account").accept(APPLICATION_JSON).toExchange();
<del>
<add> addLookupPathAttribute(exchange);
<add>
<ide> TestView defaultView = new TestView("jsonView", APPLICATION_JSON);
<ide>
<ide> resultHandler(Collections.singletonList(defaultView), new TestViewResolver("account"))
<ide> public void contentNegotiationWithRedirect() throws Exception {
<ide> assertEquals("/", response.getHeaders().getLocation().toString());
<ide> }
<ide>
<add> private void addLookupPathAttribute(ServerWebExchange exchange) {
<add> HttpRequestPathHelper helper = new HttpRequestPathHelper();
<add> exchange.getAttributes().put(LookupPath.LOOKUP_PATH_ATTRIBUTE, helper.getLookupPathForRequest(exchange));
<add> }
<add>
<ide>
<ide> private ViewResolutionResultHandler resultHandler(ViewResolver... resolvers) {
<ide> return resultHandler(Collections.emptyList(), resolvers);
<ide> private ServerWebExchange testHandle(String path, MethodParameter returnType, Ob
<ide> model.addAttribute("id", "123");
<ide> HandlerResult result = new HandlerResult(new Object(), returnValue, returnType, this.bindingContext);
<ide> MockServerWebExchange exchange = get(path).toExchange();
<add> addLookupPathAttribute(exchange);
<ide> resultHandler(resolvers).handleResult(exchange, result).block(Duration.ofSeconds(5));
<ide> assertResponseBody(exchange, responseBody);
<ide> return exchange; | 18 |
Text | Text | fix translate to russian | eec1a140a1b9f91edb737e147d165839c5b81f1e | <ide><path>guide/russian/agile/index.md
<ide> localeTitle: Гибкие методологии
<ide> В Agile акцент делается на «гибкость» - способность быстро реагировать на отзывы пользователей и другие изменяющиеся обстоятельства.
<ide>
<ide>
<del>
<add>
<add>Agile подчеркивает, спрашивая конечных пользователей, что они хотят, и часто демонстрирует им демонстрации продукта по мере его разработки. Это, в отличие от подхода «Водопад», разработки, основанного на спецификации, и того, что специалисты Agile называют «Big Up-Front Design». В этих подходах функции планируются и предусматриваются в бюджете до начала разработки.
<add>
<ide>
<ide> Гибкие методологии разработки программного обеспечения делает упор на четырех основных ценностях:
<ide> 1. Предпочтение группового и индивидуального взаимодействия над инструментами и процессами. | 1 |
Javascript | Javascript | reduce memory usage | d4f3a388f79123f7221e6ff08d2454c73f5d4e8e | <ide><path>examples/js/pmrem/PMREMCubeUVPacker.js
<ide> THREE.PMREMCubeUVPacker = function ( cubeTextureLods, numLods ) {
<ide> this.CubeUVRenderTarget = new THREE.WebGLRenderTarget( size, size, params );
<ide> this.CubeUVRenderTarget.texture.name = "PMREMCubeUVPacker.cubeUv";
<ide> this.CubeUVRenderTarget.texture.mapping = THREE.CubeUVReflectionMapping;
<del> this.camera = new THREE.OrthographicCamera( - size * 0.5, size * 0.5, - size * 0.5, size * 0.5, 0.0, 1000 );
<add> this.camera = new THREE.OrthographicCamera( - size * 0.5, size * 0.5, - size * 0.5, size * 0.5, 0, 1 ); // top and bottom are swapped for some reason?
<ide>
<ide> this.scene = new THREE.Scene();
<del> this.scene.add( this.camera );
<ide>
<ide> this.objects = [];
<ide>
<add> var geometry = new THREE.PlaneBufferGeometry( 1, 1 );
<add>
<ide> var faceOffsets = [];
<ide> faceOffsets.push( new THREE.Vector2( 0, 0 ) );
<ide> faceOffsets.push( new THREE.Vector2( 1, 0 ) );
<ide> THREE.PMREMCubeUVPacker = function ( cubeTextureLods, numLods ) {
<ide> material.envMap = this.cubeLods[ i ].texture;
<ide> material.uniforms[ 'faceIndex' ].value = k;
<ide> material.uniforms[ 'mapSize' ].value = mipSize;
<del> var planeMesh = new THREE.Mesh(
<del> new THREE.PlaneGeometry( mipSize, mipSize, 0 ),
<del> material );
<add>
<add> var planeMesh = new THREE.Mesh( geometry, material );
<ide> planeMesh.position.x = faceOffsets[ k ].x * mipSize - offset1 + mipOffsetX;
<ide> planeMesh.position.y = faceOffsets[ k ].y * mipSize - offset1 + offset2 + mipOffsetY;
<del> planeMesh.material.side = THREE.DoubleSide;
<add> planeMesh.material.side = THREE.BackSide;
<add> planeMesh.scale.set( mipSize, mipSize );
<ide> this.scene.add( planeMesh );
<ide> this.objects.push( planeMesh );
<ide> | 1 |
Javascript | Javascript | write test for feb 29 0000 validity | 9245568deb5b085278323bdee0ba0cbf3183975d | <ide><path>src/test/moment/is_valid.js
<ide> test('array good month', function (assert) {
<ide> }
<ide> });
<ide>
<add>test('Feb 29 0000 is valid', function (assert) {
<add> // https://github.com/moment/moment/issues/3358
<add> assert.ok(moment({year:0, month:1, date:29}).isValid(), 'Feb 29 0000 must be valid');
<add> assert.ok(moment({year:0, month:1, date:28}).add(1, 'd').isValid(), 'Feb 28 0000 + 1 day must be valid');
<add>});
<add>
<ide> test('array bad date', function (assert) {
<ide> var tests = [
<ide> moment([2010, 0, 0]), | 1 |
Javascript | Javascript | pass explicit execargv to workers | bc4de008d3d9d8f1e8e3684513ea5175a747f27b | <ide><path>packager/src/JSTransformer/index.js
<ide> function makeFarm(worker, methods, timeout, maxConcurrentWorkers) {
<ide> return workerFarm(
<ide> {
<ide> autoStart: true,
<add> execArgv: [],
<ide> maxConcurrentCallsPerWorker: 1,
<ide> maxConcurrentWorkers,
<ide> maxCallsPerWorker: MAX_CALLS_PER_WORKER,
<ide><path>packager/src/worker-farm/lib/farm.js
<ide> const extend = require('xtend')
<ide> , ProcessTerminatedError = require('errno').create('ProcessTerminatedError')
<ide> , MaxConcurrentCallsError = require('errno').create('MaxConcurrentCallsError')
<ide>
<del>function Farm (options: {}, path: string) {
<add>function Farm (options: {+execArgv: Array<string>}, path: string) {
<ide> this.options = extend(DEFAULT_OPTIONS, options)
<ide> this.path = path
<ide> this.activeCalls = 0
<ide> Farm.prototype.onExit = function (childId) {
<ide> Farm.prototype.startChild = function () {
<ide> this.childId++
<ide>
<del> var forked = fork(this.path)
<add> var forked = fork(this.path, {execArgv: this.options.execArgv})
<ide> , id = this.childId
<ide> , c = {
<ide> send : forked.send
<ide><path>packager/src/worker-farm/lib/fork.js
<ide> const childProcess = require('child_process');
<ide> const childModule = require.resolve('./child/index');
<ide>
<del>function fork(forkModule: string) {
<add>function fork(forkModule: string, options: {|+execArgv: Array<string>|}) {
<ide> const child = childProcess.fork(childModule, {
<ide> env: process.env,
<ide> cwd: process.cwd(),
<add> execArgv: options.execArgv,
<ide> });
<ide>
<ide> child.send({module: forkModule});
<ide><path>packager/src/worker-farm/lib/index.js
<ide> const Farm = require('./farm')
<ide>
<ide> var farms = [] // keep record of farms so we can end() them if required
<ide>
<del>function farm(options: {}, path: string, methods: Array<string>): {[name: string]: Function} {
<add>function farm(
<add> options: {+execArgv: Array<string>},
<add> path: string,
<add> methods: Array<string>,
<add>): {[name: string]: Function} {
<ide> var f = new Farm(options, path)
<ide> , api = f.setup(methods)
<ide> | 4 |
Python | Python | fix failing test in testschedulerjob | 5e0a2d772b2b095de277bedcaafce8dd98805d48 | <ide><path>tests/jobs/test_scheduler_job.py
<ide> def test_scheduler_loop_should_change_state_for_tis_without_dagrun(
<ide> scheduler.processor_agent = processor
<ide>
<ide> with mock.patch.object(settings, "USE_JOB_SCHEDULE", False), conf_vars(
<del> {('scheduler', 'clean_tis_without_dagrun'): '0.001'}
<add> {('scheduler', 'clean_tis_without_dagrun_interval'): '0.001'}
<ide> ):
<ide> scheduler._run_scheduler_loop()
<ide> | 1 |
Javascript | Javascript | make clone of materials | 8d40dba44b5d711185994162c6cf1c52e5a78baf | <ide><path>src/materials/MeshFaceMaterial.js
<ide> THREE.MeshFaceMaterial = function ( materials ) {
<ide> THREE.MeshFaceMaterial.prototype.clone = function () {
<ide>
<ide> var material = new THREE.MeshFaceMaterial();
<del> material.materials = this.materials;
<add> material.materials = this.materials.slice(0);
<ide> return material;
<ide>
<ide> }; | 1 |
Python | Python | remove broken link (resolves ) | 1a23a0f87e693011670ac8d869d2578ad20ef2b4 | <ide><path>examples/training/train_textcat.py
<ide> spacy.pipeline, and predictions are available via `doc.cats`. For more details,
<ide> see the documentation:
<ide> * Training: https://spacy.io/usage/training
<del>* Text classification: https://spacy.io/usage/text-classification
<ide>
<ide> Compatible with: spaCy v2.0.0+
<ide> """ | 1 |
Python | Python | fix merge conflict | f28ef61a87f097b129b18792ae64f509ef636af1 | <ide><path>examples/lstm_stateful.py
<ide> def gen_uniform_amp(amp=1, xn=10000):
<ide> -amp and +amp
<ide> and of length xn
<ide>
<del> Arguments:
<add> # Arguments
<ide> amp: maximum/minimum range of uniform data
<ide> xn: length of series
<ide> """
<ide><path>keras/activations.py
<ide> def hard_sigmoid(x):
<ide> def exponential(x):
<ide> """Exponential (base e) activation function.
<ide>
<del> # Arguments:
<add> # Arguments
<ide> x: Input tensor.
<ide>
<ide> # Returns
<ide><path>keras/backend/cntk_backend.py
<ide> def ones(shape, dtype=None, name=None):
<ide> def eye(size, dtype=None, name=None):
<ide> if dtype is None:
<ide> dtype = floatx()
<del> return variable(np.eye(size), dtype, name)
<add> if isinstance(size, (list, tuple)):
<add> n, m = size
<add> else:
<add> n, m = size, size
<add> return variable(np.eye(n, m), dtype, name)
<ide>
<ide>
<ide> def zeros_like(x, dtype=None, name=None):
<ide> def cumsum(x, axis=0):
<ide>
<ide>
<ide> def cumprod(x, axis=0):
<del> raise NotImplementedError
<add> shape = x.shape
<add> out = x
<add> for rep in range(shape[axis] - 1):
<add> sliced_shape = list(shape)
<add> sliced_shape[axis] = rep + 1
<add> if axis == 0:
<add> _x = x[rep:(rep + 1)]
<add> elif axis == 1:
<add> _x = x[:, rep:(rep + 1)]
<add> elif axis == 2:
<add> _x = x[:, :, rep:(rep + 1)]
<add> y = concatenate([ones(sliced_shape, dtype=x.dtype),
<add> repeat_elements(_x, rep=shape[axis] - 1 - rep, axis=axis)],
<add> axis=axis)
<add> out = C.element_times(out, y)
<add> return out
<ide>
<ide>
<ide> def arange(start, stop=None, step=1, dtype='int32'):
<ide><path>keras/backend/numpy_backend.py
<ide> def ones_like(x, dtype=floatx(), name=None):
<ide>
<ide>
<ide> def eye(size, dtype=None, name=None):
<del> return np.eye(size, dtype=dtype)
<add> if isinstance(size, (list, tuple)):
<add> n, m = size
<add> else:
<add> n, m = size, size
<add> return np.eye(n, m, dtype=dtype)
<ide>
<ide>
<ide> def resize_images(x, height_factor, width_factor, data_format):
<ide><path>keras/backend/tensorflow_backend.py
<ide> def eye(size, dtype=None, name=None):
<ide> """Instantiate an identity matrix and returns it.
<ide>
<ide> # Arguments
<del> size: Integer, number of rows/columns.
<add> size: Tuple, number of rows and columns. If Integer, number of rows.
<ide> dtype: String, data type of returned Keras variable.
<ide> name: String, name of returned Keras variable.
<ide>
<ide> def eye(size, dtype=None, name=None):
<ide> # Example
<ide> ```python
<ide> >>> from keras import backend as K
<del> >>> kvar = K.eye(3)
<del> >>> K.eval(kvar)
<add> >>> K.eval(K.eye(3))
<ide> array([[ 1., 0., 0.],
<ide> [ 0., 1., 0.],
<ide> [ 0., 0., 1.]], dtype=float32)
<add> >>> K.eval(K.eye((2, 3)))
<add> array([[1., 0., 0.],
<add> [0., 1., 0.]], dtype=float32)
<ide> ```
<ide> {{np_implementation}}
<ide> """
<ide> if dtype is None:
<ide> dtype = floatx()
<add> if isinstance(size, (list, tuple)):
<add> n, m = size
<add> else:
<add> n, m = size, size
<ide> with tf_ops.init_scope():
<del> return variable(tf.eye(size, dtype=dtype), dtype, name)
<add> return variable(tf.eye(n, m, dtype=tf_dtype), dtype, name)
<ide>
<ide>
<ide> @symbolic
<ide><path>keras/backend/theano_backend.py
<ide> def eye(size, dtype=None, name=None):
<ide> """
<ide> if dtype is None:
<ide> dtype = floatx()
<del> return variable(np.eye(size), dtype, name)
<add> if isinstance(size, (list, tuple)):
<add> n, m = size
<add> else:
<add> n, m = size, size
<add> return variable(np.eye(n, m), dtype, name)
<ide>
<ide>
<ide> def ones_like(x, dtype=None, name=None):
<ide> def reverse(x, axes):
<ide> """
<ide> if isinstance(axes, int):
<ide> axes = [axes]
<add> elif isinstance(axes, tuple):
<add> axes = list(axes)
<add> for i in range(len(axes)):
<add> if axes[i] == -1:
<add> axes[i] = x.ndim - 1
<ide> slices = []
<ide> for i in range(x.ndim):
<ide> if i in axes:
<ide><path>keras/engine/network.py
<ide> def from_config(cls, config, custom_objects=None):
<ide> def add_unprocessed_node(layer, node_data):
<ide> """Add node to layer list
<ide>
<del> Args:
<add> # Arguments
<ide> layer: layer object
<ide> node_data: Node data specifying layer call
<ide> """
<ide> def add_unprocessed_node(layer, node_data):
<ide> def process_node(layer, node_data):
<ide> """Reconstruct node by linking to inbound layers
<ide>
<del> Args:
<add> # Arguments
<ide> layer: Layer to process
<ide> node_data: List of layer configs
<ide>
<del> Raises:
<add> # Raises
<ide> ValueError: For incorrect layer config
<ide> LookupError: If layer required is not found
<ide> """
<ide><path>keras/engine/training_utils.py
<ide> def is_sequence(seq):
<ide> def should_run_validation(validation_freq, epoch):
<ide> """Checks if validation should be run this epoch.
<ide>
<del> Arguments:
<del> validation_freq: Integer or list. If an integer, specifies how many training
<del> epochs to run before a new validation run is performed. If a list,
<del> specifies the epochs on which to run validation.
<del> epoch: Integer, the number of the training epoch just completed.
<del>
<del> Returns:
<del> Bool, True if validation should be run.
<del>
<del> Raises:
<del> ValueError: if `validation_freq` is an Integer and less than 1, or if
<del> it is neither an Integer nor a Sequence.
<add> # Arguments
<add> validation_freq: Integer or list. If an integer, specifies how many training
<add> epochs to run before a new validation run is performed. If a list,
<add> specifies the epochs on which to run validation.
<add> epoch: Integer, the number of the training epoch just completed.
<add>
<add> # Returns
<add> Bool, True if validation should be run.
<add>
<add> # Raises
<add> ValueError: if `validation_freq` is an Integer and less than 1, or if
<add> it is neither an Integer nor a Sequence.
<ide> """
<ide> # `epoch` is 0-indexed internally but 1-indexed in the public API.
<ide> one_indexed_epoch = epoch + 1
<ide><path>keras/utils/test_utils.py
<ide> class tf_file_io_proxy(object):
<ide> recommended to use method `get_filepath(filename)` in tests to make them
<ide> pass with and without a real GCS bucket during testing. See example below.
<ide>
<del> Arguments:
<add> # Arguments
<ide> file_io_module: String identifier of the file_io module import to patch. E.g
<ide> 'keras.engine.saving.tf_file_io'
<ide> bucket_name: String identifier of *a real* GCS bucket (with or without the
<ide> 'gs://' prefix). A bucket name provided with argument precedes what is
<ide> specified using the GCS_TEST_BUCKET environment variable.
<ide>
<del> Example:
<add> # Example
<ide> ```python
<ide> model = Sequential()
<ide> model.add(Dense(2, input_shape=(3,)))
<ide><path>keras/utils/vis_utils.py
<ide> def plot_model(model,
<ide> expand_nested: whether to expand nested models into clusters.
<ide> dpi: dot DPI.
<ide>
<del> # Returns:
<add> # Returns
<ide> A Jupyter notebook Image object if Jupyter is installed.
<ide> This enables in-line display of the model plots in notebooks.
<ide> """
<ide><path>tests/keras/backend/backend_test.py
<ide> def test_set_learning_phase(self):
<ide> with pytest.raises(ValueError):
<ide> K.set_learning_phase(2)
<ide>
<del> def test_eye(self):
<add> def test_creation_operations(self):
<ide> check_single_tensor_operation('eye', 3, WITH_NP, shape_or_val=False)
<add> check_single_tensor_operation('eye', (3, 2), WITH_NP, shape_or_val=False)
<add> check_single_tensor_operation('eye', (3, 4), WITH_NP, shape_or_val=False)
<ide>
<del> def test_ones(self):
<ide> check_single_tensor_operation('ones', (3, 5, 10, 8),
<ide> WITH_NP, shape_or_val=False)
<del>
<del> def test_zeros(self):
<ide> check_single_tensor_operation('zeros', (3, 5, 10, 8),
<ide> WITH_NP, shape_or_val=False)
<ide>
<del> def test_ones_like(self):
<del> check_single_tensor_operation('ones_like', (3, 5, 10, 8),
<del> WITH_NP, shape_or_val=True)
<del>
<del> def test_zeros_like(self):
<del> check_single_tensor_operation('zeros_like', (3, 5, 10, 8),
<del> WITH_NP, shape_or_val=True)
<add> check_single_tensor_operation('ones_like', (3, 5, 10, 8), WITH_NP)
<add> check_single_tensor_operation('zeros_like', (3, 5, 10, 8), WITH_NP)
<ide>
<ide> def test_linear_operations(self):
<ide> check_two_tensor_operation('dot', (4, 2), (2, 4), WITH_NP)
<ide> def test_linear_operations(self):
<ide>
<ide> check_single_tensor_operation('transpose', (4, 2), WITH_NP)
<ide> check_single_tensor_operation('reverse', (4, 3, 2), WITH_NP, axes=1)
<del> if K.backend() != 'cntk':
<del> check_single_tensor_operation('reverse', (4, 3, 2), WITH_NP, axes=(1, 2))
<add> check_single_tensor_operation('reverse', (4, 3, 2), WITH_NP, axes=(1, 2))
<add> check_single_tensor_operation('reverse', (4, 3, 2), WITH_NP, axes=(0, -1))
<ide>
<ide> def test_random_variables(self):
<ide> check_single_tensor_operation('random_uniform_variable', (2, 3), WITH_NP,
<ide> def test_reset_uids(self):
<ide> K.reset_uids()
<ide> assert K.get_uid() == first
<ide>
<del> @pytest.mark.skipif(K.backend() == 'cntk', reason='cntk does not support '
<del> 'cumsum and cumprod yet')
<ide> def test_cumsum(self):
<ide> check_single_tensor_operation('cumsum', (4, 2), WITH_NP)
<ide> check_single_tensor_operation('cumsum', (4, 2), WITH_NP, axis=1)
<ide>
<del> @pytest.mark.skipif(K.backend() == 'cntk', reason='cntk does not support '
<del> 'cumprod yet')
<ide> def test_cumprod(self):
<ide> check_single_tensor_operation('cumprod', (4, 2), WITH_NP)
<ide> check_single_tensor_operation('cumprod', (4, 2), WITH_NP, axis=1)
<ide> def test_nn_operations(self):
<ide> check_single_tensor_operation('softmax', (4, 5, 3), WITH_NP, axis=1)
<ide> check_single_tensor_operation('softmax', (4, 5, 3, 10), WITH_NP, axis=2)
<ide>
<del> check_two_tensor_operation('binary_crossentropy', (4, 2), (4, 2),
<del> WITH_NP, from_logits=True)
<del> # cross_entropy call require the label is a valid probability distribution,
<del> # otherwise it is garbage in garbage out...
<del> # due to the algo difference, we can't guarantee CNTK has the same result
<del> # on the garbage input.
<del> # so create a separate test case for valid label input
<del> if K.backend() != 'cntk':
<del> check_two_tensor_operation('categorical_crossentropy', (4, 2), (4, 2),
<del> WITH_NP, from_logits=True)
<del> xval = np.asarray([[0.26157712, 0.0432167], [-0.43380741, 0.30559841],
<del> [0.20225059, -0.38956559], [-0.13805378, 0.08506755]],
<del> dtype=np.float32)
<del> yval = np.asarray([[0.46221867, 0.53778133], [0.51228984, 0.48771016],
<del> [0.64916514, 0.35083486], [0.47028078, 0.52971922]],
<del> dtype=np.float32)
<del> check_two_tensor_operation('categorical_crossentropy', yval, xval, WITH_NP,
<del> cntk_two_dynamicity=True, from_logits=True)
<del> check_two_tensor_operation('binary_crossentropy', (4, 2), (4, 2),
<del> WITH_NP, from_logits=False)
<del> check_two_tensor_operation('categorical_crossentropy', (4, 2), (4, 2),
<del> WITH_NP, from_logits=False)
<del>
<ide> check_single_tensor_operation('l2_normalize', (4, 3), WITH_NP, axis=-1)
<ide> check_single_tensor_operation('l2_normalize', (4, 3), WITH_NP, axis=1)
<ide>
<add> def test_crossentropy(self):
<add> # toy label matrix (4 samples, 2 classes)
<add> label = np.array([[.4, .6], [.3, .7], [.1, .9], [.2, .8]], dtype=np.float32)
<add> check_two_tensor_operation('binary_crossentropy', label, (4, 2), WITH_NP)
<add> check_two_tensor_operation('binary_crossentropy', label, (4, 2),
<add> WITH_NP, from_logits=True)
<add> check_two_tensor_operation('categorical_crossentropy', label, (4, 2),
<add> WITH_NP, cntk_two_dynamicity=True)
<add> check_two_tensor_operation('categorical_crossentropy', label, (4, 2),
<add> WITH_NP, cntk_two_dynamicity=True,
<add> from_logits=True)
<add>
<add> # toy label matrix (2 samples, 3 classes)
<add> label = np.array([[.4, .1, .5], [.2, .6, .2]], dtype=np.float32)
<add> check_two_tensor_operation('categorical_crossentropy', label, (2, 3),
<add> WITH_NP, cntk_two_dynamicity=True)
<add> check_two_tensor_operation('categorical_crossentropy', label, (2, 3),
<add> WITH_NP, cntk_two_dynamicity=True,
<add> from_logits=True)
<add>
<ide> @pytest.mark.skipif(K.backend() == 'cntk', reason='Bug in CNTK')
<ide> def test_in_top_k(self):
<ide> batch_size = 20 | 11 |
Ruby | Ruby | break conditional branches into separate methods | 8e9e424fa99769b86c1d3df958ef534868543316 | <ide><path>activerecord/lib/active_record/associations/association_scope.rb
<ide> def join_type
<ide> end
<ide>
<ide> def self.get_bind_values(owner, chain)
<del> bvs = []
<del> chain.each_with_index do |reflection, i|
<del> if reflection == chain.last
<del> bvs << reflection.join_id_for(owner)
<del> if reflection.type
<del> bvs << owner.class.base_class.name
<del> end
<del> else
<del> if reflection.type
<del> bvs << chain[i + 1].klass.base_class.name
<del> end
<add> binds = []
<add> last_reflection = chain.last
<add>
<add> binds << last_reflection.join_id_for(owner)
<add> if last_reflection.type
<add> binds << owner.class.base_class.name
<add> end
<add>
<add> chain.each_cons(2).each do |reflection, next_reflection|
<add> if reflection.type
<add> binds << next_reflection.klass.base_class.name
<ide> end
<ide> end
<del> bvs
<add> binds
<ide> end
<ide>
<ide> private
<ide> def bind(scope, table_name, column_name, value, tracker)
<ide> bind_value scope, column, value, tracker
<ide> end
<ide>
<add> def last_chain_scope(scope, table, reflection, owner, tracker, assoc_klass)
<add> join_keys = reflection.join_keys(assoc_klass)
<add> key = join_keys.key
<add> foreign_key = join_keys.foreign_key
<add>
<add> bind_val = bind scope, table.table_name, key.to_s, owner[foreign_key], tracker
<add> scope = scope.where(table[key].eq(bind_val))
<add>
<add> if reflection.type
<add> value = owner.class.base_class.name
<add> bind_val = bind scope, table.table_name, reflection.type, value, tracker
<add> scope = scope.where(table[reflection.type].eq(bind_val))
<add> else
<add> scope
<add> end
<add> end
<add>
<add> def next_chain_scope(scope, table, reflection, chain, tracker, assoc_klass, foreign_table, next_reflection)
<add> join_keys = reflection.join_keys(assoc_klass)
<add> key = join_keys.key
<add> foreign_key = join_keys.foreign_key
<add>
<add> constraint = table[key].eq(foreign_table[foreign_key])
<add>
<add> if reflection.type
<add> value = next_reflection.klass.base_class.name
<add> bind_val = bind scope, table.table_name, reflection.type, value, tracker
<add> scope = scope.where(table[reflection.type].eq(bind_val))
<add> end
<add>
<add> scope = scope.joins(join(foreign_table, constraint))
<add> end
<add>
<ide> def add_constraints(scope, owner, assoc_klass, refl, tracker)
<ide> chain = refl.chain
<ide> scope_chain = refl.scope_chain
<ide>
<ide> tables = construct_tables(chain, assoc_klass, refl, tracker)
<ide>
<add> reflection = chain.last
<add> table = tables.last
<add> scope = last_chain_scope(scope, table, reflection, owner, tracker, assoc_klass)
<add>
<ide> chain.each_with_index do |reflection, i|
<ide> table, foreign_table = tables.shift, tables.first
<ide>
<del> join_keys = reflection.join_keys(assoc_klass)
<del> key = join_keys.key
<del> foreign_key = join_keys.foreign_key
<del>
<del> if reflection == chain.last
<del> bind_val = bind scope, table.table_name, key.to_s, owner[foreign_key], tracker
<del> scope = scope.where(table[key].eq(bind_val))
<del>
<del> if reflection.type
<del> value = owner.class.base_class.name
<del> bind_val = bind scope, table.table_name, reflection.type, value, tracker
<del> scope = scope.where(table[reflection.type].eq(bind_val))
<del> end
<del> else
<del> constraint = table[key].eq(foreign_table[foreign_key])
<del>
<del> if reflection.type
<del> value = chain[i + 1].klass.base_class.name
<del> bind_val = bind scope, table.table_name, reflection.type, value, tracker
<del> scope = scope.where(table[reflection.type].eq(bind_val))
<del> end
<del>
<del> scope = scope.joins(join(foreign_table, constraint))
<add> unless reflection == chain.last
<add> next_reflection = chain[i + 1]
<add> scope = next_chain_scope(scope, table, reflection, chain, tracker, assoc_klass, foreign_table, next_reflection)
<ide> end
<ide>
<ide> is_first_chain = i == 0 | 1 |
PHP | PHP | update email.blade.php | 3c397ea1f4c7af4346f4f83144ea8d3ff307f405 | <ide><path>src/Illuminate/Notifications/resources/views/email.blade.php
<ide> @component('mail::subcopy')
<ide> @lang(
<ide> "If you’re having trouble clicking the \":actionText\" button, copy and paste the URL below\n".
<del> 'into your web browser: [:actionURL](:actionURL)',
<add> 'into your web browser: ',
<ide> [
<del> 'actionText' => $actionText,
<del> 'actionURL' => $actionUrl
<add> 'actionText' => $actionText
<ide> ]
<ide> )
<add>[{{ $actionUrl }}]({!! $actionUrl !!})
<ide> @endcomponent
<ide> @endisset
<ide> @endcomponent | 1 |
Javascript | Javascript | refactor the code in test-http-connect | 72a8488e8e0108e92422443364752b788d58ea56 | <ide><path>test/parallel/test-http-connect.js
<ide> 'use strict';
<ide> const common = require('../common');
<del>var assert = require('assert');
<del>var http = require('http');
<add>const assert = require('assert');
<add>const http = require('http');
<ide>
<del>var serverGotConnect = false;
<del>var clientGotConnect = false;
<add>const server = http.createServer(common.fail);
<ide>
<del>var server = http.createServer(common.fail);
<del>server.on('connect', function(req, socket, firstBodyChunk) {
<del> assert.equal(req.method, 'CONNECT');
<del> assert.equal(req.url, 'google.com:443');
<del> console.error('Server got CONNECT request');
<del> serverGotConnect = true;
<add>server.on('connect', common.mustCall((req, socket, firstBodyChunk) => {
<add> assert.strictEqual(req.method, 'CONNECT');
<add> assert.strictEqual(req.url, 'google.com:443');
<ide>
<ide> socket.write('HTTP/1.1 200 Connection established\r\n\r\n');
<ide>
<del> var data = firstBodyChunk.toString();
<del> socket.on('data', function(buf) {
<add> let data = firstBodyChunk.toString();
<add> socket.on('data', (buf) => {
<ide> data += buf.toString();
<ide> });
<del> socket.on('end', function() {
<add>
<add> socket.on('end', common.mustCall(() => {
<ide> socket.end(data);
<del> });
<del>});
<del>server.listen(0, function() {
<del> var req = http.request({
<add> }));
<add>}));
<add>
<add>server.listen(0, common.mustCall(function() {
<add> const req = http.request({
<ide> port: this.address().port,
<ide> method: 'CONNECT',
<ide> path: 'google.com:443'
<ide> }, common.fail);
<ide>
<del> var clientRequestClosed = false;
<del> req.on('close', function() {
<del> clientRequestClosed = true;
<del> });
<del>
<del> req.on('connect', function(res, socket, firstBodyChunk) {
<del> console.error('Client got CONNECT request');
<del> clientGotConnect = true;
<add> req.on('close', common.mustCall(() => {}));
<ide>
<add> req.on('connect', common.mustCall((res, socket, firstBodyChunk) => {
<ide> // Make sure this request got removed from the pool.
<del> var name = 'localhost:' + server.address().port;
<add> const name = 'localhost:' + server.address().port;
<ide> assert(!http.globalAgent.sockets.hasOwnProperty(name));
<ide> assert(!http.globalAgent.requests.hasOwnProperty(name));
<ide>
<ide> // Make sure this socket has detached.
<ide> assert(!socket.ondata);
<ide> assert(!socket.onend);
<del> assert.equal(socket.listeners('connect').length, 0);
<del> assert.equal(socket.listeners('data').length, 0);
<add> assert.strictEqual(socket.listeners('connect').length, 0);
<add> assert.strictEqual(socket.listeners('data').length, 0);
<ide>
<ide> // the stream.Duplex onend listener
<ide> // allow 0 here, so that i can run the same test on streams1 impl
<ide> assert(socket.listeners('end').length <= 1);
<ide>
<del> assert.equal(socket.listeners('free').length, 0);
<del> assert.equal(socket.listeners('close').length, 0);
<del> assert.equal(socket.listeners('error').length, 0);
<del> assert.equal(socket.listeners('agentRemove').length, 0);
<add> assert.strictEqual(socket.listeners('free').length, 0);
<add> assert.strictEqual(socket.listeners('close').length, 0);
<add> assert.strictEqual(socket.listeners('error').length, 0);
<add> assert.strictEqual(socket.listeners('agentRemove').length, 0);
<ide>
<del> var data = firstBodyChunk.toString();
<del> socket.on('data', function(buf) {
<add> let data = firstBodyChunk.toString();
<add> socket.on('data', (buf) => {
<ide> data += buf.toString();
<ide> });
<del> socket.on('end', function() {
<del> assert.equal(data, 'HeadBody');
<del> assert(clientRequestClosed);
<add>
<add> socket.on('end', common.mustCall(() => {
<add> assert.strictEqual(data, 'HeadBody');
<ide> server.close();
<del> });
<add> }));
<add>
<ide> socket.write('Body');
<ide> socket.end();
<del> });
<add> }));
<ide>
<ide> // It is legal for the client to send some data intended for the server
<ide> // before the "200 Connection established" (or any other success or
<ide> // error code) is received.
<ide> req.write('Head');
<ide> req.end();
<del>});
<del>
<del>process.on('exit', function() {
<del> assert.ok(serverGotConnect);
<del> assert.ok(clientGotConnect);
<del>});
<add>})); | 1 |
Java | Java | improve testrecursivescheduler2 determinism | ed30a269900bf1166819c48a161ffc13bd93e0be | <ide><path>rxjava-core/src/test/java/rx/concurrency/TestSchedulers.java
<ide> public Subscription call(Scheduler scheduler, BooleanSubscription cancel) {
<ide> observer.onNext(42);
<ide> latch.countDown();
<ide>
<del> try {
<del> Thread.sleep(1);
<del> } catch (InterruptedException e) {
<del> e.printStackTrace();
<del> }
<del>
<add> // this will recursively schedule this task for execution again
<ide> scheduler.schedule(cancel, this);
<ide>
<ide> return cancel;
<ide> public void onNext(Integer args) {
<ide> fail("Timed out waiting on completion latch");
<ide> }
<ide>
<del> assertEquals(10, count.get()); // wondering if this could be 11 in a race condition (which would be okay due to how unsubscribe works ... just it would make this test non-deterministic)
<add> // the count can be 10 or higher due to thread scheduling of the unsubscribe vs the scheduler looping to emit the count
<add> assertTrue(count.get() >= 10);
<ide> assertTrue(completed.get());
<ide> }
<ide> | 1 |
Ruby | Ruby | improve the deprecation message for errors#on | bc1dd0b82eb994a9948c038e01db7ced5a48ffea | <ide><path>activemodel/lib/active_model/deprecated_error_methods.rb
<ide> module ActiveModel
<ide> module DeprecatedErrorMethods
<ide> def on(attribute)
<ide> message = "Errors#on have been deprecated, use Errors#[] instead.\n"
<del> message << "Also note that the behaviour of Errors#[] has changed. Errors#[] now always returns an Array and an empty Array when "
<del> message << "there are not errors on the specified attribute."
<add> message << "Also note that the behaviour of Errors#[] has changed. Errors#[] now always returns an Array. An empty Array is "
<add> message << "returned when there are no errors on the specified attribute."
<ide> ActiveSupport::Deprecation.warn(message)
<ide>
<ide> errors = self[attribute] | 1 |
Javascript | Javascript | add a fallback for fullstack cert | 4c14e3f3965be359936fc05006773d9031ebc25d | <ide><path>api-server/server/boot/certificate.js
<ide> import {
<ide> infosecQaId,
<ide> fullStackId
<ide> } from '../utils/constantStrings.json';
<add>import { oldDataVizId } from '../../../config/misc';
<ide> import certTypes from '../utils/certTypes.json';
<ide> import superBlockCertTypeMap from '../utils/superBlockCertTypeMap';
<ide> import { completeCommitment$ } from '../utils/commit';
<ide> function createShowCert(app) {
<ide> let { completedDate = new Date() } = certChallenge || {};
<ide>
<ide> // the challenge id has been rotated for isDataVisCert
<del> // so we need to check for id 561add10cb82ac38a17513b3
<ide> if (certType === 'isDataVisCert' && !certChallenge) {
<del> console.log('olderId');
<ide> let oldDataVisIdChall = _.find(
<ide> completedChallenges,
<del> ({ id }) => '561add10cb82ac38a17513b3' === id
<add> ({ id }) => oldDataVizId === id
<ide> );
<ide>
<ide> if (oldDataVisIdChall) {
<ide> completedDate = oldDataVisIdChall.completedDate || completedDate;
<ide> }
<ide> }
<ide>
<add> // if fullcert is not found, return the latest completedDate
<add> if (certType === 'isFullStackCert' && !certChallenge) {
<add> var chalIds = [...Object.values(certIds), oldDataVizId];
<add>
<add> const latestCertDate = completedChallenges
<add> .filter(chal => chalIds.includes(chal.id))
<add> .sort((a, b) => a.completedDate < b.completedDate)[0].completedDate;
<add>
<add> completedDate = latestCertDate ? latestCertDate : completedDate;
<add> }
<add>
<ide> const { username, name } = user;
<ide> return res.json({
<ide> certTitle,
<ide><path>config/misc.js
<add>exports.oldDataVizId = '561add10cb82ac38a17513b3'; | 2 |
Mixed | Javascript | fix some typos | 7ce9710f336c2ad2359466dc9357979c4d562446 | <ide><path>docs/docs/configuration/legend.md
<ide> The legend label configuration is nested below the legend configuration using th
<ide> | `generateLabels` | `function` | | Generates legend items for each thing in the legend. Default implementation returns the text + styling for the color box. See [Legend Item](#legend-item-interface) for details.
<ide> | `filter` | `function` | `null` | Filters legend items out of the legend. Receives 2 parameters, a [Legend Item](#legend-item-interface) and the chart data.
<ide> | `sort` | `function` | `null` | Sorts legend items. Receives 3 parameters, two [Legend Items](#legend-item-interface) and the chart data.
<del>| `usePointStyle` | `boolean` | `false` | Label style will match corresponding point style (size is based on the mimimum value between boxWidth and font.size).
<add>| `usePointStyle` | `boolean` | `false` | Label style will match corresponding point style (size is based on the minimum value between boxWidth and font.size).
<ide>
<ide> ## Legend Title Configuration
<ide>
<ide><path>docs/docs/general/performance.md
<ide> new Chart(ctx, {
<ide>
<ide> ## Parallel rendering with web workers (Chrome only)
<ide>
<del>Chome (in version 69) added the ability to [transfer rendering control of a canvas](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/transferControlToOffscreen) to a web worker. Web workers can use the [OffscreenCanvas API](https://developer.mozilla.org/en-US/docs/Web/API/OffscreenCanvas) to render from a web worker onto canvases in the DOM. Chart.js is a canvas-based library and supports rendering in a web worker - just pass an OffscreenCanvas into the Chart constructor instead of a Canvas element. Note that as of today, this API is only supported in Chrome.
<add>Chrome (in version 69) added the ability to [transfer rendering control of a canvas](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/transferControlToOffscreen) to a web worker. Web workers can use the [OffscreenCanvas API](https://developer.mozilla.org/en-US/docs/Web/API/OffscreenCanvas) to render from a web worker onto canvases in the DOM. Chart.js is a canvas-based library and supports rendering in a web worker - just pass an OffscreenCanvas into the Chart constructor instead of a Canvas element. Note that as of today, this API is only supported in Chrome.
<ide>
<ide> By moving all Chart.js calculations onto a separate thread, the main thread can be freed up for other uses. Some tips and tricks when using Chart.js in a web worker:
<ide> * Transferring data between threads can be expensive, so ensure that your config and data objects are as small as possible. Try generating them on the worker side if you can (workers can make HTTP requests!) or passing them to your worker as ArrayBuffers, which can be transferred quickly from one thread to another.
<ide> new Chart(ctx, {
<ide> });
<ide> ```
<ide>
<del>## When transpiling with Babel, cosider using `loose` mode
<add>## When transpiling with Babel, consider using `loose` mode
<ide>
<ide> Babel 7.9 changed the way classes are constructed. It is slow, unless used with `loose` mode.
<ide> [More information](https://github.com/babel/babel/issues/11356)
<ide><path>docs/docs/general/responsive.md
<ide> chart.canvas.parentNode.style.width = '128px';
<ide>
<ide> Note that in order for the above code to correctly resize the chart height, the [`maintainAspectRatio`](#configuration-options) option must also be set to `false`.
<ide>
<del>## Printing Resizeable Charts
<add>## Printing Resizable Charts
<ide>
<ide> CSS media queries allow changing styles when printing a page. The CSS applied from these media queries may cause charts to need to resize. However, the resize won't happen automatically. To support resizing charts when printing, one needs to hook the [onbeforeprint](https://developer.mozilla.org/en-US/docs/Web/API/WindowEventHandlers/onbeforeprint) event and manually trigger resizing of each chart.
<ide>
<ide><path>src/core/core.animations.js
<ide> export default class Animations {
<ide> const animations = this._createAnimations(options, newOptions);
<ide> if (newOptions.$shared && !options.$shared) {
<ide> // Going from distinct options to shared options:
<del> // After all animations are done, assing the shared options object to the element
<add> // After all animations are done, assign the shared options object to the element
<ide> // So any new updates to the shared options are observed
<ide> awaitAll(target.options.$animations, newOptions).then(() => {
<ide> target.options = newOptions;
<ide><path>src/core/core.plugins.js
<ide> function createDescriptors(plugins, options) {
<ide> /**
<ide> * @method IPlugin#beforeDraw
<ide> * @desc Called before drawing `chart` at every animation frame. If any plugin returns `false`,
<del> * the frame drawing is cancelled untilanother `render` is triggered.
<add> * the frame drawing is cancelled until another `render` is triggered.
<ide> * @param {Chart} chart - The chart instance.
<ide> * @param {object} options - The plugin options.
<ide> * @returns {boolean} `false` to cancel the chart drawing.
<ide><path>src/scales/scale.category.js
<ide> export default class CategoryScale extends Scale {
<ide> return me.getPixelForDecimal((value - me._startValue) / me._valueRange);
<ide> }
<ide>
<del> // Must override base implementation becuase it calls getPixelForValue
<add> // Must override base implementation because it calls getPixelForValue
<ide> // and category scale can have duplicate values
<ide> getPixelForTick(index) {
<ide> const me = this; | 6 |
PHP | PHP | apply scopes on increment/decrement | 4a70fd555b6dd0c81acf5a45955c1b7e82ea375b | <ide><path>src/Illuminate/Database/Eloquent/Builder.php
<ide> public function increment($column, $amount = 1, array $extra = [])
<ide> {
<ide> $extra = $this->addUpdatedAtColumn($extra);
<ide>
<del> return $this->query->increment($column, $amount, $extra);
<add> return $this->toBase()->increment($column, $amount, $extra);
<ide> }
<ide>
<ide> /**
<ide> public function decrement($column, $amount = 1, array $extra = [])
<ide> {
<ide> $extra = $this->addUpdatedAtColumn($extra);
<ide>
<del> return $this->query->decrement($column, $amount, $extra);
<add> return $this->toBase()->decrement($column, $amount, $extra);
<ide> }
<ide>
<ide> /**
<ide><path>tests/Database/DatabaseEloquentSoftDeletesIntegrationTest.php
<ide> public function testSoftDeletesAreNotRetrievedFromBuilderHelpers()
<ide>
<ide> $query = SoftDeletesTestUser::query();
<ide> $this->assertCount(1, $query->simplePaginate(2)->all());
<add>
<add> $this->assertEquals(0, SoftDeletesTestUser::where('email', 'taylorotwell@gmail.com')->increment('id'));
<add> $this->assertEquals(0, SoftDeletesTestUser::where('email', 'taylorotwell@gmail.com')->decrement('id'));
<ide> }
<ide>
<ide> public function testWithTrashedReturnsAllRecords() | 2 |
Ruby | Ruby | move anonymous class to the top, add documentation | fae8f72076c7feef271176636aef9e2dca3930fe | <ide><path>actionpack/lib/action_controller/base.rb
<ide> require "action_controller/metal/params_wrapper"
<ide>
<ide> module ActionController
<add> # The <tt>metal</tt> anonymous class is simple workaround the ordering issues there are with modules.
<add> # They need to be included in specyfic order which makes it impossible for 3rd party libs (like ActiveRecord)
<add> # to hook up with its own functionality. Having anonymous super class type of Metal with <tt>AbstractController::Rendering</tt>
<add> # included, allows us to include <tt>ActionView::Rendering</tt> (which implements <tt>AbstractController::Rendering</tt> interface)
<add> # after the <tt>AbstractController::Rendering</tt> and before <tt>ActionController::Rendering</tt>.
<add> metal = Class.new(Metal) do
<add> include AbstractController::Rendering
<add> end
<add>
<ide> # Action Controllers are the core of a web request in \Rails. They are made up of one or more actions that are executed
<ide> # on request and then either it renders a template or redirects to another action. An action is defined as a public method
<ide> # on the controller, which will automatically be made accessible to the web-server through \Rails Routes.
<ide> module ActionController
<ide> # render action: "overthere" # won't be called if monkeys is nil
<ide> # end
<ide> #
<del> metal = Class.new(Metal) do
<del> include AbstractController::Rendering
<del> end
<del>
<ide> class Base < metal
<ide> abstract!
<ide> | 1 |
Ruby | Ruby | simplify bind parameter logging | 61b69338b27e3d865d4ca87269a7c12a74ea32da | <ide><path>activerecord/lib/active_record/log_subscriber.rb
<ide> def sql(event)
<ide> binds = nil
<ide>
<ide> unless (payload[:binds] || []).empty?
<del> binds = " {" + payload[:binds].map { |col,v|
<del> "#{col.name.inspect} => #{v.inspect}"
<del> }.join(", ") + "}"
<add> binds = " #{Hash[payload[:binds].map { |col,v| [col.name, v] }]}"
<ide> end
<ide>
<ide> if odd?
<ide><path>activerecord/test/cases/bind_parameter_test.rb
<ide> def debug str
<ide> }.new
<ide>
<ide> logger.sql event
<del> assert_match("{#{pk.name.inspect} => #{10.inspect}}", logger.debugs.first)
<add> assert_match({pk.name => 10}.inspect, logger.debugs.first)
<ide> end
<ide> end
<ide> end | 2 |
Python | Python | set version to 2.0.17 | db08b168a34ac1d32decb8d2b45e57b5f45f4b2f | <ide><path>spacy/about.py
<ide> # https://github.com/pypa/warehouse/blob/master/warehouse/__about__.py
<ide>
<ide> __title__ = 'spacy'
<del>__version__ = '2.0.17.dev1'
<add>__version__ = '2.0.17'
<ide> __summary__ = 'Industrial-strength Natural Language Processing (NLP) with Python and Cython'
<ide> __uri__ = 'https://spacy.io'
<ide> __author__ = 'Explosion AI'
<ide> __email__ = 'contact@explosion.ai'
<ide> __license__ = 'MIT'
<del>__release__ = False
<add>__release__ = True
<ide>
<ide> __download_url__ = 'https://github.com/explosion/spacy-models/releases/download'
<ide> __compatibility__ = 'https://raw.githubusercontent.com/explosion/spacy-models/master/compatibility.json' | 1 |
Javascript | Javascript | remove old loader | a14d4d60eb4d7fcf9a20a23eeeff19a80dcce5bf | <ide><path>examples/js/loaders/ABINloader.js
<del>
<del>(function(){
<del>
<del>
<del>var Virtulous = {};
<del>
<del>
<del>Virtulous.KeyFrame = function( time, matrix ) {
<del>
<del> this.time = time;
<del> this.matrix = matrix.clone();
<del> this.position = new THREE.Vector3();
<del> this.quaternion = new THREE.Quaternion();
<del> this.scale = new THREE.Vector3( 1, 1, 1 );
<del> this.matrix.decompose( this.position, this.quaternion, this.scale );
<del> this.clone = function() {
<del>
<del> var n = new Virtulous.KeyFrame( this.time, this.matrix );
<del> return n;
<del>
<del> }
<del> this.lerp = function( nextKey, time ) {
<del>
<del> time -= this.time;
<del> var dist = ( nextKey.time - this.time );
<del> var l = time / dist;
<del> var l2 = 1 - l;
<del> var keypos = this.position;
<del> var keyrot = this.quaternion;
<del> // var keyscl = key.parentspaceScl || key.scl;
<del> var key2pos = nextKey.position;
<del> var key2rot = nextKey.quaternion
<del> // var key2scl = key2.parentspaceScl || key2.scl;
<del> Virtulous.KeyFrame.tempAniPos.x = keypos.x * l2 + key2pos.x * l;
<del> Virtulous.KeyFrame.tempAniPos.y = keypos.y * l2 + key2pos.y * l;
<del> Virtulous.KeyFrame.tempAniPos.z = keypos.z * l2 + key2pos.z * l;
<del> // tempAniScale.x = keyscl[0] * l2 + key2scl[0] * l;
<del> // tempAniScale.y = keyscl[1] * l2 + key2scl[1] * l;
<del> // tempAniScale.z = keyscl[2] * l2 + key2scl[2] * l;
<del> Virtulous.KeyFrame.tempAniQuat.set( keyrot.x, keyrot.y, keyrot.z, keyrot.w );
<del> Virtulous.KeyFrame.tempAniQuat.slerp( key2rot, l );
<del> return Virtulous.KeyFrame.tempAniMatrix.compose( Virtulous.KeyFrame.tempAniPos, Virtulous.KeyFrame.tempAniQuat, Virtulous.KeyFrame.tempAniScale );
<del>
<del> }
<del>
<del>}
<del>Virtulous.KeyFrame.tempAniPos = new THREE.Vector3();
<del>Virtulous.KeyFrame.tempAniQuat = new THREE.Quaternion();
<del>Virtulous.KeyFrame.tempAniScale = new THREE.Vector3( 1, 1, 1 );
<del>Virtulous.KeyFrame.tempAniMatrix = new THREE.Matrix4();
<del>Virtulous.KeyFrameTrack = function() {
<del>
<del> this.keys = [];
<del> this.target = null;
<del> this.time = 0;
<del> this.length = 0;
<del> this._accelTable = {};
<del> this.fps = 20;
<del> this.addKey = function( key ) {
<del>
<del> this.keys.push( key );
<del>
<del> }
<del> this.init = function() {
<del>
<del> this.sortKeys();
<del> if ( this.keys.length > 0 )
<del> this.length = this.keys[ this.keys.length - 1 ].time;
<del> else
<del> this.length = 0;
<del> if ( !this.fps ) return;
<del> for ( var j = 0; j < this.length * this.fps; j++ ) {
<del>
<del> for ( var i = 0; i < this.keys.length; i++ ) {
<del>
<del> if ( this.keys[ i ].time == j ) {
<del>
<del> this._accelTable[ j ] = i;
<del> break;
<del> } else if ( this.keys[ i ].time < j / this.fps && this.keys[ i + 1 ] && this.keys[ i + 1 ].time >= j / this.fps ) {
<del>
<del> this._accelTable[ j ] = i;
<del> break;
<del>
<del> }
<del>
<del> }
<del>
<del> }
<del>
<del> }
<del> this.parseFromThree = function( data ) {
<del>
<del> var fps = data.fps;
<del> this.target = data.node;
<del> var track = data.hierarchy[ 0 ].keys;
<del> for ( var i = 0; i < track.length; i++ ) {
<del>
<del> this.addKey( new Virtulous.KeyFrame( i / fps || track[ i ].time, track[ i ].targets[ 0 ].data ) )
<del>
<del> }
<del> this.init();
<del>
<del> }
<del> this.parseFromCollada = function( data ) {
<del>
<del> var track = data.keys;
<del> var fps = this.fps;
<del> for ( var i = 0; i < track.length; i++ ) {
<del>
<del> this.addKey( new Virtulous.KeyFrame( i / fps || track[ i ].time, track[ i ].matrix ) )
<del>
<del> }
<del> this.init();
<del>
<del> }
<del> this.sortKeys = function() {
<del>
<del> this.keys.sort( this.keySortFunc )
<del>
<del> }
<del> this.keySortFunc = function( a, b ) {
<del>
<del> return a.time - b.time;
<del> }
<del>
<del> this.clone = function() {
<del>
<del> var t = new Virtulous.KeyFrameTrack();
<del> t.target = this.target;
<del> t.time = this.time;
<del> t.length = this.length;
<del> for ( var i = 0; i < this.keys.length; i++ ) {
<del>
<del> t.addKey( this.keys[ i ].clone() );
<del> }
<del>
<del> t.init();
<del> return t;
<del> }
<del>
<del> this.reTarget = function( root, compareitor ) {
<del>
<del> if ( !compareitor ) compareitor = Virtulous.TrackTargetNodeNameCompare;
<del> this.target = compareitor( root, this.target );
<del>
<del> }
<del> this.keySearchAccel = function( time ) {
<del>
<del> time *= this.fps;
<del> time = Math.floor( time );
<del> return this._accelTable[ time ] || 0;
<del> }
<del>
<del> this.setTime = function( time ) {
<del>
<del> time = Math.abs( time );
<del> if ( this.length )
<del> time = time % this.length + .05;
<del> var key0 = null;
<del> var key1 = null;
<del> for ( var i = this.keySearchAccel( time ); i < this.keys.length; i++ ) {
<del>
<del> if ( this.keys[ i ].time == time ) {
<del>
<del> key0 = this.keys[ i ];
<del> key1 = this.keys[ i ];
<del> break;
<del>
<del> } else if ( this.keys[ i ].time < time && this.keys[ i + 1 ] && this.keys[ i + 1 ].time > time ) {
<del>
<del> key0 = this.keys[ i ];
<del> key1 = this.keys[ i + 1 ];
<del> break;
<del>
<del> } else if ( this.keys[ i ].time < time && i == this.keys.length - 1 ) {
<del>
<del> key0 = this.keys[ i ];
<del> key1 = this.keys[ 0 ].clone();
<del> key1.time += this.length + .05;
<del> break;
<del>
<del> }
<del>
<del> }
<del> if ( key0 && key1 && key0 !== key1 ) {
<del>
<del> this.target.matrixAutoUpdate = false;
<del> this.target.matrix.copy( key0.lerp( key1, time ) );
<del> this.target.matrixWorldNeedsUpdate = true;
<del> return;
<del>
<del> }
<del> if ( key0 && key1 && key0 == key1 ) {
<del>
<del> this.target.matrixAutoUpdate = false;
<del> this.target.matrix.copy( key0.matrix );
<del> this.target.matrixWorldNeedsUpdate = true;
<del> return;
<del>
<del> }
<del> }
<del>}
<del>Virtulous.TrackTargetNodeNameCompare = function( root, target ) {
<del>
<del> function find( node, name ) {
<del>
<del> if ( node.name == name )
<del> return node;
<del> for ( var i = 0; i < node.children.length; i++ ) {
<del>
<del> var r = find( node.children[ i ], name )
<del> if ( r ) return r;
<del>
<del> }
<del>
<del> return null;
<del>
<del> }
<del>
<del> return find( root, target.name );
<del>
<del>}
<del>Virtulous.Animation = function() {
<del>
<del> this.tracks = [];
<del> this.length = 0;
<del> this.addTrack = function( track ) {
<del>
<del> this.tracks.push( track );
<del> this.length = Math.max( track.length, this.length );
<del>
<del> }
<del> this.setTime = function( time ) {
<del>
<del> this.time = time;
<del> for ( var i = 0; i < this.tracks.length; i++ )
<del> this.tracks[ i ].setTime( time );
<del>
<del> }
<del>
<del> this.clone = function( target, compareitor ) {
<del>
<del> if ( !compareitor ) compareitor = Virtulous.TrackTargetNodeNameCompare;
<del> var n = new Virtulous.Animation();
<del> n.target = target;
<del> for ( var i = 0; i < this.tracks.length; i++ ) {
<del>
<del> var track = this.tracks[ i ].clone();
<del> track.reTarget( target, compareitor );
<del> n.addTrack( track );
<del>
<del> }
<del>
<del> return n;
<del>
<del> }
<del>
<del>}
<del>
<del>var ASSBIN_CHUNK_AICAMERA = 0x1234;
<del>var ASSBIN_CHUNK_AILIGHT = 0x1235;
<del>var ASSBIN_CHUNK_AITEXTURE = 0x1236;
<del>var ASSBIN_CHUNK_AIMESH = 0x1237;
<del>var ASSBIN_CHUNK_AINODEANIM = 0x1238;
<del>var ASSBIN_CHUNK_AISCENE = 0x1239;
<del>var ASSBIN_CHUNK_AIBONE = 0x123a;
<del>var ASSBIN_CHUNK_AIANIMATION = 0x123b;
<del>var ASSBIN_CHUNK_AINODE = 0x123c;
<del>var ASSBIN_CHUNK_AIMATERIAL = 0x123d;
<del>var ASSBIN_CHUNK_AIMATERIALPROPERTY = 0x123e;
<del>var ASSBIN_MESH_HAS_POSITIONS = 0x1;
<del>var ASSBIN_MESH_HAS_NORMALS = 0x2;
<del>var ASSBIN_MESH_HAS_TANGENTS_AND_BITANGENTS = 0x4;
<del>var ASSBIN_MESH_HAS_TEXCOORD_BASE = 0x100;
<del>var ASSBIN_MESH_HAS_COLOR_BASE = 0x10000;
<del>var AI_MAX_NUMBER_OF_COLOR_SETS = 1;
<del>var AI_MAX_NUMBER_OF_TEXTURECOORDS = 4;
<del>var aiLightSource_UNDEFINED = 0x0;
<del>//! A directional light source has a well-defined direction
<del>//! but is infinitely far away. That's quite a good
<del>//! approximation for sun light.
<del>var aiLightSource_DIRECTIONAL = 0x1;
<del>//! A point light source has a well-defined position
<del>//! in space but no direction - it emits light in all
<del>//! directions. A normal bulb is a point light.
<del>var aiLightSource_POINT = 0x2;
<del>//! A spot light source emits light in a specific
<del>//! angle. It has a position and a direction it is pointing to.
<del>//! A good example for a spot light is a light spot in
<del>//! sport arenas.
<del>var aiLightSource_SPOT = 0x3;
<del>//! The generic light level of the world, including the bounces
<del>//! of all other lightsources.
<del>//! Typically, there's at most one ambient light in a scene.
<del>//! This light type doesn't have a valid position, direction, or
<del>//! other properties, just a color.
<del>var aiLightSource_AMBIENT = 0x4;
<del>/** Flat shading. Shading is done on per-face base,
<del> * diffuse only. Also known as 'faceted shading'.
<del> */
<del>var aiShadingMode_Flat = 0x1;
<del>/** Simple Gouraud shading.
<del> */
<del>var aiShadingMode_Gouraud = 0x2;
<del>/** Phong-Shading -
<del> */
<del>var aiShadingMode_Phong = 0x3;
<del>/** Phong-Blinn-Shading
<del> */
<del>var aiShadingMode_Blinn = 0x4;
<del>/** Toon-Shading per pixel
<del> *
<del> * Also known as 'comic' shader.
<del> */
<del>var aiShadingMode_Toon = 0x5;
<del>/** OrenNayar-Shading per pixel
<del> *
<del> * Extension to standard Lambertian shading, taking the
<del> * roughness of the material into account
<del> */
<del>var aiShadingMode_OrenNayar = 0x6;
<del>/** Minnaert-Shading per pixel
<del> *
<del> * Extension to standard Lambertian shading, taking the
<del> * "darkness" of the material into account
<del> */
<del>var aiShadingMode_Minnaert = 0x7;
<del>/** CookTorrance-Shading per pixel
<del> *
<del> * Special shader for metallic surfaces.
<del> */
<del>var aiShadingMode_CookTorrance = 0x8;
<del>/** No shading at all. Constant light influence of 1.0.
<del> */
<del>var aiShadingMode_NoShading = 0x9;
<del>/** Fresnel shading
<del> */
<del>var aiShadingMode_Fresnel = 0xa;
<del>var aiTextureType_NONE = 0x0;
<del>/** The texture is combined with the result of the diffuse
<del> * lighting equation.
<del> */
<del>var aiTextureType_DIFFUSE = 0x1;
<del>/** The texture is combined with the result of the specular
<del> * lighting equation.
<del> */
<del>var aiTextureType_SPECULAR = 0x2;
<del>/** The texture is combined with the result of the ambient
<del> * lighting equation.
<del> */
<del>var aiTextureType_AMBIENT = 0x3;
<del>/** The texture is added to the result of the lighting
<del> * calculation. It isn't influenced by incoming light.
<del> */
<del>var aiTextureType_EMISSIVE = 0x4;
<del>/** The texture is a height map.
<del> *
<del> * By convention, higher gray-scale values stand for
<del> * higher elevations from the base height.
<del> */
<del>var aiTextureType_HEIGHT = 0x5;
<del>/** The texture is a (tangent space) normal-map.
<del> *
<del> * Again, there are several conventions for tangent-space
<del> * normal maps. Assimp does (intentionally) not
<del> * distinguish here.
<del> */
<del>var aiTextureType_NORMALS = 0x6;
<del>/** The texture defines the glossiness of the material.
<del> *
<del> * The glossiness is in fact the exponent of the specular
<del> * (phong) lighting equation. Usually there is a conversion
<del> * function defined to map the linear color values in the
<del> * texture to a suitable exponent. Have fun.
<del> */
<del>var aiTextureType_SHININESS = 0x7;
<del>/** The texture defines per-pixel opacity.
<del> *
<del> * Usually 'white' means opaque and 'black' means
<del> * 'transparency'. Or quite the opposite. Have fun.
<del> */
<del>var aiTextureType_OPACITY = 0x8;
<del>/** Displacement texture
<del> *
<del> * The exact purpose and format is application-dependent.
<del> * Higher color values stand for higher vertex displacements.
<del> */
<del>var aiTextureType_DISPLACEMENT = 0x9;
<del>/** Lightmap texture (aka Ambient Occlusion)
<del> *
<del> * Both 'Lightmaps' and dedicated 'ambient occlusion maps' are
<del> * covered by this material property. The texture contains a
<del> * scaling value for the final color value of a pixel. Its
<del> * intensity is not affected by incoming light.
<del> */
<del>var aiTextureType_LIGHTMAP = 0xA;
<del>/** Reflection texture
<del> *
<del> * Contains the color of a perfect mirror reflection.
<del> * Rarely used, almost never for real-time applications.
<del> */
<del>var aiTextureType_REFLECTION = 0xB;
<del>/** Unknown texture
<del> *
<del> * A texture reference that does not match any of the definitions
<del> * above is considered to be 'unknown'. It is still imported,
<del> * but is excluded from any further postprocessing.
<del> */
<del>var aiTextureType_UNKNOWN = 0xC;
<del>var BONESPERVERT = 4;
<del>
<del>function ASSBIN_MESH_HAS_TEXCOORD( n ) {
<del>
<del> return ( ASSBIN_MESH_HAS_TEXCOORD_BASE << n )
<del>
<del>}
<del>
<del>function ASSBIN_MESH_HAS_COLOR( n ) {
<del>
<del> return ( ASSBIN_MESH_HAS_COLOR_BASE << n )
<del>
<del>}
<del>
<del>function markBones( scene ) {
<del>
<del> for ( var i in scene.mMeshes ) {
<del>
<del> var mesh = scene.mMeshes[ i ];
<del> for ( var k in mesh.mBones ) {
<del>
<del> var boneNode = scene.findNode( mesh.mBones[ k ].mName );
<del> if ( boneNode )
<del> boneNode.isBone = true;
<del>
<del> }
<del>
<del> }
<del>
<del>}
<del>function cloneTreeToBones( root, scene ) {
<del>
<del> var rootBone = new THREE.Bone();
<del> rootBone.matrix.copy( root.matrix );
<del> rootBone.matrixWorld.copy( root.matrixWorld );
<del> rootBone.position.copy( root.position );
<del> rootBone.quaternion.copy( root.quaternion );
<del> rootBone.scale.copy( root.scale );
<del> scene.nodeCount++;
<del> rootBone.name = "bone_" + root.name + scene.nodeCount.toString();
<del>
<del> if ( !scene.nodeToBoneMap[ root.name ] )
<del> scene.nodeToBoneMap[ root.name ] = [];
<del> scene.nodeToBoneMap[ root.name ].push( rootBone );
<del> for ( var i in root.children ) {
<del>
<del> var child = cloneTreeToBones( root.children[ i ], scene );
<del> if ( child )
<del> rootBone.add( child );
<del>
<del> }
<del>
<del> return rootBone;
<del>
<del>}
<del>
<del>function aiAnimation() {
<del>
<del> this.mName = "";
<del> this.mDuration = 0;
<del> this.mTicksPerSecond = 0;
<del> this.mNumChannels = 0;
<del> this.mChannels = [];
<del>
<del>}
<del>
<del>function sortWeights( indexes, weights ) {
<del>
<del> var pairs = [];
<del> for ( var i = 0; i < indexes.length; i++ ) {
<del>
<del> pairs.push( {
<del>
<del> i: indexes[ i ],
<del> w: weights[ i ]
<del> } )
<del> }
<del> pairs.sort( function( a, b ) {
<del>
<del> return b.w - a.w;
<del> } )
<del> while ( pairs.length < 4 ) {
<del>
<del> pairs.push( {
<del>
<del> i: 0,
<del> w: 0
<del> } )
<del> };
<del> if ( pairs.length > 4 )
<del> pairs.length = 4;
<del> var sum = 0;
<del> for ( var i = 0; i < 4; i++ ) {
<del>
<del> sum += pairs[ i ].w * pairs[ i ].w;
<del> }
<del> sum = Math.sqrt( sum );
<del> for ( var i = 0; i < 4; i++ ) {
<del>
<del> pairs[ i ].w = pairs[ i ].w / sum;
<del> indexes[ i ] = pairs[ i ].i;
<del> weights[ i ] = pairs[ i ].w;
<del> }
<del>
<del>}
<del>
<del>function findMatchingBone( root, name ) {
<del>
<del> if ( root.name.indexOf( "bone_" + name ) == 0 )
<del> return root;
<del>
<del> for ( var i in root.children ) {
<del>
<del> var ret = findMatchingBone( root.children[ i ], name )
<del> if ( ret )
<del> return ret;
<del>
<del> }
<del>
<del> return undefined;
<del>
<del>}
<del>
<del>function aiMesh() {
<del>
<del> this.mPrimitiveTypes = 0;
<del> this.mNumVertices = 0;
<del> this.mNumFaces = 0;
<del> this.mNumBones = 0;
<del> this.mMaterialIndex = 0;
<del> this.mVertices = [];
<del> this.mNormals = [];
<del> this.mTangents = [];
<del> this.mBitangents = [];
<del> this.mColors = [
<del> []
<del> ];
<del> this.mTextureCoords = [
<del> []
<del> ];
<del> this.mFaces = [];
<del> this.mBones = [];
<del> this.hookupSkeletons = function(scene, threeScene)
<del> {
<del> if (this.mBones.length == 0) return
<del> var allBones = [];
<del> var offsetMatrix = [];
<del> var skeletonRoot = scene.findNode(this.mBones[0].mName);
<del>
<del> while (skeletonRoot.mParent && skeletonRoot.mParent.isBone)
<del> {
<del> skeletonRoot = skeletonRoot.mParent;
<del> }
<del> var threeSkeletonRoot = skeletonRoot.toTHREE(scene);
<del> var threeSkeletonRootBone = cloneTreeToBones(threeSkeletonRoot,scene);
<del> this.threeNode.add(threeSkeletonRootBone);
<del> for (var i = 0; i < this.mBones.length; i++)
<del> {
<del> var bone = findMatchingBone(threeSkeletonRootBone, this.mBones[i].mName);
<del> if (bone)
<del> {
<del> var tbone = bone;
<del> allBones.push(tbone);
<del> //tbone.matrixAutoUpdate = false;
<del> offsetMatrix.push(this.mBones[i].mOffsetMatrix.toTHREE());
<del> }
<del> else
<del> {
<del> var skeletonRoot = scene.findNode(this.mBones[i].mName);
<del> if (!skeletonRoot) return;
<del> var threeSkeletonRoot = skeletonRoot.toTHREE(scene);
<del> var threeSkeletonRootParent = threeSkeletonRoot.parent;
<del> var threeSkeletonRootBone = cloneTreeToBones(threeSkeletonRoot,scene);
<del> this.threeNode.add(threeSkeletonRootBone);
<del> var bone = findMatchingBone(threeSkeletonRootBone, this.mBones[i].mName);
<del> var tbone = bone;
<del> allBones.push(tbone);
<del> //tbone.matrixAutoUpdate = false;
<del> offsetMatrix.push(this.mBones[i].mOffsetMatrix.toTHREE());
<del> }
<del> }
<del> var skeleton = new THREE.Skeleton(allBones, offsetMatrix);
<del> this.threeNode.bind(skeleton, new THREE.Matrix4());
<del> this.threeNode.material.skinning = true;
<del> }
<del> this.toTHREE = function( scene ) {
<del>
<del> if ( this.threeNode ) return this.threeNode;
<del> var geometry = new THREE.BufferGeometry();
<del> var mat;
<del> if ( scene.mMaterials[ this.mMaterialIndex ] )
<del> mat = scene.mMaterials[ this.mMaterialIndex ].toTHREE( scene );
<del> else
<del> mat = new THREE.MeshLambertMaterial();
<del> geometry.addAttribute( 'position', new THREE.BufferAttribute( this.mVertexBuffer, 3 ) );
<del> geometry.setIndex(new THREE.BufferAttribute( new Uint32Array( this.mIndexArray ), 1 ) );
<del> if ( this.mNormalBuffer.length > 0 )
<del> geometry.addAttribute( 'normal', new THREE.BufferAttribute( this.mNormalBuffer, 3 ) );
<del> if ( this.mColorBuffer && this.mColorBuffer.length > 0 )
<del> geometry.addAttribute( 'color', new THREE.BufferAttribute( this.mColorBuffer, 4 ) );
<del> if ( this.mTexCoordsBuffers[ 0 ] && this.mTexCoordsBuffers[ 0 ].length > 0 )
<del> geometry.addAttribute( 'uv', new THREE.BufferAttribute( new Float32Array( this.mTexCoordsBuffers[ 0 ] ), 2 ) );
<del> if ( this.mTexCoordsBuffers[ 1 ] && this.mTexCoordsBuffers[ 1 ] && this.mTextureCoords[ 1 ].length > 0 )
<del> geometry.addAttribute( 'uv1', new THREE.BufferAttribute( new Float32Array( this.mTexCoordsBuffers[ 1 ] ), 2 ) );
<del> if ( this.mTangentBuffer && this.mTangentBuffer.length > 0 )
<del> geometry.addAttribute( 'tangents', new THREE.BufferAttribute( this.mTangentBuffer, 3 ) );
<del> if ( this.mBitangentBuffer && this.mBitangentBuffer.length > 0 )
<del> geometry.addAttribute( 'bitangents', new THREE.BufferAttribute( this.mBitangentBuffer, 3 ) );
<del> if ( this.mBones.length > 0 ) {
<del>
<del> var weights = [];
<del> var bones = [];
<del> for ( var i = 0 ; i < this.mBones.length; i++ ) {
<del>
<del> for ( var j = 0; j < this.mBones[ i ].mWeights.length; j++ ) {
<del>
<del> var weight = this.mBones[ i ].mWeights[ j ];
<del> if ( weight ) {
<del>
<del> if ( !weights[ weight.mVertexId ] ) weights[ weight.mVertexId ] = [];
<del> if ( !bones[ weight.mVertexId ] ) bones[ weight.mVertexId ] = [];
<del> weights[ weight.mVertexId ].push( weight.mWeight );
<del> bones[ weight.mVertexId ].push( parseInt( i ) );
<del> }
<del> }
<del> }
<del> for ( var i in bones ) {
<del>
<del> sortWeights( bones[ i ], weights[ i ] );
<del> }
<del> var _weights = [];
<del> var _bones = [];
<del> for ( var i = 0; i < weights.length; i++ )
<del> for ( var j = 0; j < 4; j++ ) {
<del>
<del> if ( weights[ i ] && bones[ i ] ) {
<del>
<del> _weights.push( weights[ i ][ j ] );
<del> _bones.push( bones[ i ][ j ] );
<del> } else {
<del>
<del> _weights.push( 0 );
<del> _bones.push( 0 );
<del> }
<del> }
<del> geometry.addAttribute( 'skinWeight', new THREE.BufferAttribute( new Float32Array( _weights ), BONESPERVERT ) );
<del> geometry.addAttribute( 'skinIndex', new THREE.BufferAttribute( new Float32Array( _bones ), BONESPERVERT ) );
<del> }
<del> var mesh;
<del> if ( this.mBones.length == 0 )
<del> mesh = new THREE.Mesh( geometry, mat );
<del> if ( this.mBones.length > 0 ) {
<del>
<del> mesh = new THREE.SkinnedMesh( geometry, mat );
<del> }
<del> this.threeNode = mesh;
<del> //mesh.matrixAutoUpdate = false;
<del> return mesh;
<del> }
<del>
<del>}
<del>
<del>function aiFace() {
<del>
<del> this.mNumIndices = 0;
<del> this.mIndices = [];
<del>
<del>}
<del>
<del>function aiVector3D() {
<del>
<del> this.x = 0;
<del> this.y = 0;
<del> this.z = 0;
<del> this.toTHREE = function() {
<del>
<del> return new THREE.Vector3( this.x, this.y, this.z );
<del> }
<del>
<del>}
<del>
<del>function aiVector2D() {
<del>
<del> this.x = 0;
<del> this.y = 0;
<del> this.toTHREE = function() {
<del>
<del> return new THREE.Vector2( this.x, this.y );
<del> }
<del>
<del>}
<del>
<del>function aiVector4D() {
<del>
<del> this.w = 0;
<del> this.x = 0;
<del> this.y = 0;
<del> this.z = 0;
<del> this.toTHREE = function() {
<del>
<del> return new THREE.Vector4( this.w, this.x, this.y, this.z );
<del> }
<del>
<del>}
<del>
<del>function aiColor4D() {
<del>
<del> this.r = 0;
<del> this.g = 0;
<del> this.b = 0;
<del> this.a = 0;
<del> this.toTHREE = function() {
<del>
<del> return new THREE.Color( this.r, this.g, this.b, this.a );
<del> }
<del>
<del>}
<del>
<del>function aiColor3D() {
<del>
<del> this.r = 0;
<del> this.g = 0;
<del> this.b = 0;
<del> this.a = 0;
<del> this.toTHREE = function() {
<del>
<del> return new THREE.Color( this.r, this.g, this.b, 1 );
<del> }
<del>
<del>}
<del>
<del>function aiQuaternion()
<del>{
<del> this.x = 0;
<del> this.y = 0;
<del> this.z = 0;
<del> this.w = 0;
<del> this.toTHREE = function()
<del> {
<del>
<del> return new THREE.Quaternion(this.x, this.y, this.z, this.w);
<del> }
<del>}
<del>
<del>function aiVertexWeight() {
<del>
<del> this.mVertexId = 0;
<del> this.mWeight = 0;
<del>
<del>}
<del>
<del>function aiString() {
<del>
<del> this.data = [];
<del> this.toString = function() {
<del>
<del> var str = '';
<del> this.data.forEach( function( i ) {
<del>
<del> str += ( String.fromCharCode( i ) )
<del> } );
<del> return str.replace( /[^\x20-\x7E]+/g, '' );
<del> }
<del>
<del>}
<del>
<del>function aiVectorKey() {
<del>
<del> this.mTime = 0;
<del> this.mValue = null;
<del>
<del>}
<del>
<del>function aiQuatKey() {
<del>
<del> this.mTime = 0;
<del> this.mValue = null;
<del>
<del>}
<del>
<del>function aiNode() {
<del>
<del> this.mName = '';
<del> this.mTransformation = [];
<del> this.mNumChildren = 0;
<del> this.mNumMeshes = 0;
<del> this.mMeshes = [];
<del> this.mChildren = [];
<del> this.toTHREE = function( scene ) {
<del>
<del> if ( this.threeNode ) return this.threeNode;
<del> var o = new THREE.Object3D();
<del> o.name = this.mName;
<del> o.matrix = this.mTransformation.toTHREE();
<del> for ( var i = 0; i < this.mChildren.length; i++ ) {
<del>
<del> o.add( this.mChildren[ i ].toTHREE( scene ) );
<del> }
<del> for ( var i = 0; i < this.mMeshes.length; i++ ) {
<del>
<del> o.add( scene.mMeshes[ this.mMeshes[ i ] ].toTHREE( scene ) );
<del> }
<del> this.threeNode = o;
<del> //o.matrixAutoUpdate = false;
<del> o.matrix.decompose( o.position, o.quaternion, o.scale );
<del> return o;
<del> }
<del>
<del>}
<del>
<del>function aiBone() {
<del>
<del> this.mName = '';
<del> this.mNumWeights = 0;
<del> this.mOffsetMatrix = 0;
<del>
<del>}
<del>
<del>function aiMaterialProperty() {
<del>
<del> this.mKey = "";
<del> this.mSemantic = 0;
<del> this.mIndex = 0;
<del> this.mData = [];
<del> this.mDataLength = 0;
<del> this.mType = 0;
<del> this.dataAsColor = function() {
<del>
<del> var array = ( new Uint8Array( this.mData ) ).buffer;
<del> var reader = new DataView( array );
<del> var r = reader.getFloat32( 0, true );
<del> var g = reader.getFloat32( 4, true );
<del> var b = reader.getFloat32( 8, true );
<del> //var a = reader.getFloat32(12, true);
<del> return new THREE.Color( r, g, b );
<del> }
<del> this.dataAsFloat = function() {
<del>
<del> var array = ( new Uint8Array( this.mData ) ).buffer;
<del> var reader = new DataView( array );
<del> var r = reader.getFloat32( 0, true );
<del> return r;
<del> }
<del> this.dataAsBool = function() {
<del>
<del> var array = ( new Uint8Array( this.mData ) ).buffer;
<del> var reader = new DataView( array );
<del> var r = reader.getFloat32( 0, true );
<del> return !!r;
<del> }
<del> this.dataAsString = function() {
<del>
<del> var s = new aiString();
<del> s.data = this.mData;
<del> return s.toString();
<del> }
<del> this.dataAsMap = function( scene ) {
<del>
<del> var baseURL = scene.baseURL;
<del> baseURL = baseURL.substr(0, baseURL.lastIndexOf( "/" ) + 1 )
<del> var s = new aiString();
<del> s.data = this.mData;
<del> var path = s.toString();
<del> path = path.replace( /\\/g, '/' );
<del> if ( path.indexOf( "/" ) != -1 ) {
<del>
<del> path = path.substr( path.lastIndexOf( "/" ) + 1 );
<del> }
<del>
<del> return THREE.ImageUtils.loadTexture(baseURL + path );
<del> }
<del>
<del>}
<del>var namePropMapping = {
<del>
<del> "?mat.name": "name",
<del> "$mat.shadingm": "shading",
<del> "$mat.twosided": "twoSided",
<del> "$mat.wireframe": "wireframe",
<del> "$clr.ambient": "ambient",
<del> "$clr.diffuse": "color",
<del> "$clr.specular": "specular",
<del> "$clr.emissive": "emissive",
<del> "$clr.transparent": "transparent",
<del> "$clr.reflective": "reflect",
<del> "$mat.shininess": "shininess",
<del> "$mat.reflectivity": "reflectivity",
<del> "$mat.refracti": "refraction",
<del> "$tex.file": "map"
<del>
<del>}
<del>var nameTexMapping = {
<del>
<del> "$tex.ambient": "ambientMap",
<del> "$clr.diffuse": "map",
<del> "$clr.specular": "specMap",
<del> "$clr.emissive": "emissive",
<del> "$clr.transparent": "alphaMap",
<del> "$clr.reflective": "reflectMap",
<del>
<del>}
<del>var nameTypeMapping = {
<del>
<del> "?mat.name": "string",
<del> "$mat.shadingm": "bool",
<del> "$mat.twosided": "bool",
<del> "$mat.wireframe": "bool",
<del> "$clr.ambient": "color",
<del> "$clr.diffuse": "color",
<del> "$clr.specular": "color",
<del> "$clr.emissive": "color",
<del> "$clr.transparent": "color",
<del> "$clr.reflective": "color",
<del> "$mat.shininess": "float",
<del> "$mat.reflectivity": "float",
<del> "$mat.refracti": "float",
<del> "$tex.file": "map"
<del>
<del>}
<del>
<del>function aiMaterial() {
<del>
<del> this.mNumAllocated = 0;
<del> this.mNumProperties = 0;
<del> this.mProperties = [];
<del> this.toTHREE = function( scene ) {
<del>
<del> var name = this.mProperties[ 0 ].dataAsString();
<del> var mat = new THREE.MeshPhongMaterial();
<del> for ( var i = 0; i < this.mProperties.length; i++ ) {
<del>
<del>
<del> if ( nameTypeMapping[ this.mProperties[ i ].mKey ] == 'float' )
<del> mat[ namePropMapping[ this.mProperties[ i ].mKey ] ] = this.mProperties[ i ].dataAsFloat();
<del> if ( nameTypeMapping[ this.mProperties[ i ].mKey ] == 'color' )
<del> mat[ namePropMapping[ this.mProperties[ i ].mKey ] ] = this.mProperties[ i ].dataAsColor();
<del> if ( nameTypeMapping[ this.mProperties[ i ].mKey ] == 'bool' )
<del> mat[ namePropMapping[ this.mProperties[ i ].mKey ] ] = this.mProperties[ i ].dataAsBool();
<del> if ( nameTypeMapping[ this.mProperties[ i ].mKey ] == 'string' )
<del> mat[ namePropMapping[ this.mProperties[ i ].mKey ] ] = this.mProperties[ i ].dataAsString();
<del> if ( nameTypeMapping[ this.mProperties[ i ].mKey ] == 'map' ) {
<del>
<del> var prop = this.mProperties[ i ];
<del> if ( prop.mSemantic == aiTextureType_DIFFUSE )
<del> mat.map = this.mProperties[ i ].dataAsMap( scene );
<del> if ( prop.mSemantic == aiTextureType_NORMALS )
<del> mat.normalMap = this.mProperties[ i ].dataAsMap( scene );
<del> if ( prop.mSemantic == aiTextureType_LIGHTMAP )
<del> mat.lightMap = this.mProperties[ i ].dataAsMap( scene );
<del> if ( prop.mSemantic == aiTextureType_OPACITY )
<del> mat.alphaMap = this.mProperties[ i ].dataAsMap( scene );
<del> }
<del> }
<del> mat.ambient.r = .53;
<del> mat.ambient.g = .53;
<del> mat.ambient.b = .53;
<del> mat.color.r = 1;
<del> mat.color.g = 1;
<del> mat.color.b = 1;
<del> return mat;
<del> }
<del>
<del>}
<del>
<del>
<del>function veclerp(v1,v2,l) {
<del>
<del> var v = new THREE.Vector3();
<del> var lm1 = 1-l;
<del> v.x = v1.x * l + v2.x * lm1;
<del> v.y = v1.y * l + v2.y * lm1;
<del> v.z = v1.z * l + v2.z * lm1;
<del> return v;
<del>
<del>}
<del>
<del>function quatlerp(q1,q2,l) {
<del>
<del> return q1.clone().slerp(q2,1-l);
<del>
<del>}
<del>
<del>function sampleTrack(keys,time,lne,lerp) {
<del>
<del> if(keys.length == 1)
<del> return keys[0].mValue.toTHREE();
<del> var dist = Infinity;
<del> var key = null;
<del> var nextKey = null;
<del>
<del> for(var i=0; i < keys.length; i++)
<del> {
<del> var timeDist = Math.abs(keys[i].mTime - time);
<del> if( timeDist < dist && keys[i].mTime <= time) {
<del>
<del> dist = timeDist;
<del> key = keys[i];
<del> nextKey = keys[i+1];
<del>
<del> }
<del>
<del> }
<del> if(!key)
<del> return null;
<del> if(key && nextKey) {
<del>
<del> var dT = nextKey.mTime - key.mTime;
<del> var T = key.mTime - time;
<del> var l = T/dT;
<del> return lerp(key.mValue.toTHREE(),nextKey.mValue.toTHREE(),l)
<del>
<del> }
<del>
<del> nextKey = keys[0].clone();
<del> nextKey.mTime += lne;
<del> var dT = nextKey.mTime - key.mTime;
<del> var T = key.mTime - time;
<del> var l = T/dT;
<del> return lerp(key.mValue.toTHREE(),nextKey.mValue.toTHREE(),l)
<del>
<del>}
<del>
<del>function aiNodeAnim() {
<del> this.mNodeName = "";
<del> this.mNumPositionKeys = 0;
<del> this.mNumRotationKeys = 0;
<del> this.mNumScalingKeys = 0;
<del> this.mPositionKeys = [];
<del> this.mRotationKeys = [];
<del> this.mScalingKeys = [];
<del> this.mPreState = "";
<del> this.mPostState = "";
<del> this.init = function(tps) {
<del>
<del> if(!tps)
<del> tps = 1;
<del> function t(t) {
<del>
<del> t.mTime /= tps;
<del> }
<del> this.mPositionKeys.forEach(t);
<del> this.mRotationKeys.forEach(t);
<del> this.mScalingKeys.forEach(t);
<del>
<del> }
<del> this.sortKeys = function() {
<del>
<del> function comp(a,b) {
<del>
<del> return a.mTime - b.mTime;
<del>
<del> }
<del> this.mPositionKeys.sort(comp);
<del> this.mRotationKeys.sort(comp);
<del> this.mScalingKeys.sort(comp);
<del>
<del> }
<del> this.getLength = function() {
<del>
<del> return Math.max(
<del> Math.max.apply(null,this.mPositionKeys.map(function(a){return a.mTime;})),
<del> Math.max.apply(null,this.mRotationKeys.map(function(a){return a.mTime;})),
<del> Math.max.apply(null,this.mScalingKeys.map(function(a){return a.mTime;}))
<del> )
<del>
<del> }
<del> this.toTHREE = function(o,tps) {
<del>
<del> this.sortKeys();
<del> var length = this.getLength();
<del> var track = new Virtulous.KeyFrameTrack();
<del>
<del>
<del> for(var i = 0; i < length; i+=.05)
<del> {
<del> var matrix = new THREE.Matrix4();
<del> var time = i;
<del> var pos = sampleTrack(this.mPositionKeys,time,length,veclerp);
<del> var scale = sampleTrack(this.mScalingKeys,time,length,veclerp);
<del> var rotation = sampleTrack(this.mRotationKeys,time,length,quatlerp);
<del> matrix.compose(pos,rotation,scale);
<del> var key = new Virtulous.KeyFrame(time,matrix);
<del> track.addKey(key);
<del> }
<del> track.target = o.findNode(this.mNodeName).toTHREE();
<del> var tracks = [track];
<del> if( o.nodeToBoneMap[this.mNodeName])
<del> {
<del> for(var i=0; i < o.nodeToBoneMap[this.mNodeName].length; i++)
<del> {
<del> var t2 = track.clone();
<del> t2.target = o.nodeToBoneMap[this.mNodeName][i];
<del> tracks.push(t2);
<del> }
<del> }
<del>
<del> return tracks;
<del> }
<del>}
<del>
<del>
<del>function aiAnimation() {
<del>
<del> this.mName = "";
<del> this.mDuration = 0;
<del> this.mTicksPerSecond = 0;
<del> this.mNumChannels = 0;
<del> this.mChannels = [];
<del> this.toTHREE = function(root) {
<del>
<del> var animationHandle = new Virtulous.Animation();
<del> for(var i in this.mChannels) {
<del>
<del> this.mChannels[i].init(this.mTicksPerSecond)
<del> var tracks = this.mChannels[i].toTHREE(root);
<del> for(var j in tracks)
<del> {
<del> tracks[j].init();
<del> animationHandle.addTrack(tracks[j]);
<del> }
<del>
<del> }
<del> animationHandle.length = Math.max.apply(null,animationHandle.tracks.map(function(e){return e.length}));
<del> return animationHandle;
<del>
<del> }
<del>
<del>}
<del>
<del>function aiTexture() {
<del>
<del> this.mWidth = 0;
<del> this.mHeight = 0;
<del> this.texAchFormatHint = [];
<del> this.pcData = [];
<del>
<del>}
<del>
<del>function aiLight() {
<del>
<del> this.mName = '';
<del> this.mType = 0;
<del> this.mAttenuationConstant = 0;
<del> this.mAttenuationLinear = 0;
<del> this.mAttenuationQuadratic = 0;
<del> this.mAngleInnerCone = 0;
<del> this.mAngleOuterCone = 0;
<del> this.mColorDiffuse = null;
<del> this.mColorSpecular = null;
<del> this.mColorAmbient = null;
<del>
<del>}
<del>
<del>function aiCamera() {
<del>
<del> this.mName = '';
<del> this.mPosition = null;
<del> this.mLookAt = null;
<del> this.mUp = null;
<del> this.mHorizontalFOV = 0;
<del> this.mClipPlaneNear = 0;
<del> this.mClipPlaneFar = 0;
<del> this.mAspect = 0;
<del>
<del>}
<del>
<del>function aiScene() {
<del>
<del> this.mFlags = 0;
<del> this.mNumMeshes = 0;
<del> this.mNumMaterials = 0;
<del> this.mNumAnimations = 0;
<del> this.mNumTextures = 0;
<del> this.mNumLights = 0;
<del> this.mNumCameras = 0;
<del> this.mRootNode = null;
<del> this.mMeshes = [];
<del> this.mMaterials = [];
<del> this.mAnimations = [];
<del> this.mLights = [];
<del> this.mCameras = [];
<del> this.nodeToBoneMap = {};
<del> this.findNode = function( name, root ) {
<del>
<del> if ( !root ) {
<del>
<del> root = this.mRootNode;
<del> }
<del> if ( root.mName == name ) {
<del>
<del> return root;
<del> }
<del> for ( var i = 0; i < root.mChildren.length; i++ ) {
<del>
<del> var ret = this.findNode( name, root.mChildren[ i ] )
<del> if ( ret ) return ret;
<del> }
<del> return null;
<del> }
<del> this.toTHREE = function()
<del> {
<del> this.nodeCount = 0;
<del> markBones(this);
<del> var o = this.mRootNode.toTHREE(this);
<del>
<del> for (var i in this.mMeshes)
<del> this.mMeshes[i].hookupSkeletons(this, o);
<del> if(this.mAnimations.length > 0)
<del> {
<del> var a = this.mAnimations[0].toTHREE(this);
<del> }
<del> return {object:o,animation:a};
<del> }
<del>
<del>}
<del>
<del>function aiMatrix4() {
<del>
<del> this.elements = [
<del> [],
<del> [],
<del> [],
<del> []
<del> ];
<del> this.toTHREE = function() {
<del>
<del> var m = new THREE.Matrix4();
<del> for ( var i = 0; i < 4; ++i ) {
<del>
<del> for ( var i2 = 0; i2 < 4; ++i2 ) {
<del>
<del> m.elements[ i * 4 + i2 ] = this.elements[ i2 ][ i ]
<del> }
<del> }
<del> return m;
<del> }
<del>
<del>}
<del>var littleEndian = true;
<del>
<del>function readFloat( dataview ) {
<del>
<del> var val = dataview.getFloat32( dataview.readOffset, littleEndian );
<del> dataview.readOffset += 4;
<del> return val;
<del>
<del>}
<del>
<del>function Read_double( dataview ) {
<del>
<del> var val = dataview.getFloat64( dataview.readOffset, littleEndian );
<del> dataview.readOffset += 8;
<del> return val;
<del>
<del>}
<del>
<del>function Read_uint8_t( dataview ) {
<del>
<del> var val = dataview.getUint8( dataview.readOffset );
<del> dataview.readOffset += 1;
<del> return val;
<del>
<del>}
<del>
<del>function Read_uint16_t( dataview ) {
<del>
<del> var val = dataview.getUint16( dataview.readOffset, littleEndian );
<del> dataview.readOffset += 2;
<del> return val;
<del>
<del>}
<del>
<del>function Read_unsigned_int( dataview ) {
<del>
<del> var val = dataview.getUint32( dataview.readOffset, littleEndian );
<del> dataview.readOffset += 4;
<del> return val;
<del>
<del>}
<del>
<del>function Read_uint32_t( dataview ) {
<del>
<del> var val = dataview.getUint32( dataview.readOffset, littleEndian );
<del> dataview.readOffset += 4;
<del> return val;
<del>
<del>}
<del>
<del>function Read_aiVector3D( stream ) {
<del>
<del> v = new aiVector3D();
<del> v.x = readFloat( stream );
<del> v.y = readFloat( stream );
<del> v.z = readFloat( stream );
<del> return v;
<del>
<del>}
<del>
<del>function Read_aiVector2D( stream ) {
<del>
<del> v = new aiVector2D();
<del> v.x = readFloat( stream );
<del> v.y = readFloat( stream );
<del> return v;
<del>
<del>}
<del>
<del>function Read_aiVector4D( stream ) {
<del>
<del> v = new aiVector4D();
<del> v.w = readFloat( stream );
<del> v.x = readFloat( stream );
<del> v.y = readFloat( stream );
<del> v.z = readFloat( stream );
<del> return v;
<del>
<del>}
<del>
<del>function Read_aiColor3D( stream ) {
<del>
<del> var c = new aiColor3D();
<del> c.r = readFloat( stream );
<del> c.g = readFloat( stream );
<del> c.b = readFloat( stream );
<del> return c;
<del>
<del>}
<del>
<del>function Read_aiColor4D( stream ) {
<del>
<del> var c = new aiColor4D();
<del> c.r = readFloat( stream );
<del> c.g = readFloat( stream );
<del> c.b = readFloat( stream );
<del> c.a = readFloat( stream );
<del> return c;
<del>
<del>}
<del>
<del>function Read_aiQuaternion( stream ) {
<del>
<del> var v = new aiQuaternion();
<del> v.w = readFloat( stream );
<del> v.x = readFloat( stream );
<del> v.y = readFloat( stream );
<del> v.z = readFloat( stream );
<del> return v;
<del>
<del>}
<del>
<del>function Read_aiString( stream ) {
<del>
<del> var s = new aiString();
<del> var stringlengthbytes = Read_unsigned_int( stream );
<del> stream.ReadBytes( s.data, 1, stringlengthbytes );
<del> return s.toString();
<del>
<del>}
<del>
<del>function Read_aiVertexWeight( stream ) {
<del>
<del> var w = new aiVertexWeight();
<del> w.mVertexId = Read_unsigned_int( stream );
<del> w.mWeight = readFloat( stream );
<del> return w;
<del>
<del>}
<del>
<del>function Read_aiMatrix4x4( stream ) {
<del>
<del> var m = new aiMatrix4();
<del> for ( var i = 0; i < 4; ++i ) {
<del>
<del> for ( var i2 = 0; i2 < 4; ++i2 ) {
<del>
<del> m.elements[ i ][ i2 ] = readFloat( stream );
<del> }
<del> }
<del> return m;
<del>
<del>}
<del>
<del>function Read_aiVectorKey( stream ) {
<del>
<del> var v = new aiVectorKey();
<del> v.mTime = Read_double( stream );
<del> v.mValue = Read_aiVector3D( stream );
<del> return v;
<del>
<del>}
<del>
<del>function Read_aiQuatKey( stream ) {
<del>
<del> var v = new aiQuatKey();
<del> v.mTime = Read_double( stream );
<del> v.mValue = Read_aiQuaternion( stream );
<del> return v;
<del>
<del>}
<del>
<del>function ReadArray( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiVector2D( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiVector2D( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiVector3D( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiVector3D( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiVector4D( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiVector4D( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiVertexWeight( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiVertexWeight( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiColor4D( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiColor4D( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiVectorKey( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiVectorKey( stream );
<del>
<del>}
<del>
<del>function ReadArray_aiQuatKey( stream, data, size ) {
<del>
<del> for ( var i = 0; i < size; i++ ) data[ i ] = Read_aiQuatKey( stream );
<del>
<del>}
<del>
<del>function ReadBounds( stream, T /*p*/ , n ) {
<del>
<del> // not sure what to do here, the data isn't really useful.
<del> return stream.Seek( sizeof( T ) * n, aiOrigin_CUR );
<del>
<del>}
<del>
<del>function ai_assert( bool ) {
<del>
<del> if ( !bool )
<del> throw ( "asset failed" );
<del>
<del>}
<del>
<del>function ReadBinaryNode( stream, parent, depth ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AINODE );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> var node = new aiNode();
<del> node.mParent = parent;
<del> node.mDepth = depth;
<del> node.mName = Read_aiString( stream );
<del> node.mTransformation = Read_aiMatrix4x4( stream );
<del> node.mNumChildren = Read_unsigned_int( stream );
<del> node.mNumMeshes = Read_unsigned_int( stream );
<del> if ( node.mNumMeshes ) {
<del>
<del> node.mMeshes = []
<del> for ( var i = 0; i < node.mNumMeshes; ++i ) {
<del>
<del> node.mMeshes[ i ] = Read_unsigned_int( stream );
<del> }
<del> }
<del> if ( node.mNumChildren ) {
<del>
<del> node.mChildren = [];
<del> for ( var i = 0; i < node.mNumChildren; ++i ) {
<del>
<del> var node2 = ReadBinaryNode( stream, node, depth++ );
<del> node.mChildren[ i ] = node2;
<del> }
<del> }
<del> return node;
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryBone( stream, b ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AIBONE );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> b.mName = Read_aiString( stream );
<del> b.mNumWeights = Read_unsigned_int( stream );
<del> b.mOffsetMatrix = Read_aiMatrix4x4( stream );
<del> // for the moment we write dumb min/max values for the bones, too.
<del> // maybe I'll add a better, hash-like solution later
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, b.mWeights, b.mNumWeights );
<del> } // else write as usual
<del> else {
<del>
<del> b.mWeights = [];
<del> ReadArray_aiVertexWeight( stream, b.mWeights, b.mNumWeights );
<del> }
<del> return b;
<del>
<del>}
<del>
<del>function ReadBinaryMesh( stream, mesh ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AIMESH );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> mesh.mPrimitiveTypes = Read_unsigned_int( stream );
<del> mesh.mNumVertices = Read_unsigned_int( stream );
<del> mesh.mNumFaces = Read_unsigned_int( stream );
<del> mesh.mNumBones = Read_unsigned_int( stream );
<del> mesh.mMaterialIndex = Read_unsigned_int( stream );
<del> mesh.mNumUVComponents = [];
<del> // first of all, write bits for all existent vertex components
<del> var c = Read_unsigned_int( stream );
<del> if ( c & ASSBIN_MESH_HAS_POSITIONS ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, mesh.mVertices, mesh.mNumVertices );
<del> } // else write as usual
<del> else {
<del>
<del> mesh.mVertices = [];
<del> mesh.mVertexBuffer = stream.subArray32( stream.readOffset, stream.readOffset + mesh.mNumVertices * 3 * 4 );
<del> stream.Seek( mesh.mNumVertices * 3 * 4, aiOrigin_CUR );
<del> }
<del> }
<del> if ( c & ASSBIN_MESH_HAS_NORMALS ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, mesh.mNormals, mesh.mNumVertices );
<del> } // else write as usual
<del> else {
<del>
<del> mesh.mNormals = [];
<del> mesh.mNormalBuffer = stream.subArray32( stream.readOffset, stream.readOffset + mesh.mNumVertices * 3 * 4 );
<del> stream.Seek( mesh.mNumVertices * 3 * 4, aiOrigin_CUR );
<del> }
<del> }
<del> if ( c & ASSBIN_MESH_HAS_TANGENTS_AND_BITANGENTS ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, mesh.mTangents, mesh.mNumVertices );
<del> ReadBounds( stream, mesh.mBitangents, mesh.mNumVertices );
<del> } // else write as usual
<del> else {
<del>
<del> mesh.mTangents = [];
<del> mesh.mTangentBuffer = stream.subArray32( stream.readOffset, stream.readOffset + mesh.mNumVertices * 3 * 4 );
<del> stream.Seek( mesh.mNumVertices * 3 * 4, aiOrigin_CUR );
<del> mesh.mBitangents = [];
<del> mesh.mBitangentBuffer = stream.subArray32( stream.readOffset, stream.readOffset + mesh.mNumVertices * 3 * 4 );
<del> stream.Seek( mesh.mNumVertices * 3 * 4, aiOrigin_CUR );
<del> }
<del> }
<del> for ( var n = 0; n < AI_MAX_NUMBER_OF_COLOR_SETS; ++n ) {
<del>
<del> if ( !( c & ASSBIN_MESH_HAS_COLOR( n ) ) )
<del> break;
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, mesh.mColors[ n ], mesh.mNumVertices );
<del> } // else write as usual
<del> else {
<del>
<del> mesh.mColors[ n ] = [];
<del> mesh.mColorBuffer = stream.subArray32( stream.readOffset, stream.readOffset + mesh.mNumVertices * 4 * 4 );
<del> stream.Seek( mesh.mNumVertices * 4 * 4, aiOrigin_CUR );
<del> }
<del> }
<del> mesh.mTexCoordsBuffers = [];
<del> for ( var n = 0; n < AI_MAX_NUMBER_OF_TEXTURECOORDS; ++n ) {
<del>
<del> if ( !( c & ASSBIN_MESH_HAS_TEXCOORD( n ) ) )
<del> break;
<del> // write number of UV components
<del> mesh.mNumUVComponents[ n ] = Read_unsigned_int( stream );
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, mesh.mTextureCoords[ n ], mesh.mNumVertices );
<del> } // else write as usual
<del> else {
<del>
<del> mesh.mTextureCoords[ n ] = [];
<del> //note that assbin always writes 3d texcoords
<del> mesh.mTexCoordsBuffers[ n ] = [];
<del> for ( var uv = 0; uv < mesh.mNumVertices; uv++ ) {
<del>
<del> mesh.mTexCoordsBuffers[ n ].push( readFloat( stream ) )
<del> mesh.mTexCoordsBuffers[ n ].push( readFloat( stream ) )
<del> readFloat( stream )
<del> }
<del> }
<del> }
<del> // write faces. There are no floating-point calculations involved
<del> // in these, so we can write a simple hash over the face data
<del> // to the dump file. We generate a single 32 Bit hash for 512 faces
<del> // using Assimp's standard hashing function.
<del> if ( shortened ) {
<del>
<del> Read_unsigned_int( stream );
<del> } else // else write as usual
<del> {
<del>
<del> // if there are less than 2^16 vertices, we can simply use 16 bit integers ...
<del> mesh.mFaces = [];
<del> var indexCounter = 0;
<del> mesh.mIndexArray = [];
<del> for ( var i = 0; i < mesh.mNumFaces; ++i ) {
<del>
<del> var f = mesh.mFaces[ i ] = new aiFace();
<del> // BOOST_STATIC_ASSERT(AI_MAX_FACE_INDICES <= 0xffff);
<del> f.mNumIndices = Read_uint16_t( stream );
<del> f.mIndices = [];
<del> for ( var a = 0; a < f.mNumIndices; ++a ) {
<del>
<del> if ( mesh.mNumVertices < ( 1 << 16 ) ) {
<del>
<del> f.mIndices[ a ] = Read_uint16_t( stream );
<del> } else {
<del>
<del> f.mIndices[ a ] = Read_unsigned_int( stream );
<del> }
<del> mesh.mIndexArray.push( f.mIndices[ a ] );
<del> }
<del> }
<del> }
<del> // write bones
<del> if ( mesh.mNumBones ) {
<del>
<del> mesh.mBones = [];
<del> for ( var a = 0; a < mesh.mNumBones; ++a ) {
<del>
<del> mesh.mBones[ a ] = new aiBone();
<del> ReadBinaryBone( stream, mesh.mBones[ a ] );
<del> }
<del> }
<del>
<del>}
<del>
<del>
<del>
<del>function ReadBinaryMaterialProperty( stream, prop ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AIMATERIALPROPERTY );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> prop.mKey = Read_aiString( stream );
<del> prop.mSemantic = Read_unsigned_int( stream );
<del> prop.mIndex = Read_unsigned_int( stream );
<del> prop.mDataLength = Read_unsigned_int( stream );
<del> prop.mType = Read_unsigned_int( stream );
<del> prop.mData = [];
<del> stream.ReadBytes( prop.mData, 1, prop.mDataLength );
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryMaterial( stream, mat ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AIMATERIAL );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> mat.mNumAllocated = mat.mNumProperties = Read_unsigned_int( stream );
<del> if ( mat.mNumProperties ) {
<del>
<del> if ( mat.mProperties ) {
<del>
<del> delete mat.mProperties;
<del> }
<del> mat.mProperties = [];
<del> for ( var i = 0; i < mat.mNumProperties; ++i ) {
<del>
<del> mat.mProperties[ i ] = new aiMaterialProperty();
<del> ReadBinaryMaterialProperty( stream, mat.mProperties[ i ] );
<del> }
<del> }
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryNodeAnim( stream, nd ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AINODEANIM );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> nd.mNodeName = Read_aiString( stream );
<del> nd.mNumPositionKeys = Read_unsigned_int( stream );
<del> nd.mNumRotationKeys = Read_unsigned_int( stream );
<del> nd.mNumScalingKeys = Read_unsigned_int( stream );
<del> nd.mPreState = Read_unsigned_int( stream );
<del> nd.mPostState = Read_unsigned_int( stream );
<del> if ( nd.mNumPositionKeys ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, nd.mPositionKeys, nd.mNumPositionKeys );
<del> } // else write as usual
<del> else {
<del>
<del> nd.mPositionKeys = [];
<del> ReadArray_aiVectorKey( stream, nd.mPositionKeys, nd.mNumPositionKeys );
<del> }
<del> }
<del> if ( nd.mNumRotationKeys ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, nd.mRotationKeys, nd.mNumRotationKeys );
<del> } // else write as usual
<del> else {
<del>
<del> nd.mRotationKeys = [];
<del> ReadArray_aiQuatKey( stream, nd.mRotationKeys, nd.mNumRotationKeys );
<del> }
<del> }
<del> if ( nd.mNumScalingKeys ) {
<del>
<del> if ( shortened ) {
<del>
<del> ReadBounds( stream, nd.mScalingKeys, nd.mNumScalingKeys );
<del> } // else write as usual
<del> else {
<del>
<del> nd.mScalingKeys = [];
<del> ReadArray_aiVectorKey( stream, nd.mScalingKeys, nd.mNumScalingKeys );
<del> }
<del> }
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryAnim( stream, anim ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AIANIMATION );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> anim.mName = Read_aiString( stream );
<del> anim.mDuration = Read_double( stream );
<del> anim.mTicksPerSecond = Read_double( stream );
<del> anim.mNumChannels = Read_unsigned_int( stream );
<del> if ( anim.mNumChannels ) {
<del>
<del> anim.mChannels = [];
<del> for ( var a = 0; a < anim.mNumChannels; ++a ) {
<del>
<del> anim.mChannels[ a ] = new aiNodeAnim();
<del> ReadBinaryNodeAnim( stream, anim.mChannels[ a ] );
<del> }
<del> }
<del>
<del>}
<del>
<del>function ReadBinaryTexture( stream, tex ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AITEXTURE );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> tex.mWidth = Read_unsigned_int( stream );
<del> tex.mHeight = Read_unsigned_int( stream );
<del> stream.ReadBytes( tex.achFormatHint, 1, 4 );
<del> if ( !shortened ) {
<del>
<del> if ( !tex.mHeight ) {
<del>
<del> tex.pcData = [];
<del> stream.ReadBytes( tex.pcData, 1, tex.mWidth );
<del> } else {
<del>
<del> tex.pcData = []
<del> stream.ReadBytes( tex.pcData, 1, tex.mWidth * tex.mHeight * 4 );
<del> }
<del> }
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryLight( stream, l ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AILIGHT );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> l.mName = Read_aiString( stream );
<del> l.mType = Read_unsigned_int( stream );
<del> if ( l.mType != aiLightSource_DIRECTIONAL ) {
<del>
<del> l.mAttenuationConstant = readFloat( stream );
<del> l.mAttenuationLinear = readFloat( stream );
<del> l.mAttenuationQuadratic = readFloat( stream );
<del> }
<del> l.mColorDiffuse = Read_aiColor3D( stream );
<del> l.mColorSpecular = Read_aiColor3D( stream );
<del> l.mColorAmbient = Read_aiColor3D( stream );
<del> if ( l.mType == aiLightSource_SPOT ) {
<del>
<del> l.mAngleInnerCone = readFloat( stream );
<del> l.mAngleOuterCone = readFloat( stream );
<del> }
<del>
<del>}
<del>// -----------------------------------------------------------------------------------
<del>function ReadBinaryCamera( stream, cam ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AICAMERA );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> cam.mName = Read_aiString( stream );
<del> cam.mPosition = Read_aiVector3D( stream );
<del> cam.mLookAt = Read_aiVector3D( stream );
<del> cam.mUp = Read_aiVector3D( stream );
<del> cam.mHorizontalFOV = readFloat( stream );
<del> cam.mClipPlaneNear = readFloat( stream );
<del> cam.mClipPlaneFar = readFloat( stream );
<del> cam.mAspect = readFloat( stream );
<del>
<del>}
<del>
<del>function ReadBinaryScene( stream, scene ) {
<del>
<del> var chunkID = Read_uint32_t( stream );
<del> ai_assert( chunkID == ASSBIN_CHUNK_AISCENE );
<del> /*uint32_t size =*/
<del> Read_uint32_t( stream );
<del> scene.mFlags = Read_unsigned_int( stream );
<del> scene.mNumMeshes = Read_unsigned_int( stream );
<del> scene.mNumMaterials = Read_unsigned_int( stream );
<del> scene.mNumAnimations = Read_unsigned_int( stream );
<del> scene.mNumTextures = Read_unsigned_int( stream );
<del> scene.mNumLights = Read_unsigned_int( stream );
<del> scene.mNumCameras = Read_unsigned_int( stream );
<del> // Read node graph
<del> scene.mRootNode = new aiNode();
<del> scene.mRootNode = ReadBinaryNode( stream, null, 0 );
<del> // Read all meshes
<del> if ( scene.mNumMeshes ) {
<del>
<del> scene.mMeshes = [];
<del> for ( var i = 0; i < scene.mNumMeshes; ++i ) {
<del>
<del> scene.mMeshes[ i ] = new aiMesh();
<del> ReadBinaryMesh( stream, scene.mMeshes[ i ] );
<del> }
<del> }
<del> // Read materials
<del> if ( scene.mNumMaterials ) {
<del>
<del> scene.mMaterials = [];
<del> for ( var i = 0; i < scene.mNumMaterials; ++i ) {
<del>
<del> scene.mMaterials[ i ] = new aiMaterial();
<del> ReadBinaryMaterial( stream, scene.mMaterials[ i ] );
<del> }
<del> }
<del> // Read all animations
<del> if ( scene.mNumAnimations ) {
<del>
<del> scene.mAnimations = [];
<del> for ( var i = 0; i < scene.mNumAnimations; ++i ) {
<del>
<del> scene.mAnimations[ i ] = new aiAnimation();
<del> ReadBinaryAnim( stream, scene.mAnimations[ i ] );
<del> }
<del> }
<del> // Read all textures
<del> if ( scene.mNumTextures ) {
<del>
<del> scene.mTextures = [];
<del> for ( var i = 0; i < scene.mNumTextures; ++i ) {
<del>
<del> scene.mTextures[ i ] = new aiTexture();
<del> ReadBinaryTexture( stream, scene.mTextures[ i ] );
<del> }
<del> }
<del> // Read lights
<del> if ( scene.mNumLights ) {
<del>
<del> scene.mLights = [];
<del> for ( var i = 0; i < scene.mNumLights; ++i ) {
<del>
<del> scene.mLights[ i ] = new aiLight();
<del> ReadBinaryLight( stream, scene.mLights[ i ] );
<del> }
<del> }
<del> // Read cameras
<del> if ( scene.mNumCameras ) {
<del>
<del> scene.mCameras = [];
<del> for ( var i = 0; i < scene.mNumCameras; ++i ) {
<del>
<del> scene.mCameras[ i ] = new aiCamera();
<del> ReadBinaryCamera( stream, scene.mCameras[ i ] );
<del> }
<del> }
<del>
<del>}
<del>var aiOrigin_CUR = 0;
<del>var aiOrigin_BEG = 1;
<del>
<del>function extendStream( stream ) {
<del>
<del> stream.readOffset = 0;
<del> stream.Seek = function( off, ori ) {
<del>
<del> if ( ori == aiOrigin_CUR ) {
<del>
<del> stream.readOffset += off;
<del> }
<del> if ( ori == aiOrigin_BEG ) {
<del>
<del> stream.readOffset = off;
<del> }
<del> }
<del> stream.ReadBytes = function( buff, size, n ) {
<del>
<del> var bytes = size * n;
<del> for ( var i = 0; i < bytes; i++ )
<del> buff[ i ] = Read_uint8_t( this );
<del> }
<del> stream.subArray32 = function( start, end ) {
<del>
<del> var buff = this.buffer;
<del> var newbuff = buff.slice( start, end );
<del> return new Float32Array( newbuff );
<del> }
<del> stream.subArrayUint16 = function( start, end ) {
<del>
<del> var buff = this.buffer;
<del> var newbuff = buff.slice( start, end );
<del> return new Uint16Array( newbuff );
<del> }
<del> stream.subArrayUint8 = function( start, end ) {
<del>
<del> var buff = this.buffer;
<del> var newbuff = buff.slice( start, end );
<del> return new Uint8Array( newbuff );
<del> }
<del> stream.subArrayUint32 = function( start, end ) {
<del>
<del> var buff = this.buffer;
<del> var newbuff = buff.slice( start, end );
<del> return new Uint32Array( newbuff );
<del> }
<del>
<del>}
<del>
<del>function AssimpLoader() {
<del>
<del> this.load = function( url, callback ) {
<del>
<del> var xhr = new XMLHttpRequest();
<del> xhr.open( 'GET', url, true );
<del> xhr.responseType = 'arraybuffer';
<del> xhr.onerror = function( e ) {
<del>
<del> callback( e );
<del> }
<del> xhr.onload = function( e ) {
<del>
<del> try {
<del>
<del> var time = performance.now();
<del> // response is unsigned 8 bit integer
<del> var node = InternReadFile( this.response, url );
<del> console.info( "Parse in " + ( performance.now() - time ) );
<del> callback(null, node);
<del>
<del> } catch ( e ) {
<del>
<del> callback(e);
<del>
<del> }
<del> };
<del> xhr.send();
<del> }
<del>
<del>}
<del>
<del>function InternReadFile( pFiledata, url ) {
<del>
<del> var pScene = new aiScene();
<del> pScene.baseURL = url;
<del> var stream = new DataView( pFiledata );
<del> extendStream( stream );
<del> stream.Seek( 44, aiOrigin_CUR ); // signature
<del> /*unsigned int versionMajor =*/
<del> var versionMajor = Read_unsigned_int( stream );
<del> /*unsigned int versionMinor =*/
<del> var versionMinor = Read_unsigned_int( stream );
<del> /*unsigned int versionRevision =*/
<del> var versionRevision = Read_unsigned_int( stream );
<del> /*unsigned int compileFlags =*/
<del> var compileFlags = Read_unsigned_int( stream );
<del> shortened = Read_uint16_t( stream ) > 0;
<del> compressed = Read_uint16_t( stream ) > 0;
<del> if ( shortened )
<del> throw "Shortened binaries are not supported!";
<del> stream.Seek( 256, aiOrigin_CUR ); // original filename
<del> stream.Seek( 128, aiOrigin_CUR ); // options
<del> stream.Seek( 64, aiOrigin_CUR ); // padding
<del> if ( compressed ) {
<del>
<del> var uncompressedSize = Read_uint32_t( stream );
<del> var compressedSize = stream.FileSize() - stream.Tell();
<del> var compressedData = [];
<del> stream.Read( compressedData, 1, compressedSize );
<del> var uncompressedData = [];
<del> uncompress( uncompressedData, uncompressedSize, compressedData, compressedSize );
<del> var buff = new ArrayBuffer( uncompressedData );
<del> ReadBinaryScene( buff, pScene );
<del> } else {
<del>
<del> ReadBinaryScene( stream, pScene );
<del> return pScene.toTHREE();
<del> }
<del>
<del>}
<del>
<del>THREE.AssimpLoader = AssimpLoader
<del>
<del>})()
<ide>\ No newline at end of file | 1 |
Ruby | Ruby | upgrade virtualenv to 16.5.0 | e3615add8c8289e3baa99d6b316ab78b54b3edc4 | <ide><path>Library/Homebrew/language/python_virtualenv_constants.rb
<ide> # frozen_string_literal: true
<ide>
<ide> PYTHON_VIRTUALENV_URL =
<del> "https://files.pythonhosted.org/packages/37/db" \
<del> "/89d6b043b22052109da35416abc3c397655e4bd3cff031446ba02b9654fa" \
<del> "/virtualenv-16.4.3.tar.gz"
<add> "https://files.pythonhosted.org/packages/6a/56" \
<add> "/74dce1fdeeabbcc0a3fcf299115f6814bd9c39fc4161658b513240a75ea7" \
<add> "/virtualenv-16.5.0.tar.gz"
<ide> PYTHON_VIRTUALENV_SHA256 =
<del> "984d7e607b0a5d1329425dd8845bd971b957424b5ba664729fab51ab8c11bc39"
<add> "15ee248d13e4001a691d9583948ad3947bcb8a289775102e4c4aa98a8b7a6d73" | 1 |
Javascript | Javascript | add unstable_strictmodelevel to test renderer | de0ee76dbde0a5ec507cf4a62c9690f67b56d7b6 | <ide><path>packages/react-test-renderer/src/ReactTestRenderer.js
<ide> const {IsSomeRendererActing} = ReactSharedInternals;
<ide> type TestRendererOptions = {
<ide> createNodeMock: (element: React$Element<any>) => any,
<ide> unstable_isConcurrent: boolean,
<add> unstable_strictModeLevel: number,
<ide> ...
<ide> };
<ide>
<ide> function propsMatch(props: Object, filter: Object): boolean {
<ide> function create(element: React$Element<any>, options: TestRendererOptions) {
<ide> let createNodeMock = defaultTestOptions.createNodeMock;
<ide> let isConcurrent = false;
<add> let strictModeLevel = null;
<ide> if (typeof options === 'object' && options !== null) {
<ide> if (typeof options.createNodeMock === 'function') {
<ide> createNodeMock = options.createNodeMock;
<ide> }
<ide> if (options.unstable_isConcurrent === true) {
<ide> isConcurrent = true;
<ide> }
<add> if (options.unstable_strictModeLevel !== undefined) {
<add> strictModeLevel = options.unstable_strictModeLevel;
<add> }
<ide> }
<ide> let container = {
<ide> children: [],
<ide> function create(element: React$Element<any>, options: TestRendererOptions) {
<ide> isConcurrent ? ConcurrentRoot : LegacyRoot,
<ide> false,
<ide> null,
<del> null,
<add> strictModeLevel,
<ide> );
<ide> invariant(root != null, 'something went wrong');
<ide> updateContainer(element, root, null, null); | 1 |
Python | Python | fix seterr example for resetting to old settings | 5803ec43faf46d5117ddddfc7026f098ea5b3b89 | <ide><path>numpy/core/numeric.py
<ide> def seterr(all=None, divide=None, over=None, under=None, invalid=None):
<ide> >>> np.seterr(over='raise')
<ide> {'over': 'ignore', 'divide': 'ignore', 'invalid': 'ignore',
<ide> 'under': 'ignore'}
<del> >>> np.seterr(all='ignore') # reset to default
<add> >>> np.seterr(**old_settings) # reset to default
<ide> {'over': 'raise', 'divide': 'ignore', 'invalid': 'ignore', 'under': 'ignore'}
<ide>
<ide> >>> np.int16(32000) * np.int16(3) | 1 |
Javascript | Javascript | fix uiexplorer list | 38157f017583e3c787f5caca018ba6a49bd590fe | <ide><path>Examples/UIExplorer/js/UIExplorerExampleList.js
<ide> const ds = new ListView.DataSource({
<ide> });
<ide>
<ide> class UIExplorerExampleList extends React.Component {
<del> constuctor(props: {
<add>
<add> state = {filter: ''};
<add>
<add> constructor(props: {
<ide> disableTitleRow: ?boolean,
<ide> onNavigate: Function,
<ide> filter: ?string,
<ide> class UIExplorerExampleList extends React.Component {
<ide> },
<ide> style: ?any,
<ide> }) {
<del>
<add> super(props);
<ide> }
<ide>
<ide> static makeRenderable(example: any): ReactClass<any> {
<ide> class UIExplorerExampleList extends React.Component {
<ide> }
<ide>
<ide> render(): ?ReactElement<any> {
<del> const filterText = this.props.filter || '';
<add> const filterText = this.state.filter || '';
<ide> const filterRegex = new RegExp(String(filterText), 'i');
<ide> const filter = (example) => filterRegex.test(example.module.title);
<ide>
<ide> class UIExplorerExampleList extends React.Component {
<ide> autoCorrect={false}
<ide> clearButtonMode="always"
<ide> onChangeText={text => {
<del> this.props.onNavigate(UIExplorerActions.ExampleListWithFilter(text));
<add> this.setState({filter: text});
<ide> }}
<ide> placeholder="Search..."
<ide> style={[styles.searchTextInput, this.props.searchTextInputStyle]}
<ide> testID="explorer_search"
<del> value={this.props.filter}
<add> value={this.state.filter}
<ide> />
<ide> </View>
<ide> ); | 1 |
Text | Text | add 1.7.4 release notes | e25f84296f459f5e1dc86c8720f655f906bc8845 | <ide><path>CHANGELOG.md
<add><a name="1.7.4"></a>
<add># 1.7.4 interstellar-exploration (2018-09-07)
<add>
<add>## Bug Fixes
<add>- **ngAria.ngClick:** prevent default event on space/enter only for non-interactive elements
<add> ([61b335](https://github.com/angular/angular.js/commit/61b33543ff8e7f32464dec98a46bf0a35e9b03a4),
<add> [#16664](https://github.com/angular/angular.js/issues/16664),
<add> [#16680](https://github.com/angular/angular.js/issues/16680))
<add>- **ngAnimate:** remove the "prepare" classes with multiple structural animations
<add> ([3105b2](https://github.com/angular/angular.js/commit/3105b2c26a71594c4e7904efc18f4b2e9da25b1b),
<add> [#16681](https://github.com/angular/angular.js/issues/16681),
<add> [#16677](https://github.com/angular/angular.js/issues/16677))
<add>- **$route:** correctly extract path params if the path contains a question mark or a hash
<add> ([2ceeb7](https://github.com/angular/angular.js/commit/2ceeb739f35e01fcebcabac4beeeb7684ae9f86d))
<add>- **ngHref:** allow numbers and other objects in interpolation
<add> ([30084c](https://github.com/angular/angular.js/commit/30084c13699c814ff6703d7aa2d3947a9b2f7067),
<add> [#16652](https://github.com/angular/angular.js/issues/16652),
<add> [#16626](https://github.com/angular/angular.js/issues/16626))
<add>- **select:** allow to select first option with value `undefined`
<add> ([668a33](https://github.com/angular/angular.js/commit/668a33da3439f17e61dfa8f6d9b114ebde8c9d87),
<add> [#16653](https://github.com/angular/angular.js/issues/16653),
<add> [#16656](https://github.com/angular/angular.js/issues/16656))
<add>
<add>
<ide> <a name="1.7.3"></a>
<ide> # 1.7.3 eventful-proposal (2018-08-03)
<ide> | 1 |
Javascript | Javascript | sanitize srcset attribute | ab80cd90661396dbb1c94c5f4dd2d11ee8f6b6af | <ide><path>src/ng/compile.js
<ide> function $CompileProvider($provide, $$sanitizeUriProvider) {
<ide>
<ide> nodeName = nodeName_(this.$$element);
<ide>
<del> // sanitize a[href] and img[src] values
<ide> if ((nodeName === 'a' && key === 'href') ||
<ide> (nodeName === 'img' && key === 'src')) {
<add> // sanitize a[href] and img[src] values
<ide> this[key] = value = $$sanitizeUri(value, key === 'src');
<add> } else if (nodeName === 'img' && key === 'srcset') {
<add> // sanitize img[srcset] values
<add> var result = "";
<add>
<add> // first check if there are spaces because it's not the same pattern
<add> var trimmedSrcset = trim(value);
<add> // ( 999x ,| 999w ,| ,|, )
<add> var srcPattern = /(\s+\d+x\s*,|\s+\d+w\s*,|\s+,|,\s+)/;
<add> var pattern = /\s/.test(trimmedSrcset) ? srcPattern : /(,)/;
<add>
<add> // split srcset into tuple of uri and descriptor except for the last item
<add> var rawUris = trimmedSrcset.split(pattern);
<add>
<add> // for each tuples
<add> var nbrUrisWith2parts = Math.floor(rawUris.length / 2);
<add> for (var i=0; i<nbrUrisWith2parts; i++) {
<add> var innerIdx = i*2;
<add> // sanitize the uri
<add> result += $$sanitizeUri(trim( rawUris[innerIdx]), true);
<add> // add the descriptor
<add> result += ( " " + trim(rawUris[innerIdx+1]));
<add> }
<add>
<add> // split the last item into uri and descriptor
<add> var lastTuple = trim(rawUris[i*2]).split(/\s/);
<add>
<add> // sanitize the last uri
<add> result += $$sanitizeUri(trim(lastTuple[0]), true);
<add>
<add> // and add the last descriptor if any
<add> if( lastTuple.length === 2) {
<add> result += (" " + trim(lastTuple[1]));
<add> }
<add> this[key] = value = result;
<ide> }
<ide>
<ide> if (writeAttr !== false) {
<ide><path>test/ng/compileSpec.js
<ide> describe('$compile', function() {
<ide> });
<ide> });
<ide>
<add> describe('img[srcset] sanitization', function() {
<add>
<add> it('should NOT require trusted values for img srcset', inject(function($rootScope, $compile, $sce) {
<add> element = $compile('<img srcset="{{testUrl}}"></img>')($rootScope);
<add> $rootScope.testUrl = 'http://example.com/image.png';
<add> $rootScope.$digest();
<add> expect(element.attr('srcset')).toEqual('http://example.com/image.png');
<add> // But it should accept trusted values anyway.
<add> $rootScope.testUrl = $sce.trustAsUrl('http://example.com/image2.png');
<add> $rootScope.$digest();
<add> expect(element.attr('srcset')).toEqual('http://example.com/image2.png');
<add> }));
<add>
<add> it('should use $$sanitizeUri', function() {
<add> var $$sanitizeUri = jasmine.createSpy('$$sanitizeUri');
<add> module(function($provide) {
<add> $provide.value('$$sanitizeUri', $$sanitizeUri);
<add> });
<add> inject(function($compile, $rootScope) {
<add> element = $compile('<img srcset="{{testUrl}}"></img>')($rootScope);
<add> $rootScope.testUrl = "someUrl";
<add>
<add> $$sanitizeUri.andReturn('someSanitizedUrl');
<add> $rootScope.$apply();
<add> expect(element.attr('srcset')).toBe('someSanitizedUrl');
<add> expect($$sanitizeUri).toHaveBeenCalledWith($rootScope.testUrl, true);
<add> });
<add> });
<add>
<add> it('should sanitize all uris in srcset', inject(function($rootScope, $compile) {
<add> /*jshint scripturl:true*/
<add> element = $compile('<img srcset="{{testUrl}}"></img>')($rootScope);
<add> var testSet = {
<add> 'http://example.com/image.png':'http://example.com/image.png',
<add> ' http://example.com/image.png':'http://example.com/image.png',
<add> 'http://example.com/image.png ':'http://example.com/image.png',
<add> 'http://example.com/image.png 128w':'http://example.com/image.png 128w',
<add> 'http://example.com/image.png 2x':'http://example.com/image.png 2x',
<add> 'http://example.com/image.png 1.5x':'http://example.com/image.png 1.5x',
<add> 'http://example.com/image1.png 1x,http://example.com/image2.png 2x':'http://example.com/image1.png 1x,http://example.com/image2.png 2x',
<add> 'http://example.com/image1.png 1x ,http://example.com/image2.png 2x':'http://example.com/image1.png 1x ,http://example.com/image2.png 2x',
<add> 'http://example.com/image1.png 1x, http://example.com/image2.png 2x':'http://example.com/image1.png 1x,http://example.com/image2.png 2x',
<add> 'http://example.com/image1.png 1x , http://example.com/image2.png 2x':'http://example.com/image1.png 1x ,http://example.com/image2.png 2x',
<add> 'http://example.com/image1.png 48w,http://example.com/image2.png 64w':'http://example.com/image1.png 48w,http://example.com/image2.png 64w',
<add> //Test regex to make sure doesn't mistake parts of url for width descriptors
<add> 'http://example.com/image1.png?w=48w,http://example.com/image2.png 64w':'http://example.com/image1.png?w=48w,http://example.com/image2.png 64w',
<add> 'http://example.com/image1.png 1x,http://example.com/image2.png 64w':'http://example.com/image1.png 1x,http://example.com/image2.png 64w',
<add> 'http://example.com/image1.png,http://example.com/image2.png':'http://example.com/image1.png ,http://example.com/image2.png',
<add> 'http://example.com/image1.png ,http://example.com/image2.png':'http://example.com/image1.png ,http://example.com/image2.png',
<add> 'http://example.com/image1.png, http://example.com/image2.png':'http://example.com/image1.png ,http://example.com/image2.png',
<add> 'http://example.com/image1.png , http://example.com/image2.png':'http://example.com/image1.png ,http://example.com/image2.png',
<add> 'http://example.com/image1.png 1x, http://example.com/image2.png 2x, http://example.com/image3.png 3x':
<add> 'http://example.com/image1.png 1x,http://example.com/image2.png 2x,http://example.com/image3.png 3x',
<add> 'javascript:doEvilStuff() 2x': 'unsafe:javascript:doEvilStuff() 2x',
<add> 'http://example.com/image1.png 1x,javascript:doEvilStuff() 2x':'http://example.com/image1.png 1x,unsafe:javascript:doEvilStuff() 2x',
<add> 'http://example.com/image1.jpg?x=a,b 1x,http://example.com/ima,ge2.jpg 2x':'http://example.com/image1.jpg?x=a,b 1x,http://example.com/ima,ge2.jpg 2x',
<add> //Test regex to make sure doesn't mistake parts of url for pixel density descriptors
<add> 'http://example.com/image1.jpg?x=a2x,b 1x,http://example.com/ima,ge2.jpg 2x':'http://example.com/image1.jpg?x=a2x,b 1x,http://example.com/ima,ge2.jpg 2x'
<add> };
<add>
<add> forEach( testSet, function( ref, url) {
<add> $rootScope.testUrl = url;
<add> $rootScope.$digest();
<add> expect(element.attr('srcset')).toEqual(ref);
<add> });
<add>
<add> }));
<add> });
<ide>
<ide> describe('a[href] sanitization', function() {
<ide>
<ide><path>test/ng/directive/ngSrcSpec.js
<ide> describe('ngSrc', function() {
<ide> dealoc(element);
<ide> });
<ide>
<del> it('should not result empty string in img src', inject(function($rootScope, $compile) {
<del> $rootScope.image = {};
<del> element = $compile('<img ng-src="{{image.url}}">')($rootScope);
<del> $rootScope.$digest();
<del> expect(element.attr('src')).not.toBe('');
<del> expect(element.attr('src')).toBe(undefined);
<del> }));
<add> describe('img[ng-src]', function() {
<add> it('should not result empty string in img src', inject(function($rootScope, $compile) {
<add> $rootScope.image = {};
<add> element = $compile('<img ng-src="{{image.url}}">')($rootScope);
<add> $rootScope.$digest();
<add> expect(element.attr('src')).not.toBe('');
<add> expect(element.attr('src')).toBe(undefined);
<add> }));
<add>
<add> it('should sanitize url', inject(function($rootScope, $compile) {
<add> $rootScope.imageUrl = 'javascript:alert(1);';
<add> element = $compile('<img ng-src="{{imageUrl}}">')($rootScope);
<add> $rootScope.$digest();
<add> expect(element.attr('src')).toBe('unsafe:javascript:alert(1);');
<add> }));
<add> });
<ide>
<ide> describe('iframe[ng-src]', function() {
<ide> it('should pass through src attributes for the same domain', inject(function($compile, $rootScope) {
<ide><path>test/ng/directive/ngSrcsetSpec.js
<add>/*jshint scripturl:true*/
<ide> 'use strict';
<ide>
<ide> describe('ngSrcset', function() {
<ide> describe('ngSrcset', function() {
<ide> $rootScope.$digest();
<ide> expect(element.attr('srcset')).toBeUndefined();
<ide> }));
<add>
<add> it('should sanitize good urls', inject(function($rootScope, $compile) {
<add> $rootScope.imageUrl = 'http://example.com/image1.png 1x, http://example.com/image2.png 2x';
<add> element = $compile('<img ng-srcset="{{imageUrl}}">')($rootScope);
<add> $rootScope.$digest();
<add> expect(element.attr('srcset')).toBe('http://example.com/image1.png 1x,http://example.com/image2.png 2x');
<add> }));
<add>
<add> it('should sanitize evil url', inject(function($rootScope, $compile) {
<add> $rootScope.imageUrl = 'http://example.com/image1.png 1x, javascript:doEvilStuff() 2x';
<add> element = $compile('<img ng-srcset="{{imageUrl}}">')($rootScope);
<add> $rootScope.$digest();
<add> expect(element.attr('srcset')).toBe('http://example.com/image1.png 1x,unsafe:javascript:doEvilStuff() 2x');
<add> }));
<ide> });
<add> | 4 |
Text | Text | add changelogs for cluster | dc10c1aa29db556fcb02918c6e565ece4a2128c0 | <ide><path>doc/api/cluster.md
<ide> It is not emitted in the worker.
<ide> ### worker.disconnect()
<ide> <!-- YAML
<ide> added: v0.7.7
<add>changes:
<add> - version: v7.3.0
<add> pr-url: https://github.com/nodejs/node/pull/10019
<add> description: This method now returns a reference to `worker`.
<ide> -->
<ide>
<ide> * Returns: {Worker} A reference to `worker`.
<ide> accidental disconnection.
<ide> ### worker.send(message[, sendHandle][, callback])
<ide> <!-- YAML
<ide> added: v0.7.0
<add>changes:
<add> - version: v4.0.0
<add> pr-url: https://github.com/nodejs/node/pull/2620
<add> description: The `callback` parameter is supported now.
<ide> -->
<ide>
<ide> * `message` {Object}
<ide> if (cluster.isMaster) {
<ide> <!-- YAML
<ide> added: v0.7.0
<ide> deprecated: v6.0.0
<add>changes:
<add> - version: v7.0.0
<add> pr-url: https://github.com/nodejs/node/pull/3747
<add> description: Accessing this property will now emit a deprecation warning.
<ide> -->
<ide>
<ide> > Stability: 0 - Deprecated: Use [`worker.exitedAfterDisconnect`][] instead.
<ide> The `addressType` is one of:
<ide> * `"udp4"` or `"udp6"` (UDP v4 or v6)
<ide>
<ide> ## Event: 'message'
<add><!-- YAML
<add>added: v2.5.0
<add>changes:
<add> - version: v6.0.0
<add> pr-url: https://github.com/nodejs/node/pull/5361
<add> description: The `worker` parameter is passed now; see below for details.
<add>-->
<ide>
<ide> * `worker` {cluster.Worker}
<ide> * `message` {Object}
<ide> values are `"rr"` and `"none"`.
<ide> ## cluster.settings
<ide> <!-- YAML
<ide> added: v0.7.1
<add>changes:
<add> - version: v6.4.0
<add> pr-url: https://github.com/nodejs/node/pull/7838
<add> description: The `stdio` option is supported now.
<ide> -->
<ide>
<ide> * {Object}
<ide> This object is not supposed to be changed or set manually, by you.
<ide> ## cluster.setupMaster([settings])
<ide> <!-- YAML
<ide> added: v0.7.1
<add>changes:
<add> - version: v6.4.0
<add> pr-url: https://github.com/nodejs/node/pull/7838
<add> description: The `stdio` option is supported now.
<ide> -->
<ide>
<ide> * `settings` {Object} | 1 |
PHP | PHP | apply fixes from styleci | e68c3d81411fe0e167d851328a66b1d334034469 | <ide><path>src/Illuminate/Routing/Router.php
<ide> public function pushMiddlewareToGroup($group, $middleware)
<ide> $this->middlewareGroups[$group] = [];
<ide> }
<ide>
<del> if ( ! in_array($middleware, $this->middlewareGroups[$group])) {
<add> if (! in_array($middleware, $this->middlewareGroups[$group])) {
<ide> $this->middlewareGroups[$group][] = $middleware;
<ide> }
<ide> | 1 |
Text | Text | add v3.14.0 to changelog | 1ffa71d499e4a5dce7370304accae473936695c1 | <ide><path>CHANGELOG.md
<ide> # Ember Changelog
<ide>
<del>### v3.14.0-beta.5 (October 14, 2019)
<add>### v3.14.0 (October 29, 2019)
<ide>
<del>- [#18476](https://github.com/emberjs/ember.js/pull/18476) [BUGFIX] Ensure model can be observed by sync observers
<add>- [#18345](https://github.com/emberjs/ember.js/pull/18345) / [#18363](https://github.com/emberjs/ember.js/pull/18363) [FEATURE] Implement the [Provide @model named argument to route templates](https://github.com/emberjs/rfcs/blob/master/text/0523-model-argument-for-route-templates.md RFC.
<ide> - [#18458](https://github.com/emberjs/ember.js/pull/18458) [BUGFIX] Using query params helper outside of link-to
<del>
<del>### v3.14.0-beta.4 (October 7, 2019)
<del>
<del>- [#18462](https://github.com/emberjs/ember.js/pull/18462) [BUGFIX] Prevents observer re-entry
<del>
<del>### v3.14.0-beta.3 (October 1, 2019)
<del>
<ide> - [#18429](https://github.com/emberjs/ember.js/pull/18429) [BUGFIX] Fix incorrect error message for octane features.
<del>
<del>### v3.14.0-beta.2 (September 24, 2019)
<del>
<del>- [#18273](https://github.com/emberjs/ember.js/pull/18273) [BUGFIX] Fix issues with SSR rehydration of <title>.
<ide> - [#18415](https://github.com/emberjs/ember.js/pull/18415) [BUGFIX] Fix hbs import path in test blueprint.
<del>- [#18418](https://github.com/emberjs/ember.js/pull/18418) / [#18419](https://github.com/emberjs/ember.js/pull/18419) [BUGFIX] Require Octane features when using Octane preview.
<del>
<del>### v3.14.0-beta.1 (September 19, 2019)
<del>
<del>- [#18345](https://github.com/emberjs/ember.js/pull/18345) / [#18363](https://github.com/emberjs/ember.js/pull/18363) [FEATURE] Implement the [Provide @model named argument to route templates](https://github.com/emberjs/rfcs/blob/master/text/0523-model-argument-for-route-templates.md RFC.
<ide> - [#18387](https://github.com/emberjs/ember.js/pull/18387) [BUGFIX] Ensure `updateComponent` is fired consistently
<del>- [#18372](https://github.com/emberjs/ember.js/pull/18372) Debug Render Tree (for Ember Inspector)
<ide> - [#18381](https://github.com/emberjs/ember.js/pull/18381) Drop Node 6 and 11 support.
<ide> - [#18410](https://github.com/emberjs/ember.js/pull/18410) Use ember-cli-htmlbars for inline precompilation if possible.
<ide> | 1 |
PHP | PHP | remove dead test | 7d39beff68c5ebf5666e8433a76d84f779914c91 | <ide><path>tests/test_app/templates/layout/rss/default.php
<del><?php
<del>echo $this->Rss->header();
<del>
<del>if (!isset($channel)) {
<del> $channel = [];
<del>}
<del>if (!isset($channel['title'])) {
<del> $channel['title'] = $this->fetch('title');
<del>}
<del>
<del>echo $this->Rss->document(
<del> $this->Rss->channel(
<del> [], $channel, $this->fetch('content')
<del> )
<del>);
<del>
<del>?> | 1 |
Text | Text | remove subsystem from pull request template | c72e540cde6695176bd7d218a1ed6584e2a0b9c7 | <ide><path>.github/PULL_REQUEST_TEMPLATE.md
<ide> Contributors guide: https://github.com/nodejs/node/blob/master/CONTRIBUTING.md
<ide> - [ ] tests and/or benchmarks are included
<ide> - [ ] documentation is changed or added
<ide> - [ ] commit message follows [commit guidelines](https://github.com/nodejs/node/blob/master/doc/guides/contributing/pull-requests.md#commit-message-guidelines)
<del>
<del>##### Affected core subsystem(s)
<del><!-- Provide affected core subsystem(s) (like doc, cluster, crypto, etc). --> | 1 |
Ruby | Ruby | add more tests for `find_signed/!` methods | 21889ff55048737a2fa2a3c75412518e2b2aef51 | <ide><path>activerecord/test/cases/signed_id_test.rb
<ide> class SignedIdTest < ActiveRecord::TestCase
<ide> assert_equal Company.first, Company.find_signed(Company.first.signed_id)
<ide> end
<ide>
<del> test "raise UnknownPrimaryKey when model have no primary key" do
<add> test "find signed record raises UnknownPrimaryKey when a model has no primary key" do
<ide> error = assert_raises(ActiveRecord::UnknownPrimaryKey) do
<ide> Matey.find_signed("this will not be even verified")
<ide> end
<ide> class SignedIdTest < ActiveRecord::TestCase
<ide> end
<ide> end
<ide>
<add> test "find signed record with a bang with custom primary key" do
<add> assert_equal @toy, Toy.find_signed!(@toy.signed_id)
<add> end
<add>
<add> test "find signed record with a bang for single table inheritance (STI Models)" do
<add> assert_equal Company.first, Company.find_signed!(Company.first.signed_id)
<add> end
<add>
<ide> test "fail to find record from broken signed id" do
<ide> assert_nil Account.find_signed("this won't find anything")
<ide> end
<ide> class SignedIdTest < ActiveRecord::TestCase
<ide> assert_nil Account.find_signed signed_id
<ide> end
<ide>
<add> test "find signed record with purpose" do
<add> assert_equal @account, Account.find_signed(@account.signed_id(purpose: :v1), purpose: :v1)
<add> end
<add>
<add> test "fail to find signed record with purpose" do
<add> assert_nil Account.find_signed(@account.signed_id(purpose: :v1))
<add>
<add> assert_nil Account.find_signed(@account.signed_id(purpose: :v1), purpose: :v2)
<add> end
<add>
<ide> test "finding record from broken signed id raises on the bang" do
<ide> assert_raises(ActiveSupport::MessageVerifier::InvalidSignature) do
<ide> Account.find_signed! "this will blow up"
<ide> end
<ide> end
<ide>
<add> test "find signed record with a bang within expiration date" do
<add> assert_equal @account, Account.find_signed!(@account.signed_id(expires_in: 1.minute))
<add> end
<add>
<ide> test "finding signed record outside expiration date raises on the bang" do
<ide> signed_id = @account.signed_id(expires_in: 1.minute)
<ide> travel 2.minutes
<ide> class SignedIdTest < ActiveRecord::TestCase
<ide> end
<ide> end
<ide>
<add> test "find signed record with bang with purpose" do
<add> assert_equal @account, Account.find_signed!(@account.signed_id(purpose: :v1), purpose: :v1)
<add> end
<add>
<add> test "find signed record with bang with purpose raises" do
<add> assert_raises(ActiveSupport::MessageVerifier::InvalidSignature) do
<add> Account.find_signed!(@account.signed_id(purpose: :v1))
<add> end
<add>
<add> assert_raises(ActiveSupport::MessageVerifier::InvalidSignature) do
<add> Account.find_signed!(@account.signed_id(purpose: :v1), purpose: :v2)
<add> end
<add> end
<add>
<ide> test "fail to work without a signed_id_verifier_secret" do
<ide> ActiveRecord::Base.signed_id_verifier_secret = nil
<ide> Account.instance_variable_set :@signed_id_verifier, nil | 1 |
Go | Go | add swarmkit fields to stack service | 13384ba34b8d57d9cc6e68ff9c8b0b72bc72f9c3 | <ide><path>cli/command/stack/deploy.go
<ide> import (
<ide> "io/ioutil"
<ide> "os"
<ide> "strings"
<del> "time"
<ide>
<ide> "github.com/spf13/cobra"
<ide> "golang.org/x/net/context"
<ide> import (
<ide> "github.com/docker/docker/cli"
<ide> "github.com/docker/docker/cli/command"
<ide> servicecmd "github.com/docker/docker/cli/command/service"
<add> runconfigopts "github.com/docker/docker/runconfig/opts"
<add> "github.com/docker/docker/opts"
<ide> "github.com/docker/go-connections/nat"
<ide> )
<ide>
<ide> func convertService(
<ide> return swarm.ServiceSpec{}, err
<ide> }
<ide>
<add> resources, err := convertResources(service.Deploy.Resources)
<add> if err != nil {
<add> return swarm.ServiceSpec{}, err
<add> }
<add>
<add> restartPolicy, err := convertRestartPolicy(
<add> service.Restart, service.Deploy.RestartPolicy)
<add> if err != nil {
<add> return swarm.ServiceSpec{}, err
<add> }
<add>
<ide> serviceSpec := swarm.ServiceSpec{
<ide> Annotations: swarm.Annotations{
<ide> Name: name,
<del> Labels: getStackLabels(namespace, service.Labels),
<add> Labels: getStackLabels(namespace, service.Deploy.Labels),
<ide> },
<ide> TaskTemplate: swarm.TaskSpec{
<ide> ContainerSpec: swarm.ContainerSpec{
<del> Image: service.Image,
<del> Command: service.Entrypoint,
<del> Args: service.Command,
<del> Hostname: service.Hostname,
<del> Env: convertEnvironment(service.Environment),
<del> Labels: getStackLabels(namespace, service.Deploy.Labels),
<del> Dir: service.WorkingDir,
<del> User: service.User,
<del> Mounts: mounts,
<add> Image: service.Image,
<add> Command: service.Entrypoint,
<add> Args: service.Command,
<add> Hostname: service.Hostname,
<add> Env: convertEnvironment(service.Environment),
<add> Labels: getStackLabels(namespace, service.Labels),
<add> Dir: service.WorkingDir,
<add> User: service.User,
<add> Mounts: mounts,
<add> StopGracePeriod: service.StopGracePeriod,
<ide> },
<add> Resources: resources,
<add> RestartPolicy: restartPolicy,
<ide> Placement: &swarm.Placement{
<ide> Constraints: service.Deploy.Placement.Constraints,
<ide> },
<ide> },
<ide> EndpointSpec: endpoint,
<ide> Mode: mode,
<ide> Networks: convertNetworks(service.Networks, namespace, service.Name),
<add> UpdateConfig: convertUpdateConfig(service.Deploy.UpdateConfig),
<ide> }
<ide>
<del> if service.StopGracePeriod != nil {
<del> stopGrace, err := time.ParseDuration(*service.StopGracePeriod)
<add> return serviceSpec, nil
<add>}
<add>
<add>func convertRestartPolicy(restart string, source *composetypes.RestartPolicy) (*swarm.RestartPolicy, error) {
<add> // TODO: log if restart is being ignored
<add> if source == nil {
<add> policy, err := runconfigopts.ParseRestartPolicy(restart)
<ide> if err != nil {
<del> return swarm.ServiceSpec{}, err
<add> return nil, err
<add> }
<add> // TODO: is this an accurate convertion?
<add> switch {
<add> case policy.IsNone(), policy.IsAlways(), policy.IsUnlessStopped():
<add> return nil, nil
<add> case policy.IsOnFailure():
<add> attempts := uint64(policy.MaximumRetryCount)
<add> return &swarm.RestartPolicy{
<add> Condition: swarm.RestartPolicyConditionOnFailure,
<add> MaxAttempts: &attempts,
<add> }, nil
<ide> }
<del> serviceSpec.TaskTemplate.ContainerSpec.StopGracePeriod = &stopGrace
<ide> }
<add> return &swarm.RestartPolicy{
<add> Condition: swarm.RestartPolicyCondition(source.Condition),
<add> Delay: source.Delay,
<add> MaxAttempts: source.MaxAttempts,
<add> Window: source.Window,
<add> }, nil
<add>}
<ide>
<del> // TODO: convert mounts
<del> return serviceSpec, nil
<add>func convertUpdateConfig(source *composetypes.UpdateConfig) *swarm.UpdateConfig {
<add> if source == nil {
<add> return nil
<add> }
<add> return &swarm.UpdateConfig{
<add> Parallelism: source.Parallelism,
<add> Delay: source.Delay,
<add> FailureAction: source.FailureAction,
<add> Monitor: source.Monitor,
<add> MaxFailureRatio: source.MaxFailureRatio,
<add> }
<add>}
<add>
<add>func convertResources(source composetypes.Resources) (*swarm.ResourceRequirements, error) {
<add> resources := &swarm.ResourceRequirements{}
<add> if source.Limits != nil {
<add> cpus, err := opts.ParseCPUs(source.Limits.NanoCPUs)
<add> if err != nil {
<add> return nil, err
<add> }
<add> resources.Limits = &swarm.Resources{
<add> NanoCPUs: cpus,
<add> MemoryBytes: int64(source.Limits.MemoryBytes),
<add> }
<add> }
<add> if source.Reservations != nil {
<add> cpus, err := opts.ParseCPUs(source.Reservations.NanoCPUs)
<add> if err != nil {
<add> return nil, err
<add> }
<add> resources.Reservations = &swarm.Resources{
<add> NanoCPUs: cpus,
<add> MemoryBytes: int64(source.Reservations.MemoryBytes),
<add> }
<add> }
<add> return resources, nil
<ide> }
<ide>
<ide> func convertEndpointSpec(source []string) (*swarm.EndpointSpec, error) {
<ide> func convertEnvironment(source map[string]string) []string {
<ide> return output
<ide> }
<ide>
<del>func convertDeployMode(mode string, replicas uint64) (swarm.ServiceMode, error) {
<add>func convertDeployMode(mode string, replicas *uint64) (swarm.ServiceMode, error) {
<ide> serviceMode := swarm.ServiceMode{}
<ide>
<ide> switch mode {
<ide> case "global":
<del> if replicas != 0 {
<add> if replicas != nil {
<ide> return serviceMode, fmt.Errorf("replicas can only be used with replicated mode")
<ide> }
<ide> serviceMode.Global = &swarm.GlobalService{}
<ide> case "replicated":
<del> serviceMode.Replicated = &swarm.ReplicatedService{Replicas: &replicas}
<add> serviceMode.Replicated = &swarm.ReplicatedService{Replicas: replicas}
<ide> default:
<ide> return serviceMode, fmt.Errorf("Unknown mode: %s", mode)
<ide> }
<ide><path>opts/opts.go
<ide> func (c *NanoCPUs) String() string {
<ide>
<ide> // Set sets the value of the NanoCPU by passing a string
<ide> func (c *NanoCPUs) Set(value string) error {
<del> cpu, ok := new(big.Rat).SetString(value)
<del> if !ok {
<del> return fmt.Errorf("Failed to parse %v as a rational number", value)
<del> }
<del> nano := cpu.Mul(cpu, big.NewRat(1e9, 1))
<del> if !nano.IsInt() {
<del> return fmt.Errorf("value is too precise")
<del> }
<del> *c = NanoCPUs(nano.Num().Int64())
<del> return nil
<add> cpus, err := ParseCPUs(value)
<add> *c = NanoCPUs(cpus)
<add> return err
<ide> }
<ide>
<ide> // Type returns the type
<ide> func (c *NanoCPUs) Type() string {
<ide> func (c *NanoCPUs) Value() int64 {
<ide> return int64(*c)
<ide> }
<add>
<add>// ParseCPUs takes a string ratio and returns an integer value of nano cpus
<add>func ParseCPUs(value string) (int64, error) {
<add> cpu, ok := new(big.Rat).SetString(value)
<add> if !ok {
<add> return 0, fmt.Errorf("failed to parse %v as a rational number", value)
<add> }
<add> nano := cpu.Mul(cpu, big.NewRat(1e9, 1))
<add> if !nano.IsInt() {
<add> return 0, fmt.Errorf("value is too precise")
<add> }
<add> return nano.Num().Int64(), nil
<add>} | 2 |
Java | Java | avoid unescape for connect and connected frames | 5d91560f9230a1673036ee0a9b5e4aa75c4f148c | <ide><path>spring-messaging/src/main/java/org/springframework/messaging/simp/stomp/StompDecoder.java
<ide> private Message<byte[]> decodeMessage(ByteBuffer byteBuffer, @Nullable MultiValu
<ide> StompCommand stompCommand = StompCommand.valueOf(command);
<ide> headerAccessor = StompHeaderAccessor.create(stompCommand);
<ide> initHeaders(headerAccessor);
<del> readHeaders(byteBuffer, headerAccessor);
<add> readHeaders(stompCommand, byteBuffer, headerAccessor);
<ide> payload = readPayload(byteBuffer, headerAccessor);
<ide> }
<ide> if (payload != null) {
<ide> private String readCommand(ByteBuffer byteBuffer) {
<ide> return StreamUtils.copyToString(command, StandardCharsets.UTF_8);
<ide> }
<ide>
<del> private void readHeaders(ByteBuffer byteBuffer, StompHeaderAccessor headerAccessor) {
<add> private void readHeaders(StompCommand stompCommand, ByteBuffer byteBuffer, StompHeaderAccessor headerAccessor) {
<add> boolean shouldUnescape = (stompCommand != StompCommand.CONNECT && stompCommand != StompCommand.STOMP
<add> && stompCommand != StompCommand.CONNECTED);
<ide> while (true) {
<ide> ByteArrayOutputStream headerStream = new ByteArrayOutputStream(256);
<ide> boolean headerComplete = false;
<ide> private void readHeaders(ByteBuffer byteBuffer, StompHeaderAccessor headerAccess
<ide> }
<ide> }
<ide> else {
<del> String headerName = unescape(header.substring(0, colonIndex));
<del> String headerValue = unescape(header.substring(colonIndex + 1));
<add> String headerName = shouldUnescape ? unescape(header.substring(0, colonIndex)) : header.substring(0, colonIndex);
<add> String headerValue = shouldUnescape ? unescape(header.substring(colonIndex + 1)) : header.substring(colonIndex + 1);
<ide> try {
<ide> headerAccessor.addNativeHeader(headerName, headerValue);
<ide> }
<ide><path>spring-messaging/src/test/java/org/springframework/messaging/simp/stomp/StompDecoderTests.java
<ide> public void decodeFrameWithEscapedHeaders() {
<ide> assertThat(headers.getFirstNativeHeader("a:\r\n\\b")).isEqualTo("alpha:bravo\r\n\\");
<ide> }
<ide>
<add> @Test
<add> public void decodeFrameWithHeaderWithBackslashValue() {
<add> String accept = "accept-version:1.1\n";
<add> String keyAndValueWithBackslash = "key:\\value\n";
<add>
<add> Message<byte[]> frame = decode("CONNECT\n" + accept + keyAndValueWithBackslash + "\n\0");
<add> StompHeaderAccessor headers = StompHeaderAccessor.wrap(frame);
<add>
<add> assertThat(headers.getCommand()).isEqualTo(StompCommand.CONNECT);
<add>
<add> assertThat(headers.toNativeHeaderMap().size()).isEqualTo(2);
<add> assertThat(headers.getFirstNativeHeader("accept-version")).isEqualTo("1.1");
<add> assertThat(headers.getFirstNativeHeader("key")).isEqualTo("\\value");
<add>
<add> assertThat(frame.getPayload().length).isEqualTo(0);
<add> }
<add>
<ide> @Test
<ide> public void decodeFrameBodyNotAllowed() {
<ide> assertThatExceptionOfType(StompConversionException.class).isThrownBy(() -> | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.