content_type
stringclasses
8 values
main_lang
stringclasses
7 values
message
stringlengths
1
50
sha
stringlengths
40
40
patch
stringlengths
52
962k
file_count
int64
1
300
Text
Text
add link to minitest rdoc & github
04e1281990fd1d18d95c6b609690380711d0b48a
<ide><path>guides/source/testing.md <ide> Ideally, you would like to include a test for everything which could possibly br <ide> By now you've caught a glimpse of some of the assertions that are available. Assertions are the worker bees of testing. They are the ones that actually perform the checks to ensure that things are going as planned. <ide> <ide> There are a bunch of different types of assertions you can use. <del>Here's an extract of the assertions you can use with `minitest`, the default testing library used by Rails. The `[msg]` parameter is an optional string message you can specify to make your test failure messages clearer. It's not required. <add>Here's an extract of the assertions you can use with [`Minitest`](https://github.com/seattlerb/minitest), the default testing library used by Rails. The `[msg]` parameter is an optional string message you can specify to make your test failure messages clearer. It's not required. <ide> <ide> | Assertion | Purpose | <ide> | ---------------------------------------------------------------- | ------- | <ide> Here's an extract of the assertions you can use with `minitest`, the default tes <ide> | `assert_send( array, [msg] )` | Ensures that executing the method listed in `array[1]` on the object in `array[0]` with the parameters of `array[2 and up]` is true. This one is weird eh?| <ide> | `flunk( [msg] )` | Ensures failure. This is useful to explicitly mark a test that isn't finished yet.| <ide> <add>The above are subset of assertions that minitest supports. For an exhaustive & more up-to-date list, please check [Minitest API documentation](http://docs.seattlerb.org/minitest/), specifically [`Minitest::Assertions`](http://docs.seattlerb.org/minitest/Minitest/Assertions.html) <add> <ide> Because of the modular nature of the testing framework, it is possible to create your own assertions. In fact, that's exactly what Rails does. It includes some specialized assertions to make your life easier. <ide> <ide> NOTE: Creating your own assertions is an advanced topic that we won't cover in this tutorial.
1
Python
Python
extend layer api to multi inputs / ouputs
eac3bf8b587936782952539bafc6dfb1c9d49fa2
<ide><path>keras/layers/containers.py <ide> def __init__(self): <ide> self.constraints = [] <ide> self.updates = [] <ide> <del> def set_previous(self, layer): <del> if len(self.inputs) != 1 or len(self.outputs) != 1: <del> raise Exception('The Graph container can only be used as a layer \ <del> when it has exactly one input and one output.') <del> self.inputs[self.input_order[0]].set_previous(layer) <add> @property <add> def nb_input(self): <add> return len(self.inputs) <add> <add> @property <add> def nb_output(self): <add> return len(self.outputs) <add> <add> def set_previous(self, layer, connection_map={}): <add> if self.nb_input != layer.nb_output: <add> raise Exception('Cannot connect layers: input count does not match output count.') <add> if self.nb_input == 1: <add> self.inputs[self.input_order[0]].set_previous(layer) <add> else: <add> if not connection_map: <add> raise Exception('Cannot attach multi-input layer: no connection_map provided.') <add> for k, v in connection_map: <add> if k in self.inputs and v in layer.outputs: <add> self.inputs[k].set_previous(layer.outputs[v]) <add> else: <add> raise Exception('Invalid connection map.') <ide> <ide> def get_input(self, train=False): <ide> if len(self.inputs) != 1 or len(self.outputs) != 1: <ide><path>keras/layers/core.py <ide> def __init__(self): <ide> def init_updates(self): <ide> self.updates = [] <ide> <del> def set_previous(self, layer): <add> def set_previous(self, layer, connection_map={}): <add> assert self.nb_input == layer.nb_output == 1, "Cannot connect layers: input count and output count should be 1." <ide> if not self.supports_masked_input() and layer.get_output_mask() is not None: <del> raise Exception("Attached non-masking layer to layer with masked output") <add> raise Exception("Cannot connect non-masking layer to layer with masked output") <ide> self.previous = layer <ide> <add> @property <add> def nb_input(self): <add> return 1 <add> <add> @property <add> def nb_output(self): <add> return 1 <add> <ide> def get_output(self, train=False): <ide> return self.get_input(train) <ide>
2
Text
Text
add some extra details to webhook docs
b4b899264ed892818ef31c3626acad8fb110aabb
<ide><path>docs/sources/docker-hub/builds.md <ide> repo on the Docker Hub. <ide> ### Webhooks <ide> <ide> Automated Builds also include a Webhooks feature. Webhooks can be called <del>after a successful repository push is made. <add>after a successful repository push is made. This includes when a new tag is added <add>to an existing image. <ide> <ide> The webhook call will generate a HTTP POST with the following JSON <ide> payload: <ide> <ide> ``` <ide> { <del> "push_data":{ <del> "pushed_at":1385141110, <del> "images":[ <del> "imagehash1", <del> "imagehash2", <del> "imagehash3" <del> ], <del> "pusher":"username" <del> }, <del> "repository":{ <del> "status":"Active", <del> "description":"my docker repo that does cool things", <del> "is_automated":false, <del> "full_description":"This is my full description", <del> "repo_url":"https://registry.hub.docker.com/u/username/reponame/", <del> "owner":"username", <del> "is_official":false, <del> "is_private":false, <del> "name":"reponame", <del> "namespace":"username", <del> "star_count":1, <del> "comment_count":1, <del> "date_created":1370174400, <del> "dockerfile":"my full dockerfile is listed here", <del> "repo_name":"username/reponame" <del> } <add> "callback_url": "https://registry.hub.docker.com/u/svendowideit/testhook/hook/2141b5bi5i5b02bec211i4eeih0242eg11000a/", <add> "push_data": { <add> "images": [], <add> "pushed_at": 1.417566161e+09, <add> "pusher": "trustedbuilder" <add> }, <add> "repository": { <add> "comment_count": 0, <add> "date_created": 1.417494799e+09, <add> "description": "", <add> "dockerfile": "#\n# BUILD\u0009\u0009docker build -t svendowideit/apt-cacher .\n# RUN\u0009\u0009docker run -d -p 3142:3142 -name apt-cacher-run apt-cacher\n#\n# and then you can run containers with:\n# \u0009\u0009docker run -t -i -rm -e http_proxy http://192.168.1.2:3142/ debian bash\n#\nFROM\u0009\u0009ubuntu\nMAINTAINER\u0009SvenDowideit@home.org.au\n\n\nVOLUME\u0009\u0009[\"/var/cache/apt-cacher-ng\"]\nRUN\u0009\u0009apt-get update ; apt-get install -yq apt-cacher-ng\n\nEXPOSE \u0009\u00093142\nCMD\u0009\u0009chmod 777 /var/cache/apt-cacher-ng ; /etc/init.d/apt-cacher-ng start ; tail -f /var/log/apt-cacher-ng/*\n", <add> "full_description": "Docker Hub based automated build from a GitHub repo", <add> "is_official": false, <add> "is_private": true, <add> "is_trusted": true, <add> "name": "testhook", <add> "namespace": "svendowideit", <add> "owner": "svendowideit", <add> "repo_name": "svendowideit/testhook", <add> "repo_url": "https://registry.hub.docker.com/u/svendowideit/testhook/", <add> "star_count": 0, <add> "status": "Active" <add> } <ide> } <ide> ``` <ide> <del>Webhooks are available under the Settings menu of each Automated <del>Build's repo. <add>Webhooks are available under the Settings menu of each Repository. <ide> <ide> > **Note:** If you want to test your webhook out we recommend using <ide> > a tool like [requestb.in](http://requestb.in/). <ide> <add>### Webhook chains <add> <add>Webhook chains allow you to chain calls to multiple services. For example, <add>you can use this to trigger a deployment of your container only after <add>it has been successfully tested, then update a separate Changelog once the <add>deployment is complete. <add>After clicking the "Add webhook" button, simply add as many URLs as necessary <add>in your chain. <add> <add>The first webhook in a chain will be called after a successful push. Subsequent <add>URLs will be contacted after the callback has been validated. <add> <add>#### Validating a callback <add> <add>In order to validate a callback in a webhook chain, you need to <add> <add>1. Retrieve the `callback_url` value in the request's JSON payload. <add>1. Send a POST request to this URL containing a valid JSON body. <add> <add>> **Note**: A chain request will only be considered complete once the last <add>> callback has been validated. <add> <add>To help you debug or simply view the results of your webhook(s), <add>view the "History" of the webhook available on its settings page. <add> <add>#### Callback JSON data <add> <add>The following parameters are recognized in callback data: <add> <add>* `state` (required): Accepted values are `success`, `failure` and `error`. <add> If the state isn't `success`, the webhook chain will be interrupted. <add>* `description`: A string containing miscellaneous information that will be <add> available on the Docker Hub. Maximum 255 characters. <add>* `context`: A string containing the context of the operation. Can be retrieved <add> from the Docker Hub. Maximum 100 characters. <add>* `target_url`: The URL where the results of the operation can be found. Can be <add> retrieved on the Docker Hub. <add> <add>*Example callback payload:* <add> <add> { <add> "state": "success", <add> "description": "387 tests PASSED", <add> "context": "Continuous integration by Acme CI", <add> "target_url": "http://ci.acme.com/results/afd339c1c3d27" <add> } <ide> <ide> ### Repository links <ide> <ide><path>docs/sources/docker-hub/repos.md <ide> similar to the example shown below. <ide> <ide> *Example webhook JSON payload:* <ide> <del> { <del> "push_data":{ <del> "pushed_at":1385141110, <del> "images":[ <del> "imagehash1", <del> "imagehash2", <del> "imagehash3" <del> ], <del> "pusher":"username" <del> }, <del> "repository":{ <del> "status":"Active", <del> "description":"my docker repo that does cool things", <del> "is_automated":false, <del> "full_description":"This is my full description", <del> "repo_url":"https://registry.hub.docker.com/u/username/reponame/", <del> "owner":"username", <del> "is_official":false, <del> "is_private":false, <del> "name":"reponame", <del> "namespace":"username", <del> "star_count":1, <del> "comment_count":1, <del> "date_created":1370174400, <del> "dockerfile":"my full dockerfile is listed here", <del> "repo_name":"username/reponame" <del> } <del> } <add>``` <add>{ <add> "callback_url": "https://registry.hub.docker.com/u/svendowideit/busybox/hook/2141bc0cdec4hebec411i4c1g40242eg110020/", <add> "push_data": { <add> "images": [], <add> "pushed_at": 1.417566822e+09, <add> "pusher": "svendowideit" <add> }, <add> "repository": { <add> "comment_count": 0, <add> "date_created": 1.417566665e+09, <add> "description": "", <add> "full_description": "webhook triggered from a 'docker push'", <add> "is_official": false, <add> "is_private": false, <add> "is_trusted": false, <add> "name": "busybox", <add> "namespace": "svendowideit", <add> "owner": "svendowideit", <add> "repo_name": "svendowideit/busybox", <add> "repo_url": "https://registry.hub.docker.com/u/svendowideit/busybox/", <add> "star_count": 0, <add> "status": "Active" <add>} <add>``` <ide> <ide> Webhooks allow you to notify people, services and other applications of <ide> new updates to your images and repositories. To get started adding webhooks, <ide> deployment is complete. <ide> After clicking the "Add webhook" button, simply add as many URLs as necessary <ide> in your chain. <ide> <del>The first webhook in a chain will be called after a successful push. Subsequent URLs will be contacted after the callback has been validated. <add>The first webhook in a chain will be called after a successful push. Subsequent <add>URLs will be contacted after the callback has been validated. <ide> <ide> #### Validating a callback <ide> <ide> view the "History" of the webhook available on its settings page. <ide> <ide> The following parameters are recognized in callback data: <ide> <del>* `state` (required): Accepted values are `success`, `failure` and `error`. If the state isn't `success`, the webhook chain will be interrupted. <del>* `description`: A string containing miscellaneous information that will be available on the Docker Hub. Maximum 255 characters. <del>* `context`: A string containing the context of the operation. Can be retrieved on the Docker Hub. Maximum 100 characters. <del>* `target_url`: The URL where the results of the operation can be found. Can be retrieved on the Docker Hub. <add>* `state` (required): Accepted values are `success`, `failure` and `error`. <add> If the state isn't `success`, the webhook chain will be interrupted. <add>* `description`: A string containing miscellaneous information that will be <add> available on the Docker Hub. Maximum 255 characters. <add>* `context`: A string containing the context of the operation. Can be retrieved <add> from the Docker Hub. Maximum 100 characters. <add>* `target_url`: The URL where the results of the operation can be found. Can be <add> retrieved on the Docker Hub. <ide> <ide> *Example callback payload:* <ide>
2
Javascript
Javascript
add todo comments
41343d1763d0a2088c43d0ffba775b9a886c2216
<ide><path>lib/Compilation.js <ide> class Compilation extends Tapable { <ide> this.dependencyFactories = new Map(); <ide> /** @type {Map<DepConstructor|string, DependencyTemplate|string>} */ <ide> this.dependencyTemplates = new Map(); <add> // TODO refactor this in webpack 5 to a custom DependencyTemplates class with a hash property <ide> this.dependencyTemplates.set("hash", ""); <ide> this.childrenCounters = {}; <ide> /** @type {Set<number|string>} */ <ide><path>lib/NormalModule.js <ide> class NormalModule extends Module { <ide> } <ide> <ide> getHashDigest(dependencyTemplates) { <add> // TODO webpack 5 refactor <ide> let dtHash = dependencyTemplates.get("hash"); <ide> return `${this.hash}-${dtHash}`; <ide> } <ide><path>lib/optimize/ConcatenatedModule.js <ide> class ConcatenatedModule extends Module { <ide> <ide> // Must use full identifier in our cache here to ensure that the source <ide> // is updated should our dependencies list change. <add> // TODO webpack 5 refactor <ide> innerDependencyTemplates.set( <ide> "hash", <ide> innerDependencyTemplates.get("hash") + this.identifier()
3
Java
Java
fix warnings from the sniff task
cb40908a7d3c6fb330d3813692f7679501d9a54b
<ide><path>spring-websocket/src/main/java/org/springframework/web/socket/config/WebSocketNamespaceUtils.java <ide> package org.springframework.web.socket.config; <ide> <ide> import org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler; <add>import org.springframework.util.ClassUtils; <add>import org.springframework.web.socket.config.annotation.WebSocketConfigurationSupport; <ide> import org.w3c.dom.Element; <ide> <ide> import org.springframework.beans.factory.config.BeanDefinition; <ide> */ <ide> class WebSocketNamespaceUtils { <ide> <add> // Check for setRemoveOnCancelPolicy method - available on JDK 7 and higher <add> private static boolean hasRemoveOnCancelPolicyMethod = ClassUtils.hasMethod( <add> WebSocketConfigurationSupport.class, "setRemoveOnCancelPolicy", boolean.class); <add> <add> <ide> public static RuntimeBeanReference registerHandshakeHandler(Element element, ParserContext parserContext, Object source) { <ide> RuntimeBeanReference handlerRef; <ide> Element handlerElem = DomUtils.getChildElementByTagName(element, "handshake-handler"); <ide> private static RuntimeBeanReference registerSockJsTaskScheduler(String scheduler <ide> taskSchedulerDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE); <ide> taskSchedulerDef.getPropertyValues().add("poolSize", Runtime.getRuntime().availableProcessors()); <ide> taskSchedulerDef.getPropertyValues().add("threadNamePrefix", schedulerName + "-"); <del> taskSchedulerDef.getPropertyValues().add("removeOnCancelPolicy", true); <add> if (hasRemoveOnCancelPolicyMethod) { <add> taskSchedulerDef.getPropertyValues().add("removeOnCancelPolicy", true); <add> } <ide> parserContext.getRegistry().registerBeanDefinition(schedulerName, taskSchedulerDef); <ide> parserContext.registerComponent(new BeanComponentDefinition(taskSchedulerDef, schedulerName)); <ide> } <ide><path>spring-websocket/src/main/java/org/springframework/web/socket/config/annotation/WebSocketConfigurationSupport.java <ide> <ide> import org.springframework.context.annotation.Bean; <ide> import org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler; <add>import org.springframework.util.ClassUtils; <ide> import org.springframework.web.servlet.HandlerMapping; <ide> <ide> /** <ide> */ <ide> public class WebSocketConfigurationSupport { <ide> <add> // Check for setRemoveOnCancelPolicy method - available on JDK 7 and higher <add> private static boolean hasRemoveOnCancelPolicyMethod = ClassUtils.hasMethod( <add> WebSocketConfigurationSupport.class, "setRemoveOnCancelPolicy", boolean.class); <add> <add> <ide> @Bean <ide> public HandlerMapping webSocketHandlerMapping() { <ide> ServletWebSocketHandlerRegistry registry = new ServletWebSocketHandlerRegistry(defaultSockJsTaskScheduler()); <ide> public ThreadPoolTaskScheduler defaultSockJsTaskScheduler() { <ide> ThreadPoolTaskScheduler scheduler = new ThreadPoolTaskScheduler(); <ide> scheduler.setThreadNamePrefix("SockJS-"); <ide> scheduler.setPoolSize(Runtime.getRuntime().availableProcessors()); <del> scheduler.setRemoveOnCancelPolicy(true); <add> if (hasRemoveOnCancelPolicyMethod) { <add> scheduler.setRemoveOnCancelPolicy(true); <add> } <ide> return scheduler; <ide> } <ide> <ide><path>spring-websocket/src/main/java/org/springframework/web/socket/config/annotation/WebSocketMessageBrokerConfigurationSupport.java <ide> import org.springframework.messaging.simp.SimpSessionScope; <ide> import org.springframework.messaging.simp.config.AbstractMessageBrokerConfiguration; <ide> import org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler; <add>import org.springframework.util.ClassUtils; <ide> import org.springframework.web.servlet.HandlerMapping; <ide> import org.springframework.web.socket.WebSocketHandler; <ide> import org.springframework.web.socket.messaging.SubProtocolWebSocketHandler; <ide> */ <ide> public abstract class WebSocketMessageBrokerConfigurationSupport extends AbstractMessageBrokerConfiguration { <ide> <add> // Check for setRemoveOnCancelPolicy method - available on JDK 7 and higher <add> private static boolean hasRemoveOnCancelPolicyMethod = ClassUtils.hasMethod( <add> WebSocketConfigurationSupport.class, "setRemoveOnCancelPolicy", boolean.class); <add> <ide> private WebSocketTransportRegistration transportRegistration; <ide> <ide> <ide> public ThreadPoolTaskScheduler messageBrokerSockJsTaskScheduler() { <ide> ThreadPoolTaskScheduler scheduler = new ThreadPoolTaskScheduler(); <ide> scheduler.setThreadNamePrefix("MessageBrokerSockJS-"); <ide> scheduler.setPoolSize(Runtime.getRuntime().availableProcessors()); <del> scheduler.setRemoveOnCancelPolicy(true); <add> if (hasRemoveOnCancelPolicyMethod) { <add> scheduler.setRemoveOnCancelPolicy(true); <add> } <ide> return scheduler; <ide> } <ide>
3
Javascript
Javascript
fix the arguments order in `assert.strictequal`
b4af0c49eaa2f479476f021c5ff8dbd7a05fe38b
<ide><path>test/parallel/test-buffer-alloc.js <ide> assert.strictEqual((Buffer.from('Man')).toString('base64'), 'TWFu'); <ide> 'dWVkIGFuZCBpbmRlZmF0aWdhYmxlIGdlbmVyYXRpb24gb2Yga25vd2xlZ' + <ide> 'GdlLCBleGNlZWRzIHRoZSBzaG9ydCB2ZWhlbWVuY2Ugb2YgYW55IGNhcm' + <ide> '5hbCBwbGVhc3VyZS4='; <del> assert.strictEqual(expected, (Buffer.from(quote)).toString('base64')); <add> assert.strictEqual((Buffer.from(quote)).toString('base64'), expected); <ide> <ide> let b = Buffer.allocUnsafe(1024); <ide> let bytesWritten = b.write(expected, 0, 'base64');
1
Ruby
Ruby
calculate model class on construction
0ebbf6be8a5aa9b5c817b3ee0dfdd8135c525ac7
<ide><path>activerecord/lib/active_record/fixtures.rb <ide> def self.identify(label) <ide> Zlib.crc32(label.to_s) % MAX_ID <ide> end <ide> <del> attr_reader :table_name, :name, :fixtures <add> attr_reader :table_name, :name, :fixtures, :model_class <ide> <ide> def initialize(connection, table_name, class_name, fixture_path, file_filter = DEFAULT_FILTER_RE) <ide> @connection = connection <ide> def initialize(connection, table_name, class_name, fixture_path, file_filter = D <ide> <ide> @fixtures = ActiveSupport::OrderedHash.new <ide> @table_name = "#{ActiveRecord::Base.table_name_prefix}#{@table_name}#{ActiveRecord::Base.table_name_suffix}" <del> @table_name = class_name.table_name if class_name.respond_to?(:table_name) <del> @connection = class_name.connection if class_name.respond_to?(:connection) <add> <add> # Should be an AR::Base type class <add> if class_name.is_a?(Class) <add> @table_name = class_name.table_name <add> @connection = class_name.connection <add> @model_class = class_name <add> else <add> @model_class = class_name.constantize rescue nil <add> end <add> <ide> read_fixture_files <ide> end <ide> <ide> class HabtmFixtures < ::Fixtures #:nodoc: <ide> def read_fixture_files; end <ide> end <ide> <del> def model_class <del> unless defined?(@model_class) <del> @model_class = <del> if @class_name.nil? || @class_name.is_a?(Class) <del> @class_name <del> else <del> @class_name.constantize rescue nil <del> end <del> end <del> <del> @model_class <del> end <del> <ide> def primary_key_name <ide> @primary_key_name ||= model_class && model_class.primary_key <ide> end
1
PHP
PHP
replace spaces with tabs
c35c9957a276960254274929bc966ec00a0ee18f
<ide><path>tests/Database/DatabaseQueryBuilderMemoryLeakTest.php <ide> <ide> class DatabaseJoinMemoryLeakTest extends PHPUnit_Framework_TestCase { <ide> <del> public function tearDown() <del> { <del> m::close(); <del> } <add> public function tearDown() <add> { <add> m::close(); <add> } <ide> <del> public function testItDoesNotLeakMemoryOnNewQuery() <del> { <del> $builderMain = $this->getBuilder(); <add> public function testItDoesNotLeakMemoryOnNewQuery() <add> { <add> $builderMain = $this->getBuilder(); <ide> <del> $this->runMemoryTest(function() use($builderMain){ <del> $builder = $builderMain->newQuery(); <del> $builder->select('*')->from('users'); <add> $this->runMemoryTest(function() use($builderMain){ <add> $builder = $builderMain->newQuery(); <add> $builder->select('*')->from('users'); <ide> <del> }); <del> } <add> }); <add> } <ide> <del> public function testItDoesNotLeakMemoryOnNewQueryWithJoin() <del> { <del> $builderMain = $this->getBuilder(); <add> public function testItDoesNotLeakMemoryOnNewQueryWithJoin() <add> { <add> $builderMain = $this->getBuilder(); <ide> <del> $this->runMemoryTest(function() use($builderMain){ <del> $builder = $builderMain->newQuery(); <del> $builder->select('*')->join('new', 'col', '=', 'col2')->from('users'); <add> $this->runMemoryTest(function() use($builderMain){ <add> $builder = $builderMain->newQuery(); <add> $builder->select('*')->join('new', 'col', '=', 'col2')->from('users'); <ide> <del> }); <del> } <add> }); <add> } <ide> <del> protected function runMemoryTest(\Closure $callback) <del> { <del> $i = 5; <add> protected function runMemoryTest(\Closure $callback) <add> { <add> $i = 5; <ide> <del> $last = null; <add> $last = null; <ide> <del> gc_collect_cycles(); <add> gc_collect_cycles(); <ide> <del> while($i--) <del> { <del> $callback(); <add> while($i--) <add> { <add> $callback(); <ide> <del> $prev = $last; <del> $last = memory_get_usage(); <del> } <add> $prev = $last; <add> $last = memory_get_usage(); <add> } <ide> <del> $this->assertEquals($prev, $last); <del> } <add> $this->assertEquals($prev, $last); <add> } <ide> <ide> <del> protected function getBuilder() <del> { <del> $grammar = new Illuminate\Database\Query\Grammars\SqlServerGrammar; <del> $processor = m::mock('Illuminate\Database\Query\Processors\Processor'); <del> return new Builder(m::mock('Illuminate\Database\ConnectionInterface'), $grammar, $processor); <del> } <add> protected function getBuilder() <add> { <add> $grammar = new Illuminate\Database\Query\Grammars\SqlServerGrammar; <add> $processor = m::mock('Illuminate\Database\Query\Processors\Processor'); <add> return new Builder(m::mock('Illuminate\Database\ConnectionInterface'), $grammar, $processor); <add> } <ide> <ide> }
1
Text
Text
add symlink information for process.execpath
8c177c467ac42dc8def8c90a3104dd0a80369c70
<ide><path>doc/api/process.md <ide> added: v0.1.100 <ide> * {string} <ide> <ide> The `process.execPath` property returns the absolute pathname of the executable <del>that started the Node.js process. <add>that started the Node.js process. Symbolic links, if any, are resolved. <ide> <ide> <!-- eslint-disable semi --> <ide> ```js
1
Javascript
Javascript
expose the actual transformer in the config
e4621f4ce142c23067ef5ab83465c01ebce34ff4
<ide><path>local-cli/bundle/buildBundle.js <ide> async function buildBundle( <ide> sourceMapUrl = path.basename(sourceMapUrl); <ide> } <ide> <del> config.transformModulePath = args.transformer <add> config.transformerPath = args.transformer <ide> ? path.resolve(args.transformer) <del> : config.transformModulePath; <add> : config.transformerPath; <ide> <ide> const requestOpts: RequestOptions = { <ide> entryFile: args.entryFile, <ide><path>local-cli/dependencies/dependencies.js <ide> async function dependencies(argv, configPromise, args, packagerInstance) { <ide> } <ide> <ide> config.cacheStores = []; <del> config.transformModulePath = args.transformer <del> ? path.resolve(args.transformer) <del> : config.transformModulePath; <add> if (args.transformer) { <add> config.transformer.babelTransformerPath = path.resolve(args.transformer); <add> } <ide> <ide> const relativePath = path.relative( <ide> config.projectRoot, <ide><path>local-cli/util/Config.js <ide> const Config = { <ide> ], <ide> getPolyfills, <ide> }, <add> transformer: { <add> babelTransformerPath: require.resolve('metro/src/reactNativeTransformer'), <add> }, <ide> watchFolders: getWatchFolders(), <del> transformModulePath: require.resolve('metro/src/reactNativeTransformer'), <ide> }, <ide> <ide> async load(configFile: ?string): Promise<ConfigT> {
3
Javascript
Javascript
fix error fonts (#826)
3813f73ece6d5e541dd59d751d430a0963a5970c
<ide><path>lib/error-debug.js <ide> const styles = { <ide> }, <ide> <ide> message: { <del> fontFamily: '"SF Mono", "Roboto Mono", "Fira Mono", menlo-regular, monospace', <del> fontSize: '10px', <add> fontFamily: '"SF Mono", "Roboto Mono", "Fira Mono", consolas, menlo-regular, monospace', <add> fontSize: '14px', <ide> color: '#fbe7f1', <ide> margin: 0, <ide> whiteSpace: 'pre-wrap', <ide> const styles = { <ide> <ide> heading: { <ide> fontFamily: '-apple-system, BlinkMacSystemFont, Roboto, "Segoe UI", "Fira Sans", Avenir, "Helvetica Neue", "Lucida Grande", sans-serif', <del> fontSize: '13px', <add> fontSize: '16px', <ide> fontWeight: 'bold', <ide> color: '#ff84bf', <ide> marginBottom: '20px'
1
Java
Java
remove reacttextviewmanager subclasses
bb460468a4bff2b6cdab7f8d65d739654ed456d2
<ide><path>ReactAndroid/src/main/java/com/facebook/react/views/text/ReactTextViewManager.java <ide> protected @Nullable ReactTextViewManagerCallback mReactTextViewManagerCallback; <ide> <ide> public ReactTextViewManager() { <del> super(); <add> this(null); <add> } <ide> <add> public ReactTextViewManager(@Nullable ReactTextViewManagerCallback reactTextViewManagerCallback) { <add> mReactTextViewManagerCallback = reactTextViewManagerCallback; <ide> setupViewRecycling(); <ide> } <ide>
1
PHP
PHP
remove unused route
f17ac36e6a3ea284aee6344e2432205b0ffe08e7
<ide><path>app/Http/Middleware/AuthMiddleware.php <ide> <?php namespace App\Http\Middleware; <ide> <ide> use Closure; <del>use Illuminate\Routing\Route; <ide> use Illuminate\Contracts\Auth\Authenticator; <ide> use Illuminate\Contracts\Routing\Middleware; <ide> use Illuminate\Contracts\Routing\ResponseFactory;
1
Javascript
Javascript
remove obsolete tests with new tree-walking
5becbe3f08f5286b8db772519144bf9f1e832d2b
<ide><path>src/renderers/dom/client/__tests__/ReactMount-test.js <ide> describe('ReactMount', function() { <ide> }); <ide> } <ide> <del> it('warns when using two copies of React before throwing', function() { <del> jest.resetModuleRegistry(); <del> var RD1 = require('ReactDOM'); <del> jest.resetModuleRegistry(); <del> var RD2 = require('ReactDOM'); <del> <del> var X = React.createClass({ <del> render: function() { <del> return <div />; <del> }, <del> }); <del> <del> var container = document.createElement('div'); <del> spyOn(console, 'error'); <del> var component = RD1.render(<X />, container); <del> expect(console.error.argsForCall.length).toBe(0); <del> <del> // This fails but logs a warning first <del> expect(function() { <del> RD2.findDOMNode(component); <del> }).toThrow(); <del> expect(console.error.argsForCall.length).toBe(1); <del> expect(console.error.argsForCall[0][0]).toContain('two copies of React'); <del> }); <del> <ide> it('should warn if render removes React-rendered children', function() { <ide> var container = document.createElement('container'); <ide> var Component = React.createClass({ <ide><path>src/renderers/shared/reconciler/__tests__/ReactMultiChild-test.js <ide> describe('ReactMultiChild', function() { <ide> expect(mockUnmount.mock.calls.length).toBe(1); <ide> }); <ide> }); <del> <del> describe('innerHTML', function() { <del> var setInnerHTML; <del> <del> // Only run this suite if `Element.prototype.innerHTML` can be spied on. <del> var innerHTMLDescriptor = Object.getOwnPropertyDescriptor( <del> Element.prototype, <del> 'innerHTML' <del> ); <del> if (!innerHTMLDescriptor) { <del> return; <del> } <del> <del> beforeEach(function() { <del> var ReactDOMFeatureFlags = require('ReactDOMFeatureFlags'); <del> ReactDOMFeatureFlags.useCreateElement = false; <del> <del> Object.defineProperty(Element.prototype, 'innerHTML', { <del> set: setInnerHTML = jasmine.createSpy().andCallFake( <del> innerHTMLDescriptor.set <del> ), <del> }); <del> }); <del> <del> it('should only set `innerHTML` once on update', function() { <del> var container = document.createElement('div'); <del> <del> ReactDOM.render( <del> <div> <del> <p><span /></p> <del> <p><span /></p> <del> <p><span /></p> <del> </div>, <del> container <del> ); <del> // Warm the cache used by `getMarkupWrap`. <del> ReactDOM.render( <del> <div> <del> <p><span /><span /></p> <del> <p><span /><span /></p> <del> <p><span /><span /></p> <del> </div>, <del> container <del> ); <del> expect(setInnerHTML).toHaveBeenCalled(); <del> var callCountOnMount = setInnerHTML.calls.length; <del> <del> ReactDOM.render( <del> <div> <del> <p><span /><span /><span /></p> <del> <p><span /><span /><span /></p> <del> <p><span /><span /><span /></p> <del> </div>, <del> container <del> ); <del> expect(setInnerHTML.calls.length).toBe(callCountOnMount + 1); <del> }); <del> }); <ide> });
2
PHP
PHP
apply changes from feedback
e7e2e2954d12041bd8642c19154fe7bdfa61e30c
<ide><path>src/TestSuite/HttpClientTrait.php <ide> public function cleanupMockResponses(): void <ide> * @param string $body The body for the response. <ide> * @return \Cake\Http\Client\Response <ide> */ <del> public function newResponse(int $code = 200, array $headers = [], string $body = ''): Response <add> public function newClientResponse(int $code = 200, array $headers = [], string $body = ''): Response <ide> { <ide> $headers = array_merge(["HTTP/1.1 {$code}"], $headers); <ide> <ide><path>tests/TestCase/TestSuite/HttpClientTraitTest.php <ide> class HttpClientTraitTest extends TestCase <ide> /** <ide> * Provider for http methods. <ide> * <del> * @return void <add> * @return array<array> <ide> */ <ide> public static function methodProvider(): array <ide> { <ide> public function testRequestMethods(string $httpMethod) <ide> $traitMethod = "mockClient{$httpMethod}"; <ide> $clientMethod = strtolower($httpMethod); <ide> <del> $response = $this->newResponse(200, ['Content-Type: application/json'], '{"ok":true}'); <add> $response = $this->newClientResponse(200, ['Content-Type: application/json'], '{"ok":true}'); <ide> $this->{$traitMethod}('http://example.com', $response); <ide> <ide> $client = new Client();
2
Python
Python
use unicode.translate to speed up js escaping
44767f2caf028d89e1a283d04bb552d0e18bb936
<ide><path>django/utils/html.py <ide> def escape(text): <ide> return mark_safe(force_text(text).replace('&', '&amp;').replace('<', '&lt;').replace('>', '&gt;').replace('"', '&quot;').replace("'", '&#39;')) <ide> escape = allow_lazy(escape, six.text_type) <ide> <del>_base_js_escapes = ( <del> ('\\', '\\u005C'), <del> ('\'', '\\u0027'), <del> ('"', '\\u0022'), <del> ('>', '\\u003E'), <del> ('<', '\\u003C'), <del> ('&', '\\u0026'), <del> ('=', '\\u003D'), <del> ('-', '\\u002D'), <del> (';', '\\u003B'), <del> ('\u2028', '\\u2028'), <del> ('\u2029', '\\u2029') <del>) <add>_js_escapes = { <add> ord('\\'): '\\u005C', <add> ord('\''): '\\u0027', <add> ord('"'): '\\u0022', <add> ord('>'): '\\u003E', <add> ord('<'): '\\u003C', <add> ord('&'): '\\u0026', <add> ord('='): '\\u003D', <add> ord('-'): '\\u002D', <add> ord(';'): '\\u003B', <add> ord('\u2028'): '\\u2028', <add> ord('\u2029'): '\\u2029' <add>} <ide> <ide> # Escape every ASCII character with a value less than 32. <del>_js_escapes = (_base_js_escapes + <del> tuple([('%c' % z, '\\u%04X' % z) for z in range(32)])) <add>_js_escapes.update((ord('%c' % z), '\\u%04X' % z) for z in range(32)) <ide> <ide> def escapejs(value): <ide> """Hex encodes characters for use in JavaScript strings.""" <del> for bad, good in _js_escapes: <del> value = mark_safe(force_text(value).replace(bad, good)) <del> return value <add> return mark_safe(force_text(value).translate(_js_escapes)) <ide> escapejs = allow_lazy(escapejs, six.text_type) <ide> <ide> def conditional_escape(text):
1
PHP
PHP
fix cs error
df161adb4e98d4645cfa0ce40e656c0aa5c45b5a
<ide><path>tests/TestCase/Command/RoutesCommandTest.php <ide> public function testCheckHelp() <ide> $this->assertErrorEmpty(); <ide> } <ide> <del> <ide> /** <ide> * Ensure routes check with no input <ide> *
1
Javascript
Javascript
set default easing using jquery.easing._default
5f2ea402582c7b8f4773771e1529d60587f3435e
<ide><path>src/effects.js <ide> function Animation( elem, properties, options ) { <ide> animation = deferred.promise({ <ide> elem: elem, <ide> props: jQuery.extend( {}, properties ), <del> opts: jQuery.extend( true, { specialEasing: {} }, options ), <add> opts: jQuery.extend( true, { <add> specialEasing: {}, <add> easing: jQuery.easing._default <add> }, options ), <ide> originalProperties: properties, <ide> originalOptions: options, <ide> startTime: fxNow || createFxNow(), <ide><path>src/effects/Tween.js <ide> Tween.prototype = { <ide> init: function( elem, options, prop, end, easing, unit ) { <ide> this.elem = elem; <ide> this.prop = prop; <del> this.easing = easing || "swing"; <add> this.easing = easing || jQuery.easing._default; <ide> this.options = options; <ide> this.start = this.now = this.cur(); <ide> this.end = end; <ide> jQuery.easing = { <ide> }, <ide> swing: function( p ) { <ide> return 0.5 - Math.cos( p * Math.PI ) / 2; <del> } <add> }, <add> _default: "swing" <ide> }; <ide> <ide> jQuery.fx = Tween.prototype.init; <ide><path>test/unit/effects.js <ide> test("animate with per-property easing", function(){ <ide> test("animate with CSS shorthand properties", function(){ <ide> expect(11); <ide> <del> var _default_count = 0, <del> _special_count = 0, <add> var easeAnimation_count = 0, <add> easeProperty_count = 0, <ide> propsBasic = { "padding": "10 20 30" }, <del> propsSpecial = { "padding": [ "1 2 3", "_special" ] }; <add> propsSpecial = { "padding": [ "1 2 3", "propertyScope" ] }; <ide> <del> jQuery.easing._default = function(p) { <add> jQuery.easing.animationScope = function(p) { <ide> if ( p >= 1 ) { <del> _default_count++; <add> easeAnimation_count++; <ide> } <ide> return p; <ide> }; <ide> <del> jQuery.easing._special = function(p) { <add> jQuery.easing.propertyScope = function(p) { <ide> if ( p >= 1 ) { <del> _special_count++; <add> easeProperty_count++; <ide> } <ide> return p; <ide> }; <ide> <ide> jQuery("#foo") <del> .animate( propsBasic, 200, "_default", function() { <add> .animate( propsBasic, 200, "animationScope", function() { <ide> equal( this.style.paddingTop, "10px", "padding-top was animated" ); <ide> equal( this.style.paddingLeft, "20px", "padding-left was animated" ); <ide> equal( this.style.paddingRight, "20px", "padding-right was animated" ); <ide> equal( this.style.paddingBottom, "30px", "padding-bottom was animated" ); <del> equal( _default_count, 4, "per-animation default easing called for each property" ); <del> _default_count = 0; <add> equal( easeAnimation_count, 4, "per-animation default easing called for each property" ); <add> easeAnimation_count = 0; <ide> }) <del> .animate( propsSpecial, 200, "_default", function() { <add> .animate( propsSpecial, 200, "animationScope", function() { <ide> equal( this.style.paddingTop, "1px", "padding-top was animated again" ); <ide> equal( this.style.paddingLeft, "2px", "padding-left was animated again" ); <ide> equal( this.style.paddingRight, "2px", "padding-right was animated again" ); <ide> equal( this.style.paddingBottom, "3px", "padding-bottom was animated again" ); <del> equal( _default_count, 0, "per-animation default easing not called" ); <del> equal( _special_count, 4, "special easing called for each property" ); <add> equal( easeAnimation_count, 0, "per-animation default easing not called" ); <add> equal( easeProperty_count, 4, "special easing called for each property" ); <ide> <ide> jQuery(this).css("padding", "0"); <del> delete jQuery.easing._default; <del> delete jQuery.easing._special; <add> delete jQuery.easing.animationScope; <add> delete jQuery.easing.propertyScope; <ide> }); <ide> this.clock.tick( 400 ); <ide> }); <ide> test( "Animation should go to its end state if document.hidden = true", 1, funct <ide> } <ide> }); <ide> <add>test( "jQuery.easing._default (#2218)", 2, function() { <add> jQuery( "#foo" ) <add> .animate({ width: "5px" }, { <add> duration: 5, <add> start: function( anim ) { <add> equal( anim.opts.easing, jQuery.easing._default, <add> "anim.opts.easing should be equal to jQuery.easing._default when the easing argument is not given" ); <add> } <add> }) <add> .animate({ height: "5px" }, { <add> duration: 5, <add> easing: "linear", <add> start: function( anim ) { <add> equal( anim.opts.easing, "linear", <add> "anim.opts.easing should be equal to the easing argument" ); <add> } <add> }) <add> .stop(); <add> this.clock.tick( 25 ); <add>}); <ide> <ide> })();
3
PHP
PHP
remove beta tag from front controller
13d5c301b447b2f5b6b5732c52f8789bb0e6312e
<ide><path>public/index.php <ide> * Laravel - A clean and classy framework for PHP web development. <ide> * <ide> * @package Laravel <del> * @version 2.0.0 Beta 2 <add> * @version 2.0.0 <ide> * @author Taylor Otwell <taylorotwell@gmail.com> <ide> * @link http://laravel.com <ide> */
1
Text
Text
add dana indonesia to the companies uses airflow
7c9cc41f85a909645840c7c307954b7e7420b916
<ide><path>INTHEWILD.md <ide> Currently, **officially** using Airflow: <ide> 1. [Cyscale](https://cyscale.com) [[@ocical](https://github.com/ocical)] <ide> 1. [Dailymotion](http://www.dailymotion.com/fr) [[@germaintanguy](https://github.com/germaintanguy) & [@hc](https://github.com/hc)] <ide> 1. [Danamica](https://www.danamica.dk) [[@testvinder](https://github.com/testvinder)] <add>1. [DANA](https://www.dana.id/) [[@imamdigmi](https://github.com/imamdigmi)] <ide> 1. [DataCamp](https://datacamp.com/) [[@dgrtwo](https://github.com/dgrtwo)] <ide> 1. [DataFox](https://www.datafox.com/) [[@sudowork](https://github.com/sudowork)] <ide> 1. [Datamaran](https://www.datamaran.com) [[@valexharo](https://github.com/valexharo)]
1
Mixed
Ruby
update repology parser and bump command
32656502796910d573b1788ba10ab90187abf231
<ide><path>Library/Homebrew/dev-cmd/bump.rb <ide> def bump_args <ide> <ide> def bump <ide> bump_args.parse <del> puts "command run" <del> # parse_repology_api() <add> # puts "command run" <add> outdated_repology_pacakges = RepologyParser.parse_api_response() <add> puts RepologyParser.validate__packages(outdated_repology_pacakges) <ide> end <ide> end <ide><path>Library/Homebrew/utils/livecheck.rb <ide> <ide> module Livecheck <ide> def livecheck_formula_response(formula_name) <del> puts "- livecheck formula : #{formula_name}" <add> ohai "- livecheck formula : #{formula_name}" <ide> command_args = [ <ide> "brew", <ide> "livecheck", <ide> formula_name, <del> "--quiet" <add> "--quiet", <ide> ] <ide> <ide> response = Open3.capture2e(*command_args) <ide> parse_livecheck_response(response) <ide> end <ide> <ide> def parse_livecheck_response(response) <del> output = response.first.gsub(' ', '').split(/:|==>|\n/) <add> output = response.first.delete(" ").split(/:|==>|\n/) <ide> <ide> # eg: ["burp", "2.2.18", "2.2.18"] <ide> package_name, brew_version, latest_version = output <ide> <del> {'name' => package_name, 'current_brew_version' => brew_version, 'livecheck_latest_version' => latest_version} <add> { "name" => package_name, "current_brew_version" => brew_version, <add> "livecheck_latest_version" => latest_version } <ide> end <ide> end <ide><path>Library/Homebrew/utils/repology.rb <ide> # frozen_string_literal: true <ide> <del>require "net/http" <del>require "json" <add>require "utils/curl" <add>require "formula_info" <ide> <ide> module RepologyParser <del> def call_api(url) <del> puts "- Calling API #{url}" <del> uri = URI(url) <del> response = Net::HTTP.get(uri) <add> module_function <ide> <del> puts "- Parsing response" <del> JSON.parse(response) <del> end <del> <del> def query_repology_api(last_package_in_response = "") <add> def query_api(last_package_in_response = "") <ide> url = "https://repology.org/api/v1/projects/#{last_package_in_response}?inrepo=homebrew&outdated=1" <add> ohai "Calling API #{url}" if Homebrew.args.verbose? <ide> <del> call_api(url) <add> output, errors, status = curl_output(url.to_s) <add> output = JSON.parse(output) <ide> end <ide> <del> def parse_repology_api <del> puts "\n-------- Query outdated packages from Repology --------" <add> def parse_api_response() <add> ohai "Querying outdated packages from Repology" <ide> page_no = 1 <del> puts "\n- Paginating repology api page: #{page_no}" <add> ohai "Paginating repology api page: #{page_no}" if Homebrew.args.verbose? <ide> <del> outdated_packages = query_repology_api("") <add> outdated_packages = query_api() <ide> last_pacakge_index = outdated_packages.size - 1 <ide> response_size = outdated_packages.size <add> page_limit = 15 <ide> <del> while response_size > 1 <add> while response_size > 1 && page_no <= page_limit <ide> page_no += 1 <del> puts "\n- Paginating repology api page: #{page_no}" <add> ohai "Paginating repology api page: #{page_no}" if Homebrew.args.verbose? <ide> <ide> last_package_in_response = outdated_packages.keys[last_pacakge_index] <del> response = query_repology_api("#{last_package_in_response}/") <add> response = query_api("#{last_package_in_response}/") <ide> <ide> response_size = response.size <ide> outdated_packages.merge!(response) <ide> last_pacakge_index = outdated_packages.size - 1 <ide> end <ide> <del> puts "\n- #{outdated_packages.size} outdated packages identified by repology" <add> ohai "#{outdated_packages.size} outdated packages identified" <ide> <ide> outdated_packages <ide> end <add> <add> def validate__packages(outdated_repology_packages) <add> ohai "Verifying outdated repology packages as Homebrew Formulae" <add> <add> packages = {} <add> outdated_repology_packages.each do |_name, repositories| <add> # identify homebrew repo <add> repology_homebrew_repo = repositories.find do |repo| <add> repo["repo"] == "homebrew" <add> end <add> <add> next if repology_homebrew_repo.empty? <add> latest_version = nil <add> <add> # identify latest version amongst repology repos <add> repositories.each do |repo| <add> latest_version = repo["version"] if repo["status"] == "newest" <add> end <add> <add> info = FormulaInfo.lookup(repology_homebrew_repo["srcname"]) <add> next unless info <add> current_version = info.pkg_version <add> <add> packages[repology_homebrew_repo["srcname"]] = { <add> "repology_latest_version" => latest_version, <add> "current_formula_version" => current_version.to_s <add> } <add> puts packages <add> end <add> # hash of hashes {"aacgain"=>{"repology_latest_version"=>"1.9", "current_formula_version"=>"1.8"}, ...} <add> packages <add> end <ide> end <ide><path>Library/Homebrew/utils/versions.rb <ide> def bump_formula_pr(formula_name, url) <ide> parse_formula_bump_response(response) <ide> end <ide> <del> def parse_formula_bump_response(response) <del> response, status = formula_bump_response <add> def parse_formula_bump_response(formula_bump_response) <add> response, _status = formula_bump_response <ide> response <ide> end <ide> <ide> def check_for_open_pr(formula_name, download_url) <del> puts "- Checking for open PRs for formula : #{formula_name}" <add> ohai "- Checking for open PRs for formula : #{formula_name}" <ide> <ide> response = bump_formula_pr(formula_name, download_url) <del> !response.include? 'Error: These open pull requests may be duplicates' <add> !response.include? "Error: These open pull requests may be duplicates" <ide> end <ide> end <ide><path>scripts/README.md <del># 0.0.1-API-Parser <del> <del>Parser for fixing this: https://github.com/Homebrew/brew/issues/5725 <del> <del>## Overview <del> <del>Homebrew is used to install software (packages). Homebrew uses 'formulae' to determine how a package is installed. <del>This project will automatically check which packages have had newer versions released, whether the package has an open PR on homebrew, and display the results. <del> <del>## High-level Solution <del> <del>- Fetch latest package version information from [repology.org](https://repology.org/) and store on file system. <del>- Fetch Homebrew Formulae information from [HomeBrew Formulae](https://formulae.brew.sh) <del>- Compare Current Homebrew Formulae version numbers and those coming from Repology's API and Livecheck. <del>- Determine whether package has open PR. <del>- Display results. <del> <del>## Details <del> <del>- This project can be run automatically at set intervals via GitHub Actions. <del>- Executing `ruby printPackageUpdates.rb` from the command line will query <del> both the Repology and Homebrew APIs. Homebrew's current version of each <del> package will be compared to the latest version of the package, per Repology's response. <del>- Homebrew's livecheck is also queried for each package, and that data is parsed, if available. <del>- Checks whether there is open PR for package. <del>- Each outdated package will be displayed to the console like so: <del>- Note that some packages will not be included in the Livecheck response. Those will have a 'Livecheck latest:' value of 'Not found'. <del> <del>``` <del>Package: openclonk <del>Brew current: 7.0 <del>Repology latest: 8.1 <del>Livecheck latest: 8.1 <del>Has Open PR?: true <del> <del>Package: openjdk <del>Brew current: 13.0.2+8 <del>Repology latest: 15.0.0.0~14 <del>Livecheck latest: Not found. <del>Has Open PR?: false <del> <del>Package: opentsdb <del>Brew current: 2.3.1 <del>Repology latest: 2.4.0 <del>Livecheck latest: 2.4.0 <del>Has Open PR?: true <del>``` <ide><path>scripts/bumpFormulae.rb <del>require_relative 'helpers/parsed_file' <del>require_relative 'helpers/brew_commands.rb' <add>require_relative "helpers/parsed_file" <add>require_relative "helpers/brew_commands.rb" <ide> <ide> brew_commands = BrewCommands.new <ide> <ide> puts "\n bumping package: #{line_hash['name']} formula" <ide> <ide> begin <del> bump_pr_response, bump_pr_status = brew_commands.bump_formula_pr(line_hash['name'], line_hash['download_url'], line_hash['checksum']) <add> bump_pr_response, bump_pr_status = brew_commands.bump_formula_pr(line_hash["name"], line_hash["download_url"], line_hash["checksum"]) <ide> puts "#{bump_pr_response}" <ide> rescue <del> puts "- An error occured whilst bumping package #{line_hash['name']} \n" <add> puts "- An error occured whilst bumping package #{line_hash["name"]} \n" <ide> return <ide> end <ide> end <ide><path>scripts/helpers/api_parser.rb <del>require 'net/http' <del>require 'json' <del> <del>require_relative 'brew_commands' <del>require_relative 'homebrew_formula' <del> <del>class ApiParser <del> def call_api(url) <del> puts "- Calling API #{url}" <del> uri = URI(url) <del> response = Net::HTTP.get(uri) <del> <del> puts "- Parsing response" <del> JSON.parse(response) <del> end <del> <del> def query_repology_api(last_package_in_response = '') <del> url = 'https://repology.org/api/v1/projects/' + last_package_in_response + '?inrepo=homebrew&outdated=1' <del> <del> self.call_api(url) <del> end <del> <del> def parse_repology_api() <del> puts "\n-------- Query outdated packages from Repology --------" <del> page_no = 1 <del> puts "\n- Paginating repology api page: #{page_no}" <del> <del> outdated_packages = self.query_repology_api('') <del> last_pacakge_index = outdated_packages.size - 1 <del> response_size = outdated_packages.size <del> <del> while response_size > 1 do <del> page_no += 1 <del> puts "\n- Paginating repology api page: #{page_no}" <del> <del> last_package_in_response = outdated_packages.keys[last_pacakge_index] <del> response = self.query_repology_api("#{last_package_in_response}/") <del> <del> response_size = response.size <del> outdated_packages.merge!(response) <del> last_pacakge_index = outdated_packages.size - 1 <del> end <del> <del> puts "\n- #{outdated_packages.size} outdated pacakges identified by repology" <del> outdated_packages <del> end <del> <del> def query_homebrew <del> puts "\n-------- Get Homebrew Formulas --------" <del> self.call_api('https://formulae.brew.sh/api/formula.json') <del> end <del> <del> def parse_homebrew_formulas() <del> formulas = self.query_homebrew() <del> parsed_homebrew_formulas = {} <del> <del> formulas.each do |formula| <del> parsed_homebrew_formulas[formula['name']] = { <del> "fullname" => formula["full_name"], <del> "oldname" => formula["oldname"], <del> "version" => formula["versions"]['stable'], <del> "download_url" => formula["urls"]['stable']['url'], <del> } <del> end <del> <del> parsed_homebrew_formulas <del> end <del> <del> def validate_packages(outdated_repology_packages, brew_formulas) <del> puts "\n-------- Verify Outdated Repology packages as Homebrew Formulas --------" <del> packages = {} <del> <del> outdated_repology_packages.each do |package_name, repo_using_package| <del> # Identify homebrew repo <del> repology_homebrew_repo = repo_using_package.select { |repo| repo['repo'] == 'homebrew' }[0] <del> next if repology_homebrew_repo.empty? <del> <del> latest_version = nil <del> <del> # Identify latest version amongst repos <del> repo_using_package.each do |repo| <del> latest_version = repo['version'] if repo['status'] == 'newest' <del> end <del> <del> repology_homebrew_repo['latest_version'] = latest_version if latest_version <del> homebrew_package_details = brew_formulas[repology_homebrew_repo['srcname']] <del> <del> # Format package <del> packages[repology_homebrew_repo['srcname']] = format_package(homebrew_package_details, repology_homebrew_repo) <del> end <del> <del> packages <del> end <del> <del> <del> def format_package(homebrew_details, repology_details) <del> puts "- Formatting package: #{repology_details['srcname']}" <del> <del> homebrew_formula = HomebrewFormula.new <del> new_download_url = homebrew_formula.generate_new_download_url(homebrew_details['download_url'], homebrew_details['version'], repology_details['latest_version']) <del> <del> brew_commands = BrewCommands.new <del> livecheck_response = brew_commands.livecheck_check_formula(repology_details['srcname']) <del> has_open_pr = brew_commands.check_for_open_pr(repology_details['srcname'], new_download_url) <del> <del> formatted_package = { <del> 'fullname'=> homebrew_details['fullname'], <del> 'repology_version' => repology_details['latest_version'], <del> 'homebrew_version' => homebrew_details['version'], <del> 'livecheck_latest_version' => livecheck_response['livecheck_latest_version'], <del> 'current_download_url' => homebrew_details['download_url'], <del> 'latest_download_url' => new_download_url, <del> 'repology_latest_version' => repology_details['latest_version'], <del> 'has_open_pr' => has_open_pr <del> } <del> <del> formatted_package <del> end <del> <del> def display_version_data(outdated_packages) <del> puts "==============Formatted outdated packages============\n" <del> <del> outdated_packages.each do |package_name, package_details| <del> puts "" <del> puts "Package: #{package_name}" <del> puts "Brew current: #{package_details['homebrew_version']}" <del> puts "Repology latest: #{package_details['repology_version']}" <del> puts "Livecheck latest: #{package_details['livecheck_latest_version']}" <del> puts "Has Open PR?: #{package_details['has_open_pr']}" <del> end <del> end <del> <del>end <ide><path>scripts/helpers/brew_commands.rb <ide> def livecheck_check_formula(formula_name) <ide> end <ide> <ide> def parse_livecheck_response(livecheck_output) <del> livecheck_output = livecheck_output.first.gsub(' ', '').split(/:|==>|\n/) <add> livecheck_output = livecheck_output.first.gsub(" ", "").split(/:|==>|\n/) <ide> <ide> # eg: ["burp", "2.2.18", "2.2.18"] <ide> package_name, brew_version, latest_version = livecheck_output <ide> <del> {'name' => package_name, 'current_brew_version' => brew_version, 'livecheck_latest_version' => latest_version} <add> {"name" => package_name, "current_brew_version" => brew_version, "livecheck_latest_version" => latest_version} <ide> end <ide> <ide> def bump_formula_pr(formula_name, url) <ide> def check_for_open_pr(formula_name, download_url) <ide> <ide> response = bump_formula_pr(formula_name, download_url) <ide> <del> !response.include? 'Error: These open pull requests may be duplicates' <add> !response.include? "Error: These open pull requests may be duplicates" <ide> end <ide> <ide> end <ide><path>scripts/helpers/homebrew_formula.rb <del>require 'net/http' <del>require 'open-uri' <add>require "net/http" <add>require "open-uri" <ide> <ide> class HomebrewFormula <ide> <ide><path>scripts/helpers/parsed_file.rb <del>require 'fileutils' <del> <del>class ParsedFile <del> <del> def get_latest_file(directory) <del> puts "- retrieving latest file in directory: #{directory}" <del> Dir.glob("#{directory}/*").max_by(1) {|f| File.mtime(f)}[0] <del> end <del> <del> def save_to(directory, data) <del> # Create directory if does not exist <del> FileUtils.mkdir_p directory unless Dir.exists?(directory) <del> <del> puts "- Generating datetime stamp" <del> #Include time to the filename for uniqueness when fetching multiple times a day <del> date_time = Time.new.strftime("%Y-%m-%dT%H_%M_%S") <del> <del> # Writing parsed data to file <del> puts "- Writing data to file" <del> File.write("#{directory}/#{date_time}.txt", data) <del> end <del> <del>end <ide>\ No newline at end of file <ide><path>scripts/printPackageUpdates.rb <del>require_relative 'helpers/api_parser' <add>require_relative "helpers/api_parser" <ide> <ide> api_parser = ApiParser.new <ide>
11
Python
Python
map nr to propn
aad0610a853b4731806397537f867878fec5efa8
<ide><path>spacy/lang/zh/tag_map.py <ide> from __future__ import unicode_literals <ide> <ide> from ...symbols import POS, PUNCT, ADJ, SCONJ, CCONJ, NUM, DET, ADV, ADP, X <del>from ...symbols import NOUN, PART, INTJ, PRON, VERB, SPACE <add>from ...symbols import NOUN, PART, INTJ, PRON, VERB, SPACE, PROPN <ide> <ide> # The Chinese part-of-speech tagger uses the OntoNotes 5 version of the Penn <ide> # Treebank tag set. We also map the tags to the simpler Universal Dependencies <ide> "URL": {POS: X}, <ide> "INF": {POS: X}, <ide> "NN": {POS: NOUN}, <del> "NR": {POS: NOUN}, <add> "NR": {POS: PROPN}, <ide> "NT": {POS: NOUN}, <ide> "VA": {POS: VERB}, <ide> "VC": {POS: VERB},
1
Text
Text
fix typo in the new issue template
f9e07fcaab781d9d7e37348f5af37a832cd914ea
<ide><path>.github/ISSUE_TEMPLATE.md <ide> If Homebrew was updated on Aug 10-11th 2016 and `brew update` always says `Alrea <ide> <ide> - [ ] Ran `brew update` and retried your prior step? <ide> - [ ] Ran `brew doctor`, fixed as many issues as possible and retried your prior step? <del>- [ ] Confirmed this is problem with Homebrew/brew and not specific formulae? If it's a formulae-specific problem please file this issue at https://github.com/Homebrew/homebrew-core/issues/new <add>- [ ] Confirmed this is a problem with Homebrew/brew and not specific formulae? If it's a formulae-specific problem please file this issue at https://github.com/Homebrew/homebrew-core/issues/new <ide> <ide> _You can erase any parts of this template not applicable to your Issue._ <ide>
1
Javascript
Javascript
add jsdoc typings for child_process
9d3a592fffc117c08e4fbff7bb6be3b05a112bfd
<ide><path>lib/child_process.js <ide> const { <ide> <ide> const MAX_BUFFER = 1024 * 1024; <ide> <add>/** <add> * Spawns a new Node.js process + fork. <add> * @param {string} modulePath <add> * @param {string[]} [args] <add> * @param {{ <add> * cwd?: string; <add> * detached?: boolean; <add> * env?: Object; <add> * execPath?: string; <add> * execArgv?: string[]; <add> * gid?: number; <add> * serialization?: string; <add> * signal?: AbortSignal; <add> * killSignal?: string | number; <add> * silent?: boolean; <add> * stdio?: Array | string; <add> * uid?: number; <add> * windowsVerbatimArguments?: boolean; <add> * timeout?: number; <add> * }} [options] <add> * @returns {ChildProcess} <add> */ <ide> function fork(modulePath /* , args, options */) { <ide> validateString(modulePath, 'modulePath'); <ide> <ide> function normalizeExecArgs(command, options, callback) { <ide> }; <ide> } <ide> <del> <add>/** <add> * Spawns a shell executing the given command. <add> * @param {string} command <add> * @param {{ <add> * cmd?: string; <add> * env?: Object; <add> * encoding?: string; <add> * shell?: string; <add> * signal?: AbortSignal; <add> * timeout?: number; <add> * maxBuffer?: number; <add> * killSignal?: string | number; <add> * uid?: number; <add> * gid?: number; <add> * windowsHide?: boolean; <add> * }} [options] <add> * @param {( <add> * error?: Error, <add> * stdout?: string | Buffer, <add> * stderr?: string | Buffer <add> * ) => any} [callback] <add> * @returns {ChildProcess} <add> */ <ide> function exec(command, options, callback) { <ide> const opts = normalizeExecArgs(command, options, callback); <ide> return module.exports.execFile(opts.file, <ide> ObjectDefineProperty(exec, promisify.custom, { <ide> value: customPromiseExecFunction(exec) <ide> }); <ide> <add>/** <add> * Spawns the specified file as a shell. <add> * @param {string} file <add> * @param {string[]} [args] <add> * @param {{ <add> * cwd?: string; <add> * env?: Object; <add> * encoding?: string; <add> * timeout?: number; <add> * maxBuffer?: number; <add> * killSignal?: string | number; <add> * uid?: number; <add> * gid?: number; <add> * windowsHide?: boolean; <add> * windowsVerbatimArguments?: boolean; <add> * shell?: boolean | string; <add> * signal?: AbortSignal; <add> * }} [options] <add> * @param {( <add> * error?: Error, <add> * stdout?: string | Buffer, <add> * stderr?: string | Buffer <add> * ) => any} [callback] <add> * @returns {ChildProcess} <add> */ <ide> function execFile(file /* , args, options, callback */) { <ide> let args = []; <ide> let callback; <ide> function abortChildProcess(child, killSignal) { <ide> } <ide> } <ide> <del> <add>/** <add> * Spawns a new process using the given `file`. <add> * @param {string} file <add> * @param {string[]} [args] <add> * @param {{ <add> * cwd?: string; <add> * env?: Object; <add> * argv0?: string; <add> * stdio?: Array | string; <add> * detached?: boolean; <add> * uid?: number; <add> * gid?: number; <add> * serialization?: string; <add> * shell?: boolean | string; <add> * windowsVerbatimArguments?: boolean; <add> * windowsHide?: boolean; <add> * signal?: AbortSignal; <add> * timeout?: number; <add> * killSignal?: string | number; <add> * }} [options] <add> * @returns {ChildProcess} <add> */ <ide> function spawn(file, args, options) { <ide> options = normalizeSpawnArguments(file, args, options); <ide> validateTimeout(options.timeout); <ide> function spawn(file, args, options) { <ide> return child; <ide> } <ide> <add>/** <add> * Spawns a new process synchronously using the given `file`. <add> * @param {string} file <add> * @param {string[]} [args] <add> * @param {{ <add> * cwd?: string; <add> * input?: string | Buffer | TypedArray | DataView; <add> * argv0?: string; <add> * stdio?: string | Array; <add> * env?: Object; <add> * uid?: number; <add> * gid?: number; <add> * timeout?: number; <add> * killSignal?: string | number; <add> * maxBuffer?: number; <add> * encoding?: string; <add> * shell?: boolean | string; <add> * windowsVerbatimArguments?: boolean; <add> * windowsHide?: boolean; <add> * }} [options] <add> * @returns {{ <add> * pid: number; <add> * output: Array; <add> * stdout: Buffer | string; <add> * stderr: Buffer | string; <add> * status: number | null; <add> * signal: string | null; <add> * error: Error; <add> * }} <add> */ <ide> function spawnSync(file, args, options) { <ide> options = { <ide> maxBuffer: MAX_BUFFER, <ide> function checkExecSyncError(ret, args, cmd) { <ide> return err; <ide> } <ide> <del> <add>/** <add> * Spawns a file as a shell synchronously. <add> * @param {string} command <add> * @param {string[]} [args] <add> * @param {{ <add> * cwd?: string; <add> * input?: string | Buffer | TypedArray | DataView; <add> * stdio?: string | Array; <add> * env?: Object; <add> * uid?: number; <add> * gid?: number; <add> * timeout?: number; <add> * killSignal?: string | number; <add> * maxBuffer?: number; <add> * encoding?: string; <add> * windowsHide?: boolean; <add> * shell?: boolean | string; <add> * }} [options] <add> * @returns {Buffer | string} <add> */ <ide> function execFileSync(command, args, options) { <ide> options = normalizeSpawnArguments(command, args, options); <ide> <ide> function execFileSync(command, args, options) { <ide> return ret.stdout; <ide> } <ide> <del> <add>/** <add> * Spawns a shell executing the given `command` synchronously. <add> * @param {string} command <add> * @param {{ <add> * cwd?: string; <add> * input?: string | Buffer | TypedArray | DataView; <add> * stdio?: string | Array; <add> * env?: Object; <add> * shell?: string; <add> * uid?: number; <add> * gid?: number; <add> * timeout?: number; <add> * killSignal?: string | number; <add> * maxBuffer?: number; <add> * encoding?: string; <add> * windowsHide?: boolean; <add> * }} [options] <add> * @returns {Buffer | string} <add> */ <ide> function execSync(command, options) { <ide> const opts = normalizeExecArgs(command, options, null); <ide> const inheritStderr = !opts.options.stdio;
1
Javascript
Javascript
simplify browserslist syntax
d1970115a418e97f3fb95d8511c973a3843549e4
<ide><path>lib/config/browserslistTargetHandler.js <ide> <ide> "use strict"; <ide> <add>const path = require("path"); <ide> const browserslist = require("browserslist"); <ide> <del>// ?query|[///path/to/config][:env] <del>const inputRx = /^(?:\?(.+?)|(?:\/\/(.+?))?(?::(.+?))?)$/; <add>// query|[/path/to/config][:env] <add>const inputRx = /^(?:(.+?))?(?::(.+?))?$/; <ide> <ide> /** <ide> * @typedef {Object} BrowserslistHandlerConfig <ide> const parse = input => { <ide> return null; <ide> } <ide> <del> const [, query, configPath, env] = inputRx.exec(input) || []; <add> if (path.isAbsolute(input)) { <add> const [, configPath, env] = inputRx.exec(input) || []; <add> return { configPath, env }; <add> } <add> <add> const [, queryOrEnv] = inputRx.exec(input) || []; <add> const config = browserslist.findConfig(process.cwd()); <add> <add> if (config && Object.keys(config).includes(queryOrEnv)) { <add> return { env: queryOrEnv }; <add> } <ide> <del> return { query, configPath, env }; <add> return { query: queryOrEnv }; <ide> }; <ide> <ide> /** <ide><path>lib/config/target.js <ide> const versionDependent = (major, minor) => { <ide> const TARGETS = [ <ide> // todo find out how we can do this thru a plugin <ide> [ <del> "browserslist[:[///path-to-config][>/path-to-browserslist][?query][:browserslist-env]]", <add> "browserslist / browserslist:env / browserslist:query / browserslist:path-to-config / browserslist:path-to-config:env", <ide> "Resolve features from browserslist. Will resolve browserslist config automatically. `browserslist:modern` to use `modern` environment from browserslist config", <ide> /^browserslist(?::(.+))?$/, <ide> rest => { <ide><path>test/configCases/ecmaVersion/browserslist-query/index.js <add>it("should compile and run the test", function() {}); <ide><path>test/configCases/ecmaVersion/browserslist-query/webpack.config.js <add>/** @type {import("../../../../").Configuration} */ <add>module.exports = { <add> target: ["web", `browserslist:ie 11`] <add>}; <ide><path>test/configCases/ecmaVersion/browserslist/webpack.config.js <ide> const path = require("path"); <ide> module.exports = { <ide> target: [ <ide> "web", <del> `browserslist://${path.join(__dirname, ".browserslistrc")}:modern` <add> `browserslist:${path.join(__dirname, ".browserslistrc")}:modern` <ide> ] <ide> };
5
PHP
PHP
fix pluralization of "chef"
11f2e58794714db053cdad0f558dd8dfc71bcd58
<ide><path>src/Utility/Inflector.php <ide> class Inflector <ide> '/(x|ch|ss|sh)$/i' => '\1es', <ide> '/([^aeiouy]|qu)y$/i' => '\1ies', <ide> '/(hive)$/i' => '\1s', <add> '/(chef)$/i' => '\1s', <ide> '/(?:([^f])fe|([lre])f)$/i' => '\1\2ves', <ide> '/sis$/i' => 'ses', <ide> '/([ti])um$/i' => '\1a', <ide><path>tests/TestCase/Utility/InflectorTest.php <ide> public function testInflectingPlurals() <ide> $this->assertEquals('Addresses', Inflector::pluralize('Address')); <ide> $this->assertEquals('sieves', Inflector::pluralize('sieve')); <ide> $this->assertEquals('blue_octopuses', Inflector::pluralize('blue_octopus')); <add> $this->assertEquals('chefs', Inflector::pluralize('chef')); <ide> $this->assertEquals('', Inflector::pluralize('')); <ide> } <ide>
2
Python
Python
use gray and hsv colormaps in examples
a55e197cf4164926f0a16954a2d2ec3e19e119ef
<ide><path>numpy/core/code_generators/ufunc_docstrings.py <ide> def add_newdoc(place, name, doc): <ide> Plot the function over the complex plane: <ide> <ide> >>> xx = x + 1j * x[:, np.newaxis] <del> >>> plt.imshow(np.abs(xx), extent=[-10, 10, -10, 10]) <add> >>> plt.imshow(np.abs(xx), extent=[-10, 10, -10, 10], cmap='gray') <ide> >>> plt.show() <ide> <ide> """) <ide> def add_newdoc(place, name, doc): <ide> <ide> >>> plt.subplot(121) <ide> >>> plt.imshow(np.abs(out), <del> ... extent=[-2*np.pi, 2*np.pi, -2*np.pi, 2*np.pi]) <add> ... extent=[-2*np.pi, 2*np.pi, -2*np.pi, 2*np.pi], cmap='gray') <ide> >>> plt.title('Magnitude of exp(x)') <ide> <ide> >>> plt.subplot(122) <ide> >>> plt.imshow(np.angle(out), <del> ... extent=[-2*np.pi, 2*np.pi, -2*np.pi, 2*np.pi]) <add> ... extent=[-2*np.pi, 2*np.pi, -2*np.pi, 2*np.pi], cmap='hsv') <ide> >>> plt.title('Phase (angle) of exp(x)') <ide> >>> plt.show() <ide>
1
Javascript
Javascript
add comments to pipeline implementation
960be159ac3b21bf8a4f8c3bca7e733d483fb7d9
<ide><path>lib/internal/streams/pipeline.js <ide> function pipeline(...streams) { <ide> PassThrough = require('_stream_passthrough'); <ide> } <ide> <add> // If the last argument to pipeline is not a stream <add> // we must create a proxy stream so that pipeline(...) <add> // always returns a stream which can be further <add> // composed through `.pipe(stream)`. <add> <ide> const pt = new PassThrough(); <ide> if (isPromise(ret)) { <ide> ret <ide> function pipeline(...streams) { <ide> } <ide> } <ide> <add> // TODO(ronag): Consider returning a Duplex proxy if the first argument <add> // is a writable. Would improve composability. <add> // See, https://github.com/nodejs/node/issues/32020 <ide> return ret; <ide> } <ide>
1
PHP
PHP
allow scheduling of queued jobs
52016015ad9d51278bbe91310b9cf5315d2a7289
<ide><path>src/Illuminate/Console/Scheduling/Schedule.php <ide> <ide> use Illuminate\Console\Application; <ide> use Illuminate\Container\Container; <add>use Illuminate\Contracts\Queue\ShouldQueue; <ide> use Symfony\Component\Process\ProcessUtils; <ide> use Illuminate\Contracts\Cache\Repository as Cache; <ide> <ide> public function call($callback, array $parameters = []) <ide> return $event; <ide> } <ide> <add> /** <add> * Add a new queued job callback event to the schedule. <add> * <add> * @param \Illuminate\Contracts\Queue\ShouldQueue $job <add> * @return \Illuminate\Console\Scheduling\Event <add> */ <add> public function job(ShouldQueue $job) <add> { <add> return $this->call(function() use($job) { <add> dispatch($job); <add> })->name(get_class($job)); <add> } <add> <ide> /** <ide> * Add a new Artisan command event to the schedule. <ide> *
1
Javascript
Javascript
fix atomenvironment tests
011766768a77e543bac0c23cde20ff211288c5da
<ide><path>spec/atom-environment-spec.js <ide> describe('AtomEnvironment', () => { <ide> const promise = new Promise((r) => { resolve = r }) <ide> envLoaded = () => { <ide> resolve() <del> promise <add> return promise <ide> } <ide> atomEnvironment = new AtomEnvironment({ <ide> applicationDelegate: atom.applicationDelegate, <ide><path>src/atom-environment.js <ide> class AtomEnvironment { <ide> } <ide> <ide> addProjectFolder () { <del> this.pickFolder((selectedPaths = []) => { <del> this.addToProject(selectedPaths) <add> return new Promise((resolve) => { <add> this.pickFolder((selectedPaths) => { <add> this.addToProject(selectedPaths || []).then(resolve) <add> }) <ide> }) <ide> } <ide>
2
Javascript
Javascript
check getreport when error with one line stack
c25cccf1302e779cebbefc9948f2fae700c5296f
<ide><path>test/report/test-report-getreport.js <ide> common.expectWarning('ExperimentalWarning', <ide> assert.deepStrictEqual(helper.findReports(process.pid, process.cwd()), []); <ide> } <ide> <add>{ <add> // Test with an error with one line stack <add> const error = new Error(); <add> error.stack = 'only one line'; <add> helper.validateContent(process.report.getReport(error)); <add> assert.deepStrictEqual(helper.findReports(process.pid, process.cwd()), []); <add>} <add> <ide> // Test with an invalid error argument. <ide> [null, 1, Symbol(), function() {}, 'foo'].forEach((error) => { <ide> common.expectsError(() => {
1
Javascript
Javascript
fix fallback after new download
87d9651eb71d574b5d979f61375faf23938f5dd2
<ide><path>extensions/firefox/components/PdfStreamConverter.js <ide> ChromeActions.prototype = { <ide> searchEnabled: function() { <ide> return getBoolPref(PREF_PREFIX + '.searchEnabled', false); <ide> }, <del> fallback: function(url) { <add> fallback: function(url, sendResponse) { <ide> var self = this; <ide> var domWindow = this.domWindow; <ide> var strings = getLocalizedStrings('chrome.properties'); <ide> ChromeActions.prototype = { <ide> var win = Services.wm.getMostRecentWindow('navigator:browser'); <ide> var browser = win.gBrowser.getBrowserForDocument(domWindow.top.document); <ide> var notificationBox = win.gBrowser.getNotificationBox(browser); <add> // Flag so we don't call the response callback twice, since if the user <add> // clicks open with different viewer both the button callback and <add> // eventCallback will be called. <add> var sentResponse = false; <ide> var buttons = [{ <ide> label: getLocalizedString(strings, 'open_with_different_viewer'), <ide> accessKey: getLocalizedString(strings, 'open_with_different_viewer', <ide> 'accessKey'), <ide> callback: function() { <del> self.download(url); <add> sentResponse = true; <add> sendResponse(true); <ide> } <ide> }]; <ide> notificationBox.appendNotification(message, 'pdfjs-fallback', null, <ide> notificationBox.PRIORITY_WARNING_LOW, <del> buttons); <add> buttons, <add> function eventsCallback(eventType) { <add> // Currently there is only one event "removed" but if there are any other <add> // added in the future we still only care about removed at the moment. <add> if (eventType !== 'removed') <add> return; <add> // Don't send a response again if we already responded when the button was <add> // clicked. <add> if (!sentResponse) <add> sendResponse(false); <add> }); <ide> } <ide> }; <ide> <ide><path>web/viewer.js <ide> var PDFView = { <ide> return; <ide> this.fellback = true; <ide> var url = this.url.split('#')[0]; <del> FirefoxCom.request('fallback', url); <add> FirefoxCom.request('fallback', url, function response(download) { <add> if (!download) <add> return; <add> PDFView.download(); <add> }); <ide> }, <ide> <ide> navigateTo: function pdfViewNavigateTo(dest) {
2
Python
Python
fix some tests for py3
87cf71e97cc256fb6e86c921c0c5d6c54a037910
<ide><path>numpy/compat/py3k.py <ide> <ide> """ <ide> <del>__all__ = ['bytes', 'asbytes', 'isfileobj', 'getexception'] <add>__all__ = ['bytes', 'asbytes', 'isfileobj', 'getexception', 'strchar'] <ide> <ide> import sys <ide> <ide> def asbytes(s): <ide> return s.encode('iso-8859-1') <ide> def isfileobj(f): <ide> return isinstance(f, io.IOBase) <add> strchar = 'U' <ide> else: <ide> bytes = str <ide> asbytes = str <add> strchar = 'S' <ide> def isfileobj(f): <ide> return isinstance(f, file) <ide> <ide><path>numpy/core/tests/test_multiarray.py <ide> from numpy.core import * <ide> from numpy.core.multiarray_tests import test_neighborhood_iterator, test_neighborhood_iterator_oob <ide> <del>from numpy.compat import asbytes, getexception <add>from numpy.compat import asbytes, getexception, strchar <ide> <ide> from test_print import in_foreign_locale <ide> <ide> def test_sort_order(self): <ide> strtype = '>i2' <ide> else: <ide> strtype = '<i2' <del> mydtype = [('name', 'S5'),('col2',strtype)] <add> mydtype = [('name', strchar + '5'),('col2',strtype)] <ide> r = np.array([('a', 1),('b', 255), ('c', 3), ('d', 258)], <ide> dtype= mydtype) <ide> r.sort(order='col2') <ide> def test_bytes_fields(self): <ide> assert_raises(TypeError, np.dtype, [(('b', asbytes('a')), int)]) <ide> <ide> dt = np.dtype([((asbytes('a'), 'b'), int)]) <del> assert_raises(KeyError, dt.__getitem__, asbytes('a')) <add> assert_raises(ValueError, dt.__getitem__, asbytes('a')) <ide> <ide> x = np.array([(1,), (2,), (3,)], dtype=dt) <del> assert_raises(KeyError, x.__getitem__, asbytes('a')) <add> assert_raises(ValueError, x.__getitem__, asbytes('a')) <ide> <ide> y = x[0] <del> assert_raises(KeyError, y.__getitem__, asbytes('a')) <add> assert_raises(IndexError, y.__getitem__, asbytes('a')) <ide> else: <ide> def test_unicode_field_titles(self): <ide> # Unicode field titles are added to field dict on Py2 <ide> def test_unicode_field_titles(self): <ide> def test_unicode_field_names(self): <ide> # Unicode field names are not allowed on Py2 <ide> title = unicode('b') <del> assert_raises(TypeError, np.dtype, [(title, int)]) <del> assert_raises(TypeError, np.dtype, [(('a', title), int)]) <add> assert_raises(ValueError, np.dtype, [(title, int)]) <add> assert_raises(ValueError, np.dtype, [(('a', title), int)]) <ide> <ide> class TestView(TestCase): <ide> def test_basic(self):
2
Python
Python
fix typo in layer.add_loss.
a095c1b16f8368a996ec345c4f3fd0a77cdf54f3
<ide><path>keras/engine/topology.py <ide> def add_loss(self, losses, inputs=None): <ide> if hasattr(self, '_losses'): <ide> self._losses += losses <ide> # Update self._per_input_updates <del> if isinstance(input, list) and inputs == []: <add> if isinstance(inputs, list) and inputs == []: <ide> inputs = None <ide> if inputs is not None: <ide> inputs_hash = _object_list_uid(inputs)
1
Python
Python
handle list when serializing expand_kwargs
b816a6b243d16da87ca00e443619c75e9f6f5816
<ide><path>airflow/serialization/serialized_objects.py <ide> """Serialized DAG and BaseOperator""" <ide> from __future__ import annotations <ide> <add>import collections.abc <ide> import datetime <ide> import enum <ide> import logging <ide> import warnings <ide> import weakref <ide> from dataclasses import dataclass <ide> from inspect import Parameter, signature <del>from typing import TYPE_CHECKING, Any, Iterable, NamedTuple, Type <add>from typing import TYPE_CHECKING, Any, Collection, Iterable, Mapping, NamedTuple, Type, Union <ide> <ide> import cattr <ide> import lazy_object_proxy <ide> def deref(self, dag: DAG) -> XComArg: <ide> return deserialize_xcom_arg(self.data, dag) <ide> <ide> <add># These two should be kept in sync. Note that these are intentionally not using <add># the type declarations in expandinput.py so we always remember to update <add># serialization logic when adding new ExpandInput variants. If you add things to <add># the unions, be sure to update _ExpandInputRef to match. <add>_ExpandInputOriginalValue = Union[ <add> # For .expand(**kwargs). <add> Mapping[str, Any], <add> # For expand_kwargs(arg). <add> XComArg, <add> Collection[Union[XComArg, Mapping[str, Any]]], <add>] <add>_ExpandInputSerializedValue = Union[ <add> # For .expand(**kwargs). <add> Mapping[str, Any], <add> # For expand_kwargs(arg). <add> _XComRef, <add> Collection[Union[_XComRef, Mapping[str, Any]]], <add>] <add> <add> <ide> class _ExpandInputRef(NamedTuple): <ide> """Used to store info needed to create a mapped operator's expand input. <ide> <ide> class _ExpandInputRef(NamedTuple): <ide> """ <ide> <ide> key: str <del> value: _XComRef | dict[str, Any] <add> value: _ExpandInputSerializedValue <add> <add> @classmethod <add> def validate_expand_input_value(cls, value: _ExpandInputOriginalValue) -> None: <add> """Validate we've covered all ``ExpandInput.value`` types. <add> <add> This function does not actually do anything, but is called during <add> serialization so Mypy will *statically* check we have handled all <add> possible ExpandInput cases. <add> """ <ide> <ide> def deref(self, dag: DAG) -> ExpandInput: <add> """De-reference into a concrete ExpandInput object. <add> <add> If you add more cases here, be sure to update _ExpandInputOriginalValue <add> and _ExpandInputSerializedValue to match the logic. <add> """ <ide> if isinstance(self.value, _XComRef): <ide> value: Any = self.value.deref(dag) <del> else: <add> elif isinstance(self.value, collections.abc.Mapping): <ide> value = {k: v.deref(dag) if isinstance(v, _XComRef) else v for k, v in self.value.items()} <add> else: <add> value = [v.deref(dag) if isinstance(v, _XComRef) else v for v in self.value] <ide> return create_expand_input(self.key, value) <ide> <ide> <ide> def serialize_mapped_operator(cls, op: MappedOperator) -> dict[str, Any]: <ide> serialized_op = cls._serialize_node(op, include_deps=op.deps != MappedOperator.deps_for(BaseOperator)) <ide> # Handle expand_input and op_kwargs_expand_input. <ide> expansion_kwargs = op._get_specified_expand_input() <add> if TYPE_CHECKING: # Let Mypy check the input type for us! <add> _ExpandInputRef.validate_expand_input_value(expansion_kwargs.value) <ide> serialized_op[op._expand_input_attr] = { <ide> "type": get_map_type_key(expansion_kwargs), <ide> "value": cls.serialize(expansion_kwargs.value), <ide><path>tests/serialization/test_dag_serialization.py <ide> def test_operator_expand_xcomarg_serde(): <ide> <ide> <ide> @pytest.mark.parametrize("strict", [True, False]) <del>def test_operator_expand_kwargs_serde(strict): <add>def test_operator_expand_kwargs_literal_serde(strict): <add> from airflow.models.xcom_arg import PlainXComArg, XComArg <add> from airflow.serialization.serialized_objects import _XComRef <add> <add> with DAG("test-dag", start_date=datetime(2020, 1, 1)) as dag: <add> task1 = BaseOperator(task_id="op1") <add> mapped = MockOperator.partial(task_id='task_2').expand_kwargs( <add> [{"a": "x"}, {"a": XComArg(task1)}], <add> strict=strict, <add> ) <add> <add> serialized = SerializedBaseOperator.serialize(mapped) <add> assert serialized == { <add> '_is_empty': False, <add> '_is_mapped': True, <add> '_task_module': 'tests.test_utils.mock_operators', <add> '_task_type': 'MockOperator', <add> 'downstream_task_ids': [], <add> 'expand_input': { <add> "type": "list-of-dicts", <add> "value": [ <add> {"__type": "dict", "__var": {"a": "x"}}, <add> { <add> "__type": "dict", <add> "__var": {"a": {'__type': 'xcomref', '__var': {'task_id': 'op1', 'key': 'return_value'}}}, <add> }, <add> ], <add> }, <add> 'partial_kwargs': {}, <add> 'task_id': 'task_2', <add> 'template_fields': ['arg1', 'arg2'], <add> 'template_ext': [], <add> 'template_fields_renderers': {}, <add> 'operator_extra_links': [], <add> 'ui_color': '#fff', <add> 'ui_fgcolor': '#000', <add> "_disallow_kwargs_override": strict, <add> '_expand_input_attr': 'expand_input', <add> } <add> <add> op = SerializedBaseOperator.deserialize_operator(serialized) <add> assert op.deps is MappedOperator.deps_for(BaseOperator) <add> assert op._disallow_kwargs_override == strict <add> <add> # The XComArg can't be deserialized before the DAG is. <add> expand_value = op.expand_input.value <add> assert expand_value == [{"a": "x"}, {"a": _XComRef({"task_id": "op1", "key": XCOM_RETURN_KEY})}] <add> <add> serialized_dag: DAG = SerializedDAG.from_dict(SerializedDAG.to_dict(dag)) <add> <add> resolved_expand_value = serialized_dag.task_dict['task_2'].expand_input.value <add> resolved_expand_value == [{"a": "x"}, {"a": PlainXComArg(serialized_dag.task_dict['op1'])}] <add> <add> <add>@pytest.mark.parametrize("strict", [True, False]) <add>def test_operator_expand_kwargs_xcomarg_serde(strict): <ide> from airflow.models.xcom_arg import PlainXComArg, XComArg <ide> from airflow.serialization.serialized_objects import _XComRef <ide>
2
Python
Python
remove outdated comment in serializermethodfield
62ae241894fc49a7c6261cb1b6e3b9c98768ecf0
<ide><path>rest_framework/fields.py <ide> def __init__(self, method_name=None, **kwargs): <ide> super().__init__(**kwargs) <ide> <ide> def bind(self, field_name, parent): <del> # In order to enforce a consistent style, we error if a redundant <del> # 'method_name' argument has been used. For example: <del> # my_field = serializer.SerializerMethodField(method_name='get_my_field') <del> default_method_name = 'get_{field_name}'.format(field_name=field_name) <del> <del> # The method name should default to `get_{field_name}`. <add> # The method name defaults to `get_{field_name}`. <ide> if self.method_name is None: <del> self.method_name = default_method_name <add> self.method_name = 'get_{field_name}'.format(field_name=field_name) <ide> <ide> super().bind(field_name, parent) <ide>
1
Text
Text
remove note about python 3.10 support availability
b7f60d326718c3f19c19a41c90afeb25582695e2
<ide><path>README.md <ide> MariaDB is not tested/recommended. <ide> **Note**: SQLite is used in Airflow tests. Do not use it in production. We recommend <ide> using the latest stable version of SQLite for local development. <ide> <del>**Note**: Support for Python v3.10 will be available from Airflow 2.3.0. The `main` (development) branch <del>already supports Python 3.10. <del> <ide> **Note**: Airflow currently can be run on POSIX-compliant Operating Systems. For development it is regularly <ide> tested on fairly modern Linux Distros and recent versions of MacOS. <ide> On Windows you can run it via WSL2 (Windows Subsystem for Linux 2) or via Linux Containers.
1
Python
Python
replace field-errors with field_errors
904f197474ad3561d3d273961a70ab19d8175aaf
<ide><path>djangorestframework/tests/validators.py <ide> def validation_failed_due_to_no_content_returns_appropriate_message(self, valida <ide> try: <ide> validator.validate_request(content, None) <ide> except ErrorResponse, exc: <del> self.assertEqual(exc.response.raw_content, {'field-errors': {'qwerty': ['This field is required.']}}) <add> self.assertEqual(exc.response.raw_content, {'field_errors': {'qwerty': ['This field is required.']}}) <ide> else: <ide> self.fail('ResourceException was not raised') #pragma: no cover <ide> <ide> def validation_failed_due_to_field_error_returns_appropriate_message(self, valid <ide> try: <ide> validator.validate_request(content, None) <ide> except ErrorResponse, exc: <del> self.assertEqual(exc.response.raw_content, {'field-errors': {'qwerty': ['This field is required.']}}) <add> self.assertEqual(exc.response.raw_content, {'field_errors': {'qwerty': ['This field is required.']}}) <ide> else: <ide> self.fail('ResourceException was not raised') #pragma: no cover <ide> <ide> def validation_failed_due_to_invalid_field_returns_appropriate_message(self, val <ide> try: <ide> validator.validate_request(content, None) <ide> except ErrorResponse, exc: <del> self.assertEqual(exc.response.raw_content, {'field-errors': {'extra': ['This field does not exist.']}}) <add> self.assertEqual(exc.response.raw_content, {'field_errors': {'extra': ['This field does not exist.']}}) <ide> else: <ide> self.fail('ResourceException was not raised') #pragma: no cover <ide> <ide> def validation_failed_due_to_multiple_errors_returns_appropriate_message(self, v <ide> try: <ide> validator.validate_request(content, None) <ide> except ErrorResponse, exc: <del> self.assertEqual(exc.response.raw_content, {'field-errors': {'qwerty': ['This field is required.'], <add> self.assertEqual(exc.response.raw_content, {'field_errors': {'qwerty': ['This field is required.'], <ide> 'extra': ['This field does not exist.']}}) <ide> else: <ide> self.fail('ResourceException was not raised') #pragma: no cover
1
Ruby
Ruby
use symbol for mail preview format, not string
663548845b265d97118999eea83bf05a7d41161f
<ide><path>railties/lib/rails/mailers_controller.rb <ide> def preview <ide> end <ide> else <ide> @part = find_preferred_part(request.format, Mime[:html], Mime[:text]) <del> render action: "email", layout: false, formats: %w[html] <add> render action: "email", layout: false, formats: [:html] <ide> end <ide> else <ide> raise AbstractController::ActionNotFound, "Email '#{@email_action}' not found in #{@preview.name}"
1
Javascript
Javascript
update script.aculo.us scripts to fix some bugs
c9f2389c010ba9364a4454b45b3cedd4dd273c38
<ide><path>actionpack/lib/action_view/helpers/javascripts/controls.js <ide> Ajax.Autocompleter.prototype = (new Ajax.Base()).extend({ <ide> onComplete: function(request) { <ide> if(!this.changed) { <ide> this.update.innerHTML = request.responseText; <add> Element.cleanWhitespace(this.update.firstChild); <ide> <ide> if(this.update.firstChild && this.update.firstChild.childNodes) { <ide> this.entry_count = <ide> Ajax.Autocompleter.prototype = (new Ajax.Base()).extend({ <ide> case Event.KEY_UP: <ide> this.mark_previous(); <ide> this.render(); <add> if(navigator.appVersion.indexOf('AppleWebKit')>0) Event.stop(event); <ide> return; <ide> case Event.KEY_DOWN: <ide> this.mark_next(); <ide> this.render(); <add> if(navigator.appVersion.indexOf('AppleWebKit')>0) Event.stop(event); <ide> return; <ide> } <ide> else <ide><path>actionpack/lib/action_view/helpers/javascripts/effects.js <ide> Element.setContentZoom = function(element, percent) { <ide> <ide> element.style.fontSize = sizeEm*(percent/100) + "em"; <ide> if(navigator.appVersion.indexOf('AppleWebKit')>0) window.scrollBy(0,0); <del>} <ide>\ No newline at end of file <add>} <ide><path>actionpack/lib/action_view/helpers/javascripts/prototype.js <ide> var Position = { <ide> return [valueL, valueT]; <ide> }, <ide> <add> cumulative_offset: function(element) { <add> var valueT = 0; var valueL = 0; <add> do { <add> valueT += element.offsetTop || 0; <add> valueL += element.offsetLeft || 0; <add> element = element.offsetParent; <add> } while(element); <add> return [valueL, valueT]; <add> }, <add> <ide> // caches x/y coordinate pair to use with overlap <ide> within: function(element, x, y) { <ide> if(this.include_scroll_offsets) <del> return within_including_scrolloffsets(element, x, y); <add> return this.within_including_scrolloffsets(element, x, y); <ide> this.xcomp = x; <ide> this.ycomp = y; <del> var offsettop = element.offsetTop; <del> var offsetleft = element.offsetLeft; <del> return (y>=offsettop && <del> y<offsettop+element.offsetHeight && <del> x>=offsetleft && <del> x<offsetleft+element.offsetWidth); <add> this.offset = this.cumulative_offset(element); <add> <add> return (y>=this.offset[1] && <add> y<this.offset[1]+element.offsetHeight && <add> x>=this.offset[0] && <add> x<this.offset[0]+element.offsetWidth); <ide> }, <ide> <ide> within_including_scrolloffsets: function(element, x, y) { <ide> var offsetcache = this.real_offset(element); <del> this.xcomp = x + offsetcache[0] - this.deltaX; <del> this.ycomp = y + offsetcache[1] - this.deltaY; <del> this.xcomp = x; <del> this.ycomp = y; <del> var offsettop = element.offsetTop; <del> var offsetleft = element.offsetLeft; <del> return (y>=offsettop && <del> y<offsettop+element.offsetHeight && <del> x>=offsetleft && <del> x<offsetleft+element.offsetWidth); <add> this.offset = this.cumulative_offset(element); <add> this.xcomp = x + offsetcache[0] - this.deltaX + this.offset[0]; <add> this.ycomp = y + offsetcache[1] - this.deltaY + this.offset[1]; <add> <add> return (this.ycomp>=this.offset[1] && <add> this.ycomp<this.offset[1]+element.offsetHeight && <add> this.xcomp>=this.offset[0] && <add> this.xcomp<this.offset[0]+element.offsetWidth); <ide> }, <ide> <ide> // within must be called directly before <ide> overlap: function(mode, element) { <ide> if(!mode) return 0; <ide> if(mode == 'vertical') <del> return ((element.offsetTop+element.offsetHeight)-this.ycomp) / element.offsetHeight; <add> return ((this.offset[1]+element.offsetHeight)-this.ycomp) / element.offsetHeight; <ide> if(mode == 'horizontal') <del> return ((element.offsetLeft+element.offsetWidth)-this.xcomp) / element.offsetWidth; <add> return ((this.offset[0]+element.offsetWidth)-this.xcomp) / element.offsetWidth; <ide> }, <ide> <ide> clone: function(source, target) {
3
Ruby
Ruby
persist glob when replacing a path
0f4d005501c4230fcdf8d64d530639f5bcda6086
<ide><path>railties/lib/rails/paths.rb <ide> def initialize(path) <ide> end <ide> <ide> def []=(path, value) <del> add(path, :with => value) <add> glob = self[path] ? self[path].glob : nil <add> add(path, :with => value, :glob => glob) <ide> end <ide> <ide> def add(path, options={}) <ide><path>railties/test/paths_test.rb <ide> def setup <ide> assert_equal "*.rb", @root["app"].glob <ide> end <ide> <add> test "it should be possible to replace a path and persist the original paths glob" do <add> @root.add "app", :glob => "*.rb" <add> @root["app"] = "app2" <add> assert_equal ["/foo/bar/app2"], @root["app"].paths <add> assert_equal "*.rb", @root["app"].glob <add> end <add> <ide> test "a path can be added to the load path" do <ide> @root["app"] = "app" <ide> @root["app"].load_path!
2
Python
Python
add test for unicode, parametrize for chunksize
370792b3929aa9c66403c9509f08cb7921347352
<ide><path>numpy/lib/tests/test_loadtxt.py <ide> def test_control_character_newline_raises(nl): <ide> np.loadtxt(txt, quotechar=nl) <ide> <ide> <del>def test_datetime_parametric_unit_discovery(): <add>@pytest.mark.parametrize( <add> ("generic_data", "long_datum", "unitless_dtype", "expected_dtype"), <add> [ <add> ("2012-03", "2013-01-15", "M8", "M8[D]"), # Datetimes <add> ("spam-a-lot", "tis_but_a_scratch", "U", "U17"), # str <add> ], <add>) <add>@pytest.mark.parametrize("nrows", (10, 50000, 60000)) # lt, eq, gt chunksize <add>def test_datetime_parametric_unit_discovery( <add> generic_data, long_datum, unitless_dtype, expected_dtype, nrows <add>): <ide> """Check that the correct unit (e.g. month, day, second) is discovered from <ide> the data when a user specifies a unitless datetime.""" <ide> # Unit should be "D" (days) due to last entry <del> data = ["2012-03"] * 50000 + ["2013-01-15"] <del> expected = np.array(data, dtype="M8[D]") <add> data = [generic_data] * 50000 + [long_datum] <add> expected = np.array(data, dtype=expected_dtype) <ide> <ide> # file-like path <ide> txt = StringIO("\n".join(data)) <del> a = np.loadtxt(txt, dtype="M8") <add> a = np.loadtxt(txt, dtype=unitless_dtype) <ide> assert a.dtype == expected.dtype <ide> assert_equal(a, expected) <ide> <ide> # file-obj path <ide> fd, fname = mkstemp() <ide> with open(fname, "w") as fh: <ide> fh.write("\n".join(data)) <del> a = np.loadtxt(fname, dtype="M8") <add> a = np.loadtxt(fname, dtype=unitless_dtype) <ide> assert a.dtype == expected.dtype <ide> assert_equal(a, expected)
1
Text
Text
add dev notes
eb14a670e7b95fac283bf996f6225aad507a2d61
<ide><path>DEVELOPMENT.md <add># DEVELOPMENT NOTES <add> <add>## archiving old versions <add> <add>(hopefully automated someday) <add> <add>1. make repo on github <add>2. in threejsfundamentals.org <add> 1. git fetch origin gh-pages <add> 2. git checkout gh-pages <add> 3. git rebase origin/gh-pages? <add> 4. git clone --branch gh-pages ../old.threejsfundamentals/rXXX <add>3. cd ../old.threejsfundamentals.rXXX <add>4. git remote rm origin <add>5. git remote add origin <new-github-repo> <add>6. edit CNAME <add>7. delete <add> * robots.txt <add> * sitemap.xml <add> * atom.xml <add> * Gruntfile.js <add>8. s/\/threejsfundamentals.org/\/rXXX.threejsfundamentals.org/g <add>9. git push -u origin gh-pages <add>10. Update DNS (cloudflare)
1
Text
Text
add upgrading guide from next.js 9 to 10
feb0cb497b61946f98bb00520704e09e23209730
<ide><path>docs/upgrading.md <ide> description: Learn how to upgrade Next.js. <ide> <ide> # Upgrade Guide <ide> <del>## Upgrading from version 8 to 9.0.x <add>## Upgrading from version 9 to 10 <add> <add>There were no breaking changes between version 9 and 10. <add> <add>To upgrade run the following command: <add> <add>``` <add>npm install next@latest <add>``` <add> <add>## Upgrading from version 8 to 9 <ide> <ide> ### Preamble <ide>
1
Python
Python
fix weight saving and loading
983c68bf4738f54e74921a52eb2b717b8b9ada41
<ide><path>keras/layers/core.py <ide> def connect(self, node): <ide> <ide> def get_weights(self): <ide> weights = [] <del> for m in encoders + decoders: <add> for m in self.encoders + self.decoders: <ide> weights += m.get_weights() <ide> return weights <ide> <ide> def set_weights(self, weights): <del> models = encoders + decoders <add> models = self.encoders + self.decoders <ide> for i in range(len(models)): <ide> nb_param = len(models[i].params) <ide> models[i].set_weights(weights[:nb_param])
1
Python
Python
fix bigquery_dts parameter docstring typo
73eb24f25c2d60fb3a2d8fe2ed64b3c165f8d4c6
<ide><path>airflow/providers/google/cloud/operators/bigquery_dts.py <ide> class BigQueryCreateDataTransferOperator(BaseOperator): <ide> :param project_id: The BigQuery project id where the transfer configuration should be <ide> created. If set to None or missing, the default project_id from the Google Cloud connection <ide> is used. <del> :param: location: BigQuery Transfer Service location for regional transfers. <add> :param location: BigQuery Transfer Service location for regional transfers. <ide> :param authorization_code: authorization code to use with this transfer configuration. <ide> This is required if new credentials are needed. <ide> :param retry: A retry object used to retry requests. If `None` is <ide> class BigQueryDeleteDataTransferConfigOperator(BaseOperator): <ide> :param transfer_config_id: Id of transfer config to be used. <ide> :param project_id: The BigQuery project id where the transfer configuration should be <ide> created. If set to None or missing, the default project_id from the Google Cloud connection is used. <del> :param: location: BigQuery Transfer Service location for regional transfers. <add> :param location: BigQuery Transfer Service location for regional transfers. <ide> :param retry: A retry object used to retry requests. If `None` is <ide> specified, requests will not be retried. <ide> :param timeout: The amount of time, in seconds, to wait for the request to <ide> class BigQueryDataTransferServiceStartTransferRunsOperator(BaseOperator): <ide> `~google.cloud.bigquery_datatransfer_v1.types.Timestamp` <ide> :param project_id: The BigQuery project id where the transfer configuration should be <ide> created. If set to None or missing, the default project_id from the Google Cloud connection is used. <del> :param: location: BigQuery Transfer Service location for regional transfers. <add> :param location: BigQuery Transfer Service location for regional transfers. <ide> :param retry: A retry object used to retry requests. If `None` is <ide> specified, requests will not be retried. <ide> :param timeout: The amount of time, in seconds, to wait for the request to
1
Javascript
Javascript
update $location api close #62
1cad16c6f9144113e2b458ce91af8fcb0211c173
<ide><path>src/services.js <ide> angularServiceInject("$location", function(browser) { <ide> scope.$eval(); <ide> } <ide> }); <del> <add> <ide> this.$onEval(PRIORITY_FIRST, updateBrowser); <ide> this.$onEval(PRIORITY_LAST, updateBrowser); <del> <add> <ide> update(lastLocationHref); <ide> lastLocationHash = location.hash; <del> <add> <ide> return location; <del> <add> <ide> // PUBLIC METHODS <del> <add> <ide> /** <ide> * Update location object <ide> * Does not immediately update the browser <ide> * Browser is updated at the end of $eval() <del> * <add> * <ide> * @example <ide> * scope.$location.update('http://www.angularjs.org/path#hash?search=x'); <ide> * scope.$location.update({host: 'www.google.com', protocol: 'https'}); <ide> * scope.$location.update({hashPath: '/path', hashSearch: {a: 'b', x: true}}); <del> * <add> * <ide> * @param {String | Object} Full href as a string or hash object with properties <ide> */ <ide> function update(href) { <ide> if (isString(href)) { <ide> extend(location, parseHref(href)); <del> } <del> else { <add> } else { <ide> if (isDefined(href.hash)) { <ide> extend(href, parseHash(href.hash)); <ide> } <del> <add> <ide> extend(location, href); <del> <add> <ide> if (isDefined(href.hashPath || href.hashSearch)) { <ide> location.hash = composeHash(location); <ide> } <del> <add> <ide> location.href = composeHref(location); <ide> } <ide> } <del> <add> <ide> /** <ide> * Update location hash <ide> * @see update() <del> * <add> * <ide> * @example <ide> * scope.$location.updateHash('/hp') <ide> * ==> update({hashPath: '/hp'}) <del> * <add> * <ide> * scope.$location.updateHash({a: true, b: 'val'}) <ide> * ==> update({hashSearch: {a: true, b: 'val'}}) <del> * <add> * <ide> * scope.$location.updateHash('/hp', {a: true}) <ide> * ==> update({hashPath: '/hp', hashSearch: {a: true}}) <del> * <add> * <ide> * @param {String | Object} hashPath as String or hashSearch as Object <ide> * @param {String | Object} hashPath as String or hashSearch as Object [optional] <ide> */ <ide> angularServiceInject("$location", function(browser) { <ide> } <ide> update(hash); <ide> } <del> <add> <ide> /** <ide> * Returns string representation - href <del> * <add> * <ide> * @return {String} Location's href property <ide> */ <ide> function toString() { <ide> updateLocation(); <ide> return location.href; <ide> } <del> <add> <ide> /** <ide> * Cancel change of the location <del> * <add> * <ide> * Calling update(), updateHash() or setting a property does not immediately <ide> * change the browser's url. Url is changed at the end of $eval() <del> * <add> * <ide> * By calling this method, you can cancel the change (before end of $eval()) <del> * <add> * <ide> */ <ide> function cancel() { <ide> update(lastLocationHref); <ide> } <del> <add> <ide> // INNER METHODS <ide> <ide> /** <ide> * Update location object <del> * <add> * <ide> * User is allowed to change properties, so after property change, <ide> * location object is not in consistent state. <del> * <add> * <ide> * @example <ide> * scope.$location.href = 'http://www.angularjs.org/path#a/b' <ide> * immediately after this call, other properties are still the old ones... <del> * <add> * <ide> * This method checks the changes and update location to the consistent state <ide> */ <ide> function updateLocation() { <ide> angularServiceInject("$location", function(browser) { <ide> } <ide> update(location.href); <ide> } <del> <add> <ide> /** <ide> * If location has changed, update the browser <ide> * This method is called at the end of $eval() phase <ide> */ <ide> function updateBrowser() { <ide> updateLocation(); <del> <add> <ide> if (location.href != lastLocationHref) { <ide> browser.setUrl(lastLocationHref = location.href); <ide> lastLocationHash = location.hash; <ide> angularServiceInject("$location", function(browser) { <ide> <ide> /** <ide> * Compose href string from a location object <del> * <add> * <ide> * @param {Object} Location object with all properties <ide> * @return {String} Composed href <ide> */ <ide> angularServiceInject("$location", function(browser) { <ide> (port ? ':' + port : '') + loc.path + <ide> (url ? '?' + url : '') + (loc.hash ? '#' + loc.hash : ''); <ide> } <del> <add> <ide> /** <ide> * Compose hash string from location object <del> * <add> * <ide> * @param {Object} Object with hashPath and hashSearch properties <ide> * @return {String} Hash string <ide> */ <ide> angularServiceInject("$location", function(browser) { <ide> <ide> /** <ide> * Parse href string into location object <del> * <add> * <ide> * @param {String} Href <ide> * @return {Object} Location <ide> */ <ide> function parseHref(href) { <ide> var loc = {}; <ide> var match = URL_MATCH.exec(href); <del> <add> <ide> if (match) { <ide> loc.href = href.replace('#$', ''); <ide> loc.protocol = match[1]; <ide> angularServiceInject("$location", function(browser) { <ide> loc.path = match[6]; <ide> loc.search = parseKeyValue(match[8]); <ide> loc.hash = match[10] || ''; <del> <add> <ide> extend(loc, parseHash(loc.hash)); <ide> } <del> <add> <ide> return loc; <ide> } <del> <add> <ide> /** <ide> * Parse hash string into object <del> * <add> * <ide> * @param {String} Hash <ide> * @param {Object} Object with hashPath and hashSearch properties <ide> */ <ide> function parseHash(hash) { <ide> var h = {}; <ide> var match = HASH_MATCH.exec(hash); <del> <add> <ide> if (match) { <ide> h.hash = hash; <ide> h.hashPath = unescape(match[1] || ''); <ide> h.hashSearch = parseKeyValue(match[3]); <ide> } <del> <add> <ide> return h; <ide> } <ide> }, ['$browser'], EAGER_PUBLISHED);
1
Ruby
Ruby
use formula#tap to get the repository name
944ac75b2460859151c103efc2b49b31e133e9ec
<ide><path>Library/Contributions/cmd/brew-gist-logs.rb <ide> def gist_logs f <ide> puts 'and then set HOMEBREW_GITHUB_API_TOKEN to use --new-issue option.' <ide> exit 1 <ide> end <del> repo = repo_name(f) <ide> end <ide> <ide> files = load_logs(f.name) <ide> def gist_logs f <ide> url = create_gist(files) <ide> <ide> if ARGV.include? '--new-issue' <del> url = new_issue(repo, "#{f.name} failed to build on #{MACOS_FULL_VERSION}", url) <add> url = new_issue(f.tap, "#{f.name} failed to build on #{MACOS_FULL_VERSION}", url) <ide> end <ide> <ide> ensure puts url if url <ide> def initialize response <ide> end <ide> end <ide> <del>def repo_name f <del> dir = f.path.dirname <del> url = dir.cd { `git config --get remote.origin.url` } <del> unless url =~ %r{github.com(?:/|:)([\w\d]+)/([\-\w\d]+)} <del> raise 'Unable to determine formula repository.' <del> end <del> "#{$1}/#{$2}" <del>end <del> <ide> def usage <ide> puts "usage: brew gist-logs [options] <formula>" <ide> puts
1
Javascript
Javascript
add spec for datepickerandroid
34d88bc5976870e0de93f217f1c29c83e4648213
<ide><path>Libraries/Components/DatePickerAndroid/DatePickerAndroid.android.js <ide> <ide> 'use strict'; <ide> <del>const DatePickerModule = require('../../BatchedBridge/NativeModules') <del> .DatePickerAndroid; <ide> import type {Options, DatePickerOpenAction} from './DatePickerAndroidTypes'; <add>import NativeDatePickerAndroid from './NativeDatePickerAndroid'; <ide> <ide> /** <ide> * Convert a Date to a timestamp. <ide> class DatePickerAndroid { <ide> _toMillis(optionsMs, 'minDate'); <ide> _toMillis(optionsMs, 'maxDate'); <ide> } <del> return DatePickerModule.open(options); <add> return NativeDatePickerAndroid.open(options); <ide> } <ide> <ide> /** <ide><path>Libraries/Components/DatePickerAndroid/NativeDatePickerAndroid.js <add>/** <add> * Copyright (c) Facebook, Inc. and its affiliates. <add> * <add> * This source code is licensed under the MIT license found in the <add> * LICENSE file in the root directory of this source tree. <add> * <add> * @flow <add> * @format <add> */ <add> <add>'use strict'; <add> <add>import type {TurboModule} from 'RCTExport'; <add>import * as TurboModuleRegistry from 'TurboModuleRegistry'; <add> <add>export interface Spec extends TurboModule { <add> +open: (options: Object) => Promise<Object>; <add>} <add> <add>export default TurboModuleRegistry.getEnforcing<Spec>('DatePickerAndroid');
2
Python
Python
fix display of minutes in time+
168cdef81887b4df9b8033f2f590512ed5c53273
<ide><path>glances/glances.py <ide> def displayProcess(self, processcount, processlist, log_count=0): <ide> process_time = processlist[processes]['proc_time'] <ide> dtime = timedelta(seconds=sum(process_time)) <ide> dtime = "{0}:{1}.{2}".format( <del> dtime.seconds // 60 % 60, <add> str(dtime.seconds // 60 % 60).zfill(2), <ide> str(dtime.seconds % 60).zfill(2), <ide> str(dtime.microseconds)[:2]) <ide> self.term_window.addnstr(
1
Text
Text
update pre-req to node.js 18
eba1ea58a2b3bcbdf335e9c32c28d28cfa29e582
<ide><path>docs/how-to-setup-freecodecamp-locally.md <ide> Some community members also develop on Windows 10 natively with Git for Windows <ide> <ide> | Prerequisite | Version | Notes | <ide> | --------------------------------------------------------------------------------------------- | ------- | ------------------------------------------------------------------------------------------- | <del>| [Node.js](http://nodejs.org) | `16.x` | We use the "Active LTS" version, See [LTS Schedule](https://nodejs.org/en/about/releases/). | <add>| [Node.js](http://nodejs.org) | `18.x` | We use the "Active LTS" version, See [LTS Schedule](https://nodejs.org/en/about/releases/). | <ide> | npm (comes bundled with Node) | `8.x` | We use the version bundled with Node.js Active LTS. | <ide> | [MongoDB Community Server](https://docs.mongodb.com/manual/administration/install-community/) | `4.2.x` | - | <ide>
1
Python
Python
fix `pool3d` padding of theano
1c9a49781da2101507db23e2014e4e5d16bd2e52
<ide><path>keras/backend/theano_backend.py <ide> def pool3d(x, pool_size, strides=(1, 1, 1), padding='valid', <ide> w_pad = pool_size[0] - 2 if pool_size[0] % 2 == 1 else pool_size[0] - 1 <ide> h_pad = pool_size[1] - 2 if pool_size[1] % 2 == 1 else pool_size[1] - 1 <ide> d_pad = pool_size[2] - 2 if pool_size[2] % 2 == 1 else pool_size[2] - 1 <del> padding = (w_pad, h_pad, d_pad) <add> pad = (w_pad, h_pad, d_pad) <ide> elif padding == 'valid': <del> padding = (0, 0, 0) <add> pad = (0, 0, 0) <ide> else: <ide> raise ValueError('Invalid padding:', padding) <ide> <ide> def pool3d(x, pool_size, strides=(1, 1, 1), padding='valid', <ide> if pool_mode == 'max': <ide> pool_out = pool.pool_3d(x, ws=pool_size, stride=strides, <ide> ignore_border=True, <del> pad=padding, <add> pad=pad, <ide> mode='max') <ide> elif pool_mode == 'avg': <ide> pool_out = pool.pool_3d(x, ws=pool_size, stride=strides, <ide> ignore_border=True, <del> pad=padding, <add> pad=pad, <ide> mode='average_exc_pad') <ide> else: <ide> raise ValueError('Invalid pooling mode:', pool_mode)
1
Javascript
Javascript
run the support test once only
1f16b168594f5b98341fd7db0b9fb5b3e84217c8
<ide><path>src/css.js <ide> jQuery.cssHooks.marginRight = { <ide> delete jQuery.cssHooks.marginRight; <ide> return; <ide> } <del> if ( computed ) { <del> // Support: Android 2.3 <del> // WebKit Bug 13343 - getComputedStyle returns wrong value for margin-right <del> // Work around by temporarily setting element display to inline-block <del> return jQuery.swap( elem, { "display": "inline-block" }, <del> curCSS, [ elem, "marginRight" ] ); <del> } <add> <add> jQuery.cssHooks.marginRight.get = function( elem, computed ) { <add> if ( computed ) { <add> // Support: Android 2.3 <add> // WebKit Bug 13343 - getComputedStyle returns wrong value for margin-right <add> // Work around by temporarily setting element display to inline-block <add> return jQuery.swap( elem, { "display": "inline-block" }, <add> curCSS, [ elem, "marginRight" ] ); <add> } <add> }; <add> <add> return jQuery.cssHooks.marginRight.get( elem, computed ); <ide> } <ide> }; <ide> <ide> jQuery.each( [ "top", "left" ], function( i, prop ) { <ide> delete jQuery.cssHooks[ prop ]; <ide> return; <ide> } <add> <ide> jQuery.cssHooks[ prop ].get = function ( i, prop ) { <ide> if ( computed ) { <ide> computed = curCSS( elem, prop ); <ide> jQuery.each( [ "top", "left" ], function( i, prop ) { <ide> computed; <ide> } <ide> }; <add> <ide> return jQuery.cssHooks[ prop ].get( i, prop ); <ide> } <ide> };
1
Python
Python
remove datetime_parser argument from datetimefield
6ce9e3baf04eb8587e53ae05979c3c198cd51110
<ide><path>rest_framework/fields.py <ide> class DateTimeField(Field): <ide> default_timezone = timezone.get_default_timezone() if settings.USE_TZ else None <ide> datetime_parser = datetime.datetime.strptime <ide> <del> def __init__(self, format=empty, datetime_parser=None, input_formats=None, default_timezone=None, *args, **kwargs): <add> def __init__(self, format=empty, input_formats=None, default_timezone=None, *args, **kwargs): <ide> self.format = format if format is not empty else self.format <ide> self.input_formats = input_formats if input_formats is not None else self.input_formats <ide> self.default_timezone = default_timezone if default_timezone is not None else self.default_timezone <del> self.datetime_parser = datetime_parser if datetime_parser is not None else self.datetime_parser <ide> super(DateTimeField, self).__init__(*args, **kwargs) <ide> <ide> def enforce_timezone(self, value):
1
Go
Go
ignore false positives
7c91fd4240ed7024cd1f23ac0a27a5338671c251
<ide><path>integration/build/build_session_test.go <ide> func TestBuildWithSession(t *testing.T) { <ide> assert.Check(t, is.Equal(du.BuilderSize, int64(0))) <ide> } <ide> <add>//nolint:unused // false positive: linter detects this as "unused" <ide> func testBuildWithSession(t *testing.T, client dclient.APIClient, daemonHost string, dir, dockerfile string) (outStr string) { <ide> ctx := context.Background() <ide> sess, err := session.NewSession(ctx, "foo1", "foo") <ide><path>integration/container/checkpoint_test.go <ide> import ( <ide> "gotest.tools/v3/skip" <ide> ) <ide> <add>//nolint:unused // false positive: linter detects this as "unused" <ide> func containerExec(t *testing.T, client client.APIClient, cID string, cmd []string) { <ide> t.Logf("Exec: %s", cmd) <ide> ctx := context.Background() <ide><path>integration/network/service_test.go <ide> func TestServiceRemoveKeepsIngressNetwork(t *testing.T) { <ide> assert.Assert(t, ok, "ingress-sbox not present in ingress network") <ide> } <ide> <add>//nolint:unused // for some reason, the "unused" linter marks this function as "unused" <ide> func swarmIngressReady(client client.NetworkAPIClient) func(log poll.LogT) poll.Result { <ide> return func(log poll.LogT) poll.Result { <ide> netInfo, err := client.NetworkInspect(context.Background(), ingressNet, types.NetworkInspectOptions{
3
Python
Python
use func.count to count rows
2111d73a9277c8e036279f2cc59f146270ef8e5b
<ide><path>airflow/models/trigger.py <ide> def assign_unassigned(cls, triggerer_id, capacity, session=None): <ide> """ <ide> from airflow.jobs.base_job import BaseJob # To avoid circular import <ide> <del> count = session.query(cls.id).filter(cls.triggerer_id == triggerer_id).count() <add> count = session.query(func.count(cls.id)).filter(cls.triggerer_id == triggerer_id).scalar() <ide> capacity -= count <ide> <ide> if capacity <= 0: <ide><path>airflow/utils/db.py <ide> def check_run_id_null(session: Session) -> Iterable[str]: <ide> dagrun_table.c.run_id.is_(None), <ide> dagrun_table.c.execution_date.is_(None), <ide> ) <del> invalid_dagrun_count = session.query(dagrun_table.c.id).filter(invalid_dagrun_filter).count() <add> invalid_dagrun_count = session.query(func.count(dagrun_table.c.id)).filter(invalid_dagrun_filter).scalar() <ide> if invalid_dagrun_count > 0: <ide> dagrun_dangling_table_name = _format_airflow_moved_table_name(dagrun_table.name, '2.2', 'dangling') <ide> if dagrun_dangling_table_name in inspect(session.get_bind()).get_table_names():
2
Javascript
Javascript
improve coverage at `lib/internal/vm/module.js`
146f0fc495d1ceb19542dfc145e55060e0ad4eec
<ide><path>test/parallel/test-vm-module-synthetic.js <ide> const assert = require('assert'); <ide> code: 'ERR_VM_MODULE_STATUS', <ide> }); <ide> } <add> <add> { <add> assert.throws(() => { <add> SyntheticModule.prototype.setExport.call({}, 'foo'); <add> }, { <add> code: 'ERR_VM_MODULE_NOT_MODULE', <add> message: /Provided module is not an instance of Module/ <add> }); <add> } <add> <ide> })().then(common.mustCall());
1
Javascript
Javascript
improve test coverage
22b839845c204310e8d78e28dd0cf9f6b3fff6aa
<ide><path>test/simple/test-string-decoder.js <ide> var common = require('../common'); <ide> var assert = require('assert'); <ide> var StringDecoder = require('string_decoder').StringDecoder; <del>var decoder = new StringDecoder('utf8'); <del> <del> <del> <del>var buffer = new Buffer('$'); <del>assert.deepEqual('$', decoder.write(buffer)); <del> <del>buffer = new Buffer('¢'); <del>assert.deepEqual('', decoder.write(buffer.slice(0, 1))); <del>assert.deepEqual('¢', decoder.write(buffer.slice(1, 2))); <del> <del>buffer = new Buffer('€'); <del>assert.deepEqual('', decoder.write(buffer.slice(0, 1))); <del>assert.deepEqual('', decoder.write(buffer.slice(1, 2))); <del>assert.deepEqual('€', decoder.write(buffer.slice(2, 3))); <del> <del>buffer = new Buffer([0xF0, 0xA4, 0xAD, 0xA2]); <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 1)); <del>s += decoder.write(buffer.slice(1, 2)); <del>s += decoder.write(buffer.slice(2, 3)); <del>s += decoder.write(buffer.slice(3, 4)); <del>assert.ok(s.length > 0); <del> <del>// CESU-8 <del>buffer = new Buffer('EDA0BDEDB18D', 'hex'); // THUMBS UP SIGN (in CESU-8) <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 1)); <del>s += decoder.write(buffer.slice(1, 2)); <del>s += decoder.write(buffer.slice(2, 3)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(3, 4)); <del>s += decoder.write(buffer.slice(4, 5)); <del>s += decoder.write(buffer.slice(5, 6)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 2)); <del>s += decoder.write(buffer.slice(2, 4)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(4, 6)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 3)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(3, 6)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 4)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(4, 5)); <del>s += decoder.write(buffer.slice(5, 6)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 5)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(5, 6)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 6)); <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del> <del>// UCS-2 <del>decoder = new StringDecoder('ucs2'); <del>buffer = new Buffer('ab', 'ucs2'); <del>assert.equal(decoder.write(buffer), 'ab'); // 2 complete chars <del>buffer = new Buffer('abc', 'ucs2'); <del>assert.equal(decoder.write(buffer.slice(0, 3)), 'a'); // 'a' and first of 'b' <del>assert.equal(decoder.write(buffer.slice(3, 6)), 'bc'); // second of 'b' and 'c' <del> <del> <del>// UTF-16LE <del>buffer = new Buffer('3DD84DDC', 'hex'); // THUMBS UP SIGN (in CESU-8) <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 1)); <del>s += decoder.write(buffer.slice(1, 2)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(2, 3)); <del>s += decoder.write(buffer.slice(3, 4)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 2)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(2, 4)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 3)); // complete lead surrogate <del>assert.equal(s, ''); <del>s += decoder.write(buffer.slice(3, 4)); // complete trail surrogate <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <del> <del>var s = ''; <del>s += decoder.write(buffer.slice(0, 4)); <del>assert.equal(s, '\uD83D\uDC4D'); // THUMBS UP SIGN (in UTF-16) <ide> <add>process.stdout.write('scanning '); <ide> <add>// UTF-8 <add>test('utf-8', new Buffer('$', 'utf-8'), '$'); <add>test('utf-8', new Buffer('¢', 'utf-8'), '¢'); <add>test('utf-8', new Buffer('€', 'utf-8'), '€'); <add>test('utf-8', new Buffer('𤭢', 'utf-8'), '𤭢'); <ide> // A mixed ascii and non-ascii string <ide> // Test stolen from deps/v8/test/cctest/test-strings.cc <ide> // U+02E4 -> CB A4 <ide> // U+0064 -> 64 <ide> // U+12E4 -> E1 8B A4 <ide> // U+0030 -> 30 <ide> // U+3045 -> E3 81 85 <del>var expected = '\u02e4\u0064\u12e4\u0030\u3045'; <del>var buffer = new Buffer([0xCB, 0xA4, 0x64, 0xE1, 0x8B, 0xA4, <del> 0x30, 0xE3, 0x81, 0x85]); <del>var charLengths = [0, 0, 1, 2, 2, 2, 3, 4, 4, 4, 5, 5]; <add>test( <add> 'utf-8', <add> new Buffer([0xCB, 0xA4, 0x64, 0xE1, 0x8B, 0xA4, 0x30, 0xE3, 0x81, 0x85]), <add> '\u02e4\u0064\u12e4\u0030\u3045' <add>); <add> <add>// CESU-8 <add>test('utf-8', new Buffer('EDA0BDEDB18D', 'hex'), '\ud83d\udc4d'); // thumbs up <add> <add>// UCS-2 <add>test('ucs2', new Buffer('ababc', 'ucs2'), 'ababc'); <add> <add>// UTF-16LE <add>test('ucs2', new Buffer('3DD84DDC', 'hex'), '\ud83d\udc4d'); // thumbs up <ide> <del>// Split the buffer into 3 segments <del>// |----|------|-------| <del>// 0 i j buffer.length <del>// Scan through every possible 3 segment combination <del>// and make sure that the string is always parsed. <del>common.print('scanning '); <del>for (var j = 2; j < buffer.length; j++) { <del> for (var i = 1; i < j; i++) { <del> var decoder = new StringDecoder('utf8'); <add>console.log(' crayon!'); <ide> <del> var sum = decoder.write(buffer.slice(0, i)); <add>// test verifies that StringDecoder will correctly decode the given input <add>// buffer with the given encoding to the expected output. It will attempt all <add>// possible ways to write() the input buffer, see writeSequences(). The <add>// singleSequence allows for easy debugging of a specific sequence which is <add>// useful in case of test failures. <add>function test(encoding, input, expected, singleSequence) { <add> var sequences; <add> if (!singleSequence) { <add> sequences = writeSequences(input.length); <add> } else { <add> sequences = [singleSequence]; <add> } <add> sequences.forEach(function(sequence) { <add> var decoder = new StringDecoder(encoding); <add> var output = ''; <add> sequence.forEach(function(write) { <add> output += decoder.write(input.slice(write[0], write[1])); <add> }); <add> process.stdout.write('.'); <add> if (output !== expected) { <add> var message = <add> 'Expected "'+unicodeEscape(expected)+'", '+ <add> 'but got "'+unicodeEscape(output)+'"\n'+ <add> 'Write sequence: '+JSON.stringify(sequence)+'\n'+ <add> 'Decoder charBuffer: 0x'+decoder.charBuffer.toString('hex')+'\n'+ <add> 'Full Decoder State: '+JSON.stringify(decoder, null, 2); <add> assert.fail(output, expected, message); <add> } <add> }); <add>} <ide> <del> // just check that we've received the right amount <del> // after the first write <del> assert.equal(charLengths[i], sum.length); <add>// unicodeEscape prints the str contents as unicode escape codes. <add>function unicodeEscape(str) { <add> var r = ''; <add> for (var i = 0; i < str.length; i++) { <add> r += '\\u'+str.charCodeAt(i).toString(16); <add> } <add> return r; <add>} <ide> <del> sum += decoder.write(buffer.slice(i, j)); <del> sum += decoder.write(buffer.slice(j, buffer.length)); <del> assert.equal(expected, sum); <del> common.print('.'); <add>// writeSequences returns an array of arrays that describes all possible ways a <add>// buffer of the given length could be split up and passed to sequential write <add>// calls. <add>// <add>// e.G. writeSequences(3) will return: [ <add>// [ [ 0, 3 ] ], <add>// [ [ 0, 2 ], [ 2, 3 ] ], <add>// [ [ 0, 1 ], [ 1, 3 ] ], <add>// [ [ 0, 1 ], [ 1, 2 ], [ 2, 3 ] ] <add>// ] <add>function writeSequences(length, start, sequence) { <add> if (start === undefined) { <add> start = 0; <add> sequence = [] <add> } else if (start === length) { <add> return [sequence]; <ide> } <add> var sequences = []; <add> for (var end = length; end > start; end--) { <add> var subSequence = sequence.concat([[start, end]]); <add> var subSequences = writeSequences(length, end, subSequence, sequences); <add> sequences = sequences.concat(subSequences); <add> } <add> return sequences; <ide> } <del>console.log(' crayon!'); <ide>
1
Ruby
Ruby
preinstall any pre-fetch dependencies
508b48d19e1627db45d33b82e1a2159a010594a5
<ide><path>Library/Homebrew/dependency_collector.rb <ide> def fetch(spec) <ide> end <ide> <ide> def cache_key(spec) <del> if spec.is_a?(Resource) && spec.download_strategy == CurlDownloadStrategy <add> if spec.is_a?(Resource) && spec.download_strategy <= CurlDownloadStrategy <ide> File.extname(spec.url) <ide> else <ide> spec <ide> def resource_dep(spec, tags) <ide> strategy = spec.download_strategy <ide> <ide> if strategy <= HomebrewCurlDownloadStrategy <del> brewed_curl_dep_if_needed(tags) <add> @deps << brewed_curl_dep_if_needed(tags) <ide> parse_url_spec(spec.url, tags) <ide> elsif strategy <= CurlDownloadStrategy <ide> parse_url_spec(spec.url, tags) <ide><path>Library/Homebrew/formula_installer.rb <ide> def prelude <ide> forbidden_license_check <ide> <ide> check_install_sanity <add> install_fetch_deps unless ignore_deps? <ide> end <ide> <ide> sig { void } <ide> def check_install_sanity <ide> "#{formula.full_name} requires the latest version of pinned dependencies" <ide> end <ide> <add> sig { void } <add> def install_fetch_deps <add> return if @compute_dependencies.blank? <add> <add> compute_dependencies(use_cache: false) if @compute_dependencies.any? do |dep, options| <add> next false unless dep.tags == [:build, :test] <add> <add> fetch_dependencies <add> install_dependency(dep, options) <add> true <add> end <add> end <add> <ide> def build_bottle_preinstall <ide> @etc_var_dirs ||= [HOMEBREW_PREFIX/"etc", HOMEBREW_PREFIX/"var"] <ide> @etc_var_preinstall = Find.find(*@etc_var_dirs.select(&:directory?)).to_a
2
Python
Python
add backward compatible `deprecate_with_doc`
8dc57cd463d7ed28b1ec7bdd87a6e2f930f54dda
<ide><path>numpy/lib/utils.py <ide> from numpy.core import product, ndarray <ide> <ide> __all__ = ['issubclass_', 'get_numpy_include', 'issubsctype', <del> 'issubdtype', 'deprecate', 'get_numarray_include', <del> 'get_include', 'info', 'source', 'who', 'lookfor', <add> 'issubdtype', 'deprecate', 'deprecate_with_doc', <add> 'get_numarray_include', 'get_include', <add> 'info', 'source', 'who', 'lookfor', <ide> 'byte_bounds', 'may_share_memory', 'safe_eval'] <ide> <ide> def get_include(): <ide> def deprecate(*args, **kwargs): <ide> fn = args[0] <ide> args = args[1:] <ide> <add> # backward compatibility -- can be removed <add> # after next release <add> if 'newname' in kwargs: <add> kwargs['new_name'] = kwargs.pop('newname') <add> if 'oldname' in kwargs: <add> kwargs['old_name'] = kwargs.pop('oldname') <add> <ide> return _Deprecate(*args, **kwargs)(fn) <ide> else: <ide> return _Deprecate(*args, **kwargs) <ide> <add>deprecate_with_doc = lambda msg: _Deprecate(message=msg) <ide> get_numpy_include = deprecate(get_include, 'get_numpy_include', 'get_include') <ide> <ide>
1
PHP
PHP
apply fixes from styleci
8cbdf8fa668898c98b750c394e555c2db682b2b2
<ide><path>tests/Integration/Foundation/Fixtures/EventDiscovery/UnionListeners/UnionListener.php <ide> <ide> class UnionListener <ide> { <del> public function handle(EventOne|EventTwo $event) <add> public function handle(EventOne | EventTwo $event) <ide> { <ide> // <ide> }
1
Ruby
Ruby
fix strategy error
a920f74671d0a9acf5bc50939333dbce98468b0c
<ide><path>Library/Homebrew/dev-cmd/pr-pull.rb <ide> def pr_pull <ide> next <ide> end <ide> <del> GitHub.fetch_artifact(user, repo, pr, dir, workflow_id: workflow, artifact_name: artifact) <add> GitHub.fetch_artifact(user, repo, pr, dir, workflow_id: workflow, <add> artifact_name: artifact, <add> strategy: CurlNoResumeDownloadStrategy) <ide> <ide> if Homebrew.args.dry_run? <ide> puts "brew bottle --merge --write #{Dir["*.json"].join " "}" <ide> def pr_pull <ide> if Homebrew.args.dry_run? <ide> puts "Upload bottles described by these JSON files to Bintray:\n #{Dir["*.json"].join("\n ")}" <ide> else <del> bintray.upload_bottle_json Dir["*.json"], <del> publish_package: !args.no_publish?, <del> strategy: CurlNoResumeDownloadStrategy <add> bintray.upload_bottle_json Dir["*.json"], publish_package: !args.no_publish? <ide> end <ide> end <ide> end
1
Ruby
Ruby
follow the pattern more closely
769cab7e4f936c8ebb04852b1f6202479a34710f
<ide><path>Library/Homebrew/extend/os/dependency_collector.rb <add>require "dependency_collector" <add> <ide> if OS.mac? <ide> require "extend/os/mac/dependency_collector" <ide> elsif OS.linux? <ide><path>Library/Homebrew/extend/os/linux/dependency_collector.rb <del>def ant_dep(spec, tags) <del> Dependency.new(spec.to_s, tags) <add>class DependencyCollector <add> def ant_dep(spec, tags) <add> Dependency.new(spec.to_s, tags) <add> end <ide> end <ide><path>Library/Homebrew/extend/os/mac/dependency_collector.rb <del>def ant_dep(spec, tags) <del> if MacOS.version >= :mavericks <del> Dependency.new(spec.to_s, tags) <add>class DependencyCollector <add> def ant_dep(spec, tags) <add> if MacOS.version >= :mavericks <add> Dependency.new(spec.to_s, tags) <add> end <ide> end <ide> end
3
Javascript
Javascript
increase http2 coverage
340b3be1df9e8ade94c4767c7ff6d7c6d3003ea6
<ide><path>test/parallel/test-http2-getpackedsettings.js <ide> assert.doesNotThrow(() => http2.getPackedSettings({ enablePush: false })); <ide> assert.deepStrictEqual(packed, check); <ide> } <ide> <add>// check for not passing settings <add>{ <add> const packed = http2.getPackedSettings(); <add> assert.strictEqual(packed.length, 0); <add>} <add> <ide> { <ide> const packed = Buffer.from([ <ide> 0x00, 0x01, 0x00, 0x00, 0x00, 0x64, 0x00, 0x03, 0x00, 0x00, <ide> assert.doesNotThrow(() => http2.getPackedSettings({ enablePush: false })); <ide> assert.strictEqual(settings.enablePush, true); <ide> } <ide> <add>//check for what happens if passing {validate: true} and no errors happen <add>{ <add> const packed = Buffer.from([ <add> 0x00, 0x01, 0x00, 0x00, 0x00, 0x64, 0x00, 0x03, 0x00, 0x00, <add> 0x00, 0xc8, 0x00, 0x05, 0x00, 0x00, 0x4e, 0x20, 0x00, 0x04, <add> 0x00, 0x00, 0x00, 0x64, 0x00, 0x06, 0x00, 0x00, 0x00, 0x64, <add> 0x00, 0x02, 0x00, 0x00, 0x00, 0x01]); <add> <add> assert.doesNotThrow(() => { <add> http2.getUnpackedSettings(packed, { validate: true }); <add> }); <add>} <add> <add>// check for maxFrameSize failing the max number <add>{ <add> const packed = Buffer.from([0x00, 0x05, 0x01, 0x00, 0x00, 0x00]); <add> <add> assert.throws(() => { <add> http2.getUnpackedSettings(packed, { validate: true }); <add> }, common.expectsError({ <add> code: 'ERR_HTTP2_INVALID_SETTING_VALUE', <add> type: RangeError, <add> message: 'Invalid value for setting "maxFrameSize": 16777216' <add> })); <add>} <add> <add>// check for maxConcurrentStreams failing the max number <ide> { <ide> const packed = Buffer.from([0x00, 0x03, 0xFF, 0xFF, 0xFF, 0xFF]); <ide>
1
Python
Python
fix use of mems in transformer-xl
812def00c9497513f5da4f51c795465b3cacd887
<ide><path>src/transformers/modeling_transfo_xl.py <ide> def get_output_embeddings(self): <ide> return self.crit.out_layers[-1] <ide> <ide> def prepare_inputs_for_generation(self, input_ids, past, **model_kwargs): <del> inputs = {"input_ids": input_ids} <add> inputs = {} <ide> <ide> # if past is defined in model kwargs then use it for faster decoding <ide> if past: <ide> inputs["mems"] = past <add> inputs["input_ids"] = input_ids[:, -1].unsqueeze(-1) <add> else: <add> inputs["input_ids"] = input_ids <ide> <ide> return inputs <ide>
1
Python
Python
add copyto example
cf2b314a4c5fb7bb05c1d265d2ba3e22bd859203
<ide><path>numpy/core/multiarray.py <ide> def copyto(dst, src, casting=None, where=None): <ide> A boolean array which is broadcasted to match the dimensions <ide> of `dst`, and selects elements to copy from `src` to `dst` <ide> wherever it contains the value True. <add> <add> Examples <add> -------- <add> >>> A = np.array([4, 5, 6]) <add> >>> B = [1, 2, 3] <add> >>> np.copyto(A, B) <add> >>> A <add> array([1, 2, 3]) <add> <add> >>> A = np.array([[1, 2, 3], [4, 5, 6]]) <add> >>> B = [[4, 5, 6], [7, 8, 9]] <add> >>> np.copyto(A, B) <add> >>> A <add> array([[4, 5, 6], <add> [7, 8, 9]]) <add> <ide> """ <ide> return (dst, src, where) <ide>
1
Javascript
Javascript
throw tojson errors when formatting %j
455e6f1dd88dc43b9e6d95fadb24c2cad3798ac7
<ide><path>lib/util.js <ide> const inspectDefaultOptions = Object.seal({ <ide> breakLength: 60 <ide> }); <ide> <add>const CIRCULAR_ERROR_MESSAGE = 'Converting circular structure to JSON'; <add> <ide> var Debug; <ide> <ide> function tryStringify(arg) { <ide> try { <ide> return JSON.stringify(arg); <del> } catch (_) { <del> return '[Circular]'; <add> } catch (err) { <add> if (err.name === 'TypeError' && err.message === CIRCULAR_ERROR_MESSAGE) <add> return '[Circular]'; <add> throw err; <ide> } <ide> } <ide> <ide><path>test/parallel/test-util-format.js <ide> assert.strictEqual(util.format('o: %j, a: %j'), 'o: %j, a: %j'); <ide> assert.strictEqual(util.format('%j', o), '[Circular]'); <ide> } <ide> <add>{ <add> const o = { <add> toJSON() { <add> throw new Error('Not a circular object but still not serializable'); <add> } <add> }; <add> assert.throws(() => util.format('%j', o), <add> /^Error: Not a circular object but still not serializable$/); <add>} <add> <ide> // Errors <ide> const err = new Error('foo'); <ide> assert.strictEqual(util.format(err), err.stack);
2
Go
Go
use strings.index instead of strings.split
5702a89db6ce055c243a4197ba80738028aa5792
<ide><path>container/container.go <ide> func (container *Container) CreateDaemonEnvironment(tty bool, linkedEnv []string <ide> if os == "" { <ide> os = runtime.GOOS <ide> } <del> env := []string{} <add> <add> // Figure out what size slice we need so we can allocate this all at once. <add> envSize := len(container.Config.Env) <ide> if runtime.GOOS != "windows" || (runtime.GOOS == "windows" && os == "linux") { <del> env = []string{ <del> "PATH=" + system.DefaultPathEnv(os), <del> "HOSTNAME=" + container.Config.Hostname, <del> } <add> envSize += 2 + len(linkedEnv) <add> } <add> if tty { <add> envSize++ <add> } <add> <add> env := make([]string, 0, envSize) <add> if runtime.GOOS != "windows" || (runtime.GOOS == "windows" && os == "linux") { <add> env = append(env, "PATH="+system.DefaultPathEnv(os)) <add> env = append(env, "HOSTNAME="+container.Config.Hostname) <ide> if tty { <ide> env = append(env, "TERM=xterm") <ide> } <ide><path>container/env.go <ide> import ( <ide> func ReplaceOrAppendEnvValues(defaults, overrides []string) []string { <ide> cache := make(map[string]int, len(defaults)) <ide> for i, e := range defaults { <del> parts := strings.SplitN(e, "=", 2) <del> cache[parts[0]] = i <add> index := strings.Index(e, "=") <add> cache[e[:index]] = i <ide> } <ide> <ide> for _, value := range overrides { <ide> // Values w/o = means they want this env to be removed/unset. <del> if !strings.Contains(value, "=") { <add> index := strings.Index(value, "=") <add> if index < 0 { <add> // no "=" in value <ide> if i, exists := cache[value]; exists { <ide> defaults[i] = "" // Used to indicate it should be removed <ide> } <ide> continue <ide> } <ide> <del> // Just do a normal set/update <del> parts := strings.SplitN(value, "=", 2) <del> if i, exists := cache[parts[0]]; exists { <add> if i, exists := cache[value[:index]]; exists { <ide> defaults[i] = value <ide> } else { <ide> defaults = append(defaults, value) <ide><path>container/env_test.go <ide> package container // import "github.com/docker/docker/container" <ide> <del>import "testing" <add>import ( <add> "crypto/rand" <add> "testing" <add> <add> "gotest.tools/v3/assert" <add>) <ide> <ide> func TestReplaceAndAppendEnvVars(t *testing.T) { <ide> var ( <ide> func TestReplaceAndAppendEnvVars(t *testing.T) { <ide> t.Fatalf("expected TERM=xterm got '%s'", env[1]) <ide> } <ide> } <add> <add>func BenchmarkReplaceOrAppendEnvValues(b *testing.B) { <add> b.Run("0", func(b *testing.B) { <add> benchmarkReplaceOrAppendEnvValues(b, 0) <add> }) <add> b.Run("100", func(b *testing.B) { <add> benchmarkReplaceOrAppendEnvValues(b, 100) <add> }) <add> b.Run("1000", func(b *testing.B) { <add> benchmarkReplaceOrAppendEnvValues(b, 1000) <add> }) <add> b.Run("10000", func(b *testing.B) { <add> benchmarkReplaceOrAppendEnvValues(b, 10000) <add> }) <add>} <add> <add>func benchmarkReplaceOrAppendEnvValues(b *testing.B, extraEnv int) { <add> b.StopTimer() <add> // remove FOO from env <add> // remove BAR from env (nop) <add> o := []string{"HOME=/root", "TERM=xterm", "FOO", "BAR"} <add> <add> if extraEnv > 0 { <add> buf := make([]byte, 5) <add> for i := 0; i < extraEnv; i++ { <add> n, err := rand.Read(buf) <add> assert.NilError(b, err) <add> key := string(buf[:n]) <add> <add> n, err = rand.Read(buf) <add> assert.NilError(b, err) <add> val := string(buf[:n]) <add> <add> o = append(o, key+"="+val) <add> } <add> } <add> d := make([]string, 0, len(o)+2) <add> d = append(d, []string{"HOME=/", "FOO=foo_default"}...) <add> <add> b.StartTimer() <add> for i := 0; i < b.N; i++ { <add> _ = ReplaceOrAppendEnvValues(d, o) <add> } <add>}
3
PHP
PHP
remove session.auto_start configuration
faa2cbd3c3fc1bbf83064727847789123110b8e3
<ide><path>lib/Cake/Model/Datasource/CakeSession.php <ide> protected static function _defaultConfig($name) { <ide> 'session.serialize_handler' => 'php', <ide> 'session.use_cookies' => 1, <ide> 'session.cookie_path' => self::$path, <del> 'session.auto_start' => 0, <ide> 'session.save_path' => TMP . 'sessions', <ide> 'session.save_handler' => 'files' <ide> ) <ide> protected static function _defaultConfig($name) { <ide> 'ini' => array( <ide> 'session.use_trans_sid' => 0, <ide> 'url_rewriter.tags' => '', <del> 'session.auto_start' => 0, <ide> 'session.use_cookies' => 1, <ide> 'session.cookie_path' => self::$path, <ide> 'session.save_handler' => 'user', <ide> protected static function _defaultConfig($name) { <ide> 'ini' => array( <ide> 'session.use_trans_sid' => 0, <ide> 'url_rewriter.tags' => '', <del> 'session.auto_start' => 0, <ide> 'session.use_cookies' => 1, <ide> 'session.cookie_path' => self::$path, <ide> 'session.save_handler' => 'user',
1
PHP
PHP
fix doc block
b88df5d77d490bf76befe022b67423358b16de62
<ide><path>src/Illuminate/Events/Dispatcher.php <ide> protected function addInterfaceListeners($eventName, array $listeners = []) <ide> /** <ide> * Register an event listener with the dispatcher. <ide> * <del> * @param \Closure|string $listener <add> * @param \Closure|string|array $listener <ide> * @param bool $wildcard <ide> * @return \Closure <ide> */
1
Ruby
Ruby
fix initialization proccess of the query tags
18cd634e07be1ebea1d877b470d4076ef6c550a9
<ide><path>actionpack/lib/action_controller/railtie.rb <ide> class Railtie < Rails::Railtie # :nodoc: <ide> end <ide> <ide> initializer "action_controller.query_log_tags" do |app| <del> ActiveSupport.on_load(:active_record) do <del> if app.config.active_record.query_log_tags_enabled && app.config.action_controller.log_query_tags_around_actions != false <add> if app.config.active_record.query_log_tags_enabled && app.config.action_controller.log_query_tags_around_actions != false <add> app.config.active_record.query_log_tags += [:controller, :action] <add> <add> ActiveSupport.on_load(:action_controller) do <add> include ActionController::QueryTags <add> end <add> <add> ActiveSupport.on_load(:active_record) do <ide> ActiveRecord::QueryLogs.taggings.merge!( <ide> controller: -> { context[:controller]&.controller_name }, <ide> action: -> { context[:controller]&.action_name }, <ide> namespaced_controller: -> { context[:controller]&.class&.name } <ide> ) <del> <del> ActiveRecord::QueryLogs.tags + [:controller, :action] <del> <del> ActiveSupport.on_load(:action_controller) do <del> include ActionController::QueryTags <del> end <ide> end <ide> end <ide> end <ide><path>activejob/lib/active_job/railtie.rb <ide> class Railtie < Rails::Railtie # :nodoc: <ide> end <ide> <ide> initializer "active_job.query_log_tags" do |app| <add> if app.config.active_record.query_log_tags_enabled && app.config.active_job.log_query_tags_around_perform != false <add> app.config.active_record.query_log_tags << :job <add> <add> ActiveSupport.on_load(:active_job) do <add> include ActiveJob::QueryTags <add> end <add> <ide> ActiveSupport.on_load(:active_record) do <del> if app.config.active_record.query_log_tags_enabled && app.config.active_job.log_query_tags_around_perform != false <ide> ActiveRecord::QueryLogs.taggings[:job] = -> { context[:job]&.class&.name } <del> ActiveRecord::QueryLogs.tags << :job <del> <del> ActiveSupport.on_load(:active_job) do <del> include ActiveJob::QueryTags <del> end <ide> end <ide> end <ide> end
2
Javascript
Javascript
inline expectations in tests as they are used once
58c9fda94655327a35e239a2770031f93b8e9948
<ide><path>src/isomorphic/classic/element/__tests__/ReactElement-test.js <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe(ComponentClass); <ide> expect(element.key).toBe(null); <ide> expect(element.ref).toBe(null); <del> var expectation = {}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({}); <ide> }); <ide> <ide> it('should warn when `key` is being accessed on createClass element', function() { <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe('div'); <ide> expect(element.key).toBe(null); <ide> expect(element.ref).toBe(null); <del> var expectation = {}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({}); <ide> }); <ide> <ide> it('returns an immutable element', function() { <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe(ComponentClass); <ide> expect(element.key).toBe('12'); <ide> expect(element.ref).toBe('34'); <del> var expectation = {foo: '56'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({foo: '56'}); <ide> }); <ide> <ide> it('extracts null key and ref', function() { <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe(ComponentClass); <ide> expect(element.key).toBe('null'); <ide> expect(element.ref).toBe(null); <del> var expectation = {foo: '12'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({foo: '12'}); <ide> }); <ide> <ide> it('ignores undefined key and ref', function() { <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe(ComponentClass); <ide> expect(element.key).toBe(null); <ide> expect(element.ref).toBe(null); <del> var expectation = {foo: '56'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({foo: '56'}); <ide> }); <ide> <ide> it('ignores key and ref warning getters', function() { <ide> describe('ReactElement', function() { <ide> expect(element.type).toBe(ComponentClass); <ide> expect(element.key).toBe('12'); <ide> expect(element.ref).toBe(null); <del> var expectation = {foo: '56'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(element.props).toEqual(expectation); <add> expect(element.props).toEqual({foo: '56'}); <ide> }); <ide> <ide> it('preserves the owner on the element', function() { <ide><path>src/isomorphic/classic/element/__tests__/ReactElementClone-test.js <ide> describe('ReactElementClone', function() { <ide> expect(clone.type).toBe(ComponentClass); <ide> expect(clone.key).toBe('12'); <ide> expect(clone.ref).toBe('34'); <del> var expectation = {foo: 'ef'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(clone.props).toEqual(expectation); <add> expect(clone.props).toEqual({foo: 'ef'}); <ide> }); <ide> <ide> it('should extract null key and ref', function() { <ide> describe('ReactElementClone', function() { <ide> expect(clone.type).toBe(ComponentClass); <ide> expect(clone.key).toBe('null'); <ide> expect(clone.ref).toBe(null); <del> var expectation = {foo: 'ef'}; <ide> expect(Object.isFrozen(element)).toBe(true); <ide> expect(Object.isFrozen(element.props)).toBe(true); <del> expect(clone.props).toEqual(expectation); <add> expect(clone.props).toEqual({foo: 'ef'}); <ide> }); <ide> <ide> });
2
Java
Java
update copyright header for xstreammarshaller
c85b6116009ba41202ce6100aa23efc336be1120
<ide><path>spring-oxm/src/main/java/org/springframework/oxm/xstream/XStreamMarshaller.java <ide> /* <del> * Copyright 2002-2011 the original author or authors. <add> * Copyright 2002-2012 the original author or authors. <ide> * <ide> * Licensed under the Apache License, Version 2.0 (the "License"); <ide> * you may not use this file except in compliance with the License.
1
Text
Text
fix some links
f88178906bd0e9678f70e8f95522192adc5b3872
<ide><path>doc/api/assert.md <ide> Primitive values are compared with the [Abstract Equality Comparison][] <ide> Only [enumerable "own" properties][] are considered. The <ide> [`assert.deepEqual()`][] implementation does not test the <ide> [`[[Prototype]]`][prototype-spec] of objects or enumerable own [`Symbol`][] <del>properties. For such checks, consider using [assert.deepStrictEqual()][] <add>properties. For such checks, consider using [`assert.deepStrictEqual()`][] <ide> instead. [`assert.deepEqual()`][] can have potentially surprising results. The <ide> following example does not throw an `AssertionError` because the properties on <del>the [RegExp][] object are not enumerable: <add>the [`RegExp`][] object are not enumerable: <ide> <ide> ```js <ide> // WARNING: This does not throw an AssertionError! <ide><path>doc/api/async_hooks.md <ide> respective asynchronous event during a resource's lifetime. <ide> All callbacks are optional. So, for example, if only resource cleanup needs to <ide> be tracked then only the `destroy` callback needs to be passed. The <ide> specifics of all functions that can be passed to `callbacks` is in the section <del>[`Hook Callbacks`][]. <add>[Hook Callbacks][]. <ide> <ide> ```js <ide> const async_hooks = require('async_hooks'); <ide> constructor. <ide> [`after` callback]: #async_hooks_after_asyncid <ide> [`before` callback]: #async_hooks_before_asyncid <ide> [`destroy` callback]: #async_hooks_before_asyncid <del>[`Hook Callbacks`]: #async_hooks_hook_callbacks <ide> [`init` callback]: #async_hooks_init_asyncid_type_triggerasyncid_resource <add>[Hook Callbacks]: #async_hooks_hook_callbacks <ide><path>doc/api/http2.md <ide> added: v8.5.0 <ide> * `request` {http2.Http2ServerRequest} <ide> * `response` {http2.Http2ServerResponse} <ide> <del>If a [`'request'`][] listener is registered or [`'http2.createServer()'`][] is <add>If a [`'request'`][] listener is registered or [`http2.createServer()`][] is <ide> supplied a callback function, the `'checkContinue'` event is emitted each time <ide> a request with an HTTP `Expect: 100-continue` is received. If this event is <ide> not listened for, the server will automatically respond with a status <ide> added: v8.5.0 <ide> * `request` {http2.Http2ServerRequest} <ide> * `response` {http2.Http2ServerResponse} <ide> <del>If a [`'request'`][] listener is registered or [`'http2.createSecureServer()'`][] <add>If a [`'request'`][] listener is registered or [`http2.createSecureServer()`][] <ide> is supplied a callback function, the `'checkContinue'` event is emitted each <ide> time a request with an HTTP `Expect: 100-continue` is received. If this event <ide> is not listened for, the server will automatically respond with a status <ide> set on `request.stream`. <ide> `setTimeout` method will be called on `request.stream.session`. <ide> <ide> `pause`, `read`, `resume`, and `write` will throw an error with code <del>`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See [`Http2Session and Sockets`][] for <add>`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See [Http2Session and Sockets][] for <ide> more information. <ide> <ide> All other interactions will be routed directly to the socket. With TLS support, <ide> set on `response.stream`. <ide> `setTimeout` method will be called on `response.stream.session`. <ide> <ide> `pause`, `read`, `resume`, and `write` will throw an error with code <del>`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See [`Http2Session and Sockets`][] for <add>`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See [Http2Session and Sockets][] for <ide> more information. <ide> <ide> All other interactions will be routed directly to the socket. <ide><path>doc/api/n-api.md <ide> NAPI_EXTERN napi_status napi_run_script(napi_env env, <ide> [Working with JavaScript Values]: #n_api_working_with_javascript_values <ide> [Working with JavaScript Values - Abstract Operations]: #n_api_working_with_javascript_values_abstract_operations <ide> <add>[`napi_async_init`]: #n_api_napi_async_init <ide> [`napi_cancel_async_work`]: #n_api_napi_cancel_async_work <ide> [`napi_close_escapable_handle_scope`]: #n_api_napi_close_escapable_handle_scope <ide> [`napi_close_handle_scope`]: #n_api_napi_close_handle_scope
4
Python
Python
fix barthez tokenizer
245cdb469d2a7f47316926fdbac925e0ed149332
<ide><path>src/transformers/models/auto/tokenization_auto.py <ide> (AlbertConfig, (AlbertTokenizer, AlbertTokenizerFast)), <ide> (CamembertConfig, (CamembertTokenizer, CamembertTokenizerFast)), <ide> (PegasusConfig, (PegasusTokenizer, PegasusTokenizerFast)), <del> (MBartConfig, (BarthezTokenizer, BarthezTokenizerFast)), <ide> (MBartConfig, (MBartTokenizer, MBartTokenizerFast)), <ide> (XLMRobertaConfig, (XLMRobertaTokenizer, XLMRobertaTokenizerFast)), <ide> (MarianConfig, (MarianTokenizer, None)), <ide> HerbertTokenizer, <ide> HerbertTokenizerFast, <ide> PhobertTokenizer, <add> BarthezTokenizer, <ide> ] <ide> <ide>
1
Ruby
Ruby
convert `digest` to only use kwargs
e4c3225b2c9475ebfac474c7196dec2feb290e5a
<ide><path>actionview/lib/action_view/digestor.rb <ide> class << self <ide> # * <tt>finder</tt> - An instance of <tt>ActionView::LookupContext</tt> <ide> # * <tt>dependencies</tt> - An array of dependent views <ide> # * <tt>partial</tt> - Specifies whether the template is a partial <del> def digest(name:, finder:, **options) <del> options.assert_valid_keys(:dependencies, :partial) <del> <del> dependencies = Array.wrap(options[:dependencies]) <add> def digest(name:, finder:, dependencies: []) <add> dependencies ||= [] <ide> cache_key = ([ name, finder.details_key.hash ].compact + dependencies).join('.') <ide> <ide> # this is a correctly done double-checked locking idiom
1
Ruby
Ruby
add set union to options
b2ccbfe6af2adaf7cc4f0e2af691f428704d5658
<ide><path>Library/Homebrew/options.rb <ide> def &(o) <ide> Options.new(@options & o) <ide> end <ide> <add> def |(o) <add> Options.new(@options | o) <add> end <add> <ide> def *(arg) <ide> @options.to_a * arg <ide> end <ide><path>Library/Homebrew/test/test_options.rb <ide> def test_intersection <ide> assert_equal [foo], (@options & options).to_a <ide> end <ide> <add> def test_set_union <add> foo, bar, baz = %w{foo bar baz}.map { |o| Option.new(o) } <add> options = Options.new << foo << bar <add> @options << foo << baz <add> assert_equal [foo, bar, baz].sort, (@options | options).to_a.sort <add> end <add> <ide> def test_coerce_with_options <ide> assert_same @options, Options.coerce(@options) <ide> end
2
Javascript
Javascript
avoid linear scan in _unrefactive
e5bb66886bfa40818de7a96e982c5964eef9eb78
<ide><path>lib/timers.js <ide> function unrefTimeout() { <ide> <ide> debug('unrefTimer fired'); <ide> <del> var diff, domain, first, threw; <del> while (first = L.peek(unrefList)) { <del> diff = now - first._idleStart; <add> var timeSinceLastActive; <add> var nextTimeoutTime; <add> var nextTimeoutDuration; <add> var minNextTimeoutTime; <add> var itemToDelete; <add> <add> // The actual timer fired and has not yet been rearmed, <add> // let's consider its next firing time is invalid for now. <add> // It may be set to a relevant time in the future once <add> // we scanned through the whole list of timeouts and if <add> // we find a timeout that needs to expire. <add> unrefTimer.when = -1; <ide> <del> if (diff < first._idleTimeout) { <del> diff = first._idleTimeout - diff; <del> unrefTimer.start(diff, 0); <del> unrefTimer.when = now + diff; <del> debug('unrefTimer rescheudling for later'); <del> return; <add> // Iterate over the list of timeouts, <add> // call the onTimeout callback for those expired, <add> // and rearm the actual timer if the next timeout to expire <add> // will expire before the current actual timer. <add> var cur = unrefList._idlePrev; <add> while (cur != unrefList) { <add> timeSinceLastActive = now - cur._idleStart; <add> <add> if (timeSinceLastActive < cur._idleTimeout) { <add> // This timer hasn't expired yet, but check if its expiring time is <add> // earlier than the actual timer's expiring time <add> <add> nextTimeoutDuration = cur._idleTimeout - timeSinceLastActive; <add> nextTimeoutTime = now + nextTimeoutDuration; <add> if (minNextTimeoutTime == null || <add> (nextTimeoutTime < minNextTimeoutTime)) { <add> // We found a timeout that will expire earlier, <add> // store its next timeout time now so that we <add> // can rearm the actual timer accordingly when <add> // we scanned through the whole list. <add> minNextTimeoutTime = nextTimeoutTime; <add> } <add> <add> // This timer hasn't expired yet, skipping <add> cur = cur._idlePrev; <add> continue; <ide> } <ide> <del> L.remove(first); <add> // We found a timer that expired <add> var domain = cur.domain; <ide> <del> domain = first.domain; <add> if (!cur._onTimeout) continue; <ide> <del> if (!first._onTimeout) continue; <del> if (domain && domain._disposed) continue; <add> if (domain && domain._disposed) <add> continue; <ide> <ide> try { <add> var threw = true; <add> <ide> if (domain) domain.enter(); <del> threw = true; <add> <add> itemToDelete = cur; <add> // Move to the previous item before calling the _onTimeout callback, <add> // as it can mutate the list. <add> cur = cur._idlePrev; <add> <add> // Remove the timeout from the list because it expired. <add> L.remove(itemToDelete); <add> <ide> debug('unreftimer firing timeout'); <del> first._called = true; <del> first._onTimeout(); <add> itemToDelete._called = true; <add> itemToDelete._onTimeout(); <add> <ide> threw = false; <add> <ide> if (domain) <ide> domain.exit(); <ide> } finally { <ide> if (threw) process.nextTick(unrefTimeout); <ide> } <ide> } <ide> <del> debug('unrefList is empty'); <del> unrefTimer.when = -1; <add> // Rearm the actual timer with the timeout delay <add> // of the earliest timeout found. <add> if (minNextTimeoutTime != null) { <add> unrefTimer.start(minNextTimeoutTime - now, 0); <add> unrefTimer.when = minNextTimeoutTime; <add> debug('unrefTimer rescheduled'); <add> } else if (L.isEmpty(unrefList)) { <add> debug('unrefList is empty'); <add> } <ide> } <ide> <ide> <ide> exports._unrefActive = function(item) { <ide> var now = Timer.now(); <ide> item._idleStart = now; <ide> <del> if (L.isEmpty(unrefList)) { <del> debug('unrefList empty'); <del> L.append(unrefList, item); <add> var when = now + msecs; <ide> <add> // If the actual timer is set to fire too late, or not set to fire at all, <add> // we need to make it fire earlier <add> if (unrefTimer.when === -1 || unrefTimer.when > when) { <ide> unrefTimer.start(msecs, 0); <del> unrefTimer.when = now + msecs; <add> unrefTimer.when = when; <ide> debug('unrefTimer scheduled'); <del> return; <del> } <del> <del> var when = now + msecs; <del> <del> debug('unrefList find where we can insert'); <del> <del> var cur, them; <del> <del> for (cur = unrefList._idlePrev; cur != unrefList; cur = cur._idlePrev) { <del> them = cur._idleStart + cur._idleTimeout; <del> <del> if (when < them) { <del> debug('unrefList inserting into middle of list'); <del> <del> L.append(cur, item); <del> <del> if (unrefTimer.when > when) { <del> debug('unrefTimer is scheduled to fire too late, reschedule'); <del> unrefTimer.start(msecs, 0); <del> unrefTimer.when = when; <del> } <del> <del> return; <del> } <ide> } <ide> <ide> debug('unrefList append to end'); <ide><path>test/parallel/test-timers-unref-active.js <add>'use strict'; <add> <add>/* <add> * This test is aimed at making sure that unref timers queued with <add> * timers._unrefActive work correctly. <add> * <add> * Basically, it queues one timer in the unref queue, and then queues <add> * it again each time its timeout callback is fired until the callback <add> * has been called ten times. <add> * <add> * At that point, it unenrolls the unref timer so that its timeout callback <add> * is not fired ever again. <add> * <add> * Finally, a ref timeout is used with a delay large enough to make sure that <add> * all 10 timeouts had the time to expire. <add> */ <add> <add>const common = require('../common'); <add>const timers = require('timers'); <add>const assert = require('assert'); <add> <add>var someObject = {}; <add>var nbTimeouts = 0; <add> <add>/* <add> * libuv 0.10.x uses GetTickCount on Windows to implement timers, which uses <add> * system's timers whose resolution is between 10 and 16ms. See <add> * http://msdn.microsoft.com/en-us/library/windows/desktop/ms724408.aspx <add> * for more information. That's the lowest resolution for timers across all <add> * supported platforms. We're using it as the lowest common denominator, <add> * and thus expect 5 timers to be able to fire in under 100 ms. <add> */ <add>const N = 5; <add>const TEST_DURATION = 100; <add> <add>timers.unenroll(someObject); <add>timers.enroll(someObject, 1); <add> <add>someObject._onTimeout = function _onTimeout() { <add> ++nbTimeouts; <add> <add> if (nbTimeouts === N) timers.unenroll(someObject); <add> <add> timers._unrefActive(someObject); <add>}; <add> <add>timers._unrefActive(someObject); <add> <add>setTimeout(function() { <add> assert.equal(nbTimeouts, N); <add>}, TEST_DURATION);
2
Text
Text
add sca for guerda
5d067bcc5e480ed6b446e0e80ddf97b6a42cc80e
<ide><path>.github/contributors/guerda.md <add># spaCy contributor agreement <add> <add>This spaCy Contributor Agreement (**"SCA"**) is based on the <add>[Oracle Contributor Agreement](http://www.oracle.com/technetwork/oca-405177.pdf). <add>The SCA applies to any contribution that you make to any product or project <add>managed by us (the **"project"**), and sets out the intellectual property rights <add>you grant to us in the contributed materials. The term **"us"** shall mean <add>[ExplosionAI GmbH](https://explosion.ai/legal). The term <add>**"you"** shall mean the person or entity identified below. <add> <add>If you agree to be bound by these terms, fill in the information requested <add>below and include the filled-in version with your first pull request, under the <add>folder [`.github/contributors/`](/.github/contributors/). The name of the file <add>should be your GitHub username, with the extension `.md`. For example, the user <add>example_user would create the file `.github/contributors/example_user.md`. <add> <add>Read this agreement carefully before signing. These terms and conditions <add>constitute a binding legal agreement. <add> <add>## Contributor Agreement <add> <add>1. The term "contribution" or "contributed materials" means any source code, <add>object code, patch, tool, sample, graphic, specification, manual, <add>documentation, or any other material posted or submitted by you to the project. <add> <add>2. With respect to any worldwide copyrights, or copyright applications and <add>registrations, in your contribution: <add> <add> * you hereby assign to us joint ownership, and to the extent that such <add> assignment is or becomes invalid, ineffective or unenforceable, you hereby <add> grant to us a perpetual, irrevocable, non-exclusive, worldwide, no-charge, <add> royalty-free, unrestricted license to exercise all rights under those <add> copyrights. This includes, at our option, the right to sublicense these same <add> rights to third parties through multiple levels of sublicensees or other <add> licensing arrangements; <add> <add> * you agree that each of us can do all things in relation to your <add> contribution as if each of us were the sole owners, and if one of us makes <add> a derivative work of your contribution, the one who makes the derivative <add> work (or has it made will be the sole owner of that derivative work; <add> <add> * you agree that you will not assert any moral rights in your contribution <add> against us, our licensees or transferees; <add> <add> * you agree that we may register a copyright in your contribution and <add> exercise all ownership rights associated with it; and <add> <add> * you agree that neither of us has any duty to consult with, obtain the <add> consent of, pay or render an accounting to the other for any use or <add> distribution of your contribution. <add> <add>3. With respect to any patents you own, or that you can license without payment <add>to any third party, you hereby grant to us a perpetual, irrevocable, <add>non-exclusive, worldwide, no-charge, royalty-free license to: <add> <add> * make, have made, use, sell, offer to sell, import, and otherwise transfer <add> your contribution in whole or in part, alone or in combination with or <add> included in any product, work or materials arising out of the project to <add> which your contribution was submitted, and <add> <add> * at our option, to sublicense these same rights to third parties through <add> multiple levels of sublicensees or other licensing arrangements. <add> <add>4. Except as set out above, you keep all right, title, and interest in your <add>contribution. The rights that you grant to us under these terms are effective <add>on the date you first submitted a contribution to us, even if your submission <add>took place before the date you sign these terms. <add> <add>5. You covenant, represent, warrant and agree that: <add> <add> * Each contribution that you submit is and shall be an original work of <add> authorship and you can legally grant the rights set out in this SCA; <add> <add> * to the best of your knowledge, each contribution will not violate any <add> third party's copyrights, trademarks, patents, or other intellectual <add> property rights; and <add> <add> * each contribution shall be in compliance with U.S. export control laws and <add> other applicable export and import laws. You agree to notify us if you <add> become aware of any circumstance which would make any of the foregoing <add> representations inaccurate in any respect. We may publicly disclose your <add> participation in the project, including the fact that you have signed the SCA. <add> <add>6. This SCA is governed by the laws of the State of California and applicable <add>U.S. Federal law. Any choice of law rules will not apply. <add> <add>7. Please place an “x” on one of the applicable statement below. Please do NOT <add>mark both statements: <add> <add> * [ ] I am signing on behalf of myself as an individual and no other person <add> or entity, including my employer, has or will have rights with respect to my <add> contributions. <add> <add> * [ ] I am signing on behalf of my employer or a legal entity and I have the <add> actual authority to contractually bind that entity. <add> <add>## Contributor Details <add> <add>| Field | Entry | <add>|------------------------------- | -------------------- | <add>| Name | Philip Gillißen | <add>| Company name (if applicable) | | <add>| Title or role (if applicable) | | <add>| Date | 2020-03-24 | <add>| GitHub username | guerda | <add>| Website (optional) | |
1
Text
Text
fix typo in doc/api/packages.md
d5c3ed0b09f74e3d76fd844e6684ad9d54c87f36
<ide><path>doc/api/packages.md <ide> New conditions definitions may be added to this list by creating a PR to the <ide> [Node.js documentation for this section][]. The requirements for listing a <ide> new condition definition here are that: <ide> <del>* The definition should be clear and unambigious for all implementers. <add>* The definition should be clear and unambiguous for all implementers. <ide> * The use case for why the condition is needed should be clearly justified. <ide> * There should exist sufficient existing implementation usage. <ide> * The condition name should not conflict with another condition definition or
1
Go
Go
ensure workdir handled in platform semantics
6c56f917d3facec1438d03712f8a26d929fbe5ea
<ide><path>daemon/daemon.go <ide> func (daemon *Daemon) verifyContainerSettings(hostConfig *runconfig.HostConfig, <ide> <ide> // First perform verification of settings common across all platforms. <ide> if config != nil { <del> if config.WorkingDir != "" && !system.IsAbs(config.WorkingDir) { <del> return nil, fmt.Errorf("The working directory '%s' is invalid. It needs to be an absolute path.", config.WorkingDir) <add> if config.WorkingDir != "" { <add> config.WorkingDir = filepath.FromSlash(config.WorkingDir) // Ensure in platform semantics <add> if !system.IsAbs(config.WorkingDir) { <add> return nil, fmt.Errorf("The working directory '%s' is invalid. It needs to be an absolute path.", config.WorkingDir) <add> } <ide> } <ide> } <ide>
1
PHP
PHP
add a comment about the route
5049b298140846001963ab621e078b134b159d3a
<ide><path>src/Illuminate/Auth/Console/RemindersControllerCommand.php <ide> public function fire() <ide> $this->files->copy(__DIR__.'/stubs/controller.stub', $destination); <ide> <ide> $this->info('Password reminders controller created successfully!'); <add> <add> $this->comment("Route: Route::controller('password', 'RemindersController')"); <ide> } <ide> else <ide> {
1
Text
Text
add animation for heap sort
a170997eafc15733baa70a858600a47c34daacf2
<ide><path>README.md <ide> __Properties__ <ide> <ide> <ide> ### Heap <add>![alt text][heapsort-image] <ide> <ide> **Heapsort** is a _comparison-based_ sorting algorithm. It can be thought of as an improved selection sort. It divides its input into a sorted and an unsorted region, and it iteratively shrinks the unsorted region by extracting the largest element and moving that to the sorted region. <ide> <ide> where {\displaystyle \oplus } \oplus denotes the exclusive disjunction (XOR) op <ide> [quick-wiki]: https://en.wikipedia.org/wiki/Quicksort <ide> [quick-image]: https://upload.wikimedia.org/wikipedia/commons/6/6a/Sorting_quicksort_anim.gif "Quick Sort" <ide> <add>[heapsort-image]: https://upload.wikimedia.org/wikipedia/commons/4/4d/Heapsort-example.gif "Heap Sort" <ide> [heap-wiki]: https://en.wikipedia.org/wiki/Heapsort <ide> <ide> [radix-wiki]: https://en.wikipedia.org/wiki/Radix_sort
1
Javascript
Javascript
add more export tests for amp
20e4a5049e5b1f04107df0389946849f28fc657b
<ide><path>packages/next/export/index.js <ide> import { resolve, join } from 'path' <ide> import { existsSync, readFileSync } from 'fs' <ide> import loadConfig from 'next-server/next-config' <ide> import { tryAmp } from 'next-server/dist/server/require' <add>import { cleanAmpPath } from 'next-server/dist/server/utils' <ide> import { PHASE_EXPORT, SERVER_DIRECTORY, PAGES_MANIFEST, CONFIG_FILE, BUILD_ID_FILE, CLIENT_STATIC_FILES_PATH } from 'next-server/constants' <ide> import createProgress from 'tty-aware-progress' <ide> import { promisify } from 'util' <ide> export default async function (dir, options, configuration) { <ide> <ide> if (isAmp) { <ide> defaultPathMap[path].query = { amp: 1 } <del> if (!defaultPathMap[path.split('.amp')[0]]) { <add> const nonAmp = cleanAmpPath(path).replace(/\/$/, '') <add> if (!defaultPathMap[nonAmp]) { <ide> defaultPathMap[path].query.ampOnly = true <ide> } <ide> } else { <ide><path>packages/next/export/worker.js <ide> process.on( <ide> path = cleanAmpPath(path) <ide> } <ide> <add> // replace /docs/index.amp with /docs.amp <add> path = path.replace(/(?<!^)\/index\.amp$/, '.amp') <add> <ide> let htmlFilename = `${path}${sep}index.html` <ide> const pageExt = extname(page) <ide> const pathExt = extname(path) <ide><path>test/integration/export-default-map/pages/docs/index.amp.js <del>export default () => ( <del> <p>I'm an AMP page</p> <del>) <add>export { default } from './index' <ide><path>test/integration/export-default-map/pages/docs/index.js <add>import { useAmp } from 'next/amp' <add> <add>export default () => ( <add> <p>I'm an {useAmp() ? 'AMP' : 'normal'} page</p> <add>) <ide><path>test/integration/export-default-map/pages/info.js <add>import { useAmp } from 'next/amp' <add> <add>export default () => ( <add> <p>I'm an {useAmp() ? 'AMP' : 'normal'} page</p> <add>) <ide><path>test/integration/export-default-map/pages/info/index.amp.js <add>export { default } from '../info' <ide><path>test/integration/export-default-map/pages/just-amp/index.amp.js <add>export default () => ( <add> <p>I am an AMP only page</p> <add>) <ide><path>test/integration/export-default-map/test/index.test.js <ide> describe('Export with default map', () => { <ide> await expect(access(join(outdir, 'some/index.html'))).resolves.toBe(undefined) <ide> await expect(access(join(outdir, 'some.amp/index.html'))).resolves.toBe(undefined) <ide> }) <add> <add> it('should export nested hybrid amp page correctly', async () => { <add> expect.assertions(2) <add> await expect(access(join(outdir, 'docs/index.html'))).resolves.toBe(undefined) <add> await expect(access(join(outdir, 'docs.amp/index.html'))).resolves.toBe(undefined) <add> }) <add> <add> it('should export nested hybrid amp page correctly with folder', async () => { <add> expect.assertions(2) <add> await expect(access(join(outdir, 'info/index.html'))).resolves.toBe(undefined) <add> await expect(access(join(outdir, 'info.amp/index.html'))).resolves.toBe(undefined) <add> }) <ide> })
8
Python
Python
add a couple of tests for nextafter
19b5dea4e52c96c08ee364295137f69af15c724e
<ide><path>numpy/core/tests/test_umath.py <ide> def test_copysign(): <ide> assert np.signbit(np.copysign(np.nan, -1)) <ide> assert not np.signbit(np.copysign(np.nan, 1)) <ide> <add>def test_nextafter(): <add> for t in [np.float32, np.float64, np.longdouble]: <add> one = t(1) <add> two = t(2) <add> zero = t(0) <add> eps = np.finfo(t).eps <add> assert np.nextafter(one, two) - one == eps <add> assert one - np.nextafter(one, zero) == -eps <add> assert np.isnan(np.nextafter(np.nan, one)) <add> assert np.isnan(np.nextafter(one, np.nan)) <add> assert np.nextafter(one, one) == one <add> <ide> def test_pos_nan(): <ide> """Check np.nan is a positive nan.""" <ide> assert np.signbit(np.nan) == 0
1
Javascript
Javascript
move suppression to primary locations in xplat/js
a3d9e912036f6a87fb0479bc90703611ebbe5f4e
<ide><path>Libraries/Animated/src/nodes/AnimatedInterpolation.js <ide> type ExtrapolateType = 'extend' | 'identity' | 'clamp'; <ide> <ide> export type InterpolationConfigType = { <ide> inputRange: $ReadOnlyArray<number>, <del> /* $FlowFixMe(>=0.38.0 site=react_native_fb,react_native_oss) - Flow error <del> * detected during the deployment of v0.38.0. To see the error, remove this <del> * comment and run flow <del> */ <ide> outputRange: $ReadOnlyArray<number> | $ReadOnlyArray<string>, <ide> easing?: (input: number) => number, <ide> extrapolate?: ExtrapolateType, <ide> function createInterpolationFromStringOutputRange( <ide> }); <ide> }); <ide> <del> /* $FlowFixMe(>=0.18.0): `outputRange[0].match()` can return `null`. Need to <del> * guard against this possibility. <del> */ <ide> const interpolations = outputRange[0] <ide> .match(stringShapeRegex) <add> /* $FlowFixMe(>=0.18.0): `outputRange[0].match()` can return `null`. Need <add> * to guard against this possibility. */ <ide> .map((value, i) => { <ide> return createInterpolation({ <ide> ...config, <ide> class AnimatedInterpolation extends AnimatedWithChildren { <ide> return { <ide> inputRange: this._config.inputRange, <ide> // Only the `outputRange` can contain strings so we don't need to transform `inputRange` here <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the deployment of <add> * v0.38.0. To see the error, remove this comment and run flow */ <ide> outputRange: this.__transformDataType(this._config.outputRange), <ide> extrapolateLeft: <ide> this._config.extrapolateLeft || this._config.extrapolate || 'extend', <ide><path>Libraries/Components/TextInput/TextInput.js <ide> function InternalTextInput(props: Props): React.Node { <ide> onBlur={_onBlur} <ide> onChange={_onChange} <ide> onFocus={_onFocus} <add> /* $FlowFixMe the types for AndroidTextInput don't match up exactly <add> * with the props for TextInput. This will need to get fixed */ <ide> onScroll={_onScroll} <ide> onSelectionChange={_onSelectionChange} <ide> selection={selection} <ide><path>Libraries/Lists/FlatList.js <ide> class FlatList<ItemT> extends React.PureComponent<Props<ItemT>, void> { <ide> }), <ide> ); <ide> } else if (this.props.onViewableItemsChanged) { <del> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <del> * error found when Flow v0.63 was deployed. To see the error delete this <del> * comment and run Flow. */ <ide> this._virtualizedListPairs.push({ <add> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <add> * error found when Flow v0.63 was deployed. To see the error delete <add> * this comment and run Flow. */ <ide> viewabilityConfig: this.props.viewabilityConfig, <ide> onViewableItemsChanged: this._createOnViewableItemsChanged( <ide> this.props.onViewableItemsChanged, <ide><path>Libraries/Lists/ViewabilityHelper.js <ide> export type ViewabilityConfig = {| <ide> class ViewabilityHelper { <ide> _config: ViewabilityConfig; <ide> _hasInteracted: boolean = false; <del> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an error <del> * found when Flow v0.63 was deployed. To see the error delete this comment <del> * and run Flow. */ <ide> _timers: Set<number> = new Set(); <ide> _viewableIndices: Array<number> = []; <ide> _viewableItems: Map<string, ViewToken> = new Map(); <ide> class ViewabilityHelper { <ide> * Cleanup, e.g. on unmount. Clears any pending timers. <ide> */ <ide> dispose() { <add> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <add> * error found when Flow v0.63 was deployed. To see the error delete this <add> * comment and run Flow. */ <ide> this._timers.forEach(clearTimeout); <ide> } <ide> <ide> class ViewabilityHelper { <ide> this._viewableIndices = viewableIndices; <ide> if (this._config.minimumViewTime) { <ide> const handle = setTimeout(() => { <add> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <add> * error found when Flow v0.63 was deployed. To see the error delete <add> * this comment and run Flow. */ <ide> this._timers.delete(handle); <ide> this._onUpdateSync( <ide> viewableIndices, <ide> onViewableItemsChanged, <ide> createViewToken, <ide> ); <ide> }, this._config.minimumViewTime); <add> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <add> * error found when Flow v0.63 was deployed. To see the error delete this <add> * comment and run Flow. */ <ide> this._timers.add(handle); <ide> } else { <ide> this._onUpdateSync( <ide><path>Libraries/Lists/__flowtests__/FlatList-flowtest.js <ide> module.exports = { <ide> testBadDataWithTypicalItem(): React.Node { <ide> const data = [ <ide> { <del> // $FlowExpectedError - bad title type 6, should be string <ide> title: 6, <ide> key: 1, <ide> }, <ide> ]; <add> // $FlowExpectedError - bad title type 6, should be string <ide> return <FlatList renderItem={renderMyListItem} data={data} />; <ide> }, <ide> <ide> module.exports = { <ide> />, <ide> // EverythingIsFine <ide> <FlatList <add> // $FlowExpectedError - bad title type number, should be string <ide> renderItem={(info: {item: {title: string, ...}, ...}) => <span />} <ide> data={data} <ide> />, <ide><path>Libraries/Lists/__flowtests__/SectionList-flowtest.js <ide> module.exports = { <ide> }, <ide> ]; <ide> return [ <del> // $FlowExpectedError - title should be inside `item` <ide> <SectionList <add> // $FlowExpectedError - title should be inside `item` <ide> renderItem={(info: {title: string, ...}) => <span />} <ide> sections={sections} <ide> />, <ide> module.exports = { <ide> <ide> testBadSectionsShape(): React.Element<*> { <ide> const sections = [ <del> /* $FlowFixMe(>=0.63.0 site=react_native_fb) This comment suppresses an <del> * error found when Flow v0.63 was deployed. To see the error delete this <del> * comment and run Flow. */ <ide> { <ide> key: 'a', <ide> items: [ <ide> module.exports = { <ide> const sections = [ <ide> { <ide> key: 'a', <del> // $FlowExpectedError - section has bad meta data `fooNumber` field of type string <ide> fooNumber: 'string', <ide> data: [ <ide> { <ide> module.exports = { <ide> <SectionList <ide> renderSectionHeader={renderMyHeader} <ide> renderItem={renderMyListItem} <add> /* $FlowExpectedError - section has bad meta data `fooNumber` field of <add> * type string */ <ide> sections={sections} <ide> /> <ide> ); <ide><path>Libraries/LogBox/UI/LogBoxInspectorSourceMapStatus.js <ide> function LogBoxInspectorSourceMapStatus(props: Props): React.Node { <ide> animation, <ide> rotate: animated.interpolate({ <ide> inputRange: [0, 1], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the deployment <add> * of v0.38.0. To see the error, remove this comment and run flow <add> */ <ide> outputRange: ['0deg', '360deg'], <ide> }), <ide> }); <ide><path>Libraries/StyleSheet/__flowtests__/StyleSheet-flowtest.js <ide> module.exports = { <ide> textStyle, <ide> ): ImageStyleProp); <ide> <add> // $FlowExpectedError - Incompatible type. <ide> (StyleSheet.compose( <del> // $FlowExpectedError - Incompatible type. <ide> [textStyle], <ide> null, <ide> ): ImageStyleProp); <ide><path>RNTester/js/RNTesterApp.android.js <ide> class RNTesterApp extends React.Component<Props, RNTesterNavigationState> { <ide> /* $FlowFixMe(>=0.78.0 site=react_native_android_fb) This issue was found <ide> * when making Flow check .android.js files. */ <ide> this._exampleRef && <add> /* $FlowFixMe(>=0.78.0 site=react_native_android_fb) This issue was found <add> * when making Flow check .android.js files. */ <ide> this._exampleRef.handleBackAction && <ide> /* $FlowFixMe(>=0.78.0 site=react_native_android_fb) This issue was found <ide> * when making Flow check .android.js files. */ <ide><path>RNTester/js/examples/MaskedView/MaskedViewExample.js <ide> class AnimatedMaskExample extends React.Component<Props> { <ide> { <ide> rotate: this._maskRotateAnimatedValue.interpolate({ <ide> inputRange: [0, 360], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this <add> * comment and run flow */ <ide> outputRange: ['0deg', '360deg'], <ide> }), <ide> }, <ide><path>RNTester/js/examples/NativeAnimation/NativeAnimationsExample.js <ide> class LoopExample extends React.Component<{...}, $FlowFixMeState> { <ide> { <ide> opacity: this.state.value.interpolate({ <ide> inputRange: [0, 0.5, 1], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this comment <add> * and run flow */ <ide> outputRange: [0, 1, 0], <ide> }), <ide> }, <ide> class EventExample extends React.Component<{...}, $FlowFixMeState> { <ide> { <ide> rotate: this.state.anim.interpolate({ <ide> inputRange: [0, 1], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this <add> * comment and run flow */ <ide> outputRange: ['0deg', '1deg'], <ide> }), <ide> }, <ide><path>RNTester/js/examples/ScrollView/ScrollViewAnimatedExample.js <ide> class ScrollViewAnimatedExample extends Component<{...}> { <ide> render(): React.Node { <ide> const interpolated = this._scrollViewPos.interpolate({ <ide> inputRange: [0, 1], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the deployment of <add> * v0.38.0. To see the error, remove this comment and run flow */ <ide> outputRange: [0, 0.1], <ide> }); <ide> const interpolated2 = this._scrollViewPos.interpolate({ <ide> inputRange: [0, 1], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the deployment of <add> * v0.38.0. To see the error, remove this comment and run flow */ <ide> outputRange: ['0deg', '1deg'], <ide> }); <ide> return ( <ide><path>RNTester/js/examples/TextInput/TextInputExample.android.js <ide> class AutogrowingTextInputExample extends React.Component<{...}> { <ide> * found when making Flow check .android.js files. */ <ide> this.setState({contentSize: event.nativeEvent.contentSize}) <ide> } <add> /* $FlowFixMe(>=0.78.0 site=react_native_android_fb) This issue was <add> * found when making Flow check .android.js files. */ <ide> {...props} <ide> /> <ide> <Text>Plain text value representation:</Text> <ide><path>RNTester/js/examples/Transform/TransformExample.js <ide> function AnimateTansformSingleProp() { <ide> { <ide> rotate: theta.interpolate({ <ide> inputRange: [0, 100], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this <add> * comment and run flow */ <ide> outputRange: ['0deg', '360deg'], <ide> }), <ide> }, <ide> function Flip() { <ide> { <ide> rotateX: theta.interpolate({ <ide> inputRange: [0, 180], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this <add> * comment and run flow */ <ide> outputRange: ['0deg', '180deg'], <ide> }), <ide> }, <ide> function Flip() { <ide> { <ide> rotateX: theta.interpolate({ <ide> inputRange: [0, 180], <add> /* $FlowFixMe(>=0.38.0) - Flow error detected during the <add> * deployment of v0.38.0. To see the error, remove this <add> * comment and run flow */ <ide> outputRange: ['180deg', '360deg'], <ide> }), <ide> },
14
Python
Python
add image_and_boxes_tensor to helper doc string
d3c73c21d939d5c2a736741e2303b5058bbd5f39
<ide><path>research/object_detection/exporter_lib_v2.py <ide> def __call__(self, input_tensor): <ide> _decode_tf_example) <ide> return self._run_inference_on_images(images, true_shapes) <ide> <del>DETECTION_MODULE_MAP = { <del> 'image_tensor': DetectionFromImageModule, <del> 'encoded_image_string_tensor': <del> DetectionFromEncodedImageModule, <del> 'tf_example': DetectionFromTFExampleModule, <del> 'float_image_tensor': DetectionFromFloatImageModule <del>} <del> <ide> <ide> def export_inference_graph(input_type, <ide> pipeline_config, <ide> def __call__(self, input_tensor, boxes): <ide> return self._run_segmentation_on_images(input_tensor, boxes) <ide> <ide> <del>DETECTION_MODULE_MAP.update({ <add>DETECTION_MODULE_MAP = { <add> 'image_tensor': DetectionFromImageModule, <add> 'encoded_image_string_tensor': <add> DetectionFromEncodedImageModule, <add> 'tf_example': DetectionFromTFExampleModule, <add> 'float_image_tensor': DetectionFromFloatImageModule, <ide> 'image_and_boxes_tensor': DetectionFromImageAndBoxModule, <del>}) <add>} <ide><path>research/object_detection/exporter_main_v2.py <ide> * `tf_example`: Accepts a 1-D string tensor of shape [None] containing <ide> serialized TFExample protos. Image resolutions are expected to be the same <ide> if more than 1 image is provided. <add> * `image_and_boxes_tensor`: Accepts a 4-D image tensor of size <add> [1, None, None, 3] and a boxes tensor of size [1, None, 4] of normalized <add> bounding boxes. To be able to support this option, the model needs <add> to implement a predict_masks_from_boxes method. See the documentation <add> for DetectionFromImageAndBoxModule for details. <ide> <ide> and the following output nodes returned by the model.postprocess(..): <ide> * `num_detections`: Outputs float32 tensors of the form [batch] <ide> <ide> flags.DEFINE_string('input_type', 'image_tensor', 'Type of input node. Can be ' <ide> 'one of [`image_tensor`, `encoded_image_string_tensor`, ' <del> '`tf_example`, `float_image_tensor`]') <add> '`tf_example`, `float_image_tensor`, ' <add> '`image_and_boxes_tensor`]') <ide> flags.DEFINE_string('pipeline_config_path', None, <ide> 'Path to a pipeline_pb2.TrainEvalPipelineConfig config ' <ide> 'file.')
2
Ruby
Ruby
compare sorted arrays in relations_test
acef8feafa8a44271eb28685e180f8c28b7e4a0f
<ide><path>activerecord/test/cases/relations_test.rb <ide> def test_finding_with_group <ide> <ide> def test_select_with_block <ide> even_ids = Developer.scoped.select {|d| d.id % 2 == 0 }.map(&:id) <del> assert_equal [2, 4, 6, 8, 10], even_ids <add> assert_equal [2, 4, 6, 8, 10], even_ids.sort <ide> end <ide> <ide> def test_finding_with_hash_conditions_on_joined_table
1
Python
Python
remove extraenous whitespace using pystrip
2723ca06da0171116cb443249f90a20c8cce7f1b
<ide><path>celery/backends/__init__.py <ide> def get_backend_cls(backend): <ide> <ide> """ <ide> .. class:: DefaultBackend <del> <add> <ide> The default backend class used for storing task results and status, <ide> specified in :setting:`CELERY_BACKEND`. <ide> <ide><path>celery/backends/base.py <ide> def find_nearest_pickleable_exception(exc): <ide> not go below :exc:`Exception` (i.e. it skips :exc:`Exception`, <ide> :class:`BaseException` and :class:`object`). If that happens <ide> you should use :exc:`UnpickleableException` instead. <del> <add> <ide> :param exc: An exception instance. <ide> <ide> :returns: the nearest exception if it's not :exc:`Exception` or below, <ide> def find_nearest_pickleable_exception(exc): <ide> <ide> class UnpickleableExceptionWrapper(Exception): <ide> """Wraps unpickleable exceptions. <del> <add> <ide> :param exc_module: see :attr:`exc_module`. <ide> <ide> :param exc_cls_name: see :attr:`exc_cls_name`. <del> <add> <ide> :param exc_args: see :attr:`exc_args` <ide> <ide> .. attribute:: exc_module <ide> def prepare_exception(self, exc): <ide> return excwrapper <ide> else: <ide> return exc <del> <add> <ide> def exception_to_python(self, exc): <ide> if isinstance(exc, UnpickleableExceptionWrapper): <ide> exc_cls = self.create_exception_cls(exc.exc_cls_name, <ide><path>celery/conf.py <ide> <ide> """ <ide> .. data:: LOG_LEVELS <del> <add> <ide> Mapping of log level names to :mod:`logging` module constants. <ide> <ide> """ <ide> <ide> """ <ide> .. data:: LOG_FORMAT <del> <add> <ide> The format to use for log messages. <ide> Default is ``[%(asctime)s: %(levelname)s/%(processName)s] %(message)s`` <ide> <ide> <ide> """ <ide> .. data:: DAEMON_LOG_FILE <del> <add> <ide> The path to the deamon log file (if not set, ``stderr`` is used). <ide> <ide> """ <ide> <ide> """ <ide> .. data:: DAEMON_LOG_LEVEL <del> <add> <ide> Celery daemon log level, can be any of ``DEBUG``, ``INFO``, ``WARNING``, <ide> ``ERROR``, ``CRITICAL``, or ``FATAL``. See the :mod:`logging` module <ide> for more information. <ide> <ide> """ <ide> .. data:: QUEUE_WAKEUP_AFTER <del> <add> <ide> The time (in seconds) the celery worker should sleep when there's <ide> no messages left on the queue. After the time is slept, the worker <ide> wakes up and checks the queue again. <ide> <ide> """ <ide> .. data:: EMPTY_MSG_EMIT_EVERY <del> <add> <ide> How often the celery daemon should write a log message saying there are no <ide> messages in the queue. If this is ``None`` or ``0``, it will never print <ide> this message. <ide> <ide> """ <ide> .. data:: DAEMON_PID_FILE <del> <add> <ide> Full path to the daemon pidfile. <ide> <ide> """ <ide> <ide> """ <ide> .. data:: DAEMON_CONCURRENCY <del> <add> <ide> The number of concurrent worker processes, executing tasks simultaneously. <ide> <ide> """ <ide> .. data:: AMQP_EXCHANGE_TYPE <ide> <ide> The type of exchange. If the exchange type is ``direct``, all messages <del>receives all tasks. However, if the exchange type is ``topic``, you can <add>receives all tasks. However, if the exchange type is ``topic``, you can <ide> route e.g some tasks to one server, and others to the rest. <ide> See `Exchange types and the effect of bindings`_. <ide> <ide> <ide> """ <ide> .. data:: AMQP_PUBLISHER_ROUTING_KEY <del> <add> <ide> The default AMQP routing key used when publishing tasks. <ide> <ide> """ <ide> <ide> """ <ide> .. data:: AMQP_CONSUMER_ROUTING_KEY <del> <add> <ide> The AMQP routing key used when consuming tasks. <ide> <ide> """ <ide> <ide> """ <ide> .. data:: AMQP_CONSUMER_QUEUE <del> <add> <ide> The name of the AMQP queue. <ide> <ide> """ <ide><path>celery/datastructures.py <ide> def apply_async(self, target, args, kwargs, task_name, task_id): <ide> self._start() <ide> <ide> self._processed_total = self._process_counter.next() <del> <add> <ide> on_return = lambda r: self.on_return(r, task_name, task_id) <ide> <ide> result = self._pool.apply_async(target, args, kwargs, <ide> def add(self, result, task_name, task_id): <ide> :param task_id: Id of the task executed. <ide> <ide> """ <del> <add> <ide> self._processes[task_id] = [result, task_name] <ide> <ide> if self.full(): <ide><path>celery/result.py <ide> class AsyncResult(BaseAsyncResult): <ide> <ide> :param task_id: see :attr:`task_id`. <ide> <del> <add> <ide> .. attribute:: task_id <ide> <ide> The unique identifier for this task. <ide><path>celery/task.py <ide> def apply_async(cls, args=None, kwargs=None, **options): <ide> :rtype: :class:`celery.result.AsyncResult` <ide> <ide> See :func:`apply_async`. <del> <add> <ide> """ <ide> return apply_async(cls, args, kwargs, **options) <ide> <ide><path>celery/worker.py <ide> class UnknownTask(Exception): <ide> <ide> <ide> def jail(task_id, func, args, kwargs): <del> """Wraps the task in a jail, which catches all exceptions, and <add> """Wraps the task in a jail, which catches all exceptions, and <ide> saves the status and result of the task execution to the task <ide> meta backend. <ide> <ide> def jail(task_id, func, args, kwargs): <ide> <ide> class TaskWrapper(object): <ide> """Class wrapping a task to be run. <del> <add> <ide> :param task_name: see :attr:`task_name`. <ide> <ide> :param task_id: see :attr:`task_id`. <ide> class TaskWrapper(object): <ide> .. attribute:: task_name <ide> <ide> Kind of task. Must be a name registered in the task registry. <del> <add> <ide> .. attribute:: task_id <ide> <ide> UUID of the task. <ide> class TaskWrapper(object): <ide> .. attribute:: kwargs <ide> <ide> Mapping of keyword arguments to apply to the task. <del> <add> <ide> """ <ide> <ide> def __init__(self, task_name, task_id, task_func, args, kwargs): <ide> def extend_with_default_kwargs(self, loglevel, logfile): <ide> def execute(self, loglevel=None, logfile=None): <ide> """Execute the task in a :func:`jail` and store return value <ide> and status in the task meta backend. <del> <add> <ide> :keyword loglevel: The loglevel used by the task. <ide> <ide> :keyword logfile: The logfile used by the task. <ide> def execute_using_pool(self, pool, loglevel=None, logfile=None): <ide> <ide> class WorkController(object): <ide> """Executes tasks waiting in the task queue. <del> <add> <ide> :param concurrency: see :attr:`concurrency`. <del> <add> <ide> :param logfile: see :attr:`logfile`. <ide> <ide> :param loglevel: see :attr:`loglevel`. <ide> def __init__(self, concurrency=None, logfile=None, loglevel=None, <ide> def reset_connection(self): <ide> """Reset the AMQP connection, and reinitialize the <ide> :class:`celery.messaging.TaskConsumer` instance. <del> <add> <ide> Resets the task consumer in :attr:`task_consumer`. <ide> <ide> """ <ide> def execute_next_task(self): <ide> self.logger.debug("Trying to fetch a task.") <ide> task, message = self.fetch_next_task() <ide> self.logger.debug("Got a task: %s. Trying to execute it..." % task) <del> <add> <ide> result = task.execute_using_pool(self.pool, self.loglevel, <ide> self.logfile) <ide>
7
Ruby
Ruby
expand requirements only once
12158b201dc67e96cf8cceb700f569d16aab756d
<ide><path>Library/Homebrew/build.rb <ide> def main <ide> end <ide> <ide> class Build <del> attr_reader :f, :deps <add> attr_reader :f, :deps, :reqs <ide> <ide> def initialize(f) <ide> @f = f <ide> @deps = expand_deps <add> @reqs = f.recursive_requirements <ide> end <ide> <ide> def post_superenv_hacks <ide> # Only allow Homebrew-approved directories into the PATH, unless <ide> # a formula opts-in to allowing the user's path. <del> if f.env.userpaths? or f.recursive_requirements.any? { |rq| rq.env.userpaths? } <add> if f.env.userpaths? || reqs.any? { |rq| rq.env.userpaths? } <ide> ENV.userpaths! <ide> end <ide> end <ide> def install <ide> if superenv? <ide> ENV.keg_only_deps = keg_only_deps.map(&:to_s) <ide> ENV.deps = deps.map(&:to_s) <del> ENV.x11 = f.recursive_requirements.detect { |rq| rq.kind_of?(X11Dependency) } <add> ENV.x11 = reqs.any? { |rq| rq.kind_of?(X11Dependency) } <ide> ENV.setup_build_environment <ide> post_superenv_hacks <del> f.recursive_requirements.each(&:modify_build_environment) <add> reqs.each(&:modify_build_environment) <ide> else <ide> ENV.setup_build_environment <del> f.recursive_requirements.each(&:modify_build_environment) <add> reqs.each(&:modify_build_environment) <ide> <ide> keg_only_deps.each do |dep| <ide> opt = dep.opt_prefix
1
Javascript
Javascript
change callback function to arrow function
f63dc27eecf21b004e1656b779622742a863deb9
<ide><path>test/parallel/test-zlib-invalid-input.js <ide> const unzips = [ <ide> <ide> nonStringInputs.forEach(common.mustCall((input) => { <ide> // zlib.gunzip should not throw an error when called with bad input. <del> zlib.gunzip(input, function(err, buffer) { <add> zlib.gunzip(input, (err, buffer) => { <ide> // zlib.gunzip should pass the error to the callback. <ide> assert.ok(err); <ide> });
1
Text
Text
add container list filtering to api docs
37bdb05615763f94f7877cce3426752d43b48ff7
<ide><path>docs/sources/reference/api/docker_remote_api_v1.14.md <ide> Query Parameters: <ide> - **since** – Show only containers created since Id, include non-running ones. <ide> - **before** – Show only containers created before Id, include non-running ones. <ide> - **size** – 1/True/true or 0/False/false, Show the containers sizes <add>- **filters** - a json encoded value of the filters (a map[string][]string) to process on the containers list. <ide> <ide> Status Codes: <ide> <ide><path>docs/sources/reference/api/docker_remote_api_v1.15.md <ide> Query Parameters: <ide> non-running ones. <ide> - **size** – 1/True/true or 0/False/false, Show the containers <ide> sizes <add>- **filters** - a json encoded value of the filters (a map[string][]string) to process on the containers list. <ide> <ide> Status Codes: <ide> <ide><path>docs/sources/reference/api/docker_remote_api_v1.16.md <ide> Query Parameters: <ide> non-running ones. <ide> - **size** – 1/True/true or 0/False/false, Show the containers <ide> sizes <add>- **filters** - a json encoded value of the filters (a map[string][]string) to process on the containers list. <ide> <ide> Status Codes: <ide>
3
Python
Python
log the found compiler version if too old
639085e410e805303b4511edee4c2a315006a7f8
<ide><path>configure.py <ide> def check_compiler(o): <ide> if not ok: <ide> warn('failed to autodetect C++ compiler version (CXX=%s)' % CXX) <ide> elif clang_version < (8, 0, 0) if is_clang else gcc_version < (6, 3, 0): <del> warn('C++ compiler too old, need g++ 6.3.0 or clang++ 8.0.0 (CXX=%s)' % CXX) <add> warn('C++ compiler (CXX=%s, %s) too old, need g++ 6.3.0 or clang++ 8.0.0' % <add> (CXX, ".".join(map(str, clang_version if is_clang else gcc_version)))) <ide> <ide> ok, is_clang, clang_version, gcc_version = try_check_compiler(CC, 'c') <ide> if not ok: <ide> def check_compiler(o): <ide> # clang 3.2 is a little white lie because any clang version will probably <ide> # do for the C bits. However, we might as well encourage people to upgrade <ide> # to a version that is not completely ancient. <del> warn('C compiler too old, need gcc 4.2 or clang 3.2 (CC=%s)' % CC) <add> warn('C compiler (CC=%s, %s) too old, need gcc 4.2 or clang 3.2' % <add> (CC, ".".join(map(str, gcc_version)))) <ide> <ide> o['variables']['llvm_version'] = get_llvm_version(CC) if is_clang else '0.0' <ide>
1
Python
Python
add more type annotations to aws hooks
383a118d2df618e46d81c520cd2c4a31d81b33dd
<ide><path>airflow/providers/amazon/aws/hooks/aws_dynamodb.py <ide> """ <ide> This module contains the AWS DynamoDB hook <ide> """ <add>from typing import Iterable, List, Optional <add> <ide> from airflow.exceptions import AirflowException <ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook <ide> <ide> class AwsDynamoDBHook(AwsBaseHook): <ide> :type table_name: str <ide> """ <ide> <del> def __init__(self, *args, table_keys=None, table_name=None, **kwargs): <add> def __init__( <add> self, *args, table_keys: Optional[List] = None, table_name: Optional[str] = None, **kwargs <add> ) -> None: <ide> self.table_keys = table_keys <ide> self.table_name = table_name <del> super().__init__(resource_type='dynamodb', *args, **kwargs) <add> kwargs["resource_type"] = "dynamodb" <add> super().__init__(*args, **kwargs) <ide> <del> def write_batch_data(self, items): <add> def write_batch_data(self, items: Iterable): <ide> """ <ide> Write batch items to DynamoDB table with provisioned throughout capacity. <ide> """ <ide> def write_batch_data(self, items): <ide> return True <ide> except Exception as general_error: <ide> raise AirflowException( <del> 'Failed to insert items in dynamodb, error: {error}'.format(error=str(general_error)) <add> "Failed to insert items in dynamodb, error: {error}".format(error=str(general_error)) <ide> ) <ide><path>airflow/providers/amazon/aws/hooks/base_aws.py <ide> <ide> <ide> class _SessionFactory(LoggingMixin): <del> def __init__(self, conn: Connection, region_name: str, config: Config): <add> def __init__(self, conn: Connection, region_name: Optional[str], config: Config) -> None: <ide> super().__init__() <ide> self.conn = conn <ide> self.region_name = region_name <ide> def _assume_role_with_saml( <ide> RoleArn=role_arn, PrincipalArn=principal_arn, SAMLAssertion=saml_assertion, **assume_role_kwargs <ide> ) <ide> <del> def _fetch_saml_assertion_using_http_spegno_auth(self, saml_config: Dict[str, Any]): <add> def _fetch_saml_assertion_using_http_spegno_auth(self, saml_config: Dict[str, Any]) -> str: <ide> import requests <ide> <ide> # requests_gssapi will need paramiko > 2.6 since you'll need <ide> def __init__( <ide> self.config = config <ide> <ide> if not (self.client_type or self.resource_type): <del> raise AirflowException('Either client_type or resource_type' ' must be provided.') <add> raise AirflowException('Either client_type or resource_type must be provided.') <ide> <del> def _get_credentials(self, region_name): <add> def _get_credentials(self, region_name: Optional[str]) -> Tuple[boto3.session.Session, Optional[str]]: <ide> <ide> if not self.aws_conn_id: <ide> session = boto3.session.Session(region_name=region_name) <ide> def _get_credentials(self, region_name): <ide> session = boto3.session.Session(region_name=region_name) <ide> return session, None <ide> <del> def get_client_type(self, client_type, region_name=None, config=None): <add> def get_client_type( <add> self, client_type: str, region_name: Optional[str] = None, config: Optional[Config] = None, <add> ) -> boto3.client: <ide> """Get the underlying boto3 client using boto3 session""" <ide> session, endpoint_url = self._get_credentials(region_name) <ide> <ide> def get_client_type(self, client_type, region_name=None, config=None): <ide> <ide> return session.client(client_type, endpoint_url=endpoint_url, config=config, verify=self.verify) <ide> <del> def get_resource_type(self, resource_type, region_name=None, config=None): <add> def get_resource_type( <add> self, resource_type: str, region_name: Optional[str] = None, config: Optional[Config] = None, <add> ) -> boto3.resource: <ide> """Get the underlying boto3 resource using boto3 session""" <ide> session, endpoint_url = self._get_credentials(region_name) <ide> <ide> def get_resource_type(self, resource_type, region_name=None, config=None): <ide> return session.resource(resource_type, endpoint_url=endpoint_url, config=config, verify=self.verify) <ide> <ide> @cached_property <del> def conn(self): <add> def conn(self) -> Union[boto3.client, boto3.resource]: <ide> """ <ide> Get the underlying boto3 client/resource (cached) <ide> <ide> def conn(self): <ide> # Rare possibility - subclasses have not specified a client_type or resource_type <ide> raise NotImplementedError('Could not get boto3 connection!') <ide> <del> def get_conn(self): <add> def get_conn(self) -> Union[boto3.client, boto3.resource]: <ide> """ <ide> Get the underlying boto3 client/resource (cached) <ide> <ide> def get_conn(self): <ide> # Compat shim <ide> return self.conn <ide> <del> def get_session(self, region_name=None): <add> def get_session(self, region_name: Optional[str] = None) -> boto3.session.Session: <ide> """Get the underlying boto3.session.""" <ide> session, _ = self._get_credentials(region_name) <ide> return session <ide> <del> def get_credentials(self, region_name=None): <add> def get_credentials(self, region_name: Optional[str] = None) -> Tuple[Optional[str], Optional[str]]: <ide> """ <ide> Get the underlying `botocore.Credentials` object. <ide> <ide> def get_credentials(self, region_name=None): <ide> # See https://stackoverflow.com/a/36291428/8283373 <ide> return session.get_credentials().get_frozen_credentials() <ide> <del> def expand_role(self, role): <add> def expand_role(self, role: str) -> str: <ide> """ <ide> If the IAM role is a role name, get the Amazon Resource Name (ARN) for the role. <ide> If IAM role is already an IAM role ARN, no change is made. <ide> def expand_role(self, role): <ide> return self.get_client_type("iam").get_role(RoleName=role)["Role"]["Arn"] <ide> <ide> <del>def _parse_s3_config(config_file_name, config_format="boto", profile=None): <add>def _parse_s3_config( <add> config_file_name: str, config_format: Optional[str] = "boto", profile: Optional[str] = None <add>) -> Tuple[Optional[str], Optional[str]]: <ide> """ <ide> Parses a config file for s3 credentials. Can currently <ide> parse boto, s3cmd.conf and AWS SDK config formats <ide><path>airflow/providers/amazon/aws/hooks/ec2.py <ide> class EC2Hook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, *args, **kwargs): <del> super().__init__(resource_type="ec2", *args, **kwargs) <add> def __init__(self, *args, **kwargs) -> None: <add> kwargs["resource_type"] = "ec2" <add> super().__init__(*args, **kwargs) <ide> <ide> def get_instance(self, instance_id: str): <ide> """ <ide><path>airflow/providers/amazon/aws/hooks/emr.py <ide> # KIND, either express or implied. See the License for the <ide> # specific language governing permissions and limitations <ide> # under the License. <add>from typing import Dict, List, Optional <ide> <ide> from airflow.exceptions import AirflowException <ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook <ide> class EmrHook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, emr_conn_id=None, *args, **kwargs): <add> def __init__(self, emr_conn_id: Optional[str] = None, *args, **kwargs) -> None: <ide> self.emr_conn_id = emr_conn_id <del> super().__init__(client_type='emr', *args, **kwargs) <add> kwargs["client_type"] = "emr" <add> super().__init__(*args, **kwargs) <ide> <del> def get_cluster_id_by_name(self, emr_cluster_name, cluster_states): <add> def get_cluster_id_by_name(self, emr_cluster_name: str, cluster_states: List[str]) -> Optional[str]: <ide> """ <del> Fetch id of EMR cluster with given name and (optional) states. Will return only if single id is found. <add> Fetch id of EMR cluster with given name and (optional) states. <add> Will return only if single id is found. <ide> <ide> :param emr_cluster_name: Name of a cluster to find <ide> :type emr_cluster_name: str <ide> def get_cluster_id_by_name(self, emr_cluster_name, cluster_states): <ide> self.log.info('No cluster found for name %s', emr_cluster_name) <ide> return None <ide> <del> def create_job_flow(self, job_flow_overrides): <add> def create_job_flow(self, job_flow_overrides: Dict): <ide> """ <ide> Creates a job flow using the config from the EMR connection. <ide> Keys of the json extra hash may have the arguments of the boto3 <ide><path>airflow/providers/amazon/aws/hooks/kinesis.py <ide> """ <ide> This module contains AWS Firehose hook <ide> """ <add>from typing import Iterable <add> <ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook <ide> <ide> <ide> class AwsFirehoseHook(AwsBaseHook): <ide> :type delivery_stream: str <ide> """ <ide> <del> def __init__(self, delivery_stream, *args, **kwargs): <add> def __init__(self, delivery_stream, *args, **kwargs) -> None: <ide> self.delivery_stream = delivery_stream <del> super().__init__(client_type='firehose', *args, **kwargs) <add> kwargs["client_type"] = "firehose" <add> super().__init__(*args, **kwargs) <ide> <del> def put_records(self, records): <add> def put_records(self, records: Iterable): <ide> """ <ide> Write batch records to Kinesis Firehose <ide> """ <ide><path>airflow/providers/amazon/aws/hooks/lambda_function.py <ide> class AwsLambdaHook(AwsBaseHook): <ide> <ide> def __init__( <ide> self, <del> function_name, <del> log_type='None', <del> qualifier='$LATEST', <del> invocation_type='RequestResponse', <add> function_name: str, <add> log_type: str = 'None', <add> qualifier: str = '$LATEST', <add> invocation_type: str = 'RequestResponse', <ide> *args, <ide> **kwargs, <del> ): <add> ) -> None: <ide> self.function_name = function_name <ide> self.log_type = log_type <ide> self.invocation_type = invocation_type <ide> self.qualifier = qualifier <del> super().__init__(client_type='lambda', *args, **kwargs) <add> kwargs["client_type"] = "lambda" <add> super().__init__(*args, **kwargs) <ide> <del> def invoke_lambda(self, payload): <add> def invoke_lambda(self, payload: str) -> str: <ide> """ <ide> Invoke Lambda Function <ide> """ <ide><path>airflow/providers/amazon/aws/hooks/logs.py <ide> This module contains a hook (AwsLogsHook) with some very basic <ide> functionality for interacting with AWS CloudWatch. <ide> """ <add>from typing import Dict, Generator, Optional <ide> <ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook <ide> <ide> class AwsLogsHook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, *args, **kwargs): <del> super().__init__(client_type='logs', *args, **kwargs) <del> <del> def get_log_events(self, log_group, log_stream_name, start_time=0, skip=0, start_from_head=True): <add> def __init__(self, *args, **kwargs) -> None: <add> kwargs["client_type"] = "logs" <add> super().__init__(*args, **kwargs) <add> <add> def get_log_events( <add> self, <add> log_group: str, <add> log_stream_name: str, <add> start_time: int = 0, <add> skip: int = 0, <add> start_from_head: bool = True, <add> ) -> Generator: <ide> """ <ide> A generator for log items in a single stream. This will yield all the <ide> items that are available at the current moment. <ide> def get_log_events(self, log_group, log_stream_name, start_time=0, skip=0, start <ide> event_count = 1 <ide> while event_count > 0: <ide> if next_token is not None: <del> token_arg = {'nextToken': next_token} <add> token_arg: Optional[Dict[str, str]] = {'nextToken': next_token} <ide> else: <ide> token_arg = {} <ide> <ide><path>airflow/providers/amazon/aws/hooks/redshift.py <ide> class RedshiftHook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, *args, **kwargs): <del> super().__init__(client_type='redshift', *args, **kwargs) <add> def __init__(self, *args, **kwargs) -> None: <add> kwargs["client_type"] = "redshift" <add> super().__init__(*args, **kwargs) <ide> <ide> # TODO: Wrap create_cluster_snapshot <ide> def cluster_status(self, cluster_identifier: str) -> str: <ide><path>airflow/providers/amazon/aws/hooks/sqs.py <ide> """ <ide> This module contains AWS SQS hook <ide> """ <add>from typing import Dict, Optional <add> <ide> from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook <ide> <ide> <ide> class SQSHook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, *args, **kwargs): <del> super().__init__(client_type='sqs', *args, **kwargs) <add> def __init__(self, *args, **kwargs) -> None: <add> kwargs["client_type"] = "sqs" <add> super().__init__(*args, **kwargs) <ide> <del> def create_queue(self, queue_name, attributes=None): <add> def create_queue(self, queue_name: str, attributes: Optional[Dict] = None) -> Dict: <ide> """ <ide> Create queue using connection object <ide> <ide> def create_queue(self, queue_name, attributes=None): <ide> """ <ide> return self.get_conn().create_queue(QueueName=queue_name, Attributes=attributes or {}) <ide> <del> def send_message(self, queue_url, message_body, delay_seconds=0, message_attributes=None): <add> def send_message( <add> self, <add> queue_url: str, <add> message_body: str, <add> delay_seconds: int = 0, <add> message_attributes: Optional[Dict] = None, <add> ) -> Dict: <ide> """ <ide> Send message to the queue <ide> <ide><path>airflow/providers/amazon/aws/hooks/step_function.py <ide> class StepFunctionHook(AwsBaseHook): <ide> :class:`~airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook` <ide> """ <ide> <del> def __init__(self, region_name=None, *args, **kwargs): <del> super().__init__(client_type='stepfunctions', *args, **kwargs) <add> def __init__(self, region_name: Optional[str] = None, *args, **kwargs) -> None: <add> kwargs["client_type"] = "stepfunctions" <add> super().__init__(*args, **kwargs) <ide> <ide> def start_execution( <ide> self,
10
Text
Text
add readme for http package
8702a908a9b275f66bae7e49838899e6bdd400d7
<ide><path>src/Http/README.md <add>[![Total Downloads](https://img.shields.io/packagist/dt/cakephp/http.svg?style=flat-square)](https://packagist.org/packages/cakephp/http) <add>[![License](https://img.shields.io/badge/license-MIT-blue.svg?style=flat-square)](LICENSE.txt) <add> <add># CakePHP Http Library <add> <add>This library provides an PSR15 Http middleware server, PSR7 Request and <add>Response objects, and a PSR8 Http Client. Together these classes let you <add>handle incoming server requests and send outgoing HTTP requests. <add> <add>## Using the Http Client <add> <add>Sending requests is straight forward. Doing a GET request looks like <add> <add>```php <add>use Cake\Http\Client; <add> <add>$http = new Client(); <add> <add>// Simple get <add>$response = $http->get('http://example.com/test.html'); <add> <add>// Simple get with querystring <add>$response = $http->get('http://example.com/search', ['q' => 'widget']); <add> <add>// Simple get with querystring & additional headers <add>$response = $http->get('http://example.com/search', ['q' => 'widget'], [ <add> 'headers' => ['X-Requested-With' => 'XMLHttpRequest'] <add>]); <add>``` <add> <add>To learn more read the [Http Client documentation](https://book.cakephp.org/4.0/en/core-libraries/httpclient.html). <add> <add>## Using the Http Server <add> <add>The Http Server allows an `HttpApplicationInterface` to process requests and <add>emit responses. To get started first implement the <add>`Cake\Http\HttpApplicationInterface`, or extend `Cake\Http\BaseApplication` if <add>you are also using CakePHP's console libraries. A minimal example would could <add>look like: <add> <add>```php <add>namespace App <add> <add>use Cake\Http\BaseApplication; <add> <add>class Application implements HttpApplicationInterface <add>{ <add> /** <add> * Load all the application configuration and bootstrap logic. <add> * <add> * @return void <add> */ <add> public function bootstrap(): void <add> { <add> // Load configuration here. This is the first <add> // method Cake\Http\Server will call on your application. <add> } <add> <add> /** <add> * Define the HTTP middleware layers for an application. <add> * <add> * @param \Cake\Http\MiddlewareQueue $middleware The middleware queue to set in your App Class <add> * @return \Cake\Http\MiddlewareQueue <add> */ <add> public function middleware(MiddlewareQueue $middleware): MiddlewareQueue <add> { <add> // Add middleware for your application. <add> return $middleware; <add> } <add>} <add>``` <add> <add>Once you have an application with some middleware. You can start accepting <add>requests. In your application's webroot, you can add an `index.php` and process <add>requests: <add> <add>```php <add>require dirname(__DIR__) . '/vendor/autoload.php'; <add> <add>use App\Application; <add>use Cake\Http\Server; <add> <add>// Bind your application to the server. <add>$server = new Server(new Application()); <add> <add>// Run the request/response through the application and emit the response. <add>$server->emit($server->run()); <add>``` <add> <add>For more information on middleware, [consult the <add>documentation](https://book.cakephp.org/4.0/en/controllers/middleware.html)
1
Text
Text
fix error of a file name
a2fa326139bca583e97159d656b2ca06c2dbb7f4
<ide><path>guides/source/initialization.md <ide> load Gem.bin_path('railties', 'rails', version) <ide> ``` <ide> <ide> If you try out this command in a Rails console, you would see that this loads <del>`railties/exe/rails`. A part of the file `railties/exe/rails.rb` has the <add>`railties/exe/rails`. A part of the file `railties/exe/rails` has the <ide> following code: <ide> <ide> ```ruby
1
Text
Text
fix typo in "with-eslint" example docs
429f63dd76817f4a275a0caa73d3a0c90497c617
<ide><path>examples/with-eslint/README.md <ide> Deploy the example using [Vercel](https://vercel.com?utm_source=github&utm_mediu <ide> Execute [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app) with [npm](https://docs.npmjs.com/cli/init) or [Yarn](https://yarnpkg.com/lang/en/docs/cli/create/) to bootstrap the example: <ide> <ide> ```bash <del>npx create-next-app --example with-eslint with-eslint <add>npx create-next-app --example with-eslint with-eslint-app <ide> # or <del>yarn create next-app --example with-eslint with-eslint <add>yarn create next-app --example with-eslint with-eslint-app <ide> ``` <ide> <ide> Deploy it to the cloud with [Vercel](https://vercel.com/new?utm_source=github&utm_medium=readme&utm_campaign=next-example) ([Documentation](https://nextjs.org/docs/deployment)).
1
Text
Text
remove usage of "node" in favor of "node.js"
d2d5c970158d01a4da8e644da3202a2666414d91
<ide><path>SECURITY.md <ide> the HackerOne platform. See <https://hackerone.com/nodejs> for further details. <ide> ## Reporting a Bug in a third party module <ide> <ide> Security bugs in third party modules should be reported to their respective <del>maintainers and should also be coordinated through the Node Ecosystem Security <del>Team via [HackerOne](https://hackerone.com/nodejs-ecosystem). <add>maintainers and should also be coordinated through the Node.js Ecosystem <add>Security Team via [HackerOne](https://hackerone.com/nodejs-ecosystem). <ide> <ide> Details regarding this process can be found in the <ide> [Security Working Group repository](https://github.com/nodejs/security-wg/blob/master/processes/third_party_vuln_process.md). <ide><path>doc/STYLE_GUIDE.md <ide> * OK: JavaScript, Google's V8 <ide> <!--lint disable prohibited-strings remark-lint--> <ide> * NOT OK: Javascript, Google's v8 <del> <!-- lint enable prohibited-strings remark-lint--> <ide> <ide> * Use _Node.js_ and not _Node_, _NodeJS_, or similar variants. <add> <!-- lint enable prohibited-strings remark-lint--> <ide> * When referring to the executable, _`node`_ is acceptable. <ide> <ide> See also API documentation structure overview in [doctools README][]. <ide><path>doc/api/fs.md <ide> completion callback. <ide> <ide> The `type` argument is only available on Windows and ignored on other platforms. <ide> It can be set to `'dir'`, `'file'`, or `'junction'`. If the `type` argument is <del>not set, Node will autodetect `target` type and use `'file'` or `'dir'`. If the <del>`target` does not exist, `'file'` will be used. Windows junction points require <del>the destination path to be absolute. When using `'junction'`, the `target` <del>argument will automatically be normalized to absolute path. <add>not set, Node.js will autodetect `target` type and use `'file'` or `'dir'`. If <add>the `target` does not exist, `'file'` will be used. Windows junction points <add>require the destination path to be absolute. When using `'junction'`, the <add>`target` argument will automatically be normalized to absolute path. <ide> <ide> Relative targets are relative to the link’s parent directory. <ide> <ide><path>doc/guides/contributing/pull-requests.md <ide> In case of doubt, open an issue in the <ide> Node.js has two IRC channels: <ide> [#Node.js](https://webchat.freenode.net/?channels=node.js) for general help and <ide> questions, and <del>[#Node-dev](https://webchat.freenode.net/?channels=node-dev) for development of <add>[#node-dev](https://webchat.freenode.net/?channels=node-dev) for development of <ide> Node.js core specifically. <ide> <ide> ## Setting up your local environment
4
Text
Text
remove broken wiki link from test/common doc
cd075f488aedbbe712a2672e82ca2d1aea3ba5d0
<ide><path>test/common/README.md <ide> See [the WPT tests README][] for details. <ide> [Web Platform Tests]: https://github.com/web-platform-tests/wpt <ide> [`hijackstdio.hijackStdErr()`]: #hijackstderrlistener <ide> [`hijackstdio.hijackStdOut()`]: #hijackstdoutlistener <del>[internationalization]: https://github.com/nodejs/node/wiki/Intl <add>[internationalization]: ../../doc/api/intl.md <ide> [the WPT tests README]: ../wpt/README.md
1
Ruby
Ruby
fix broken mysql test [frederick.cheung@gmail.com]
1abe5a2dedc25d0dcfdff74c65aebc20ccf3011f
<ide><path>activerecord/test/migration_test.rb <ide> def test_unabstracted_database_dependent_types <ide> Person.delete_all <ide> <ide> ActiveRecord::Migration.add_column :people, :intelligence_quotient, :tinyint <add> Person.reset_column_information <ide> Person.create :intelligence_quotient => 300 <ide> jonnyg = Person.find(:first) <ide> assert_equal 127, jonnyg.intelligence_quotient
1