commit
stringlengths
40
40
old_file
stringlengths
4
237
new_file
stringlengths
4
237
old_contents
stringlengths
1
4.24k
new_contents
stringlengths
5
4.84k
subject
stringlengths
15
778
message
stringlengths
16
6.86k
lang
stringlengths
1
30
license
stringclasses
13 values
repos
stringlengths
5
116k
config
stringlengths
1
30
content
stringlengths
105
8.72k
a6f291a3beb7ecb7d67b81fe92e7cca6db2139dc
example_scraper.py
example_scraper.py
import json import requests API = 'http://localhost:8000/api/1.0' AUTH_PARAMS = { 'email': 'panda@pandaproject.net', 'api_key': 'edfe6c5ffd1be4d3bf22f69188ac6bc0fc04c84b' } # Create dataset dataset = { 'name': 'Test Dataset from API', 'schema': [{ 'column': 'A', 'type': 'unicode' }, { 'column': 'B', 'type': 'unicode' }, { 'column': 'C', 'type': 'unicode' }] } response = requests.post(API + '/dataset/', json.dumps(dataset), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) dataset = json.loads(response.content) # Write data data = { 'objects': [{ 'data': ['The', 'PANDA', 'lives.'] }, { 'data': ['More', 'data', 'here.'] }]} response = requests.put(API + '/dataset/%s/data/' % dataset['slug'], json.dumps(data), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) print response.content
import json import requests API = 'http://localhost:8000/api/1.0' AUTH_PARAMS = { 'email': 'panda@pandaproject.net', 'api_key': 'edfe6c5ffd1be4d3bf22f69188ac6bc0fc04c84b' } DATASET_SLUG = 'test-dataset' # Check if dataset exists response = requests.get(API + '/dataset/%s/' % DATASET_SLUG, params=AUTH_PARAMS) # Create dataset if necessary if response.status_code == 404: dataset = { 'name': 'Test Dataset from API', 'schema': [{ 'column': 'A', 'type': 'unicode' }, { 'column': 'B', 'type': 'unicode' }, { 'column': 'C', 'type': 'unicode' }] } response = requests.put(API + '/dataset/%s/' % DATASET_SLUG, json.dumps(dataset), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) # Write data data = { 'objects': [{ 'data': ['The', 'PANDA', 'lives.'] }, { 'data': ['More', 'data', 'here.'] }]} response = requests.put(API + '/dataset/%s/data/' % DATASET_SLUG, json.dumps(data), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' })
Update example scraper to use known slug.
Update example scraper to use known slug.
Python
mit
PalmBeachPost/panda,pandaproject/panda,PalmBeachPost/panda,newsapps/panda,ibrahimcesar/panda,pandaproject/panda,NUKnightLab/panda,ibrahimcesar/panda,NUKnightLab/panda,datadesk/panda,PalmBeachPost/panda,PalmBeachPost/panda,NUKnightLab/panda,ibrahimcesar/panda,datadesk/panda,ibrahimcesar/panda,pandaproject/panda,newsapps/panda,NUKnightLab/panda,datadesk/panda,pandaproject/panda,ibrahimcesar/panda,pandaproject/panda,datadesk/panda,newsapps/panda,PalmBeachPost/panda,newsapps/panda,datadesk/panda
python
## Code Before: import json import requests API = 'http://localhost:8000/api/1.0' AUTH_PARAMS = { 'email': 'panda@pandaproject.net', 'api_key': 'edfe6c5ffd1be4d3bf22f69188ac6bc0fc04c84b' } # Create dataset dataset = { 'name': 'Test Dataset from API', 'schema': [{ 'column': 'A', 'type': 'unicode' }, { 'column': 'B', 'type': 'unicode' }, { 'column': 'C', 'type': 'unicode' }] } response = requests.post(API + '/dataset/', json.dumps(dataset), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) dataset = json.loads(response.content) # Write data data = { 'objects': [{ 'data': ['The', 'PANDA', 'lives.'] }, { 'data': ['More', 'data', 'here.'] }]} response = requests.put(API + '/dataset/%s/data/' % dataset['slug'], json.dumps(data), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) print response.content ## Instruction: Update example scraper to use known slug. ## Code After: import json import requests API = 'http://localhost:8000/api/1.0' AUTH_PARAMS = { 'email': 'panda@pandaproject.net', 'api_key': 'edfe6c5ffd1be4d3bf22f69188ac6bc0fc04c84b' } DATASET_SLUG = 'test-dataset' # Check if dataset exists response = requests.get(API + '/dataset/%s/' % DATASET_SLUG, params=AUTH_PARAMS) # Create dataset if necessary if response.status_code == 404: dataset = { 'name': 'Test Dataset from API', 'schema': [{ 'column': 'A', 'type': 'unicode' }, { 'column': 'B', 'type': 'unicode' }, { 'column': 'C', 'type': 'unicode' }] } response = requests.put(API + '/dataset/%s/' % DATASET_SLUG, json.dumps(dataset), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' }) # Write data data = { 'objects': [{ 'data': ['The', 'PANDA', 'lives.'] }, { 'data': ['More', 'data', 'here.'] }]} response = requests.put(API + '/dataset/%s/data/' % DATASET_SLUG, json.dumps(data), params=AUTH_PARAMS, headers={ 'Content-Type': 'application/json' })
12b8da460a88e94fd832d6ff20763ec82972bd10
.github/workflows/.vscode/settings.json
.github/workflows/.vscode/settings.json
{ "cSpell.words": [ "SHARPLAB" ] }
{ "cSpell.words": [ "azcliversion", "Brunner", "creds", "mirrorsharp", "pwsh", "SHARPLAB", "slpublic" ] }
Add custom words to dictionary
Add custom words to dictionary
JSON
bsd-2-clause
ashmind/SharpLab,ashmind/SharpLab,ashmind/SharpLab,ashmind/SharpLab,ashmind/SharpLab,ashmind/SharpLab
json
## Code Before: { "cSpell.words": [ "SHARPLAB" ] } ## Instruction: Add custom words to dictionary ## Code After: { "cSpell.words": [ "azcliversion", "Brunner", "creds", "mirrorsharp", "pwsh", "SHARPLAB", "slpublic" ] }
aa9717b62f1a0c0e41f06e9f54393d8cd55769f9
templates/demo/config/namesystem.js
templates/demo/config/namesystem.js
module.exports = { default: { available_providers: ["ens", "ipns"], provider: "ens", register: { rootDomain: "embark.eth", subdomains: { 'status': '0x1a2f3b98e434c02363f3dac3174af93c1d690914' } } } };
module.exports = { default: { available_providers: ["ens", "ipns"], provider: "ens" }, development: { register: { rootDomain: "embark.eth", subdomains: { 'status': '0x1a2f3b98e434c02363f3dac3174af93c1d690914' } } } };
Move name config to development
[CHORES] Move name config to development
JavaScript
mit
iurimatias/embark-framework,iurimatias/embark-framework
javascript
## Code Before: module.exports = { default: { available_providers: ["ens", "ipns"], provider: "ens", register: { rootDomain: "embark.eth", subdomains: { 'status': '0x1a2f3b98e434c02363f3dac3174af93c1d690914' } } } }; ## Instruction: [CHORES] Move name config to development ## Code After: module.exports = { default: { available_providers: ["ens", "ipns"], provider: "ens" }, development: { register: { rootDomain: "embark.eth", subdomains: { 'status': '0x1a2f3b98e434c02363f3dac3174af93c1d690914' } } } };
90440c20d9e9880de07aeec39da65873f988e166
plugin-interface/src/main/scala/mesosphere/marathon/plugin/RunSpec.scala
plugin-interface/src/main/scala/mesosphere/marathon/plugin/RunSpec.scala
package mesosphere.marathon.plugin /** * A Marathon RunSpec Definition */ trait RunSpec { /** * The uniqie id of this run specification */ val id: PathId /** * The Mesos resource roles that are accepted */ val acceptedResourceRoles: Set[String] /** * All defined secret definitions */ val secrets: Map[String, Secret] } /** * An application is a run spec that launches a single task. */ trait ApplicationSpec extends RunSpec { /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A Marathon Container definition */ trait ContainerSpec { /** * The name of the container spec. */ val name: String /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A pod is a run spec that launches a task group. */ trait PodSpec extends RunSpec { /** * All containers of this run specification */ val containers: Seq[ContainerSpec] }
package mesosphere.marathon.plugin /** * A Marathon RunSpec Definition */ trait RunSpec { /** * The uniqie id of this run specification */ val id: PathId /** * The Mesos resource roles that are accepted */ val acceptedResourceRoles: Set[String] /** * All defined secret definitions */ val secrets: Map[String, Secret] } /** * An application is a run spec that launches a single task. */ trait ApplicationSpec extends RunSpec { /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A Marathon Container definition */ trait ContainerSpec { /** * The name of the container spec. */ val name: String /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A pod is a run spec that launches a task group. */ trait PodSpec extends RunSpec { /** * All containers of this run specification */ val containers: Seq[ContainerSpec] /** * The environment shared for all containers inside this pod. */ val env: Map[String, EnvVarValue] }
Make pod.env available in the plugin definition.
Make pod.env available in the plugin definition. Summary: Users can define env vars on the level of pods (for every container) or on the level of containers. The same structure should be available to plugin developers. Test Plan: sbt test Reviewers: unterstein, meichstedt Reviewed By: unterstein, meichstedt Subscribers: marathon-team Differential Revision: https://phabricator.mesosphere.com/D210
Scala
apache-2.0
meln1k/marathon,Caerostris/marathon,mesosphere/marathon,meln1k/marathon,janisz/marathon,guenter/marathon,mesosphere/marathon,janisz/marathon,natemurthy/marathon,Caerostris/marathon,natemurthy/marathon,gsantovena/marathon,guenter/marathon,Caerostris/marathon,gsantovena/marathon,Caerostris/marathon,janisz/marathon,meln1k/marathon,janisz/marathon,mesosphere/marathon,guenter/marathon,gsantovena/marathon,guenter/marathon,natemurthy/marathon,gsantovena/marathon,meln1k/marathon,mesosphere/marathon,mesosphere/marathon,janisz/marathon,gsantovena/marathon,guenter/marathon,Caerostris/marathon,meln1k/marathon
scala
## Code Before: package mesosphere.marathon.plugin /** * A Marathon RunSpec Definition */ trait RunSpec { /** * The uniqie id of this run specification */ val id: PathId /** * The Mesos resource roles that are accepted */ val acceptedResourceRoles: Set[String] /** * All defined secret definitions */ val secrets: Map[String, Secret] } /** * An application is a run spec that launches a single task. */ trait ApplicationSpec extends RunSpec { /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A Marathon Container definition */ trait ContainerSpec { /** * The name of the container spec. */ val name: String /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A pod is a run spec that launches a task group. */ trait PodSpec extends RunSpec { /** * All containers of this run specification */ val containers: Seq[ContainerSpec] } ## Instruction: Make pod.env available in the plugin definition. Summary: Users can define env vars on the level of pods (for every container) or on the level of containers. The same structure should be available to plugin developers. Test Plan: sbt test Reviewers: unterstein, meichstedt Reviewed By: unterstein, meichstedt Subscribers: marathon-team Differential Revision: https://phabricator.mesosphere.com/D210 ## Code After: package mesosphere.marathon.plugin /** * A Marathon RunSpec Definition */ trait RunSpec { /** * The uniqie id of this run specification */ val id: PathId /** * The Mesos resource roles that are accepted */ val acceptedResourceRoles: Set[String] /** * All defined secret definitions */ val secrets: Map[String, Secret] } /** * An application is a run spec that launches a single task. */ trait ApplicationSpec extends RunSpec { /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A Marathon Container definition */ trait ContainerSpec { /** * The name of the container spec. */ val name: String /** * The user to execute the container task */ val user: Option[String] /** * The environment of this container. */ val env: Map[String, EnvVarValue] /** * The labels in that container */ val labels: Map[String, String] } /** * A pod is a run spec that launches a task group. */ trait PodSpec extends RunSpec { /** * All containers of this run specification */ val containers: Seq[ContainerSpec] /** * The environment shared for all containers inside this pod. */ val env: Map[String, EnvVarValue] }
0969be07f45b533919783427f2077b9d75978d0a
Cargo.toml
Cargo.toml
[package] name = "stm32-bootloader-client" version = "0.1.0" edition = "2021" license = "BSD-3-Clause" [dependencies] embedded-hal = "0.2.6" defmt = { version = "0.2.1", optional = true } log = { version = "0.4.14", optional = true } [dev-dependencies] anyhow = "1.0.38" embedded-hal-mock = "0.8.0" mcp2221 = "0.1.0" [features] default = ["std"] std = []
[package] name = "stm32-bootloader-client" description = "A library for communicating with the STM32 system bootloader" repository = "https://github.com/google/stm32-bootloader-client-rs" version = "0.1.0" edition = "2021" license = "BSD-3-Clause" [dependencies] embedded-hal = "0.2.6" defmt = { version = "0.2.1", optional = true } log = { version = "0.4.14", optional = true } [dev-dependencies] anyhow = "1.0.38" embedded-hal-mock = "0.8.0" mcp2221 = "0.1.0" [features] default = ["std"] std = []
Add description and repository link
Add description and repository link
TOML
apache-2.0
google/stm32-bootloader-client-rs
toml
## Code Before: [package] name = "stm32-bootloader-client" version = "0.1.0" edition = "2021" license = "BSD-3-Clause" [dependencies] embedded-hal = "0.2.6" defmt = { version = "0.2.1", optional = true } log = { version = "0.4.14", optional = true } [dev-dependencies] anyhow = "1.0.38" embedded-hal-mock = "0.8.0" mcp2221 = "0.1.0" [features] default = ["std"] std = [] ## Instruction: Add description and repository link ## Code After: [package] name = "stm32-bootloader-client" description = "A library for communicating with the STM32 system bootloader" repository = "https://github.com/google/stm32-bootloader-client-rs" version = "0.1.0" edition = "2021" license = "BSD-3-Clause" [dependencies] embedded-hal = "0.2.6" defmt = { version = "0.2.1", optional = true } log = { version = "0.4.14", optional = true } [dev-dependencies] anyhow = "1.0.38" embedded-hal-mock = "0.8.0" mcp2221 = "0.1.0" [features] default = ["std"] std = []
cfef82ffb57292d98fed2a6334b9352b606d2f6b
metadata/org.piwik.mobile2.txt
metadata/org.piwik.mobile2.txt
Categories:Internet License:GPL-3.0-only Web Site:http://www.piwik.org Source Code:https://github.com/piwik/piwik-mobile-2 Issue Tracker: Summary:Web Analytics Description: Piwik is a free and open source web analytics application written by a team of international developers that runs on a PHP/MySQL webserver. . Repo Type:git Repo:https://github.com/piwik/piwik-mobile-2.git Build:2.0.0,1 disable=needs titanium commit=92ca99c389 Auto Update Mode:None Update Check Mode:RepoManifest Current Version:2.0.1 Current Version Code:1
Categories:Internet License:GPL-3.0-only Web Site:http://www.piwik.org Source Code:https://github.com/piwik/piwik-mobile-2 Issue Tracker:https://github.com/matomo-org/matomo-mobile-2/issues Changelog:https://matomo.org/blog/category/piwik-mobile-changelog/ Summary:Web Analytics Description: Piwik is a free and open source web analytics application written by a team of international developers that runs on a PHP/MySQL webserver. . Repo Type:git Repo:https://github.com/piwik/piwik-mobile-2.git Build:2.0.0,1 disable=needs titanium commit=92ca99c389 Auto Update Mode:None Update Check Mode:RepoManifest Current Version:2.0.1 Current Version Code:1
Add changelog + issues to Piwik2
Add changelog + issues to Piwik2
Text
agpl-3.0
f-droid/fdroiddata,f-droid/fdroiddata,f-droid/fdroid-data
text
## Code Before: Categories:Internet License:GPL-3.0-only Web Site:http://www.piwik.org Source Code:https://github.com/piwik/piwik-mobile-2 Issue Tracker: Summary:Web Analytics Description: Piwik is a free and open source web analytics application written by a team of international developers that runs on a PHP/MySQL webserver. . Repo Type:git Repo:https://github.com/piwik/piwik-mobile-2.git Build:2.0.0,1 disable=needs titanium commit=92ca99c389 Auto Update Mode:None Update Check Mode:RepoManifest Current Version:2.0.1 Current Version Code:1 ## Instruction: Add changelog + issues to Piwik2 ## Code After: Categories:Internet License:GPL-3.0-only Web Site:http://www.piwik.org Source Code:https://github.com/piwik/piwik-mobile-2 Issue Tracker:https://github.com/matomo-org/matomo-mobile-2/issues Changelog:https://matomo.org/blog/category/piwik-mobile-changelog/ Summary:Web Analytics Description: Piwik is a free and open source web analytics application written by a team of international developers that runs on a PHP/MySQL webserver. . Repo Type:git Repo:https://github.com/piwik/piwik-mobile-2.git Build:2.0.0,1 disable=needs titanium commit=92ca99c389 Auto Update Mode:None Update Check Mode:RepoManifest Current Version:2.0.1 Current Version Code:1
b208aa675ec1745af4fa2c2b461ddb2c7367b2e7
.travis.yml
.travis.yml
language: ruby matrix: include: - env: TAGS= rvm: 2.3.3 # - env: TAGS=tcell # rvm: 2.3.3 install: - sudo add-apt-repository -y ppa:pi-rho/dev - sudo apt-add-repository -y ppa:fish-shell/release-2 - sudo apt-get update - sudo apt-get install -y tmux zsh fish script: | make test install && ./install --all && tmux new "ruby test/test_go.rb > out && touch ok" && cat out && [ -e ok ]
language: ruby dist: trusty sudo: required matrix: include: - env: TAGS= rvm: 2.3.3 # - env: TAGS=tcell # rvm: 2.3.3 install: - sudo add-apt-repository -y ppa:pi-rho/dev - sudo apt-add-repository -y ppa:fish-shell/release-2 - sudo apt-get update - sudo apt-get install -y tmux zsh fish script: | make test install && ./install --all && tmux new "ruby test/test_go.rb > out && touch ok" && cat out && [ -e ok ]
Update Travis build to run on Trusty
Update Travis build to run on Trusty
YAML
mit
infokiller/fzf,wilywampa/fzf,wilywampa/fzf,junegunn/fzf,janlazo/fzf,infokiller/fzf,janlazo/fzf,infokiller/fzf,janlazo/fzf,wilywampa/fzf,junegunn/fzf,junegunn/fzf,stuha/wbr-zsh
yaml
## Code Before: language: ruby matrix: include: - env: TAGS= rvm: 2.3.3 # - env: TAGS=tcell # rvm: 2.3.3 install: - sudo add-apt-repository -y ppa:pi-rho/dev - sudo apt-add-repository -y ppa:fish-shell/release-2 - sudo apt-get update - sudo apt-get install -y tmux zsh fish script: | make test install && ./install --all && tmux new "ruby test/test_go.rb > out && touch ok" && cat out && [ -e ok ] ## Instruction: Update Travis build to run on Trusty ## Code After: language: ruby dist: trusty sudo: required matrix: include: - env: TAGS= rvm: 2.3.3 # - env: TAGS=tcell # rvm: 2.3.3 install: - sudo add-apt-repository -y ppa:pi-rho/dev - sudo apt-add-repository -y ppa:fish-shell/release-2 - sudo apt-get update - sudo apt-get install -y tmux zsh fish script: | make test install && ./install --all && tmux new "ruby test/test_go.rb > out && touch ok" && cat out && [ -e ok ]
ab273709876f4e25d03f43cc96c77aa46bcc9f4e
config/symplify.php
config/symplify.php
<?php declare(strict_types=1); use PhpCsFixer\Fixer\ClassNotation\FinalInternalClassFixer; use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator; return static function (ContainerConfigurator $containerConfigurator): void { $containerConfigurator->import(__DIR__ . '/config.php'); $services = $containerConfigurator->services(); $services->defaults() ->public() ->autowire() ->autoconfigure(); $services->set(FinalInternalClassFixer::class); $services->load('Symplify\CodingStandard\Fixer\\', __DIR__ . '/../src/Fixer') ->exclude([__DIR__ . '/../src/Fixer/Annotation']); };
<?php declare(strict_types=1); use PhpCsFixer\Fixer\ClassNotation\FinalInternalClassFixer; use PhpCsFixer\Fixer\Phpdoc\GeneralPhpdocAnnotationRemoveFixer; use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator; return static function (ContainerConfigurator $containerConfigurator): void { $containerConfigurator->import(__DIR__ . '/config.php'); $services = $containerConfigurator->services(); $services->defaults() ->public() ->autowire() ->autoconfigure(); $services->set(FinalInternalClassFixer::class); $services->load('Symplify\CodingStandard\Fixer\\', __DIR__ . '/../src/Fixer') ->exclude([__DIR__ . '/../src/Fixer/Annotation']); $services->set(GeneralPhpdocAnnotationRemoveFixer::class) ->call('configure', [[ 'annotations' => ['throws', 'author', 'package', 'group', 'covers'], ]]); };
Add general doc removal for thorws
[ECS] Add general doc removal for thorws
PHP
mit
Symplify/CodingStandard
php
## Code Before: <?php declare(strict_types=1); use PhpCsFixer\Fixer\ClassNotation\FinalInternalClassFixer; use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator; return static function (ContainerConfigurator $containerConfigurator): void { $containerConfigurator->import(__DIR__ . '/config.php'); $services = $containerConfigurator->services(); $services->defaults() ->public() ->autowire() ->autoconfigure(); $services->set(FinalInternalClassFixer::class); $services->load('Symplify\CodingStandard\Fixer\\', __DIR__ . '/../src/Fixer') ->exclude([__DIR__ . '/../src/Fixer/Annotation']); }; ## Instruction: [ECS] Add general doc removal for thorws ## Code After: <?php declare(strict_types=1); use PhpCsFixer\Fixer\ClassNotation\FinalInternalClassFixer; use PhpCsFixer\Fixer\Phpdoc\GeneralPhpdocAnnotationRemoveFixer; use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator; return static function (ContainerConfigurator $containerConfigurator): void { $containerConfigurator->import(__DIR__ . '/config.php'); $services = $containerConfigurator->services(); $services->defaults() ->public() ->autowire() ->autoconfigure(); $services->set(FinalInternalClassFixer::class); $services->load('Symplify\CodingStandard\Fixer\\', __DIR__ . '/../src/Fixer') ->exclude([__DIR__ . '/../src/Fixer/Annotation']); $services->set(GeneralPhpdocAnnotationRemoveFixer::class) ->call('configure', [[ 'annotations' => ['throws', 'author', 'package', 'group', 'covers'], ]]); };
22159527d2ec81497d37cb75ea7770f5cfa311b8
README.md
README.md
[![Build Status](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat.svg?branch=develop)](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat) [![Coverage Status](https://img.shields.io/coveralls/Flink/lita-gitlab-ci-hipchat.svg)](https://coveralls.io/r/Flink/lita-gitlab-ci-hipchat) Receive and display nicely web hooks from GitLab CI in HipChat. ## Installation Add lita-gitlab-ci-hipchat to your Lita instance's Gemfile: ``` ruby gem 'lita-gitlab-ci-hipchat' ``` ## Configuration ```ruby Lita.configure do |config| # The API token for your bot’s user config.handlers.gitlab_ci_hipchat.api_token = 'token' # The room to be notified (HipChat name, not JID) config.handlers.gitlab_ci_hipchat.room = 'my_room' end ``` ## Usage This handler add a HTTP route at `/gitlab-ci`. So you have to add a web hook pointing to that URL (http://lita-bot.tld/gitlab-ci). ## License [MIT](http://opensource.org/licenses/MIT)
[![Build Status](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat.svg?branch=develop)](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat) [![Coverage Status](https://coveralls.io/repos/Flink/lita-gitlab-ci-hipchat/badge.png?branch=develop)](https://coveralls.io/r/Flink/lita-gitlab-ci-hipchat?branch=develop) Receive and display nicely web hooks from GitLab CI in HipChat. ## Installation Add lita-gitlab-ci-hipchat to your Lita instance's Gemfile: ``` ruby gem 'lita-gitlab-ci-hipchat' ``` ## Configuration ```ruby Lita.configure do |config| # The API token for your bot’s user config.handlers.gitlab_ci_hipchat.api_token = 'token' # The room to be notified (HipChat name, not JID) config.handlers.gitlab_ci_hipchat.room = 'my_room' end ``` ## Usage This handler add a HTTP route at `/gitlab-ci`. So you have to add a web hook pointing to that URL (http://lita-bot.tld/gitlab-ci). ## License [MIT](http://opensource.org/licenses/MIT)
Use PNG badge for Coveralls instead of SVG one
Use PNG badge for Coveralls instead of SVG one
Markdown
mit
Flink/lita-gitlab-ci-hipchat,Flink/lita-gitlab-ci-hipchat
markdown
## Code Before: [![Build Status](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat.svg?branch=develop)](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat) [![Coverage Status](https://img.shields.io/coveralls/Flink/lita-gitlab-ci-hipchat.svg)](https://coveralls.io/r/Flink/lita-gitlab-ci-hipchat) Receive and display nicely web hooks from GitLab CI in HipChat. ## Installation Add lita-gitlab-ci-hipchat to your Lita instance's Gemfile: ``` ruby gem 'lita-gitlab-ci-hipchat' ``` ## Configuration ```ruby Lita.configure do |config| # The API token for your bot’s user config.handlers.gitlab_ci_hipchat.api_token = 'token' # The room to be notified (HipChat name, not JID) config.handlers.gitlab_ci_hipchat.room = 'my_room' end ``` ## Usage This handler add a HTTP route at `/gitlab-ci`. So you have to add a web hook pointing to that URL (http://lita-bot.tld/gitlab-ci). ## License [MIT](http://opensource.org/licenses/MIT) ## Instruction: Use PNG badge for Coveralls instead of SVG one ## Code After: [![Build Status](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat.svg?branch=develop)](https://travis-ci.org/Flink/lita-gitlab-ci-hipchat) [![Coverage Status](https://coveralls.io/repos/Flink/lita-gitlab-ci-hipchat/badge.png?branch=develop)](https://coveralls.io/r/Flink/lita-gitlab-ci-hipchat?branch=develop) Receive and display nicely web hooks from GitLab CI in HipChat. ## Installation Add lita-gitlab-ci-hipchat to your Lita instance's Gemfile: ``` ruby gem 'lita-gitlab-ci-hipchat' ``` ## Configuration ```ruby Lita.configure do |config| # The API token for your bot’s user config.handlers.gitlab_ci_hipchat.api_token = 'token' # The room to be notified (HipChat name, not JID) config.handlers.gitlab_ci_hipchat.room = 'my_room' end ``` ## Usage This handler add a HTTP route at `/gitlab-ci`. So you have to add a web hook pointing to that URL (http://lita-bot.tld/gitlab-ci). ## License [MIT](http://opensource.org/licenses/MIT)
1220784cd1d05b7f2ca12a05a5797a4806cdb223
settings/setup-interface.el
settings/setup-interface.el
;; Font (set-face-attribute 'default nil :family "Menlo" :height 120) (set-face-attribute 'fixed-pitch nil :family 'unspecified :inherit 'default) ;; Turn off mouse interface early in startup to avoid momentary display (if (fboundp 'menu-bar-mode) (menu-bar-mode -1)) (if (fboundp 'tabbar-mode) (tabbar-mode -1)) (if (fboundp 'tool-bar-mode) (tool-bar-mode -1)) (if (fboundp 'scroll-bar-mode) (scroll-bar-mode -1)) ;; vertical window split (setq split-width-threshold 190) (setq split-height-threshold nil) ;; Line wrap indicator (setq-default visual-line-fringe-indicators '(left-curly-arrow nil)) (show-paren-mode 1) (provide 'setup-interface)
;; Font (set-face-attribute 'default nil :family "Menlo" :height 160) (set-face-attribute 'fixed-pitch nil :family 'unspecified :inherit 'default) ;; Turn off mouse interface early in startup to avoid momentary display (if (fboundp 'menu-bar-mode) (menu-bar-mode -1)) (if (fboundp 'tabbar-mode) (tabbar-mode -1)) (if (fboundp 'tool-bar-mode) (tool-bar-mode -1)) (if (fboundp 'scroll-bar-mode) (scroll-bar-mode -1)) ;; vertical window split (setq split-width-threshold 190) (setq split-height-threshold nil) ;; Line wrap indicator (setq-default visual-line-fringe-indicators '(left-curly-arrow nil)) ;; Fill column (setq fill-column 80) (show-paren-mode 1) (provide 'setup-interface)
Increase font height and set fill-column to 80
Increase font height and set fill-column to 80
Emacs Lisp
mit
hsienchiaolee/aquamacs
emacs-lisp
## Code Before: ;; Font (set-face-attribute 'default nil :family "Menlo" :height 120) (set-face-attribute 'fixed-pitch nil :family 'unspecified :inherit 'default) ;; Turn off mouse interface early in startup to avoid momentary display (if (fboundp 'menu-bar-mode) (menu-bar-mode -1)) (if (fboundp 'tabbar-mode) (tabbar-mode -1)) (if (fboundp 'tool-bar-mode) (tool-bar-mode -1)) (if (fboundp 'scroll-bar-mode) (scroll-bar-mode -1)) ;; vertical window split (setq split-width-threshold 190) (setq split-height-threshold nil) ;; Line wrap indicator (setq-default visual-line-fringe-indicators '(left-curly-arrow nil)) (show-paren-mode 1) (provide 'setup-interface) ## Instruction: Increase font height and set fill-column to 80 ## Code After: ;; Font (set-face-attribute 'default nil :family "Menlo" :height 160) (set-face-attribute 'fixed-pitch nil :family 'unspecified :inherit 'default) ;; Turn off mouse interface early in startup to avoid momentary display (if (fboundp 'menu-bar-mode) (menu-bar-mode -1)) (if (fboundp 'tabbar-mode) (tabbar-mode -1)) (if (fboundp 'tool-bar-mode) (tool-bar-mode -1)) (if (fboundp 'scroll-bar-mode) (scroll-bar-mode -1)) ;; vertical window split (setq split-width-threshold 190) (setq split-height-threshold nil) ;; Line wrap indicator (setq-default visual-line-fringe-indicators '(left-curly-arrow nil)) ;; Fill column (setq fill-column 80) (show-paren-mode 1) (provide 'setup-interface)
a3e0a4afff7799dd6b524c896cc0e88e92612004
src/components/Posts/posts.html
src/components/Posts/posts.html
<div v-if="posts.length"> <h1 class="mb-4"><i class="fa fa-file-text-o"></i> Posts</h1> <div class="list-group"> <router-link class="list-group-item" v-for="(post, index) in posts" :to="{ name: 'post', params: { id: post.id }}">#{{ index+1 }} {{ post.title }} </router-link> </div> </div>
<div v-if="posts.length"> <div v-show="posts.length" class="mb-4"> <h1 class="mb-4"><i class="fa fa-file-text-o"></i> Posts</h1> <div class="list-group"> <router-link class="list-group-item" v-for="(post, index) in posts" :to="{ name: 'post', params: { id: post.id }}">#{{ index+1 }} {{ post.title }} </router-link> </div> </div>
Add a bit spacing below heading
Add a bit spacing below heading
HTML
mit
villeristi/vue.js-starter-template,villeristi/vue.js-starter-template
html
## Code Before: <div v-if="posts.length"> <h1 class="mb-4"><i class="fa fa-file-text-o"></i> Posts</h1> <div class="list-group"> <router-link class="list-group-item" v-for="(post, index) in posts" :to="{ name: 'post', params: { id: post.id }}">#{{ index+1 }} {{ post.title }} </router-link> </div> </div> ## Instruction: Add a bit spacing below heading ## Code After: <div v-if="posts.length"> <div v-show="posts.length" class="mb-4"> <h1 class="mb-4"><i class="fa fa-file-text-o"></i> Posts</h1> <div class="list-group"> <router-link class="list-group-item" v-for="(post, index) in posts" :to="{ name: 'post', params: { id: post.id }}">#{{ index+1 }} {{ post.title }} </router-link> </div> </div>
12d91af46eec64ab4f04088d37cebb653eaca32f
app/helpers/application_helper/button/utilization_download.rb
app/helpers/application_helper/button/utilization_download.rb
class ApplicationHelper::Button::UtilizationDownload < ApplicationHelper::Button::Basic def disabled? # to enable the button we are in the "Utilization" and have trend report return false if @layout == 'miq_capacity_utilization' && @sb[:active_tab] == 'report' && !@sb.fetch_path(:trend_rpt).table.data.empty? # otherwise the button is off @error_message = _('No records found for this report') @error_message.present? end end
class ApplicationHelper::Button::UtilizationDownload < ApplicationHelper::Button::Basic def disabled? # to enable the button we are in the "Utilization" and have trend report return false if @layout == 'miq_capacity_utilization' && @sb[:active_tab] == 'report' && @sb.fetch_path(:trend_rpt, :table, :data).present? # otherwise the button is off @error_message = _('No records found for this report') @error_message.present? end end
Fix error about empty? called on nil object
Fix error about empty? called on nil object
Ruby
apache-2.0
ManageIQ/manageiq-ui-classic,ManageIQ/manageiq-ui-classic,ManageIQ/manageiq-ui-classic,ManageIQ/manageiq-ui-classic
ruby
## Code Before: class ApplicationHelper::Button::UtilizationDownload < ApplicationHelper::Button::Basic def disabled? # to enable the button we are in the "Utilization" and have trend report return false if @layout == 'miq_capacity_utilization' && @sb[:active_tab] == 'report' && !@sb.fetch_path(:trend_rpt).table.data.empty? # otherwise the button is off @error_message = _('No records found for this report') @error_message.present? end end ## Instruction: Fix error about empty? called on nil object ## Code After: class ApplicationHelper::Button::UtilizationDownload < ApplicationHelper::Button::Basic def disabled? # to enable the button we are in the "Utilization" and have trend report return false if @layout == 'miq_capacity_utilization' && @sb[:active_tab] == 'report' && @sb.fetch_path(:trend_rpt, :table, :data).present? # otherwise the button is off @error_message = _('No records found for this report') @error_message.present? end end
623f06fc14ab813ea128ad5c1d8ff65bef95eb26
run_tests.sh
run_tests.sh
set -ex [[ ! "$GOVERSION" ]] && GOVERSION=1.12 REPO=dcrwallet # To run on docker on windows, symlink /mnt/c to /c and then execute the script # from the repo path under /c. See: # https://github.com/Microsoft/BashOnWindows/issues/1854 # for more details. testrepo () { go version env GORACE='halt_on_error=1' CC=gcc GOTESTFLAGS='-race -short' bash ./testmodules.sh } DOCKER= [[ "$1" == "docker" ]] && DOCKER=docker [[ "$1" == "podman" ]] && DOCKER=podman if [[ ! "$DOCKER" ]]; then testrepo exit fi DOCKER_IMAGE_TAG=decred-golang-builder-$GOVERSION $DOCKER pull decred/$DOCKER_IMAGE_TAG $DOCKER run --rm -it -v $(pwd):/src:Z decred/$DOCKER_IMAGE_TAG /bin/bash -c "\ cp -R /src ~/src && \ cd ~/src && \ env GOVERSION=$GOVERSION bash ./run_tests.sh"
set -ex [[ ! "$GOVERSION" ]] && GOVERSION=1.12 REPO=dcrwallet # To run on docker on windows, symlink /mnt/c to /c and then execute the script # from the repo path under /c. See: # https://github.com/Microsoft/BashOnWindows/issues/1854 # for more details. testrepo () { go version env CC=gcc GOTESTFLAGS='-short' bash ./testmodules.sh } DOCKER= [[ "$1" == "docker" ]] && DOCKER=docker [[ "$1" == "podman" ]] && DOCKER=podman if [[ ! "$DOCKER" ]]; then testrepo exit fi DOCKER_IMAGE_TAG=decred-golang-builder-$GOVERSION $DOCKER pull decred/$DOCKER_IMAGE_TAG $DOCKER run --rm -it -v $(pwd):/src:Z decred/$DOCKER_IMAGE_TAG /bin/bash -c "\ cp -R /src ~/src && \ cd ~/src && \ env GOVERSION=$GOVERSION bash ./run_tests.sh"
Remove -race from CI test scripts
Remove -race from CI test scripts Race-enabled testing is an order of magnitude slower. This causes issues for pull requests which add more tests, due to elapsed timeouts on the CI infrastructure, and it is more desirable to have a comprehensive test suite running without -race than few tests with it. Further improvements may be made by caching the Go build cache and/or investigating other infra. To test with -race locally, just run go test -race as usual, or (better) build and run a race-enabled binary.
Shell
isc
jrick/dcrwallet,decred/dcrwallet,jrick/btcwallet,jrick/dcrwallet,jrick/btcwallet,decred/dcrwallet
shell
## Code Before: set -ex [[ ! "$GOVERSION" ]] && GOVERSION=1.12 REPO=dcrwallet # To run on docker on windows, symlink /mnt/c to /c and then execute the script # from the repo path under /c. See: # https://github.com/Microsoft/BashOnWindows/issues/1854 # for more details. testrepo () { go version env GORACE='halt_on_error=1' CC=gcc GOTESTFLAGS='-race -short' bash ./testmodules.sh } DOCKER= [[ "$1" == "docker" ]] && DOCKER=docker [[ "$1" == "podman" ]] && DOCKER=podman if [[ ! "$DOCKER" ]]; then testrepo exit fi DOCKER_IMAGE_TAG=decred-golang-builder-$GOVERSION $DOCKER pull decred/$DOCKER_IMAGE_TAG $DOCKER run --rm -it -v $(pwd):/src:Z decred/$DOCKER_IMAGE_TAG /bin/bash -c "\ cp -R /src ~/src && \ cd ~/src && \ env GOVERSION=$GOVERSION bash ./run_tests.sh" ## Instruction: Remove -race from CI test scripts Race-enabled testing is an order of magnitude slower. This causes issues for pull requests which add more tests, due to elapsed timeouts on the CI infrastructure, and it is more desirable to have a comprehensive test suite running without -race than few tests with it. Further improvements may be made by caching the Go build cache and/or investigating other infra. To test with -race locally, just run go test -race as usual, or (better) build and run a race-enabled binary. ## Code After: set -ex [[ ! "$GOVERSION" ]] && GOVERSION=1.12 REPO=dcrwallet # To run on docker on windows, symlink /mnt/c to /c and then execute the script # from the repo path under /c. See: # https://github.com/Microsoft/BashOnWindows/issues/1854 # for more details. testrepo () { go version env CC=gcc GOTESTFLAGS='-short' bash ./testmodules.sh } DOCKER= [[ "$1" == "docker" ]] && DOCKER=docker [[ "$1" == "podman" ]] && DOCKER=podman if [[ ! "$DOCKER" ]]; then testrepo exit fi DOCKER_IMAGE_TAG=decred-golang-builder-$GOVERSION $DOCKER pull decred/$DOCKER_IMAGE_TAG $DOCKER run --rm -it -v $(pwd):/src:Z decred/$DOCKER_IMAGE_TAG /bin/bash -c "\ cp -R /src ~/src && \ cd ~/src && \ env GOVERSION=$GOVERSION bash ./run_tests.sh"
17403a197c48b54fe6085bb883b28a7934577ac9
README.md
README.md
[![version: Bintray](https://api.bintray.com/packages/jfreeman/jfreeman/autocheck%3Ajfreeman/images/download.svg)](https://bintray.com/jfreeman/jfreeman/autocheck%3Ajfreeman/_latestVersion) [![build status: Linux and Mac OSX](https://travis-ci.org/thejohnfreeman/autocheck.svg?branch=master)](https://travis-ci.org/thejohnfreeman/autocheck) [![build status: Windows](https://ci.appveyor.com/api/projects/status/github/thejohnfreeman/autocheck?branch=master&svg=true)](https://ci.appveyor.com/project/thejohnfreeman/autocheck) Header-only C++11 library for QuickCheck (and later, SmallCheck) testing. Please consult the [wiki][] for documentation. [wiki]: http://github.com/thejohnfreeman/autocheck/wiki ## Install ```sh conan remote add autocheck https://api.bintray.com/conan/thejohnfreeman/autocheck conan install autocheck/[*]@autocheck/stable ```
[![version: Bintray](https://api.bintray.com/packages/jfreeman/jfreeman/autocheck%3Ajfreeman/images/download.svg)](https://bintray.com/jfreeman/jfreeman/autocheck%3Ajfreeman/_latestVersion) [![build status: Linux and Mac OSX](https://travis-ci.org/thejohnfreeman/autocheck.svg?branch=master)](https://travis-ci.org/thejohnfreeman/autocheck) [![build status: Windows](https://ci.appveyor.com/api/projects/status/github/thejohnfreeman/autocheck?branch=master&svg=true)](https://ci.appveyor.com/project/thejohnfreeman/autocheck) Header-only C++11 library for QuickCheck (and later, SmallCheck) testing. Please consult the [wiki][] for documentation. [wiki]: http://github.com/thejohnfreeman/autocheck/wiki ## Install ```sh conan remote add jfreeman https://api.bintray.com/conan/jfreeman/jfreeman conan install autocheck/[*]@jfreeman/stable ```
Update installation instructions for jfreeman Conan repository
Update installation instructions for jfreeman Conan repository [skip ci]
Markdown
isc
thejohnfreeman/autocheck,thejohnfreeman/autocheck
markdown
## Code Before: [![version: Bintray](https://api.bintray.com/packages/jfreeman/jfreeman/autocheck%3Ajfreeman/images/download.svg)](https://bintray.com/jfreeman/jfreeman/autocheck%3Ajfreeman/_latestVersion) [![build status: Linux and Mac OSX](https://travis-ci.org/thejohnfreeman/autocheck.svg?branch=master)](https://travis-ci.org/thejohnfreeman/autocheck) [![build status: Windows](https://ci.appveyor.com/api/projects/status/github/thejohnfreeman/autocheck?branch=master&svg=true)](https://ci.appveyor.com/project/thejohnfreeman/autocheck) Header-only C++11 library for QuickCheck (and later, SmallCheck) testing. Please consult the [wiki][] for documentation. [wiki]: http://github.com/thejohnfreeman/autocheck/wiki ## Install ```sh conan remote add autocheck https://api.bintray.com/conan/thejohnfreeman/autocheck conan install autocheck/[*]@autocheck/stable ``` ## Instruction: Update installation instructions for jfreeman Conan repository [skip ci] ## Code After: [![version: Bintray](https://api.bintray.com/packages/jfreeman/jfreeman/autocheck%3Ajfreeman/images/download.svg)](https://bintray.com/jfreeman/jfreeman/autocheck%3Ajfreeman/_latestVersion) [![build status: Linux and Mac OSX](https://travis-ci.org/thejohnfreeman/autocheck.svg?branch=master)](https://travis-ci.org/thejohnfreeman/autocheck) [![build status: Windows](https://ci.appveyor.com/api/projects/status/github/thejohnfreeman/autocheck?branch=master&svg=true)](https://ci.appveyor.com/project/thejohnfreeman/autocheck) Header-only C++11 library for QuickCheck (and later, SmallCheck) testing. Please consult the [wiki][] for documentation. [wiki]: http://github.com/thejohnfreeman/autocheck/wiki ## Install ```sh conan remote add jfreeman https://api.bintray.com/conan/jfreeman/jfreeman conan install autocheck/[*]@jfreeman/stable ```
eff47fe4b2d57fde12d03eecbdac0f68a09f39d0
lib/nord-atom-ui-vertical-tabs.coffee
lib/nord-atom-ui-vertical-tabs.coffee
module.exports = NordAtomUiVerticalTabs = activate: (state) -> deactivate: ->
selfName = 'nord-atom-ui-vertical-tabs' setDirection = (isLeft) -> panes = document.querySelectorAll('atom-pane') for pane, i in panes pane.style.flexDirection = if isLeft then 'row' else 'row-reverse' atom.config.onDidChange(selfName + '.isLeftTab', (isLeft) -> setDirection(isLeft['newValue']); ) module.exports = NordAtomUiVerticalTabs = config: isLeftTab: title: 'Left tabs' description: 'Show tabs in "left side" of editor' type: 'boolean' default: true activate: (state) -> isLeft = atom.config.get(this.selfName + '.isLeftTab') setDirection(isLeft) deactivate: -> panes = document.querySelectorAll('atom-pane') for pane, i in panes pane.style.flexDirection = 'column'
Support switching side of tabs
Support switching side of tabs
CoffeeScript
mit
hmsk/nord-atom-ui-vertical-tabs
coffeescript
## Code Before: module.exports = NordAtomUiVerticalTabs = activate: (state) -> deactivate: -> ## Instruction: Support switching side of tabs ## Code After: selfName = 'nord-atom-ui-vertical-tabs' setDirection = (isLeft) -> panes = document.querySelectorAll('atom-pane') for pane, i in panes pane.style.flexDirection = if isLeft then 'row' else 'row-reverse' atom.config.onDidChange(selfName + '.isLeftTab', (isLeft) -> setDirection(isLeft['newValue']); ) module.exports = NordAtomUiVerticalTabs = config: isLeftTab: title: 'Left tabs' description: 'Show tabs in "left side" of editor' type: 'boolean' default: true activate: (state) -> isLeft = atom.config.get(this.selfName + '.isLeftTab') setDirection(isLeft) deactivate: -> panes = document.querySelectorAll('atom-pane') for pane, i in panes pane.style.flexDirection = 'column'
c256b5165cd3bc582b357a9d98b3aacc8c4bc338
ArgusWeb/app/js/services/downloadHelper.js
ArgusWeb/app/js/services/downloadHelper.js
/** * Created by liuxizi.xu on 2/2/17. */ 'use strict'; /*global angular:false */ angular.module('argus.services.downloadHelper', []) .service('DownloadHelper', function () { this.downloadFile = function (data, filename) { var blob = new Blob([data], {type: "text/plain;charset=utf-8"}); saveAs(blob, filename); }; });
/** * Created by liuxizi.xu on 2/2/17. */ 'use strict'; /*global angular:false */ angular.module('argus.services.downloadHelper', []) .service('DownloadHelper', function () { this.downloadFile = function (data, filename) { var blob = new Blob([data], {type: "text/plain"}); saveAs(blob, filename); }; });
Fix downloaded file has invisible characters issue
Fix downloaded file has invisible characters issue
JavaScript
bsd-3-clause
dilipdevaraj-sfdc/Argus-1,xizi-xu/Argus,xizi-xu/Argus,SalesforceEng/Argus,rajsarkapally/Argus,SalesforceEng/Argus,rajsarkapally-sfdc/Argus,xizi-xu/Argus,dilipdevaraj-sfdc/Argus-1,dilipdevaraj-sfdc/Argus-1,salesforce/Argus,rajsarkapally/Argus,rajsarkapally-sfdc/Argus,dilipdevaraj-sfdc/Argus-1,salesforce/Argus,rajsarkapally/Argus,salesforce/Argus,rajsarkapally-sfdc/Argus,SalesforceEng/Argus,rajsarkapally/Argus,salesforce/Argus,rajsarkapally-sfdc/Argus,xizi-xu/Argus,SalesforceEng/Argus,rajsarkapally/Argus
javascript
## Code Before: /** * Created by liuxizi.xu on 2/2/17. */ 'use strict'; /*global angular:false */ angular.module('argus.services.downloadHelper', []) .service('DownloadHelper', function () { this.downloadFile = function (data, filename) { var blob = new Blob([data], {type: "text/plain;charset=utf-8"}); saveAs(blob, filename); }; }); ## Instruction: Fix downloaded file has invisible characters issue ## Code After: /** * Created by liuxizi.xu on 2/2/17. */ 'use strict'; /*global angular:false */ angular.module('argus.services.downloadHelper', []) .service('DownloadHelper', function () { this.downloadFile = function (data, filename) { var blob = new Blob([data], {type: "text/plain"}); saveAs(blob, filename); }; });
e0d0f575a2e83d88e9f2049de6ed39cc86fa2dc1
Pod/Classes/BetterSegmentedControl+PredefinedStyles.swift
Pod/Classes/BetterSegmentedControl+PredefinedStyles.swift
// // BetterSegmentedControl+PredefinedStyles.swift // BetterSegmentedControl // // Created by George Marmaridis on 18.10.20. // import UIKit public extension BetterSegmentedControl { class func appleStyled(frame: CGRect, titles: [String]) -> BetterSegmentedControl { let control = BetterSegmentedControl( frame: frame, segments: LabelSegment.segments(withTitles: titles, normalFont: .systemFont(ofSize: 13.0), normalTextColor: .appleSegmentedControlDefaultSegmentText, selectedFont: .systemFont(ofSize: 13.0, weight: .medium), selectedTextColor: .appleSegmentedControlDefaultSegmentText), index: 0, options: [.backgroundColor(.appleSegmentedControlDefaultControlBackground), .indicatorViewBackgroundColor(.appleSegmentedControlDefaultIndicatorBackground), .cornerRadius(8), .indicatorViewInset(2)]) control.indicatorView.layer.shadowColor = UIColor.black.cgColor control.indicatorView.layer.shadowOpacity = 0.1 control.indicatorView.layer.shadowOffset = CGSize(width: 1, height: 1) control.indicatorView.layer.shadowRadius = 2 return control } }
// // BetterSegmentedControl+PredefinedStyles.swift // BetterSegmentedControl // // Created by George Marmaridis on 18.10.20. // import UIKit public extension BetterSegmentedControl { class func appleStyled(frame: CGRect, titles: [String]) -> BetterSegmentedControl { let control = BetterSegmentedControl( frame: frame, segments: LabelSegment.segments(withTitles: titles), options: [.cornerRadius(8)]) control.indicatorView.layer.shadowColor = UIColor.black.cgColor control.indicatorView.layer.shadowOpacity = 0.1 control.indicatorView.layer.shadowOffset = CGSize(width: 1, height: 1) control.indicatorView.layer.shadowRadius = 2 return control } }
Simplify implementation of `appleStyled` extension
Simplify implementation of `appleStyled` extension
Swift
mit
gmarm/BetterSegmentedControl,gmarm/BetterSegmentedControl
swift
## Code Before: // // BetterSegmentedControl+PredefinedStyles.swift // BetterSegmentedControl // // Created by George Marmaridis on 18.10.20. // import UIKit public extension BetterSegmentedControl { class func appleStyled(frame: CGRect, titles: [String]) -> BetterSegmentedControl { let control = BetterSegmentedControl( frame: frame, segments: LabelSegment.segments(withTitles: titles, normalFont: .systemFont(ofSize: 13.0), normalTextColor: .appleSegmentedControlDefaultSegmentText, selectedFont: .systemFont(ofSize: 13.0, weight: .medium), selectedTextColor: .appleSegmentedControlDefaultSegmentText), index: 0, options: [.backgroundColor(.appleSegmentedControlDefaultControlBackground), .indicatorViewBackgroundColor(.appleSegmentedControlDefaultIndicatorBackground), .cornerRadius(8), .indicatorViewInset(2)]) control.indicatorView.layer.shadowColor = UIColor.black.cgColor control.indicatorView.layer.shadowOpacity = 0.1 control.indicatorView.layer.shadowOffset = CGSize(width: 1, height: 1) control.indicatorView.layer.shadowRadius = 2 return control } } ## Instruction: Simplify implementation of `appleStyled` extension ## Code After: // // BetterSegmentedControl+PredefinedStyles.swift // BetterSegmentedControl // // Created by George Marmaridis on 18.10.20. // import UIKit public extension BetterSegmentedControl { class func appleStyled(frame: CGRect, titles: [String]) -> BetterSegmentedControl { let control = BetterSegmentedControl( frame: frame, segments: LabelSegment.segments(withTitles: titles), options: [.cornerRadius(8)]) control.indicatorView.layer.shadowColor = UIColor.black.cgColor control.indicatorView.layer.shadowOpacity = 0.1 control.indicatorView.layer.shadowOffset = CGSize(width: 1, height: 1) control.indicatorView.layer.shadowRadius = 2 return control } }
e8a0b8351b2576d7bde4ec31e56773b99c844317
packages/ra/rank-product.yaml
packages/ra/rank-product.yaml
homepage: http://github.com/GregorySchwartz/rank-product#readme changelog-type: '' hash: baa32db0358a5c8cb9db1fdb0d60205f6bd8c2ed4a4605cc513c8879e994a4bd test-bench-deps: {} maintainer: gsch@mail.med.upenn.edu synopsis: Find the rank product of a data set. changelog: '' basic-deps: base: ! '>=4.7 && <5' random-fu: -any all-versions: - '0.1.0.0' - '0.1.0.1' - '0.1.0.2' author: Gregory W. Schwartz latest: '0.1.0.2' description-type: haddock description: Find the rank product of a data set and get the p-value from a permutation test. license-name: GPL-3
homepage: http://github.com/GregorySchwartz/rank-product#readme changelog-type: '' hash: b19c42db895c32198f1505fc881e7dbe4f3d12618cf40053583faf6d495a6514 test-bench-deps: {} maintainer: gsch@mail.med.upenn.edu synopsis: Find the rank product of a data set. changelog: '' basic-deps: base: ! '>=4.7 && <5' random-fu: -any all-versions: - '0.1.0.3' author: Gregory W. Schwartz latest: '0.1.0.3' description-type: haddock description: Find the rank product of a data set and get the p-value from a permutation test. license-name: GPL-3
Update from Hackage at 2017-04-05T19:48:48Z
Update from Hackage at 2017-04-05T19:48:48Z
YAML
mit
commercialhaskell/all-cabal-metadata
yaml
## Code Before: homepage: http://github.com/GregorySchwartz/rank-product#readme changelog-type: '' hash: baa32db0358a5c8cb9db1fdb0d60205f6bd8c2ed4a4605cc513c8879e994a4bd test-bench-deps: {} maintainer: gsch@mail.med.upenn.edu synopsis: Find the rank product of a data set. changelog: '' basic-deps: base: ! '>=4.7 && <5' random-fu: -any all-versions: - '0.1.0.0' - '0.1.0.1' - '0.1.0.2' author: Gregory W. Schwartz latest: '0.1.0.2' description-type: haddock description: Find the rank product of a data set and get the p-value from a permutation test. license-name: GPL-3 ## Instruction: Update from Hackage at 2017-04-05T19:48:48Z ## Code After: homepage: http://github.com/GregorySchwartz/rank-product#readme changelog-type: '' hash: b19c42db895c32198f1505fc881e7dbe4f3d12618cf40053583faf6d495a6514 test-bench-deps: {} maintainer: gsch@mail.med.upenn.edu synopsis: Find the rank product of a data set. changelog: '' basic-deps: base: ! '>=4.7 && <5' random-fu: -any all-versions: - '0.1.0.3' author: Gregory W. Schwartz latest: '0.1.0.3' description-type: haddock description: Find the rank product of a data set and get the p-value from a permutation test. license-name: GPL-3
fa8e69618a305104152a4eaa2c1dea96e979f293
index.js
index.js
'use strict' module.exports = tempest tempest.compile = compile tempest.render = render function tempest (s) { for (var i = 0, t = 0, p = [], e = [], f, l, o = '{{', c = '}}';;) { f = s.indexOf(o, t) if (f < 0) break p[i] = s.slice(t, f) l = s.indexOf(c, f) e[i++] = s.slice(f + 2, l) t = l + 2 } p[i] = s.slice(t) return [p, e] } function compile (t, d) { for (var i = 0, s = '', p = t[0], e = t[1], l = e.length; i < l; s += p[i] + (d[e[i++]] || '')); return s + p[i] } function render (s, d) { return compile(tempest(s), d) }
'use strict' module.exports = tempest tempest.compile = compile tempest.render = render function tempest (s) { for (var i = 0, p = [], e = [], f, l, o = '{{', c = '}}';;) { f = s.indexOf(o, l) if (f < 0) break p[i] = s.slice(l, f) l = s.indexOf(c, f) e[i++] = s.slice(f + 2, l) l = l + 2 } p[i] = s.slice(l) return [p, e] } function compile (t, d) { for (var i = 0, s = '', p = t[0], e = t[1], l = e.length; i < l; s += p[i] + (d[e[i++]] || '')); return s + p[i] } function render (s, d) { return compile(tempest(s), d) }
Use l instead of t
Use l instead of t
JavaScript
mit
baggo/tempest
javascript
## Code Before: 'use strict' module.exports = tempest tempest.compile = compile tempest.render = render function tempest (s) { for (var i = 0, t = 0, p = [], e = [], f, l, o = '{{', c = '}}';;) { f = s.indexOf(o, t) if (f < 0) break p[i] = s.slice(t, f) l = s.indexOf(c, f) e[i++] = s.slice(f + 2, l) t = l + 2 } p[i] = s.slice(t) return [p, e] } function compile (t, d) { for (var i = 0, s = '', p = t[0], e = t[1], l = e.length; i < l; s += p[i] + (d[e[i++]] || '')); return s + p[i] } function render (s, d) { return compile(tempest(s), d) } ## Instruction: Use l instead of t ## Code After: 'use strict' module.exports = tempest tempest.compile = compile tempest.render = render function tempest (s) { for (var i = 0, p = [], e = [], f, l, o = '{{', c = '}}';;) { f = s.indexOf(o, l) if (f < 0) break p[i] = s.slice(l, f) l = s.indexOf(c, f) e[i++] = s.slice(f + 2, l) l = l + 2 } p[i] = s.slice(l) return [p, e] } function compile (t, d) { for (var i = 0, s = '', p = t[0], e = t[1], l = e.length; i < l; s += p[i] + (d[e[i++]] || '')); return s + p[i] } function render (s, d) { return compile(tempest(s), d) }
a641880a13adc7f6e201ea25dbcf57e1997eb1bd
test/integration/LogSpec.groovy
test/integration/LogSpec.groovy
import grails.plugin.spock.* import collectiontypes.list.* import spock.lang.* class LogSpec extends IntegrationSpec { def fixtureLoader def "log available in the fixtureLoader"() { when: def f = fixtureLoader.load { log.info "Shouldn't fail" } then: true //a log should be available } def "log available in file"() { when: def f = fixtureLoader.load("log/**/*") then: true //a log should be available } }
import grails.plugin.spock.* import collectiontypes.list.* import spock.lang.* class LogSpec extends IntegrationSpec { def fixtureLoader def "log available in the fixtureLoader"() { when: def f = fixtureLoader.load { log.info "Shouldn't fail" } then: notThrown(Exception) } def "log available in file"() { when: def f = fixtureLoader.load("log/**/*") then: notThrown(Exception) } }
Use more idiomatic technique for asserting that an exception was not thrown in this test.
Use more idiomatic technique for asserting that an exception was not thrown in this test.
Groovy
apache-2.0
sbglasius/fixtures,gpc/fixtures
groovy
## Code Before: import grails.plugin.spock.* import collectiontypes.list.* import spock.lang.* class LogSpec extends IntegrationSpec { def fixtureLoader def "log available in the fixtureLoader"() { when: def f = fixtureLoader.load { log.info "Shouldn't fail" } then: true //a log should be available } def "log available in file"() { when: def f = fixtureLoader.load("log/**/*") then: true //a log should be available } } ## Instruction: Use more idiomatic technique for asserting that an exception was not thrown in this test. ## Code After: import grails.plugin.spock.* import collectiontypes.list.* import spock.lang.* class LogSpec extends IntegrationSpec { def fixtureLoader def "log available in the fixtureLoader"() { when: def f = fixtureLoader.load { log.info "Shouldn't fail" } then: notThrown(Exception) } def "log available in file"() { when: def f = fixtureLoader.load("log/**/*") then: notThrown(Exception) } }
18099d1ad54e2f3dc43c1326e67c3f7e8ca2e4c4
pkgs/development/libraries/libcaca/default.nix
pkgs/development/libraries/libcaca/default.nix
{stdenv, fetchurl, ncurses}: stdenv.mkDerivation { name = "libcaca-0.99-beta13b"; src = fetchurl { url = http://libcaca.zoy.org/files/libcaca-0.99.beta13b.tar.gz; sha256 = "0xy8pcnljnj5la97bzbwwyzyqa7dr3v9cyw8gdjzdfgqywvac1vg"; }; configureFlags = "--disable-x11 --disable-imlib2 --disable-doc"; propagatedBuildInputs = [ncurses]; }
{stdenv, fetchurl, ncurses}: stdenv.mkDerivation rec { name = "libcaca-0.99-beta13b"; src = fetchurl { name = "${name}.tar.gz"; url = http://libcaca.zoy.org/attachment/wiki/libcaca/libcaca-0.99.beta13b.tar.gz?format=raw; sha256 = "0xy8pcnljnj5la97bzbwwyzyqa7dr3v9cyw8gdjzdfgqywvac1vg"; }; configureFlags = "--disable-x11 --disable-imlib2 --disable-doc"; propagatedBuildInputs = [ncurses]; meta = { homepage = http://libcaca.zoy.org/; description = "A graphics library that outputs text instead of pixels."; license = "WTFPL"; # http://sam.zoy.org/wtfpl/ }; }
Fix the url of libcaca.
Fix the url of libcaca. svn path=/nixpkgs/trunk/; revision=12033
Nix
mit
triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,triton/triton,triton/triton,SymbiFlow/nixpkgs,triton/triton,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,triton/triton,triton/triton,triton/triton,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs
nix
## Code Before: {stdenv, fetchurl, ncurses}: stdenv.mkDerivation { name = "libcaca-0.99-beta13b"; src = fetchurl { url = http://libcaca.zoy.org/files/libcaca-0.99.beta13b.tar.gz; sha256 = "0xy8pcnljnj5la97bzbwwyzyqa7dr3v9cyw8gdjzdfgqywvac1vg"; }; configureFlags = "--disable-x11 --disable-imlib2 --disable-doc"; propagatedBuildInputs = [ncurses]; } ## Instruction: Fix the url of libcaca. svn path=/nixpkgs/trunk/; revision=12033 ## Code After: {stdenv, fetchurl, ncurses}: stdenv.mkDerivation rec { name = "libcaca-0.99-beta13b"; src = fetchurl { name = "${name}.tar.gz"; url = http://libcaca.zoy.org/attachment/wiki/libcaca/libcaca-0.99.beta13b.tar.gz?format=raw; sha256 = "0xy8pcnljnj5la97bzbwwyzyqa7dr3v9cyw8gdjzdfgqywvac1vg"; }; configureFlags = "--disable-x11 --disable-imlib2 --disable-doc"; propagatedBuildInputs = [ncurses]; meta = { homepage = http://libcaca.zoy.org/; description = "A graphics library that outputs text instead of pixels."; license = "WTFPL"; # http://sam.zoy.org/wtfpl/ }; }
625989822ed8cfab8df1cdb15a4f387899420890
src/Event/LoggedInUserListener.php
src/Event/LoggedInUserListener.php
<?php namespace Ceeram\Blame\Event; use ArrayObject; use Cake\Controller\Component\AuthComponent; use Cake\Event\Event; use Cake\Event\EventListenerInterface; use Cake\ORM\Entity; /** * Class LoggedInUserListener * * Licensed under The MIT License * For full copyright and license information, please see the LICENSE.txt */ class LoggedInUserListener implements EventListenerInterface { /** * @var AuthComponent */ protected $_Auth; /** * Constructor * * @param \Cake\Controller\Component\AuthComponent $Auth Authcomponent */ public function __construct(AuthComponent $Auth) { $this->_Auth = $Auth; } /** * {@inheritDoc} */ public function implementedEvents() { return [ 'Model.beforeSave' => [ 'callable' => 'beforeSave', 'priority' => -100 ] ]; } /** * Before save listener. * * @param \Cake\Event\Event $event The beforeSave event that was fired * @param \Cake\ORM\Entity $entity The entity that is going to be saved * @param \ArrayObject $options the options passed to the save method * @return void */ public function beforeSave(Event $event, Entity $entity, ArrayObject $options) { $options['loggedInUser'] = $this->_Auth->user('id'); } }
<?php namespace Ceeram\Blame\Event; use ArrayObject; use Cake\Controller\Component\AuthComponent; use Cake\Event\Event; use Cake\Event\EventListenerInterface; use Cake\ORM\Entity; /** * Class LoggedInUserListener * * Licensed under The MIT License * For full copyright and license information, please see the LICENSE.txt */ class LoggedInUserListener implements EventListenerInterface { /** * @var AuthComponent */ protected $_Auth; /** * Constructor * * @param \Cake\Controller\Component\AuthComponent $Auth Authcomponent */ public function __construct(AuthComponent $Auth) { $this->_Auth = $Auth; } /** * {@inheritDoc} */ public function implementedEvents() { return [ 'Model.beforeSave' => [ 'callable' => 'beforeSave', 'priority' => -100 ] ]; } /** * Before save listener. * * @param \Cake\Event\Event $event The beforeSave event that was fired * @param \Cake\ORM\Entity $entity The entity that is going to be saved * @param \ArrayObject $options the options passed to the save method * @return void */ public function beforeSave(Event $event, Entity $entity, ArrayObject $options) { if (empty($options['loggedInUser'])) { $options['loggedInUser'] = $this->_Auth->user('id'); } } }
Allow manually setting of loggedInUser
Allow manually setting of loggedInUser Sometimes it is necessary to manually override the `loggedInUser`.
PHP
mit
ceeram/blame
php
## Code Before: <?php namespace Ceeram\Blame\Event; use ArrayObject; use Cake\Controller\Component\AuthComponent; use Cake\Event\Event; use Cake\Event\EventListenerInterface; use Cake\ORM\Entity; /** * Class LoggedInUserListener * * Licensed under The MIT License * For full copyright and license information, please see the LICENSE.txt */ class LoggedInUserListener implements EventListenerInterface { /** * @var AuthComponent */ protected $_Auth; /** * Constructor * * @param \Cake\Controller\Component\AuthComponent $Auth Authcomponent */ public function __construct(AuthComponent $Auth) { $this->_Auth = $Auth; } /** * {@inheritDoc} */ public function implementedEvents() { return [ 'Model.beforeSave' => [ 'callable' => 'beforeSave', 'priority' => -100 ] ]; } /** * Before save listener. * * @param \Cake\Event\Event $event The beforeSave event that was fired * @param \Cake\ORM\Entity $entity The entity that is going to be saved * @param \ArrayObject $options the options passed to the save method * @return void */ public function beforeSave(Event $event, Entity $entity, ArrayObject $options) { $options['loggedInUser'] = $this->_Auth->user('id'); } } ## Instruction: Allow manually setting of loggedInUser Sometimes it is necessary to manually override the `loggedInUser`. ## Code After: <?php namespace Ceeram\Blame\Event; use ArrayObject; use Cake\Controller\Component\AuthComponent; use Cake\Event\Event; use Cake\Event\EventListenerInterface; use Cake\ORM\Entity; /** * Class LoggedInUserListener * * Licensed under The MIT License * For full copyright and license information, please see the LICENSE.txt */ class LoggedInUserListener implements EventListenerInterface { /** * @var AuthComponent */ protected $_Auth; /** * Constructor * * @param \Cake\Controller\Component\AuthComponent $Auth Authcomponent */ public function __construct(AuthComponent $Auth) { $this->_Auth = $Auth; } /** * {@inheritDoc} */ public function implementedEvents() { return [ 'Model.beforeSave' => [ 'callable' => 'beforeSave', 'priority' => -100 ] ]; } /** * Before save listener. * * @param \Cake\Event\Event $event The beforeSave event that was fired * @param \Cake\ORM\Entity $entity The entity that is going to be saved * @param \ArrayObject $options the options passed to the save method * @return void */ public function beforeSave(Event $event, Entity $entity, ArrayObject $options) { if (empty($options['loggedInUser'])) { $options['loggedInUser'] = $this->_Auth->user('id'); } } }
344f0f4535cbdc5ba43a3159534d084ae47643f6
lib/poper/cli.rb
lib/poper/cli.rb
require 'thor' module Poper class CLI < Thor require 'poper' require 'poper/version' class << self def is_thor_reserved_word?(word, type) return false if word == 'run' super end end desc 'run COMMIT', 'Run Poper, checking commits from HEAD to it' def run(commit) Runner.new(commit).run.each do |message| puts "#{message.commit[0..6]}: #{message.message}" end end desc 'version', 'Show the Poper version' map %w(-v --version) => :version def version puts "Poper version #{::Poper::VERSION}" end end end
require 'thor' module Poper class CLI < Thor require 'poper' require 'poper/version' class << self def is_thor_reserved_word?(word, type) return false if word == 'run' super end end desc 'run COMMIT', 'Run Poper, checking commits from HEAD to it' def run(commit) Runner.new(commit).run.each do |message| # message.commit and message.message are Strings # prints first 7 characteres of the commit sha1 hash # followed by the associated message puts "#{message.commit[0..6]}: #{message.message}" end end desc 'version', 'Show the Poper version' map %w(-v --version) => :version def version puts "Poper version #{::Poper::VERSION}" end end end
Add a comment to clarify a piece of code
Add a comment to clarify a piece of code
Ruby
mit
mmozuras/poper
ruby
## Code Before: require 'thor' module Poper class CLI < Thor require 'poper' require 'poper/version' class << self def is_thor_reserved_word?(word, type) return false if word == 'run' super end end desc 'run COMMIT', 'Run Poper, checking commits from HEAD to it' def run(commit) Runner.new(commit).run.each do |message| puts "#{message.commit[0..6]}: #{message.message}" end end desc 'version', 'Show the Poper version' map %w(-v --version) => :version def version puts "Poper version #{::Poper::VERSION}" end end end ## Instruction: Add a comment to clarify a piece of code ## Code After: require 'thor' module Poper class CLI < Thor require 'poper' require 'poper/version' class << self def is_thor_reserved_word?(word, type) return false if word == 'run' super end end desc 'run COMMIT', 'Run Poper, checking commits from HEAD to it' def run(commit) Runner.new(commit).run.each do |message| # message.commit and message.message are Strings # prints first 7 characteres of the commit sha1 hash # followed by the associated message puts "#{message.commit[0..6]}: #{message.message}" end end desc 'version', 'Show the Poper version' map %w(-v --version) => :version def version puts "Poper version #{::Poper::VERSION}" end end end
2dd4661bce61cadbfc0f7d6b792ed2c6455bd62d
src/lib/utils/utils.coffee
src/lib/utils/utils.coffee
goog.provide 'spark.utils' counter = Math.floor Math.random() * 2147483648 ###* Returns a unique id. Ported from TartJS. https://github.com/tart/tartJS/blob/master/tart/tart.js#L26 -- amcalar <3 @export @return {string} Unique id. ### spark.utils.getUid = -> return (counter++).toString 36 ###* Concats strings together with a space. @export @param {...*} var_args A list of strings to concatenate. @return {string} The concatenation of {@code var_args}. ### spark.utils.concatString = (var_args) -> strings = [] for arg in arguments when typeof arg is 'string' if arg.indexOf(' ') > -1 for item in arg.split ' ' strings.push item if strings.indexOf(item) is -1 else strings.push arg return Array.prototype.join.call strings, ' '
goog.provide 'spark.utils' counter = Math.floor Math.random() * 2147483648 ###* Returns a unique id. Ported from TartJS. https://github.com/tart/tartJS/blob/master/tart/tart.js#L26 -- amcalar <3 @export @return {string} Unique id. ### spark.utils.getUid = -> return (counter++).toString 36 ###* Concats strings together with a space. @export @param {...*} var_args A list of strings to concatenate. @return {string} The concatenation of {@code var_args}. ### spark.utils.concatString = (var_args) -> strings = [] for arg in arguments when typeof arg is 'string' if arg.indexOf(' ') > -1 for item in arg.split ' ' strings.push item if strings.indexOf(item) is -1 else strings.push arg return strings.join ' '
Use native array.join to concat strings.
Utils: Use native array.join to concat strings. cihangir <3
CoffeeScript
mit
fatihacet/spark,fatihacet/spark
coffeescript
## Code Before: goog.provide 'spark.utils' counter = Math.floor Math.random() * 2147483648 ###* Returns a unique id. Ported from TartJS. https://github.com/tart/tartJS/blob/master/tart/tart.js#L26 -- amcalar <3 @export @return {string} Unique id. ### spark.utils.getUid = -> return (counter++).toString 36 ###* Concats strings together with a space. @export @param {...*} var_args A list of strings to concatenate. @return {string} The concatenation of {@code var_args}. ### spark.utils.concatString = (var_args) -> strings = [] for arg in arguments when typeof arg is 'string' if arg.indexOf(' ') > -1 for item in arg.split ' ' strings.push item if strings.indexOf(item) is -1 else strings.push arg return Array.prototype.join.call strings, ' ' ## Instruction: Utils: Use native array.join to concat strings. cihangir <3 ## Code After: goog.provide 'spark.utils' counter = Math.floor Math.random() * 2147483648 ###* Returns a unique id. Ported from TartJS. https://github.com/tart/tartJS/blob/master/tart/tart.js#L26 -- amcalar <3 @export @return {string} Unique id. ### spark.utils.getUid = -> return (counter++).toString 36 ###* Concats strings together with a space. @export @param {...*} var_args A list of strings to concatenate. @return {string} The concatenation of {@code var_args}. ### spark.utils.concatString = (var_args) -> strings = [] for arg in arguments when typeof arg is 'string' if arg.indexOf(' ') > -1 for item in arg.split ' ' strings.push item if strings.indexOf(item) is -1 else strings.push arg return strings.join ' '
d0b40c90bd5af1ba9ef0d617c10700566d4e3775
tests/unit/directory/test_directory.py
tests/unit/directory/test_directory.py
"""Contains the unit tests for the inner directory package""" import unittest class TestDirectory(unittest.TestCase): pass if __name__ == "__main__": unittest.main()
"""Contains the unit tests for the inner directory package""" import unittest class TestDirectory(unittest.TestCase): def setUp(self): self.fake_path = os.path.abspath("hello-world-dir") return if __name__ == "__main__": unittest.main()
Add TestDirectory.setUp() to the directory package's test file
Add TestDirectory.setUp() to the directory package's test file
Python
mit
SizzlingVortex/classyfd
python
## Code Before: """Contains the unit tests for the inner directory package""" import unittest class TestDirectory(unittest.TestCase): pass if __name__ == "__main__": unittest.main() ## Instruction: Add TestDirectory.setUp() to the directory package's test file ## Code After: """Contains the unit tests for the inner directory package""" import unittest class TestDirectory(unittest.TestCase): def setUp(self): self.fake_path = os.path.abspath("hello-world-dir") return if __name__ == "__main__": unittest.main()
f83c3b1a959026992302bf34031cef86518a1796
README.rst
README.rst
Jubatus Java Client =================== Template of Jubatus client in Java. See `the maven repository <http://download.jubat.us/maven/>`_ for the latest release. All codes are generated by `MessagePack IDL <https://github.com/msgpack/msgpack-haskell/tree/master/msgpack-idl>`_ . Install ======= If your project uses Maven, please add these lines to your pom.xml. .. code-block:: xml <repositories> <repository> <id>jubat.us</id> <name>Jubatus Repository for Maven</name> <url>http://download.jubat.us/maven</url> </repository> </repositories> <dependencies> <dependency> <groupId>us.jubat</groupId> <artifactId>jubatus</artifactId> <version>x.x.x</version> </dependency> </dependencies> License ======= MIT License
Jubatus Java Client =================== Template of Jubatus client in Java. See `the maven repository <http://download.jubat.us/maven/>`_ for the latest release. All codes are generated by ``jenerator`` . Install ======= If your project uses Maven, please add these lines to your pom.xml. .. code-block:: xml <repositories> <repository> <id>jubat.us</id> <name>Jubatus Repository for Maven</name> <url>http://download.jubat.us/maven</url> </repository> </repositories> <dependencies> <dependency> <groupId>us.jubat</groupId> <artifactId>jubatus</artifactId> <version>x.x.x</version> </dependency> </dependencies> License ======= MIT License
Update description of code generator
Update description of code generator
reStructuredText
mit
jubatus/jubatus-java-client,jubatus/jubatus-java-client
restructuredtext
## Code Before: Jubatus Java Client =================== Template of Jubatus client in Java. See `the maven repository <http://download.jubat.us/maven/>`_ for the latest release. All codes are generated by `MessagePack IDL <https://github.com/msgpack/msgpack-haskell/tree/master/msgpack-idl>`_ . Install ======= If your project uses Maven, please add these lines to your pom.xml. .. code-block:: xml <repositories> <repository> <id>jubat.us</id> <name>Jubatus Repository for Maven</name> <url>http://download.jubat.us/maven</url> </repository> </repositories> <dependencies> <dependency> <groupId>us.jubat</groupId> <artifactId>jubatus</artifactId> <version>x.x.x</version> </dependency> </dependencies> License ======= MIT License ## Instruction: Update description of code generator ## Code After: Jubatus Java Client =================== Template of Jubatus client in Java. See `the maven repository <http://download.jubat.us/maven/>`_ for the latest release. All codes are generated by ``jenerator`` . Install ======= If your project uses Maven, please add these lines to your pom.xml. .. code-block:: xml <repositories> <repository> <id>jubat.us</id> <name>Jubatus Repository for Maven</name> <url>http://download.jubat.us/maven</url> </repository> </repositories> <dependencies> <dependency> <groupId>us.jubat</groupId> <artifactId>jubatus</artifactId> <version>x.x.x</version> </dependency> </dependencies> License ======= MIT License
ada5bae2f932bb8d4ed8b4e1ee9d8292bbbe2fe4
jsrepl_dev.js
jsrepl_dev.js
// Set default paths to the plugin dirs Importer.defaultPaths = ['js/src', 'build/default/plugins/sqlite3', 'build/default/plugins/curl', 'build/default/plugins/environment', 'build/default/src']; Importer.preload['xml'] = function() { return this.$importer.load('flusspferd-xml'); } Importer.preload['io'] = function() { return this.$importer.load('flusspferd-io'); } prelude = 'prelude.js';
// Set default paths to the plugin dirs Importer.defaultPaths = ['js/src', 'plugins/sqlite3', 'build/default/plugins/sqlite3', 'plugins/curl', 'build/default/plugins/curl', 'build/default/plugins/environment', 'build/default/src']; Importer.preload['xml'] = function() { return this.$importer.load('flusspferd-xml'); } Importer.preload['io'] = function() { return this.$importer.load('flusspferd-io'); } prelude = 'prelude.js';
Add plugin src dirs into search path to pick up .js files in dev
Shell: Add plugin src dirs into search path to pick up .js files in dev
JavaScript
mit
Flusspferd/flusspferd,Flusspferd/flusspferd,Flusspferd/flusspferd,Flusspferd/flusspferd,Flusspferd/flusspferd
javascript
## Code Before: // Set default paths to the plugin dirs Importer.defaultPaths = ['js/src', 'build/default/plugins/sqlite3', 'build/default/plugins/curl', 'build/default/plugins/environment', 'build/default/src']; Importer.preload['xml'] = function() { return this.$importer.load('flusspferd-xml'); } Importer.preload['io'] = function() { return this.$importer.load('flusspferd-io'); } prelude = 'prelude.js'; ## Instruction: Shell: Add plugin src dirs into search path to pick up .js files in dev ## Code After: // Set default paths to the plugin dirs Importer.defaultPaths = ['js/src', 'plugins/sqlite3', 'build/default/plugins/sqlite3', 'plugins/curl', 'build/default/plugins/curl', 'build/default/plugins/environment', 'build/default/src']; Importer.preload['xml'] = function() { return this.$importer.load('flusspferd-xml'); } Importer.preload['io'] = function() { return this.$importer.load('flusspferd-io'); } prelude = 'prelude.js';
eea647cf05d7143d800f834dd77aeafc32522100
groundstation/settings.py
groundstation/settings.py
PORT=1248 BEACON_TIMEOUT=5 DEFAULT_BUFSIZE=8192
PORT=1248 BEACON_TIMEOUT=5 DEFAULT_BUFSIZE=8192 DEFAULT_CACHE_LIFETIME=900
Add config key for default cache lifetime
Add config key for default cache lifetime
Python
mit
richo/groundstation,richo/groundstation,richo/groundstation,richo/groundstation,richo/groundstation
python
## Code Before: PORT=1248 BEACON_TIMEOUT=5 DEFAULT_BUFSIZE=8192 ## Instruction: Add config key for default cache lifetime ## Code After: PORT=1248 BEACON_TIMEOUT=5 DEFAULT_BUFSIZE=8192 DEFAULT_CACHE_LIFETIME=900
0126da430bbbe0b6c21e351cc808ab8e29a50765
index.js
index.js
var xml = require('libxmljs'), request = require('request') var Client = exports.Client = function Client(appKey) { this.appKey = appKey } Client.prototype.query = function(input, cb) { if(!this.appKey) { return cb("Application key not set", null) } var uri = 'http://api.wolframalpha.com/v2/query?input=' + encodeURIComponent(input) + '&primary=true&appid=' + this.appKey request(uri, function(error, response, body) { if(!error && response.statusCode == 200) { var doc = xml.parseXml(body), root = doc.root() if(root.attr('error').value() != 'false') { var message = root.get('//error/msg').text() return cb(message, null) } else { var pods = root.find('pod').map(function(pod) { var subpods = pod.find('subpod').map(function(node) { return { title: node.attr('title').value(), value: node.get('plaintext').text(), image: node.get('img').attr('src').value() } }) var primary = (pod.attr('primary') && pod.attr('primary').value()) == 'true' return { subpods: subpods, primary: primary } }) return cb(null, pods) } } }) } exports.createClient = function(appKey) { return new Client(appKey) }
var xml = require('libxmljs'), request = require('request') var Client = exports.Client = function Client(appKey) { this.appKey = appKey } Client.prototype.query = function(input, cb) { if(!this.appKey) { return cb("Application key not set", null) } var uri = 'http://api.wolframalpha.com/v2/query?input=' + encodeURIComponent(input) + '&primary=true&appid=' + this.appKey request(uri, function(error, response, body) { if(!error && response.statusCode == 200) { var doc = xml.parseXml(body), root = doc.root() if(root.attr('error').value() != 'false') { var message = root.get('//error/msg').text() return cb(message, null) } else { var pods = root.find('pod').map(function(pod) { var subpods = pod.find('subpod').map(function(node) { return { title: node.attr('title').value(), value: node.get('plaintext').text(), image: node.get('img').attr('src').value() } }) var primary = (pod.attr('primary') && pod.attr('primary').value()) == 'true' return { subpods: subpods, primary: primary } }) return cb(null, pods) } } else { return cb(error, null) } }) } exports.createClient = function(appKey) { return new Client(appKey) }
Handle network errors instead of failing silently
Handle network errors instead of failing silently
JavaScript
mit
cycomachead/node-wolfram,strax/node-wolfram
javascript
## Code Before: var xml = require('libxmljs'), request = require('request') var Client = exports.Client = function Client(appKey) { this.appKey = appKey } Client.prototype.query = function(input, cb) { if(!this.appKey) { return cb("Application key not set", null) } var uri = 'http://api.wolframalpha.com/v2/query?input=' + encodeURIComponent(input) + '&primary=true&appid=' + this.appKey request(uri, function(error, response, body) { if(!error && response.statusCode == 200) { var doc = xml.parseXml(body), root = doc.root() if(root.attr('error').value() != 'false') { var message = root.get('//error/msg').text() return cb(message, null) } else { var pods = root.find('pod').map(function(pod) { var subpods = pod.find('subpod').map(function(node) { return { title: node.attr('title').value(), value: node.get('plaintext').text(), image: node.get('img').attr('src').value() } }) var primary = (pod.attr('primary') && pod.attr('primary').value()) == 'true' return { subpods: subpods, primary: primary } }) return cb(null, pods) } } }) } exports.createClient = function(appKey) { return new Client(appKey) } ## Instruction: Handle network errors instead of failing silently ## Code After: var xml = require('libxmljs'), request = require('request') var Client = exports.Client = function Client(appKey) { this.appKey = appKey } Client.prototype.query = function(input, cb) { if(!this.appKey) { return cb("Application key not set", null) } var uri = 'http://api.wolframalpha.com/v2/query?input=' + encodeURIComponent(input) + '&primary=true&appid=' + this.appKey request(uri, function(error, response, body) { if(!error && response.statusCode == 200) { var doc = xml.parseXml(body), root = doc.root() if(root.attr('error').value() != 'false') { var message = root.get('//error/msg').text() return cb(message, null) } else { var pods = root.find('pod').map(function(pod) { var subpods = pod.find('subpod').map(function(node) { return { title: node.attr('title').value(), value: node.get('plaintext').text(), image: node.get('img').attr('src').value() } }) var primary = (pod.attr('primary') && pod.attr('primary').value()) == 'true' return { subpods: subpods, primary: primary } }) return cb(null, pods) } } else { return cb(error, null) } }) } exports.createClient = function(appKey) { return new Client(appKey) }
bcc7bbde415527bceda9b7362ec2424250c7ca5a
bower.json
bower.json
{ "name": "cherrytree", "version": "0.1.0", "main": "router.js", "ignore": [ "**/.*", "node_modules", "components", "bower_components", "test", "tests" ], "dependencies": { "router.js": "tildeio/router.js#6254eae384ae378373e3e1a86f7c9abd0c0455af", "route-recognizer": "tildeio/route-recognizer#7d78b62279f0ceff22ec7783c43e36efe4a8e336", "rsvp.js": "tildeio/rsvp.js#78e4ef8613e64bc9a14c3375c58560138f37e000", "underscore": ">=1.4.4" } }
{ "name": "cherrytree", "version": "0.1.0", "main": "router.js", "ignore": [ "**/.*", "node_modules", "components", "bower_components", "test", "tests" ], "dependencies": { "router.js": "KidkArolis/router.js#1e31941370d6152b181cddece6b5be48223ef04a", "route-recognizer": "tildeio/route-recognizer#7d78b62279f0ceff22ec7783c43e36efe4a8e336", "rsvp.js": "tildeio/rsvp.js#78e4ef8613e64bc9a14c3375c58560138f37e000", "underscore": ">=1.4.4" } }
Use the forked router.js until the issue with noop transitions is fixed
Use the forked router.js until the issue with noop transitions is fixed
JSON
mit
QubitProducts/cherrytree,nathanboktae/cherrytree,QubitProducts/cherrytree,nathanboktae/cherrytree
json
## Code Before: { "name": "cherrytree", "version": "0.1.0", "main": "router.js", "ignore": [ "**/.*", "node_modules", "components", "bower_components", "test", "tests" ], "dependencies": { "router.js": "tildeio/router.js#6254eae384ae378373e3e1a86f7c9abd0c0455af", "route-recognizer": "tildeio/route-recognizer#7d78b62279f0ceff22ec7783c43e36efe4a8e336", "rsvp.js": "tildeio/rsvp.js#78e4ef8613e64bc9a14c3375c58560138f37e000", "underscore": ">=1.4.4" } } ## Instruction: Use the forked router.js until the issue with noop transitions is fixed ## Code After: { "name": "cherrytree", "version": "0.1.0", "main": "router.js", "ignore": [ "**/.*", "node_modules", "components", "bower_components", "test", "tests" ], "dependencies": { "router.js": "KidkArolis/router.js#1e31941370d6152b181cddece6b5be48223ef04a", "route-recognizer": "tildeio/route-recognizer#7d78b62279f0ceff22ec7783c43e36efe4a8e336", "rsvp.js": "tildeio/rsvp.js#78e4ef8613e64bc9a14c3375c58560138f37e000", "underscore": ">=1.4.4" } }
29bc4322531210e398b565f04b4b18cb4b8b1e2d
content/_recipes/tooltips.md
content/_recipes/tooltips.md
--- title: Tooltips --- Tooltips provide a quick definition for an item. There are two ways of creating tooltip: automatically from the glossary, and via a link override. # Automatic Tooltips Automatic tooltips reference glossary entries. If a glossary article by the name of "Tooltips" exists, then a tooltip will be available for the following item: [Tooltips](# 'presidium-tooltip') ```md [Tooltips](# 'presidium-tooltip') ``` # Link Override You may also supply an internal article as a source for a tooltip. Presidium will use the article's first paragraph to construct the tooltip. You are required to ensure, however, that the first paragraph is semantically sufficient to be used as a tooltip. Note that the text used for the demarcation of a tooltip need not match the article title like [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip'). ```md [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip') ```
--- title: Tooltips --- Tooltips provide a quick definition for an item. There are two ways of creating tooltip: automatically from the glossary, and via a link override. # Automatic Tooltips Automatic tooltips reference glossary entries. If a glossary article by the name of "Tooltips" exists, then a tooltip will be available for the following item: [Tooltips](# 'presidium-tooltip') ```md [Tooltips](# 'presidium-tooltip') ``` # Link Override You may also supply an internal article as a source for a tooltip. Presidium will use the article's first paragraph to construct the tooltip. You are required to ensure, however, that the first paragraph is semantically sufficient to be used as a tooltip. Note that the text used for the demarcation of a tooltip need not match the article title like [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip'). {% assign tooltips_example = '{{ site.baseurl }}' %} ```md [the following text on article templates]({{ tooltips_example }}/best-practices/#use-article-templates 'presidium-tooltip') ```
Add hidden liquid from example.
Add hidden liquid from example.
Markdown
apache-2.0
SPANDigital/presidium
markdown
## Code Before: --- title: Tooltips --- Tooltips provide a quick definition for an item. There are two ways of creating tooltip: automatically from the glossary, and via a link override. # Automatic Tooltips Automatic tooltips reference glossary entries. If a glossary article by the name of "Tooltips" exists, then a tooltip will be available for the following item: [Tooltips](# 'presidium-tooltip') ```md [Tooltips](# 'presidium-tooltip') ``` # Link Override You may also supply an internal article as a source for a tooltip. Presidium will use the article's first paragraph to construct the tooltip. You are required to ensure, however, that the first paragraph is semantically sufficient to be used as a tooltip. Note that the text used for the demarcation of a tooltip need not match the article title like [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip'). ```md [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip') ``` ## Instruction: Add hidden liquid from example. ## Code After: --- title: Tooltips --- Tooltips provide a quick definition for an item. There are two ways of creating tooltip: automatically from the glossary, and via a link override. # Automatic Tooltips Automatic tooltips reference glossary entries. If a glossary article by the name of "Tooltips" exists, then a tooltip will be available for the following item: [Tooltips](# 'presidium-tooltip') ```md [Tooltips](# 'presidium-tooltip') ``` # Link Override You may also supply an internal article as a source for a tooltip. Presidium will use the article's first paragraph to construct the tooltip. You are required to ensure, however, that the first paragraph is semantically sufficient to be used as a tooltip. Note that the text used for the demarcation of a tooltip need not match the article title like [the following text on article templates]({{site.baseurl}}/best-practices/#use-article-templates 'presidium-tooltip'). {% assign tooltips_example = '{{ site.baseurl }}' %} ```md [the following text on article templates]({{ tooltips_example }}/best-practices/#use-article-templates 'presidium-tooltip') ```
3083cd22b8f023bd76c7f018f33c16509eadb471
.travis.yml
.travis.yml
env: global: - ENCRYPTION_LABEL: "8c7035aad7a2" install: true sudo: false language: java jdk: - oraclejdk8 branches: only: - master addons: apt: packages: - gnome-themes-standard - metacity - libwebkit-dev before_install: - openssl aes-256-cbc -K $encrypted_8c7035aad7a2_key -iv $encrypted_8c7035aad7a2_iv -in deploy-key.enc -out deploy-key -d - chmod 600 deploy-key - cp deploy-key ~/.ssh/id_rsa script: - mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.5 && mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.6 && mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.7 before_script: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start - sleep 5 - metacity --sm-disable --replace & - sleep 5 after_success: - chmod a+x .travis-deploy.sh - "./.travis-deploy.sh" after_failure: - find . -path "*/surefire-reports/*.txt" | xargs cat
env: global: - ENCRYPTION_LABEL: "8c7035aad7a2" install: true sudo: false language: java jdk: - oraclejdk8 branches: only: - master addons: apt: packages: - gnome-themes-standard - metacity - libwebkit-dev before_install: - openssl aes-256-cbc -K $encrypted_8c7035aad7a2_key -iv $encrypted_8c7035aad7a2_iv -in deploy-key.enc -out deploy-key -d - chmod 600 deploy-key - cp deploy-key ~/.ssh/id_rsa script: - mvn clean verify -e -Dtarget-platform-classifier=eclipse-4.7 before_script: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start - sleep 5 - metacity --sm-disable --replace & - sleep 5 after_success: - chmod a+x .travis-deploy.sh - "./.travis-deploy.sh" after_failure: - find . -path "*/surefire-reports/*.txt" | xargs cat
Reduce number of maven builds for testing
Reduce number of maven builds for testing
YAML
epl-1.0
rherrmann/eclipse-extras,rherrmann/eclipse-extras
yaml
## Code Before: env: global: - ENCRYPTION_LABEL: "8c7035aad7a2" install: true sudo: false language: java jdk: - oraclejdk8 branches: only: - master addons: apt: packages: - gnome-themes-standard - metacity - libwebkit-dev before_install: - openssl aes-256-cbc -K $encrypted_8c7035aad7a2_key -iv $encrypted_8c7035aad7a2_iv -in deploy-key.enc -out deploy-key -d - chmod 600 deploy-key - cp deploy-key ~/.ssh/id_rsa script: - mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.5 && mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.6 && mvn clean verify -q -e -Dtarget-platform-classifier=eclipse-4.7 before_script: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start - sleep 5 - metacity --sm-disable --replace & - sleep 5 after_success: - chmod a+x .travis-deploy.sh - "./.travis-deploy.sh" after_failure: - find . -path "*/surefire-reports/*.txt" | xargs cat ## Instruction: Reduce number of maven builds for testing ## Code After: env: global: - ENCRYPTION_LABEL: "8c7035aad7a2" install: true sudo: false language: java jdk: - oraclejdk8 branches: only: - master addons: apt: packages: - gnome-themes-standard - metacity - libwebkit-dev before_install: - openssl aes-256-cbc -K $encrypted_8c7035aad7a2_key -iv $encrypted_8c7035aad7a2_iv -in deploy-key.enc -out deploy-key -d - chmod 600 deploy-key - cp deploy-key ~/.ssh/id_rsa script: - mvn clean verify -e -Dtarget-platform-classifier=eclipse-4.7 before_script: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start - sleep 5 - metacity --sm-disable --replace & - sleep 5 after_success: - chmod a+x .travis-deploy.sh - "./.travis-deploy.sh" after_failure: - find . -path "*/surefire-reports/*.txt" | xargs cat
d5d5cb8e4aa3b98eb32cdfe5c536e1e20835b8ae
.travis.yml
.travis.yml
language: python services: redis-server install: - pip install coveralls tox script: - tox - coveralls
language: python services: redis-server install: - pip install coveralls tox script: - tox - coveralls deploy: provider: pypi user: playpauseandstop password: secure: adlnY9sKpfJPdcpnh0SuhQ4iIbZozQ4uNMJ/UgAra4ft4FfiDhYKqqbz58mX0PSKhLcaA0RAeLeesMqqiKrEBRmmQeu7k/pM5kw3Z57B7H/zMIoHDqgbdUMvM3Jf6PZrgx5qat48ECGZHz0ImvQzwfr98i1k+wmJM5on2X6pWgU= distributions: sdist bdist_wheel on: tags: true
Allow Travis to deploy tags as new releases on PyPI.
Allow Travis to deploy tags as new releases on PyPI.
YAML
bsd-3-clause
playpauseandstop/Flask-And-Redis,playpauseandstop/Flask-And-Redis
yaml
## Code Before: language: python services: redis-server install: - pip install coveralls tox script: - tox - coveralls ## Instruction: Allow Travis to deploy tags as new releases on PyPI. ## Code After: language: python services: redis-server install: - pip install coveralls tox script: - tox - coveralls deploy: provider: pypi user: playpauseandstop password: secure: adlnY9sKpfJPdcpnh0SuhQ4iIbZozQ4uNMJ/UgAra4ft4FfiDhYKqqbz58mX0PSKhLcaA0RAeLeesMqqiKrEBRmmQeu7k/pM5kw3Z57B7H/zMIoHDqgbdUMvM3Jf6PZrgx5qat48ECGZHz0ImvQzwfr98i1k+wmJM5on2X6pWgU= distributions: sdist bdist_wheel on: tags: true
3e9328671e5d3d328bc4229b274342f97716560e
base/poses/standing06.meta
base/poses/standing06.meta
tag Standing name Standing06-pose description Standing pose author MakeHuman license CC0 copyright (c) 2016 Data Collection AB, Joel Palmius, Jonas Hauquier
name Standing06-pose description Standing pose author MakeHuman license CC0 copyright (c) 2016 Data Collection AB, Joel Palmius, Jonas Hauquier tag MakeHuman™ tag Standing
Add tag MakeHuman™ to pose .meta files
Add tag MakeHuman™ to pose .meta files
Unity3D Asset
cc0-1.0
makehumancommunity/makehuman-assets
unity3d-asset
## Code Before: tag Standing name Standing06-pose description Standing pose author MakeHuman license CC0 copyright (c) 2016 Data Collection AB, Joel Palmius, Jonas Hauquier ## Instruction: Add tag MakeHuman™ to pose .meta files ## Code After: name Standing06-pose description Standing pose author MakeHuman license CC0 copyright (c) 2016 Data Collection AB, Joel Palmius, Jonas Hauquier tag MakeHuman™ tag Standing
a8e9b105e1250ba9221258c8bb7ab849d8e4a373
.travis.yml
.travis.yml
before_install: gem update bundler language: ruby rvm: - 2.1 - 2.2 script: - bundle exec rake test_app - bundle exec rake spec
before_install: gem update bundler language: ruby rvm: - 2.2 - 2.3 - 2.4 - 2.5 script: - bundle exec rake test_app - bundle exec rake spec
Drop Ruby 2.1 support, but allow 2.3, 2.4 and 2.5
Drop Ruby 2.1 support, but allow 2.3, 2.4 and 2.5
YAML
bsd-3-clause
berkes/spree_billing_sisow,berkes/spree_billing_sisow
yaml
## Code Before: before_install: gem update bundler language: ruby rvm: - 2.1 - 2.2 script: - bundle exec rake test_app - bundle exec rake spec ## Instruction: Drop Ruby 2.1 support, but allow 2.3, 2.4 and 2.5 ## Code After: before_install: gem update bundler language: ruby rvm: - 2.2 - 2.3 - 2.4 - 2.5 script: - bundle exec rake test_app - bundle exec rake spec
e23914f0d21f3882aa6e4720d2a139cc697a29c4
Util/bukkit/src/main/java/tc/oc/commons/bukkit/chat/LegacyPlayerAudience.java
Util/bukkit/src/main/java/tc/oc/commons/bukkit/chat/LegacyPlayerAudience.java
package tc.oc.commons.bukkit.chat; import net.md_5.bungee.api.chat.BaseComponent; import org.bukkit.entity.Player; import tc.oc.commons.core.chat.Component; import tc.oc.commons.core.chat.Components; public class LegacyPlayerAudience extends PlayerAudience { private BaseComponent recentHotbarMessage; public LegacyPlayerAudience(Player player) { super(player); } @Override public void sendHotbarMessage(BaseComponent message) { // Do not spam hot bar messages, as the protocol converts // them to regular chat messages. if(!Components.equals(message, recentHotbarMessage)) { super.sendHotbarMessage(message); recentHotbarMessage = message; } } @Override public void showTitle(BaseComponent title, BaseComponent subtitle, int inTicks, int stayTicks, int outTicks) { super.sendMessage(new Component(title).extra(" ").extra(subtitle)); } }
package tc.oc.commons.bukkit.chat; import net.md_5.bungee.api.chat.BaseComponent; import org.bukkit.entity.Player; import tc.oc.commons.core.chat.Components; import tc.oc.commons.core.stream.Collectors; import java.util.stream.Stream; public class LegacyPlayerAudience extends PlayerAudience { private BaseComponent recentHotbarMessage; public LegacyPlayerAudience(Player player) { super(player); } @Override public void sendHotbarMessage(BaseComponent message) { // Do not spam hot bar messages, as the protocol converts // them to regular chat messages. if(!Components.equals(message, recentHotbarMessage)) { emphasize(message); recentHotbarMessage = message; } } @Override public void showTitle(BaseComponent title, BaseComponent subtitle, int inTicks, int stayTicks, int outTicks) { emphasize(Components.join(Components.space(), Stream.of(title, subtitle).filter(msg -> msg != null).collect(Collectors.toImmutableList()))); } protected void emphasize(BaseComponent message) { sendMessage(Components.blank()); sendMessage(message); sendMessage(Components.blank()); } }
Make titles and hotbars more visible
Make titles and hotbars more visible
Java
agpl-3.0
StratusNetwork/ProjectAres,PotatoStealer/ProjectAres
java
## Code Before: package tc.oc.commons.bukkit.chat; import net.md_5.bungee.api.chat.BaseComponent; import org.bukkit.entity.Player; import tc.oc.commons.core.chat.Component; import tc.oc.commons.core.chat.Components; public class LegacyPlayerAudience extends PlayerAudience { private BaseComponent recentHotbarMessage; public LegacyPlayerAudience(Player player) { super(player); } @Override public void sendHotbarMessage(BaseComponent message) { // Do not spam hot bar messages, as the protocol converts // them to regular chat messages. if(!Components.equals(message, recentHotbarMessage)) { super.sendHotbarMessage(message); recentHotbarMessage = message; } } @Override public void showTitle(BaseComponent title, BaseComponent subtitle, int inTicks, int stayTicks, int outTicks) { super.sendMessage(new Component(title).extra(" ").extra(subtitle)); } } ## Instruction: Make titles and hotbars more visible ## Code After: package tc.oc.commons.bukkit.chat; import net.md_5.bungee.api.chat.BaseComponent; import org.bukkit.entity.Player; import tc.oc.commons.core.chat.Components; import tc.oc.commons.core.stream.Collectors; import java.util.stream.Stream; public class LegacyPlayerAudience extends PlayerAudience { private BaseComponent recentHotbarMessage; public LegacyPlayerAudience(Player player) { super(player); } @Override public void sendHotbarMessage(BaseComponent message) { // Do not spam hot bar messages, as the protocol converts // them to regular chat messages. if(!Components.equals(message, recentHotbarMessage)) { emphasize(message); recentHotbarMessage = message; } } @Override public void showTitle(BaseComponent title, BaseComponent subtitle, int inTicks, int stayTicks, int outTicks) { emphasize(Components.join(Components.space(), Stream.of(title, subtitle).filter(msg -> msg != null).collect(Collectors.toImmutableList()))); } protected void emphasize(BaseComponent message) { sendMessage(Components.blank()); sendMessage(message); sendMessage(Components.blank()); } }
3975ffed38b419a7ee6e88846a20de9c06c52c08
packages/fireplace/lib/model/promise_model.js
packages/fireplace/lib/model/promise_model.js
var get = Ember.get; FP.PromiseModel = Ember.ObjectProxy.extend(Ember.PromiseProxyMixin, { // forward on all content's functions where it makes sense to do so _setupContentForwarding: function() { var obj = get(this, "content"); if (!obj) { return; } for (var prop in obj) { if (!this[prop] && typeof obj[prop] === "function") { this._forwardToContent(prop); } } }.observes("content").on("init"), _forwardToContent: function(prop) { this[prop] = function() { var content = this.get("content"); return content[prop].apply(content, arguments); }; } });
var set = Ember.set, get = Ember.get, resolve = Ember.RSVP.resolve; // reimplemented private method from Ember, but with setting // _settingFromFirebase so we can avoid extra saves down the line function observePromise(proxy, promise) { promise.then(function(value) { set(proxy, 'isFulfilled', true); value._settingFromFirebase = true; set(proxy, 'content', value); value._settingFromFirebase = false; }, function(reason) { set(proxy, 'isRejected', true); set(proxy, 'reason', reason); // don't re-throw, as we are merely observing }); } FP.PromiseModel = Ember.ObjectProxy.extend(Ember.PromiseProxyMixin, { // forward on all content's functions where it makes sense to do so _setupContentForwarding: function() { var obj = get(this, "content"); if (!obj) { return; } for (var prop in obj) { if (!this[prop] && typeof obj[prop] === "function") { this._forwardToContent(prop); } } }.observes("content").on("init"), _forwardToContent: function(prop) { this[prop] = function() { var content = this.get("content"); return content[prop].apply(content, arguments); }; }, // re-implemented from Ember so we can call our own observePromise promise: Ember.computed(function(key, promise) { if (arguments.length === 2) { promise = resolve(promise); observePromise(this, promise); return promise.then(); // fork the promise. } else { throw new Ember.Error("PromiseProxy's promise must be set"); } }) });
Set _settingFromFirebase on the PromiseModel when it's resolved
Set _settingFromFirebase on the PromiseModel when it's resolved
JavaScript
mit
rlivsey/fireplace,rlivsey/fireplace,pk4media/fireplace,pk4media/fireplace
javascript
## Code Before: var get = Ember.get; FP.PromiseModel = Ember.ObjectProxy.extend(Ember.PromiseProxyMixin, { // forward on all content's functions where it makes sense to do so _setupContentForwarding: function() { var obj = get(this, "content"); if (!obj) { return; } for (var prop in obj) { if (!this[prop] && typeof obj[prop] === "function") { this._forwardToContent(prop); } } }.observes("content").on("init"), _forwardToContent: function(prop) { this[prop] = function() { var content = this.get("content"); return content[prop].apply(content, arguments); }; } }); ## Instruction: Set _settingFromFirebase on the PromiseModel when it's resolved ## Code After: var set = Ember.set, get = Ember.get, resolve = Ember.RSVP.resolve; // reimplemented private method from Ember, but with setting // _settingFromFirebase so we can avoid extra saves down the line function observePromise(proxy, promise) { promise.then(function(value) { set(proxy, 'isFulfilled', true); value._settingFromFirebase = true; set(proxy, 'content', value); value._settingFromFirebase = false; }, function(reason) { set(proxy, 'isRejected', true); set(proxy, 'reason', reason); // don't re-throw, as we are merely observing }); } FP.PromiseModel = Ember.ObjectProxy.extend(Ember.PromiseProxyMixin, { // forward on all content's functions where it makes sense to do so _setupContentForwarding: function() { var obj = get(this, "content"); if (!obj) { return; } for (var prop in obj) { if (!this[prop] && typeof obj[prop] === "function") { this._forwardToContent(prop); } } }.observes("content").on("init"), _forwardToContent: function(prop) { this[prop] = function() { var content = this.get("content"); return content[prop].apply(content, arguments); }; }, // re-implemented from Ember so we can call our own observePromise promise: Ember.computed(function(key, promise) { if (arguments.length === 2) { promise = resolve(promise); observePromise(this, promise); return promise.then(); // fork the promise. } else { throw new Ember.Error("PromiseProxy's promise must be set"); } }) });
820e3adb3fd7f3586fae8898df2c9bdd31d0f8a8
app/models/spree/address_decorator.rb
app/models/spree/address_decorator.rb
Spree::Address.class_eval do belongs_to :user attr_accessible :user_id, :deleted_at def self.required_fields validator = Spree::Address.validators.find{|v| v.kind_of?(ActiveModel::Validations::PresenceValidator)} validator ? validator.attributes : [] end # can modify an address if it's not been used in an order def editable? new_record? || (shipments.empty? && (Spree::Order.where("bill_address_id = ?", self.id).count + Spree::Order.where("bill_address_id = ?", self.id).count <= 1) && Spree::Order.complete.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0) end def can_be_deleted? shipments.empty? && Spree::Order.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0 end def to_s "#{firstname} #{lastname}: #{full_address}, #{city}, #{state || state_name}, #{country}, #{zipcode}" end def full_address [address1, address2].reject(&:blank?).to_a.join(', ') end def destroy_with_saving_used if can_be_deleted? destroy_without_saving_used else update_attribute(:deleted_at, Time.now) end end alias_method_chain :destroy, :saving_used end
Spree::Address.class_eval do belongs_to :user attr_accessible :user_id, :deleted_at def self.required_fields validator = Spree::Address.validators.find{|v| v.kind_of?(ActiveModel::Validations::PresenceValidator)} validator ? validator.attributes : [] end # can modify an address if it's not been used in an order def editable? new_record? || (shipments.empty? && (Spree::Order.where("bill_address_id = ?", self.id).count + Spree::Order.where("bill_address_id = ?", self.id).count <= 1) && Spree::Order.complete.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0) end def can_be_deleted? shipments.empty? && Spree::Order.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0 end def to_s "#{firstname} #{lastname}: #{full_address}, #{city}, #{state || state_name}, #{country}, #{zipcode}" end def full_address [address1, address2].reject(&:blank?).to_a.join(', ') end def destroy_with_saving_used if can_be_deleted? destroy_without_saving_used else update_attribute(:user_id, nil) end end alias_method_chain :destroy, :saving_used end
Remove user association from address instead of setting deleted at flag
Remove user association from address instead of setting deleted at flag
Ruby
bsd-3-clause
DynamoMTL/spree_address_book,DynamoMTL/spree_address_book
ruby
## Code Before: Spree::Address.class_eval do belongs_to :user attr_accessible :user_id, :deleted_at def self.required_fields validator = Spree::Address.validators.find{|v| v.kind_of?(ActiveModel::Validations::PresenceValidator)} validator ? validator.attributes : [] end # can modify an address if it's not been used in an order def editable? new_record? || (shipments.empty? && (Spree::Order.where("bill_address_id = ?", self.id).count + Spree::Order.where("bill_address_id = ?", self.id).count <= 1) && Spree::Order.complete.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0) end def can_be_deleted? shipments.empty? && Spree::Order.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0 end def to_s "#{firstname} #{lastname}: #{full_address}, #{city}, #{state || state_name}, #{country}, #{zipcode}" end def full_address [address1, address2].reject(&:blank?).to_a.join(', ') end def destroy_with_saving_used if can_be_deleted? destroy_without_saving_used else update_attribute(:deleted_at, Time.now) end end alias_method_chain :destroy, :saving_used end ## Instruction: Remove user association from address instead of setting deleted at flag ## Code After: Spree::Address.class_eval do belongs_to :user attr_accessible :user_id, :deleted_at def self.required_fields validator = Spree::Address.validators.find{|v| v.kind_of?(ActiveModel::Validations::PresenceValidator)} validator ? validator.attributes : [] end # can modify an address if it's not been used in an order def editable? new_record? || (shipments.empty? && (Spree::Order.where("bill_address_id = ?", self.id).count + Spree::Order.where("bill_address_id = ?", self.id).count <= 1) && Spree::Order.complete.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0) end def can_be_deleted? shipments.empty? && Spree::Order.where("bill_address_id = ? OR ship_address_id = ?", self.id, self.id).count == 0 end def to_s "#{firstname} #{lastname}: #{full_address}, #{city}, #{state || state_name}, #{country}, #{zipcode}" end def full_address [address1, address2].reject(&:blank?).to_a.join(', ') end def destroy_with_saving_used if can_be_deleted? destroy_without_saving_used else update_attribute(:user_id, nil) end end alias_method_chain :destroy, :saving_used end
b7b0436fb866c5a19769458d47c72a0d2c55f8c0
test/default/auth.c
test/default/auth.c
/* "Test Case 2" from RFC 4231 */ unsigned char key[32] = "Jefe"; unsigned char c[] = "what do ya want for nothing?"; unsigned char a[32]; int main(void) { int i; crypto_auth(a,c,sizeof c - 1U,key); for (i = 0;i < 32;++i) { printf(",0x%02x",(unsigned int) a[i]); if (i % 8 == 7) printf("\n"); } assert(crypto_auth_bytes() > 0U); assert(crypto_auth_keybytes() > 0U); assert(strcmp(crypto_auth_primitive(), "hmacsha512256") == 0); return 0; }
/* "Test Case 2" from RFC 4231 */ unsigned char key[32] = "Jefe"; unsigned char c[] = "what do ya want for nothing?"; unsigned char a[32]; int main(void) { int i; crypto_auth(a,c,sizeof c - 1U,key); for (i = 0;i < 32;++i) { printf(",0x%02x",(unsigned int) a[i]); if (i % 8 == 7) printf("\n"); } assert(crypto_auth_bytes() > 0U); assert(crypto_auth_keybytes() > 0U); assert(strcmp(crypto_auth_primitive(), "hmacsha512256") == 0); assert(crypto_auth_hmacsha512256_bytes() > 0U); assert(crypto_auth_hmacsha512256_keybytes() > 0U); return 0; }
Test the presence of some extra functions
Test the presence of some extra functions
C
isc
rustyhorde/libsodium,SpiderOak/libsodium,netroby/libsodium,JackWink/libsodium,Payshares/libsodium,eburkitt/libsodium,netroby/libsodium,akkakks/libsodium,CyanogenMod/android_external_dnscrypt_libsodium,zhuqling/libsodium,rustyhorde/libsodium,pyparallel/libsodium,paragonie-scott/libsodium,eburkitt/libsodium,akkakks/libsodium,GreatFruitOmsk/libsodium,kytvi2p/libsodium,pmienk/libsodium,mvduin/libsodium,pyparallel/libsodium,akkakks/libsodium,Payshares/libsodium,mvduin/libsodium,donpark/libsodium,JackWink/libsodium,optedoblivion/android_external_libsodium,SpiderOak/libsodium,JackWink/libsodium,tml/libsodium,rustyhorde/libsodium,tml/libsodium,Payshare/libsodium,kytvi2p/libsodium,mvduin/libsodium,pyparallel/libsodium,Payshare/libsodium,eburkitt/libsodium,CyanogenMod/android_external_dnscrypt_libsodium,HappyYang/libsodium,CyanogenMod/android_external_dnscrypt_libsodium,zhuqling/libsodium,GreatFruitOmsk/libsodium,HappyYang/libsodium,donpark/libsodium,optedoblivion/android_external_libsodium,Payshare/libsodium,soumith/libsodium,GreatFruitOmsk/libsodium,HappyYang/libsodium,paragonie-scott/libsodium,soumith/libsodium,kytvi2p/libsodium,akkakks/libsodium,SpiderOak/libsodium,paragonie-scott/libsodium,Payshares/libsodium,SpiderOak/libsodium,optedoblivion/android_external_libsodium,netroby/libsodium,pmienk/libsodium,rustyhorde/libsodium,zhuqling/libsodium,pmienk/libsodium,tml/libsodium,donpark/libsodium,soumith/libsodium
c
## Code Before: /* "Test Case 2" from RFC 4231 */ unsigned char key[32] = "Jefe"; unsigned char c[] = "what do ya want for nothing?"; unsigned char a[32]; int main(void) { int i; crypto_auth(a,c,sizeof c - 1U,key); for (i = 0;i < 32;++i) { printf(",0x%02x",(unsigned int) a[i]); if (i % 8 == 7) printf("\n"); } assert(crypto_auth_bytes() > 0U); assert(crypto_auth_keybytes() > 0U); assert(strcmp(crypto_auth_primitive(), "hmacsha512256") == 0); return 0; } ## Instruction: Test the presence of some extra functions ## Code After: /* "Test Case 2" from RFC 4231 */ unsigned char key[32] = "Jefe"; unsigned char c[] = "what do ya want for nothing?"; unsigned char a[32]; int main(void) { int i; crypto_auth(a,c,sizeof c - 1U,key); for (i = 0;i < 32;++i) { printf(",0x%02x",(unsigned int) a[i]); if (i % 8 == 7) printf("\n"); } assert(crypto_auth_bytes() > 0U); assert(crypto_auth_keybytes() > 0U); assert(strcmp(crypto_auth_primitive(), "hmacsha512256") == 0); assert(crypto_auth_hmacsha512256_bytes() > 0U); assert(crypto_auth_hmacsha512256_keybytes() > 0U); return 0; }
f9358b0dcf022005883dc89190450fb598c8dcd0
.gitlab-ci.yml
.gitlab-ci.yml
image: ruby:2.3.1 services: - postgres:latest variables: POSTGRES_DB: wulin_app_test POSTGRES_USER: postgres POSTGRES_PASSWORD: '' before_script: - apt-get update -qy - apt-get install -qy nodejs - apt-get install -qy libfontconfig1 # - export PHANTOM_JS="phantomjs-2.1.1-linux-x86_64" # - wget https://bitbucket.org/ariya/phantomjs/downloads/$PHANTOM_JS.tar.bz2 --referer=https://bitbucket.org -P /cache/ # - tar xjf /cache/$PHANTOM_JS.tar.bz2 -C /cache # - mv /cache/$PHANTOM_JS /usr/local/share # - ln -sf /usr/local/share/$PHANTOM_JS/bin/phantomjs /usr/local/bin - export PHANTOM_JS_VERSION="2.1.1" - wget https://github.com/paladox/phantomjs/archive/v$PHANTOM_JS_VERSION.tar.gz -P /cache/phantomjs/ - tar vxzf /cache/phantomjs/v$PHANTOM_JS_VERSION.tar.gz -C /cache - mv /cache/phantomjs-$PHANTOM_JS_VERSION /usr/local/share - ln -sf /usr/local/share/phantomjs-$PHANTOM_JS_VERSION/bin/phantomjs /usr/local/bin - gem install bundler - cp spec/wulin_app/config/database.ci.yml spec/wulin_app/config/database.yml - bundle install --path /cache test: script: - bundle exec rspec - bundle exec cucumber
image: ruby:2.3.1 services: - postgres:latest variables: POSTGRES_DB: wulin_app_test POSTGRES_USER: postgres POSTGRES_PASSWORD: '' before_script: - apt-get update -qy - apt-get install -qy nodejs - apt-get install -qy libfontconfig1 - export PHANTOM_JS_VERSION="2.1.1" - wget -q https://github.com/paladox/phantomjs/archive/v$PHANTOM_JS_VERSION.tar.gz -P /cache/phantomjs/ - tar xzf /cache/phantomjs/v$PHANTOM_JS_VERSION.tar.gz -C /cache - mv /cache/phantomjs-$PHANTOM_JS_VERSION /usr/local/share - ln -sf /usr/local/share/phantomjs-$PHANTOM_JS_VERSION/bin/phantomjs /usr/local/bin - gem install bundler - cp spec/wulin_app/config/database.ci.yml spec/wulin_app/config/database.yml - bundle install --path /cache test: script: - bundle exec rspec - bundle exec cucumber
Remove quiet/verbose flags in wget/tar
Remove quiet/verbose flags in wget/tar
YAML
mit
ekohe/wulin_master,ekohe/wulin_master,ekohe/wulin_master
yaml
## Code Before: image: ruby:2.3.1 services: - postgres:latest variables: POSTGRES_DB: wulin_app_test POSTGRES_USER: postgres POSTGRES_PASSWORD: '' before_script: - apt-get update -qy - apt-get install -qy nodejs - apt-get install -qy libfontconfig1 # - export PHANTOM_JS="phantomjs-2.1.1-linux-x86_64" # - wget https://bitbucket.org/ariya/phantomjs/downloads/$PHANTOM_JS.tar.bz2 --referer=https://bitbucket.org -P /cache/ # - tar xjf /cache/$PHANTOM_JS.tar.bz2 -C /cache # - mv /cache/$PHANTOM_JS /usr/local/share # - ln -sf /usr/local/share/$PHANTOM_JS/bin/phantomjs /usr/local/bin - export PHANTOM_JS_VERSION="2.1.1" - wget https://github.com/paladox/phantomjs/archive/v$PHANTOM_JS_VERSION.tar.gz -P /cache/phantomjs/ - tar vxzf /cache/phantomjs/v$PHANTOM_JS_VERSION.tar.gz -C /cache - mv /cache/phantomjs-$PHANTOM_JS_VERSION /usr/local/share - ln -sf /usr/local/share/phantomjs-$PHANTOM_JS_VERSION/bin/phantomjs /usr/local/bin - gem install bundler - cp spec/wulin_app/config/database.ci.yml spec/wulin_app/config/database.yml - bundle install --path /cache test: script: - bundle exec rspec - bundle exec cucumber ## Instruction: Remove quiet/verbose flags in wget/tar ## Code After: image: ruby:2.3.1 services: - postgres:latest variables: POSTGRES_DB: wulin_app_test POSTGRES_USER: postgres POSTGRES_PASSWORD: '' before_script: - apt-get update -qy - apt-get install -qy nodejs - apt-get install -qy libfontconfig1 - export PHANTOM_JS_VERSION="2.1.1" - wget -q https://github.com/paladox/phantomjs/archive/v$PHANTOM_JS_VERSION.tar.gz -P /cache/phantomjs/ - tar xzf /cache/phantomjs/v$PHANTOM_JS_VERSION.tar.gz -C /cache - mv /cache/phantomjs-$PHANTOM_JS_VERSION /usr/local/share - ln -sf /usr/local/share/phantomjs-$PHANTOM_JS_VERSION/bin/phantomjs /usr/local/bin - gem install bundler - cp spec/wulin_app/config/database.ci.yml spec/wulin_app/config/database.yml - bundle install --path /cache test: script: - bundle exec rspec - bundle exec cucumber
e8a1c96335982d9862d6abb76dda8f00ffa2b795
lib/travis/build/script/clojure.rb
lib/travis/build/script/clojure.rb
require 'travis/build/script/shared/jdk' module Travis module Build class Script class Clojure < Script include Jdk DEFAULTS = { lein: 'lein', jdk: 'default' } def announce super sh.cmd "#{lein} version" end def install sh.cmd "#{lein} deps", fold: 'install', retry: true end def script sh.cmd "#{lein} test" end def cache_slug super << '--lein-' << lein end private def lein config[:lein].to_s end end end end end
require 'travis/build/script/shared/jdk' module Travis module Build class Script class Clojure < Script include Jdk DEFAULTS = { lein: 'lein', jdk: 'default' } LEIN_VERSION = '2.6.1' def configure super update_lein LEIN_VERSION end def announce super sh.cmd "#{lein} version" end def install sh.cmd "#{lein} deps", fold: 'install', retry: true end def script sh.cmd "#{lein} test" end def cache_slug super << '--lein-' << lein end private def lein config[:lein].to_s end def update_lein(version) sh.if "! -f $HOME/.lein/self-installs/leiningen-#{version}-standalone.jar" do sh.cmd "env LEIN_ROOT=true curl -L -o /usr/local/bin/lein https://raw.githubusercontent.com/technomancy/leiningen/#{version}/bin/lein", echo: true, assert: true, sudo: true sh.cmd "rm -rf $HOME/.lein", echo: false sh.cmd "lein self-install", echo: true, assert: true end end end end end end
Update leiningen on the fly
Update leiningen on the fly Note that we the version to update to is fixed at this time.
Ruby
mit
craigcitro/travis-build,kevmoo/travis-build,andyli/travis-build,kevmoo/travis-build,JuliaCI/travis-build,JuliaCI/travis-build,andyli/travis-build,craigcitro/travis-build
ruby
## Code Before: require 'travis/build/script/shared/jdk' module Travis module Build class Script class Clojure < Script include Jdk DEFAULTS = { lein: 'lein', jdk: 'default' } def announce super sh.cmd "#{lein} version" end def install sh.cmd "#{lein} deps", fold: 'install', retry: true end def script sh.cmd "#{lein} test" end def cache_slug super << '--lein-' << lein end private def lein config[:lein].to_s end end end end end ## Instruction: Update leiningen on the fly Note that we the version to update to is fixed at this time. ## Code After: require 'travis/build/script/shared/jdk' module Travis module Build class Script class Clojure < Script include Jdk DEFAULTS = { lein: 'lein', jdk: 'default' } LEIN_VERSION = '2.6.1' def configure super update_lein LEIN_VERSION end def announce super sh.cmd "#{lein} version" end def install sh.cmd "#{lein} deps", fold: 'install', retry: true end def script sh.cmd "#{lein} test" end def cache_slug super << '--lein-' << lein end private def lein config[:lein].to_s end def update_lein(version) sh.if "! -f $HOME/.lein/self-installs/leiningen-#{version}-standalone.jar" do sh.cmd "env LEIN_ROOT=true curl -L -o /usr/local/bin/lein https://raw.githubusercontent.com/technomancy/leiningen/#{version}/bin/lein", echo: true, assert: true, sudo: true sh.cmd "rm -rf $HOME/.lein", echo: false sh.cmd "lein self-install", echo: true, assert: true end end end end end end
9b4930028a8451fc5eaba509ee0887a093b0a325
app/views/compositions/show.json.jbuilder
app/views/compositions/show.json.jbuilder
json.composition do if @composition.persisted? json.id @composition.id end json.slug @composition.slug json.name @composition.name json.notes @composition.notes json.map do json.id @composition.map.id json.name @composition.map.name json.type @composition.map.map_type json.segments @composition.map.segments do |map_segment| json.id map_segment.id json.name map_segment.name end end json.availablePlayers @available_players do |player| json.id player.id json.name player.name end json.players @builder.rows do |row| if row.player json.id row.player.id json.name row.player.name else json.name '' end end # Hash of player ID => heroes json.heroes do @builder.rows.each do |row| if row.player json.set! row.player.id, row.heroes do |hero| json.id hero.id json.slug hero.slug json.name hero.name end end end end # Hash of player ID => map segment ID => hero ID json.selections do @builder.rows.each do |row| if row.player json.set! row.player.id do @builder.map_segments.each do |map_segment| json.set! map_segment.id, row.selected_hero(map_segment) end end end end end end
json.composition do if @composition.persisted? json.id @composition.id end json.user do json.battletag @composition.user.try(:battletag) end json.slug @composition.slug json.name @composition.name json.notes @composition.notes json.map do json.id @composition.map.id json.name @composition.map.name json.type @composition.map.map_type json.segments @composition.map.segments do |map_segment| json.id map_segment.id json.name map_segment.name end end json.availablePlayers @available_players do |player| json.id player.id json.name player.name end json.players @builder.rows do |row| if row.player json.id row.player.id json.name row.player.name else json.name '' end end # Hash of player ID => heroes json.heroes do @builder.rows.each do |row| if row.player json.set! row.player.id, row.heroes do |hero| json.id hero.id json.slug hero.slug json.name hero.name end end end end # Hash of player ID => map segment ID => hero ID json.selections do @builder.rows.each do |row| if row.player json.set! row.player.id do @builder.map_segments.each do |map_segment| json.set! map_segment.id, row.selected_hero(map_segment) end end end end end end
Include creator info in composition JSON
Include creator info in composition JSON
Ruby
mit
cheshire137/overwatch-team-comps,cheshire137/overwatch-team-comps,cheshire137/overwatch-team-comps
ruby
## Code Before: json.composition do if @composition.persisted? json.id @composition.id end json.slug @composition.slug json.name @composition.name json.notes @composition.notes json.map do json.id @composition.map.id json.name @composition.map.name json.type @composition.map.map_type json.segments @composition.map.segments do |map_segment| json.id map_segment.id json.name map_segment.name end end json.availablePlayers @available_players do |player| json.id player.id json.name player.name end json.players @builder.rows do |row| if row.player json.id row.player.id json.name row.player.name else json.name '' end end # Hash of player ID => heroes json.heroes do @builder.rows.each do |row| if row.player json.set! row.player.id, row.heroes do |hero| json.id hero.id json.slug hero.slug json.name hero.name end end end end # Hash of player ID => map segment ID => hero ID json.selections do @builder.rows.each do |row| if row.player json.set! row.player.id do @builder.map_segments.each do |map_segment| json.set! map_segment.id, row.selected_hero(map_segment) end end end end end end ## Instruction: Include creator info in composition JSON ## Code After: json.composition do if @composition.persisted? json.id @composition.id end json.user do json.battletag @composition.user.try(:battletag) end json.slug @composition.slug json.name @composition.name json.notes @composition.notes json.map do json.id @composition.map.id json.name @composition.map.name json.type @composition.map.map_type json.segments @composition.map.segments do |map_segment| json.id map_segment.id json.name map_segment.name end end json.availablePlayers @available_players do |player| json.id player.id json.name player.name end json.players @builder.rows do |row| if row.player json.id row.player.id json.name row.player.name else json.name '' end end # Hash of player ID => heroes json.heroes do @builder.rows.each do |row| if row.player json.set! row.player.id, row.heroes do |hero| json.id hero.id json.slug hero.slug json.name hero.name end end end end # Hash of player ID => map segment ID => hero ID json.selections do @builder.rows.each do |row| if row.player json.set! row.player.id do @builder.map_segments.each do |map_segment| json.set! map_segment.id, row.selected_hero(map_segment) end end end end end end
b40f35d34739e43d5006594e183d4f5fef9a69db
.travis.yml
.travis.yml
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "pypy" - "pypy3" script: scripts/runtest.sh
sudo: false language: python cache: pip python: - '2.7' - '3.3' - '3.4' - '3.5' - '3.6' - pypy - pypy3 script: scripts/runtest.sh deploy: provider: pypi user: uncovertruth password: secure: EbuYCrzrV4X9cJhj832zikWP9KUn6mEpf+/Ajm40amMS2jY5j9sJaLZulf3D+Ff3/I5K2fIjYrzzr+M+UxoD56eg2wfMmn3tvsngtDMhwMU5CCMWKJrcponEmCi5PYwp5lSQgmCjZlgaWS2reSzsDQe4oVS8ivejSUbZgE+o9vk= distributions: sdist bdist_wheel skip_upload_docs: true on: repo: uncovertruth/py-geohex3 tags: true python: 3.6
Add auto deployment to pypi
Add auto deployment to pypi
YAML
mit
uncovertruth/py-geohex3,uncovertruth/py-geohex3
yaml
## Code Before: language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "pypy" - "pypy3" script: scripts/runtest.sh ## Instruction: Add auto deployment to pypi ## Code After: sudo: false language: python cache: pip python: - '2.7' - '3.3' - '3.4' - '3.5' - '3.6' - pypy - pypy3 script: scripts/runtest.sh deploy: provider: pypi user: uncovertruth password: secure: EbuYCrzrV4X9cJhj832zikWP9KUn6mEpf+/Ajm40amMS2jY5j9sJaLZulf3D+Ff3/I5K2fIjYrzzr+M+UxoD56eg2wfMmn3tvsngtDMhwMU5CCMWKJrcponEmCi5PYwp5lSQgmCjZlgaWS2reSzsDQe4oVS8ivejSUbZgE+o9vk= distributions: sdist bdist_wheel skip_upload_docs: true on: repo: uncovertruth/py-geohex3 tags: true python: 3.6
2e0ce2eed7e2713f7c479f724f453adcd88ee226
tests-arquillian/src/test/java/org/jboss/weld/tests/ejb/EJBCallTest.java
tests-arquillian/src/test/java/org/jboss/weld/tests/ejb/EJBCallTest.java
package org.jboss.weld.tests.ejb; import javax.ejb.Stateless; import javax.inject.Inject; import org.jboss.arquillian.container.test.api.Deployment; import org.jboss.arquillian.junit.Arquillian; import org.jboss.shrinkwrap.api.ArchivePaths; import org.jboss.shrinkwrap.api.ShrinkWrap; import org.jboss.shrinkwrap.api.asset.EmptyAsset; import org.jboss.shrinkwrap.api.spec.JavaArchive; import org.junit.Assert; import org.junit.Test; import org.junit.runner.RunWith; @RunWith(Arquillian.class) public class EJBCallTest { @Deployment public static JavaArchive createTestArchive() { return ShrinkWrap .create(JavaArchive.class, "test.jar") .addAsManifestResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml")); } @Stateless public static class SomeService { public String someMethod() { return "test"; } } @Inject SomeService someService; @Test public void testStatelessCall() { Assert.assertEquals("test", someService.someMethod()); } }
package org.jboss.weld.tests.ejb; import javax.ejb.Stateless; import javax.inject.Inject; import org.jboss.arquillian.container.test.api.Deployment; import org.jboss.arquillian.junit.Arquillian; import org.jboss.shrinkwrap.api.ArchivePaths; import org.jboss.shrinkwrap.api.ShrinkWrap; import org.jboss.shrinkwrap.api.asset.EmptyAsset; import org.jboss.shrinkwrap.api.spec.JavaArchive; import org.junit.Assert; import org.junit.Ignore; import org.junit.Test; import org.junit.runner.RunWith; @RunWith(Arquillian.class) public class EJBCallTest { @Deployment public static JavaArchive createTestArchive() { return ShrinkWrap .create(JavaArchive.class, "test.jar") .addAsManifestResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml")); } @Stateless public static class SomeService { public String someMethod() { return "test"; } } @Inject SomeService someService; @Test @Ignore("WELD-1086") public void testStatelessCall() { Assert.assertEquals("test", someService.someMethod()); } }
Exclude the test for WELD-1086 temporarily
Exclude the test for WELD-1086 temporarily
Java
apache-2.0
manovotn/core,antoinesd/weld-core,antoinesd/weld-core,weld/core,weld/core,antoinesd/weld-core,manovotn/core,manovotn/core
java
## Code Before: package org.jboss.weld.tests.ejb; import javax.ejb.Stateless; import javax.inject.Inject; import org.jboss.arquillian.container.test.api.Deployment; import org.jboss.arquillian.junit.Arquillian; import org.jboss.shrinkwrap.api.ArchivePaths; import org.jboss.shrinkwrap.api.ShrinkWrap; import org.jboss.shrinkwrap.api.asset.EmptyAsset; import org.jboss.shrinkwrap.api.spec.JavaArchive; import org.junit.Assert; import org.junit.Test; import org.junit.runner.RunWith; @RunWith(Arquillian.class) public class EJBCallTest { @Deployment public static JavaArchive createTestArchive() { return ShrinkWrap .create(JavaArchive.class, "test.jar") .addAsManifestResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml")); } @Stateless public static class SomeService { public String someMethod() { return "test"; } } @Inject SomeService someService; @Test public void testStatelessCall() { Assert.assertEquals("test", someService.someMethod()); } } ## Instruction: Exclude the test for WELD-1086 temporarily ## Code After: package org.jboss.weld.tests.ejb; import javax.ejb.Stateless; import javax.inject.Inject; import org.jboss.arquillian.container.test.api.Deployment; import org.jboss.arquillian.junit.Arquillian; import org.jboss.shrinkwrap.api.ArchivePaths; import org.jboss.shrinkwrap.api.ShrinkWrap; import org.jboss.shrinkwrap.api.asset.EmptyAsset; import org.jboss.shrinkwrap.api.spec.JavaArchive; import org.junit.Assert; import org.junit.Ignore; import org.junit.Test; import org.junit.runner.RunWith; @RunWith(Arquillian.class) public class EJBCallTest { @Deployment public static JavaArchive createTestArchive() { return ShrinkWrap .create(JavaArchive.class, "test.jar") .addAsManifestResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml")); } @Stateless public static class SomeService { public String someMethod() { return "test"; } } @Inject SomeService someService; @Test @Ignore("WELD-1086") public void testStatelessCall() { Assert.assertEquals("test", someService.someMethod()); } }
170e145d6620e0b1e49634614c4860c10530ac26
.travis.yml
.travis.yml
language: ruby rvm: - jruby-1.7.11 script: bundle exec rspec && (bundle exec rake release || echo "Not released") before_deploy: deploy: provider: rubygems api_key: secure: I9uNSIU6Ob+UlxXzSdUnzg9YVT8tsZULQwYDd0ENLrTE1CdCDiShw7oXP3AYAr/SOQg8yrURkNIW0QV5h4nKoZb+rGpZBMyY9wTtFpek4daYNRJKxUsrFX/Jv84zjqEO/AFEwTtlVKhpUOcCOp26bdCVaBaFgJMZPcObmm7z2bc= gem: pacer on: Release from master branch branch: master tags: true repo: pangloss/pacer
language: ruby rvm: - jruby-1.7.11 - jruby-1.7.18 script: bundle exec rspec && (bundle exec rake release || echo "Not released") before_deploy: deploy: provider: rubygems api_key: secure: I9uNSIU6Ob+UlxXzSdUnzg9YVT8tsZULQwYDd0ENLrTE1CdCDiShw7oXP3AYAr/SOQg8yrURkNIW0QV5h4nKoZb+rGpZBMyY9wTtFpek4daYNRJKxUsrFX/Jv84zjqEO/AFEwTtlVKhpUOcCOp26bdCVaBaFgJMZPcObmm7z2bc= gem: pacer on: Release from master branch branch: master tags: true repo: pangloss/pacer
Test on JRuby 1.7.18 too
Test on JRuby 1.7.18 too
YAML
bsd-3-clause
xnlogic/pacer,xnlogic/pacer
yaml
## Code Before: language: ruby rvm: - jruby-1.7.11 script: bundle exec rspec && (bundle exec rake release || echo "Not released") before_deploy: deploy: provider: rubygems api_key: secure: I9uNSIU6Ob+UlxXzSdUnzg9YVT8tsZULQwYDd0ENLrTE1CdCDiShw7oXP3AYAr/SOQg8yrURkNIW0QV5h4nKoZb+rGpZBMyY9wTtFpek4daYNRJKxUsrFX/Jv84zjqEO/AFEwTtlVKhpUOcCOp26bdCVaBaFgJMZPcObmm7z2bc= gem: pacer on: Release from master branch branch: master tags: true repo: pangloss/pacer ## Instruction: Test on JRuby 1.7.18 too ## Code After: language: ruby rvm: - jruby-1.7.11 - jruby-1.7.18 script: bundle exec rspec && (bundle exec rake release || echo "Not released") before_deploy: deploy: provider: rubygems api_key: secure: I9uNSIU6Ob+UlxXzSdUnzg9YVT8tsZULQwYDd0ENLrTE1CdCDiShw7oXP3AYAr/SOQg8yrURkNIW0QV5h4nKoZb+rGpZBMyY9wTtFpek4daYNRJKxUsrFX/Jv84zjqEO/AFEwTtlVKhpUOcCOp26bdCVaBaFgJMZPcObmm7z2bc= gem: pacer on: Release from master branch branch: master tags: true repo: pangloss/pacer
b0df098bc0f9a69720357aedb1c4acfd9a55f066
tests/python/PythonTests.cmake
tests/python/PythonTests.cmake
if(WIN32) set(_separator "\\;") else() set(_separator ":") endif() function(add_python_test case) set(name "python_${case}") set(module ${case}_test) set(_one_value_args PYTHONPATH) cmake_parse_arguments(fn "" "${_one_value_args}" "" ${ARGN}) add_test( NAME ${name} WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}" COMMAND "${PYTHON_EXECUTABLE}" -m unittest -v ${module} ) set(_pythonpath "${tomviz_python_binary_dir}") set(_pythonpath "${_pythonpath}${_separator}${fn_PYTHONPATH}") set(_pythonpath "$ENV{PYTHONPATH}${_separator}${_pythonpath}") if (WIN32) string(REPLACE "\\;" ";" _pythonpath "${_pythonpath}") string(REPLACE ";" "\\;" _pythonpath "${_pythonpath}") endif() set_property(TEST ${name} PROPERTY ENVIRONMENT "PYTHONPATH=${_pythonpath}" ) endfunction()
if(WIN32) set(_separator "\\;") else() set(_separator ":") endif() function(add_python_test case) set(name "python_${case}") set(module ${case}_test) set(_one_value_args PYTHONPATH) cmake_parse_arguments(fn "" "${_one_value_args}" "" ${ARGN}) set(_python_executable "${PYTHON_EXECUTABLE}") # Allow a different python environment to be used other than # the one Tomviz was built with. For example one that has testing # packages installed such as mock. if(DEFINED ENV{TOMVIZ_TEST_PYTHON_EXECUTABLE}) set(_python_executable "$ENV{TOMVIZ_TEST_PYTHON_EXECUTABLE}") endif() add_test( NAME ${name} WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}" COMMAND "${_python_executable}" -m unittest -v ${module} ) set(_pythonpath "${tomviz_python_binary_dir}") set(_pythonpath "${_pythonpath}${_separator}${fn_PYTHONPATH}") set(_pythonpath "$ENV{PYTHONPATH}${_separator}${_pythonpath}") if (WIN32) string(REPLACE "\\;" ";" _pythonpath "${_pythonpath}") string(REPLACE ";" "\\;" _pythonpath "${_pythonpath}") endif() set_property(TEST ${name} PROPERTY ENVIRONMENT "PYTHONPATH=${_pythonpath}" ) endfunction()
Allow python executable for tests to be specified
Allow python executable for tests to be specified
CMake
bsd-3-clause
OpenChemistry/tomviz,OpenChemistry/tomviz,mathturtle/tomviz,cryos/tomviz,cryos/tomviz,cryos/tomviz,OpenChemistry/tomviz,mathturtle/tomviz,OpenChemistry/tomviz,mathturtle/tomviz
cmake
## Code Before: if(WIN32) set(_separator "\\;") else() set(_separator ":") endif() function(add_python_test case) set(name "python_${case}") set(module ${case}_test) set(_one_value_args PYTHONPATH) cmake_parse_arguments(fn "" "${_one_value_args}" "" ${ARGN}) add_test( NAME ${name} WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}" COMMAND "${PYTHON_EXECUTABLE}" -m unittest -v ${module} ) set(_pythonpath "${tomviz_python_binary_dir}") set(_pythonpath "${_pythonpath}${_separator}${fn_PYTHONPATH}") set(_pythonpath "$ENV{PYTHONPATH}${_separator}${_pythonpath}") if (WIN32) string(REPLACE "\\;" ";" _pythonpath "${_pythonpath}") string(REPLACE ";" "\\;" _pythonpath "${_pythonpath}") endif() set_property(TEST ${name} PROPERTY ENVIRONMENT "PYTHONPATH=${_pythonpath}" ) endfunction() ## Instruction: Allow python executable for tests to be specified ## Code After: if(WIN32) set(_separator "\\;") else() set(_separator ":") endif() function(add_python_test case) set(name "python_${case}") set(module ${case}_test) set(_one_value_args PYTHONPATH) cmake_parse_arguments(fn "" "${_one_value_args}" "" ${ARGN}) set(_python_executable "${PYTHON_EXECUTABLE}") # Allow a different python environment to be used other than # the one Tomviz was built with. For example one that has testing # packages installed such as mock. if(DEFINED ENV{TOMVIZ_TEST_PYTHON_EXECUTABLE}) set(_python_executable "$ENV{TOMVIZ_TEST_PYTHON_EXECUTABLE}") endif() add_test( NAME ${name} WORKING_DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}" COMMAND "${_python_executable}" -m unittest -v ${module} ) set(_pythonpath "${tomviz_python_binary_dir}") set(_pythonpath "${_pythonpath}${_separator}${fn_PYTHONPATH}") set(_pythonpath "$ENV{PYTHONPATH}${_separator}${_pythonpath}") if (WIN32) string(REPLACE "\\;" ";" _pythonpath "${_pythonpath}") string(REPLACE ";" "\\;" _pythonpath "${_pythonpath}") endif() set_property(TEST ${name} PROPERTY ENVIRONMENT "PYTHONPATH=${_pythonpath}" ) endfunction()
30296de7f20ddf637f4bd2d6fef2f617a93e26b6
app/views/tasks/_new.html.erb
app/views/tasks/_new.html.erb
<h3>Create a New Task:</h3> <%= form_for @task do |f| %> <div><%= f.hidden_field :group_id, value: @group.id %></div> <div>Task: <%= f.text_field :task %></div> <div>Date Due: <%= f.date_select :due_date %></div> <div><%= f.select :priority, [["high", 3], ["medium", 2], ["low", 1]]%></div> <%= f.submit "create" %> <% end %>
<h3>Create a New Task:</h3> <%= form_for @task do |f| %> <div><%= f.hidden_field :group_id, value: @group.id %></div> <div>Task: <%= f.text_field :task %></div> <div>Date Due: <%= f.date_field :due_date, class: "datepicker" %></div> <div><%= f.select :priority, [["high", 3], ["medium", 2], ["low", 1]], {}, class: "browser-default"%></div> <%= f.submit "create" %> <% end %>
Add fixed form for new task
Add fixed form for new task
HTML+ERB
mit
yisusans/taskmate,yisusans/taskmate,yisusans/taskmate,yisusans/taskmate
html+erb
## Code Before: <h3>Create a New Task:</h3> <%= form_for @task do |f| %> <div><%= f.hidden_field :group_id, value: @group.id %></div> <div>Task: <%= f.text_field :task %></div> <div>Date Due: <%= f.date_select :due_date %></div> <div><%= f.select :priority, [["high", 3], ["medium", 2], ["low", 1]]%></div> <%= f.submit "create" %> <% end %> ## Instruction: Add fixed form for new task ## Code After: <h3>Create a New Task:</h3> <%= form_for @task do |f| %> <div><%= f.hidden_field :group_id, value: @group.id %></div> <div>Task: <%= f.text_field :task %></div> <div>Date Due: <%= f.date_field :due_date, class: "datepicker" %></div> <div><%= f.select :priority, [["high", 3], ["medium", 2], ["low", 1]], {}, class: "browser-default"%></div> <%= f.submit "create" %> <% end %>
40ae74fef08a6463072951731a9173282d920aee
src/tasks/nuget-restore.js
src/tasks/nuget-restore.js
import gulp from 'gulp'; import nugetRestore from 'gulp-nuget-restore'; export default { /** * Task name * @type {String} */ name: 'sitecore:nuget-restore', /** * Task description * @type {String} */ description: 'Restore all nuget packages for solution.', /** * Task default configuration * @type {Object} */ config: { deps: [], }, /** * Task help options * @type {Object} */ help: { 'solution, -s': 'Solution filepath', }, /** * Task function * @param {object} config * @param {Function} end * @param {Function} error */ fn(config, end) { gulp.src(config.solution) .pipe(nugetRestore()) .on('end', end); }, };
import gulp from 'gulp'; import nugetRestore from 'gulp-nuget-restore'; export default { /** * Task name * @type {String} */ name: 'sitecore:nuget-restore', /** * Task description * @type {String} */ description: 'Restore all nuget packages for solution.', /** * Task default configuration * @type {Object} */ config: { deps: [], }, /** * Task help options * @type {Object} */ help: { 'solution, -s': 'Solution file path', }, /** * Task function * @param {object} config * @param {Function} end * @param {Function} error */ fn(config, end, error) { if (!config.solution) { error('A solution file path was not set.'); return; } gulp.src(config.solution) .pipe(nugetRestore()) .on('end', end); }, };
Add error handling to nuget restore task
Add error handling to nuget restore task
JavaScript
mit
jscarmona/gulp-ignite-sitecore
javascript
## Code Before: import gulp from 'gulp'; import nugetRestore from 'gulp-nuget-restore'; export default { /** * Task name * @type {String} */ name: 'sitecore:nuget-restore', /** * Task description * @type {String} */ description: 'Restore all nuget packages for solution.', /** * Task default configuration * @type {Object} */ config: { deps: [], }, /** * Task help options * @type {Object} */ help: { 'solution, -s': 'Solution filepath', }, /** * Task function * @param {object} config * @param {Function} end * @param {Function} error */ fn(config, end) { gulp.src(config.solution) .pipe(nugetRestore()) .on('end', end); }, }; ## Instruction: Add error handling to nuget restore task ## Code After: import gulp from 'gulp'; import nugetRestore from 'gulp-nuget-restore'; export default { /** * Task name * @type {String} */ name: 'sitecore:nuget-restore', /** * Task description * @type {String} */ description: 'Restore all nuget packages for solution.', /** * Task default configuration * @type {Object} */ config: { deps: [], }, /** * Task help options * @type {Object} */ help: { 'solution, -s': 'Solution file path', }, /** * Task function * @param {object} config * @param {Function} end * @param {Function} error */ fn(config, end, error) { if (!config.solution) { error('A solution file path was not set.'); return; } gulp.src(config.solution) .pipe(nugetRestore()) .on('end', end); }, };
10b38ef62b87be022dda65ec03cc6fd60a12cac7
resources/tev.desktop
resources/tev.desktop
[Desktop Entry] Type=Application Encoding=UTF-8 Name=tev Comment=High dynamic range (HDR) image comparison tool for graphics people. Supports primarily OpenEXR files. Exec=tev Icon=tev.png Terminal=false
[Desktop Entry] Type=Application Encoding=UTF-8 Name=tev Comment=High dynamic range (HDR) image comparison tool for graphics people. Supports primarily OpenEXR files. Exec=tev Icon=tev.png Terminal=false MimeType=image/*;
Support all image MIME types
Support all image MIME types
desktop
bsd-3-clause
Tom94/tev,Tom94/tev,Tom94/tev,Tom94/tev
desktop
## Code Before: [Desktop Entry] Type=Application Encoding=UTF-8 Name=tev Comment=High dynamic range (HDR) image comparison tool for graphics people. Supports primarily OpenEXR files. Exec=tev Icon=tev.png Terminal=false ## Instruction: Support all image MIME types ## Code After: [Desktop Entry] Type=Application Encoding=UTF-8 Name=tev Comment=High dynamic range (HDR) image comparison tool for graphics people. Supports primarily OpenEXR files. Exec=tev Icon=tev.png Terminal=false MimeType=image/*;
81cd197e95e89dd37797c489774f34496ecea259
server/pushlanding.py
server/pushlanding.py
import logging import os from django.http import HttpResponse, Http404 from django.views.decorators.csrf import csrf_exempt from twilio.rest import TwilioRestClient logger = logging.getLogger('django') @csrf_exempt def handle(request): if (request.method != 'POST'): raise Http404 return HttpResponse("Hello, world. You're at the push page.") def testSms(request): # Your Account Sid and Auth Token from twilio.com/user/account account_sid = os.environ['TWILIO_SID'] auth_token = os.environ['TWILIO_AUTH_TOKEN'] client = TwilioRestClient(account_sid, auth_token) message = client.messages.create( body="Jenny please?! I love you <3", to="+19172679225", # Replace with your phone number from_=os.environ['TWILIO_PHONE'] ) logger.info(message.sid) return HttpResponse(message.sid)
import logging import os import json from django.http import HttpResponse, Http404 from django.views.decorators.csrf import csrf_exempt from twilio.rest import TwilioRestClient logger = logging.getLogger('django') @csrf_exempt def handle(request): if (request.method != 'POST'): raise Http404 logger.info("Received a push notification") rawCheckin = request.POST['checkin'] checkin = json.loads(rawCheckin) logger.info(rawCheckin) return HttpResponse("Hello, world. You're at the push page.") def testSms(request): # Your Account Sid and Auth Token from twilio.com/user/account account_sid = os.environ['TWILIO_SID'] auth_token = os.environ['TWILIO_AUTH_TOKEN'] client = TwilioRestClient(account_sid, auth_token) message = client.messages.create( body="Jenny please?! I love you <3", to="+19172679225", # Replace with your phone number from_=os.environ['TWILIO_PHONE'] ) logger.info(message.sid) return HttpResponse(message.sid)
Add debug logging for push landing
Add debug logging for push landing
Python
mit
zackzachariah/scavenger,zackzachariah/scavenger
python
## Code Before: import logging import os from django.http import HttpResponse, Http404 from django.views.decorators.csrf import csrf_exempt from twilio.rest import TwilioRestClient logger = logging.getLogger('django') @csrf_exempt def handle(request): if (request.method != 'POST'): raise Http404 return HttpResponse("Hello, world. You're at the push page.") def testSms(request): # Your Account Sid and Auth Token from twilio.com/user/account account_sid = os.environ['TWILIO_SID'] auth_token = os.environ['TWILIO_AUTH_TOKEN'] client = TwilioRestClient(account_sid, auth_token) message = client.messages.create( body="Jenny please?! I love you <3", to="+19172679225", # Replace with your phone number from_=os.environ['TWILIO_PHONE'] ) logger.info(message.sid) return HttpResponse(message.sid) ## Instruction: Add debug logging for push landing ## Code After: import logging import os import json from django.http import HttpResponse, Http404 from django.views.decorators.csrf import csrf_exempt from twilio.rest import TwilioRestClient logger = logging.getLogger('django') @csrf_exempt def handle(request): if (request.method != 'POST'): raise Http404 logger.info("Received a push notification") rawCheckin = request.POST['checkin'] checkin = json.loads(rawCheckin) logger.info(rawCheckin) return HttpResponse("Hello, world. You're at the push page.") def testSms(request): # Your Account Sid and Auth Token from twilio.com/user/account account_sid = os.environ['TWILIO_SID'] auth_token = os.environ['TWILIO_AUTH_TOKEN'] client = TwilioRestClient(account_sid, auth_token) message = client.messages.create( body="Jenny please?! I love you <3", to="+19172679225", # Replace with your phone number from_=os.environ['TWILIO_PHONE'] ) logger.info(message.sid) return HttpResponse(message.sid)
690e2989a0abc0820fbf5c3e26e67cd08c76e37b
src/discordcr/mappings.cr
src/discordcr/mappings.cr
require "json" module Discordcr module Mappings end end
require "json" module Discordcr module REST # A response to the Get Gateway REST API call. class GatewayResponse JSON.mapping ( url: String ) end end end
Create a mapping for the gateway request response
Create a mapping for the gateway request response
Crystal
mit
meew0/discordcr
crystal
## Code Before: require "json" module Discordcr module Mappings end end ## Instruction: Create a mapping for the gateway request response ## Code After: require "json" module Discordcr module REST # A response to the Get Gateway REST API call. class GatewayResponse JSON.mapping ( url: String ) end end end
eb8f9f780f6574d9079fa0be036b0ae4cc35a075
README.md
README.md
Open Threat Exchange is an open community that allows participants to learn about the latest threats, research indicators of compromise observed in their environments, share threats they have identified, and automatically update their security infrastructure with the latest indicators to defend their environment. OTX Direct Connect agents provide a way to automatically update your security infrastructure with pulses you have subscribed to from with Open Threat Exchange. By using Direct Connect, the indicators contained within the pulses you have subscribed to can be downloaded and made locally available for other applications such as Intrusion Detection Systems, Firewalls, and other security-focused applications. # OTX DirectConnect Python SDK OTX DirectConnect provides a mechanism to automatically pull indicators of compromise from the Open Threat Exchange portal into your environment. The DirectConnect API provides access to all _Pulses_ that you have subscribed to in Open Threat Exchange (https://otx.alienvault.com). 1. Clone this repo 2. Run 'python setup.py install' 3. Integrate into your codebase (see Python Notebook example below) ## Installation and Python Notebook Usage 1. Clone this repo 2. Install pandas ``` pip install pandas``` 2. Install python notebook (http://jupyter.readthedocs.org/en/latest/install.html) ``` pip install jupyter ``` 3. Run notebook ```jupyter notebook howto_use_python_otx_api.ipynb```
Open Threat Exchange is an open community that allows participants to learn about the latest threats, research indicators of compromise observed in their environments, share threats they have identified, and automatically update their security infrastructure with the latest indicators to defend their environment. OTX Direct Connect agents provide a way to automatically update your security infrastructure with pulses you have subscribed to from with Open Threat Exchange. By using Direct Connect, the indicators contained within the pulses you have subscribed to can be downloaded and made locally available for other applications such as Intrusion Detection Systems, Firewalls, and other security-focused applications. # OTX DirectConnect Python SDK OTX DirectConnect provides a mechanism to automatically pull indicators of compromise from the Open Threat Exchange portal into your environment. The DirectConnect API provides access to all _Pulses_ that you have subscribed to in Open Threat Exchange (https://otx.alienvault.com). 1. Clone this repo 2. Run (from the root directory) ``` pip install . ``` or ``` python setup.py install ``` 3. Integrate into your codebase (see Python Notebook example below) ## Installation and Python Notebook Usage 1. Clone this repo 2. Install pandas ``` pip install pandas``` 2. Install python notebook (http://jupyter.readthedocs.org/en/latest/install.html) ``` pip install jupyter ``` 3. Run notebook ```jupyter notebook howto_use_python_otx_api.ipynb```
Update readme for pip install instructions.
Update readme for pip install instructions.
Markdown
apache-2.0
gcrahay/otx_misp,AlienVault-Labs/OTX-Python-SDK
markdown
## Code Before: Open Threat Exchange is an open community that allows participants to learn about the latest threats, research indicators of compromise observed in their environments, share threats they have identified, and automatically update their security infrastructure with the latest indicators to defend their environment. OTX Direct Connect agents provide a way to automatically update your security infrastructure with pulses you have subscribed to from with Open Threat Exchange. By using Direct Connect, the indicators contained within the pulses you have subscribed to can be downloaded and made locally available for other applications such as Intrusion Detection Systems, Firewalls, and other security-focused applications. # OTX DirectConnect Python SDK OTX DirectConnect provides a mechanism to automatically pull indicators of compromise from the Open Threat Exchange portal into your environment. The DirectConnect API provides access to all _Pulses_ that you have subscribed to in Open Threat Exchange (https://otx.alienvault.com). 1. Clone this repo 2. Run 'python setup.py install' 3. Integrate into your codebase (see Python Notebook example below) ## Installation and Python Notebook Usage 1. Clone this repo 2. Install pandas ``` pip install pandas``` 2. Install python notebook (http://jupyter.readthedocs.org/en/latest/install.html) ``` pip install jupyter ``` 3. Run notebook ```jupyter notebook howto_use_python_otx_api.ipynb``` ## Instruction: Update readme for pip install instructions. ## Code After: Open Threat Exchange is an open community that allows participants to learn about the latest threats, research indicators of compromise observed in their environments, share threats they have identified, and automatically update their security infrastructure with the latest indicators to defend their environment. OTX Direct Connect agents provide a way to automatically update your security infrastructure with pulses you have subscribed to from with Open Threat Exchange. By using Direct Connect, the indicators contained within the pulses you have subscribed to can be downloaded and made locally available for other applications such as Intrusion Detection Systems, Firewalls, and other security-focused applications. # OTX DirectConnect Python SDK OTX DirectConnect provides a mechanism to automatically pull indicators of compromise from the Open Threat Exchange portal into your environment. The DirectConnect API provides access to all _Pulses_ that you have subscribed to in Open Threat Exchange (https://otx.alienvault.com). 1. Clone this repo 2. Run (from the root directory) ``` pip install . ``` or ``` python setup.py install ``` 3. Integrate into your codebase (see Python Notebook example below) ## Installation and Python Notebook Usage 1. Clone this repo 2. Install pandas ``` pip install pandas``` 2. Install python notebook (http://jupyter.readthedocs.org/en/latest/install.html) ``` pip install jupyter ``` 3. Run notebook ```jupyter notebook howto_use_python_otx_api.ipynb```
d153a2533060af060059323483d3e56ce8441a2f
bin/run.js
bin/run.js
'use strict' const path = require('path') const fs = require('fs') const SCRIPT_API_PATH = './node_modules/runjs/lib/script.js' try { fs.accessSync(path.resolve(SCRIPT_API_PATH)) } catch (error) { console.error('RunJS not found. Do "npm install runjs" or "yarn add runjs" first') process.exit(1) } const script = require(path.resolve(SCRIPT_API_PATH)) script.main()
'use strict' const path = require('path') const fs = require('fs') const SCRIPT_API_PATH = './node_modules/runjs/lib/script.js' try { fs.accessSync(path.resolve(SCRIPT_API_PATH)) } catch (error) { console.error('RunJS not found. Do "npm install runjs --save-dev" or "yarn add --dev runjs" first.') process.exit(1) } const script = require(path.resolve(SCRIPT_API_PATH)) script.main()
Make better message for RunJS not found
Make better message for RunJS not found
JavaScript
mit
pawelgalazka/runjs,pawelgalazka/runjs
javascript
## Code Before: 'use strict' const path = require('path') const fs = require('fs') const SCRIPT_API_PATH = './node_modules/runjs/lib/script.js' try { fs.accessSync(path.resolve(SCRIPT_API_PATH)) } catch (error) { console.error('RunJS not found. Do "npm install runjs" or "yarn add runjs" first') process.exit(1) } const script = require(path.resolve(SCRIPT_API_PATH)) script.main() ## Instruction: Make better message for RunJS not found ## Code After: 'use strict' const path = require('path') const fs = require('fs') const SCRIPT_API_PATH = './node_modules/runjs/lib/script.js' try { fs.accessSync(path.resolve(SCRIPT_API_PATH)) } catch (error) { console.error('RunJS not found. Do "npm install runjs --save-dev" or "yarn add --dev runjs" first.') process.exit(1) } const script = require(path.resolve(SCRIPT_API_PATH)) script.main()
0cdf7de630dbb0e6a99033e71e1201167a4d809e
lib/rom/sql/mapper_compiler.rb
lib/rom/sql/mapper_compiler.rb
require 'rom/mapper_compiler' module ROM module SQL class MapperCompiler < ROM::MapperCompiler def visit_attribute(node) name, _, meta = node if meta[:wrapped] [name, from: self.alias] else [name] end end end end end
require 'rom/mapper_compiler' module ROM module SQL class MapperCompiler < ROM::MapperCompiler def visit_attribute(node) name, _, meta_options = node if meta_options[:wrapped] [name, from: meta_options[:alias]] else [name] end end end end end
Fix mapper compiler reading attribute alias
Fix mapper compiler reading attribute alias After merging #324 to master, `rom` test suite (once `rom-sql` dependency was updated to the new master ref) was failing due to a change introduced by error on reading the attribute alias from the mapper compiler.
Ruby
mit
rom-rb/rom-sql,rom-rb/rom-sql
ruby
## Code Before: require 'rom/mapper_compiler' module ROM module SQL class MapperCompiler < ROM::MapperCompiler def visit_attribute(node) name, _, meta = node if meta[:wrapped] [name, from: self.alias] else [name] end end end end end ## Instruction: Fix mapper compiler reading attribute alias After merging #324 to master, `rom` test suite (once `rom-sql` dependency was updated to the new master ref) was failing due to a change introduced by error on reading the attribute alias from the mapper compiler. ## Code After: require 'rom/mapper_compiler' module ROM module SQL class MapperCompiler < ROM::MapperCompiler def visit_attribute(node) name, _, meta_options = node if meta_options[:wrapped] [name, from: meta_options[:alias]] else [name] end end end end end
d7b8bfb393b3f084f0bc16eb3c8a34bb7a43fdb0
l10n_ch_payment_slip/company_view.xml
l10n_ch_payment_slip/company_view.xml
<?xml version="1.0"?> <openerp> <data> <record model="ir.ui.view" id="company_form_view"> <field name="name">res.company.form.inherit.bvr</field> <field name="model">res.company</field> <field name="inherit_id" ref="base.view_company_form"/> <field name="arch" type="xml"> <xpath expr="//notebook" position="inside"> <page string="BVR Data"> <group colspan="4"> <field name="bvr_delta_horz"/> <field name="bvr_delta_vert"/> <field name="bvr_scan_line_horz"/> <field name="bvr_scan_line_vert"/> <field name="bvr_scan_line_font_size"/> <field name="bvr_scan_line_letter_spacing"/> <field name="bvr_add_horz"/> <field name="bvr_add_vert"/> <field name="bvr_background"/> </group> </page> </xpath> </field> </record> </data> </openerp>
<?xml version="1.0"?> <openerp> <data> <record model="ir.ui.view" id="company_form_view"> <field name="name">res.company.form.inherit.bvr</field> <field name="model">res.company</field> <field name="inherit_id" ref="base.view_company_form"/> <field name="arch" type="xml"> <xpath expr="//notebook" position="inside"> <page string="BVR Data"> <group colspan="4"> <field name="bvr_delta_horz"/> <field name="bvr_delta_vert"/> <field name="bvr_scan_line_horz"/> <field name="bvr_scan_line_vert"/> <field name="bvr_scan_line_font_size"/> <field name="bvr_scan_line_letter_spacing"/> <field name="bvr_add_horz"/> <field name="bvr_add_vert"/> <field name="bvr_background"/> <field name="merge_mode"/> </group> </page> </xpath> </field> </record> </data> </openerp>
Add merge option in views
Add merge option in views
XML
agpl-3.0
BT-jmichaud/l10n-switzerland,brain-tec/l10n-switzerland,brain-tec/l10n-switzerland,brain-tec/l10n-switzerland
xml
## Code Before: <?xml version="1.0"?> <openerp> <data> <record model="ir.ui.view" id="company_form_view"> <field name="name">res.company.form.inherit.bvr</field> <field name="model">res.company</field> <field name="inherit_id" ref="base.view_company_form"/> <field name="arch" type="xml"> <xpath expr="//notebook" position="inside"> <page string="BVR Data"> <group colspan="4"> <field name="bvr_delta_horz"/> <field name="bvr_delta_vert"/> <field name="bvr_scan_line_horz"/> <field name="bvr_scan_line_vert"/> <field name="bvr_scan_line_font_size"/> <field name="bvr_scan_line_letter_spacing"/> <field name="bvr_add_horz"/> <field name="bvr_add_vert"/> <field name="bvr_background"/> </group> </page> </xpath> </field> </record> </data> </openerp> ## Instruction: Add merge option in views ## Code After: <?xml version="1.0"?> <openerp> <data> <record model="ir.ui.view" id="company_form_view"> <field name="name">res.company.form.inherit.bvr</field> <field name="model">res.company</field> <field name="inherit_id" ref="base.view_company_form"/> <field name="arch" type="xml"> <xpath expr="//notebook" position="inside"> <page string="BVR Data"> <group colspan="4"> <field name="bvr_delta_horz"/> <field name="bvr_delta_vert"/> <field name="bvr_scan_line_horz"/> <field name="bvr_scan_line_vert"/> <field name="bvr_scan_line_font_size"/> <field name="bvr_scan_line_letter_spacing"/> <field name="bvr_add_horz"/> <field name="bvr_add_vert"/> <field name="bvr_background"/> <field name="merge_mode"/> </group> </page> </xpath> </field> </record> </data> </openerp>
864f2dceff4580620d21630107000795d5382a9c
files/ca.conf.erb
files/ca.conf.erb
certificate-authority: { certificate-status: { client-whitelist: [ <%= @dashboard_fqdn -%> <% @cert_fqdn_names.each do |fqdn_name| -%> <%= fqdn_name.chomp %> <% end -%> ] } }
certificate-authority: { certificate-status: { client-whitelist: [ <%= @console_client_certname %>, example.cert.com ] } }
Correct template file to be dropped in puppet_enterprise module template dir
Correct template file to be dropped in puppet_enterprise module template dir
HTML+ERB
apache-2.0
DBMoUK/certwhitelist,DBMoUK/certwhitelist
html+erb
## Code Before: certificate-authority: { certificate-status: { client-whitelist: [ <%= @dashboard_fqdn -%> <% @cert_fqdn_names.each do |fqdn_name| -%> <%= fqdn_name.chomp %> <% end -%> ] } } ## Instruction: Correct template file to be dropped in puppet_enterprise module template dir ## Code After: certificate-authority: { certificate-status: { client-whitelist: [ <%= @console_client_certname %>, example.cert.com ] } }
5047d40d2cb428cd3c872f1ef91464e40b9c8458
README.md
README.md
- an ever-growing list of things I want to save for later reference - all generally related to graphic design ### why - it's much more readable and organized in the format of a website as opposed to a raw markdown doc - sharing is cool ### to do - ~~Add deploy task~~ - Add .travis.yml
- an ever-growing list of things I'm saving for later reference - all generally related to graphic design ### why - it's much more readable and organized in the format of a website as opposed to a raw markdown doc - sharing is cool # summary - built with [Jekyll](https://jekyllrb.com/) - automated with [Gulp](http://gulpjs.com/) ([see gulpfile](https://github.com/jckfa/silly.graphics/blob/master/gulpfile.js)) - styled with [cssnext](http://cssnext.io/) - run on [Nginx](http://nginx.org/) ([see configuration](https://github.com/jckfa/nginx-config/blob/master/sites-available/silly.graphics)) - hosted on [Digital Ocean](https://www.digitalocean.com/) - Encrypted with [Let's Encrypt](https://letsencrypt.org/) ### deploying Instead of pushing source files to my server and building there, I build on my development machine and push from _site. And I have a gulp task (deploy) to automate this. ### compatibility - show/hide functionality on mobile requires [classList](http://caniuse.com/#search=classlist) - relies on [vh](http://caniuse.com/#search=vh) and [flexbox](http://caniuse.com/#search=flex) for the text bordering the window - only uses [woff](http://caniuse.com/#search=woff) and [woff2](http://caniuse.com/#search=woff2) for webfonts - exclusively [TLSv1.2](https://github.com/jckfa/nginx-config/blob/master/conf.d/directive-only/tls.conf) (>IE10, Android 4.3)
Add more stuff to readme
Add more stuff to readme
Markdown
isc
jckfa/silly.graphics,jckfa/silly.graphics,jckfa/silly.graphics
markdown
## Code Before: - an ever-growing list of things I want to save for later reference - all generally related to graphic design ### why - it's much more readable and organized in the format of a website as opposed to a raw markdown doc - sharing is cool ### to do - ~~Add deploy task~~ - Add .travis.yml ## Instruction: Add more stuff to readme ## Code After: - an ever-growing list of things I'm saving for later reference - all generally related to graphic design ### why - it's much more readable and organized in the format of a website as opposed to a raw markdown doc - sharing is cool # summary - built with [Jekyll](https://jekyllrb.com/) - automated with [Gulp](http://gulpjs.com/) ([see gulpfile](https://github.com/jckfa/silly.graphics/blob/master/gulpfile.js)) - styled with [cssnext](http://cssnext.io/) - run on [Nginx](http://nginx.org/) ([see configuration](https://github.com/jckfa/nginx-config/blob/master/sites-available/silly.graphics)) - hosted on [Digital Ocean](https://www.digitalocean.com/) - Encrypted with [Let's Encrypt](https://letsencrypt.org/) ### deploying Instead of pushing source files to my server and building there, I build on my development machine and push from _site. And I have a gulp task (deploy) to automate this. ### compatibility - show/hide functionality on mobile requires [classList](http://caniuse.com/#search=classlist) - relies on [vh](http://caniuse.com/#search=vh) and [flexbox](http://caniuse.com/#search=flex) for the text bordering the window - only uses [woff](http://caniuse.com/#search=woff) and [woff2](http://caniuse.com/#search=woff2) for webfonts - exclusively [TLSv1.2](https://github.com/jckfa/nginx-config/blob/master/conf.d/directive-only/tls.conf) (>IE10, Android 4.3)
702fabfc96b3b07efb4d5c30a6807031b803a551
kolibri/deployment/default/wsgi.py
kolibri/deployment/default/wsgi.py
import logging import os import time from django.core.wsgi import get_wsgi_application from django.db.utils import OperationalError os.environ.setdefault( "DJANGO_SETTINGS_MODULE", "kolibri.deployment.default.settings.base" ) logger = logging.getLogger(__name__) application = None tries_remaining = 6 interval = 10 while not application and tries_remaining: try: logger.info("Starting Kolibri") application = get_wsgi_application() except OperationalError: # an OperationalError happens when sqlite vacuum is being executed. the db is locked logger.info("DB busy. Trying again in {}s".format(interval)) tries_remaining -= 1 time.sleep(interval) application = get_wsgi_application() # try again one last time if not application: logger.error("Could not start Kolibri")
import logging import os import time from django.core.wsgi import get_wsgi_application from django.db.utils import OperationalError import kolibri os.environ.setdefault( "DJANGO_SETTINGS_MODULE", "kolibri.deployment.default.settings.base" ) logger = logging.getLogger(__name__) application = None tries_remaining = 6 interval = 10 while not application and tries_remaining: try: logger.info("Starting Kolibri {version}".format(version=kolibri.__version__)) application = get_wsgi_application() except OperationalError: # an OperationalError happens when sqlite vacuum is being executed. the db is locked logger.info("DB busy. Trying again in {}s".format(interval)) tries_remaining -= 1 time.sleep(interval) application = get_wsgi_application() # try again one last time if not application: logger.error("Could not start Kolibri")
Add kolibri version to server start logging.
Add kolibri version to server start logging.
Python
mit
indirectlylit/kolibri,learningequality/kolibri,learningequality/kolibri,indirectlylit/kolibri,learningequality/kolibri,mrpau/kolibri,learningequality/kolibri,mrpau/kolibri,indirectlylit/kolibri,indirectlylit/kolibri,mrpau/kolibri,mrpau/kolibri
python
## Code Before: import logging import os import time from django.core.wsgi import get_wsgi_application from django.db.utils import OperationalError os.environ.setdefault( "DJANGO_SETTINGS_MODULE", "kolibri.deployment.default.settings.base" ) logger = logging.getLogger(__name__) application = None tries_remaining = 6 interval = 10 while not application and tries_remaining: try: logger.info("Starting Kolibri") application = get_wsgi_application() except OperationalError: # an OperationalError happens when sqlite vacuum is being executed. the db is locked logger.info("DB busy. Trying again in {}s".format(interval)) tries_remaining -= 1 time.sleep(interval) application = get_wsgi_application() # try again one last time if not application: logger.error("Could not start Kolibri") ## Instruction: Add kolibri version to server start logging. ## Code After: import logging import os import time from django.core.wsgi import get_wsgi_application from django.db.utils import OperationalError import kolibri os.environ.setdefault( "DJANGO_SETTINGS_MODULE", "kolibri.deployment.default.settings.base" ) logger = logging.getLogger(__name__) application = None tries_remaining = 6 interval = 10 while not application and tries_remaining: try: logger.info("Starting Kolibri {version}".format(version=kolibri.__version__)) application = get_wsgi_application() except OperationalError: # an OperationalError happens when sqlite vacuum is being executed. the db is locked logger.info("DB busy. Trying again in {}s".format(interval)) tries_remaining -= 1 time.sleep(interval) application = get_wsgi_application() # try again one last time if not application: logger.error("Could not start Kolibri")
940fc6d5658ee472346ae16e855fdaf8b906f0ae
lib/kosmos/web/app.rb
lib/kosmos/web/app.rb
require 'sinatra' module Kosmos module Web class App < Sinatra::Application set server: 'thin', connection: nil get '/' do send_file(File.join(settings.public_folder, 'index.html')) end get '/stream', provides: 'text/event-stream' do stream :keep_open do |out| settings.connection = out end end post '/' do Kosmos.configure do |config| config.output_method = Proc.new do |str| str.split("\n").each do |line| settings.connection << "data: #{line}\n\n" end end end kosmos_params = params[:params].split(' ') kosmos_command = %w(init install uninstall list).find do |command| command == params[:command] end UserInterface.send(kosmos_command, kosmos_params) 204 end end end end
require 'sinatra' module Kosmos module Web class App < Sinatra::Application set server: 'thin', connection: nil get '/' do send_file(File.join(settings.public_folder, 'index.html')) end get '/stream', provides: 'text/event-stream' do stream :keep_open do |out| settings.connection = out end end post '/' do Kosmos.configure do |config| config.output_method = Proc.new do |str| # Send to STDOUT puts str # And to the in-browser UI str.split("\n").each do |line| settings.connection << "data: #{line}\n\n" end end end kosmos_params = params[:params].split(' ') kosmos_command = %w(init install uninstall list).find do |command| command == params[:command] end UserInterface.send(kosmos_command, kosmos_params) 204 end end end end
Send Util.log to both STDOUT and UI when in server mode.
Send Util.log to both STDOUT and UI when in server mode.
Ruby
mit
kmc/kmc,kmc/kmc,kmc/kmc
ruby
## Code Before: require 'sinatra' module Kosmos module Web class App < Sinatra::Application set server: 'thin', connection: nil get '/' do send_file(File.join(settings.public_folder, 'index.html')) end get '/stream', provides: 'text/event-stream' do stream :keep_open do |out| settings.connection = out end end post '/' do Kosmos.configure do |config| config.output_method = Proc.new do |str| str.split("\n").each do |line| settings.connection << "data: #{line}\n\n" end end end kosmos_params = params[:params].split(' ') kosmos_command = %w(init install uninstall list).find do |command| command == params[:command] end UserInterface.send(kosmos_command, kosmos_params) 204 end end end end ## Instruction: Send Util.log to both STDOUT and UI when in server mode. ## Code After: require 'sinatra' module Kosmos module Web class App < Sinatra::Application set server: 'thin', connection: nil get '/' do send_file(File.join(settings.public_folder, 'index.html')) end get '/stream', provides: 'text/event-stream' do stream :keep_open do |out| settings.connection = out end end post '/' do Kosmos.configure do |config| config.output_method = Proc.new do |str| # Send to STDOUT puts str # And to the in-browser UI str.split("\n").each do |line| settings.connection << "data: #{line}\n\n" end end end kosmos_params = params[:params].split(' ') kosmos_command = %w(init install uninstall list).find do |command| command == params[:command] end UserInterface.send(kosmos_command, kosmos_params) 204 end end end end
05f26a20d81d3fe2add61016a4279ac988a5dc9b
spec/classes/hitch_spec.rb
spec/classes/hitch_spec.rb
require 'spec_helper' describe 'hitch' do on_supported_os.each do |os, os_facts| context "on #{os}" do let(:facts) { os_facts } it { is_expected.to compile } end end end
require 'spec_helper' describe 'hitch' do on_supported_os.each do |os, os_facts| context "on #{os}" do let(:facts) { os_facts } context 'defaults' do it { is_expected.to compile } it { is_expected.to contain_exec('hitch::config generate dhparams') } it { is_expected.to contain_file('/etc/hitch/dhparams.pem') .without_content } end context 'with dhparams_content' do let(:params) { { dhparams_content: 'BEGIN DH PARAMETERS' } } it { is_expected.not_to contain_exec('hitch::config generate dhparams') } it { is_expected.to contain_file('/etc/hitch/dhparams.pem') .with_content(%r{BEGIN DH PARAMETERS}) } end end end end
Add test for missing dhparams content
Add test for missing dhparams content This should fail until #6 is fixed.
Ruby
apache-2.0
ssm/ssm-hitch,ssm/ssm-hitch
ruby
## Code Before: require 'spec_helper' describe 'hitch' do on_supported_os.each do |os, os_facts| context "on #{os}" do let(:facts) { os_facts } it { is_expected.to compile } end end end ## Instruction: Add test for missing dhparams content This should fail until #6 is fixed. ## Code After: require 'spec_helper' describe 'hitch' do on_supported_os.each do |os, os_facts| context "on #{os}" do let(:facts) { os_facts } context 'defaults' do it { is_expected.to compile } it { is_expected.to contain_exec('hitch::config generate dhparams') } it { is_expected.to contain_file('/etc/hitch/dhparams.pem') .without_content } end context 'with dhparams_content' do let(:params) { { dhparams_content: 'BEGIN DH PARAMETERS' } } it { is_expected.not_to contain_exec('hitch::config generate dhparams') } it { is_expected.to contain_file('/etc/hitch/dhparams.pem') .with_content(%r{BEGIN DH PARAMETERS}) } end end end end
afdd154d09974c130aabc06046032f7d9fefdb5f
benchmark/Results.md
benchmark/Results.md
Benchmark Results ================= - Benchmark Mode Cnt Score Error Units - EventBenchmark.testMethodHandleSpeed thrpt 20 31023302.051 ± 113011.697 ops/s - EventBenchmark.testReflectionSpeed thrpt 20 33397973.784 ± 57074.901 ops/s
Benchmark Results ================= | Benchmark | Mode | Cnt | Score | Error | Units | | --------------------------------------|-------|-----|--------------|--------------|-------| | EventBenchmark.testMethodHandleSpeed | thrpt | 20 | 31023302.051 | ± 113011.697 | ops/s | | EventBenchmark.testReflectionSpeed | thrpt | 20 | 33397973.784 | ± 57074.901 | ops/s |
Use pretty table for benchmark
Use pretty table for benchmark
Markdown
mit
Techcable/Event4J
markdown
## Code Before: Benchmark Results ================= - Benchmark Mode Cnt Score Error Units - EventBenchmark.testMethodHandleSpeed thrpt 20 31023302.051 ± 113011.697 ops/s - EventBenchmark.testReflectionSpeed thrpt 20 33397973.784 ± 57074.901 ops/s ## Instruction: Use pretty table for benchmark ## Code After: Benchmark Results ================= | Benchmark | Mode | Cnt | Score | Error | Units | | --------------------------------------|-------|-----|--------------|--------------|-------| | EventBenchmark.testMethodHandleSpeed | thrpt | 20 | 31023302.051 | ± 113011.697 | ops/s | | EventBenchmark.testReflectionSpeed | thrpt | 20 | 33397973.784 | ± 57074.901 | ops/s |
819843bae496b16fcb0dff8f26211dd7aaa47004
contents/includes/buttons/buttons-types-ghost.html
contents/includes/buttons/buttons-types-ghost.html
<div class="cf nr2 nl2 mb3"> <div class="fl-ns w-50-ns ph2"> <p> Ghost buttons are used with icons only. </p> <p> Add a background if there is not enough cue that the button is actionable. </p> </div> <div class="fl-ns w-50-ns ph2"> <img class="db w-100-ns outline outline--black-01 outline-offset--1" src="../images/buttons/button-ghost.svg" alt=""> </div> </div>
<div class="cf nr2 nl2 mb3"> <div class="fl-ns w-50-ns ph2"> <p> Ghost buttons include an icon with no accompanying text. </p> <p> Add a background if there is not enough cue that the button is actionable. </p> </div> <div class="fl-ns w-50-ns ph2"> <img class="db w-100-ns outline outline--black-01 outline-offset--1" src="../images/buttons/button-ghost.svg" alt=""> </div> </div>
Update Ghost to: Ghost buttons include an icon with no accompanying text. Add a background if there is not enough cue that the button is actionable.
[edit] Update Ghost to: Ghost buttons include an icon with no accompanying text. Add a background if there is not enough cue that the button is actionable. according to brjones’ notes
HTML
mpl-2.0
FirefoxUX/photon,bwinton/photon,FirefoxUX/photon,FirefoxUX/photon,bwinton/photon,bwinton/photon,FirefoxUX/photon
html
## Code Before: <div class="cf nr2 nl2 mb3"> <div class="fl-ns w-50-ns ph2"> <p> Ghost buttons are used with icons only. </p> <p> Add a background if there is not enough cue that the button is actionable. </p> </div> <div class="fl-ns w-50-ns ph2"> <img class="db w-100-ns outline outline--black-01 outline-offset--1" src="../images/buttons/button-ghost.svg" alt=""> </div> </div> ## Instruction: [edit] Update Ghost to: Ghost buttons include an icon with no accompanying text. Add a background if there is not enough cue that the button is actionable. according to brjones’ notes ## Code After: <div class="cf nr2 nl2 mb3"> <div class="fl-ns w-50-ns ph2"> <p> Ghost buttons include an icon with no accompanying text. </p> <p> Add a background if there is not enough cue that the button is actionable. </p> </div> <div class="fl-ns w-50-ns ph2"> <img class="db w-100-ns outline outline--black-01 outline-offset--1" src="../images/buttons/button-ghost.svg" alt=""> </div> </div>
9d8b6087969a87768a0922f4a9ce049efb66b7a2
webui_config.js
webui_config.js
window.config = { pppCoreUrl: 'http://core.frontend.askplatyp.us/', pppLoggerUrl: 'http://logger.frontend.askplatyp.us/' };
window.config = { pppCoreUrl: '//core.frontend.askplatyp.us/', pppLoggerUrl: '//logger.frontend.askplatyp.us/' };
Use relative protocol (http/https) for querying the API.
Use relative protocol (http/https) for querying the API.
JavaScript
cc0-1.0
ProjetPP/Deployment,ProjetPP/Deployment,ProjetPP/Deployment
javascript
## Code Before: window.config = { pppCoreUrl: 'http://core.frontend.askplatyp.us/', pppLoggerUrl: 'http://logger.frontend.askplatyp.us/' }; ## Instruction: Use relative protocol (http/https) for querying the API. ## Code After: window.config = { pppCoreUrl: '//core.frontend.askplatyp.us/', pppLoggerUrl: '//logger.frontend.askplatyp.us/' };
81ab68681e6c0836238fb1b4f28e0c405e126773
recipes-debian/rdesktop/rdesktop_debian.bb
recipes-debian/rdesktop/rdesktop_debian.bb
DESCRIPTION = "Rdesktop rdp client for X" HOMEPAGE = "http://www.rdesktop.org" LICENSE = "GPLv3" LIC_FILES_CHKSUM = "file://COPYING;md5=f27defe1e96c2e1ecd4e0c9be8967949" PR = "r0" inherit debian-package DEPENDS = "virtual/libx11 openssl libgssglue alsa-lib" inherit autotools-brokensep pkgconfig # Currently, we have no recipe for pcsclite, so temporary disable smartcard. EXTRA_OECONF = "--with-openssl=${STAGING_LIBDIR}/.. \ --with-ipv6 --with-sound=alsa \ --disable-smartcard \ " INSANE_SKIP_rdesktop_forcevariable = " already-stripped"
DESCRIPTION = "Rdesktop rdp client for X" HOMEPAGE = "http://www.rdesktop.org" LICENSE = "GPLv3" LIC_FILES_CHKSUM = "file://COPYING;md5=f27defe1e96c2e1ecd4e0c9be8967949" PR = "r1" inherit debian-package DEPENDS = "virtual/libx11 openssl libgssglue alsa-lib" inherit autotools-brokensep pkgconfig # Currently, we have no recipe for pcsclite, so temporary disable smartcard. EXTRA_OECONF = "--with-openssl=${STAGING_LIBDIR}/.. \ --with-ipv6 --with-sound=alsa \ --disable-smartcard \ " # We are cross compiling, set 'fu_cv_sys_stat_statvfs64=cross' # to prevent sys_stat_statvfs64 is 'yes' by running test code in configure, # because it make error when build for qemuppc target: # | disk.c:726:18: error: storage size of 'stat_fs' isn't known # | struct STATFS_T stat_fs; CACHED_CONFIGUREVARS += "fu_cv_sys_stat_statvfs64=cross" INSANE_SKIP_rdesktop_forcevariable = " already-stripped"
Fix error when build for qemuppc
rdesktop: Fix error when build for qemuppc We are cross compiling, set 'fu_cv_sys_stat_statvfs64=cross' to prevent sys_stat_statvfs64 is 'yes' by running test code in configure, because it make error when build for qemuppc target: | disk.c:726:18: error: storage size of 'stat_fs' isn't known | struct STATFS_T stat_fs; Signed-off-by: Trung Do <64081e6b7eb2b24289b8a4e51cef9836ddf28af2@toshiba-tsdv.com>
BitBake
mit
nghiaht-tsdv/meta-debian,tienlee/meta-debian,tienlee/meta-debian,tienlee/meta-debian,nghiaht-tsdv/meta-debian,nghiaht-tsdv/meta-debian,meta-debian/meta-debian,nghiaht-tsdv/meta-debian,nghiaht-tsdv/meta-debian,nghiaht-tsdv/meta-debian,tienlee/meta-debian,tienlee/meta-debian,meta-debian/meta-debian,meta-debian/meta-debian,meta-debian/meta-debian,tienlee/meta-debian,meta-debian/meta-debian,meta-debian/meta-debian
bitbake
## Code Before: DESCRIPTION = "Rdesktop rdp client for X" HOMEPAGE = "http://www.rdesktop.org" LICENSE = "GPLv3" LIC_FILES_CHKSUM = "file://COPYING;md5=f27defe1e96c2e1ecd4e0c9be8967949" PR = "r0" inherit debian-package DEPENDS = "virtual/libx11 openssl libgssglue alsa-lib" inherit autotools-brokensep pkgconfig # Currently, we have no recipe for pcsclite, so temporary disable smartcard. EXTRA_OECONF = "--with-openssl=${STAGING_LIBDIR}/.. \ --with-ipv6 --with-sound=alsa \ --disable-smartcard \ " INSANE_SKIP_rdesktop_forcevariable = " already-stripped" ## Instruction: rdesktop: Fix error when build for qemuppc We are cross compiling, set 'fu_cv_sys_stat_statvfs64=cross' to prevent sys_stat_statvfs64 is 'yes' by running test code in configure, because it make error when build for qemuppc target: | disk.c:726:18: error: storage size of 'stat_fs' isn't known | struct STATFS_T stat_fs; Signed-off-by: Trung Do <64081e6b7eb2b24289b8a4e51cef9836ddf28af2@toshiba-tsdv.com> ## Code After: DESCRIPTION = "Rdesktop rdp client for X" HOMEPAGE = "http://www.rdesktop.org" LICENSE = "GPLv3" LIC_FILES_CHKSUM = "file://COPYING;md5=f27defe1e96c2e1ecd4e0c9be8967949" PR = "r1" inherit debian-package DEPENDS = "virtual/libx11 openssl libgssglue alsa-lib" inherit autotools-brokensep pkgconfig # Currently, we have no recipe for pcsclite, so temporary disable smartcard. EXTRA_OECONF = "--with-openssl=${STAGING_LIBDIR}/.. \ --with-ipv6 --with-sound=alsa \ --disable-smartcard \ " # We are cross compiling, set 'fu_cv_sys_stat_statvfs64=cross' # to prevent sys_stat_statvfs64 is 'yes' by running test code in configure, # because it make error when build for qemuppc target: # | disk.c:726:18: error: storage size of 'stat_fs' isn't known # | struct STATFS_T stat_fs; CACHED_CONFIGUREVARS += "fu_cv_sys_stat_statvfs64=cross" INSANE_SKIP_rdesktop_forcevariable = " already-stripped"
da41223f531203c24d9c433d18d4d45dee469920
lib/wiki-plugins/wikiplugin_usercount.php
lib/wiki-plugins/wikiplugin_usercount.php
<?php // (c) Copyright 2002-2010 by authors of the Tiki Wiki/CMS/Groupware Project // // All Rights Reserved. See copyright.txt for details and a complete list of authors. // Licensed under the GNU LESSER GENERAL PUBLIC LICENSE. See license.txt for details. // $Id$ // Displays the number of total users or the number of users in a group // Use: // {USERCOUNT()}groupname{USERCOUNT} // // If no groupname is given returns all users function wikiplugin_usercount_help() { return tra("Displays the number of registered users").":<br />~np~{USERCOUNT()}groupname{USERCOUNT}~/np~"; } function wikiplugin_usercount_info() { return array( 'name' => tra('User Count'), 'documentation' => 'PluginUserCount', 'description' => tra('Displays the number of registered users'), 'prefs' => array( 'wikiplugin_usercount' ), 'body' => tra('Group name'), 'icon' => 'pics/icons/group_gear.png', 'params' => array( ), ); } function wikiplugin_usercount($data, $params) { global $tikilib; global $userlib; extract ($params,EXTR_SKIP); $numusers = $userlib->count_users($data); return $numusers; }
<?php // (c) Copyright 2002-2010 by authors of the Tiki Wiki/CMS/Groupware Project // // All Rights Reserved. See copyright.txt for details and a complete list of authors. // Licensed under the GNU LESSER GENERAL PUBLIC LICENSE. See license.txt for details. // $Id$ // Displays the number of total users or the number of users in a group // Use: // {USERCOUNT()}groupname{USERCOUNT} // // If no groupname is given returns all users function wikiplugin_usercount_help() { return tra("Displays the number of registered users").":<br />~np~{USERCOUNT()}groupname{USERCOUNT}~/np~"; } function wikiplugin_usercount_info() { return array( 'name' => tra('User Count'), 'documentation' => tra('PluginUserCount'), 'description' => tra('Displays the number of users that are registered or within a group'), 'prefs' => array( 'wikiplugin_usercount' ), 'body' => tra('Group name'), 'icon' => 'pics/icons/group_gear.png', 'params' => array( ), ); } function wikiplugin_usercount($data, $params) { global $tikilib; global $userlib; extract ($params,EXTR_SKIP); $numusers = $userlib->count_users($data); return $numusers; }
Add tra and some rewording for plugin info used in plugin edit window
Add tra and some rewording for plugin info used in plugin edit window git-svn-id: 9bac41f8ebc9458fc3e28d41abfab39641e8bd1c@30966 b456876b-0849-0410-b77d-98878d47e9d5
PHP
lgpl-2.1
tikiorg/tiki,tikiorg/tiki,changi67/tiki,oregional/tiki,tikiorg/tiki,changi67/tiki,changi67/tiki,changi67/tiki,oregional/tiki,tikiorg/tiki,oregional/tiki,changi67/tiki,oregional/tiki
php
## Code Before: <?php // (c) Copyright 2002-2010 by authors of the Tiki Wiki/CMS/Groupware Project // // All Rights Reserved. See copyright.txt for details and a complete list of authors. // Licensed under the GNU LESSER GENERAL PUBLIC LICENSE. See license.txt for details. // $Id$ // Displays the number of total users or the number of users in a group // Use: // {USERCOUNT()}groupname{USERCOUNT} // // If no groupname is given returns all users function wikiplugin_usercount_help() { return tra("Displays the number of registered users").":<br />~np~{USERCOUNT()}groupname{USERCOUNT}~/np~"; } function wikiplugin_usercount_info() { return array( 'name' => tra('User Count'), 'documentation' => 'PluginUserCount', 'description' => tra('Displays the number of registered users'), 'prefs' => array( 'wikiplugin_usercount' ), 'body' => tra('Group name'), 'icon' => 'pics/icons/group_gear.png', 'params' => array( ), ); } function wikiplugin_usercount($data, $params) { global $tikilib; global $userlib; extract ($params,EXTR_SKIP); $numusers = $userlib->count_users($data); return $numusers; } ## Instruction: Add tra and some rewording for plugin info used in plugin edit window git-svn-id: 9bac41f8ebc9458fc3e28d41abfab39641e8bd1c@30966 b456876b-0849-0410-b77d-98878d47e9d5 ## Code After: <?php // (c) Copyright 2002-2010 by authors of the Tiki Wiki/CMS/Groupware Project // // All Rights Reserved. See copyright.txt for details and a complete list of authors. // Licensed under the GNU LESSER GENERAL PUBLIC LICENSE. See license.txt for details. // $Id$ // Displays the number of total users or the number of users in a group // Use: // {USERCOUNT()}groupname{USERCOUNT} // // If no groupname is given returns all users function wikiplugin_usercount_help() { return tra("Displays the number of registered users").":<br />~np~{USERCOUNT()}groupname{USERCOUNT}~/np~"; } function wikiplugin_usercount_info() { return array( 'name' => tra('User Count'), 'documentation' => tra('PluginUserCount'), 'description' => tra('Displays the number of users that are registered or within a group'), 'prefs' => array( 'wikiplugin_usercount' ), 'body' => tra('Group name'), 'icon' => 'pics/icons/group_gear.png', 'params' => array( ), ); } function wikiplugin_usercount($data, $params) { global $tikilib; global $userlib; extract ($params,EXTR_SKIP); $numusers = $userlib->count_users($data); return $numusers; }
07a196be3dca5125454262bb96967d5895081c56
app/views/admin/application_settings/_repository_storage.html.haml
app/views/admin/application_settings/_repository_storage.html.haml
= form_for @application_setting, url: admin_application_settings_path(anchor: 'js-repository-storage-settings'), html: { class: 'fieldset-form' } do |f| = form_errors(@application_setting) %fieldset .sub-section .form-group .form-check = f.check_box :hashed_storage_enabled, class: 'form-check-input qa-hashed-storage-checkbox' = f.label :hashed_storage_enabled, class: 'form-check-label' do Use hashed storage paths for newly created and renamed projects .form-text.text-muted Enable immutable, hash-based paths and repository names to store repositories on disk. This prevents repositories from having to be moved or renamed when the Project URL changes and may improve disk I/O performance. %em (EXPERIMENTAL) .form-group = f.label :repository_storages, 'Storage paths for new projects', class: 'label-bold' = f.select :repository_storages, repository_storages_options_for_select(@application_setting.repository_storages), {include_hidden: false}, multiple: true, class: 'form-control' .form-text.text-muted Manage repository storage paths. Learn more in the = succeed "." do = link_to "repository storages documentation", help_page_path("administration/repository_storage_paths") = f.submit 'Save changes', class: "btn btn-success qa-save-changes-button"
= form_for @application_setting, url: admin_application_settings_path(anchor: 'js-repository-storage-settings'), html: { class: 'fieldset-form' } do |f| = form_errors(@application_setting) %fieldset .sub-section .form-group .form-check = f.check_box :hashed_storage_enabled, class: 'form-check-input qa-hashed-storage-checkbox' = f.label :hashed_storage_enabled, class: 'form-check-label' do Use hashed storage paths for newly created and renamed projects .form-text.text-muted Enable immutable, hash-based paths and repository names to store repositories on disk. This prevents repositories from having to be moved or renamed when the Project URL changes and may improve disk I/O performance. .form-group = f.label :repository_storages, 'Storage paths for new projects', class: 'label-bold' = f.select :repository_storages, repository_storages_options_for_select(@application_setting.repository_storages), {include_hidden: false}, multiple: true, class: 'form-control' .form-text.text-muted Manage repository storage paths. Learn more in the = succeed "." do = link_to "repository storages documentation", help_page_path("administration/repository_storage_paths") = f.submit 'Save changes', class: "btn btn-success qa-save-changes-button"
Remove "Experimental" text from Hashed Storage settings page
Remove "Experimental" text from Hashed Storage settings page
Haml
mit
mmkassem/gitlabhq,stoplightio/gitlabhq,axilleas/gitlabhq,axilleas/gitlabhq,axilleas/gitlabhq,mmkassem/gitlabhq,stoplightio/gitlabhq,stoplightio/gitlabhq,iiet/iiet-git,iiet/iiet-git,mmkassem/gitlabhq,stoplightio/gitlabhq,iiet/iiet-git,mmkassem/gitlabhq,axilleas/gitlabhq,iiet/iiet-git
haml
## Code Before: = form_for @application_setting, url: admin_application_settings_path(anchor: 'js-repository-storage-settings'), html: { class: 'fieldset-form' } do |f| = form_errors(@application_setting) %fieldset .sub-section .form-group .form-check = f.check_box :hashed_storage_enabled, class: 'form-check-input qa-hashed-storage-checkbox' = f.label :hashed_storage_enabled, class: 'form-check-label' do Use hashed storage paths for newly created and renamed projects .form-text.text-muted Enable immutable, hash-based paths and repository names to store repositories on disk. This prevents repositories from having to be moved or renamed when the Project URL changes and may improve disk I/O performance. %em (EXPERIMENTAL) .form-group = f.label :repository_storages, 'Storage paths for new projects', class: 'label-bold' = f.select :repository_storages, repository_storages_options_for_select(@application_setting.repository_storages), {include_hidden: false}, multiple: true, class: 'form-control' .form-text.text-muted Manage repository storage paths. Learn more in the = succeed "." do = link_to "repository storages documentation", help_page_path("administration/repository_storage_paths") = f.submit 'Save changes', class: "btn btn-success qa-save-changes-button" ## Instruction: Remove "Experimental" text from Hashed Storage settings page ## Code After: = form_for @application_setting, url: admin_application_settings_path(anchor: 'js-repository-storage-settings'), html: { class: 'fieldset-form' } do |f| = form_errors(@application_setting) %fieldset .sub-section .form-group .form-check = f.check_box :hashed_storage_enabled, class: 'form-check-input qa-hashed-storage-checkbox' = f.label :hashed_storage_enabled, class: 'form-check-label' do Use hashed storage paths for newly created and renamed projects .form-text.text-muted Enable immutable, hash-based paths and repository names to store repositories on disk. This prevents repositories from having to be moved or renamed when the Project URL changes and may improve disk I/O performance. .form-group = f.label :repository_storages, 'Storage paths for new projects', class: 'label-bold' = f.select :repository_storages, repository_storages_options_for_select(@application_setting.repository_storages), {include_hidden: false}, multiple: true, class: 'form-control' .form-text.text-muted Manage repository storage paths. Learn more in the = succeed "." do = link_to "repository storages documentation", help_page_path("administration/repository_storage_paths") = f.submit 'Save changes', class: "btn btn-success qa-save-changes-button"
16812eadadecdb4449f796f453e891d1adecf95d
setup.py
setup.py
from setuptools import find_packages, setup import kitchen.release setup(name='kitchen', version=kitchen.release.__version__, description=kitchen.release.DESCRIPTION, author=kitchen.release.AUTHOR, author_email=kitchen.release.EMAIL, license=kitchen.release.LICENSE, url=kitchen.release.URL, download_url=kitchen.release.DOWNLOAD_URL, keywords='Useful Small Code Snippets', classifiers=[ 'Development Status :: 3 - Alpha', 'License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)', 'Programming Language :: Python :: 2.3', 'Programming Language :: Python :: 2.4', 'Programming Language :: Python :: 2.5', 'Programming Language :: Python :: 2.6', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Text Processing :: General', ], packages=find_packages(), data_files = [], )
from setuptools import find_packages, setup import kitchen.release setup(name='kitchen', version=kitchen.release.__version__, description=kitchen.release.DESCRIPTION, author=kitchen.release.AUTHOR, author_email=kitchen.release.EMAIL, license=kitchen.release.LICENSE, url=kitchen.release.URL, download_url=kitchen.release.DOWNLOAD_URL, keywords='Useful Small Code Snippets', classifiers=[ 'Development Status :: 3 - Alpha', 'License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)', 'Programming Language :: Python :: 2.3', 'Programming Language :: Python :: 2.4', 'Programming Language :: Python :: 2.5', 'Programming Language :: Python :: 2.6', 'Programming Language :: Python :: 2.7', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Text Processing :: General', ], packages=find_packages(), data_files = [], )
Add Python-2.7 as a platform kitchen runs on
Add Python-2.7 as a platform kitchen runs on
Python
lgpl-2.1
fedora-infra/kitchen,fedora-infra/kitchen
python
## Code Before: from setuptools import find_packages, setup import kitchen.release setup(name='kitchen', version=kitchen.release.__version__, description=kitchen.release.DESCRIPTION, author=kitchen.release.AUTHOR, author_email=kitchen.release.EMAIL, license=kitchen.release.LICENSE, url=kitchen.release.URL, download_url=kitchen.release.DOWNLOAD_URL, keywords='Useful Small Code Snippets', classifiers=[ 'Development Status :: 3 - Alpha', 'License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)', 'Programming Language :: Python :: 2.3', 'Programming Language :: Python :: 2.4', 'Programming Language :: Python :: 2.5', 'Programming Language :: Python :: 2.6', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Text Processing :: General', ], packages=find_packages(), data_files = [], ) ## Instruction: Add Python-2.7 as a platform kitchen runs on ## Code After: from setuptools import find_packages, setup import kitchen.release setup(name='kitchen', version=kitchen.release.__version__, description=kitchen.release.DESCRIPTION, author=kitchen.release.AUTHOR, author_email=kitchen.release.EMAIL, license=kitchen.release.LICENSE, url=kitchen.release.URL, download_url=kitchen.release.DOWNLOAD_URL, keywords='Useful Small Code Snippets', classifiers=[ 'Development Status :: 3 - Alpha', 'License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)', 'Programming Language :: Python :: 2.3', 'Programming Language :: Python :: 2.4', 'Programming Language :: Python :: 2.5', 'Programming Language :: Python :: 2.6', 'Programming Language :: Python :: 2.7', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Text Processing :: General', ], packages=find_packages(), data_files = [], )
0bdb00e5c616f25705a414c3bdac20ffcd6e7d6b
config/initializers/omniauth.rb
config/initializers/omniauth.rb
OmniAuth.config.logger = Rails.logger Rails.application.config.middleware.use OmniAuth::Builder do options = { scope: 'email,user_location', image_size: :normal } provider :facebook, ENV['FACEBOOK_APP_ID'], ENV['FACEBOOK_APP_SECRET'], options provider :browser_id provider :developer, fields: %i[name], uid_field: :name if Rails.env.development? end # in the development environment, redirect instead of raising an # exception when the user cancels third-party authentication # http://stackoverflow.com/questions/10963286/callback-denied-with-omniauth OmniAuth.config.on_failure = proc { |env| OmniAuth::FailureEndpoint.new(env).redirect_to_failure }
OmniAuth.config.logger = Rails.logger Rails.application.config.middleware.use OmniAuth::Builder do options = { scope: 'email,user_location', image_size: :normal, client_options: { site: 'https://graph.facebook.com/v2.6', authorize_url: "https://www.facebook.com/v2.6/dialog/oauth", }, token_params: { parse: :json }, } provider :facebook, ENV['FACEBOOK_APP_ID'], ENV['FACEBOOK_APP_SECRET'], options provider :browser_id provider :developer, fields: %i[name], uid_field: :name if Rails.env.development? end # in the development environment, redirect instead of raising an # exception when the user cancels third-party authentication # http://stackoverflow.com/questions/10963286/callback-denied-with-omniauth OmniAuth.config.on_failure = proc { |env| OmniAuth::FailureEndpoint.new(env).redirect_to_failure }
Use v2.6 of the Facebook Graph API.
Use v2.6 of the Facebook Graph API. We'd been using v2.0, which is getting shut down in a few months. We'll see whether this helps.
Ruby
mit
patbl/Skillshare.im,patbl/Skillshare.im,patbl/Skillshare.im
ruby
## Code Before: OmniAuth.config.logger = Rails.logger Rails.application.config.middleware.use OmniAuth::Builder do options = { scope: 'email,user_location', image_size: :normal } provider :facebook, ENV['FACEBOOK_APP_ID'], ENV['FACEBOOK_APP_SECRET'], options provider :browser_id provider :developer, fields: %i[name], uid_field: :name if Rails.env.development? end # in the development environment, redirect instead of raising an # exception when the user cancels third-party authentication # http://stackoverflow.com/questions/10963286/callback-denied-with-omniauth OmniAuth.config.on_failure = proc { |env| OmniAuth::FailureEndpoint.new(env).redirect_to_failure } ## Instruction: Use v2.6 of the Facebook Graph API. We'd been using v2.0, which is getting shut down in a few months. We'll see whether this helps. ## Code After: OmniAuth.config.logger = Rails.logger Rails.application.config.middleware.use OmniAuth::Builder do options = { scope: 'email,user_location', image_size: :normal, client_options: { site: 'https://graph.facebook.com/v2.6', authorize_url: "https://www.facebook.com/v2.6/dialog/oauth", }, token_params: { parse: :json }, } provider :facebook, ENV['FACEBOOK_APP_ID'], ENV['FACEBOOK_APP_SECRET'], options provider :browser_id provider :developer, fields: %i[name], uid_field: :name if Rails.env.development? end # in the development environment, redirect instead of raising an # exception when the user cancels third-party authentication # http://stackoverflow.com/questions/10963286/callback-denied-with-omniauth OmniAuth.config.on_failure = proc { |env| OmniAuth::FailureEndpoint.new(env).redirect_to_failure }
4882c1a4ea100da20b9ca6f7ec7a423e99eba3c1
Omise/Payment/Gateway/Http/Client/AbstractOmiseClient.php
Omise/Payment/Gateway/Http/Client/AbstractOmiseClient.php
<?php namespace Omise\Payment\Gateway\Http\Client; use Magento\Payment\Gateway\Http\ClientInterface; use Magento\Payment\Gateway\Http\TransferInterface; define('OMISE_PUBLIC_KEY', 'pkey_test_51fl8dfabqmt3m27vl7'); define('OMISE_SECRET_KEY', 'skey_test_51fl8dfabe7sqnj8th2'); abstract class AbstractOmiseClient implements ClientInterface { /** * Client request status represented to initiating step. * * @var string */ const PROCESS_STATUS_INIT = 'initiate_request'; /** * Client request status represented to successful request step. * * @var string */ const PROCESS_STATUS_SUCCESSFUL = 'successful'; /** * Client request status represented to failed request step. * * @var string */ const PROCESS_STATUS_FAILED = 'failed'; /** * @param \Magento\Payment\Gateway\Http\TransferInterface $transferObject * * @return array */ public function placeRequest(TransferInterface $transferObject) { $payload = []; $response = [ 'status' => self::PROCESS_STATUS_INIT, 'api' => null ]; try { $response['api'] = $this->request($payload); $response['status'] = self::PROCESS_STATUS_SUCCESSFUL; } catch (Exception $e) { $response['status'] = self::PROCESS_STATUS_FAILED; } return $response; } }
<?php namespace Omise\Payment\Gateway\Http\Client; use Magento\Payment\Gateway\Http\ClientInterface; use Magento\Payment\Gateway\Http\TransferInterface; use Omise\Payment\Model\Ui\OmiseConfigProvider; abstract class AbstractOmiseClient implements ClientInterface { /** * Client request status represented to initiating step. * * @var string */ const PROCESS_STATUS_INIT = 'initiate_request'; /** * Client request status represented to successful request step. * * @var string */ const PROCESS_STATUS_SUCCESSFUL = 'successful'; /** * Client request status represented to failed request step. * * @var string */ const PROCESS_STATUS_FAILED = 'failed'; protected $publicKey; protected $secretKey; public function __construct(OmiseConfigProvider $config) { $this->publicKey = $config->getPublicKey(); $this->secretKey = $config->getSecretKey(); } /** * @param \Magento\Payment\Gateway\Http\TransferInterface $transferObject * * @return array */ public function placeRequest(TransferInterface $transferObject) { $payload = []; $response = [ 'status' => self::PROCESS_STATUS_INIT, 'api' => null ]; try { $response['api'] = $this->request($payload); $response['status'] = self::PROCESS_STATUS_SUCCESSFUL; } catch (Exception $e) { $response['status'] = self::PROCESS_STATUS_FAILED; } return $response; } }
Change from constant public and secret key to retrieve it from database
Change from constant public and secret key to retrieve it from database Change the constant of public key and secret key that was fixed, to retrieve it from database by using the class OmiseConfigProvider.
PHP
mit
omise/omise-magento,omise/omise-magento,omise/omise-magento
php
## Code Before: <?php namespace Omise\Payment\Gateway\Http\Client; use Magento\Payment\Gateway\Http\ClientInterface; use Magento\Payment\Gateway\Http\TransferInterface; define('OMISE_PUBLIC_KEY', 'pkey_test_51fl8dfabqmt3m27vl7'); define('OMISE_SECRET_KEY', 'skey_test_51fl8dfabe7sqnj8th2'); abstract class AbstractOmiseClient implements ClientInterface { /** * Client request status represented to initiating step. * * @var string */ const PROCESS_STATUS_INIT = 'initiate_request'; /** * Client request status represented to successful request step. * * @var string */ const PROCESS_STATUS_SUCCESSFUL = 'successful'; /** * Client request status represented to failed request step. * * @var string */ const PROCESS_STATUS_FAILED = 'failed'; /** * @param \Magento\Payment\Gateway\Http\TransferInterface $transferObject * * @return array */ public function placeRequest(TransferInterface $transferObject) { $payload = []; $response = [ 'status' => self::PROCESS_STATUS_INIT, 'api' => null ]; try { $response['api'] = $this->request($payload); $response['status'] = self::PROCESS_STATUS_SUCCESSFUL; } catch (Exception $e) { $response['status'] = self::PROCESS_STATUS_FAILED; } return $response; } } ## Instruction: Change from constant public and secret key to retrieve it from database Change the constant of public key and secret key that was fixed, to retrieve it from database by using the class OmiseConfigProvider. ## Code After: <?php namespace Omise\Payment\Gateway\Http\Client; use Magento\Payment\Gateway\Http\ClientInterface; use Magento\Payment\Gateway\Http\TransferInterface; use Omise\Payment\Model\Ui\OmiseConfigProvider; abstract class AbstractOmiseClient implements ClientInterface { /** * Client request status represented to initiating step. * * @var string */ const PROCESS_STATUS_INIT = 'initiate_request'; /** * Client request status represented to successful request step. * * @var string */ const PROCESS_STATUS_SUCCESSFUL = 'successful'; /** * Client request status represented to failed request step. * * @var string */ const PROCESS_STATUS_FAILED = 'failed'; protected $publicKey; protected $secretKey; public function __construct(OmiseConfigProvider $config) { $this->publicKey = $config->getPublicKey(); $this->secretKey = $config->getSecretKey(); } /** * @param \Magento\Payment\Gateway\Http\TransferInterface $transferObject * * @return array */ public function placeRequest(TransferInterface $transferObject) { $payload = []; $response = [ 'status' => self::PROCESS_STATUS_INIT, 'api' => null ]; try { $response['api'] = $this->request($payload); $response['status'] = self::PROCESS_STATUS_SUCCESSFUL; } catch (Exception $e) { $response['status'] = self::PROCESS_STATUS_FAILED; } return $response; } }
ab9de946dd312e18ff232e7def39890ba2be436e
test/contents_test.js
test/contents_test.js
'use strict'; var fs = require('fs'); function read(filename) { return fs.readFileSync(filename, {'encoding': 'utf8'}); } exports.contents = { 'filter': function(test) { var actual = read('tmp/helpers/filter.html'); var expected = read('test/expected/helpers/filter.html'); test.equal(actual, expected, 'should filter all entries'); test.done(); }, 'list': function(test) { var actual = read('tmp/helpers/list.html'); var expected = read('test/expected/helpers/list.html'); test.equal(actual, expected, 'should list all entries'); test.done(); }, };
'use strict'; const fs = require('fs'); function read(filename) { return fs.readFileSync(filename, {'encoding': 'utf8'}); } exports.contents = { 'filter': (test) => { const actual = read('tmp/helpers/filter.html'); const expected = read('test/expected/helpers/filter.html'); test.equal(actual, expected, 'should filter all entries'); test.done(); }, 'list': (test) => { const actual = read('tmp/helpers/list.html'); const expected = read('test/expected/helpers/list.html'); test.equal(actual, expected, 'should list all entries'); test.done(); }, };
Use ES6 features in test
Use ES6 features in test
JavaScript
mit
xavierdutreilh/wintersmith-contents,xavierdutreilh/wintersmith-contents
javascript
## Code Before: 'use strict'; var fs = require('fs'); function read(filename) { return fs.readFileSync(filename, {'encoding': 'utf8'}); } exports.contents = { 'filter': function(test) { var actual = read('tmp/helpers/filter.html'); var expected = read('test/expected/helpers/filter.html'); test.equal(actual, expected, 'should filter all entries'); test.done(); }, 'list': function(test) { var actual = read('tmp/helpers/list.html'); var expected = read('test/expected/helpers/list.html'); test.equal(actual, expected, 'should list all entries'); test.done(); }, }; ## Instruction: Use ES6 features in test ## Code After: 'use strict'; const fs = require('fs'); function read(filename) { return fs.readFileSync(filename, {'encoding': 'utf8'}); } exports.contents = { 'filter': (test) => { const actual = read('tmp/helpers/filter.html'); const expected = read('test/expected/helpers/filter.html'); test.equal(actual, expected, 'should filter all entries'); test.done(); }, 'list': (test) => { const actual = read('tmp/helpers/list.html'); const expected = read('test/expected/helpers/list.html'); test.equal(actual, expected, 'should list all entries'); test.done(); }, };
3269d7295f34b03096814fcd474a4b7bbfe4ad23
packages/xm/xml-verify.yaml
packages/xm/xml-verify.yaml
homepage: https://github.com/jotron-as/xml-verify changelog-type: markdown hash: e8184fb4a21be6ab75f6068b30de11b121c9cac47edf218b9ad839c1fe6ba6fc test-bench-deps: {} maintainer: james.hobson@jotron.com synopsis: Verifying XML signatures changelog: | # Revision history for xml-verify ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: bytestring: '>=0.10.10.1 && <0.11' base: '>=4.13.0.0 && <=5.0' pem: '>=0.2.4 && <0.3' hxt: '>=9.3.1.22 && <9.4' x509: '>=1.7.5 && <1.8' cryptostore: '>=0.2.1.0 && <0.3' mtl: '>=2.2.2 && <2.3' all-versions: - 0.1.0.0 author: James Hobson latest: 0.1.0.0 description-type: haddock description: A small library, that calls xmlsec, for verifying XML. It also contains a wrapper for use with HXT license-name: BSD-3-Clause
homepage: https://github.com/jotron-as/xml-verify changelog-type: markdown hash: 0908ce343c5c676b59303b6cc295c08dc6bab52b0288b4a52ff6ec52affe5a66 test-bench-deps: {} maintainer: james.hobson@jotron.com synopsis: Verifying XML signatures changelog: | # Revision history for xml-verify ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: bytestring: '>=0.10.10.1 && <0.11' base: '>=4.13.0.0 && <=5.0' pem: '>=0.2.4 && <0.3' hxt: '>=9.3.1.22 && <9.4' x509: '>=1.7.5 && <1.8' cryptostore: '>=0.2.1.0 && <0.3' mtl: '>=2.2.2 && <2.3' all-versions: - 0.1.0.0 - 0.1.0.1 author: James Hobson latest: 0.1.0.1 description-type: haddock description: A small library, that calls xmlsec, for verifying XML. It also contains a wrapper for use with HXT license-name: BSD-3-Clause
Update from Hackage at 2021-09-23T07:27:41Z
Update from Hackage at 2021-09-23T07:27:41Z
YAML
mit
commercialhaskell/all-cabal-metadata
yaml
## Code Before: homepage: https://github.com/jotron-as/xml-verify changelog-type: markdown hash: e8184fb4a21be6ab75f6068b30de11b121c9cac47edf218b9ad839c1fe6ba6fc test-bench-deps: {} maintainer: james.hobson@jotron.com synopsis: Verifying XML signatures changelog: | # Revision history for xml-verify ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: bytestring: '>=0.10.10.1 && <0.11' base: '>=4.13.0.0 && <=5.0' pem: '>=0.2.4 && <0.3' hxt: '>=9.3.1.22 && <9.4' x509: '>=1.7.5 && <1.8' cryptostore: '>=0.2.1.0 && <0.3' mtl: '>=2.2.2 && <2.3' all-versions: - 0.1.0.0 author: James Hobson latest: 0.1.0.0 description-type: haddock description: A small library, that calls xmlsec, for verifying XML. It also contains a wrapper for use with HXT license-name: BSD-3-Clause ## Instruction: Update from Hackage at 2021-09-23T07:27:41Z ## Code After: homepage: https://github.com/jotron-as/xml-verify changelog-type: markdown hash: 0908ce343c5c676b59303b6cc295c08dc6bab52b0288b4a52ff6ec52affe5a66 test-bench-deps: {} maintainer: james.hobson@jotron.com synopsis: Verifying XML signatures changelog: | # Revision history for xml-verify ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: bytestring: '>=0.10.10.1 && <0.11' base: '>=4.13.0.0 && <=5.0' pem: '>=0.2.4 && <0.3' hxt: '>=9.3.1.22 && <9.4' x509: '>=1.7.5 && <1.8' cryptostore: '>=0.2.1.0 && <0.3' mtl: '>=2.2.2 && <2.3' all-versions: - 0.1.0.0 - 0.1.0.1 author: James Hobson latest: 0.1.0.1 description-type: haddock description: A small library, that calls xmlsec, for verifying XML. It also contains a wrapper for use with HXT license-name: BSD-3-Clause
1d0fbb435fb05fbbf5257e7a56516153af74e678
source/CMakeLists.txt
source/CMakeLists.txt
configure_file(version.h.in ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME}/${META_PROJECT_NAME}-version.h) # # Sub-projects # # Libraries set(IDE_FOLDER "") add_subdirectory(baselib) add_subdirectory(fiblib) # Examples set(IDE_FOLDER "Examples") add_subdirectory(examples) # Tests if(OPTION_BUILD_TESTS) set(IDE_FOLDER "Tests") add_subdirectory(tests) endif() # # Deployment # # Deploy generated headers install(DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME} DESTINATION include COMPONENT dev)
configure_file(version.h.in ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME}/${META_PROJECT_NAME}-version.h) # # Sub-projects # # Libraries set(IDE_FOLDER "") add_subdirectory(baselib) add_subdirectory(fiblib) # Examples set(IDE_FOLDER "Examples") add_subdirectory(examples) # Tests if(OPTION_BUILD_TESTS AND NOT MINGW) set(IDE_FOLDER "Tests") add_subdirectory(tests) endif() # # Deployment # # Deploy generated headers install(DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME} DESTINATION include COMPONENT dev)
Disable testing on mingw, as gtest seems not to be compatible with mingw
Disable testing on mingw, as gtest seems not to be compatible with mingw
Text
mit
j-o/cmake-init,hpicgs/cmake-init,hpicgs/cmake-init,j-o/cmake-init,hpicgs/cmake-init,cginternals/cmake-init,cginternals/cmake-init,j-o/cmake-init,j-o/cmake-init,hpicgs/cmake-init
text
## Code Before: configure_file(version.h.in ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME}/${META_PROJECT_NAME}-version.h) # # Sub-projects # # Libraries set(IDE_FOLDER "") add_subdirectory(baselib) add_subdirectory(fiblib) # Examples set(IDE_FOLDER "Examples") add_subdirectory(examples) # Tests if(OPTION_BUILD_TESTS) set(IDE_FOLDER "Tests") add_subdirectory(tests) endif() # # Deployment # # Deploy generated headers install(DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME} DESTINATION include COMPONENT dev) ## Instruction: Disable testing on mingw, as gtest seems not to be compatible with mingw ## Code After: configure_file(version.h.in ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME}/${META_PROJECT_NAME}-version.h) # # Sub-projects # # Libraries set(IDE_FOLDER "") add_subdirectory(baselib) add_subdirectory(fiblib) # Examples set(IDE_FOLDER "Examples") add_subdirectory(examples) # Tests if(OPTION_BUILD_TESTS AND NOT MINGW) set(IDE_FOLDER "Tests") add_subdirectory(tests) endif() # # Deployment # # Deploy generated headers install(DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/include/${META_PROJECT_NAME} DESTINATION include COMPONENT dev)
178a1a17c33cd5dada60679e4b6819fec117081e
setup/base_setup.sh
setup/base_setup.sh
echo =================================== echo = echo = base setup for vm echo = echo =================================== dir="$PWD" . "$dir/environmental_variables.sh" . "$dir/common_build_dependencies.sh" . "$dir/encodings.sh" . "$dir/app_user_as_linux_user.sh" . "$dir/nodejs.sh" . "$dir/postgres_bootstrap.sh" echo --- echo base setup for vm is complete! echo ===================================
echo =================================== echo = echo = base setup for vm echo = echo =================================== dir="/vagrant/setup" . "$dir/environmental_variables.sh" mkdir -p $ROOT_PROV_DIR . "$dir/common_build_dependencies.sh" . "$dir/encodings.sh" . "$dir/app_user_as_linux_user.sh" . "$dir/nodejs.sh" . "$dir/postgres_bootstrap.sh" echo --- echo base setup for vm is complete! echo ===================================
Clean up env variable use
Clean up env variable use
Shell
mit
dpneumo/Vagrant_Rails_PG
shell
## Code Before: echo =================================== echo = echo = base setup for vm echo = echo =================================== dir="$PWD" . "$dir/environmental_variables.sh" . "$dir/common_build_dependencies.sh" . "$dir/encodings.sh" . "$dir/app_user_as_linux_user.sh" . "$dir/nodejs.sh" . "$dir/postgres_bootstrap.sh" echo --- echo base setup for vm is complete! echo =================================== ## Instruction: Clean up env variable use ## Code After: echo =================================== echo = echo = base setup for vm echo = echo =================================== dir="/vagrant/setup" . "$dir/environmental_variables.sh" mkdir -p $ROOT_PROV_DIR . "$dir/common_build_dependencies.sh" . "$dir/encodings.sh" . "$dir/app_user_as_linux_user.sh" . "$dir/nodejs.sh" . "$dir/postgres_bootstrap.sh" echo --- echo base setup for vm is complete! echo ===================================
ddf099ff2f61d0a9e089de0e68efe9eb5f6391ce
src/templates/common/elements/how_to_cite.html
src/templates/common/elements/how_to_cite.html
<p> {{ author_str }}, {{ year_str }} “{{ title }}”, {{ journal_str|safe }} {{ issue_str }}. {{ pages_str|safe }} {{ doi_str|safe }} </p>
<p> {{ author_str }}, {{ year_str }} “{{ title }}”, {{ journal_str|safe }} {{ issue_str }}{% if pages_str %},{% else %}.{% endif %} {{ pages_str|safe }} {{ doi_str|safe }} </p>
Fix citation separator for pages
Fix citation separator for pages
HTML
agpl-3.0
BirkbeckCTP/janeway,BirkbeckCTP/janeway,BirkbeckCTP/janeway,BirkbeckCTP/janeway
html
## Code Before: <p> {{ author_str }}, {{ year_str }} “{{ title }}”, {{ journal_str|safe }} {{ issue_str }}. {{ pages_str|safe }} {{ doi_str|safe }} </p> ## Instruction: Fix citation separator for pages ## Code After: <p> {{ author_str }}, {{ year_str }} “{{ title }}”, {{ journal_str|safe }} {{ issue_str }}{% if pages_str %},{% else %}.{% endif %} {{ pages_str|safe }} {{ doi_str|safe }} </p>
c00671977f00458288dd76d6b939bd76c7cd2490
dpl-puppet_forge.gemspec
dpl-puppet_forge.gemspec
require './gemspec_helper' gemspec_for 'puppet_forge', [['puppet'], ['puppet-blacksmith']]
require './gemspec_helper' gemspec_for 'puppet_forge', [['puppet'], ['puppet-blacksmith'], ['json_pure']]
Add json_pure to puppet_forge runtime dependency
Add json_pure to puppet_forge runtime dependency
Ruby
mit
testfairy/dpl,travis-ci/dpl,travis-ci/dpl,testfairy/dpl,travis-ci/dpl,computology/dpl,testfairy/dpl
ruby
## Code Before: require './gemspec_helper' gemspec_for 'puppet_forge', [['puppet'], ['puppet-blacksmith']] ## Instruction: Add json_pure to puppet_forge runtime dependency ## Code After: require './gemspec_helper' gemspec_for 'puppet_forge', [['puppet'], ['puppet-blacksmith'], ['json_pure']]
9de26028bc6b441554fc60bc3f49cd9b0dcd0ad9
recipes/zen3geo/meta.yaml
recipes/zen3geo/meta.yaml
{% set name = "zen3geo" %} {% set version = "0.3.0" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/zen3geo-{{ version }}.tar.gz sha256: 09e84306809d03899155510f75dff83023e301c95bb0835b7bf5c34e58c72720 build: noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - pip - python >=3.8,<3.12 run: - python >=3.8,<3.12 - rioxarray >=0.10.0 - torchdata >=0.4.0 test: imports: - zen3geo commands: - pip check requires: - pip about: home: None summary: The 🌏 data science library you've been waiting for~ license: LGPL-3.0-or-later license_file: - LICENSE.md extra: recipe-maintainers: - weiji14
{% set name = "zen3geo" %} {% set version = "0.3.0" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/zen3geo-{{ version }}.tar.gz sha256: 09e84306809d03899155510f75dff83023e301c95bb0835b7bf5c34e58c72720 build: noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - pip - poetry-core >=1.0.0 - python >=3.8,<3.12 run: - python >=3.8,<3.12 - rioxarray >=0.10.0 - torchdata >=0.4.0 test: imports: - zen3geo commands: - pip check requires: - pip about: home: None summary: The 🌏 data science library you've been waiting for~ license: LGPL-3.0-or-later license_file: - LICENSE.md extra: recipe-maintainers: - weiji14
Add poetry-core as host dependency
Add poetry-core as host dependency Try to resolve `ModuleNotFoundError: No module named 'poetry'`.
YAML
bsd-3-clause
conda-forge/staged-recipes,ocefpaf/staged-recipes,johanneskoester/staged-recipes,conda-forge/staged-recipes,johanneskoester/staged-recipes,ocefpaf/staged-recipes
yaml
## Code Before: {% set name = "zen3geo" %} {% set version = "0.3.0" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/zen3geo-{{ version }}.tar.gz sha256: 09e84306809d03899155510f75dff83023e301c95bb0835b7bf5c34e58c72720 build: noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - pip - python >=3.8,<3.12 run: - python >=3.8,<3.12 - rioxarray >=0.10.0 - torchdata >=0.4.0 test: imports: - zen3geo commands: - pip check requires: - pip about: home: None summary: The 🌏 data science library you've been waiting for~ license: LGPL-3.0-or-later license_file: - LICENSE.md extra: recipe-maintainers: - weiji14 ## Instruction: Add poetry-core as host dependency Try to resolve `ModuleNotFoundError: No module named 'poetry'`. ## Code After: {% set name = "zen3geo" %} {% set version = "0.3.0" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/zen3geo-{{ version }}.tar.gz sha256: 09e84306809d03899155510f75dff83023e301c95bb0835b7bf5c34e58c72720 build: noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - pip - poetry-core >=1.0.0 - python >=3.8,<3.12 run: - python >=3.8,<3.12 - rioxarray >=0.10.0 - torchdata >=0.4.0 test: imports: - zen3geo commands: - pip check requires: - pip about: home: None summary: The 🌏 data science library you've been waiting for~ license: LGPL-3.0-or-later license_file: - LICENSE.md extra: recipe-maintainers: - weiji14
350c5a6e8989aa41507e8a5ac4442cb05ef3b18b
server/src/elastic/queries.js
server/src/elastic/queries.js
const fp = require('lodash/fp') const elastic = require('./index') const toBody = message => fp.pick([ 'timestamp', 'from', 'to', 'text', ], message) const indexMessage = message => { return elastic.index({ index: 'messages', type: 'message', id: message.id, body: toBody(message), }) } const indexMessages = messages => { return elastic.bulk({ body: fp.flatMap(message => [ { index: { _index: 'messages', _type: 'message', _id: message.id }}, toBody(message), ], messages), }) } const searchMessages = async (channel, query, limit, afterTimestamp) => { let body = { size: limit, sort: [{ timestamp: { order: 'desc' }}], query: { bool: { filter: fp.concat( { term: { to: channel }}, !query.nick ? [] : { term: { from: query.nick }} ), must: !query.text ? [] : [{ match: { text: query.text }, }], }, }, } if (afterTimestamp) { body.search_after = [+afterTimestamp] } const { hits } = await elastic.search({ index: 'messages', type: 'message', body, }) return hits } module.exports = { indexMessage, indexMessages, searchMessages }
const fp = require('lodash/fp') const elastic = require('./index') const toBody = message => fp.pick([ 'timestamp', 'from', 'to', 'text', ], message) const indexMessage = message => { return elastic.index({ index: 'messages', type: 'message', id: message.id, body: toBody(message), }) } const indexMessages = messages => { return elastic.bulk({ body: fp.flatMap(message => [ { index: { _index: 'messages', _type: 'message', _id: message.id }}, toBody(message), ], messages), }) } const searchMessages = async (channel, query, limit, afterTimestamp) => { let body = { size: limit, sort: [{ timestamp: { order: 'desc' }}], query: { bool: { filter: fp.concat( { term: { to: channel }}, !query.nick ? [] : { term: { from: query.nick }} ), must: !query.text ? [] : [{ match: { text: { query: query.text, operator: 'and', fuzziness: 'auto', } }, }], }, }, } if (afterTimestamp) { body.search_after = [+afterTimestamp] } const { hits } = await elastic.search({ index: 'messages', type: 'message', body, }) return hits } module.exports = { indexMessage, indexMessages, searchMessages }
Enable fuzziness + AND phrases when searching text
Enable fuzziness + AND phrases when searching text
JavaScript
mit
daGrevis/msks,daGrevis/msks,daGrevis/msks
javascript
## Code Before: const fp = require('lodash/fp') const elastic = require('./index') const toBody = message => fp.pick([ 'timestamp', 'from', 'to', 'text', ], message) const indexMessage = message => { return elastic.index({ index: 'messages', type: 'message', id: message.id, body: toBody(message), }) } const indexMessages = messages => { return elastic.bulk({ body: fp.flatMap(message => [ { index: { _index: 'messages', _type: 'message', _id: message.id }}, toBody(message), ], messages), }) } const searchMessages = async (channel, query, limit, afterTimestamp) => { let body = { size: limit, sort: [{ timestamp: { order: 'desc' }}], query: { bool: { filter: fp.concat( { term: { to: channel }}, !query.nick ? [] : { term: { from: query.nick }} ), must: !query.text ? [] : [{ match: { text: query.text }, }], }, }, } if (afterTimestamp) { body.search_after = [+afterTimestamp] } const { hits } = await elastic.search({ index: 'messages', type: 'message', body, }) return hits } module.exports = { indexMessage, indexMessages, searchMessages } ## Instruction: Enable fuzziness + AND phrases when searching text ## Code After: const fp = require('lodash/fp') const elastic = require('./index') const toBody = message => fp.pick([ 'timestamp', 'from', 'to', 'text', ], message) const indexMessage = message => { return elastic.index({ index: 'messages', type: 'message', id: message.id, body: toBody(message), }) } const indexMessages = messages => { return elastic.bulk({ body: fp.flatMap(message => [ { index: { _index: 'messages', _type: 'message', _id: message.id }}, toBody(message), ], messages), }) } const searchMessages = async (channel, query, limit, afterTimestamp) => { let body = { size: limit, sort: [{ timestamp: { order: 'desc' }}], query: { bool: { filter: fp.concat( { term: { to: channel }}, !query.nick ? [] : { term: { from: query.nick }} ), must: !query.text ? [] : [{ match: { text: { query: query.text, operator: 'and', fuzziness: 'auto', } }, }], }, }, } if (afterTimestamp) { body.search_after = [+afterTimestamp] } const { hits } = await elastic.search({ index: 'messages', type: 'message', body, }) return hits } module.exports = { indexMessage, indexMessages, searchMessages }
910d1288adddd0c8dd500c1be5e488502c1ed335
localflavor/nl/forms.py
localflavor/nl/forms.py
"""NL-specific Form helpers.""" from __future__ import unicode_literals from django import forms from django.utils import six from .nl_provinces import PROVINCE_CHOICES from .validators import NLBSNFieldValidator, NLZipCodeFieldValidator class NLZipCodeField(forms.CharField): """A Dutch zip code field.""" default_validators = [NLZipCodeFieldValidator()] def clean(self, value): if isinstance(value, six.string_types): value = value.upper().replace(' ', '') if len(value) == 6: value = '%s %s' % (value[:4], value[4:]) return super(NLZipCodeField, self).clean(value) class NLProvinceSelect(forms.Select): """A Select widget that uses a list of provinces of the Netherlands as it's choices.""" def __init__(self, attrs=None): super(NLProvinceSelect, self).__init__(attrs, choices=PROVINCE_CHOICES) class NLBSNFormField(forms.CharField): """ A Dutch social security number (BSN) field. http://nl.wikipedia.org/wiki/Sofinummer .. versionadded:: 1.6 """ default_validators = [NLBSNFieldValidator()] def __init__(self, *args, **kwargs): kwargs['max_length'] = 9 super(NLBSNFormField, self).__init__(*args, **kwargs)
"""NL-specific Form helpers.""" from __future__ import unicode_literals from django import forms from django.utils import six from .nl_provinces import PROVINCE_CHOICES from .validators import NLBSNFieldValidator, NLZipCodeFieldValidator class NLZipCodeField(forms.CharField): """A Dutch zip code field.""" default_validators = [NLZipCodeFieldValidator()] def clean(self, value): if isinstance(value, six.string_types): value = value.upper().replace(' ', '') if len(value) == 6: value = '%s %s' % (value[:4], value[4:]) return super(NLZipCodeField, self).clean(value) class NLProvinceSelect(forms.Select): """A Select widget that uses a list of provinces of the Netherlands as it's choices.""" def __init__(self, attrs=None): super(NLProvinceSelect, self).__init__(attrs, choices=PROVINCE_CHOICES) class NLBSNFormField(forms.CharField): """ A Dutch social security number (BSN) field. https://nl.wikipedia.org/wiki/Burgerservicenummer Note that you may only process the BSN if you have a legal basis to do so! .. versionadded:: 1.6 """ default_validators = [NLBSNFieldValidator()] def __init__(self, *args, **kwargs): kwargs['max_length'] = 9 super(NLBSNFormField, self).__init__(*args, **kwargs)
Fix the wikipedia link and include a warning
Fix the wikipedia link and include a warning
Python
bsd-3-clause
django/django-localflavor,rsalmaso/django-localflavor
python
## Code Before: """NL-specific Form helpers.""" from __future__ import unicode_literals from django import forms from django.utils import six from .nl_provinces import PROVINCE_CHOICES from .validators import NLBSNFieldValidator, NLZipCodeFieldValidator class NLZipCodeField(forms.CharField): """A Dutch zip code field.""" default_validators = [NLZipCodeFieldValidator()] def clean(self, value): if isinstance(value, six.string_types): value = value.upper().replace(' ', '') if len(value) == 6: value = '%s %s' % (value[:4], value[4:]) return super(NLZipCodeField, self).clean(value) class NLProvinceSelect(forms.Select): """A Select widget that uses a list of provinces of the Netherlands as it's choices.""" def __init__(self, attrs=None): super(NLProvinceSelect, self).__init__(attrs, choices=PROVINCE_CHOICES) class NLBSNFormField(forms.CharField): """ A Dutch social security number (BSN) field. http://nl.wikipedia.org/wiki/Sofinummer .. versionadded:: 1.6 """ default_validators = [NLBSNFieldValidator()] def __init__(self, *args, **kwargs): kwargs['max_length'] = 9 super(NLBSNFormField, self).__init__(*args, **kwargs) ## Instruction: Fix the wikipedia link and include a warning ## Code After: """NL-specific Form helpers.""" from __future__ import unicode_literals from django import forms from django.utils import six from .nl_provinces import PROVINCE_CHOICES from .validators import NLBSNFieldValidator, NLZipCodeFieldValidator class NLZipCodeField(forms.CharField): """A Dutch zip code field.""" default_validators = [NLZipCodeFieldValidator()] def clean(self, value): if isinstance(value, six.string_types): value = value.upper().replace(' ', '') if len(value) == 6: value = '%s %s' % (value[:4], value[4:]) return super(NLZipCodeField, self).clean(value) class NLProvinceSelect(forms.Select): """A Select widget that uses a list of provinces of the Netherlands as it's choices.""" def __init__(self, attrs=None): super(NLProvinceSelect, self).__init__(attrs, choices=PROVINCE_CHOICES) class NLBSNFormField(forms.CharField): """ A Dutch social security number (BSN) field. https://nl.wikipedia.org/wiki/Burgerservicenummer Note that you may only process the BSN if you have a legal basis to do so! .. versionadded:: 1.6 """ default_validators = [NLBSNFieldValidator()] def __init__(self, *args, **kwargs): kwargs['max_length'] = 9 super(NLBSNFormField, self).__init__(*args, **kwargs)
bb4701103101c698f6afdc05bce02a186d228d89
celery/views.py
celery/views.py
"""celery.views""" from django.http import HttpResponse from celery.task import is_done, delay_task from celery.result import AsyncResult from carrot.serialization import serialize as JSON_dump def is_task_done(request, task_id): """Returns task execute status in JSON format.""" response_data = {"task": {"id": task_id, "executed": is_done(task_id)}} return HttpResponse(JSON_dump(response_data)) def task_status(request, task_id): """Returns task status and result in JSON format.""" async_result = AsyncResult(task_id) status = async_result.status if status == "FAILURE": response_data = { "id": task_id, "status": status, "result": async_result.result.args[0], } else: response_data = { "id": task_id, "status": status, "result": async_result.result, } return HttpResponse(JSON_dump({"task": response_data}))
"""celery.views""" from django.http import HttpResponse from celery.task import is_done, delay_task from celery.result import AsyncResult from carrot.serialization import serialize as JSON_dump def is_task_done(request, task_id): """Returns task execute status in JSON format.""" response_data = {"task": {"id": task_id, "executed": is_done(task_id)}} return HttpResponse(JSON_dump(response_data), mimetype="application/json") def task_status(request, task_id): """Returns task status and result in JSON format.""" async_result = AsyncResult(task_id) status = async_result.status if status == "FAILURE": response_data = { "id": task_id, "status": status, "result": async_result.result.args[0], } else: response_data = { "id": task_id, "status": status, "result": async_result.result, } return HttpResponse(JSON_dump({"task": response_data}), mimetype="application/json")
Send back a mimetype for JSON response.
Send back a mimetype for JSON response.
Python
bsd-3-clause
frac/celery,ask/celery,frac/celery,mitsuhiko/celery,cbrepo/celery,cbrepo/celery,mitsuhiko/celery,ask/celery,WoLpH/celery,WoLpH/celery
python
## Code Before: """celery.views""" from django.http import HttpResponse from celery.task import is_done, delay_task from celery.result import AsyncResult from carrot.serialization import serialize as JSON_dump def is_task_done(request, task_id): """Returns task execute status in JSON format.""" response_data = {"task": {"id": task_id, "executed": is_done(task_id)}} return HttpResponse(JSON_dump(response_data)) def task_status(request, task_id): """Returns task status and result in JSON format.""" async_result = AsyncResult(task_id) status = async_result.status if status == "FAILURE": response_data = { "id": task_id, "status": status, "result": async_result.result.args[0], } else: response_data = { "id": task_id, "status": status, "result": async_result.result, } return HttpResponse(JSON_dump({"task": response_data})) ## Instruction: Send back a mimetype for JSON response. ## Code After: """celery.views""" from django.http import HttpResponse from celery.task import is_done, delay_task from celery.result import AsyncResult from carrot.serialization import serialize as JSON_dump def is_task_done(request, task_id): """Returns task execute status in JSON format.""" response_data = {"task": {"id": task_id, "executed": is_done(task_id)}} return HttpResponse(JSON_dump(response_data), mimetype="application/json") def task_status(request, task_id): """Returns task status and result in JSON format.""" async_result = AsyncResult(task_id) status = async_result.status if status == "FAILURE": response_data = { "id": task_id, "status": status, "result": async_result.result.args[0], } else: response_data = { "id": task_id, "status": status, "result": async_result.result, } return HttpResponse(JSON_dump({"task": response_data}), mimetype="application/json")
7060e3f1b1e8bda4c96cdc4b0c84ae344ac81c76
Sketches/MPS/test/test_Selector.py
Sketches/MPS/test/test_Selector.py
import unittest import sys; sys.path.append("../") from Selector import Selector if __name__=="__main__": unittest.main()
import unittest import sys; sys.path.append("../") from Selector import Selector class SmokeTests_Selector(unittest.TestCase): def test_SmokeTest(self): """__init__ - Called with no arguments succeeds""" S = Selector() self.assert_(isinstance(S, Axon.Component.component)) if __name__=="__main__": unittest.main()
Add the most basic smoke test. We make a check that the resulting object is a minimal component at least.
Add the most basic smoke test. We make a check that the resulting object is a minimal component at least.
Python
apache-2.0
sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia,sparkslabs/kamaelia
python
## Code Before: import unittest import sys; sys.path.append("../") from Selector import Selector if __name__=="__main__": unittest.main() ## Instruction: Add the most basic smoke test. We make a check that the resulting object is a minimal component at least. ## Code After: import unittest import sys; sys.path.append("../") from Selector import Selector class SmokeTests_Selector(unittest.TestCase): def test_SmokeTest(self): """__init__ - Called with no arguments succeeds""" S = Selector() self.assert_(isinstance(S, Axon.Component.component)) if __name__=="__main__": unittest.main()
3ec71d3925a3551f6f25fc25e827c88caaff1fdd
tests/integration/test_redirection_external.py
tests/integration/test_redirection_external.py
"""Check external REDIRECTIONS""" import pytest from nikola import __main__ from .helper import append_config, cd from .test_demo_build import prepare_demo_site from .test_empty_build import ( # NOQA test_archive_exists, test_avoid_double_slash_in_rss, test_check_files, test_check_links, test_index_in_sitemap, ) @pytest.fixture(scope="module") def build(target_dir): """Fill the site with demo content and build it.""" prepare_demo_site(target_dir) append_config( target_dir, """ REDIRECTIONS = [ ("external.html", "http://www.example.com/"), ] """, ) with cd(target_dir): __main__.main(["build"])
"""Check external REDIRECTIONS""" import os import pytest from nikola import __main__ from .helper import append_config, cd from .test_demo_build import prepare_demo_site from .test_empty_build import ( # NOQA test_archive_exists, test_avoid_double_slash_in_rss, test_check_files, test_check_links, test_index_in_sitemap, ) def test_external_redirection(build, output_dir): ext_link = os.path.join(output_dir, 'external.html') assert os.path.exists(ext_link) with open(ext_link) as ext_link_fd: ext_link_content = ext_link_fd.read() redirect_tag = '<meta http-equiv="refresh" content="0; url=http://www.example.com/">' assert redirect_tag in ext_link_content @pytest.fixture(scope="module") def build(target_dir): """Fill the site with demo content and build it.""" prepare_demo_site(target_dir) append_config( target_dir, """ REDIRECTIONS = [ ("external.html", "http://www.example.com/"), ] """, ) with cd(target_dir): __main__.main(["build"])
Add test for external redirection.
Add test for external redirection.
Python
mit
okin/nikola,okin/nikola,okin/nikola,getnikola/nikola,getnikola/nikola,getnikola/nikola,okin/nikola,getnikola/nikola
python
## Code Before: """Check external REDIRECTIONS""" import pytest from nikola import __main__ from .helper import append_config, cd from .test_demo_build import prepare_demo_site from .test_empty_build import ( # NOQA test_archive_exists, test_avoid_double_slash_in_rss, test_check_files, test_check_links, test_index_in_sitemap, ) @pytest.fixture(scope="module") def build(target_dir): """Fill the site with demo content and build it.""" prepare_demo_site(target_dir) append_config( target_dir, """ REDIRECTIONS = [ ("external.html", "http://www.example.com/"), ] """, ) with cd(target_dir): __main__.main(["build"]) ## Instruction: Add test for external redirection. ## Code After: """Check external REDIRECTIONS""" import os import pytest from nikola import __main__ from .helper import append_config, cd from .test_demo_build import prepare_demo_site from .test_empty_build import ( # NOQA test_archive_exists, test_avoid_double_slash_in_rss, test_check_files, test_check_links, test_index_in_sitemap, ) def test_external_redirection(build, output_dir): ext_link = os.path.join(output_dir, 'external.html') assert os.path.exists(ext_link) with open(ext_link) as ext_link_fd: ext_link_content = ext_link_fd.read() redirect_tag = '<meta http-equiv="refresh" content="0; url=http://www.example.com/">' assert redirect_tag in ext_link_content @pytest.fixture(scope="module") def build(target_dir): """Fill the site with demo content and build it.""" prepare_demo_site(target_dir) append_config( target_dir, """ REDIRECTIONS = [ ("external.html", "http://www.example.com/"), ] """, ) with cd(target_dir): __main__.main(["build"])
9b1828dbdbd8bf83db734c6c7ae30edadf1337ee
tests/test-amp.php
tests/test-amp.php
<?php /** * Tests for amp.php. * * @package AMP */ /** * Tests for amp.php. */ class Test_AMP extends WP_UnitTestCase { /** * Tear down and clean up. */ public function tearDown() { parent::tearDown(); remove_theme_support( 'amp' ); } /** * Test amp_is_canonical(). * * @covers amp_is_canonical() */ public function test_amp_is_canonical() { remove_theme_support( 'amp' ); $this->assertFalse( amp_is_canonical() ); add_theme_support( 'amp' ); $this->assertTrue( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'template_path' => get_template_directory() . 'amp-templates/', ) ); $this->assertFalse( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'custom_prop' => 'something', ) ); $this->assertTrue( amp_is_canonical() ); } }
<?php /** * Tests for amp.php. * * @package AMP */ /** * Tests for amp.php. */ class Test_AMP extends WP_UnitTestCase { /** * Tear down and clean up. */ public function tearDown() { parent::tearDown(); remove_theme_support( 'amp' ); } /** * Test amp_is_canonical(). * * @covers amp_is_canonical() */ public function test_amp_is_canonical() { remove_theme_support( 'amp' ); $this->assertFalse( amp_is_canonical() ); add_theme_support( 'amp' ); $this->assertTrue( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'template_dir' => 'amp-templates', ) ); $this->assertFalse( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'custom_prop' => 'something', ) ); $this->assertTrue( amp_is_canonical() ); } }
Update test_amp_is_canonical() with new naming.
Update test_amp_is_canonical() with new naming.
PHP
apache-2.0
ampproject/amp-toolbox-php,ampproject/amp-toolbox-php
php
## Code Before: <?php /** * Tests for amp.php. * * @package AMP */ /** * Tests for amp.php. */ class Test_AMP extends WP_UnitTestCase { /** * Tear down and clean up. */ public function tearDown() { parent::tearDown(); remove_theme_support( 'amp' ); } /** * Test amp_is_canonical(). * * @covers amp_is_canonical() */ public function test_amp_is_canonical() { remove_theme_support( 'amp' ); $this->assertFalse( amp_is_canonical() ); add_theme_support( 'amp' ); $this->assertTrue( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'template_path' => get_template_directory() . 'amp-templates/', ) ); $this->assertFalse( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'custom_prop' => 'something', ) ); $this->assertTrue( amp_is_canonical() ); } } ## Instruction: Update test_amp_is_canonical() with new naming. ## Code After: <?php /** * Tests for amp.php. * * @package AMP */ /** * Tests for amp.php. */ class Test_AMP extends WP_UnitTestCase { /** * Tear down and clean up. */ public function tearDown() { parent::tearDown(); remove_theme_support( 'amp' ); } /** * Test amp_is_canonical(). * * @covers amp_is_canonical() */ public function test_amp_is_canonical() { remove_theme_support( 'amp' ); $this->assertFalse( amp_is_canonical() ); add_theme_support( 'amp' ); $this->assertTrue( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'template_dir' => 'amp-templates', ) ); $this->assertFalse( amp_is_canonical() ); remove_theme_support( 'amp' ); add_theme_support( 'amp', array( 'custom_prop' => 'something', ) ); $this->assertTrue( amp_is_canonical() ); } }
fcd346f7ef0fabbf61b555e7ba3cd0a1b6ea4528
geo-query.php
geo-query.php
<?php /** * Plugin Name: Geo Query * Description: Modify the WP_Query to support the geo_query parameter. Uses the Haversine SQL implementation by Ollie Jones. * Plugin URI: https://github.com/birgire/geo-query * Author: Birgir Erlendsson (birgire) * Version: 0.0.1 * Licence: MIT */ namespace Birgir\Geo; /** * Autoload */ \add_action( 'plugins_loaded', function() { require __DIR__ . '/vendor/autoload.php'; }); /** * Init */ \add_action( 'init', function() { if( class_exists( __NAMESPACE__ . '\\GeoQueryContext' ) ) { $o = new GeoQueryContext(); $o->setup( $GLOBALS['wpdb'] )->activate(); } });
<?php /** * Plugin Name: Geo Query * Description: Modify the WP_Query to support the geo_query parameter. Uses the Haversine SQL implementation by Ollie Jones. * Plugin URI: https://github.com/birgire/geo-query * GitHub Plugin URI: https://github.com/birgire/geo-query.git * Author: Birgir Erlendsson (birgire) * Version: 0.0.1 * Licence: MIT */ namespace Birgir\Geo; /** * Autoload */ \add_action( 'plugins_loaded', function() { require __DIR__ . '/vendor/autoload.php'; }); /** * Init */ \add_action( 'init', function() { if( class_exists( __NAMESPACE__ . '\\GeoQueryContext' ) ) { $o = new GeoQueryContext(); $o->setup( $GLOBALS['wpdb'] )->activate(); } });
Support for the GitHub Updater.
Added: Support for the GitHub Updater.
PHP
mit
birgire/geo-query,birgire/geo-query
php
## Code Before: <?php /** * Plugin Name: Geo Query * Description: Modify the WP_Query to support the geo_query parameter. Uses the Haversine SQL implementation by Ollie Jones. * Plugin URI: https://github.com/birgire/geo-query * Author: Birgir Erlendsson (birgire) * Version: 0.0.1 * Licence: MIT */ namespace Birgir\Geo; /** * Autoload */ \add_action( 'plugins_loaded', function() { require __DIR__ . '/vendor/autoload.php'; }); /** * Init */ \add_action( 'init', function() { if( class_exists( __NAMESPACE__ . '\\GeoQueryContext' ) ) { $o = new GeoQueryContext(); $o->setup( $GLOBALS['wpdb'] )->activate(); } }); ## Instruction: Added: Support for the GitHub Updater. ## Code After: <?php /** * Plugin Name: Geo Query * Description: Modify the WP_Query to support the geo_query parameter. Uses the Haversine SQL implementation by Ollie Jones. * Plugin URI: https://github.com/birgire/geo-query * GitHub Plugin URI: https://github.com/birgire/geo-query.git * Author: Birgir Erlendsson (birgire) * Version: 0.0.1 * Licence: MIT */ namespace Birgir\Geo; /** * Autoload */ \add_action( 'plugins_loaded', function() { require __DIR__ . '/vendor/autoload.php'; }); /** * Init */ \add_action( 'init', function() { if( class_exists( __NAMESPACE__ . '\\GeoQueryContext' ) ) { $o = new GeoQueryContext(); $o->setup( $GLOBALS['wpdb'] )->activate(); } });
346cd4a8f568004c05ba2dee08fd3164103d51b1
app/controllers/admin/user_registration_forms_controller.rb
app/controllers/admin/user_registration_forms_controller.rb
module Admin class UserRegistrationFormsController < AdminController before_action -> { require_scope('write:user_registration_forms') } permits :active_days def index @user_registration_forms = UserRegistrationForm.includes(admin: { user: { member: :avatar } }).all.reverse_order end def new @user_registration_form = current_user.admin.user_registration_forms.new end def create(user_registration_form) @user_registration_form = current_user.admin.user_registration_forms.new(user_registration_form) if @user_registration_form.save AdminActivityNotifyJob.perform_later( user: current_user, operation: '作成しました', object: @user_registration_form, detail: @user_registration_form.as_json, url: admin_user_registration_forms_url, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{@user_registration_form.id} を作成しました" else render :new, status: :unprocessable_entity end end def destroy(id) user_registration_form = UserRegistrationForm.find(id) user_registration_form.destroy! AdminActivityNotifyJob.perform_now( user: current_user, operation: '削除しました', object: user_registration_form, detail: user_registration_form.as_json, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{user_registration_form.id} を削除しました" end end end
module Admin class UserRegistrationFormsController < AdminController before_action -> { require_scope('write:user_registration_forms') } permits :active_days def index @user_registration_forms = UserRegistrationForm.includes(admin: { user: { member: :avatar } }).all.reverse_order end def new @user_registration_form = current_user.admin.user_registration_forms.new end def create(user_registration_form) @user_registration_form = current_user.admin.user_registration_forms.new(user_registration_form) if @user_registration_form.save AdminActivityNotifyJob.perform_later( user: current_user, operation: '作成しました', object: @user_registration_form, detail: @user_registration_form.as_json(except: :token), url: admin_user_registration_forms_url, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{@user_registration_form.id} を作成しました" else render :new, status: :unprocessable_entity end end def destroy(id) user_registration_form = UserRegistrationForm.find(id) user_registration_form.destroy! AdminActivityNotifyJob.perform_now( user: current_user, operation: '削除しました', object: user_registration_form, detail: user_registration_form.as_json, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{user_registration_form.id} を削除しました" end end end
Remove user_registration_forms.token value from Slack notification
Remove user_registration_forms.token value from Slack notification
Ruby
mit
sankichi92/LiveLog,sankichi92/LiveLog,sankichi92/LiveLog,sankichi92/LiveLog
ruby
## Code Before: module Admin class UserRegistrationFormsController < AdminController before_action -> { require_scope('write:user_registration_forms') } permits :active_days def index @user_registration_forms = UserRegistrationForm.includes(admin: { user: { member: :avatar } }).all.reverse_order end def new @user_registration_form = current_user.admin.user_registration_forms.new end def create(user_registration_form) @user_registration_form = current_user.admin.user_registration_forms.new(user_registration_form) if @user_registration_form.save AdminActivityNotifyJob.perform_later( user: current_user, operation: '作成しました', object: @user_registration_form, detail: @user_registration_form.as_json, url: admin_user_registration_forms_url, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{@user_registration_form.id} を作成しました" else render :new, status: :unprocessable_entity end end def destroy(id) user_registration_form = UserRegistrationForm.find(id) user_registration_form.destroy! AdminActivityNotifyJob.perform_now( user: current_user, operation: '削除しました', object: user_registration_form, detail: user_registration_form.as_json, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{user_registration_form.id} を削除しました" end end end ## Instruction: Remove user_registration_forms.token value from Slack notification ## Code After: module Admin class UserRegistrationFormsController < AdminController before_action -> { require_scope('write:user_registration_forms') } permits :active_days def index @user_registration_forms = UserRegistrationForm.includes(admin: { user: { member: :avatar } }).all.reverse_order end def new @user_registration_form = current_user.admin.user_registration_forms.new end def create(user_registration_form) @user_registration_form = current_user.admin.user_registration_forms.new(user_registration_form) if @user_registration_form.save AdminActivityNotifyJob.perform_later( user: current_user, operation: '作成しました', object: @user_registration_form, detail: @user_registration_form.as_json(except: :token), url: admin_user_registration_forms_url, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{@user_registration_form.id} を作成しました" else render :new, status: :unprocessable_entity end end def destroy(id) user_registration_form = UserRegistrationForm.find(id) user_registration_form.destroy! AdminActivityNotifyJob.perform_now( user: current_user, operation: '削除しました', object: user_registration_form, detail: user_registration_form.as_json, ) redirect_to admin_user_registration_forms_path, notice: "ID: #{user_registration_form.id} を削除しました" end end end
15c3168595c364e935307fbb91385ec6d1a86edd
TWLight/users/templates/users/redesigned_my_library.html
TWLight/users/templates/users/redesigned_my_library.html
{% extends "new_base.html" %} {% load i18n %} {% block content %} {% include "header_partial_b4.html" %} <div class="alert alert-dark text-center" role="alert"> We're in the process of redesigning the library - further changes will be made over the coming weeks! We would love to hear your thoughts and feedback on our <a href="https://meta.wikimedia.org/wiki/Talk:Library_Card_platform/Design_improvements" class="alert-link" target="_blank" rel="noopener"> project page </a> . </div> <div class="row"> <div class="col-lg-2 col-md-3"> {% include "users/filter_section.html" %} </div> <div class="v-divider"></div> <div class="col-lg-9 col-md-8 col-sm-12"> {% include "users/collections_section.html" %} </div> </div> {% endblock %} {% block javascript %} <script type="text/javascript"> // Dynamically add classes to the django-filter generated labels labelLanguages = document.querySelector("[for=id_languages]") labelTags = document.querySelector("[for=id_tags]") labelLanguages.classList.add("collection-filter-label"); labelTags.classList.add("collection-filter-label"); </script> {% endblock %}
{% extends "new_base.html" %} {% load i18n %} {% block content %} {% include "header_partial_b4.html" %} <div class="alert alert-dark text-center" role="alert"> {% comment %}Translators: This message invites users to give feedback on the new My Library UI.{% endcomment %} {% blocktranslate trimmed %} We're in the process of redesigning the library - further changes will be made over the coming weeks! We would love to hear your thoughts and feedback on our <a href="https://meta.wikimedia.org/wiki/Talk:Library_Card_platform/Design_improvements" class="alert-link" target="_blank" rel="noopener"> project page </a> . {% endblocktranslate %} </div> <div class="row"> <div class="col-lg-2 col-md-3"> {% include "users/filter_section.html" %} </div> <div class="v-divider"></div> <div class="col-lg-9 col-md-8 col-sm-12"> {% include "users/collections_section.html" %} </div> </div> {% endblock %} {% block javascript %} <script type="text/javascript"> // Dynamically add classes to the django-filter generated labels labelLanguages = document.querySelector("[for=id_languages]") labelTags = document.querySelector("[for=id_tags]") labelLanguages.classList.add("collection-filter-label"); labelTags.classList.add("collection-filter-label"); </script> {% endblock %}
Add blocktranslate tag on banner
Add blocktranslate tag on banner
HTML
mit
WikipediaLibrary/TWLight,WikipediaLibrary/TWLight,WikipediaLibrary/TWLight,WikipediaLibrary/TWLight,WikipediaLibrary/TWLight
html
## Code Before: {% extends "new_base.html" %} {% load i18n %} {% block content %} {% include "header_partial_b4.html" %} <div class="alert alert-dark text-center" role="alert"> We're in the process of redesigning the library - further changes will be made over the coming weeks! We would love to hear your thoughts and feedback on our <a href="https://meta.wikimedia.org/wiki/Talk:Library_Card_platform/Design_improvements" class="alert-link" target="_blank" rel="noopener"> project page </a> . </div> <div class="row"> <div class="col-lg-2 col-md-3"> {% include "users/filter_section.html" %} </div> <div class="v-divider"></div> <div class="col-lg-9 col-md-8 col-sm-12"> {% include "users/collections_section.html" %} </div> </div> {% endblock %} {% block javascript %} <script type="text/javascript"> // Dynamically add classes to the django-filter generated labels labelLanguages = document.querySelector("[for=id_languages]") labelTags = document.querySelector("[for=id_tags]") labelLanguages.classList.add("collection-filter-label"); labelTags.classList.add("collection-filter-label"); </script> {% endblock %} ## Instruction: Add blocktranslate tag on banner ## Code After: {% extends "new_base.html" %} {% load i18n %} {% block content %} {% include "header_partial_b4.html" %} <div class="alert alert-dark text-center" role="alert"> {% comment %}Translators: This message invites users to give feedback on the new My Library UI.{% endcomment %} {% blocktranslate trimmed %} We're in the process of redesigning the library - further changes will be made over the coming weeks! We would love to hear your thoughts and feedback on our <a href="https://meta.wikimedia.org/wiki/Talk:Library_Card_platform/Design_improvements" class="alert-link" target="_blank" rel="noopener"> project page </a> . {% endblocktranslate %} </div> <div class="row"> <div class="col-lg-2 col-md-3"> {% include "users/filter_section.html" %} </div> <div class="v-divider"></div> <div class="col-lg-9 col-md-8 col-sm-12"> {% include "users/collections_section.html" %} </div> </div> {% endblock %} {% block javascript %} <script type="text/javascript"> // Dynamically add classes to the django-filter generated labels labelLanguages = document.querySelector("[for=id_languages]") labelTags = document.querySelector("[for=id_tags]") labelLanguages.classList.add("collection-filter-label"); labelTags.classList.add("collection-filter-label"); </script> {% endblock %}
f79734300f55f791c0e08d4cdf92ba16663a7cb2
.travis.yml
.travis.yml
language: elixir elixir: - 1.1 - 1.2 - 1.3 - 1.4 - 1.5 - 1.6 - 1.7 - 1.8 sudo: false before_script: - mix deps.get --only test env: - MIX_ENV=test script: - mix test - mix coveralls.travis after_script: - mix deps.get --only docs - MIX_ENV=docs mix inch.report
language: elixir elixir: - 1.1 - 1.2 - 1.3 - 1.4 - 1.5 - 1.6 - 1.7 - 1.8 sudo: false before_script: - mix deps.get --only test env: - MIX_ENV=test script: - mix coveralls.travis after_script: - mix deps.get --only docs - MIX_ENV=docs mix inch.report
Remove redundant call to mix test
Remove redundant call to mix test
YAML
mit
beatrichartz/parallel_stream
yaml
## Code Before: language: elixir elixir: - 1.1 - 1.2 - 1.3 - 1.4 - 1.5 - 1.6 - 1.7 - 1.8 sudo: false before_script: - mix deps.get --only test env: - MIX_ENV=test script: - mix test - mix coveralls.travis after_script: - mix deps.get --only docs - MIX_ENV=docs mix inch.report ## Instruction: Remove redundant call to mix test ## Code After: language: elixir elixir: - 1.1 - 1.2 - 1.3 - 1.4 - 1.5 - 1.6 - 1.7 - 1.8 sudo: false before_script: - mix deps.get --only test env: - MIX_ENV=test script: - mix coveralls.travis after_script: - mix deps.get --only docs - MIX_ENV=docs mix inch.report
0d40f465eaf0b5cdc92f042d8c7a1a7dbcf36074
fontello.js
fontello.js
/*global nodeca*/ "use strict"; var app = require('nlib').Application.create({ name: 'fontomas', root: __dirname }); // // Preset application version // nodeca.runtime.version = require('./package.json').version; // // Catch unexpected exceptions // process.on('uncaughtException', function (err) { nodeca.logger.warn('Uncaught exception'); nodeca.logger.error(err); }); // // Run application // app.run();
/*global nodeca*/ "use strict"; var app = require('nlib').Application.create({ name: 'fontomas', root: __dirname }); // // Preset application version // nodeca.runtime.version = require('./package.json').version; // // Catch unexpected exceptions // process.on('uncaughtException', function (err) { try { nodeca.logger.warn('Uncaught exception'); nodeca.logger.error(err); } catch (err) { // THIS SHOULD NEVER-EVER-EVER HAPPEN -- THIS IS A WORST CASE SCENARIO // USAGE: ./fontello.js 2>/var/log/fontello-cf.log console.error(err.stack || err.toString()); } }); // // Run application // app.run();
Add try/catch into unhandled exception handler
Add try/catch into unhandled exception handler
JavaScript
mit
marcia-kubik/fontello,Sivalingaamorthy/fontello,modulexcite/fontello,SKing7/fontellomt,asrin475/fontello,cnbin/fontello,vinod1988/fontello,cnbin/fontello,liguangmings/fontello,CapeSepias/fontello,CapeSepias/fontello,fontello/fontello,Sivalingaamorthy/fontello,liguangmings/fontello,modulexcite/fontello,CapeSepias/fontello,blackgirl/fontello,katgironpe/fontello,vinod1988/fontello,katgironpe/fontello,bartuspan/fontello,fontello/fontello,katgironpe/fontello,marcia-kubik/fontello,asrin475/fontello,fontello/fontello,blackgirl/fontello,bartuspan/fontello,liguangmings/fontello,Sivalingaamorthy/fontello,blackgirl/fontello,marcia-kubik/fontello,SKing7/fontellomt,asrin475/fontello,cnbin/fontello,modulexcite/fontello,bartuspan/fontello,vinod1988/fontello
javascript
## Code Before: /*global nodeca*/ "use strict"; var app = require('nlib').Application.create({ name: 'fontomas', root: __dirname }); // // Preset application version // nodeca.runtime.version = require('./package.json').version; // // Catch unexpected exceptions // process.on('uncaughtException', function (err) { nodeca.logger.warn('Uncaught exception'); nodeca.logger.error(err); }); // // Run application // app.run(); ## Instruction: Add try/catch into unhandled exception handler ## Code After: /*global nodeca*/ "use strict"; var app = require('nlib').Application.create({ name: 'fontomas', root: __dirname }); // // Preset application version // nodeca.runtime.version = require('./package.json').version; // // Catch unexpected exceptions // process.on('uncaughtException', function (err) { try { nodeca.logger.warn('Uncaught exception'); nodeca.logger.error(err); } catch (err) { // THIS SHOULD NEVER-EVER-EVER HAPPEN -- THIS IS A WORST CASE SCENARIO // USAGE: ./fontello.js 2>/var/log/fontello-cf.log console.error(err.stack || err.toString()); } }); // // Run application // app.run();
68d547ec5f8cf5f52b17bd0f608187d64f41ac46
main.go
main.go
package main import ( "flag" "fmt" "net" "os" "time" "strings" ) var argvPort = flag.Int("port", 8117, "port to listen") var argvCandidates = flag.String("nodes", "", "comma separated list of nodes.") var argvRestBind = flag.String("rest", "127.0.0.1:8080", "Network address which will be bind to a restful service") func main() { flag.Parse() bindAddr := fmt.Sprintf("0.0.0.0:%v", *argvPort) fmt.Printf("Bind addr: %v\n", bindAddr) ln, err := net.Listen("tcp", bindAddr) if err != nil { fmt.Fprintf(os.Stderr, "%v\n", err) return } bully := NewBully(ln, nil) nodeAddr := strings.Split(*argvCandidates, ",") dialTimtout := 5 * time.Second for _, node := range nodeAddr { err := bully.AddCandidate(node, nil, dialTimtout) if err != nil { fmt.Fprintf(os.Stderr, "%v cannot be added: %v\n", node, err) } } web := NewWebAPI(bully) web.Run(*argvRestBind) }
package main import ( "flag" "fmt" "net" "os" "time" "strings" ) var argvPort = flag.Int("port", 8117, "port to listen") var argvCandidates = flag.String("nodes", "", "comma separated list of nodes.") var argvRestBind = flag.String("rest", "127.0.0.1:8080", "Network address which will be bind to a restful service") func main() { flag.Parse() bindAddr := fmt.Sprintf("0.0.0.0:%v", *argvPort) ln, err := net.Listen("tcp", bindAddr) if err != nil { fmt.Fprintf(os.Stderr, "%v\n", err) return } bully := NewBully(ln, nil) nodeAddr := strings.Split(*argvCandidates, ",") dialTimtout := 5 * time.Second for _, node := range nodeAddr { err := bully.AddCandidate(node, nil, dialTimtout) if err != nil { fmt.Fprintf(os.Stderr, "%v cannot be added: %v\n", node, err) } } fmt.Printf("My ID: %v\n", bully.MyId()) web := NewWebAPI(bully) web.Run(*argvRestBind) bully.Finalize() }
Print my id on start
[Mod] Print my id on start
Go
apache-2.0
monnand/bully
go
## Code Before: package main import ( "flag" "fmt" "net" "os" "time" "strings" ) var argvPort = flag.Int("port", 8117, "port to listen") var argvCandidates = flag.String("nodes", "", "comma separated list of nodes.") var argvRestBind = flag.String("rest", "127.0.0.1:8080", "Network address which will be bind to a restful service") func main() { flag.Parse() bindAddr := fmt.Sprintf("0.0.0.0:%v", *argvPort) fmt.Printf("Bind addr: %v\n", bindAddr) ln, err := net.Listen("tcp", bindAddr) if err != nil { fmt.Fprintf(os.Stderr, "%v\n", err) return } bully := NewBully(ln, nil) nodeAddr := strings.Split(*argvCandidates, ",") dialTimtout := 5 * time.Second for _, node := range nodeAddr { err := bully.AddCandidate(node, nil, dialTimtout) if err != nil { fmt.Fprintf(os.Stderr, "%v cannot be added: %v\n", node, err) } } web := NewWebAPI(bully) web.Run(*argvRestBind) } ## Instruction: [Mod] Print my id on start ## Code After: package main import ( "flag" "fmt" "net" "os" "time" "strings" ) var argvPort = flag.Int("port", 8117, "port to listen") var argvCandidates = flag.String("nodes", "", "comma separated list of nodes.") var argvRestBind = flag.String("rest", "127.0.0.1:8080", "Network address which will be bind to a restful service") func main() { flag.Parse() bindAddr := fmt.Sprintf("0.0.0.0:%v", *argvPort) ln, err := net.Listen("tcp", bindAddr) if err != nil { fmt.Fprintf(os.Stderr, "%v\n", err) return } bully := NewBully(ln, nil) nodeAddr := strings.Split(*argvCandidates, ",") dialTimtout := 5 * time.Second for _, node := range nodeAddr { err := bully.AddCandidate(node, nil, dialTimtout) if err != nil { fmt.Fprintf(os.Stderr, "%v cannot be added: %v\n", node, err) } } fmt.Printf("My ID: %v\n", bully.MyId()) web := NewWebAPI(bully) web.Run(*argvRestBind) bully.Finalize() }
38f67bfa5cbef74d47bb16e2bc54c5b59fab3578
packages/storybook/examples/Form.SwitchRow/BasicUsage.js
packages/storybook/examples/Form.SwitchRow/BasicUsage.js
import React from 'react'; import List from '@ichef/gypcrete/src/List'; import SwitchRow from '@ichef/gypcrete-form/src/SwitchRow'; const DESC = ` Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc risus risus, gravida in nisl ac, iaculis aliquam dui. Nunc dictum ipsum eu sapien lacinia, eu finibus nibh vestibulum. `; function BasicUsage() { return ( <List title="Switch rows"> <SwitchRow label="Module default state on iPad" asideOn="Turned on by default" asideOff="Turned off by default" desc={DESC} /> <SwitchRow disabled label="Disabled row" /> <SwitchRow readOnly label="Read-only row" /> <SwitchRow label="World peace" asideOn="There will be peace" asideOff="There will be war" status="error" errorMsg="Cannot declare a war." /> </List> ); } export default BasicUsage;
import React from 'react'; import List from '@ichef/gypcrete/src/List'; import SwitchRow from '@ichef/gypcrete-form/src/SwitchRow'; const DESC = ` Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc risus risus, gravida in nisl ac, iaculis aliquam dui. Nunc dictum ipsum eu sapien lacinia, eu finibus nibh vestibulum. `; function BasicUsage() { return ( <List title="Switch rows"> <SwitchRow label="Module default state on iPad" asideOn="Turned on by default" asideOff="Turned off by default" desc={DESC} /> <SwitchRow disabled label="Disabled row" /> <SwitchRow checked label="World peace" asideOn="There will be peace" asideOff="There will be war" status="error" errorMsg="Cannot declare a war." /> </List> ); } export default BasicUsage;
Remove *readonly* example for <SwitchRow>
Remove *readonly* example for <SwitchRow> Because setting 'readonly' on a checkbox input does not prevent user from checking/unchecking on it. It does not make sense to include that in examples for now.
JavaScript
apache-2.0
iCHEF/gypcrete,iCHEF/gypcrete
javascript
## Code Before: import React from 'react'; import List from '@ichef/gypcrete/src/List'; import SwitchRow from '@ichef/gypcrete-form/src/SwitchRow'; const DESC = ` Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc risus risus, gravida in nisl ac, iaculis aliquam dui. Nunc dictum ipsum eu sapien lacinia, eu finibus nibh vestibulum. `; function BasicUsage() { return ( <List title="Switch rows"> <SwitchRow label="Module default state on iPad" asideOn="Turned on by default" asideOff="Turned off by default" desc={DESC} /> <SwitchRow disabled label="Disabled row" /> <SwitchRow readOnly label="Read-only row" /> <SwitchRow label="World peace" asideOn="There will be peace" asideOff="There will be war" status="error" errorMsg="Cannot declare a war." /> </List> ); } export default BasicUsage; ## Instruction: Remove *readonly* example for <SwitchRow> Because setting 'readonly' on a checkbox input does not prevent user from checking/unchecking on it. It does not make sense to include that in examples for now. ## Code After: import React from 'react'; import List from '@ichef/gypcrete/src/List'; import SwitchRow from '@ichef/gypcrete-form/src/SwitchRow'; const DESC = ` Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc risus risus, gravida in nisl ac, iaculis aliquam dui. Nunc dictum ipsum eu sapien lacinia, eu finibus nibh vestibulum. `; function BasicUsage() { return ( <List title="Switch rows"> <SwitchRow label="Module default state on iPad" asideOn="Turned on by default" asideOff="Turned off by default" desc={DESC} /> <SwitchRow disabled label="Disabled row" /> <SwitchRow checked label="World peace" asideOn="There will be peace" asideOff="There will be war" status="error" errorMsg="Cannot declare a war." /> </List> ); } export default BasicUsage;
fcbaf01f45cddffb0d770038d6838a6bef6073cb
devops/QUESTIONS.md
devops/QUESTIONS.md
- **What is HAProxy and what is an example of why you would use it?** - **What is BOSH and what is an example of why you would use it?** ## Answered - **What is virtualisation?** _Answered by @LewisCowper_ Virtualisation is (most often in our cases) a way of running one, or many, self contained instances of a full operating system + associated software, on one physical host. We often use it in web development because one physical computer has the power to run multiple web servers (as a for example), so you can pay for one whole computer, and use virtualisation to achieve self contained instances that can power many different web sites. They are also (about 99% of the time) portable to another environment. As an example, if you paid for the use of one server, from Amazon, or Digital Ocean, or whatever, and ran 10 websites on that one server, in 10 virtual machines. If one of those websites suddenly started getting a lot of traffic, you could create new virtual machines on other servers that just ran that website, and you could scale that way.
- **What is HAProxy and what is an example of why you would use it?** - **What is BOSH and what is an example of why you would use it?** - **What is (a) Cloud Foundry** ## Answered - **What is virtualisation?** _Answered by @LewisCowper_ Virtualisation is (most often in our cases) a way of running one, or many, self contained instances of a full operating system + associated software, on one physical host. We often use it in web development because one physical computer has the power to run multiple web servers (as a for example), so you can pay for one whole computer, and use virtualisation to achieve self contained instances that can power many different web sites. They are also (about 99% of the time) portable to another environment. As an example, if you paid for the use of one server, from Amazon, or Digital Ocean, or whatever, and ran 10 websites on that one server, in 10 virtual machines. If one of those websites suddenly started getting a lot of traffic, you could create new virtual machines on other servers that just ran that website, and you could scale that way.
Add cloud foundry Q to devops/qs
Add cloud foundry Q to devops/qs
Markdown
mit
Charlotteis/notes,Charlotteis/notes
markdown
## Code Before: - **What is HAProxy and what is an example of why you would use it?** - **What is BOSH and what is an example of why you would use it?** ## Answered - **What is virtualisation?** _Answered by @LewisCowper_ Virtualisation is (most often in our cases) a way of running one, or many, self contained instances of a full operating system + associated software, on one physical host. We often use it in web development because one physical computer has the power to run multiple web servers (as a for example), so you can pay for one whole computer, and use virtualisation to achieve self contained instances that can power many different web sites. They are also (about 99% of the time) portable to another environment. As an example, if you paid for the use of one server, from Amazon, or Digital Ocean, or whatever, and ran 10 websites on that one server, in 10 virtual machines. If one of those websites suddenly started getting a lot of traffic, you could create new virtual machines on other servers that just ran that website, and you could scale that way. ## Instruction: Add cloud foundry Q to devops/qs ## Code After: - **What is HAProxy and what is an example of why you would use it?** - **What is BOSH and what is an example of why you would use it?** - **What is (a) Cloud Foundry** ## Answered - **What is virtualisation?** _Answered by @LewisCowper_ Virtualisation is (most often in our cases) a way of running one, or many, self contained instances of a full operating system + associated software, on one physical host. We often use it in web development because one physical computer has the power to run multiple web servers (as a for example), so you can pay for one whole computer, and use virtualisation to achieve self contained instances that can power many different web sites. They are also (about 99% of the time) portable to another environment. As an example, if you paid for the use of one server, from Amazon, or Digital Ocean, or whatever, and ran 10 websites on that one server, in 10 virtual machines. If one of those websites suddenly started getting a lot of traffic, you could create new virtual machines on other servers that just ran that website, and you could scale that way.
eefc1afce4e6cdc56c72d0721d71e93ea118c1ae
vim/custom/ale.vim
vim/custom/ale.vim
let g:ale_sign_column_always = 1 let g:ale_fix_on_save = 0 let g:airline#extensions#ale#enabled = 1 nmap <silent> <C-k> <Plug>(ale_previous_wrap) nmap <silent> <C-j> <Plug>(ale_next_wrap) let g:ale_lint_on_text_changed = 'never' let g:ale_lint_on_insert_leave = 0 let g:ale_lint_on_enter = 0 let g:ale_set_loclist = 0 let g:ale_set_quickfix = 1 let g:ale_pattern_options_enabled = 1 let g:ale_open_list = 1 let g:ale_list_window_size = 5 let g:ale_fixers = { \ '*': ['remove_trailing_lines', 'trim_whitespace'], \ 'vim': ['vint'], \ 'javascript': ['eslint'], \ 'sh': ['shfmt'], \} let g:ale_linters = { \ 'go': ['gofmt'], \} " Enable eslint for ale let g:ale_pattern_options = { \ '\.min\.js$': {'ale_linters': [], 'ale_fixers': []}, \ '\.min\.css$': {'ale_linters': [], 'ale_fixers': []}, \}
let g:ale_sign_column_always = 1 let g:ale_fix_on_save = 1 let g:airline#extensions#ale#enabled = 1 nmap <silent> <C-k> <Plug>(ale_previous_wrap) nmap <silent> <C-j> <Plug>(ale_next_wrap) " Disable ALE on save: " let g:ale_lint_on_text_changed = 'never' " let g:ale_lint_on_insert_leave = 0 " let g:ale_lint_on_enter = 0 let g:ale_set_loclist = 0 let g:ale_set_quickfix = 1 let g:ale_pattern_options_enabled = 1 let g:ale_open_list = 1 let g:ale_list_window_size = 5 let g:ale_fixers = { \ '*': ['remove_trailing_lines', 'trim_whitespace'], \ 'vim': ['vint'], \ 'javascript': ['eslint'], \ 'sh': ['shfmt'], \} let g:ale_linters = { \ 'go': ['gofmt'], \} " Enable eslint for ale let g:ale_pattern_options = { \ '\.min\.js$': {'ale_linters': [], 'ale_fixers': []}, \ '\.min\.css$': {'ale_linters': [], 'ale_fixers': []}, \}
Enable ALE linting on save
Enable ALE linting on save
VimL
mit
faun/dotfiles,faun/dotfiles
viml
## Code Before: let g:ale_sign_column_always = 1 let g:ale_fix_on_save = 0 let g:airline#extensions#ale#enabled = 1 nmap <silent> <C-k> <Plug>(ale_previous_wrap) nmap <silent> <C-j> <Plug>(ale_next_wrap) let g:ale_lint_on_text_changed = 'never' let g:ale_lint_on_insert_leave = 0 let g:ale_lint_on_enter = 0 let g:ale_set_loclist = 0 let g:ale_set_quickfix = 1 let g:ale_pattern_options_enabled = 1 let g:ale_open_list = 1 let g:ale_list_window_size = 5 let g:ale_fixers = { \ '*': ['remove_trailing_lines', 'trim_whitespace'], \ 'vim': ['vint'], \ 'javascript': ['eslint'], \ 'sh': ['shfmt'], \} let g:ale_linters = { \ 'go': ['gofmt'], \} " Enable eslint for ale let g:ale_pattern_options = { \ '\.min\.js$': {'ale_linters': [], 'ale_fixers': []}, \ '\.min\.css$': {'ale_linters': [], 'ale_fixers': []}, \} ## Instruction: Enable ALE linting on save ## Code After: let g:ale_sign_column_always = 1 let g:ale_fix_on_save = 1 let g:airline#extensions#ale#enabled = 1 nmap <silent> <C-k> <Plug>(ale_previous_wrap) nmap <silent> <C-j> <Plug>(ale_next_wrap) " Disable ALE on save: " let g:ale_lint_on_text_changed = 'never' " let g:ale_lint_on_insert_leave = 0 " let g:ale_lint_on_enter = 0 let g:ale_set_loclist = 0 let g:ale_set_quickfix = 1 let g:ale_pattern_options_enabled = 1 let g:ale_open_list = 1 let g:ale_list_window_size = 5 let g:ale_fixers = { \ '*': ['remove_trailing_lines', 'trim_whitespace'], \ 'vim': ['vint'], \ 'javascript': ['eslint'], \ 'sh': ['shfmt'], \} let g:ale_linters = { \ 'go': ['gofmt'], \} " Enable eslint for ale let g:ale_pattern_options = { \ '\.min\.js$': {'ale_linters': [], 'ale_fixers': []}, \ '\.min\.css$': {'ale_linters': [], 'ale_fixers': []}, \}
d9a8be026f92beacda9d622336378b9a19145e83
lib/helpers/generate-svgo-config.js
lib/helpers/generate-svgo-config.js
const svgo = require('svgo'); const { omit, concat, uniqBy } = require('lodash'); const { merge } = require('webpack-merge'); module.exports = (options, pre = [], post = []) => { try { // The preset-default plugin is only available since SVGO 2.4.0 svgo.optimize('', { plugins: [{ name: 'preset-default' }] }); return merge({}, omit(options, ['plugins']), { plugins: [{ name: 'preset-default', params: { overrides: uniqBy(concat(pre, options.plugins, post).reverse(), 'name').reduce((overrides, plugin) => ({ ...overrides, [plugin.name]: plugin.active !== false ? plugin.params : false }), {}) } }, ...options.plugins] }); } catch (error) { // Fall back to extendDefaultPlugins which is deprecated since 2.4.0 return merge({}, omit(options, ['plugins']), { plugins: uniqBy(concat(pre, svgo.extendDefaultPlugins(options.plugins), post).reverse(), 'name') }); } }
const svgo = require('svgo'); const { omit, concat, uniqBy } = require('lodash'); const { merge } = require('webpack-merge'); module.exports = (options, pre = [], post = []) => { try { // The preset-default plugin is only available since SVGO 2.4.0 svgo.optimize('', { plugins: [{ name: 'preset-default' }] }); const names = concat(pre, post).map((plugin) => plugin.name); return merge({}, omit(options, ['plugins']), { plugins: [{ name: 'preset-default', params: { overrides: uniqBy(concat(pre, options.plugins, post).reverse(), 'name').reduce((overrides, plugin) => ({ ...overrides, [plugin.name]: plugin.active !== false ? plugin.params : false }), {}) } }, ...options.plugins.filter((plugin) => { return !names.includes(plugin.name); })] }); } catch (error) { // Fall back to extendDefaultPlugins which is deprecated since 2.4.0 return merge({}, omit(options, ['plugins']), { plugins: uniqBy(concat(pre, svgo.extendDefaultPlugins(options.plugins), post).reverse(), 'name') }); } }
Fix pre/post plugins for generateSVGOConfig helper not being overwritten properly
Fix pre/post plugins for generateSVGOConfig helper not being overwritten properly
JavaScript
mit
freshheads/svg-spritemap-webpack-plugin,cascornelissen/svg-spritemap-webpack-plugin
javascript
## Code Before: const svgo = require('svgo'); const { omit, concat, uniqBy } = require('lodash'); const { merge } = require('webpack-merge'); module.exports = (options, pre = [], post = []) => { try { // The preset-default plugin is only available since SVGO 2.4.0 svgo.optimize('', { plugins: [{ name: 'preset-default' }] }); return merge({}, omit(options, ['plugins']), { plugins: [{ name: 'preset-default', params: { overrides: uniqBy(concat(pre, options.plugins, post).reverse(), 'name').reduce((overrides, plugin) => ({ ...overrides, [plugin.name]: plugin.active !== false ? plugin.params : false }), {}) } }, ...options.plugins] }); } catch (error) { // Fall back to extendDefaultPlugins which is deprecated since 2.4.0 return merge({}, omit(options, ['plugins']), { plugins: uniqBy(concat(pre, svgo.extendDefaultPlugins(options.plugins), post).reverse(), 'name') }); } } ## Instruction: Fix pre/post plugins for generateSVGOConfig helper not being overwritten properly ## Code After: const svgo = require('svgo'); const { omit, concat, uniqBy } = require('lodash'); const { merge } = require('webpack-merge'); module.exports = (options, pre = [], post = []) => { try { // The preset-default plugin is only available since SVGO 2.4.0 svgo.optimize('', { plugins: [{ name: 'preset-default' }] }); const names = concat(pre, post).map((plugin) => plugin.name); return merge({}, omit(options, ['plugins']), { plugins: [{ name: 'preset-default', params: { overrides: uniqBy(concat(pre, options.plugins, post).reverse(), 'name').reduce((overrides, plugin) => ({ ...overrides, [plugin.name]: plugin.active !== false ? plugin.params : false }), {}) } }, ...options.plugins.filter((plugin) => { return !names.includes(plugin.name); })] }); } catch (error) { // Fall back to extendDefaultPlugins which is deprecated since 2.4.0 return merge({}, omit(options, ['plugins']), { plugins: uniqBy(concat(pre, svgo.extendDefaultPlugins(options.plugins), post).reverse(), 'name') }); } }
3d14f152f0ef0e2658fd8fc0cfe6fba0f5948147
.travis.yml
.travis.yml
language: python sudo: false env: matrix: - LUA="lua 5.1" - LUA="lua 5.2" - LUA="lua 5.3" - LUA="luajit 2.0" - LUA="luajit 2.1" before_install: - pip install hererocks - hererocks here -r^ --$LUA - export PATH=$PATH:$PWD/here/bin - eval `luarocks path --bin` install: - luarocks make script: - lua test.lua
language: python sudo: false env: matrix: - LUA=lua V=5.1 - LUA=lua V=5.2 - LUA=lua V=5.3 before_install: - pip install hererocks - hererocks here -r^ --$LUA $V - export PATH=$PATH:$PWD/here/bin - eval `luarocks path --bin` install: - luarocks make script: - $LUA test.lua
Use separate variables for $LUA and version
Travis-CI: Use separate variables for $LUA and version
YAML
lgpl-2.1
aperezdc/lua-shelve
yaml
## Code Before: language: python sudo: false env: matrix: - LUA="lua 5.1" - LUA="lua 5.2" - LUA="lua 5.3" - LUA="luajit 2.0" - LUA="luajit 2.1" before_install: - pip install hererocks - hererocks here -r^ --$LUA - export PATH=$PATH:$PWD/here/bin - eval `luarocks path --bin` install: - luarocks make script: - lua test.lua ## Instruction: Travis-CI: Use separate variables for $LUA and version ## Code After: language: python sudo: false env: matrix: - LUA=lua V=5.1 - LUA=lua V=5.2 - LUA=lua V=5.3 before_install: - pip install hererocks - hererocks here -r^ --$LUA $V - export PATH=$PATH:$PWD/here/bin - eval `luarocks path --bin` install: - luarocks make script: - $LUA test.lua
ce206a6dcff7b1f71facf9c5837bf48d54c23a99
install.sh
install.sh
repoName="dotfiles" repoDir="$HOME/$repoName" msg() { printf '%b\n' "$1" >&2 } result() { if [ "$1" -eq '0' ]; then [ -z "$2" ] || msg "\e[32m$2\e[0m" else [ -z "$3" ] || msg "\e[31m$3\e[0m" [ "$4" = ! ] && exit 1 fi } ProgramExists() { ret='0' type $1 >/dev/null 2>&1 || ret='1' result "$ret" "" "To continue you first need to install $1." ! } # Check path ProgramExists "vim" ProgramExists "git"
repoName="dotfiles" repoDir="$HOME/$repoName" msg() { printf '%b\n' "$1" >&2 } title() { msg "\n---------------------------------------------------" msg "> $1\n" } result() { if [ "$1" -eq '0' ]; then [ -z "$2" ] || msg "\e[32m$2\e[0m" else [ -z "$3" ] || msg "\e[31m$3\e[0m" [ "$4" = ! ] && exit 1 fi } ProgramExists() { ret='0' type $1 >/dev/null 2>&1 || ret='1' result "$ret" "" "To continue you first need to install $1." ! } UpdateRepo() { title "Trying to update $1..." cd "$2" && git pull origin master result "$?" "Successfully updated $1" "Failed to update $1" } CloneRepo() { if [ ! -e "$2/.git" ]; then title "Clonning ${1}..." git clone "$3" "$2" result "$?" "Successfully cloned $1" "Failed to clone $1" else UpdateRepo "$1" "$2" fi } # Check path ProgramExists "vim" ProgramExists "git" # Clone/update dotfiles repository CloneRepo "$repoName" "$repoDir" 'http://github.com/Tunous/dotfiles.git'
Add git repository cloning/updating functions
Add git repository cloning/updating functions
Shell
mit
Tunous/dotfiles
shell
## Code Before: repoName="dotfiles" repoDir="$HOME/$repoName" msg() { printf '%b\n' "$1" >&2 } result() { if [ "$1" -eq '0' ]; then [ -z "$2" ] || msg "\e[32m$2\e[0m" else [ -z "$3" ] || msg "\e[31m$3\e[0m" [ "$4" = ! ] && exit 1 fi } ProgramExists() { ret='0' type $1 >/dev/null 2>&1 || ret='1' result "$ret" "" "To continue you first need to install $1." ! } # Check path ProgramExists "vim" ProgramExists "git" ## Instruction: Add git repository cloning/updating functions ## Code After: repoName="dotfiles" repoDir="$HOME/$repoName" msg() { printf '%b\n' "$1" >&2 } title() { msg "\n---------------------------------------------------" msg "> $1\n" } result() { if [ "$1" -eq '0' ]; then [ -z "$2" ] || msg "\e[32m$2\e[0m" else [ -z "$3" ] || msg "\e[31m$3\e[0m" [ "$4" = ! ] && exit 1 fi } ProgramExists() { ret='0' type $1 >/dev/null 2>&1 || ret='1' result "$ret" "" "To continue you first need to install $1." ! } UpdateRepo() { title "Trying to update $1..." cd "$2" && git pull origin master result "$?" "Successfully updated $1" "Failed to update $1" } CloneRepo() { if [ ! -e "$2/.git" ]; then title "Clonning ${1}..." git clone "$3" "$2" result "$?" "Successfully cloned $1" "Failed to clone $1" else UpdateRepo "$1" "$2" fi } # Check path ProgramExists "vim" ProgramExists "git" # Clone/update dotfiles repository CloneRepo "$repoName" "$repoDir" 'http://github.com/Tunous/dotfiles.git'
102afb57bcf4798cd466001b9aaeec1bf1a58577
molecule/upgrade/side_effect.yml
molecule/upgrade/side_effect.yml
--- - name: Perform apt upgrades hosts: securedrop become: yes tasks: - name: Perform safe upgrade apt: update_cache: yes upgrade: yes
--- - name: Perform apt upgrades hosts: securedrop become: yes tasks: - name: Perform safe upgrade apt: update_cache: yes upgrade: yes - name: Lay out app testing deps hosts: securedrop_application_server max_fail_percentage: 0 any_errors_fatal: yes roles: - role: app-test tags: app-test tasks: - name: Reset database command: ./manage.py reset args: chdir: /var/www/securedrop - name: Slap in latest create-dev-data script copy: src: ../../securedrop/create-dev-data.py dest: /var/www/securedrop/create-dev-data.py mode: 0555 - name: Insert journalist test user command: /var/www/securedrop/create-dev-data.py --staging args: chdir: /var/www/securedrop become: yes
Add functionality to prepare boxes for functional testing
Add functionality to prepare boxes for functional testing Typically these actions were done manually but lets get our good old friend ansible to run them for us (at least under the upgrade env).
YAML
agpl-3.0
conorsch/securedrop,conorsch/securedrop,conorsch/securedrop,heartsucker/securedrop,heartsucker/securedrop,ehartsuyker/securedrop,ehartsuyker/securedrop,heartsucker/securedrop,heartsucker/securedrop,ehartsuyker/securedrop,conorsch/securedrop,ehartsuyker/securedrop,conorsch/securedrop,heartsucker/securedrop,ehartsuyker/securedrop,ehartsuyker/securedrop
yaml
## Code Before: --- - name: Perform apt upgrades hosts: securedrop become: yes tasks: - name: Perform safe upgrade apt: update_cache: yes upgrade: yes ## Instruction: Add functionality to prepare boxes for functional testing Typically these actions were done manually but lets get our good old friend ansible to run them for us (at least under the upgrade env). ## Code After: --- - name: Perform apt upgrades hosts: securedrop become: yes tasks: - name: Perform safe upgrade apt: update_cache: yes upgrade: yes - name: Lay out app testing deps hosts: securedrop_application_server max_fail_percentage: 0 any_errors_fatal: yes roles: - role: app-test tags: app-test tasks: - name: Reset database command: ./manage.py reset args: chdir: /var/www/securedrop - name: Slap in latest create-dev-data script copy: src: ../../securedrop/create-dev-data.py dest: /var/www/securedrop/create-dev-data.py mode: 0555 - name: Insert journalist test user command: /var/www/securedrop/create-dev-data.py --staging args: chdir: /var/www/securedrop become: yes
454c76e0f1181e21d93f9ff9614b0e27a846da88
index.js
index.js
'use strict'; // Electron doesn't support notifications in Windows yet. https://github.com/atom/electron/issues/262 // So we hijack the Notification API.' const ipc = require('electron').ipcRenderer; module.exports = () => { const OldNotification = Notification; Notification = function (title, options) { // Send this to main thread. // Catch it in your main 'app' instance with `ipc.on`. // Then send it back to the view, if you want, with `event.returnValue` or `event.sender.send()`. ipc.send('notification-shim', { title, options }); // Send the native Notification. // You can't catch it, that's why we're doing all of this. :) return new OldNotification(title, options); }; Notification.prototype = OldNotification.prototype; Notification.permission = OldNotification.permission; Notification.requestPermission = OldNotification.requestPermission; };
'use strict'; // Electron doesn't automatically show notifications in Windows yet, and it's not easy to polyfill. // So we have to hijack the Notification API. let ipc; try { // Using electron >=0.35 ipc = require('electron').ipcRenderer; } catch (e) { // Assume it's electron <0.35 ipc = require('ipc'); } module.exports = () => { const OldNotification = Notification; Notification = function (title, options) { // Send this to main thread. // Catch it in your main 'app' instance with `ipc.on`. // Then send it back to the view, if you want, with `event.returnValue` or `event.sender.send()`. ipc.send('notification-shim', { title, options }); // Send the native Notification. // You can't catch it, that's why we're doing all of this. :) return new OldNotification(title, options); }; Notification.prototype = OldNotification.prototype; Notification.permission = OldNotification.permission; Notification.requestPermission = OldNotification.requestPermission; };
Add fallback for electron <0.35
Add fallback for electron <0.35
JavaScript
mit
seriema/electron-notification-shim
javascript
## Code Before: 'use strict'; // Electron doesn't support notifications in Windows yet. https://github.com/atom/electron/issues/262 // So we hijack the Notification API.' const ipc = require('electron').ipcRenderer; module.exports = () => { const OldNotification = Notification; Notification = function (title, options) { // Send this to main thread. // Catch it in your main 'app' instance with `ipc.on`. // Then send it back to the view, if you want, with `event.returnValue` or `event.sender.send()`. ipc.send('notification-shim', { title, options }); // Send the native Notification. // You can't catch it, that's why we're doing all of this. :) return new OldNotification(title, options); }; Notification.prototype = OldNotification.prototype; Notification.permission = OldNotification.permission; Notification.requestPermission = OldNotification.requestPermission; }; ## Instruction: Add fallback for electron <0.35 ## Code After: 'use strict'; // Electron doesn't automatically show notifications in Windows yet, and it's not easy to polyfill. // So we have to hijack the Notification API. let ipc; try { // Using electron >=0.35 ipc = require('electron').ipcRenderer; } catch (e) { // Assume it's electron <0.35 ipc = require('ipc'); } module.exports = () => { const OldNotification = Notification; Notification = function (title, options) { // Send this to main thread. // Catch it in your main 'app' instance with `ipc.on`. // Then send it back to the view, if you want, with `event.returnValue` or `event.sender.send()`. ipc.send('notification-shim', { title, options }); // Send the native Notification. // You can't catch it, that's why we're doing all of this. :) return new OldNotification(title, options); }; Notification.prototype = OldNotification.prototype; Notification.permission = OldNotification.permission; Notification.requestPermission = OldNotification.requestPermission; };
1f8119ef311911c3bc4567a198abd0237db6710d
examples/rest_basic_auth/README.md
examples/rest_basic_auth/README.md
Basic authorization example using REST ====================================== To try this example, you need GNU `make`, `git` and [relx](https://github.com/erlware/relx) in your PATH. To build the example, run the following command: ``` bash $ make ``` To start the release in the foreground: ``` bash $ ./_rel/bin/hello_world_example console ``` Then point your browser at [http://localhost:8080](http://localhost:8080). Example output -------------- Request with no authentication: ``` bash $ curl -i http://localhost:8080 HTTP/1.1 401 Unauthorized connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:10:27 GMT content-length: 0 www-authenticate: Basic realm="cowboy" ``` Request with authentication: ``` bash $ curl -i -u "Alladin:open sesame" http://localhost:8080 HTTP/1.1 200 OK connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:11:12 GMT content-length: 16 content-type: text/plain Hello, Alladin! ```
Basic authorization example using REST ====================================== To try this example, you need GNU `make`, `git` and [relx](https://github.com/erlware/relx) in your PATH. To build the example, run the following command: ``` bash $ make ``` To start the release in the foreground: ``` bash $ ./_rel/bin/rest_basic_auth_example console ``` Then point your browser at [http://localhost:8080](http://localhost:8080). Example output -------------- Request with no authentication: ``` bash $ curl -i http://localhost:8080 HTTP/1.1 401 Unauthorized connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:10:27 GMT content-length: 0 www-authenticate: Basic realm="cowboy" ``` Request with authentication: ``` bash $ curl -i -u "Alladin:open sesame" http://localhost:8080 HTTP/1.1 200 OK connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:11:12 GMT content-length: 16 content-type: text/plain Hello, Alladin! ```
Fix the command to start the release in rest_basic_auth example
Fix the command to start the release in rest_basic_auth example
Markdown
isc
essen/cowboy,ninenines/cowboy,odo/cowboy,MSch/cowboy,bobos/ecweb,manastech/cowboy,lpgauth/cowboy,unisontech/cowboy,acrispin/cowboy,nviennot/cowboy,276361270/cowboy,cad627/cowboy,rabbitmq/cowboy,layerhq/cowboy,paulperegud/cowboy,K2InformaticsGmbH/cowboy,venkateshreddy/cowboy,CrankWheel/cowboy,sile/cowboy,yangchengjian/cowboy,grahamrhay/cowboy,yangchengjian/cowboy,hairyhum/cowboy,dangjun625/cowboy,linearregression/cowboyku,callsign-core/cowboy_hybi00,binarin/cowboy,callsign-core/cowboy_hybi00,omusico/cowboy,manastech/cowboy,turtleDeng/cowboy,unisontech/cowboy,noname007/cowboy,ferd/cowboyku,pap/cowboy,heroku/cowboyku,bsmr-erlang/cowboy,kivra/cowboy,Reimerei/cowboy,essen/cowboy,lpgauth/cowboy,binarin/cowboy,cad627/cowboy,imdeps/cowboy,layerhq/cowboy
markdown
## Code Before: Basic authorization example using REST ====================================== To try this example, you need GNU `make`, `git` and [relx](https://github.com/erlware/relx) in your PATH. To build the example, run the following command: ``` bash $ make ``` To start the release in the foreground: ``` bash $ ./_rel/bin/hello_world_example console ``` Then point your browser at [http://localhost:8080](http://localhost:8080). Example output -------------- Request with no authentication: ``` bash $ curl -i http://localhost:8080 HTTP/1.1 401 Unauthorized connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:10:27 GMT content-length: 0 www-authenticate: Basic realm="cowboy" ``` Request with authentication: ``` bash $ curl -i -u "Alladin:open sesame" http://localhost:8080 HTTP/1.1 200 OK connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:11:12 GMT content-length: 16 content-type: text/plain Hello, Alladin! ``` ## Instruction: Fix the command to start the release in rest_basic_auth example ## Code After: Basic authorization example using REST ====================================== To try this example, you need GNU `make`, `git` and [relx](https://github.com/erlware/relx) in your PATH. To build the example, run the following command: ``` bash $ make ``` To start the release in the foreground: ``` bash $ ./_rel/bin/rest_basic_auth_example console ``` Then point your browser at [http://localhost:8080](http://localhost:8080). Example output -------------- Request with no authentication: ``` bash $ curl -i http://localhost:8080 HTTP/1.1 401 Unauthorized connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:10:27 GMT content-length: 0 www-authenticate: Basic realm="cowboy" ``` Request with authentication: ``` bash $ curl -i -u "Alladin:open sesame" http://localhost:8080 HTTP/1.1 200 OK connection: keep-alive server: Cowboy date: Sun, 20 Jan 2013 14:11:12 GMT content-length: 16 content-type: text/plain Hello, Alladin! ```
0cffb8a79ec59796cf1c1d2780b033df696c9421
optional-dependencies.txt
optional-dependencies.txt
clarifai duecredit face_recognition python-twitter gensim google-api-python-client google-compute-engine librosa>=0.6.3 matplotlib opencv-python pathos pygraphviz pysrt pytesseract python-twitter scikit-learn seaborn soundfile spacy SpeechRecognition>=3.6.0 tensorflow>=1.0.0 torch transformers xlrd rev_ai
clarifai duecredit face_recognition python-twitter gensim google-api-python-client google-compute-engine librosa>=0.6.3 numba<=0.48 matplotlib opencv-python pathos pygraphviz pysrt pytesseract python-twitter scikit-learn seaborn soundfile spacy SpeechRecognition>=3.6.0 tensorflow>=1.0.0 torch transformers xlrd rev_ai
Set numba to 0.48 or less
Set numba to 0.48 or less
Text
bsd-3-clause
tyarkoni/featureX,tyarkoni/pliers
text
## Code Before: clarifai duecredit face_recognition python-twitter gensim google-api-python-client google-compute-engine librosa>=0.6.3 matplotlib opencv-python pathos pygraphviz pysrt pytesseract python-twitter scikit-learn seaborn soundfile spacy SpeechRecognition>=3.6.0 tensorflow>=1.0.0 torch transformers xlrd rev_ai ## Instruction: Set numba to 0.48 or less ## Code After: clarifai duecredit face_recognition python-twitter gensim google-api-python-client google-compute-engine librosa>=0.6.3 numba<=0.48 matplotlib opencv-python pathos pygraphviz pysrt pytesseract python-twitter scikit-learn seaborn soundfile spacy SpeechRecognition>=3.6.0 tensorflow>=1.0.0 torch transformers xlrd rev_ai
8cefbf836fe7a76fce9bff0174acabf5a9d5cd00
lib/mr_poole/cli.rb
lib/mr_poole/cli.rb
require 'getoptlong' module MrPoole class CLI def initialize(args) @helper = Helper.new @helper.ensure_jekyll_dir @params = args @tasks = Tasks.new end def execute(action) case action when 'post' then handle_post when 'draft' then handle_draft when 'publish' then handle_publish when 'unpublish' then handle_unpublish else @helper.gen_usage end end def handle_post opts = GetoptLong.new( ['--slug', '-s', GetoptLong::REQUIRED_ARGUMENT], ['--title', '-t', GetoptLong::REQUIRED_ARGUMENT], ) slug = nil title = nil opts.each do |opt, arg| case opt when '--slug' then slug = arg when '--title' then title = arg end end title ||= @params.first @helper.post_usage unless title @tasks.post(title, slug) end def handle_draft end def handle_publish end def handle_unpublish end end end
require 'optparse' require 'ostruct' module MrPoole class CLI def initialize(args) @helper = Helper.new @helper.ensure_jekyll_dir @params = args @tasks = Tasks.new end def execute(action) case action when 'post' then handle_post when 'draft' then handle_draft when 'publish' then handle_publish when 'unpublish' then handle_unpublish else @helper.gen_usage end end def handle_post options = OpenStruct.new options.slug = nil options.title = nil opt_parser = OptionParser.new do |opts| opts.banner = 'Usage: poole post [options]' opts.separator '' opts.separator "Options: " opts.on('-s', '--slug [SLUG]', "Use custom slug") do |s| options.slug = s end opts.on('-t', '--title [TITLE]', "Specifiy title") do |t| options.title = t end end opt_parser.parse! @params options.title ||= @params.first @helper.post_usage unless options.title @tasks.post(options.title, options.slug) end def handle_draft end def handle_publish end def handle_unpublish end end end
Switch to OptionParser for dealing with options.
Switch to OptionParser for dealing with options.
Ruby
mit
mmcclimon/mr_poole
ruby
## Code Before: require 'getoptlong' module MrPoole class CLI def initialize(args) @helper = Helper.new @helper.ensure_jekyll_dir @params = args @tasks = Tasks.new end def execute(action) case action when 'post' then handle_post when 'draft' then handle_draft when 'publish' then handle_publish when 'unpublish' then handle_unpublish else @helper.gen_usage end end def handle_post opts = GetoptLong.new( ['--slug', '-s', GetoptLong::REQUIRED_ARGUMENT], ['--title', '-t', GetoptLong::REQUIRED_ARGUMENT], ) slug = nil title = nil opts.each do |opt, arg| case opt when '--slug' then slug = arg when '--title' then title = arg end end title ||= @params.first @helper.post_usage unless title @tasks.post(title, slug) end def handle_draft end def handle_publish end def handle_unpublish end end end ## Instruction: Switch to OptionParser for dealing with options. ## Code After: require 'optparse' require 'ostruct' module MrPoole class CLI def initialize(args) @helper = Helper.new @helper.ensure_jekyll_dir @params = args @tasks = Tasks.new end def execute(action) case action when 'post' then handle_post when 'draft' then handle_draft when 'publish' then handle_publish when 'unpublish' then handle_unpublish else @helper.gen_usage end end def handle_post options = OpenStruct.new options.slug = nil options.title = nil opt_parser = OptionParser.new do |opts| opts.banner = 'Usage: poole post [options]' opts.separator '' opts.separator "Options: " opts.on('-s', '--slug [SLUG]', "Use custom slug") do |s| options.slug = s end opts.on('-t', '--title [TITLE]', "Specifiy title") do |t| options.title = t end end opt_parser.parse! @params options.title ||= @params.first @helper.post_usage unless options.title @tasks.post(options.title, options.slug) end def handle_draft end def handle_publish end def handle_unpublish end end end
797a99b63ac16d182b00467f1b4d2ecfb93d1813
sale_isolated_quotation/data/ir_sequence_data.xml
sale_isolated_quotation/data/ir_sequence_data.xml
<?xml version="1.0" encoding="utf-8"?> <odoo> <data noupdate="1"> <!-- Sequences for sale.order --> <record id="seq_sale_quotation" model="ir.sequence"> <field name="name">Quotation</field> <field name="code">sale.quotation</field> <field name="prefix">SQ</field> <field name="padding">8</field> <field name="company_id" eval="False"/> </record> <!-- Change padding --> <record id="sale.seq_sale_order" model="ir.sequence"> <field name="padding">8</field> </record> </data> </odoo>
<?xml version="1.0" encoding="utf-8"?> <odoo> <data noupdate="1"> <!-- Sequences for sale.order --> <record id="seq_sale_quotation" model="ir.sequence"> <field name="name">Quotation</field> <field name="code">sale.quotation</field> <field name="prefix">SQ</field> <field name="padding">3</field> <field name="company_id" eval="False"/> </record> </data> </odoo>
Change padding of quotation from 8 to 3 (inline with existing order)
Change padding of quotation from 8 to 3 (inline with existing order)
XML
agpl-3.0
kittiu/sale-workflow,kittiu/sale-workflow
xml
## Code Before: <?xml version="1.0" encoding="utf-8"?> <odoo> <data noupdate="1"> <!-- Sequences for sale.order --> <record id="seq_sale_quotation" model="ir.sequence"> <field name="name">Quotation</field> <field name="code">sale.quotation</field> <field name="prefix">SQ</field> <field name="padding">8</field> <field name="company_id" eval="False"/> </record> <!-- Change padding --> <record id="sale.seq_sale_order" model="ir.sequence"> <field name="padding">8</field> </record> </data> </odoo> ## Instruction: Change padding of quotation from 8 to 3 (inline with existing order) ## Code After: <?xml version="1.0" encoding="utf-8"?> <odoo> <data noupdate="1"> <!-- Sequences for sale.order --> <record id="seq_sale_quotation" model="ir.sequence"> <field name="name">Quotation</field> <field name="code">sale.quotation</field> <field name="prefix">SQ</field> <field name="padding">3</field> <field name="company_id" eval="False"/> </record> </data> </odoo>