commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3a1e81a0ad434ff5e2202095916cbbf0b9a1478b | plan.txt | plan.txt | Hardest parts:
- Connections
- Interface to the data structures (map, struct, array)
- Multithreading - need to talk to Thiago
- Well-defined names, special names: might be really easy, might be surprisingly hard / ugly
- Server
- Dispatching messages to/from interfaces, (interface registration?)
Other stuff:
- Build system
- Serialization, probably not hard
- Authentication
Some ideas for classes:
- Message
- Connection
- Interface?
- Address?
- Path?
- Authenticator or AuthenticationMethod
Scope (aka for now: features not planned):
- probably not in scope: code generator for the XML interface description format
because it needs to work against the consumer API (Qt, GLib, ?)
| Hardest parts:
- Connections
- Interface to the data structures (map, struct, array)
- Do it like QDBusArgument: doesn't need any non-primitive data types, very binding-friendly
- Multithreading - need to talk to Thiago
- Well-defined names, special names: might be really easy, might be surprisingly hard / ugly
- Server
- Dispatching messages to/from interfaces, (interface registration?)
Other stuff:
- Build system
- Serialization, probably not hard
- Authentication
Some ideas for classes:
- Message
- Connection
- Interface?
- Address?
- Path?
- Authenticator or AuthenticationMethod
Scope (aka for now: features not planned):
- probably not in scope: code generator for the XML interface description format
because it needs to work against the consumer API (Qt, GLib, ?)
| Add note about the interface to non-primitive data types. | Add note about the interface to non-primitive data types.
| Text | lgpl-2.1 | KDE/dferry,KDE/dferry | text | ## Code Before:
Hardest parts:
- Connections
- Interface to the data structures (map, struct, array)
- Multithreading - need to talk to Thiago
- Well-defined names, special names: might be really easy, might be surprisingly hard / ugly
- Server
- Dispatching messages to/from interfaces, (interface registration?)
Other stuff:
- Build system
- Serialization, probably not hard
- Authentication
Some ideas for classes:
- Message
- Connection
- Interface?
- Address?
- Path?
- Authenticator or AuthenticationMethod
Scope (aka for now: features not planned):
- probably not in scope: code generator for the XML interface description format
because it needs to work against the consumer API (Qt, GLib, ?)
## Instruction:
Add note about the interface to non-primitive data types.
## Code After:
Hardest parts:
- Connections
- Interface to the data structures (map, struct, array)
- Do it like QDBusArgument: doesn't need any non-primitive data types, very binding-friendly
- Multithreading - need to talk to Thiago
- Well-defined names, special names: might be really easy, might be surprisingly hard / ugly
- Server
- Dispatching messages to/from interfaces, (interface registration?)
Other stuff:
- Build system
- Serialization, probably not hard
- Authentication
Some ideas for classes:
- Message
- Connection
- Interface?
- Address?
- Path?
- Authenticator or AuthenticationMethod
Scope (aka for now: features not planned):
- probably not in scope: code generator for the XML interface description format
because it needs to work against the consumer API (Qt, GLib, ?)
| Hardest parts:
- Connections
- Interface to the data structures (map, struct, array)
+ - Do it like QDBusArgument: doesn't need any non-primitive data types, very binding-friendly
- Multithreading - need to talk to Thiago
- Well-defined names, special names: might be really easy, might be surprisingly hard / ugly
- Server
- Dispatching messages to/from interfaces, (interface registration?)
Other stuff:
- Build system
- Serialization, probably not hard
- Authentication
Some ideas for classes:
- Message
- Connection
- Interface?
- Address?
- Path?
- Authenticator or AuthenticationMethod
Scope (aka for now: features not planned):
- probably not in scope: code generator for the XML interface description format
because it needs to work against the consumer API (Qt, GLib, ?) | 1 | 0.041667 | 1 | 0 |
9fe2e8f757b02ec1bb50f776614f0cb467a67025 | src/requisition-product-grid/product-grid-cell-input-numeric.html | src/requisition-product-grid/product-grid-cell-input-numeric.html | <input positive-integer
ng-model="lineItem[column.name]"
ng-disabled="lineItem.skipped"
ng-change="validate()"
ng-blur="validate()"
type="number"/>
| <input positive-integer
ng-model="lineItem[column.name]"
ng-disabled="lineItem.skipped"
ng-change="validate()"
ng-blur="validate()"/>
| Revert "OLMIS-2408: added type number attribute to numeric cell on product grid" | Revert "OLMIS-2408: added type number attribute to numeric cell on product grid"
This reverts commit e6bad704c3810040bb959e6d84a4fadee1fb203a.
| HTML | agpl-3.0 | OpenLMIS/openlmis-requisition-ui,OpenLMIS/openlmis-requisition-ui,OpenLMIS/openlmis-requisition-ui | html | ## Code Before:
<input positive-integer
ng-model="lineItem[column.name]"
ng-disabled="lineItem.skipped"
ng-change="validate()"
ng-blur="validate()"
type="number"/>
## Instruction:
Revert "OLMIS-2408: added type number attribute to numeric cell on product grid"
This reverts commit e6bad704c3810040bb959e6d84a4fadee1fb203a.
## Code After:
<input positive-integer
ng-model="lineItem[column.name]"
ng-disabled="lineItem.skipped"
ng-change="validate()"
ng-blur="validate()"/>
| <input positive-integer
ng-model="lineItem[column.name]"
ng-disabled="lineItem.skipped"
ng-change="validate()"
- ng-blur="validate()"
+ ng-blur="validate()"/>
? ++
- type="number"/> | 3 | 0.5 | 1 | 2 |
971cf5dc6435fede6d3e4516254a74b2a8afca06 | roles/node/tasks/main.yml | roles/node/tasks/main.yml | ---
- name: Ensure apt-transport-https is installed
apt:
pkg: "{{item}}"
state: latest
with_items:
- apt-transport-https
- sqlite # required for node / ghost generally
- name: Ensure the key is present
apt_key:
url: https://deb.nodesource.com/gpgkey/nodesource.gpg.key
- name: Add repository
apt_repository:
repo: 'deb https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
state: present
notify:
- Update apt
- name: Add repository
apt_repository:
repo: 'deb-src https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
state: present
notify:
- Update apt
- name: Ensure nodejs is installed as root
sudo: yes
apt:
pkg: "{{item}}"
state: latest
with_items:
- nodejs
| ---
- name: Ensure apt-transport-https is installed
apt:
pkg: "{{item}}"
state: latest
with_items:
- apt-transport-https
- sqlite # required for node / ghost generally
- name: Ensure the key is present
apt_key:
url: https://deb.nodesource.com/gpgkey/nodesource.gpg.key
- name: Add repository
apt_repository:
repo: 'deb https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
state: present
filename: "nodesource"
notify:
- Update apt
- name: Add repository (SRC)
apt_repository:
repo: 'deb-src https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
state: present
filename: "nodesource"
notify:
- Update apt
- name: Ensure nodejs is installed as root
sudo: yes
apt:
pkg: "{{item}}"
state: latest
with_items:
- nodejs
| Switch to node v4.4.3 LTS | Switch to node v4.4.3 LTS
| YAML | mit | tchapi/ansible-playbooks | yaml | ## Code Before:
---
- name: Ensure apt-transport-https is installed
apt:
pkg: "{{item}}"
state: latest
with_items:
- apt-transport-https
- sqlite # required for node / ghost generally
- name: Ensure the key is present
apt_key:
url: https://deb.nodesource.com/gpgkey/nodesource.gpg.key
- name: Add repository
apt_repository:
repo: 'deb https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
state: present
notify:
- Update apt
- name: Add repository
apt_repository:
repo: 'deb-src https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
state: present
notify:
- Update apt
- name: Ensure nodejs is installed as root
sudo: yes
apt:
pkg: "{{item}}"
state: latest
with_items:
- nodejs
## Instruction:
Switch to node v4.4.3 LTS
## Code After:
---
- name: Ensure apt-transport-https is installed
apt:
pkg: "{{item}}"
state: latest
with_items:
- apt-transport-https
- sqlite # required for node / ghost generally
- name: Ensure the key is present
apt_key:
url: https://deb.nodesource.com/gpgkey/nodesource.gpg.key
- name: Add repository
apt_repository:
repo: 'deb https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
state: present
filename: "nodesource"
notify:
- Update apt
- name: Add repository (SRC)
apt_repository:
repo: 'deb-src https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
state: present
filename: "nodesource"
notify:
- Update apt
- name: Ensure nodejs is installed as root
sudo: yes
apt:
pkg: "{{item}}"
state: latest
with_items:
- nodejs
| ---
- name: Ensure apt-transport-https is installed
apt:
pkg: "{{item}}"
state: latest
with_items:
- apt-transport-https
- sqlite # required for node / ghost generally
- name: Ensure the key is present
apt_key:
url: https://deb.nodesource.com/gpgkey/nodesource.gpg.key
- name: Add repository
apt_repository:
- repo: 'deb https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
+ repo: 'deb https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
? ++++
state: present
+ filename: "nodesource"
notify:
- Update apt
- - name: Add repository
+ - name: Add repository (SRC)
? ++++++
apt_repository:
- repo: 'deb-src https://deb.nodesource.com/node {{ ansible_distribution_release }} main'
+ repo: 'deb-src https://deb.nodesource.com/node_4.x {{ ansible_distribution_release }} main'
? ++++
state: present
+ filename: "nodesource"
notify:
- Update apt
- name: Ensure nodejs is installed as root
sudo: yes
apt:
pkg: "{{item}}"
state: latest
with_items:
- nodejs | 8 | 0.266667 | 5 | 3 |
88b7aaf88db907343dcc2d9e065b26536b9d3f09 | composer.json | composer.json | {
"name": "badcow/dns",
"description": "A PHP library for creating DNS zone files based on RFC1035",
"license": "MIT",
"authors": [
{
"name": "Sam Williams",
"email": "sam@badcow.co"
}
],
"minimum-stability": "stable",
"require": {
"php": "~7.2",
"rlanvin/php-ip": "~2.0",
"badcow/ademarre-binary-to-text-php": ">=2.1.1"
},
"require-dev": {
"phpunit/phpunit": "~8.4",
"friendsofphp/php-cs-fixer": "~2.1",
"phpstan/phpstan": "^0.11.6"
},
"autoload": {
"psr-4": {
"Badcow\\DNS\\": "lib/"
}
},
"autoload-dev": {
"psr-4": {
"Badcow\\DNS\\Tests\\": "tests/"
}
}
}
| {
"name": "badcow/dns",
"description": "A PHP library for creating DNS zone files based on RFC1035",
"license": "MIT",
"authors": [
{
"name": "Sam Williams",
"email": "sam@badcow.co"
}
],
"minimum-stability": "stable",
"require": {
"php": "~7.2",
"rlanvin/php-ip": "~2.0",
"ademarre/binary-to-text-php": "~3.0.0"
},
"require-dev": {
"phpunit/phpunit": "~8.4",
"friendsofphp/php-cs-fixer": "~2.1",
"phpstan/phpstan": "^0.11.6"
},
"autoload": {
"psr-4": {
"Badcow\\DNS\\": "lib/"
}
},
"autoload-dev": {
"psr-4": {
"Badcow\\DNS\\Tests\\": "tests/"
}
}
}
| Use actual binary-to-text library repository. | Use actual binary-to-text library repository.
| JSON | mit | Badcow/DNS | json | ## Code Before:
{
"name": "badcow/dns",
"description": "A PHP library for creating DNS zone files based on RFC1035",
"license": "MIT",
"authors": [
{
"name": "Sam Williams",
"email": "sam@badcow.co"
}
],
"minimum-stability": "stable",
"require": {
"php": "~7.2",
"rlanvin/php-ip": "~2.0",
"badcow/ademarre-binary-to-text-php": ">=2.1.1"
},
"require-dev": {
"phpunit/phpunit": "~8.4",
"friendsofphp/php-cs-fixer": "~2.1",
"phpstan/phpstan": "^0.11.6"
},
"autoload": {
"psr-4": {
"Badcow\\DNS\\": "lib/"
}
},
"autoload-dev": {
"psr-4": {
"Badcow\\DNS\\Tests\\": "tests/"
}
}
}
## Instruction:
Use actual binary-to-text library repository.
## Code After:
{
"name": "badcow/dns",
"description": "A PHP library for creating DNS zone files based on RFC1035",
"license": "MIT",
"authors": [
{
"name": "Sam Williams",
"email": "sam@badcow.co"
}
],
"minimum-stability": "stable",
"require": {
"php": "~7.2",
"rlanvin/php-ip": "~2.0",
"ademarre/binary-to-text-php": "~3.0.0"
},
"require-dev": {
"phpunit/phpunit": "~8.4",
"friendsofphp/php-cs-fixer": "~2.1",
"phpstan/phpstan": "^0.11.6"
},
"autoload": {
"psr-4": {
"Badcow\\DNS\\": "lib/"
}
},
"autoload-dev": {
"psr-4": {
"Badcow\\DNS\\Tests\\": "tests/"
}
}
}
| {
"name": "badcow/dns",
"description": "A PHP library for creating DNS zone files based on RFC1035",
"license": "MIT",
"authors": [
{
"name": "Sam Williams",
"email": "sam@badcow.co"
}
],
"minimum-stability": "stable",
"require": {
"php": "~7.2",
"rlanvin/php-ip": "~2.0",
- "badcow/ademarre-binary-to-text-php": ">=2.1.1"
? ------- ^ ^^^ ^ ^
+ "ademarre/binary-to-text-php": "~3.0.0"
? ^ ^^ ^ ^
},
"require-dev": {
"phpunit/phpunit": "~8.4",
"friendsofphp/php-cs-fixer": "~2.1",
"phpstan/phpstan": "^0.11.6"
},
"autoload": {
"psr-4": {
"Badcow\\DNS\\": "lib/"
}
},
"autoload-dev": {
"psr-4": {
"Badcow\\DNS\\Tests\\": "tests/"
}
}
} | 2 | 0.0625 | 1 | 1 |
74d292621fbb8e5d4a096ca7fcc757f660ee49fd | README.md | README.md | Automated Blogging software
| Automated Blogging software.
"As a blog owner, you know that keeping fresh content on your blog daily can be a tedious and time-consuming task."---LWOF Spokesperson for BlogDominator.com
There are numerous unethical automated blogging software out there that aims to "produce" content by stealing them from
other people's websites and repackaging them by changing a few words. Friend-Computer aims to be an ethical blogging
software program, behaving very similar to Automated Insights and Narrative Science. Friend-Computer will find datasets, analyze them, and then produce readable blog posts. As a result, real content is being produced, not copy-pasted
nonsense.
If this thing actually moves beyond an "elevator pitch" and actually works, then it might be extended to produce more useful reports than just blogging.
The current proposed MVP is to analyze data from the GitHub API and produce a narrative based on this API.
| Add description of end goal of Friend-Computer | Add description of end goal of Friend-Computer | Markdown | apache-2.0 | tra38/Friend-Computer,tra38/FriendComputer,tra38/FriendComputer,tra38/FriendComputer,tra38/Friend-Computer,tra38/Friend-Computer | markdown | ## Code Before:
Automated Blogging software
## Instruction:
Add description of end goal of Friend-Computer
## Code After:
Automated Blogging software.
"As a blog owner, you know that keeping fresh content on your blog daily can be a tedious and time-consuming task."---LWOF Spokesperson for BlogDominator.com
There are numerous unethical automated blogging software out there that aims to "produce" content by stealing them from
other people's websites and repackaging them by changing a few words. Friend-Computer aims to be an ethical blogging
software program, behaving very similar to Automated Insights and Narrative Science. Friend-Computer will find datasets, analyze them, and then produce readable blog posts. As a result, real content is being produced, not copy-pasted
nonsense.
If this thing actually moves beyond an "elevator pitch" and actually works, then it might be extended to produce more useful reports than just blogging.
The current proposed MVP is to analyze data from the GitHub API and produce a narrative based on this API.
| - Automated Blogging software
+ Automated Blogging software.
? +
+
+ "As a blog owner, you know that keeping fresh content on your blog daily can be a tedious and time-consuming task."---LWOF Spokesperson for BlogDominator.com
+
+ There are numerous unethical automated blogging software out there that aims to "produce" content by stealing them from
+ other people's websites and repackaging them by changing a few words. Friend-Computer aims to be an ethical blogging
+ software program, behaving very similar to Automated Insights and Narrative Science. Friend-Computer will find datasets, analyze them, and then produce readable blog posts. As a result, real content is being produced, not copy-pasted
+ nonsense.
+
+ If this thing actually moves beyond an "elevator pitch" and actually works, then it might be extended to produce more useful reports than just blogging.
+
+ The current proposed MVP is to analyze data from the GitHub API and produce a narrative based on this API. | 13 | 13 | 12 | 1 |
e95319e1fee36579bd3d0c53b6bafd4ea585101c | lib/filestack-rails.rb | lib/filestack-rails.rb | require "filestack_rails/configuration"
require "filestack_rails/transform"
require "filestack_rails/engine"
module FilestackRails
# Your code goes here...
end
| require "filestack_rails/configuration"
require "filestack_rails/transform"
require "filestack_rails/engine"
require "filestack_rails/version"
module FilestackRails
# Your code goes here...
end
| Fix issue with missing module | Fix issue with missing module
| Ruby | mit | Ink/filepicker-rails,Ink/filepicker-rails,Ink/filepicker-rails | ruby | ## Code Before:
require "filestack_rails/configuration"
require "filestack_rails/transform"
require "filestack_rails/engine"
module FilestackRails
# Your code goes here...
end
## Instruction:
Fix issue with missing module
## Code After:
require "filestack_rails/configuration"
require "filestack_rails/transform"
require "filestack_rails/engine"
require "filestack_rails/version"
module FilestackRails
# Your code goes here...
end
| require "filestack_rails/configuration"
require "filestack_rails/transform"
require "filestack_rails/engine"
+ require "filestack_rails/version"
module FilestackRails
# Your code goes here...
end | 1 | 0.142857 | 1 | 0 |
5bc92164346cfc0baf6840066410805e3d11a597 | README.md | README.md | Ionic Contrib: Tinder Cards 2
===================
Swipeable card based layout for Ionic and Angular. As seen in apps like [Tinder](http://www.gotinder.com/)
Note: There is also a similar ion library here: https://github.com/driftyco/ionic-ion-swipe-cards where you swipe the cards down instead of left or right.
[Demo](http://codepen.io/loringdodge/pen/BNmRrK)
## Install
Coming Soon
## Usage
Include `ionic.tdcards.js`, `collide.js` and `ionic.tdcards.css` after the rest of your Ionic and Angular includes. Add `ionic.contrib.ui.tinderCards` as a module dependency of your app. Then use the following AngularJS directives:
```html
```
To add new cards dynamically, just add them to the cards array:
```javascript
```
| Ionic Contrib: Tinder Cards 2
===================
This is an extension of [Tinder Cards](https://github.com/driftyco/ionic-ion-tinder-cards)
which itself was an extension of [Swipe Cards](https://github.com/driftyco/ionic-ion-swipe-cards),
both created by the driftyco team.

[Demo](http://codepen.io/loringdodge/pen/BNmRrK)
## Install
Coming Soon
## Usage
Include `ionic.tdcards.js`, `collide.js` and `ionic.tdcards.css` after the rest of your Ionic and Angular includes. Add `ionic.contrib.ui.tinderCards` as a module dependency of your app. Then use the following AngularJS directives:
```html
```
To add new cards dynamically, just add them to the cards array:
```javascript
```
| Add some comments for readme | Add some comments for readme
| Markdown | mit | loringdodge/ionic-ion-tinder-cards-2,loringdodge/ionic-ion-tinder-cards-2,loringdodge/ionic-ion-tinder-cards-2,loringdodge/ionic-ion-tinder-cards-2,loringdodge/ionic-ion-tinder-cards-2 | markdown | ## Code Before:
Ionic Contrib: Tinder Cards 2
===================
Swipeable card based layout for Ionic and Angular. As seen in apps like [Tinder](http://www.gotinder.com/)
Note: There is also a similar ion library here: https://github.com/driftyco/ionic-ion-swipe-cards where you swipe the cards down instead of left or right.
[Demo](http://codepen.io/loringdodge/pen/BNmRrK)
## Install
Coming Soon
## Usage
Include `ionic.tdcards.js`, `collide.js` and `ionic.tdcards.css` after the rest of your Ionic and Angular includes. Add `ionic.contrib.ui.tinderCards` as a module dependency of your app. Then use the following AngularJS directives:
```html
```
To add new cards dynamically, just add them to the cards array:
```javascript
```
## Instruction:
Add some comments for readme
## Code After:
Ionic Contrib: Tinder Cards 2
===================
This is an extension of [Tinder Cards](https://github.com/driftyco/ionic-ion-tinder-cards)
which itself was an extension of [Swipe Cards](https://github.com/driftyco/ionic-ion-swipe-cards),
both created by the driftyco team.

[Demo](http://codepen.io/loringdodge/pen/BNmRrK)
## Install
Coming Soon
## Usage
Include `ionic.tdcards.js`, `collide.js` and `ionic.tdcards.css` after the rest of your Ionic and Angular includes. Add `ionic.contrib.ui.tinderCards` as a module dependency of your app. Then use the following AngularJS directives:
```html
```
To add new cards dynamically, just add them to the cards array:
```javascript
```
| Ionic Contrib: Tinder Cards 2
===================
- Swipeable card based layout for Ionic and Angular. As seen in apps like [Tinder](http://www.gotinder.com/)
- Note: There is also a similar ion library here: https://github.com/driftyco/ionic-ion-swipe-cards where you swipe the cards down instead of left or right.
+ This is an extension of [Tinder Cards](https://github.com/driftyco/ionic-ion-tinder-cards)
+ which itself was an extension of [Swipe Cards](https://github.com/driftyco/ionic-ion-swipe-cards),
+ both created by the driftyco team.
+
+ 
[Demo](http://codepen.io/loringdodge/pen/BNmRrK)
## Install
Coming Soon
## Usage
Include `ionic.tdcards.js`, `collide.js` and `ionic.tdcards.css` after the rest of your Ionic and Angular includes. Add `ionic.contrib.ui.tinderCards` as a module dependency of your app. Then use the following AngularJS directives:
```html
```
To add new cards dynamically, just add them to the cards array:
```javascript
```
| 7 | 0.25 | 5 | 2 |
781273b57262cc055da9928d0271206bfc5ce0ca | Gruntfile.js | Gruntfile.js | module.exports = function(grunt) {
// config
grunt.initConfig({
less: {
build: {
expand: true,
cwd: 'src/less/',
src: ['**/*.less'],
dest: 'build/css/',
ext: '.css'
}
},
csslint: {
check: {
src: '<%= less.build.dest %>**/*.css'
}
},
watch: {
less2css: {
files: 'src/css/*.less',
tasks: ['less', 'csslint']
}
},
typescript: {
build: {
expand: true,
cwd: 'src/ts/',
src: ['**/*.ts', '!node_modules/**/*.ts'],
dest: 'build/js/',
ext: '.js',
options: {
noImplicitAny: true
}
}
}
});
// plugin
grunt.loadNpmTasks('grunt-contrib-less');
grunt.loadNpmTasks('grunt-contrib-csslint');
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-typescript');
// tasks
grunt.registerTask('default', ['less', 'csslint', 'typescript']);
};
| module.exports = function(grunt) {
// config
grunt.initConfig({
less: {
build: {
expand: true,
cwd: 'src/less/',
src: ['**/*.less'],
dest: 'build/css/',
ext: '.css'
}
},
csslint: {
check: {
src: '<%= less.build.dest %>**/*.css'
}
},
watch: {
less: {
files: 'src/less/**/*.less',
tasks: ['less', 'csslint']
}
},
typescript: {
build: {
expand: true,
cwd: 'src/ts/',
src: ['**/*.ts', '!node_modules/**/*.ts'],
dest: 'build/js/',
ext: '.js',
options: {
noImplicitAny: true
}
}
}
});
// plugin
grunt.loadNpmTasks('grunt-contrib-less');
grunt.loadNpmTasks('grunt-contrib-csslint');
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-typescript');
// tasks
grunt.registerTask('less2css', ['less', 'csslint']);
grunt.registerTask('default', ['less2css', 'typescript']);
};
| Edit gruntfile: create task less2css | Edit gruntfile: create task less2css
| JavaScript | mit | yajamon/OedoBlackSmithRecipe | javascript | ## Code Before:
module.exports = function(grunt) {
// config
grunt.initConfig({
less: {
build: {
expand: true,
cwd: 'src/less/',
src: ['**/*.less'],
dest: 'build/css/',
ext: '.css'
}
},
csslint: {
check: {
src: '<%= less.build.dest %>**/*.css'
}
},
watch: {
less2css: {
files: 'src/css/*.less',
tasks: ['less', 'csslint']
}
},
typescript: {
build: {
expand: true,
cwd: 'src/ts/',
src: ['**/*.ts', '!node_modules/**/*.ts'],
dest: 'build/js/',
ext: '.js',
options: {
noImplicitAny: true
}
}
}
});
// plugin
grunt.loadNpmTasks('grunt-contrib-less');
grunt.loadNpmTasks('grunt-contrib-csslint');
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-typescript');
// tasks
grunt.registerTask('default', ['less', 'csslint', 'typescript']);
};
## Instruction:
Edit gruntfile: create task less2css
## Code After:
module.exports = function(grunt) {
// config
grunt.initConfig({
less: {
build: {
expand: true,
cwd: 'src/less/',
src: ['**/*.less'],
dest: 'build/css/',
ext: '.css'
}
},
csslint: {
check: {
src: '<%= less.build.dest %>**/*.css'
}
},
watch: {
less: {
files: 'src/less/**/*.less',
tasks: ['less', 'csslint']
}
},
typescript: {
build: {
expand: true,
cwd: 'src/ts/',
src: ['**/*.ts', '!node_modules/**/*.ts'],
dest: 'build/js/',
ext: '.js',
options: {
noImplicitAny: true
}
}
}
});
// plugin
grunt.loadNpmTasks('grunt-contrib-less');
grunt.loadNpmTasks('grunt-contrib-csslint');
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-typescript');
// tasks
grunt.registerTask('less2css', ['less', 'csslint']);
grunt.registerTask('default', ['less2css', 'typescript']);
};
| module.exports = function(grunt) {
// config
grunt.initConfig({
less: {
build: {
expand: true,
cwd: 'src/less/',
src: ['**/*.less'],
dest: 'build/css/',
ext: '.css'
}
},
csslint: {
check: {
src: '<%= less.build.dest %>**/*.css'
}
},
watch: {
- less2css: {
? ----
+ less: {
- files: 'src/css/*.less',
? ^
+ files: 'src/less/**/*.less',
? ^^ +++
tasks: ['less', 'csslint']
}
},
typescript: {
build: {
expand: true,
cwd: 'src/ts/',
src: ['**/*.ts', '!node_modules/**/*.ts'],
dest: 'build/js/',
ext: '.js',
options: {
noImplicitAny: true
}
}
}
});
// plugin
grunt.loadNpmTasks('grunt-contrib-less');
grunt.loadNpmTasks('grunt-contrib-csslint');
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-typescript');
// tasks
+ grunt.registerTask('less2css', ['less', 'csslint']);
- grunt.registerTask('default', ['less', 'csslint', 'typescript']);
? ^^^^ ----
+ grunt.registerTask('default', ['less2css', 'typescript']);
? ^
}; | 7 | 0.132075 | 4 | 3 |
f65625bc6759a1c880a7063a07a10301600bb451 | README.md | README.md |
A resource-oriented wrapper over the window.fetch API. Defaults to JSON content type.
## Install
```
npm install --save piggyback
```
## Usage
```js
import { getFetch, postFetch, putFetch, deleteFetch } from 'piggyback';
export default function getPosts() {
return getFetch('/posts').then(function(response) {
return response.json();
});
}
export default function createPost(body) {
return postFetch('/posts', body).then(function(response) {
return response.json();
});
}
export default function updatePost(id, body) {
return putFetch('/posts/' + id.toString(), body).then(function(response) {
return response.json();
});
}
export default function deletePost(id) {
return deleteFetch('/posts/' + id.toString());
}
```
|
A resource-oriented wrapper over the window.fetch API. Defaults to JSON content type.
## Install
```
npm install --save piggyback
```
## Usage
```js
import { getFetch, postFetch, putFetch, deleteFetch } from 'piggyback';
export default function getPosts() {
return getFetch('/posts').then(function(response) {
return response.json();
});
}
export default function createPost(body) {
return postFetch('/posts', body).then(function(response) {
return response.json();
});
}
export default function updatePost(id, body) {
return putFetch('/posts/' + id.toString(), body).then(function(response) {
return response.json();
});
}
export default function deletePost(id) {
return deleteFetch('/posts/' + id.toString());
}
```
## Notes
- Depends on `es6-promise` and `isomorphic-fetch` polyfills (this is handled automatically).
- Browser auth credentials are always sent with the request.
- HTTP error codes result in a JavaScript `Error` being thrown by default.
- How you handle resolving and rejecting is up to you.
If these defaults aren’t suitable for your use-case, consider using the `isomorphic-fetch` library directly.
| Add minor notes to the documentation | Add minor notes to the documentation
| Markdown | mit | maetl/piggyback | markdown | ## Code Before:
A resource-oriented wrapper over the window.fetch API. Defaults to JSON content type.
## Install
```
npm install --save piggyback
```
## Usage
```js
import { getFetch, postFetch, putFetch, deleteFetch } from 'piggyback';
export default function getPosts() {
return getFetch('/posts').then(function(response) {
return response.json();
});
}
export default function createPost(body) {
return postFetch('/posts', body).then(function(response) {
return response.json();
});
}
export default function updatePost(id, body) {
return putFetch('/posts/' + id.toString(), body).then(function(response) {
return response.json();
});
}
export default function deletePost(id) {
return deleteFetch('/posts/' + id.toString());
}
```
## Instruction:
Add minor notes to the documentation
## Code After:
A resource-oriented wrapper over the window.fetch API. Defaults to JSON content type.
## Install
```
npm install --save piggyback
```
## Usage
```js
import { getFetch, postFetch, putFetch, deleteFetch } from 'piggyback';
export default function getPosts() {
return getFetch('/posts').then(function(response) {
return response.json();
});
}
export default function createPost(body) {
return postFetch('/posts', body).then(function(response) {
return response.json();
});
}
export default function updatePost(id, body) {
return putFetch('/posts/' + id.toString(), body).then(function(response) {
return response.json();
});
}
export default function deletePost(id) {
return deleteFetch('/posts/' + id.toString());
}
```
## Notes
- Depends on `es6-promise` and `isomorphic-fetch` polyfills (this is handled automatically).
- Browser auth credentials are always sent with the request.
- HTTP error codes result in a JavaScript `Error` being thrown by default.
- How you handle resolving and rejecting is up to you.
If these defaults aren’t suitable for your use-case, consider using the `isomorphic-fetch` library directly.
|
A resource-oriented wrapper over the window.fetch API. Defaults to JSON content type.
## Install
```
npm install --save piggyback
```
## Usage
```js
import { getFetch, postFetch, putFetch, deleteFetch } from 'piggyback';
export default function getPosts() {
return getFetch('/posts').then(function(response) {
return response.json();
});
}
export default function createPost(body) {
return postFetch('/posts', body).then(function(response) {
return response.json();
});
}
export default function updatePost(id, body) {
return putFetch('/posts/' + id.toString(), body).then(function(response) {
return response.json();
});
}
export default function deletePost(id) {
return deleteFetch('/posts/' + id.toString());
}
```
+
+ ## Notes
+
+ - Depends on `es6-promise` and `isomorphic-fetch` polyfills (this is handled automatically).
+ - Browser auth credentials are always sent with the request.
+ - HTTP error codes result in a JavaScript `Error` being thrown by default.
+ - How you handle resolving and rejecting is up to you.
+
+ If these defaults aren’t suitable for your use-case, consider using the `isomorphic-fetch` library directly. | 9 | 0.25 | 9 | 0 |
ead1b97ea80634bc4df621b7c8dc56eb27e32a9d | test/test_helper.rb | test/test_helper.rb | require "rubygems"
require "contest"
require "hpricot"
ROOT = File.expand_path(File.join(File.dirname(__FILE__), ".."))
$:.unshift ROOT
require "test/commands"
class Test::Unit::TestCase
include Test::Commands
def root(*args)
File.join(ROOT, *args)
end
def monk(args = nil)
sh("env MONK_HOME=#{File.join(ROOT, "test", "tmp")} ruby -rubygems #{root "bin/monk"} #{args}")
end
end
| require "rubygems"
require "contest"
require "hpricot"
ROOT = File.expand_path(File.join(File.dirname(__FILE__), ".."))
$:.unshift ROOT
require "test/commands"
class Test::Unit::TestCase
include Test::Commands
def root(*args)
File.join(ROOT, *args)
end
def setup
FileUtils.rm(File.join(ROOT, "test", "tmp", ".monk"))
end
def monk(args = nil)
sh("env MONK_HOME=#{File.join(ROOT, "test", "tmp")} ruby -rubygems #{root "bin/monk"} #{args}")
end
end
| Reset the .monk config after each test. | Reset the .monk config after each test.
| Ruby | mit | rkh/priest,monk/monk,rstacruz/monk,monkrb/monk | ruby | ## Code Before:
require "rubygems"
require "contest"
require "hpricot"
ROOT = File.expand_path(File.join(File.dirname(__FILE__), ".."))
$:.unshift ROOT
require "test/commands"
class Test::Unit::TestCase
include Test::Commands
def root(*args)
File.join(ROOT, *args)
end
def monk(args = nil)
sh("env MONK_HOME=#{File.join(ROOT, "test", "tmp")} ruby -rubygems #{root "bin/monk"} #{args}")
end
end
## Instruction:
Reset the .monk config after each test.
## Code After:
require "rubygems"
require "contest"
require "hpricot"
ROOT = File.expand_path(File.join(File.dirname(__FILE__), ".."))
$:.unshift ROOT
require "test/commands"
class Test::Unit::TestCase
include Test::Commands
def root(*args)
File.join(ROOT, *args)
end
def setup
FileUtils.rm(File.join(ROOT, "test", "tmp", ".monk"))
end
def monk(args = nil)
sh("env MONK_HOME=#{File.join(ROOT, "test", "tmp")} ruby -rubygems #{root "bin/monk"} #{args}")
end
end
| require "rubygems"
require "contest"
require "hpricot"
ROOT = File.expand_path(File.join(File.dirname(__FILE__), ".."))
$:.unshift ROOT
require "test/commands"
class Test::Unit::TestCase
include Test::Commands
def root(*args)
File.join(ROOT, *args)
end
+ def setup
+ FileUtils.rm(File.join(ROOT, "test", "tmp", ".monk"))
+ end
+
def monk(args = nil)
sh("env MONK_HOME=#{File.join(ROOT, "test", "tmp")} ruby -rubygems #{root "bin/monk"} #{args}")
end
end | 4 | 0.190476 | 4 | 0 |
5728d1b6b2f18bc7c26104ae71de7fd4985ee1dc | config/jest/jestConfig.js | config/jest/jestConfig.js | module.exports = {
testPathIgnorePatterns: ['<rootDir>[/\\\\](dist|node_modules)[/\\\\]'],
moduleNameMapper: {
'(seek-style-guide/react|.(css|less)$)': require.resolve(
'identity-obj-proxy'
),
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|svg)$': require.resolve(
'./fileMock'
)
},
transform: {
'^.+\\.js$': require.resolve('./babelTransform.js')
}
};
| module.exports = {
prettierPath: require.resolve('prettier'),
testPathIgnorePatterns: ['<rootDir>[/\\\\](dist|node_modules)[/\\\\]'],
moduleNameMapper: {
'(seek-style-guide/react|.(css|less)$)': require.resolve(
'identity-obj-proxy'
),
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|svg)$': require.resolve(
'./fileMock'
)
},
transform: {
'^.+\\.js$': require.resolve('./babelTransform.js')
}
};
| Add prettierPath to Jest config | fix(test): Add prettierPath to Jest config
| JavaScript | mit | seek-oss/sku,seek-oss/sku | javascript | ## Code Before:
module.exports = {
testPathIgnorePatterns: ['<rootDir>[/\\\\](dist|node_modules)[/\\\\]'],
moduleNameMapper: {
'(seek-style-guide/react|.(css|less)$)': require.resolve(
'identity-obj-proxy'
),
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|svg)$': require.resolve(
'./fileMock'
)
},
transform: {
'^.+\\.js$': require.resolve('./babelTransform.js')
}
};
## Instruction:
fix(test): Add prettierPath to Jest config
## Code After:
module.exports = {
prettierPath: require.resolve('prettier'),
testPathIgnorePatterns: ['<rootDir>[/\\\\](dist|node_modules)[/\\\\]'],
moduleNameMapper: {
'(seek-style-guide/react|.(css|less)$)': require.resolve(
'identity-obj-proxy'
),
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|svg)$': require.resolve(
'./fileMock'
)
},
transform: {
'^.+\\.js$': require.resolve('./babelTransform.js')
}
};
| module.exports = {
+ prettierPath: require.resolve('prettier'),
testPathIgnorePatterns: ['<rootDir>[/\\\\](dist|node_modules)[/\\\\]'],
moduleNameMapper: {
'(seek-style-guide/react|.(css|less)$)': require.resolve(
'identity-obj-proxy'
),
'\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga|svg)$': require.resolve(
'./fileMock'
)
},
transform: {
'^.+\\.js$': require.resolve('./babelTransform.js')
}
}; | 1 | 0.071429 | 1 | 0 |
8cc4640490b7a1e28d322800b0df64005ab0e0ad | app/components/widgets/search-bar.scss | app/components/widgets/search-bar.scss |
.search-bar-input {
width: calc(100% - 50px);
padding: 9px;
}
.filter-button {
position: absolute;
right: 17px;
.arrow {
&::before,
&::after {
left: 16px;
}
}
.search-filter {
z-index: 3;
padding: 4px 1em;
border: none;
width: 50px;
height: 39px;
.mdi-filter {
line-height: 30px;
}
type-icon {
margin-left: -6px;
img {
width: 20px;
height: 20px;
}
}
}
}
}
.popover-body {
.filtermenu {
min-width: 235px;
.list-group {
max-height: calc(100vh - 350px) !important;
}
}
}
|
.search-bar-input {
width: calc(100% - 50px);
padding: 9px;
}
.filter-button {
position: absolute;
right: 17px;
.arrow {
&::before,
&::after {
left: 16px;
}
&::after {
border-bottom-color: #eee;
}
}
.search-filter {
z-index: 3;
padding: 4px 1em;
border: none;
width: 50px;
height: 39px;
.mdi-filter {
line-height: 30px;
}
type-icon {
margin-left: -6px;
img {
width: 20px;
height: 20px;
}
}
}
}
}
.popover-body {
.filtermenu {
min-width: 235px;
.list-group {
max-height: calc(100vh - 350px) !important;
}
}
}
| Adjust filter menu arrow color | Adjust filter menu arrow color
| SCSS | apache-2.0 | codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client | scss | ## Code Before:
.search-bar-input {
width: calc(100% - 50px);
padding: 9px;
}
.filter-button {
position: absolute;
right: 17px;
.arrow {
&::before,
&::after {
left: 16px;
}
}
.search-filter {
z-index: 3;
padding: 4px 1em;
border: none;
width: 50px;
height: 39px;
.mdi-filter {
line-height: 30px;
}
type-icon {
margin-left: -6px;
img {
width: 20px;
height: 20px;
}
}
}
}
}
.popover-body {
.filtermenu {
min-width: 235px;
.list-group {
max-height: calc(100vh - 350px) !important;
}
}
}
## Instruction:
Adjust filter menu arrow color
## Code After:
.search-bar-input {
width: calc(100% - 50px);
padding: 9px;
}
.filter-button {
position: absolute;
right: 17px;
.arrow {
&::before,
&::after {
left: 16px;
}
&::after {
border-bottom-color: #eee;
}
}
.search-filter {
z-index: 3;
padding: 4px 1em;
border: none;
width: 50px;
height: 39px;
.mdi-filter {
line-height: 30px;
}
type-icon {
margin-left: -6px;
img {
width: 20px;
height: 20px;
}
}
}
}
}
.popover-body {
.filtermenu {
min-width: 235px;
.list-group {
max-height: calc(100vh - 350px) !important;
}
}
}
|
.search-bar-input {
width: calc(100% - 50px);
padding: 9px;
}
.filter-button {
position: absolute;
right: 17px;
.arrow {
&::before,
&::after {
left: 16px;
+ }
+ &::after {
+ border-bottom-color: #eee;
}
}
.search-filter {
z-index: 3;
padding: 4px 1em;
border: none;
width: 50px;
height: 39px;
.mdi-filter {
line-height: 30px;
}
type-icon {
margin-left: -6px;
img {
width: 20px;
height: 20px;
}
}
}
}
}
.popover-body {
.filtermenu {
min-width: 235px;
.list-group {
max-height: calc(100vh - 350px) !important;
}
}
} | 3 | 0.06 | 3 | 0 |
095b1c009330bb9c0008ba0d5ed8648dc964c6f4 | autowiring-config.cmake.in | autowiring-config.cmake.in |
include(${CMAKE_CURRENT_LIST_DIR}/AutowiringTargets.cmake)
set(autowiring_FOUND TRUE)
set(autowiring_INCLUDE_DIR "@INSTALL_INCLUDE_DIR@")
|
include(${CMAKE_CURRENT_LIST_DIR}/AutowiringTargets.cmake)
set(autowiring_FOUND TRUE)
set(autowiring_INCLUDE_DIR "@CMAKE_INSTALL_PREFIX@/include")
| Fix the autowiring install include path | Fix the autowiring install include path
| unknown | apache-2.0 | leapmotion/autowiring,codemercenary/autowiring,codemercenary/autowiring,leapmotion/autowiring,leapmotion/autowiring,codemercenary/autowiring,codemercenary/autowiring,leapmotion/autowiring,leapmotion/autowiring,codemercenary/autowiring,leapmotion/autowiring,codemercenary/autowiring | unknown | ## Code Before:
include(${CMAKE_CURRENT_LIST_DIR}/AutowiringTargets.cmake)
set(autowiring_FOUND TRUE)
set(autowiring_INCLUDE_DIR "@INSTALL_INCLUDE_DIR@")
## Instruction:
Fix the autowiring install include path
## Code After:
include(${CMAKE_CURRENT_LIST_DIR}/AutowiringTargets.cmake)
set(autowiring_FOUND TRUE)
set(autowiring_INCLUDE_DIR "@CMAKE_INSTALL_PREFIX@/include")
|
include(${CMAKE_CURRENT_LIST_DIR}/AutowiringTargets.cmake)
set(autowiring_FOUND TRUE)
- set(autowiring_INCLUDE_DIR "@INSTALL_INCLUDE_DIR@")
+ set(autowiring_INCLUDE_DIR "@CMAKE_INSTALL_PREFIX@/include") | 2 | 0.5 | 1 | 1 |
bd326a00f0ee64f0e2eb8b14b4cb160b9ed4cc0d | app/services/ynab/transaction_creator.rb | app/services/ynab/transaction_creator.rb | class YNAB::TransactionCreator
def initialize(id: nil, date: nil, amount: nil, payee_name: nil, description: true, flag: nil, cleared: true, budget_id: nil, account_id: nil)
@id = id
@date = date
@amount = amount
@payee_name = payee_name
@description = description
@cleared = cleared
@flag = flag
@client = YNAB::Client.new(ENV['YNAB_ACCESS_TOKEN'], budget_id, account_id)
end
def create
create = @client.create_transaction(
id: @id,
payee_name: @payee_name.to_s.truncate(50),
amount: @amount,
cleared: @cleared,
date: @date.to_date,
memo: @description,
flag: @flag
)
# If the transaction has a category, then lets notify
if create.category_id.present?
ynab_category = @client.category(create.category_id)
CategoryBalanceNotifier.new.notify(ynab_category)
end
create.try(:id).present? ? create : { error: :failed, data: create }
end
end
| class YNAB::TransactionCreator
def initialize(id: nil, date: nil, amount: nil, payee_name: nil, description: true, flag: nil, cleared: true, budget_id: nil, account_id: nil)
@id = id
@date = date
@amount = amount
@payee_name = payee_name
@description = description
@cleared = cleared
@flag = flag
@client = YNAB::Client.new(ENV['YNAB_ACCESS_TOKEN'], budget_id, account_id)
end
def create
create = @client.create_transaction(
id: @id.to_s.truncate(36),
payee_name: @payee_name.to_s.truncate(50),
amount: @amount,
cleared: @cleared,
date: @date.to_date,
memo: @description,
flag: @flag
)
# If the transaction has a category, then lets notify
if create.category_id.present?
ynab_category = @client.category(create.category_id)
CategoryBalanceNotifier.new.notify(ynab_category)
end
create.try(:id).present? ? create : { error: :failed, data: create }
end
end
| Truncate the ID here too | Truncate the ID here too
| Ruby | mit | scottrobertson/fintech-to-ynab,scottrobertson/fintech-to-ynab | ruby | ## Code Before:
class YNAB::TransactionCreator
def initialize(id: nil, date: nil, amount: nil, payee_name: nil, description: true, flag: nil, cleared: true, budget_id: nil, account_id: nil)
@id = id
@date = date
@amount = amount
@payee_name = payee_name
@description = description
@cleared = cleared
@flag = flag
@client = YNAB::Client.new(ENV['YNAB_ACCESS_TOKEN'], budget_id, account_id)
end
def create
create = @client.create_transaction(
id: @id,
payee_name: @payee_name.to_s.truncate(50),
amount: @amount,
cleared: @cleared,
date: @date.to_date,
memo: @description,
flag: @flag
)
# If the transaction has a category, then lets notify
if create.category_id.present?
ynab_category = @client.category(create.category_id)
CategoryBalanceNotifier.new.notify(ynab_category)
end
create.try(:id).present? ? create : { error: :failed, data: create }
end
end
## Instruction:
Truncate the ID here too
## Code After:
class YNAB::TransactionCreator
def initialize(id: nil, date: nil, amount: nil, payee_name: nil, description: true, flag: nil, cleared: true, budget_id: nil, account_id: nil)
@id = id
@date = date
@amount = amount
@payee_name = payee_name
@description = description
@cleared = cleared
@flag = flag
@client = YNAB::Client.new(ENV['YNAB_ACCESS_TOKEN'], budget_id, account_id)
end
def create
create = @client.create_transaction(
id: @id.to_s.truncate(36),
payee_name: @payee_name.to_s.truncate(50),
amount: @amount,
cleared: @cleared,
date: @date.to_date,
memo: @description,
flag: @flag
)
# If the transaction has a category, then lets notify
if create.category_id.present?
ynab_category = @client.category(create.category_id)
CategoryBalanceNotifier.new.notify(ynab_category)
end
create.try(:id).present? ? create : { error: :failed, data: create }
end
end
| class YNAB::TransactionCreator
def initialize(id: nil, date: nil, amount: nil, payee_name: nil, description: true, flag: nil, cleared: true, budget_id: nil, account_id: nil)
@id = id
@date = date
@amount = amount
@payee_name = payee_name
@description = description
@cleared = cleared
@flag = flag
@client = YNAB::Client.new(ENV['YNAB_ACCESS_TOKEN'], budget_id, account_id)
end
def create
create = @client.create_transaction(
- id: @id,
+ id: @id.to_s.truncate(36),
payee_name: @payee_name.to_s.truncate(50),
amount: @amount,
cleared: @cleared,
date: @date.to_date,
memo: @description,
flag: @flag
)
# If the transaction has a category, then lets notify
if create.category_id.present?
ynab_category = @client.category(create.category_id)
CategoryBalanceNotifier.new.notify(ynab_category)
end
create.try(:id).present? ? create : { error: :failed, data: create }
end
end | 2 | 0.0625 | 1 | 1 |
bafe66c554f63e89ac461c900d2febc08e8eddbe | test/node_test.rb | test/node_test.rb |
require './tet'
require '../node'
group Node do
empty_node = Node.new
content = 'here is some text'
node = Node.new(content)
group '#to_s' do
assert 'returns the content' do
node.to_s == content
end
assert 'returns an empty string if initialized with nothing' do
empty_node.to_s == ''
end
end
group '#size' do
assert 'returns the size of the content' do
node.size == content.size
end
assert 'returns 0 if initialized with nothing' do
empty_node.size.zero?
end
end
end
|
require './tet'
require '../node'
group Node do
empty_node = Node.new
content = 'here is some text'
node = Node.new(content)
group '#to_s' do
assert 'returns the content' do
node.to_s == content
end
assert 'returns an empty string if initialized with nothing' do
empty_node.to_s == ''
end
end
group '#size' do
assert 'returns the size of the content' do
node.size == content.size
end
assert 'returns 0 if initialized with nothing' do
empty_node.size.zero?
end
end
group '.new' do
assert 'given a block, evaluates it in the context of the object' do
node_with_method = Node.new do
def returns_puppies
:puppies
end
end
node_with_method.returns_puppies == :puppies
end
end
end
| Test initializing Node with block | Test initializing Node with block
| Ruby | bsd-3-clause | justcolin/lextacular | ruby | ## Code Before:
require './tet'
require '../node'
group Node do
empty_node = Node.new
content = 'here is some text'
node = Node.new(content)
group '#to_s' do
assert 'returns the content' do
node.to_s == content
end
assert 'returns an empty string if initialized with nothing' do
empty_node.to_s == ''
end
end
group '#size' do
assert 'returns the size of the content' do
node.size == content.size
end
assert 'returns 0 if initialized with nothing' do
empty_node.size.zero?
end
end
end
## Instruction:
Test initializing Node with block
## Code After:
require './tet'
require '../node'
group Node do
empty_node = Node.new
content = 'here is some text'
node = Node.new(content)
group '#to_s' do
assert 'returns the content' do
node.to_s == content
end
assert 'returns an empty string if initialized with nothing' do
empty_node.to_s == ''
end
end
group '#size' do
assert 'returns the size of the content' do
node.size == content.size
end
assert 'returns 0 if initialized with nothing' do
empty_node.size.zero?
end
end
group '.new' do
assert 'given a block, evaluates it in the context of the object' do
node_with_method = Node.new do
def returns_puppies
:puppies
end
end
node_with_method.returns_puppies == :puppies
end
end
end
|
require './tet'
require '../node'
group Node do
empty_node = Node.new
content = 'here is some text'
node = Node.new(content)
group '#to_s' do
assert 'returns the content' do
node.to_s == content
end
assert 'returns an empty string if initialized with nothing' do
empty_node.to_s == ''
end
end
group '#size' do
assert 'returns the size of the content' do
node.size == content.size
end
assert 'returns 0 if initialized with nothing' do
empty_node.size.zero?
end
end
+
+ group '.new' do
+ assert 'given a block, evaluates it in the context of the object' do
+ node_with_method = Node.new do
+ def returns_puppies
+ :puppies
+ end
+ end
+
+ node_with_method.returns_puppies == :puppies
+ end
+ end
end | 12 | 0.413793 | 12 | 0 |
e0a401534971e22510abcc9bbc411d011bbf9236 | nextjs-react-strapi-deliveroo-clone-tutorial/README.md | nextjs-react-strapi-deliveroo-clone-tutorial/README.md |
This example is the result of the tutorial series "[Cooking a Deliveroo clone with Next.js (React), GraphQL, Strapi and Stripe](https://blog.strapi.io/strapi-next-setup/)
## Setup
To see it live:
- Clone the repository.
- Start the Strapi server: `cd nextjs-strapi-deliveroo-clone-tutorial/server && yarn && yarn dev`.
- In an other tab, start the nextjs server: `cd nextjs-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
|
This example is the result of the tutorial series "[Cooking a Deliveroo clone with Next.js (React), GraphQL, Strapi and Stripe](https://blog.strapi.io/strapi-next-setup/)
## Setup
To see it live:
- Clone the repository.
- Start the Strapi server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/backend && yarn && yarn develop`.
- In an other tab, start the nextjs server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
| Rename project path nextjs in command | Rename project path nextjs in command
| Markdown | mit | strapi/strapi-examples,strapi/strapi-examples,strapi/strapi-examples | markdown | ## Code Before:
This example is the result of the tutorial series "[Cooking a Deliveroo clone with Next.js (React), GraphQL, Strapi and Stripe](https://blog.strapi.io/strapi-next-setup/)
## Setup
To see it live:
- Clone the repository.
- Start the Strapi server: `cd nextjs-strapi-deliveroo-clone-tutorial/server && yarn && yarn dev`.
- In an other tab, start the nextjs server: `cd nextjs-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
## Instruction:
Rename project path nextjs in command
## Code After:
This example is the result of the tutorial series "[Cooking a Deliveroo clone with Next.js (React), GraphQL, Strapi and Stripe](https://blog.strapi.io/strapi-next-setup/)
## Setup
To see it live:
- Clone the repository.
- Start the Strapi server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/backend && yarn && yarn develop`.
- In an other tab, start the nextjs server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
|
This example is the result of the tutorial series "[Cooking a Deliveroo clone with Next.js (React), GraphQL, Strapi and Stripe](https://blog.strapi.io/strapi-next-setup/)
## Setup
To see it live:
- Clone the repository.
- - Start the Strapi server: `cd nextjs-strapi-deliveroo-clone-tutorial/server && yarn && yarn dev`.
? ^ ^^^^
+ - Start the Strapi server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/backend && yarn && yarn develop`.
? ++++++ ^^^^ ^^ ++++
- - In an other tab, start the nextjs server: `cd nextjs-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
+ - In an other tab, start the nextjs server: `cd nextjs-react-strapi-deliveroo-clone-tutorial/frontend && yarn && yarn dev`.
? ++++++
| 4 | 0.4 | 2 | 2 |
37333506e6866e7d0859c5068f115a3e1b9dec3a | test/test_coordinate.py | test/test_coordinate.py | import unittest
from src import coordinate
class TestRules(unittest.TestCase):
""" Tests for the coordinate module """
def test_get_x_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 4
actual_result = board_location.get_x_board()
self.assertEqual(actual_result, expected_result)
def test_get_y_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 6
actual_result = board_location.get_y_board()
self.assertEqual(actual_result, expected_result)
def test_get_x_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 3
actual_result = board_location.get_x_array()
self.assertEqual(actual_result, expected_result)
def test_get_y_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 5
actual_result = board_location.get_y_array()
self.assertEqual(actual_result, expected_result) | import unittest
from src import coordinate
class TestRules(unittest.TestCase):
""" Tests for the coordinate module """
def test_get_x_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 4
actual_result = board_location.get_x_board()
self.assertEqual(actual_result, expected_result)
def test_get_y_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 6
actual_result = board_location.get_y_board()
self.assertEqual(actual_result, expected_result)
def test_get_x_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 3
actual_result = board_location.get_x_array()
self.assertEqual(actual_result, expected_result)
def test_get_y_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 5
actual_result = board_location.get_y_array()
self.assertEqual(actual_result, expected_result)
def test_coordinate_bad_x(self):
self.assertRaises(TypeError, coordinate.Coordinate, "4", 6)
def test_coordinate_bad_y(self):
self.assertRaises(TypeError, coordinate.Coordinate, 4, "6")
def test_coordinate_bad_location(self):
self.assertRaises(ValueError, coordinate.Coordinate, 50, 100)
| Add unit tests for fail fast logic in convertCharToInt() | Add unit tests for fail fast logic in convertCharToInt()
| Python | mit | blairck/jaeger | python | ## Code Before:
import unittest
from src import coordinate
class TestRules(unittest.TestCase):
""" Tests for the coordinate module """
def test_get_x_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 4
actual_result = board_location.get_x_board()
self.assertEqual(actual_result, expected_result)
def test_get_y_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 6
actual_result = board_location.get_y_board()
self.assertEqual(actual_result, expected_result)
def test_get_x_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 3
actual_result = board_location.get_x_array()
self.assertEqual(actual_result, expected_result)
def test_get_y_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 5
actual_result = board_location.get_y_array()
self.assertEqual(actual_result, expected_result)
## Instruction:
Add unit tests for fail fast logic in convertCharToInt()
## Code After:
import unittest
from src import coordinate
class TestRules(unittest.TestCase):
""" Tests for the coordinate module """
def test_get_x_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 4
actual_result = board_location.get_x_board()
self.assertEqual(actual_result, expected_result)
def test_get_y_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 6
actual_result = board_location.get_y_board()
self.assertEqual(actual_result, expected_result)
def test_get_x_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 3
actual_result = board_location.get_x_array()
self.assertEqual(actual_result, expected_result)
def test_get_y_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 5
actual_result = board_location.get_y_array()
self.assertEqual(actual_result, expected_result)
def test_coordinate_bad_x(self):
self.assertRaises(TypeError, coordinate.Coordinate, "4", 6)
def test_coordinate_bad_y(self):
self.assertRaises(TypeError, coordinate.Coordinate, 4, "6")
def test_coordinate_bad_location(self):
self.assertRaises(ValueError, coordinate.Coordinate, 50, 100)
| import unittest
from src import coordinate
class TestRules(unittest.TestCase):
""" Tests for the coordinate module """
def test_get_x_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 4
actual_result = board_location.get_x_board()
self.assertEqual(actual_result, expected_result)
def test_get_y_board(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 6
actual_result = board_location.get_y_board()
self.assertEqual(actual_result, expected_result)
def test_get_x_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 3
actual_result = board_location.get_x_array()
self.assertEqual(actual_result, expected_result)
def test_get_y_array(self):
board_location = coordinate.Coordinate(4, 6)
expected_result = 5
actual_result = board_location.get_y_array()
self.assertEqual(actual_result, expected_result)
+
+ def test_coordinate_bad_x(self):
+ self.assertRaises(TypeError, coordinate.Coordinate, "4", 6)
+
+ def test_coordinate_bad_y(self):
+ self.assertRaises(TypeError, coordinate.Coordinate, 4, "6")
+
+ def test_coordinate_bad_location(self):
+ self.assertRaises(ValueError, coordinate.Coordinate, 50, 100) | 9 | 0.310345 | 9 | 0 |
e86e9e5d55a7119a35bb6ac2349467fca623f072 | byceps/blueprints/core/templates/layout/_notifications.html | byceps/blueprints/core/templates/layout/_notifications.html | {%- from 'macros/misc.html' import render_notification %}
{%- with flashed_messages = get_flashed_messages() %}
{%- if flashed_messages %}
<div class="notifications">
{%- for text, text_is_safe, category, icon in flashed_messages %}
{{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon if (icon is not none) else category) }}
{%- endfor %}
</div>
{%- endif %}
{%- endwith %}
| {%- from 'macros/misc.html' import render_notification %}
{%- with flashed_messages = get_flashed_messages() %}
{%- if flashed_messages %}
<div class="notifications">
{%- for text, text_is_safe, category, icon in flashed_messages %}
{{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon) }}
{%- endfor %}
</div>
{%- endif %}
{%- endwith %}
| Remove icon fallback to category | Remove icon fallback to category
| HTML | bsd-3-clause | m-ober/byceps,homeworkprod/byceps,homeworkprod/byceps,m-ober/byceps,homeworkprod/byceps,m-ober/byceps | html | ## Code Before:
{%- from 'macros/misc.html' import render_notification %}
{%- with flashed_messages = get_flashed_messages() %}
{%- if flashed_messages %}
<div class="notifications">
{%- for text, text_is_safe, category, icon in flashed_messages %}
{{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon if (icon is not none) else category) }}
{%- endfor %}
</div>
{%- endif %}
{%- endwith %}
## Instruction:
Remove icon fallback to category
## Code After:
{%- from 'macros/misc.html' import render_notification %}
{%- with flashed_messages = get_flashed_messages() %}
{%- if flashed_messages %}
<div class="notifications">
{%- for text, text_is_safe, category, icon in flashed_messages %}
{{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon) }}
{%- endfor %}
</div>
{%- endif %}
{%- endwith %}
| {%- from 'macros/misc.html' import render_notification %}
{%- with flashed_messages = get_flashed_messages() %}
{%- if flashed_messages %}
<div class="notifications">
{%- for text, text_is_safe, category, icon in flashed_messages %}
- {{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon if (icon is not none) else category) }}
? ------------------------------------
+ {{ render_notification(text|safe if text_is_safe else text, category=category, icon=icon) }}
{%- endfor %}
</div>
{%- endif %}
{%- endwith %} | 2 | 0.166667 | 1 | 1 |
50d447a546cd939594aeb8fda84167cef27f0d5e | msmbuilder/scripts/msmb.py | msmbuilder/scripts/msmb.py | """Statistical models for biomolecular dynamics"""
from __future__ import print_function, absolute_import, division
import sys
from ..cmdline import App
from ..commands import *
from ..version import version
# the commands register themselves when they're imported
class MSMBuilderApp(App):
def _subcommands(self):
cmds = super(MSMBuilderApp, self)._subcommands()
# sort the commands in some arbitrary order.
return sorted(cmds, key=lambda e: ''.join(x.__name__ for x in e.mro()))
def main():
try:
app = MSMBuilderApp(name='MSMBuilder', description=__doc__)
app.start()
except RuntimeError as e:
sys.exit("Error: %s" % e)
except Exception as e:
message = """\
An unexpected error has occurred with MSMBuilder (version %s), please
consider sending the following traceback to MSMBuilder GitHub issue tracker at:
https://github.com/msmbuilder/msmbuilder/issues
"""
print(message % version, file=sys.stderr)
raise # as if we did not catch it
if __name__ == '__main__':
main()
| """Statistical models for biomolecular dynamics"""
from __future__ import print_function, absolute_import, division
import sys
from ..cmdline import App
from ..commands import *
from ..version import version
# the commands register themselves when they're imported
# Load external commands which register themselves
# with entry point msmbuilder.commands
from pkg_resources import iter_entry_points
for ep in iter_entry_points("msmbuilder.commands"):
external_command = ep.load()
# Some groups start with numbers for ordering
# Some start with descriptions e.g. "MSM"
# Let's set the group to start with ZZZ to put plugins last.
external_command._group = "ZZZ-External_" + external_command._group
class MSMBuilderApp(App):
pass
def main():
try:
app = MSMBuilderApp(name='MSMBuilder', description=__doc__)
app.start()
except RuntimeError as e:
sys.exit("Error: %s" % e)
except Exception as e:
message = """\
An unexpected error has occurred with MSMBuilder (version %s), please
consider sending the following traceback to MSMBuilder GitHub issue tracker at:
https://github.com/msmbuilder/msmbuilder/issues
"""
print(message % version, file=sys.stderr)
raise # as if we did not catch it
if __name__ == '__main__':
main()
| Load plugins from entry point | Load plugins from entry point
| Python | lgpl-2.1 | brookehus/msmbuilder,stephenliu1989/msmbuilder,peastman/msmbuilder,brookehus/msmbuilder,dr-nate/msmbuilder,dotsdl/msmbuilder,peastman/msmbuilder,msultan/msmbuilder,mpharrigan/mixtape,stephenliu1989/msmbuilder,cxhernandez/msmbuilder,rmcgibbo/msmbuilder,cxhernandez/msmbuilder,msultan/msmbuilder,brookehus/msmbuilder,stephenliu1989/msmbuilder,msmbuilder/msmbuilder,msultan/msmbuilder,peastman/msmbuilder,dr-nate/msmbuilder,Eigenstate/msmbuilder,brookehus/msmbuilder,dr-nate/msmbuilder,dotsdl/msmbuilder,brookehus/msmbuilder,peastman/msmbuilder,mpharrigan/mixtape,Eigenstate/msmbuilder,msmbuilder/msmbuilder,dotsdl/msmbuilder,rmcgibbo/msmbuilder,rafwiewiora/msmbuilder,mpharrigan/mixtape,rmcgibbo/msmbuilder,dotsdl/msmbuilder,stephenliu1989/msmbuilder,rmcgibbo/msmbuilder,mpharrigan/mixtape,msmbuilder/msmbuilder,msultan/msmbuilder,dr-nate/msmbuilder,dr-nate/msmbuilder,Eigenstate/msmbuilder,cxhernandez/msmbuilder,rafwiewiora/msmbuilder,rafwiewiora/msmbuilder,peastman/msmbuilder,msmbuilder/msmbuilder,cxhernandez/msmbuilder,rafwiewiora/msmbuilder,rafwiewiora/msmbuilder,Eigenstate/msmbuilder,msmbuilder/msmbuilder,msultan/msmbuilder,cxhernandez/msmbuilder,mpharrigan/mixtape,Eigenstate/msmbuilder | python | ## Code Before:
"""Statistical models for biomolecular dynamics"""
from __future__ import print_function, absolute_import, division
import sys
from ..cmdline import App
from ..commands import *
from ..version import version
# the commands register themselves when they're imported
class MSMBuilderApp(App):
def _subcommands(self):
cmds = super(MSMBuilderApp, self)._subcommands()
# sort the commands in some arbitrary order.
return sorted(cmds, key=lambda e: ''.join(x.__name__ for x in e.mro()))
def main():
try:
app = MSMBuilderApp(name='MSMBuilder', description=__doc__)
app.start()
except RuntimeError as e:
sys.exit("Error: %s" % e)
except Exception as e:
message = """\
An unexpected error has occurred with MSMBuilder (version %s), please
consider sending the following traceback to MSMBuilder GitHub issue tracker at:
https://github.com/msmbuilder/msmbuilder/issues
"""
print(message % version, file=sys.stderr)
raise # as if we did not catch it
if __name__ == '__main__':
main()
## Instruction:
Load plugins from entry point
## Code After:
"""Statistical models for biomolecular dynamics"""
from __future__ import print_function, absolute_import, division
import sys
from ..cmdline import App
from ..commands import *
from ..version import version
# the commands register themselves when they're imported
# Load external commands which register themselves
# with entry point msmbuilder.commands
from pkg_resources import iter_entry_points
for ep in iter_entry_points("msmbuilder.commands"):
external_command = ep.load()
# Some groups start with numbers for ordering
# Some start with descriptions e.g. "MSM"
# Let's set the group to start with ZZZ to put plugins last.
external_command._group = "ZZZ-External_" + external_command._group
class MSMBuilderApp(App):
pass
def main():
try:
app = MSMBuilderApp(name='MSMBuilder', description=__doc__)
app.start()
except RuntimeError as e:
sys.exit("Error: %s" % e)
except Exception as e:
message = """\
An unexpected error has occurred with MSMBuilder (version %s), please
consider sending the following traceback to MSMBuilder GitHub issue tracker at:
https://github.com/msmbuilder/msmbuilder/issues
"""
print(message % version, file=sys.stderr)
raise # as if we did not catch it
if __name__ == '__main__':
main()
| """Statistical models for biomolecular dynamics"""
from __future__ import print_function, absolute_import, division
import sys
from ..cmdline import App
from ..commands import *
from ..version import version
# the commands register themselves when they're imported
+ # Load external commands which register themselves
+ # with entry point msmbuilder.commands
+ from pkg_resources import iter_entry_points
+
+ for ep in iter_entry_points("msmbuilder.commands"):
+ external_command = ep.load()
+ # Some groups start with numbers for ordering
+ # Some start with descriptions e.g. "MSM"
+ # Let's set the group to start with ZZZ to put plugins last.
+ external_command._group = "ZZZ-External_" + external_command._group
+
class MSMBuilderApp(App):
+ pass
- def _subcommands(self):
- cmds = super(MSMBuilderApp, self)._subcommands()
-
- # sort the commands in some arbitrary order.
- return sorted(cmds, key=lambda e: ''.join(x.__name__ for x in e.mro()))
def main():
try:
app = MSMBuilderApp(name='MSMBuilder', description=__doc__)
app.start()
except RuntimeError as e:
sys.exit("Error: %s" % e)
except Exception as e:
message = """\
An unexpected error has occurred with MSMBuilder (version %s), please
consider sending the following traceback to MSMBuilder GitHub issue tracker at:
https://github.com/msmbuilder/msmbuilder/issues
"""
print(message % version, file=sys.stderr)
raise # as if we did not catch it
+
if __name__ == '__main__':
main() | 18 | 0.529412 | 13 | 5 |
dde3ee5113fd0ae40cfb6788e4bd0a2a8c60eb72 | t/version_t.c | t/version_t.c |
int main(void)
{
const char *version = MMDB_lib_version();
ok(version != NULL, "MMDB_lib_version exists");
ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
done_testing();
}
|
int main(void)
{
const char *version = MMDB_lib_version();
ok(version != NULL, "MMDB_lib_version exists");
if ( version )
ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
done_testing();
}
| Check version string only if MMDB_lib_version != NULL | Check version string only if MMDB_lib_version != NULL
| C | apache-2.0 | maxmind/libmaxminddb,maxmind/libmaxminddb,maxmind/libmaxminddb | c | ## Code Before:
int main(void)
{
const char *version = MMDB_lib_version();
ok(version != NULL, "MMDB_lib_version exists");
ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
done_testing();
}
## Instruction:
Check version string only if MMDB_lib_version != NULL
## Code After:
int main(void)
{
const char *version = MMDB_lib_version();
ok(version != NULL, "MMDB_lib_version exists");
if ( version )
ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
done_testing();
}
|
int main(void)
{
const char *version = MMDB_lib_version();
ok(version != NULL, "MMDB_lib_version exists");
+ if ( version )
- ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
+ ok(strcmp(version, PACKAGE_VERSION) == 0, "version is " PACKAGE_VERSION);
? ++++
done_testing();
} | 3 | 0.375 | 2 | 1 |
f4f9e78f93766c09649c4ed532e73cb01a487bd2 | lib/souschef/template.rb | lib/souschef/template.rb | require 'souschef/template/base'
require 'souschef/template/metadata'
require 'souschef/template/license'
require 'souschef/template/readme'
require 'souschef/template/rubocop'
require 'souschef/template/spec_helper'
require 'souschef/template/serverspec'
module Souschef
# Creates various files from predefined templates
class Template
# Public - Create needed standardised files
#
# Returns nil
def self.run(cookbook)
Souschef::Template::Rubocop.new.create(cookbook)
Souschef::Template::Spechelper.new.create(cookbook)
Souschef::Template::Serverspec.new.create(cookbook)
Souschef::Template::Metadata.new.create(cookbook)
Souschef::Template::License.new.create(cookbook)
Souschef::Template::Readme.new.create(cookbook)
end
end
end
| require 'souschef/template/base'
require 'souschef/template/metadata'
require 'souschef/template/license'
require 'souschef/template/readme'
require 'souschef/template/rubocop'
require 'souschef/template/spec_helper'
require 'souschef/template/serverspec'
require 'souschef/template/rakefile'
module Souschef
# Creates various files from predefined templates
class Template
# Public - Create needed standardised files
#
# Returns nil
def self.run(cookbook)
Souschef::Template::Rubocop.new.create(cookbook)
Souschef::Template::Spechelper.new.create(cookbook)
Souschef::Template::Serverspec.new.create(cookbook)
Souschef::Template::Metadata.new.create(cookbook)
Souschef::Template::License.new.create(cookbook)
Souschef::Template::Readme.new.create(cookbook)
Souschef::Template::Rakefile.new.create(cookbook)
end
end
end
| Add Rakefile creation to the list | Add Rakefile creation to the list
| Ruby | mit | akrasic/souschef,akrasic/souschef | ruby | ## Code Before:
require 'souschef/template/base'
require 'souschef/template/metadata'
require 'souschef/template/license'
require 'souschef/template/readme'
require 'souschef/template/rubocop'
require 'souschef/template/spec_helper'
require 'souschef/template/serverspec'
module Souschef
# Creates various files from predefined templates
class Template
# Public - Create needed standardised files
#
# Returns nil
def self.run(cookbook)
Souschef::Template::Rubocop.new.create(cookbook)
Souschef::Template::Spechelper.new.create(cookbook)
Souschef::Template::Serverspec.new.create(cookbook)
Souschef::Template::Metadata.new.create(cookbook)
Souschef::Template::License.new.create(cookbook)
Souschef::Template::Readme.new.create(cookbook)
end
end
end
## Instruction:
Add Rakefile creation to the list
## Code After:
require 'souschef/template/base'
require 'souschef/template/metadata'
require 'souschef/template/license'
require 'souschef/template/readme'
require 'souschef/template/rubocop'
require 'souschef/template/spec_helper'
require 'souschef/template/serverspec'
require 'souschef/template/rakefile'
module Souschef
# Creates various files from predefined templates
class Template
# Public - Create needed standardised files
#
# Returns nil
def self.run(cookbook)
Souschef::Template::Rubocop.new.create(cookbook)
Souschef::Template::Spechelper.new.create(cookbook)
Souschef::Template::Serverspec.new.create(cookbook)
Souschef::Template::Metadata.new.create(cookbook)
Souschef::Template::License.new.create(cookbook)
Souschef::Template::Readme.new.create(cookbook)
Souschef::Template::Rakefile.new.create(cookbook)
end
end
end
| require 'souschef/template/base'
require 'souschef/template/metadata'
require 'souschef/template/license'
require 'souschef/template/readme'
require 'souschef/template/rubocop'
require 'souschef/template/spec_helper'
require 'souschef/template/serverspec'
+ require 'souschef/template/rakefile'
module Souschef
# Creates various files from predefined templates
class Template
# Public - Create needed standardised files
#
# Returns nil
def self.run(cookbook)
Souschef::Template::Rubocop.new.create(cookbook)
Souschef::Template::Spechelper.new.create(cookbook)
Souschef::Template::Serverspec.new.create(cookbook)
Souschef::Template::Metadata.new.create(cookbook)
Souschef::Template::License.new.create(cookbook)
Souschef::Template::Readme.new.create(cookbook)
+ Souschef::Template::Rakefile.new.create(cookbook)
end
end
end | 2 | 0.083333 | 2 | 0 |
c9b452ac3d36296c48425cc33256b7246bf4c469 | Resources/config/grids/payment_method.yml | Resources/config/grids/payment_method.yml | sylius_grid:
grids:
sylius_admin_payment_method:
driver:
name: doctrine/orm
options:
class: "%sylius.model.payment_method.class%"
repository:
method: createListQueryBuilder
arguments: ["%locale%"]
sorting:
position: asc
fields:
position:
type: twig
label: sylius.ui.position
sortable: ~
options:
template: "@SyliusUi/Grid/Field/position.html.twig"
code:
type: string
label: sylius.ui.code
sortable: ~
name:
type: string
label: sylius.ui.name
sortable: translation.name
gateway:
type: string
path: gatewayConfig.gatewayName
label: sylius.ui.gateway
sortable: ~
enabled:
type: twig
label: sylius.ui.enabled
sortable: ~
options:
template: "@SyliusUi/Grid/Field/enabled.html.twig"
filters:
search:
type: string
label: sylius.ui.search
options:
fields: [code, translation.name]
enabled:
type: boolean
label: sylius.ui.enabled
actions:
main:
create:
type: create_payment_method
item:
update:
type: update
delete:
type: delete
| sylius_grid:
grids:
sylius_admin_payment_method:
driver:
name: doctrine/orm
options:
class: "%sylius.model.payment_method.class%"
repository:
method: createListQueryBuilder
arguments: ["%locale%"]
sorting:
position: asc
fields:
position:
type: twig
label: sylius.ui.position
sortable: ~
options:
template: "@SyliusUi/Grid/Field/position.html.twig"
code:
type: string
label: sylius.ui.code
sortable: ~
name:
type: string
label: sylius.ui.name
sortable: translation.name
gateway:
type: string
path: gatewayConfig.gatewayName
label: sylius.ui.gateway
sortable: gatewayConfig.gatewayName
enabled:
type: twig
label: sylius.ui.enabled
sortable: ~
options:
template: "@SyliusUi/Grid/Field/enabled.html.twig"
filters:
search:
type: string
label: sylius.ui.search
options:
fields: [code, translation.name]
enabled:
type: boolean
label: sylius.ui.enabled
actions:
main:
create:
type: create_payment_method
item:
update:
type: update
delete:
type: delete
| Fix issue-7765. Allow ordering payment methods by gateway config name | Fix issue-7765. Allow ordering payment methods by gateway config name
| YAML | mit | Sylius/SyliusAdminBundle,Sylius/SyliusAdminBundle | yaml | ## Code Before:
sylius_grid:
grids:
sylius_admin_payment_method:
driver:
name: doctrine/orm
options:
class: "%sylius.model.payment_method.class%"
repository:
method: createListQueryBuilder
arguments: ["%locale%"]
sorting:
position: asc
fields:
position:
type: twig
label: sylius.ui.position
sortable: ~
options:
template: "@SyliusUi/Grid/Field/position.html.twig"
code:
type: string
label: sylius.ui.code
sortable: ~
name:
type: string
label: sylius.ui.name
sortable: translation.name
gateway:
type: string
path: gatewayConfig.gatewayName
label: sylius.ui.gateway
sortable: ~
enabled:
type: twig
label: sylius.ui.enabled
sortable: ~
options:
template: "@SyliusUi/Grid/Field/enabled.html.twig"
filters:
search:
type: string
label: sylius.ui.search
options:
fields: [code, translation.name]
enabled:
type: boolean
label: sylius.ui.enabled
actions:
main:
create:
type: create_payment_method
item:
update:
type: update
delete:
type: delete
## Instruction:
Fix issue-7765. Allow ordering payment methods by gateway config name
## Code After:
sylius_grid:
grids:
sylius_admin_payment_method:
driver:
name: doctrine/orm
options:
class: "%sylius.model.payment_method.class%"
repository:
method: createListQueryBuilder
arguments: ["%locale%"]
sorting:
position: asc
fields:
position:
type: twig
label: sylius.ui.position
sortable: ~
options:
template: "@SyliusUi/Grid/Field/position.html.twig"
code:
type: string
label: sylius.ui.code
sortable: ~
name:
type: string
label: sylius.ui.name
sortable: translation.name
gateway:
type: string
path: gatewayConfig.gatewayName
label: sylius.ui.gateway
sortable: gatewayConfig.gatewayName
enabled:
type: twig
label: sylius.ui.enabled
sortable: ~
options:
template: "@SyliusUi/Grid/Field/enabled.html.twig"
filters:
search:
type: string
label: sylius.ui.search
options:
fields: [code, translation.name]
enabled:
type: boolean
label: sylius.ui.enabled
actions:
main:
create:
type: create_payment_method
item:
update:
type: update
delete:
type: delete
| sylius_grid:
grids:
sylius_admin_payment_method:
driver:
name: doctrine/orm
options:
class: "%sylius.model.payment_method.class%"
repository:
method: createListQueryBuilder
arguments: ["%locale%"]
sorting:
position: asc
fields:
position:
type: twig
label: sylius.ui.position
sortable: ~
options:
template: "@SyliusUi/Grid/Field/position.html.twig"
code:
type: string
label: sylius.ui.code
sortable: ~
name:
type: string
label: sylius.ui.name
sortable: translation.name
gateway:
type: string
path: gatewayConfig.gatewayName
label: sylius.ui.gateway
- sortable: ~
+ sortable: gatewayConfig.gatewayName
enabled:
type: twig
label: sylius.ui.enabled
sortable: ~
options:
template: "@SyliusUi/Grid/Field/enabled.html.twig"
filters:
search:
type: string
label: sylius.ui.search
options:
fields: [code, translation.name]
enabled:
type: boolean
label: sylius.ui.enabled
actions:
main:
create:
type: create_payment_method
item:
update:
type: update
delete:
type: delete | 2 | 0.035714 | 1 | 1 |
ea004041683329cedbd81f39934cc06473959c7d | lib/jekyll/assets/compressors/uglify.rb | lib/jekyll/assets/compressors/uglify.rb |
module Jekyll
module Assets
module Compressors
class Uglify < Sprockets::UglifierCompressor
def call(input)
out = super(input)
Hook.trigger :asset, :after_compression do |h|
h.call(input, out, "application/javascript")
end
out
end
end
Hook.register :env, :before_init, priority: 1 do |e|
uglifier_config = e.asset_config[:compressors][:uglifier].symbolize_keys
Sprockets.register_compressor "application/javascript",
:assets_uglify,
Uglify.new(uglifier_config)
end
Hook.register :env, :after_init, priority: 3 do |e|
e.js_compressor = nil
next unless e.asset_config[:compression]
if Utils.javascript?
e.js_compressor = :assets_uglify
end
end
end
end
end
|
module Jekyll
module Assets
module Compressors
class Uglify < Sprockets::UglifierCompressor
def call(input)
out = super(input)
Hook.trigger :asset, :after_compression do |h|
h.call(input, out, "application/javascript")
end
out
end
end
# rubocop:disable Metrics/LineLength
Sprockets.register_compressor "application/javascript", :assets_uglify, Uglify
Hook.register :env, :after_init, priority: 3 do |e|
next unless e.asset_config[:compression]
e.js_compressor = nil unless Utils.javascript?
config = e.asset_config[:compressors][:uglifier].symbolize_keys
e.js_compressor = Uglify.new(config) if Utils.javascript?
end
end
end
end
| Make sure our compressor is ran, the hooks are necessary. | Make sure our compressor is ran, the hooks are necessary.
| Ruby | mit | mattbeedle/jekyll-assets,mattbeedle/jekyll-assets,jekyll-assets/jekyll-assets,mattbeedle/jekyll-assets,jekyll/jekyll-assets,jekyll-assets/jekyll-assets,jekyll/jekyll-assets,jeremy/jekyll-assets,mattbeedle/jekyll-assets,jekyll-assets/jekyll-assets,jeremy/jekyll-assets,jeremy/jekyll-assets,jekyll/jekyll-assets,jekyll/jekyll-assets,jeremy/jekyll-assets | ruby | ## Code Before:
module Jekyll
module Assets
module Compressors
class Uglify < Sprockets::UglifierCompressor
def call(input)
out = super(input)
Hook.trigger :asset, :after_compression do |h|
h.call(input, out, "application/javascript")
end
out
end
end
Hook.register :env, :before_init, priority: 1 do |e|
uglifier_config = e.asset_config[:compressors][:uglifier].symbolize_keys
Sprockets.register_compressor "application/javascript",
:assets_uglify,
Uglify.new(uglifier_config)
end
Hook.register :env, :after_init, priority: 3 do |e|
e.js_compressor = nil
next unless e.asset_config[:compression]
if Utils.javascript?
e.js_compressor = :assets_uglify
end
end
end
end
end
## Instruction:
Make sure our compressor is ran, the hooks are necessary.
## Code After:
module Jekyll
module Assets
module Compressors
class Uglify < Sprockets::UglifierCompressor
def call(input)
out = super(input)
Hook.trigger :asset, :after_compression do |h|
h.call(input, out, "application/javascript")
end
out
end
end
# rubocop:disable Metrics/LineLength
Sprockets.register_compressor "application/javascript", :assets_uglify, Uglify
Hook.register :env, :after_init, priority: 3 do |e|
next unless e.asset_config[:compression]
e.js_compressor = nil unless Utils.javascript?
config = e.asset_config[:compressors][:uglifier].symbolize_keys
e.js_compressor = Uglify.new(config) if Utils.javascript?
end
end
end
end
|
module Jekyll
module Assets
module Compressors
class Uglify < Sprockets::UglifierCompressor
def call(input)
out = super(input)
Hook.trigger :asset, :after_compression do |h|
h.call(input, out, "application/javascript")
end
out
end
end
+ # rubocop:disable Metrics/LineLength
+ Sprockets.register_compressor "application/javascript", :assets_uglify, Uglify
- Hook.register :env, :before_init, priority: 1 do |e|
? ^ -- - ^
+ Hook.register :env, :after_init, priority: 3 do |e|
? ^^^ ^
- uglifier_config = e.asset_config[:compressors][:uglifier].symbolize_keys
+ next unless e.asset_config[:compression]
+ e.js_compressor = nil unless Utils.javascript?
+ config = e.asset_config[:compressors][:uglifier].symbolize_keys
+ e.js_compressor = Uglify.new(config) if Utils.javascript?
- Sprockets.register_compressor "application/javascript",
- :assets_uglify,
- Uglify.new(uglifier_config)
- end
-
- Hook.register :env, :after_init, priority: 3 do |e|
- e.js_compressor = nil
- next unless e.asset_config[:compression]
- if Utils.javascript?
- e.js_compressor = :assets_uglify
- end
end
end
end
end | 20 | 0.625 | 7 | 13 |
9b6e9bb18c5fbce33685c64f58c9bf199b97804a | app/views/settings/types/_image.html.erb | app/views/settings/types/_image.html.erb | <div class="row">
<div class="col-sm-6">
<%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
</div>
<div class="col-sm-6">
<%= f.input :value, label: 'Image URL', readonly: true %>
</div>
</div>
| <div class="image-to-path-wrapper">
<div class="row">
<div class="col-sm-6">
<%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
</div>
<div class="col-sm-6">
<%= f.input :value, label: 'Image URL', readonly: true %>
</div>
</div>
</div>
| Fix image-to-path image setting issue | Fix image-to-path image setting issue
| HTML+ERB | mit | dylanfisher/forest,dylanfisher/forest,dylanfisher/forest | html+erb | ## Code Before:
<div class="row">
<div class="col-sm-6">
<%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
</div>
<div class="col-sm-6">
<%= f.input :value, label: 'Image URL', readonly: true %>
</div>
</div>
## Instruction:
Fix image-to-path image setting issue
## Code After:
<div class="image-to-path-wrapper">
<div class="row">
<div class="col-sm-6">
<%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
</div>
<div class="col-sm-6">
<%= f.input :value, label: 'Image URL', readonly: true %>
</div>
</div>
</div>
| + <div class="image-to-path-wrapper">
- <div class="row">
+ <div class="row">
? ++
- <div class="col-sm-6">
+ <div class="col-sm-6">
? ++
- <%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
+ <%= f.input :value, as: :image, label: false, input_html: { field_name: 'setting_value', img_src: @setting.value, path_only: true } %>
? ++
- </div>
+ </div>
? ++
- <div class="col-sm-6">
+ <div class="col-sm-6">
? ++
- <%= f.input :value, label: 'Image URL', readonly: true %>
+ <%= f.input :value, label: 'Image URL', readonly: true %>
? ++
+ </div>
</div>
</div> | 14 | 1.75 | 8 | 6 |
0699d48d4a5958f3ec4812fea3d29dba1265f3bd | app/views/articles/new.html.erb | app/views/articles/new.html.erb | <div class = "container">
<div class = "jumbotron center">
<%= form_tag("/articles") do %>
<div>
<p class = "header">Article Title:</p>
<%= text_field_tag('title',"", class: 'textAreaForm') %>
<p class = "header">Article Content:</p>
<%= text_area_tag('content', "", class: 'textAreaForm') %>
<p class = "header">Picure URL:</p>
<%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Type:</p>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Blog Post', class: "radioButtons" %> <%= radio_button_tag 'blog_post', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Project', class: "radioButtons" %> <%= radio_button_tag 'project', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create About Me', class: "radioButtons" %> <%= radio_button_tag 'about_me', true %>
</div>
</div>
<% if @errors %>
<div class = "center error">
<% @errors.each do |error| %>
<%= error %>
<% end %>
</div>
<% end %>
<div class = "center"><%= submit_tag "Create Article", :class=>"button", :id=> "newArticleButton" %></div>
<% end %>
</div>
</div> | <div class = "container">
<div class = "jumbotron center">
<%= form_tag("/articles") do %>
<div>
<p class = "header">Article Title:</p>
<%= text_field_tag('title',"", class: 'textAreaForm') %>
<p class = "header">Picure URL:</p>
<%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Content:</p>
<%= text_area_tag('content', "", class: 'textAreaForm') %>
<p class = "header">Article Type:</p>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Blog Post', class: "radioButtons" %> <%= radio_button_tag 'blog_post', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Project', class: "radioButtons" %> <%= radio_button_tag 'project', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create About Me', class: "radioButtons" %> <%= radio_button_tag 'about_me', true %>
</div>
</div>
<% if @errors %>
<div class = "center error">
<% @errors.each do |error| %>
<%= error %>
<% end %>
</div>
<% end %>
<div class = "center"><%= submit_tag "Create Article", :class=>"button", :id=> "newArticleButton" %></div>
<% end %>
</div>
</div> | Switch pic_url and content on create / edit pages | Switch pic_url and content on create / edit pages
| HTML+ERB | mit | bburnett86/BretBlog,bburnett86/BretBlog,bburnett86/BretBlog | html+erb | ## Code Before:
<div class = "container">
<div class = "jumbotron center">
<%= form_tag("/articles") do %>
<div>
<p class = "header">Article Title:</p>
<%= text_field_tag('title',"", class: 'textAreaForm') %>
<p class = "header">Article Content:</p>
<%= text_area_tag('content', "", class: 'textAreaForm') %>
<p class = "header">Picure URL:</p>
<%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Type:</p>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Blog Post', class: "radioButtons" %> <%= radio_button_tag 'blog_post', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Project', class: "radioButtons" %> <%= radio_button_tag 'project', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create About Me', class: "radioButtons" %> <%= radio_button_tag 'about_me', true %>
</div>
</div>
<% if @errors %>
<div class = "center error">
<% @errors.each do |error| %>
<%= error %>
<% end %>
</div>
<% end %>
<div class = "center"><%= submit_tag "Create Article", :class=>"button", :id=> "newArticleButton" %></div>
<% end %>
</div>
</div>
## Instruction:
Switch pic_url and content on create / edit pages
## Code After:
<div class = "container">
<div class = "jumbotron center">
<%= form_tag("/articles") do %>
<div>
<p class = "header">Article Title:</p>
<%= text_field_tag('title',"", class: 'textAreaForm') %>
<p class = "header">Picure URL:</p>
<%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Content:</p>
<%= text_area_tag('content', "", class: 'textAreaForm') %>
<p class = "header">Article Type:</p>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Blog Post', class: "radioButtons" %> <%= radio_button_tag 'blog_post', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Project', class: "radioButtons" %> <%= radio_button_tag 'project', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create About Me', class: "radioButtons" %> <%= radio_button_tag 'about_me', true %>
</div>
</div>
<% if @errors %>
<div class = "center error">
<% @errors.each do |error| %>
<%= error %>
<% end %>
</div>
<% end %>
<div class = "center"><%= submit_tag "Create Article", :class=>"button", :id=> "newArticleButton" %></div>
<% end %>
</div>
</div> | <div class = "container">
<div class = "jumbotron center">
<%= form_tag("/articles") do %>
<div>
<p class = "header">Article Title:</p>
<%= text_field_tag('title',"", class: 'textAreaForm') %>
+ <p class = "header">Picure URL:</p>
+ <%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Content:</p>
<%= text_area_tag('content', "", class: 'textAreaForm') %>
- <p class = "header">Picure URL:</p>
- <%= text_field_tag('pic_url',"", class: 'picURLfield') %>
<p class = "header">Article Type:</p>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Blog Post', class: "radioButtons" %> <%= radio_button_tag 'blog_post', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create Project', class: "radioButtons" %> <%= radio_button_tag 'project', true %>
</div>
<div class="col-md-4">
<%= label_tag 'Article', 'Create About Me', class: "radioButtons" %> <%= radio_button_tag 'about_me', true %>
</div>
</div>
<% if @errors %>
<div class = "center error">
<% @errors.each do |error| %>
<%= error %>
<% end %>
</div>
<% end %>
<div class = "center"><%= submit_tag "Create Article", :class=>"button", :id=> "newArticleButton" %></div>
<% end %>
</div>
</div> | 4 | 0.125 | 2 | 2 |
cd6ef5875ba877d10bf929f07821746fe66258d6 | S11-compunit/rt126904.t | S11-compunit/rt126904.t | use v6;
use Test;
plan 1;
my $proc = run 'perl6', '-I', 't/spec/packages/RT126904/lib', '-e',
'use Woohoo::Foo::Bar; use Woohoo::Foo::Baz; my Woohoo::Foo::Bar $bar;',
:err;
todo('because RT #126904', 1);
is($proc.err.slurp-rest, '', "compilation errors");
# vim: ft=perl6
| use v6;
use Test;
plan 1;
my $proc = run 'perl6', '-I', 't/spec/packages/RT126904/lib', '-e',
'use Woohoo::Foo::Bar; use Woohoo::Foo::Baz; my Woohoo::Foo::Bar $bar;',
:err;
is($proc.err.slurp-rest, '', "compilation errors");
# vim: ft=perl6
| Remove obsolete TODO marker - bug is fixed | Remove obsolete TODO marker - bug is fixed
| Perl | artistic-2.0 | zostay/roast,dankogai/roast,skids/roast,skids/roast,dankogai/roast,zostay/roast,b2gills/roast,perl6/roast,bitrauser/roast,bitrauser/roast,dankogai/roast,skids/roast,zostay/roast,dogbert17/roast,b2gills/roast,dogbert17/roast,b2gills/roast | perl | ## Code Before:
use v6;
use Test;
plan 1;
my $proc = run 'perl6', '-I', 't/spec/packages/RT126904/lib', '-e',
'use Woohoo::Foo::Bar; use Woohoo::Foo::Baz; my Woohoo::Foo::Bar $bar;',
:err;
todo('because RT #126904', 1);
is($proc.err.slurp-rest, '', "compilation errors");
# vim: ft=perl6
## Instruction:
Remove obsolete TODO marker - bug is fixed
## Code After:
use v6;
use Test;
plan 1;
my $proc = run 'perl6', '-I', 't/spec/packages/RT126904/lib', '-e',
'use Woohoo::Foo::Bar; use Woohoo::Foo::Baz; my Woohoo::Foo::Bar $bar;',
:err;
is($proc.err.slurp-rest, '', "compilation errors");
# vim: ft=perl6
| use v6;
use Test;
plan 1;
my $proc = run 'perl6', '-I', 't/spec/packages/RT126904/lib', '-e',
'use Woohoo::Foo::Bar; use Woohoo::Foo::Baz; my Woohoo::Foo::Bar $bar;',
:err;
- todo('because RT #126904', 1);
is($proc.err.slurp-rest, '', "compilation errors");
# vim: ft=perl6 | 1 | 0.076923 | 0 | 1 |
bfdf343bd760bac8b2b1aebf6a49889c9786e0cc | tslint.json | tslint.json | {
"extends": "tslint:latest",
"rules": {
"quotemark": [true, "single"],
"indent": [true, "tabs"],
"trailing-comma": false,
"variable-name": [true, "allow-leading-underscore"],
"interface-name": false,
"member-ordering": false,
"object-literal-sort-keys": false,
"array-type": [true, "array"],
"no-bitwise": false,
"no-namespace": false,
"no-string-literal": false
},
"rulesDirectory": []
}
| {
"extends": "tslint:latest",
"rules": {
"quotemark": [true, "single"],
"indent": [true, "tabs"],
"trailing-comma": false,
"variable-name": [true, "allow-leading-underscore"],
"interface-name": false,
"member-ordering": false,
"object-literal-sort-keys": false,
"array-type": [true, "array"],
"no-bitwise": false,
"no-namespace": false,
"no-string-literal": false,
"no-empty": false
},
"rulesDirectory": []
}
| Allow empty block in linter for better testnig | Allow empty block in linter for better testnig
| JSON | mit | sygic-travel/js-sdk,sygic-travel/js-sdk,sygic-travel/js-sdk | json | ## Code Before:
{
"extends": "tslint:latest",
"rules": {
"quotemark": [true, "single"],
"indent": [true, "tabs"],
"trailing-comma": false,
"variable-name": [true, "allow-leading-underscore"],
"interface-name": false,
"member-ordering": false,
"object-literal-sort-keys": false,
"array-type": [true, "array"],
"no-bitwise": false,
"no-namespace": false,
"no-string-literal": false
},
"rulesDirectory": []
}
## Instruction:
Allow empty block in linter for better testnig
## Code After:
{
"extends": "tslint:latest",
"rules": {
"quotemark": [true, "single"],
"indent": [true, "tabs"],
"trailing-comma": false,
"variable-name": [true, "allow-leading-underscore"],
"interface-name": false,
"member-ordering": false,
"object-literal-sort-keys": false,
"array-type": [true, "array"],
"no-bitwise": false,
"no-namespace": false,
"no-string-literal": false,
"no-empty": false
},
"rulesDirectory": []
}
| {
"extends": "tslint:latest",
"rules": {
"quotemark": [true, "single"],
"indent": [true, "tabs"],
"trailing-comma": false,
"variable-name": [true, "allow-leading-underscore"],
"interface-name": false,
"member-ordering": false,
"object-literal-sort-keys": false,
"array-type": [true, "array"],
"no-bitwise": false,
"no-namespace": false,
- "no-string-literal": false
+ "no-string-literal": false,
? +
+ "no-empty": false
},
"rulesDirectory": []
} | 3 | 0.176471 | 2 | 1 |
8ed8b0dbe4ddc430036c4754df284465e909e098 | source/includes/features/gitlab-ee-add-product.html.haml | source/includes/features/gitlab-ee-add-product.html.haml | .add-options-buttons.text-center
.container
.row
.customer-block.col-xs-12.col-md-4
%h4 Already a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Add #{feature_name}
.customer-block.col-xs-12.col-md-4
%h4 Not a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: "/free-trial/" } Try it risk-free for 30 days
.customer-block.col-xs-12.col-md-4
%h4 Have more questions?
%a.btn.featured-item-button.btn-margin-top{href: "https://about.gitlab.com/sales/"} Contact sales
| .add-options-buttons.text-center
.container
.row
.customer-block.col-xs-12.col-md-4
%h4 Already a GitLab EE Starter customer?
%a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Upgrade to GitLab EE Premium
.customer-block.col-xs-12.col-md-4
%h4 Not a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: "/free-trial/" } Try it risk-free for 30 days
.customer-block.col-xs-12.col-md-4
%h4 Have more questions?
%a.btn.featured-item-button.btn-margin-top{href: "https://about.gitlab.com/sales/"} Contact sales
| Update to promote an upgrade to EE Premium | Update to promote an upgrade to EE Premium | Haml | mit | damianhakert/damianhakert.github.io | haml | ## Code Before:
.add-options-buttons.text-center
.container
.row
.customer-block.col-xs-12.col-md-4
%h4 Already a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Add #{feature_name}
.customer-block.col-xs-12.col-md-4
%h4 Not a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: "/free-trial/" } Try it risk-free for 30 days
.customer-block.col-xs-12.col-md-4
%h4 Have more questions?
%a.btn.featured-item-button.btn-margin-top{href: "https://about.gitlab.com/sales/"} Contact sales
## Instruction:
Update to promote an upgrade to EE Premium
## Code After:
.add-options-buttons.text-center
.container
.row
.customer-block.col-xs-12.col-md-4
%h4 Already a GitLab EE Starter customer?
%a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Upgrade to GitLab EE Premium
.customer-block.col-xs-12.col-md-4
%h4 Not a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: "/free-trial/" } Try it risk-free for 30 days
.customer-block.col-xs-12.col-md-4
%h4 Have more questions?
%a.btn.featured-item-button.btn-margin-top{href: "https://about.gitlab.com/sales/"} Contact sales
| .add-options-buttons.text-center
.container
.row
.customer-block.col-xs-12.col-md-4
- %h4 Already a GitLab Enterprise Edition customer?
? ^ -------------
+ %h4 Already a GitLab EE Starter customer?
? ^^^^^^
- %a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Add #{feature_name}
? ^ ^ ^^^^ ^^ --- ^^
+ %a.btn.featured-item-button.btn-margin-top{ href: data.pricing.payment_url } Upgrade to GitLab EE Premium
? ^^^^^ ^ ^^^^^^^ ^^^^^^ ^^^
.customer-block.col-xs-12.col-md-4
%h4 Not a GitLab Enterprise Edition customer?
%a.btn.featured-item-button.btn-margin-top{ href: "/free-trial/" } Try it risk-free for 30 days
.customer-block.col-xs-12.col-md-4
%h4 Have more questions?
%a.btn.featured-item-button.btn-margin-top{href: "https://about.gitlab.com/sales/"} Contact sales | 4 | 0.333333 | 2 | 2 |
69fc2eccaa88189fd0de86d11206fa24d1508819 | tools/np_suppressions.py | tools/np_suppressions.py | suppressions = [
[ ".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
# PyArray_Std trivially forwards to and appears to be superceded by
# __New_PyArray_Std, which is exercised by the test framework.
[ ".*/multiarray/calculation\.", "PyArray_Std" ],
# PyCapsule_Check is declared in a header, and used in
# multiarray/ctors.c. So it isn't really untested.
[ ".*/multiarray/common\.", "PyCapsule_Check" ],
]
| suppressions = [
# This one cannot be covered by any Python language test because there is
# no code pathway to it. But it is part of the C API, so must not be
# excised from the code.
[ r".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
# PyArray_Std trivially forwards to and appears to be superceded by
# __New_PyArray_Std, which is exercised by the test framework.
[ r".*/multiarray/calculation\.", "PyArray_Std" ],
# PyCapsule_Check is declared in a header, and used in
# multiarray/ctors.c. So it isn't really untested.
[ r".*/multiarray/common\.", "PyCapsule_Check" ],
]
| Add documentation on one assertion, convert RE's to raw strings. | Add documentation on one assertion, convert RE's to raw strings.
| Python | bsd-3-clause | teoliphant/numpy-refactor,teoliphant/numpy-refactor,jasonmccampbell/numpy-refactor-sprint,jasonmccampbell/numpy-refactor-sprint,teoliphant/numpy-refactor,jasonmccampbell/numpy-refactor-sprint,jasonmccampbell/numpy-refactor-sprint,teoliphant/numpy-refactor,teoliphant/numpy-refactor | python | ## Code Before:
suppressions = [
[ ".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
# PyArray_Std trivially forwards to and appears to be superceded by
# __New_PyArray_Std, which is exercised by the test framework.
[ ".*/multiarray/calculation\.", "PyArray_Std" ],
# PyCapsule_Check is declared in a header, and used in
# multiarray/ctors.c. So it isn't really untested.
[ ".*/multiarray/common\.", "PyCapsule_Check" ],
]
## Instruction:
Add documentation on one assertion, convert RE's to raw strings.
## Code After:
suppressions = [
# This one cannot be covered by any Python language test because there is
# no code pathway to it. But it is part of the C API, so must not be
# excised from the code.
[ r".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
# PyArray_Std trivially forwards to and appears to be superceded by
# __New_PyArray_Std, which is exercised by the test framework.
[ r".*/multiarray/calculation\.", "PyArray_Std" ],
# PyCapsule_Check is declared in a header, and used in
# multiarray/ctors.c. So it isn't really untested.
[ r".*/multiarray/common\.", "PyCapsule_Check" ],
]
| suppressions = [
+ # This one cannot be covered by any Python language test because there is
+ # no code pathway to it. But it is part of the C API, so must not be
+ # excised from the code.
- [ ".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
+ [ r".*/multiarray/mapping\.", "PyArray_MapIterReset" ],
? +
# PyArray_Std trivially forwards to and appears to be superceded by
# __New_PyArray_Std, which is exercised by the test framework.
- [ ".*/multiarray/calculation\.", "PyArray_Std" ],
+ [ r".*/multiarray/calculation\.", "PyArray_Std" ],
? +
# PyCapsule_Check is declared in a header, and used in
# multiarray/ctors.c. So it isn't really untested.
- [ ".*/multiarray/common\.", "PyCapsule_Check" ],
+ [ r".*/multiarray/common\.", "PyCapsule_Check" ],
? +
] | 9 | 0.818182 | 6 | 3 |
1aa1b1f767a94a583d43d34c8a78b2ce076b0b6a | docs/COMPATIBILITY.md | docs/COMPATIBILITY.md | Changes from the Ruby MAL API
=============================
The external API interface is based on and the output tries to match that
provided by the Unofficial MAL API coded by Chu Yeow. Because of differences
between implementations, some differences exist. They are listed below grouped
by the type of change.
Please note that this only applies to the "API 1.0" interface. Future revisions,
while based on the format of the Unofficial MAL API, do not offer output
compatibility.
Changes
-------
* The Ruby API encodes text with HTML entities. This code doesn't encode extended
characters and instead returns non-encoded UTF-8 text.
* The API only outputs JSON. (XML output may be added at a later time.)
* The verify_credentials endpoint now returns content in the response body. You
are free to ignore it and use the HTTP response codes, which are not changed.
New Features
------------
* The /friends/ API endpoint has been added.
* The search endpoints support more than 20 results, just use the page parameter
in the URL like in anime and manga upcoming.
Bugfixes
--------
* For series with one date, the Ruby API sets this as the end date. This now
sets it as the start date.
* For many errors and messages, the output wasn't well-formed. We attempt, to
the extent it is possible, to always return a well-formed message.
| Changes from the Ruby MAL API
=============================
*Note:* This document only applies to version 1.0 of the API interface. Later
versions, while similar in spirit, do not guarantee output compatibility.
The external API interface is based on and the output tries to match that
provided by the Unofficial MAL API coded by Chu Yeow. Because of differences
between implementations, some differences exist. They are listed below grouped
by the type of change.
Changes
-------
* The Ruby API encodes text with HTML entities. This API does not perform
encoding and returns the code as UTF-8 text.
* Only JSON is supported as an output format.
* The verify_credentials endpoint now returns content in the response body. You
are free to ignore it and use the HTTP response codes, which are not changed.
New Features
------------
* The /friends/ API endpoint has been added.
* The search endpoints support more than 20 results, just use the page parameter
in the URL similar to anime and manga upcoming.
Bugfixes
--------
* For series with only one date, the Ruby API sets this as the end date. This
now sets it as the start date.
* For many errors and messages, the output wasn't well-formed. We attempt, to
the extent it is possible, to always return a well-formed message.
| Update Compatibility Doc for Wording | Update Compatibility Doc for Wording
Adjust some of the wording in the compatibility doc. Add note to make it more clear this only applies to compat between the Ruby API and version 1.0 of the API in this project.
| Markdown | apache-2.0 | AnimeNeko/atarashii-mal-api,AnimeNeko/atarashii-mal-api,AnimeNeko/atarashii-mal-api,AnimeNeko/atarashii-mal-api | markdown | ## Code Before:
Changes from the Ruby MAL API
=============================
The external API interface is based on and the output tries to match that
provided by the Unofficial MAL API coded by Chu Yeow. Because of differences
between implementations, some differences exist. They are listed below grouped
by the type of change.
Please note that this only applies to the "API 1.0" interface. Future revisions,
while based on the format of the Unofficial MAL API, do not offer output
compatibility.
Changes
-------
* The Ruby API encodes text with HTML entities. This code doesn't encode extended
characters and instead returns non-encoded UTF-8 text.
* The API only outputs JSON. (XML output may be added at a later time.)
* The verify_credentials endpoint now returns content in the response body. You
are free to ignore it and use the HTTP response codes, which are not changed.
New Features
------------
* The /friends/ API endpoint has been added.
* The search endpoints support more than 20 results, just use the page parameter
in the URL like in anime and manga upcoming.
Bugfixes
--------
* For series with one date, the Ruby API sets this as the end date. This now
sets it as the start date.
* For many errors and messages, the output wasn't well-formed. We attempt, to
the extent it is possible, to always return a well-formed message.
## Instruction:
Update Compatibility Doc for Wording
Adjust some of the wording in the compatibility doc. Add note to make it more clear this only applies to compat between the Ruby API and version 1.0 of the API in this project.
## Code After:
Changes from the Ruby MAL API
=============================
*Note:* This document only applies to version 1.0 of the API interface. Later
versions, while similar in spirit, do not guarantee output compatibility.
The external API interface is based on and the output tries to match that
provided by the Unofficial MAL API coded by Chu Yeow. Because of differences
between implementations, some differences exist. They are listed below grouped
by the type of change.
Changes
-------
* The Ruby API encodes text with HTML entities. This API does not perform
encoding and returns the code as UTF-8 text.
* Only JSON is supported as an output format.
* The verify_credentials endpoint now returns content in the response body. You
are free to ignore it and use the HTTP response codes, which are not changed.
New Features
------------
* The /friends/ API endpoint has been added.
* The search endpoints support more than 20 results, just use the page parameter
in the URL similar to anime and manga upcoming.
Bugfixes
--------
* For series with only one date, the Ruby API sets this as the end date. This
now sets it as the start date.
* For many errors and messages, the output wasn't well-formed. We attempt, to
the extent it is possible, to always return a well-formed message.
| Changes from the Ruby MAL API
=============================
+
+ *Note:* This document only applies to version 1.0 of the API interface. Later
+ versions, while similar in spirit, do not guarantee output compatibility.
The external API interface is based on and the output tries to match that
provided by the Unofficial MAL API coded by Chu Yeow. Because of differences
between implementations, some differences exist. They are listed below grouped
by the type of change.
- Please note that this only applies to the "API 1.0" interface. Future revisions,
- while based on the format of the Unofficial MAL API, do not offer output
- compatibility.
-
Changes
-------
- * The Ruby API encodes text with HTML entities. This code doesn't encode extended
? ^^^^ ^ ^^ ^^^^^^^^^^^
+ * The Ruby API encodes text with HTML entities. This API does not perform
? ^^^ + ^ + ^^ ^^
- characters and instead returns non-encoded UTF-8 text.
- * The API only outputs JSON. (XML output may be added at a later time.)
+ encoding and returns the code as UTF-8 text.
+ * Only JSON is supported as an output format.
* The verify_credentials endpoint now returns content in the response body. You
are free to ignore it and use the HTTP response codes, which are not changed.
New Features
------------
* The /friends/ API endpoint has been added.
* The search endpoints support more than 20 results, just use the page parameter
- in the URL like in anime and manga upcoming.
? ^^^ ^^
+ in the URL similar to anime and manga upcoming.
? ++++ ^^ ^^
Bugfixes
--------
- * For series with one date, the Ruby API sets this as the end date. This now
? ----
+ * For series with only one date, the Ruby API sets this as the end date. This
? +++++
- sets it as the start date.
+ now sets it as the start date.
? ++++
* For many errors and messages, the output wasn't well-formed. We attempt, to
the extent it is possible, to always return a well-formed message. | 19 | 0.59375 | 9 | 10 |
fde0f4623511455826005e1dda31649c3d41488a | matrix.go | matrix.go | /*
Package "matrix" provides types which represent matrix, scalar and iterator of elements.
*/
package matrix
import (
"github.com/mitsuse/matrix-go/internal/types"
)
type Matrix interface {
types.Matrix
}
type Cursor interface {
types.Cursor
}
| /*
Package "matrix" is an experimental library for matrix manipulation implemented in Golang.
*/
package matrix
import (
"github.com/mitsuse/matrix-go/internal/types"
)
type Matrix interface {
types.Matrix
}
type Cursor interface {
types.Cursor
}
| Replace the short description with a sentence to describe the whole of "matirx-go" library. | Replace the short description with a sentence to describe the whole of "matirx-go" library.
| Go | mit | mitsuse/matrix-go,jmptrader/matrix-go | go | ## Code Before:
/*
Package "matrix" provides types which represent matrix, scalar and iterator of elements.
*/
package matrix
import (
"github.com/mitsuse/matrix-go/internal/types"
)
type Matrix interface {
types.Matrix
}
type Cursor interface {
types.Cursor
}
## Instruction:
Replace the short description with a sentence to describe the whole of "matirx-go" library.
## Code After:
/*
Package "matrix" is an experimental library for matrix manipulation implemented in Golang.
*/
package matrix
import (
"github.com/mitsuse/matrix-go/internal/types"
)
type Matrix interface {
types.Matrix
}
type Cursor interface {
types.Cursor
}
| /*
- Package "matrix" provides types which represent matrix, scalar and iterator of elements.
+ Package "matrix" is an experimental library for matrix manipulation implemented in Golang.
*/
package matrix
import (
"github.com/mitsuse/matrix-go/internal/types"
)
type Matrix interface {
types.Matrix
}
type Cursor interface {
types.Cursor
} | 2 | 0.125 | 1 | 1 |
613a287714e1bb5c0dead2b3267253a0f7fc68fa | app/assets/javascripts/widgets/frame.js | app/assets/javascripts/widgets/frame.js | (function () {
if (!window.postMessage || window.self === window.top) {
return;
}
function reportWidgetSize() {
var D, height, sizeMessage;
D = document;
height = Math.max(
Math.max(D.body.scrollHeight, D.documentElement.scrollHeight),
Math.max(D.body.offsetHeight, D.documentElement.offsetHeight),
Math.max(D.body.clientHeight, D.documentElement.clientHeight)
);
sizeMessage = 'hdo-widget-size:' + height + 'px';
// no sensitive data is being sent here, so * is fine.
window.top.postMessage(sizeMessage, '*');
}
if (window.addEventListener) { // W3C DOM
window.addEventListener('DOMContentLoaded', reportWidgetSize, false);
} else if (window.attachEvent) { // IE DOM
window.attachEvent("onreadystatechange", reportWidgetSize);
}
}());
| (function () {
if (!window.postMessage || window.self === window.top) {
return;
}
function reportWidgetSize() {
var D, height, sizeMessage;
D = document;
height = Math.max(
Math.max(D.body.scrollHeight, D.documentElement.scrollHeight),
Math.max(D.body.offsetHeight, D.documentElement.offsetHeight),
Math.max(D.body.clientHeight, D.documentElement.clientHeight)
);
sizeMessage = 'hdo-widget-size:' + height + 'px';
// no sensitive data is being sent here, so * is fine.
window.top.postMessage(sizeMessage, '*');
}
if (window.addEventListener) { // W3C DOM
window.addEventListener('load', reportWidgetSize, false);
} else if (window.attachEvent) { // IE DOM
window.attachEvent("onload", reportWidgetSize);
}
}());
| Use 'load' over 'DOMContentLoaded' to wait for stylesheets. | Widgets: Use 'load' over 'DOMContentLoaded' to wait for stylesheets.
| JavaScript | bsd-3-clause | holderdeord/hdo-site,holderdeord/hdo-site,holderdeord/hdo-site,holderdeord/hdo-site | javascript | ## Code Before:
(function () {
if (!window.postMessage || window.self === window.top) {
return;
}
function reportWidgetSize() {
var D, height, sizeMessage;
D = document;
height = Math.max(
Math.max(D.body.scrollHeight, D.documentElement.scrollHeight),
Math.max(D.body.offsetHeight, D.documentElement.offsetHeight),
Math.max(D.body.clientHeight, D.documentElement.clientHeight)
);
sizeMessage = 'hdo-widget-size:' + height + 'px';
// no sensitive data is being sent here, so * is fine.
window.top.postMessage(sizeMessage, '*');
}
if (window.addEventListener) { // W3C DOM
window.addEventListener('DOMContentLoaded', reportWidgetSize, false);
} else if (window.attachEvent) { // IE DOM
window.attachEvent("onreadystatechange", reportWidgetSize);
}
}());
## Instruction:
Widgets: Use 'load' over 'DOMContentLoaded' to wait for stylesheets.
## Code After:
(function () {
if (!window.postMessage || window.self === window.top) {
return;
}
function reportWidgetSize() {
var D, height, sizeMessage;
D = document;
height = Math.max(
Math.max(D.body.scrollHeight, D.documentElement.scrollHeight),
Math.max(D.body.offsetHeight, D.documentElement.offsetHeight),
Math.max(D.body.clientHeight, D.documentElement.clientHeight)
);
sizeMessage = 'hdo-widget-size:' + height + 'px';
// no sensitive data is being sent here, so * is fine.
window.top.postMessage(sizeMessage, '*');
}
if (window.addEventListener) { // W3C DOM
window.addEventListener('load', reportWidgetSize, false);
} else if (window.attachEvent) { // IE DOM
window.attachEvent("onload", reportWidgetSize);
}
}());
| (function () {
if (!window.postMessage || window.self === window.top) {
return;
}
function reportWidgetSize() {
var D, height, sizeMessage;
D = document;
height = Math.max(
Math.max(D.body.scrollHeight, D.documentElement.scrollHeight),
Math.max(D.body.offsetHeight, D.documentElement.offsetHeight),
Math.max(D.body.clientHeight, D.documentElement.clientHeight)
);
sizeMessage = 'hdo-widget-size:' + height + 'px';
// no sensitive data is being sent here, so * is fine.
window.top.postMessage(sizeMessage, '*');
}
if (window.addEventListener) { // W3C DOM
- window.addEventListener('DOMContentLoaded', reportWidgetSize, false);
? ^^^^^^^^^^^ --
+ window.addEventListener('load', reportWidgetSize, false);
? ^
} else if (window.attachEvent) { // IE DOM
- window.attachEvent("onreadystatechange", reportWidgetSize);
? ^^ ------------
+ window.attachEvent("onload", reportWidgetSize);
? ^^
}
}()); | 4 | 0.153846 | 2 | 2 |
9ea7e49e11c3e05b86b9eeaffd416285c9a2551a | pushhub/models.py | pushhub/models.py | from persistent.mapping import PersistentMapping
class Root(PersistentMapping):
__parent__ = __name__ = None
def appmaker(zodb_root):
if not 'app_root' in zodb_root:
app_root = Root()
zodb_root['app_root'] = app_root
import transaction
transaction.commit()
return zodb_root['app_root']
| from persistent.mapping import PersistentMapping
from .subsciber import Subscribers
from .topic import Topics
class Root(PersistentMapping):
__parent__ = __name__ = None
def appmaker(zodb_root):
if not 'app_root' in zodb_root:
app_root = Root()
zodb_root['app_root'] = app_root
import transaction
transaction.commit()
subscribers = Subscribers()
app_root['subscribers'] = subscribers
subscribers.__name__ = 'subscribers'
subscribers.__parent__ = app_root
transaction.commit()
topics = Topics()
app_root['topics'] = topics
topics.__name__ = 'topics'
topics.__parent__ = app_root
transaction.commit()
return zodb_root['app_root']
| Add folder set up to the ZODB on app creation. | Add folder set up to the ZODB on app creation.
| Python | bsd-3-clause | ucla/PushHubCore | python | ## Code Before:
from persistent.mapping import PersistentMapping
class Root(PersistentMapping):
__parent__ = __name__ = None
def appmaker(zodb_root):
if not 'app_root' in zodb_root:
app_root = Root()
zodb_root['app_root'] = app_root
import transaction
transaction.commit()
return zodb_root['app_root']
## Instruction:
Add folder set up to the ZODB on app creation.
## Code After:
from persistent.mapping import PersistentMapping
from .subsciber import Subscribers
from .topic import Topics
class Root(PersistentMapping):
__parent__ = __name__ = None
def appmaker(zodb_root):
if not 'app_root' in zodb_root:
app_root = Root()
zodb_root['app_root'] = app_root
import transaction
transaction.commit()
subscribers = Subscribers()
app_root['subscribers'] = subscribers
subscribers.__name__ = 'subscribers'
subscribers.__parent__ = app_root
transaction.commit()
topics = Topics()
app_root['topics'] = topics
topics.__name__ = 'topics'
topics.__parent__ = app_root
transaction.commit()
return zodb_root['app_root']
| from persistent.mapping import PersistentMapping
+
+ from .subsciber import Subscribers
+ from .topic import Topics
class Root(PersistentMapping):
__parent__ = __name__ = None
def appmaker(zodb_root):
if not 'app_root' in zodb_root:
app_root = Root()
zodb_root['app_root'] = app_root
import transaction
transaction.commit()
+
+ subscribers = Subscribers()
+ app_root['subscribers'] = subscribers
+ subscribers.__name__ = 'subscribers'
+ subscribers.__parent__ = app_root
+ transaction.commit()
+
+ topics = Topics()
+ app_root['topics'] = topics
+ topics.__name__ = 'topics'
+ topics.__parent__ = app_root
+ transaction.commit()
+
return zodb_root['app_root'] | 16 | 1.142857 | 16 | 0 |
f94a02817284b32f342ac27de1c5b22dff96b246 | recipes/linux/linux-powerpc-fsl_git.bb | recipes/linux/linux-powerpc-fsl_git.bb |
require linux.inc
FILESPATHPKG =. "linux-powerpc-fsl-git/${MACHINE}:"
SRCREV = "1406de8e11eb043681297adf86d6892ff8efc27a"
PV = "2.6.30"
PR = "r4"
SRC_URI = "git://git.kernel.org/pub/scm/linux/kernel/git/galak/powerpc.git;protocol=git \
file://defconfig"
SRC_URI_append_mpc8315e-rdb = " file://mpc8315erdb-add-msi-to-dts.patch;patch=1"
COMPATIBLE_MACHINE = "mpc8315e-rdb"
S = "${WORKDIR}/git"
|
require linux.inc
FILESPATHPKG =. "linux-powerpc-fsl-git/${MACHINE}:"
SRCREV = "1406de8e11eb043681297adf86d6892ff8efc27a"
PV = "2.6.30"
SRCREV_calamari = "7c0a57d5c47bcfc492b3139e77400f888a935c44"
PR = "r4"
SRC_URI = "git://git.kernel.org/pub/scm/linux/kernel/git/galak/powerpc.git;protocol=git \
file://defconfig"
SRC_URI_append_mpc8315e-rdb = " file://mpc8315erdb-add-msi-to-dts.patch;patch=1"
COMPATIBLE_MACHINE = "(mpc8315e-rdb|calamari)"
S = "${WORKDIR}/git"
| Update GIT revision for Freescale Calamari board kernel. | linux-powerpc-fsl: Update GIT revision for Freescale Calamari board kernel.
Signed-off-by: Leon Woestenberg <be92910ae7bb6f58bb9b7c6f845bc30640119592@sidebranch.com>
| BitBake | mit | scottellis/overo-oe,JamesAng/oe,thebohemian/openembedded,hulifox008/openembedded,sutajiokousagi/openembedded,JamesAng/oe,nzjrs/overo-openembedded,openembedded/openembedded,John-NY/overo-oe,Martix/Eonos,rascalmicro/openembedded-rascal,openembedded/openembedded,nzjrs/overo-openembedded,openembedded/openembedded,SIFTeam/openembedded,sutajiokousagi/openembedded,SIFTeam/openembedded,sutajiokousagi/openembedded,Martix/Eonos,anguslees/openembedded-android,BlackPole/bp-openembedded,sentient-energy/emsw-oe-mirror,bticino/openembedded,JamesAng/goe,libo/openembedded,John-NY/overo-oe,yyli/overo-oe,scottellis/overo-oe,giobauermeister/openembedded,mrchapp/arago-oe-dev,scottellis/overo-oe,JamesAng/oe,sampov2/audio-openembedded,philb/pbcl-oe-2010,JamesAng/oe,rascalmicro/openembedded-rascal,sampov2/audio-openembedded,libo/openembedded,scottellis/overo-oe,dellysunnymtech/sakoman-oe,dave-billin/overo-ui-moos-auv,xifengchuo/openembedded,mrchapp/arago-oe-dev,sentient-energy/emsw-oe-mirror,openembedded/openembedded,giobauermeister/openembedded,mrchapp/arago-oe-dev,nzjrs/overo-openembedded,giobauermeister/openembedded,libo/openembedded,hulifox008/openembedded,trini/openembedded,bticino/openembedded,crystalfontz/openembedded,BlackPole/bp-openembedded,JamesAng/goe,nzjrs/overo-openembedded,BlackPole/bp-openembedded,buglabs/oe-buglabs,yyli/overo-oe,sentient-energy/emsw-oe-mirror,xifengchuo/openembedded,trini/openembedded,JamesAng/goe,sentient-energy/emsw-oe-mirror,hulifox008/openembedded,libo/openembedded,dellysunnymtech/sakoman-oe,yyli/overo-oe,mrchapp/arago-oe-dev,JamesAng/goe,giobauermeister/openembedded,sentient-energy/emsw-oe-mirror,anguslees/openembedded-android,dellysunnymtech/sakoman-oe,thebohemian/openembedded,JamesAng/goe,SIFTeam/openembedded,hulifox008/openembedded,openpli-arm/openembedded,sledz/oe,JamesAng/oe,John-NY/overo-oe,xifengchuo/openembedded,trini/openembedded,nx111/openembeded_openpli2.1_nx111,trini/openembedded,dave-billin/overo-ui-moos-auv,buglabs/oe-buglabs,crystalfontz/openembedded,John-NY/overo-oe,sutajiokousagi/openembedded,openpli-arm/openembedded,xifengchuo/openembedded,sampov2/audio-openembedded,dellysunnymtech/sakoman-oe,nzjrs/overo-openembedded,crystalfontz/openembedded,Martix/Eonos,nx111/openembeded_openpli2.1_nx111,SIFTeam/openembedded,sampov2/audio-openembedded,sledz/oe,buglabs/oe-buglabs,yyli/overo-oe,dave-billin/overo-ui-moos-auv,dellysunnymtech/sakoman-oe,trini/openembedded,BlackPole/bp-openembedded,nzjrs/overo-openembedded,libo/openembedded,Martix/Eonos,openembedded/openembedded,dellysunnymtech/sakoman-oe,philb/pbcl-oe-2010,buglabs/oe-buglabs,dave-billin/overo-ui-moos-auv,anguslees/openembedded-android,sledz/oe,sledz/oe,thebohemian/openembedded,hulifox008/openembedded,scottellis/overo-oe,trini/openembedded,nzjrs/overo-openembedded,yyli/overo-oe,giobauermeister/openembedded,scottellis/overo-oe,philb/pbcl-oe-2010,anguslees/openembedded-android,philb/pbcl-oe-2010,hulifox008/openembedded,SIFTeam/openembedded,bticino/openembedded,yyli/overo-oe,dellysunnymtech/sakoman-oe,thebohemian/openembedded,openpli-arm/openembedded,sampov2/audio-openembedded,crystalfontz/openembedded,crystalfontz/openembedded,yyli/overo-oe,anguslees/openembedded-android,giobauermeister/openembedded,openpli-arm/openembedded,bticino/openembedded,scottellis/overo-oe,rascalmicro/openembedded-rascal,mrchapp/arago-oe-dev,openpli-arm/openembedded,sledz/oe,sledz/oe,nx111/openembeded_openpli2.1_nx111,xifengchuo/openembedded,thebohemian/openembedded,bticino/openembedded,openembedded/openembedded,sentient-energy/emsw-oe-mirror,dave-billin/overo-ui-moos-auv,trini/openembedded,libo/openembedded,philb/pbcl-oe-2010,anguslees/openembedded-android,sledz/oe,sutajiokousagi/openembedded,JamesAng/oe,dave-billin/overo-ui-moos-auv,nx111/openembeded_openpli2.1_nx111,anguslees/openembedded-android,sentient-energy/emsw-oe-mirror,John-NY/overo-oe,openpli-arm/openembedded,yyli/overo-oe,thebohemian/openembedded,dellysunnymtech/sakoman-oe,Martix/Eonos,mrchapp/arago-oe-dev,openembedded/openembedded,nx111/openembeded_openpli2.1_nx111,openembedded/openembedded,buglabs/oe-buglabs,nx111/openembeded_openpli2.1_nx111,xifengchuo/openembedded,openembedded/openembedded,sutajiokousagi/openembedded,JamesAng/oe,openpli-arm/openembedded,buglabs/oe-buglabs,buglabs/oe-buglabs,buglabs/oe-buglabs,bticino/openembedded,philb/pbcl-oe-2010,dellysunnymtech/sakoman-oe,BlackPole/bp-openembedded,dave-billin/overo-ui-moos-auv,giobauermeister/openembedded,sutajiokousagi/openembedded,rascalmicro/openembedded-rascal,sampov2/audio-openembedded,rascalmicro/openembedded-rascal,John-NY/overo-oe,nx111/openembeded_openpli2.1_nx111,crystalfontz/openembedded,BlackPole/bp-openembedded,John-NY/overo-oe,mrchapp/arago-oe-dev,rascalmicro/openembedded-rascal,rascalmicro/openembedded-rascal,thebohemian/openembedded,JamesAng/goe,libo/openembedded,SIFTeam/openembedded,bticino/openembedded,Martix/Eonos,rascalmicro/openembedded-rascal,Martix/Eonos,philb/pbcl-oe-2010,openembedded/openembedded,openembedded/openembedded,BlackPole/bp-openembedded,hulifox008/openembedded,crystalfontz/openembedded,JamesAng/goe,giobauermeister/openembedded,xifengchuo/openembedded,sampov2/audio-openembedded,giobauermeister/openembedded,SIFTeam/openembedded,xifengchuo/openembedded,xifengchuo/openembedded,nx111/openembeded_openpli2.1_nx111 | bitbake | ## Code Before:
require linux.inc
FILESPATHPKG =. "linux-powerpc-fsl-git/${MACHINE}:"
SRCREV = "1406de8e11eb043681297adf86d6892ff8efc27a"
PV = "2.6.30"
PR = "r4"
SRC_URI = "git://git.kernel.org/pub/scm/linux/kernel/git/galak/powerpc.git;protocol=git \
file://defconfig"
SRC_URI_append_mpc8315e-rdb = " file://mpc8315erdb-add-msi-to-dts.patch;patch=1"
COMPATIBLE_MACHINE = "mpc8315e-rdb"
S = "${WORKDIR}/git"
## Instruction:
linux-powerpc-fsl: Update GIT revision for Freescale Calamari board kernel.
Signed-off-by: Leon Woestenberg <be92910ae7bb6f58bb9b7c6f845bc30640119592@sidebranch.com>
## Code After:
require linux.inc
FILESPATHPKG =. "linux-powerpc-fsl-git/${MACHINE}:"
SRCREV = "1406de8e11eb043681297adf86d6892ff8efc27a"
PV = "2.6.30"
SRCREV_calamari = "7c0a57d5c47bcfc492b3139e77400f888a935c44"
PR = "r4"
SRC_URI = "git://git.kernel.org/pub/scm/linux/kernel/git/galak/powerpc.git;protocol=git \
file://defconfig"
SRC_URI_append_mpc8315e-rdb = " file://mpc8315erdb-add-msi-to-dts.patch;patch=1"
COMPATIBLE_MACHINE = "(mpc8315e-rdb|calamari)"
S = "${WORKDIR}/git"
|
require linux.inc
FILESPATHPKG =. "linux-powerpc-fsl-git/${MACHINE}:"
SRCREV = "1406de8e11eb043681297adf86d6892ff8efc27a"
+ PV = "2.6.30"
- PV = "2.6.30"
+ SRCREV_calamari = "7c0a57d5c47bcfc492b3139e77400f888a935c44"
+
PR = "r4"
SRC_URI = "git://git.kernel.org/pub/scm/linux/kernel/git/galak/powerpc.git;protocol=git \
file://defconfig"
SRC_URI_append_mpc8315e-rdb = " file://mpc8315erdb-add-msi-to-dts.patch;patch=1"
- COMPATIBLE_MACHINE = "mpc8315e-rdb"
+ COMPATIBLE_MACHINE = "(mpc8315e-rdb|calamari)"
? + ++++++++++
S = "${WORKDIR}/git" | 6 | 0.333333 | 4 | 2 |
6c348722943298a011e84b354a6865e1d866777e | src/main/java/com/github/aureliano/achmed/os/pkg/DpkgPackageManager.java | src/main/java/com/github/aureliano/achmed/os/pkg/DpkgPackageManager.java | package com.github.aureliano.achmed.os.pkg;
import com.github.aureliano.achmed.command.CommandResponse;
import com.github.aureliano.achmed.resources.properties.PackageProperties;
public class DpkgPackageManager implements IPackageManager {
private PackageProperties properties;
public DpkgPackageManager() {
super();
}
public CommandResponse install() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public CommandResponse uninstall() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public String latest() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public void setPackageProperties(PackageProperties properties) {
this.properties = properties;
}
public PackageProperties getPackageProperties() {
return this.properties;
}
} | package com.github.aureliano.achmed.os.pkg;
import com.github.aureliano.achmed.command.CommandFacade;
import com.github.aureliano.achmed.command.CommandResponse;
import com.github.aureliano.achmed.exception.PackageResourceException;
import com.github.aureliano.achmed.helper.StringHelper;
import com.github.aureliano.achmed.resources.properties.PackageProperties;
public class DpkgPackageManager implements IPackageManager {
private static final String DPKG_QUERY = "/usr/bin/dpkg-query";
private static final String QUERY_FORMAT = "'${status} ${version}\\n'";
private PackageProperties properties;
public DpkgPackageManager() {
super();
}
public CommandResponse install() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public CommandResponse uninstall() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public String latest() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public boolean isInstalled() {
final String desiredStatus = "installed";
CommandResponse res = CommandFacade.executeCommand(
DPKG_QUERY, "-W", "-f", QUERY_FORMAT, this.properties.getName()
);
if (!res.isOK()) {
throw new PackageResourceException(res);
}
String[] match = StringHelper.match("^(\\S+)\\s+(\\S+)\\s+(\\S+)\\s+(\\S*)$", res.getOutput());
if (match == null) {
throw new PackageResourceException("Failed to match dpkg-query " + res.getCommand());
}
return desiredStatus.equalsIgnoreCase(match[3]);
}
public void setPackageProperties(PackageProperties properties) {
this.properties = properties;
}
public PackageProperties getPackageProperties() {
return this.properties;
}
} | Implement methodo to check wheter package is installed. | Implement methodo to check wheter package is installed.
| Java | mit | aureliano/achmed | java | ## Code Before:
package com.github.aureliano.achmed.os.pkg;
import com.github.aureliano.achmed.command.CommandResponse;
import com.github.aureliano.achmed.resources.properties.PackageProperties;
public class DpkgPackageManager implements IPackageManager {
private PackageProperties properties;
public DpkgPackageManager() {
super();
}
public CommandResponse install() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public CommandResponse uninstall() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public String latest() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public void setPackageProperties(PackageProperties properties) {
this.properties = properties;
}
public PackageProperties getPackageProperties() {
return this.properties;
}
}
## Instruction:
Implement methodo to check wheter package is installed.
## Code After:
package com.github.aureliano.achmed.os.pkg;
import com.github.aureliano.achmed.command.CommandFacade;
import com.github.aureliano.achmed.command.CommandResponse;
import com.github.aureliano.achmed.exception.PackageResourceException;
import com.github.aureliano.achmed.helper.StringHelper;
import com.github.aureliano.achmed.resources.properties.PackageProperties;
public class DpkgPackageManager implements IPackageManager {
private static final String DPKG_QUERY = "/usr/bin/dpkg-query";
private static final String QUERY_FORMAT = "'${status} ${version}\\n'";
private PackageProperties properties;
public DpkgPackageManager() {
super();
}
public CommandResponse install() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public CommandResponse uninstall() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public String latest() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public boolean isInstalled() {
final String desiredStatus = "installed";
CommandResponse res = CommandFacade.executeCommand(
DPKG_QUERY, "-W", "-f", QUERY_FORMAT, this.properties.getName()
);
if (!res.isOK()) {
throw new PackageResourceException(res);
}
String[] match = StringHelper.match("^(\\S+)\\s+(\\S+)\\s+(\\S+)\\s+(\\S*)$", res.getOutput());
if (match == null) {
throw new PackageResourceException("Failed to match dpkg-query " + res.getCommand());
}
return desiredStatus.equalsIgnoreCase(match[3]);
}
public void setPackageProperties(PackageProperties properties) {
this.properties = properties;
}
public PackageProperties getPackageProperties() {
return this.properties;
}
} | package com.github.aureliano.achmed.os.pkg;
+ import com.github.aureliano.achmed.command.CommandFacade;
import com.github.aureliano.achmed.command.CommandResponse;
+ import com.github.aureliano.achmed.exception.PackageResourceException;
+ import com.github.aureliano.achmed.helper.StringHelper;
import com.github.aureliano.achmed.resources.properties.PackageProperties;
public class DpkgPackageManager implements IPackageManager {
+
+ private static final String DPKG_QUERY = "/usr/bin/dpkg-query";
+ private static final String QUERY_FORMAT = "'${status} ${version}\\n'";
private PackageProperties properties;
public DpkgPackageManager() {
super();
}
public CommandResponse install() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public CommandResponse uninstall() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
public String latest() {
throw new UnsupportedOperationException("Method not implemented yet.");
}
+
+ public boolean isInstalled() {
+ final String desiredStatus = "installed";
+ CommandResponse res = CommandFacade.executeCommand(
+ DPKG_QUERY, "-W", "-f", QUERY_FORMAT, this.properties.getName()
+ );
+
+ if (!res.isOK()) {
+ throw new PackageResourceException(res);
+ }
+
+ String[] match = StringHelper.match("^(\\S+)\\s+(\\S+)\\s+(\\S+)\\s+(\\S*)$", res.getOutput());
+ if (match == null) {
+ throw new PackageResourceException("Failed to match dpkg-query " + res.getCommand());
+ }
+
+ return desiredStatus.equalsIgnoreCase(match[3]);
+ }
public void setPackageProperties(PackageProperties properties) {
this.properties = properties;
}
public PackageProperties getPackageProperties() {
return this.properties;
}
} | 24 | 0.727273 | 24 | 0 |
8b51c9904fd09354ff5385fc1740d9270da8287c | should-I-boot-this.py | should-I-boot-this.py |
import os
import sys
import configparser
"""
To test the script, just export those variables and play with their values
export LAB=lab-free-electrons
export TREE=mainline
"""
config = configparser.ConfigParser()
config.read('labs.ini')
# Check if we need to stop here
if os.environ['TREE'] in config[os.environ['LAB']]['tree_blacklist'].split():
print("Tree '%s' is blacklisted for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(1)
print("Booting tree '%s' is allowed for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(0)
|
import os
import sys
import configparser
"""
To test the script, just export those variables and play with their values
export LAB=lab-free-electrons
export TREE=mainline
"""
config = configparser.ConfigParser()
config.read('labs.ini')
# Is the lab existing?
if os.environ['LAB'] not in config.sections():
print("Unknown lab (%s). Allowing boot of %s." % (os.environ['LAB'], os.environ['TREE']))
sys.exit(0)
# Is the tree blacklisted for this lab?
if os.environ['TREE'] in config[os.environ['LAB']]['tree_blacklist'].split():
print("Tree '%s' is blacklisted for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(1)
print("Booting tree '%s' is allowed for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(0)
| Allow boots for unknown labs | jenkins: Allow boots for unknown labs
Signed-off-by: Florent Jacquet <692930aa2e4df70616939784b5b6c25eb1f2335c@free-electrons.com>
| Python | lgpl-2.1 | kernelci/lava-ci-staging,kernelci/lava-ci-staging,kernelci/lava-ci-staging | python | ## Code Before:
import os
import sys
import configparser
"""
To test the script, just export those variables and play with their values
export LAB=lab-free-electrons
export TREE=mainline
"""
config = configparser.ConfigParser()
config.read('labs.ini')
# Check if we need to stop here
if os.environ['TREE'] in config[os.environ['LAB']]['tree_blacklist'].split():
print("Tree '%s' is blacklisted for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(1)
print("Booting tree '%s' is allowed for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(0)
## Instruction:
jenkins: Allow boots for unknown labs
Signed-off-by: Florent Jacquet <692930aa2e4df70616939784b5b6c25eb1f2335c@free-electrons.com>
## Code After:
import os
import sys
import configparser
"""
To test the script, just export those variables and play with their values
export LAB=lab-free-electrons
export TREE=mainline
"""
config = configparser.ConfigParser()
config.read('labs.ini')
# Is the lab existing?
if os.environ['LAB'] not in config.sections():
print("Unknown lab (%s). Allowing boot of %s." % (os.environ['LAB'], os.environ['TREE']))
sys.exit(0)
# Is the tree blacklisted for this lab?
if os.environ['TREE'] in config[os.environ['LAB']]['tree_blacklist'].split():
print("Tree '%s' is blacklisted for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(1)
print("Booting tree '%s' is allowed for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(0)
|
import os
import sys
import configparser
"""
To test the script, just export those variables and play with their values
export LAB=lab-free-electrons
export TREE=mainline
"""
config = configparser.ConfigParser()
config.read('labs.ini')
- # Check if we need to stop here
+ # Is the lab existing?
+ if os.environ['LAB'] not in config.sections():
+ print("Unknown lab (%s). Allowing boot of %s." % (os.environ['LAB'], os.environ['TREE']))
+ sys.exit(0)
+
+ # Is the tree blacklisted for this lab?
if os.environ['TREE'] in config[os.environ['LAB']]['tree_blacklist'].split():
print("Tree '%s' is blacklisted for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(1)
print("Booting tree '%s' is allowed for lab '%s'" % (os.environ['TREE'], os.environ['LAB']))
sys.exit(0)
| 7 | 0.304348 | 6 | 1 |
4f196b758e62cb6ebfdc2b5d009014b2825dd931 | scripts/roles/cikit-solr/defaults/main.yml | scripts/roles/cikit-solr/defaults/main.yml | ---
solr_port: 8389
solr_version: 5.5.3
solr_cores:
- default
- build
- demo
| ---
solr_port: 8389
solr_version: "{{ cikit_solr.versions | last }}"
solr_cores:
- default
- build
- demo
| Set default version to latest available | [CIKit][Solr] Set default version to latest available
| YAML | apache-2.0 | BR0kEN-/cikit,BR0kEN-/cikit,BR0kEN-/cikit,BR0kEN-/cikit,BR0kEN-/cikit | yaml | ## Code Before:
---
solr_port: 8389
solr_version: 5.5.3
solr_cores:
- default
- build
- demo
## Instruction:
[CIKit][Solr] Set default version to latest available
## Code After:
---
solr_port: 8389
solr_version: "{{ cikit_solr.versions | last }}"
solr_cores:
- default
- build
- demo
| ---
solr_port: 8389
- solr_version: 5.5.3
+ solr_version: "{{ cikit_solr.versions | last }}"
solr_cores:
- default
- build
- demo | 2 | 0.285714 | 1 | 1 |
8f436daeffd7a2e79e78c1920630fd8ffada57ff | update_repo.sh | update_repo.sh | last_commit_date=`git log -1 --format=%cd | head -c 10`
current_date=`date | head -c 10`
if [ "$last_commit_date" != "$current_date" ]; then
git config --global user.email "weecologydeploy@weecology.org"
git config --global user.name "Weecology Deploy Bot"
git checkout master
git add predictions/* docs/*
git commit -m "Update forecasts: Travis Build $TRAVIS_BUILD_NUMBER"
git remote add deploy https://${GITHUB_TOKEN}@github.com/weecology/portalPredictions.git > /dev/null 2>&1
git push --quiet deploy master > /dev/null 2>&1
fi
| last_commit_date=`git log --author='Weecology Deploy Bot' -1 --format=%cd | head -c 10`
current_date=`date | head -c 10`
if [ "$last_commit_date" != "$current_date" ]; then
git config --global user.email "weecologydeploy@weecology.org"
git config --global user.name "Weecology Deploy Bot"
git checkout master
git add predictions/* docs/*
git commit -m "Update forecasts: Travis Build $TRAVIS_BUILD_NUMBER"
git remote add deploy https://${GITHUB_TOKEN}@github.com/weecology/portalPredictions.git > /dev/null 2>&1
git push --quiet deploy master > /dev/null 2>&1
fi
| Fix last deploy date check | Fix last deploy date check
The check to prevent multiple updates of forecasts in a single day was
failing because on new changes the last commit is by definition on the
same day as the build. This checks for the last commit by the deploy bot
instead.
| Shell | mit | weecology/portalPredictions,gmyenni/portalPredictions,sdtaylor/portalPredictions,sdtaylor/portalPredictions,gmyenni/portalPredictions,gmyenni/portalPredictions,weecology/portalPredictions,weecology/portalPredictions,sdtaylor/portalPredictions | shell | ## Code Before:
last_commit_date=`git log -1 --format=%cd | head -c 10`
current_date=`date | head -c 10`
if [ "$last_commit_date" != "$current_date" ]; then
git config --global user.email "weecologydeploy@weecology.org"
git config --global user.name "Weecology Deploy Bot"
git checkout master
git add predictions/* docs/*
git commit -m "Update forecasts: Travis Build $TRAVIS_BUILD_NUMBER"
git remote add deploy https://${GITHUB_TOKEN}@github.com/weecology/portalPredictions.git > /dev/null 2>&1
git push --quiet deploy master > /dev/null 2>&1
fi
## Instruction:
Fix last deploy date check
The check to prevent multiple updates of forecasts in a single day was
failing because on new changes the last commit is by definition on the
same day as the build. This checks for the last commit by the deploy bot
instead.
## Code After:
last_commit_date=`git log --author='Weecology Deploy Bot' -1 --format=%cd | head -c 10`
current_date=`date | head -c 10`
if [ "$last_commit_date" != "$current_date" ]; then
git config --global user.email "weecologydeploy@weecology.org"
git config --global user.name "Weecology Deploy Bot"
git checkout master
git add predictions/* docs/*
git commit -m "Update forecasts: Travis Build $TRAVIS_BUILD_NUMBER"
git remote add deploy https://${GITHUB_TOKEN}@github.com/weecology/portalPredictions.git > /dev/null 2>&1
git push --quiet deploy master > /dev/null 2>&1
fi
| - last_commit_date=`git log -1 --format=%cd | head -c 10`
+ last_commit_date=`git log --author='Weecology Deploy Bot' -1 --format=%cd | head -c 10`
? ++++++++++++++++++++++++++++++++
current_date=`date | head -c 10`
if [ "$last_commit_date" != "$current_date" ]; then
git config --global user.email "weecologydeploy@weecology.org"
git config --global user.name "Weecology Deploy Bot"
git checkout master
git add predictions/* docs/*
git commit -m "Update forecasts: Travis Build $TRAVIS_BUILD_NUMBER"
git remote add deploy https://${GITHUB_TOKEN}@github.com/weecology/portalPredictions.git > /dev/null 2>&1
git push --quiet deploy master > /dev/null 2>&1
fi | 2 | 0.153846 | 1 | 1 |
b8228584b99c3212b1f4d959f2cc2a6dc9c43774 | src/android/CommunicationHelperPlugin.java | src/android/CommunicationHelperPlugin.java | package edu.berkeley.eecs.emission.cordova.comm;
import org.apache.cordova.*;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import android.content.Context;
import edu.berkeley.eecs.emission.cordova.comm.CommunicationHelper;
import edu.berkeley.eecs.emission.cordova.connectionsettings.ConnectionSettings;
public class CommunicationHelperPlugin extends CordovaPlugin {
@Override
public boolean execute(String action, JSONArray data, CallbackContext callbackContext) throws JSONException {
if (action.equals("pushGetJSON")) {
try {
Context ctxt = cordova.getActivity();
String relativeURL = data.getString(0);
JSONObject filledMessage = data.getJSONObject(1);
String commuteTrackerHost = ConnectionSettings.getConnectURL(ctxt);
String fullURL = commuteTrackerHost + relativeURL;
String resultString = CommunicationHelper.pushGetJSON(ctxt, fullURL, filledMessage);
callbackContext.success(new JSONObject(resultString));
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
return true;
} else {
return false;
}
}
}
| package edu.berkeley.eecs.emission.cordova.comm;
import org.apache.cordova.*;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import android.content.Context;
import edu.berkeley.eecs.emission.cordova.comm.CommunicationHelper;
import edu.berkeley.eecs.emission.cordova.connectionsettings.ConnectionSettings;
public class CommunicationHelperPlugin extends CordovaPlugin {
@Override
public boolean execute(String action, JSONArray data, final CallbackContext callbackContext) throws JSONException {
if (action.equals("pushGetJSON")) {
try {
final Context ctxt = cordova.getActivity();
String relativeURL = data.getString(0);
final JSONObject filledMessage = data.getJSONObject(1);
String commuteTrackerHost = ConnectionSettings.getConnectURL(ctxt);
final String fullURL = commuteTrackerHost + relativeURL;
cordova.getThreadPool().execute(new Runnable() {
public void run() {
try {
String resultString = CommunicationHelper.pushGetJSON(ctxt, fullURL, filledMessage);
callbackContext.success(new JSONObject(resultString));
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
}
});
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
return true;
} else {
return false;
}
}
}
| Move the call to the server into a separate thread | Move the call to the server into a separate thread
So that it does not block the webcore thread
| Java | bsd-3-clause | e-mission/cordova-server-communication,e-mission/cordova-server-communication | java | ## Code Before:
package edu.berkeley.eecs.emission.cordova.comm;
import org.apache.cordova.*;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import android.content.Context;
import edu.berkeley.eecs.emission.cordova.comm.CommunicationHelper;
import edu.berkeley.eecs.emission.cordova.connectionsettings.ConnectionSettings;
public class CommunicationHelperPlugin extends CordovaPlugin {
@Override
public boolean execute(String action, JSONArray data, CallbackContext callbackContext) throws JSONException {
if (action.equals("pushGetJSON")) {
try {
Context ctxt = cordova.getActivity();
String relativeURL = data.getString(0);
JSONObject filledMessage = data.getJSONObject(1);
String commuteTrackerHost = ConnectionSettings.getConnectURL(ctxt);
String fullURL = commuteTrackerHost + relativeURL;
String resultString = CommunicationHelper.pushGetJSON(ctxt, fullURL, filledMessage);
callbackContext.success(new JSONObject(resultString));
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
return true;
} else {
return false;
}
}
}
## Instruction:
Move the call to the server into a separate thread
So that it does not block the webcore thread
## Code After:
package edu.berkeley.eecs.emission.cordova.comm;
import org.apache.cordova.*;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import android.content.Context;
import edu.berkeley.eecs.emission.cordova.comm.CommunicationHelper;
import edu.berkeley.eecs.emission.cordova.connectionsettings.ConnectionSettings;
public class CommunicationHelperPlugin extends CordovaPlugin {
@Override
public boolean execute(String action, JSONArray data, final CallbackContext callbackContext) throws JSONException {
if (action.equals("pushGetJSON")) {
try {
final Context ctxt = cordova.getActivity();
String relativeURL = data.getString(0);
final JSONObject filledMessage = data.getJSONObject(1);
String commuteTrackerHost = ConnectionSettings.getConnectURL(ctxt);
final String fullURL = commuteTrackerHost + relativeURL;
cordova.getThreadPool().execute(new Runnable() {
public void run() {
try {
String resultString = CommunicationHelper.pushGetJSON(ctxt, fullURL, filledMessage);
callbackContext.success(new JSONObject(resultString));
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
}
});
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
return true;
} else {
return false;
}
}
}
| package edu.berkeley.eecs.emission.cordova.comm;
import org.apache.cordova.*;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import android.content.Context;
import edu.berkeley.eecs.emission.cordova.comm.CommunicationHelper;
import edu.berkeley.eecs.emission.cordova.connectionsettings.ConnectionSettings;
public class CommunicationHelperPlugin extends CordovaPlugin {
@Override
- public boolean execute(String action, JSONArray data, CallbackContext callbackContext) throws JSONException {
+ public boolean execute(String action, JSONArray data, final CallbackContext callbackContext) throws JSONException {
? ++++++
if (action.equals("pushGetJSON")) {
try {
- Context ctxt = cordova.getActivity();
+ final Context ctxt = cordova.getActivity();
? ++++++
String relativeURL = data.getString(0);
- JSONObject filledMessage = data.getJSONObject(1);
+ final JSONObject filledMessage = data.getJSONObject(1);
? ++++++
String commuteTrackerHost = ConnectionSettings.getConnectURL(ctxt);
- String fullURL = commuteTrackerHost + relativeURL;
+ final String fullURL = commuteTrackerHost + relativeURL;
? ++++++
+ cordova.getThreadPool().execute(new Runnable() {
+ public void run() {
+ try {
String resultString = CommunicationHelper.pushGetJSON(ctxt, fullURL, filledMessage);
callbackContext.success(new JSONObject(resultString));
+ } catch (Exception e) {
+ callbackContext.error("While pushing/getting from server "+e.getMessage());
+ }
+ }
+ });
} catch (Exception e) {
callbackContext.error("While pushing/getting from server "+e.getMessage());
}
return true;
} else {
return false;
}
}
}
| 16 | 0.457143 | 12 | 4 |
179ab5ad2af50e2f21015a920b05e6842f3d64ee | tests/ZendTest/Form/TestAsset/Model.php | tests/ZendTest/Form/TestAsset/Model.php | <?php
/**
* Zend Framework (http://framework.zend.com/)
*
* @link http://github.com/zendframework/zf2 for the canonical source repository
* @copyright Copyright (c) 2005-2013 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
* @package Zend_Form
*/
namespace ZendTest\Form\TestAsset;
use DomainException;
use Zend\Stdlib\ArraySerializableInterface;
/**
* @category Zend
* @package Zend_Form
* @subpackage UnitTest
*/
class Model implements ArraySerializableInterface
{
protected $foo;
protected $bar;
protected $foobar;
public function __set($name, $value)
{
throw new DomainException('Overloading to set values is not allowed');
}
public function __get($name)
{
if (property_exists($this, $name)) {
return $this->$name;
}
throw new DomainException('Unknown attribute');
}
public function exchangeArray(array $array)
{
foreach ($array as $key => $value) {
if (!property_exists($this, $key)) {
continue;
}
$this->$key = $value;
}
}
public function getArrayCopy()
{
return array(
'foo' => $this->foo,
'bar' => $this->bar,
'foobar' => $this->foobar,
);
}
}
| <?php
/**
* Zend Framework (http://framework.zend.com/)
*
* @link http://github.com/zendframework/zf2 for the canonical source repository
* @copyright Copyright (c) 2005-2013 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
*/
namespace ZendTest\Form\TestAsset;
use DomainException;
use Zend\Stdlib\ArraySerializableInterface;
/**
* @category Zend
* @package Zend_Form
* @subpackage UnitTest
*/
class Model implements ArraySerializableInterface
{
protected $foo;
protected $bar;
protected $foobar;
public function __set($name, $value)
{
throw new DomainException('Overloading to set values is not allowed');
}
public function __get($name)
{
if (property_exists($this, $name)) {
return $this->$name;
}
throw new DomainException('Unknown attribute');
}
public function exchangeArray(array $array)
{
foreach ($array as $key => $value) {
if (!property_exists($this, $key)) {
continue;
}
$this->$key = $value;
}
}
public function getArrayCopy()
{
return array(
'foo' => $this->foo,
'bar' => $this->bar,
'foobar' => $this->foobar,
);
}
}
| Remove @package & restart travis | Remove @package & restart travis
| PHP | bsd-3-clause | nandadotexe/zf2,inwebo/zf2,samsonasik/zf2,keradus/zf2,echampet/zf2,bullandrock/zf2,sasezaki/zf2,samsonasik/zf2,mmetayer/zf2,huangchaolei55/firstSite,stefanotorresi/zf2,pembo210/zf2,roko79/zf2,campersau/zf2,vrkansagara/zf2,TheBestSoftInTheWorld/zf2,0livier/zf2,KarimBea/test,dntmartins/zf2,KarimBea/test,desenvuno/zf2,levelfivehub/zf2,keradus/zf2,DASPRiD/zf2,olearys/zf2,ChrisSchreiber/zf2,bullandrock/zf2,AICIDNN/zf2,stefanotorresi/zf2,DASPRiD/zf2,mpalourdio/zf2,dvsavenkov/zf2,pepeam/zend,campersau/zf2,sasezaki/zf2,prolic/zf2,0livier/zf2,noopable/zf2,coolmic/zf2,sntsikdar/zf2,shitikovkirill/zf2,roheim/zf2,awd-git/zf2,dvsavenkov/zf2,david-sann/zf2,bourkhimehaytam/zf2,guilherme-santos/zf2,freax/zf2,Heshan-Sandamal/zf2,taogb/zf2,mmauri04/zf2,dkemper/zf2,samsonasik/zf2,felipemaksy/zf2,MadCat34/zf2,dkarvounaris/zf2,0livier/zf2,zf2timo/zf2,echampet/zf2,artpoplsh/zf2,dntmartins/zf2,svycka/zf2,dntmartins/zf2,bullandrock/zf2,kanellov/zf2,david-sann/zf2,walberjefferson/zf2,mikicaivosevic/zf2,campersau/zf2,echampet/zf2,bourkhimehaytam/zf2,Brucewyh/zf2,taro-asahi/zf2,stefanotorresi/zf2,enjoy2000/zf2,sasezaki/zf2,pepeam/zend,zendframework/zf2,Sineth-N/zf2,svycka/zf2 | php | ## Code Before:
<?php
/**
* Zend Framework (http://framework.zend.com/)
*
* @link http://github.com/zendframework/zf2 for the canonical source repository
* @copyright Copyright (c) 2005-2013 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
* @package Zend_Form
*/
namespace ZendTest\Form\TestAsset;
use DomainException;
use Zend\Stdlib\ArraySerializableInterface;
/**
* @category Zend
* @package Zend_Form
* @subpackage UnitTest
*/
class Model implements ArraySerializableInterface
{
protected $foo;
protected $bar;
protected $foobar;
public function __set($name, $value)
{
throw new DomainException('Overloading to set values is not allowed');
}
public function __get($name)
{
if (property_exists($this, $name)) {
return $this->$name;
}
throw new DomainException('Unknown attribute');
}
public function exchangeArray(array $array)
{
foreach ($array as $key => $value) {
if (!property_exists($this, $key)) {
continue;
}
$this->$key = $value;
}
}
public function getArrayCopy()
{
return array(
'foo' => $this->foo,
'bar' => $this->bar,
'foobar' => $this->foobar,
);
}
}
## Instruction:
Remove @package & restart travis
## Code After:
<?php
/**
* Zend Framework (http://framework.zend.com/)
*
* @link http://github.com/zendframework/zf2 for the canonical source repository
* @copyright Copyright (c) 2005-2013 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
*/
namespace ZendTest\Form\TestAsset;
use DomainException;
use Zend\Stdlib\ArraySerializableInterface;
/**
* @category Zend
* @package Zend_Form
* @subpackage UnitTest
*/
class Model implements ArraySerializableInterface
{
protected $foo;
protected $bar;
protected $foobar;
public function __set($name, $value)
{
throw new DomainException('Overloading to set values is not allowed');
}
public function __get($name)
{
if (property_exists($this, $name)) {
return $this->$name;
}
throw new DomainException('Unknown attribute');
}
public function exchangeArray(array $array)
{
foreach ($array as $key => $value) {
if (!property_exists($this, $key)) {
continue;
}
$this->$key = $value;
}
}
public function getArrayCopy()
{
return array(
'foo' => $this->foo,
'bar' => $this->bar,
'foobar' => $this->foobar,
);
}
}
| <?php
/**
* Zend Framework (http://framework.zend.com/)
*
* @link http://github.com/zendframework/zf2 for the canonical source repository
* @copyright Copyright (c) 2005-2013 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
- * @package Zend_Form
*/
namespace ZendTest\Form\TestAsset;
use DomainException;
use Zend\Stdlib\ArraySerializableInterface;
/**
* @category Zend
* @package Zend_Form
* @subpackage UnitTest
*/
class Model implements ArraySerializableInterface
{
protected $foo;
protected $bar;
protected $foobar;
public function __set($name, $value)
{
throw new DomainException('Overloading to set values is not allowed');
}
public function __get($name)
{
if (property_exists($this, $name)) {
return $this->$name;
}
throw new DomainException('Unknown attribute');
}
public function exchangeArray(array $array)
{
foreach ($array as $key => $value) {
if (!property_exists($this, $key)) {
continue;
}
$this->$key = $value;
}
}
public function getArrayCopy()
{
return array(
'foo' => $this->foo,
'bar' => $this->bar,
'foobar' => $this->foobar,
);
}
} | 1 | 0.017241 | 0 | 1 |
29ad2af79be665d15a22a155086580d47987d3c8 | docker/startup.sh | docker/startup.sh | set -e
while ! ping -c1 redis &>/dev/null; do :; done && echo "REDIS is UP"
while ! ping -c1 mq &>/dev/null; do :; done && echo "MQ is UP"
cd /app
./initialize.sh
| set -e
while ! ping -c1 redis &>/dev/null
do
echo "REDIS is DOWN"
sleep 1
done
echo "REDIS is UP"
while ! ping -c1 mq &>/dev/null
do
echo "MQ is DOWN"
sleep 1
done
echo "MQ is UP"
cd /app
./initialize.sh
| Print debug msg when waiting for Redis and MQ to start | Print debug msg when waiting for Redis and MQ to start
Otherwise you won't know why your docker container isn't starting: it's waiting
for Redis and MQ.
| Shell | mit | openspending/os-data-importers | shell | ## Code Before:
set -e
while ! ping -c1 redis &>/dev/null; do :; done && echo "REDIS is UP"
while ! ping -c1 mq &>/dev/null; do :; done && echo "MQ is UP"
cd /app
./initialize.sh
## Instruction:
Print debug msg when waiting for Redis and MQ to start
Otherwise you won't know why your docker container isn't starting: it's waiting
for Redis and MQ.
## Code After:
set -e
while ! ping -c1 redis &>/dev/null
do
echo "REDIS is DOWN"
sleep 1
done
echo "REDIS is UP"
while ! ping -c1 mq &>/dev/null
do
echo "MQ is DOWN"
sleep 1
done
echo "MQ is UP"
cd /app
./initialize.sh
| set -e
- while ! ping -c1 redis &>/dev/null; do :; done && echo "REDIS is UP"
- while ! ping -c1 mq &>/dev/null; do :; done && echo "MQ is UP"
+ while ! ping -c1 redis &>/dev/null
+ do
+ echo "REDIS is DOWN"
+ sleep 1
+ done
+ echo "REDIS is UP"
+
+ while ! ping -c1 mq &>/dev/null
+ do
+ echo "MQ is DOWN"
+ sleep 1
+ done
+ echo "MQ is UP"
cd /app
./initialize.sh | 15 | 1.875 | 13 | 2 |
e47acd07c307852b27936710469f8c97d0cd655f | lib/asciidoctor-diagram/util/svg.rb | lib/asciidoctor-diagram/util/svg.rb | require_relative 'binaryio'
module Asciidoctor
module Diagram
module SVG
def self.get_image_size(data)
if m = START_TAG_REGEX.match(data)
start_tag = m[0]
if (w = WIDTH_REGEX.match(start_tag)) && (h = HEIGHT_REGEX.match(start_tag))
width = w[:value].to_i * to_px_factor(w[:unit])
height = h[:value].to_i * to_px_factor(h[:unit])
return [width.to_i, height.to_i]
end
end
nil
end
private
START_TAG_REGEX = /<svg[^>]*>/
WIDTH_REGEX = /width="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
HEIGHT_REGEX = /height="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
def self.to_px_factor(unit)
case unit
when 'pt'
1.33
else
1
end
end
end
end
end | require_relative 'binaryio'
module Asciidoctor
module Diagram
module SVG
def self.get_image_size(data)
if m = START_TAG_REGEX.match(data)
start_tag = m[0]
if (w = WIDTH_REGEX.match(start_tag)) && (h = HEIGHT_REGEX.match(start_tag))
width = w[:value].to_i * to_px_factor(w[:unit])
height = h[:value].to_i * to_px_factor(h[:unit])
return [width.to_i, height.to_i]
end
if v = VIEWBOX_REGEX.match(start_tag)
width = v[:width]
height = v[:height]
return [width.to_i, height.to_i]
end
end
nil
end
private
START_TAG_REGEX = /<svg[^>]*>/
WIDTH_REGEX = /width="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
HEIGHT_REGEX = /height="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
VIEWBOX_REGEX = /viewBox="\d+ \d+ (?<width>\d+) (?<height>\d+)"/
def self.to_px_factor(unit)
case unit
when 'pt'
1.33
else
1
end
end
end
end
end | Add support for SVG viewbox based width/height extraction | Add support for SVG viewbox based width/height extraction
| Ruby | mit | asciidoctor/asciidoctor-diagram,chanibal/asciidoctor-diagram | ruby | ## Code Before:
require_relative 'binaryio'
module Asciidoctor
module Diagram
module SVG
def self.get_image_size(data)
if m = START_TAG_REGEX.match(data)
start_tag = m[0]
if (w = WIDTH_REGEX.match(start_tag)) && (h = HEIGHT_REGEX.match(start_tag))
width = w[:value].to_i * to_px_factor(w[:unit])
height = h[:value].to_i * to_px_factor(h[:unit])
return [width.to_i, height.to_i]
end
end
nil
end
private
START_TAG_REGEX = /<svg[^>]*>/
WIDTH_REGEX = /width="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
HEIGHT_REGEX = /height="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
def self.to_px_factor(unit)
case unit
when 'pt'
1.33
else
1
end
end
end
end
end
## Instruction:
Add support for SVG viewbox based width/height extraction
## Code After:
require_relative 'binaryio'
module Asciidoctor
module Diagram
module SVG
def self.get_image_size(data)
if m = START_TAG_REGEX.match(data)
start_tag = m[0]
if (w = WIDTH_REGEX.match(start_tag)) && (h = HEIGHT_REGEX.match(start_tag))
width = w[:value].to_i * to_px_factor(w[:unit])
height = h[:value].to_i * to_px_factor(h[:unit])
return [width.to_i, height.to_i]
end
if v = VIEWBOX_REGEX.match(start_tag)
width = v[:width]
height = v[:height]
return [width.to_i, height.to_i]
end
end
nil
end
private
START_TAG_REGEX = /<svg[^>]*>/
WIDTH_REGEX = /width="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
HEIGHT_REGEX = /height="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
VIEWBOX_REGEX = /viewBox="\d+ \d+ (?<width>\d+) (?<height>\d+)"/
def self.to_px_factor(unit)
case unit
when 'pt'
1.33
else
1
end
end
end
end
end | require_relative 'binaryio'
module Asciidoctor
module Diagram
module SVG
def self.get_image_size(data)
if m = START_TAG_REGEX.match(data)
start_tag = m[0]
if (w = WIDTH_REGEX.match(start_tag)) && (h = HEIGHT_REGEX.match(start_tag))
width = w[:value].to_i * to_px_factor(w[:unit])
height = h[:value].to_i * to_px_factor(h[:unit])
return [width.to_i, height.to_i]
end
+
+ if v = VIEWBOX_REGEX.match(start_tag)
+ width = v[:width]
+ height = v[:height]
+ return [width.to_i, height.to_i]
+ end
end
nil
end
private
START_TAG_REGEX = /<svg[^>]*>/
WIDTH_REGEX = /width="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
HEIGHT_REGEX = /height="(?<value>\d+)(?<unit>[a-zA-Z]+)"/
+ VIEWBOX_REGEX = /viewBox="\d+ \d+ (?<width>\d+) (?<height>\d+)"/
def self.to_px_factor(unit)
case unit
when 'pt'
1.33
else
1
end
end
end
end
end | 7 | 0.2 | 7 | 0 |
d9b32ab691c097762b526c8f4c0cac6d154fd87c | src/config/colors.js | src/config/colors.js | export default {
primary: '#9E9E9E',
primary1: '#4d86f7',
primary2: '#6296f9',
secondary: '#8F0CE8',
secondary2: '#00B233',
secondary3: '#00FF48',
grey0: '#393e42',
grey1: '#43484d',
grey2: '#5e6977',
grey3: '#86939e',
grey4: '#bdc6cf',
grey5: '#e1e8ee',
dkGreyBg: '#232323',
greyOutline: '#cbd2d9',
searchBg: '#303337'
}
| export default {
primary: '#9E9E9E',
primary1: '#4d86f7',
primary2: '#6296f9',
secondary: '#8F0CE8',
secondary2: '#00B233',
secondary3: '#00FF48',
grey0: '#393e42',
grey1: '#43484d',
grey2: '#5e6977',
grey3: '#86939e',
grey4: '#bdc6cf',
grey5: '#e1e8ee',
dkGreyBg: '#232323',
greyOutline: '#cbd2d9',
searchBg: '#303337',
disabled: '#dadee0',
}
| Add a light grey color for disabled buttons | Add a light grey color for disabled buttons | JavaScript | mit | kosiakMD/react-native-elements,react-native-community/React-Native-Elements,kosiakMD/react-native-elements,martinezguillaume/react-native-elements | javascript | ## Code Before:
export default {
primary: '#9E9E9E',
primary1: '#4d86f7',
primary2: '#6296f9',
secondary: '#8F0CE8',
secondary2: '#00B233',
secondary3: '#00FF48',
grey0: '#393e42',
grey1: '#43484d',
grey2: '#5e6977',
grey3: '#86939e',
grey4: '#bdc6cf',
grey5: '#e1e8ee',
dkGreyBg: '#232323',
greyOutline: '#cbd2d9',
searchBg: '#303337'
}
## Instruction:
Add a light grey color for disabled buttons
## Code After:
export default {
primary: '#9E9E9E',
primary1: '#4d86f7',
primary2: '#6296f9',
secondary: '#8F0CE8',
secondary2: '#00B233',
secondary3: '#00FF48',
grey0: '#393e42',
grey1: '#43484d',
grey2: '#5e6977',
grey3: '#86939e',
grey4: '#bdc6cf',
grey5: '#e1e8ee',
dkGreyBg: '#232323',
greyOutline: '#cbd2d9',
searchBg: '#303337',
disabled: '#dadee0',
}
| export default {
primary: '#9E9E9E',
primary1: '#4d86f7',
primary2: '#6296f9',
secondary: '#8F0CE8',
secondary2: '#00B233',
secondary3: '#00FF48',
grey0: '#393e42',
grey1: '#43484d',
grey2: '#5e6977',
grey3: '#86939e',
grey4: '#bdc6cf',
grey5: '#e1e8ee',
dkGreyBg: '#232323',
greyOutline: '#cbd2d9',
- searchBg: '#303337'
+ searchBg: '#303337',
? +
+ disabled: '#dadee0',
} | 3 | 0.176471 | 2 | 1 |
d90358d861e2436fc468b199943bb7e53b31290b | test/jobs/clean_abandoned_site_channels_job_test.rb | test/jobs/clean_abandoned_site_channels_job_test.rb | require 'test_helper'
class CleanAbandonedSiteChannelsJobTest < ActiveJob::TestCase
test "cleans abandoned site channels older than one day" do
publisher = publishers(:medium_media_group)
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("Channel.count", -1) do
assert_difference("publisher.channels.count", -1) do
CleanAbandonedSiteChannelsJob.perform_now
end
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
test "cleans abandoned site channel details older than one day" do
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("SiteChannelDetails.count", -1) do
CleanAbandonedSiteChannelsJob.perform_now
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
end | require 'test_helper'
class CleanAbandonedSiteChannelsJobTest < ActiveJob::TestCase
test "cleans non-visible (abandoned) site channels older than one day" do
publisher = publishers(:medium_media_group)
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("Channel.count", -1 * Channel.not_visible_site_channels.count) do
assert_difference("publisher.channels.count", -1 ) do
CleanAbandonedSiteChannelsJob.perform_now
end
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
test "cleans non-visible (abandoned) site channel details older than one day" do
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("SiteChannelDetails.count", -1 * Channel.not_visible_site_channels.count) do
CleanAbandonedSiteChannelsJob.perform_now
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
end
| Change requirements for abandoned channels | Change requirements for abandoned channels
| Ruby | mpl-2.0 | brave/publishers,brave/publishers,brave/publishers | ruby | ## Code Before:
require 'test_helper'
class CleanAbandonedSiteChannelsJobTest < ActiveJob::TestCase
test "cleans abandoned site channels older than one day" do
publisher = publishers(:medium_media_group)
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("Channel.count", -1) do
assert_difference("publisher.channels.count", -1) do
CleanAbandonedSiteChannelsJob.perform_now
end
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
test "cleans abandoned site channel details older than one day" do
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("SiteChannelDetails.count", -1) do
CleanAbandonedSiteChannelsJob.perform_now
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
end
## Instruction:
Change requirements for abandoned channels
## Code After:
require 'test_helper'
class CleanAbandonedSiteChannelsJobTest < ActiveJob::TestCase
test "cleans non-visible (abandoned) site channels older than one day" do
publisher = publishers(:medium_media_group)
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("Channel.count", -1 * Channel.not_visible_site_channels.count) do
assert_difference("publisher.channels.count", -1 ) do
CleanAbandonedSiteChannelsJob.perform_now
end
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
test "cleans non-visible (abandoned) site channel details older than one day" do
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
assert_difference("SiteChannelDetails.count", -1 * Channel.not_visible_site_channels.count) do
CleanAbandonedSiteChannelsJob.perform_now
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
end
| require 'test_helper'
class CleanAbandonedSiteChannelsJobTest < ActiveJob::TestCase
- test "cleans abandoned site channels older than one day" do
+ test "cleans non-visible (abandoned) site channels older than one day" do
? +++++++++++++ +
publisher = publishers(:medium_media_group)
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
- assert_difference("Channel.count", -1) do
+ assert_difference("Channel.count", -1 * Channel.not_visible_site_channels.count) do
- assert_difference("publisher.channels.count", -1) do
+ assert_difference("publisher.channels.count", -1 ) do
? +
CleanAbandonedSiteChannelsJob.perform_now
end
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
- test "cleans abandoned site channel details older than one day" do
+ test "cleans non-visible (abandoned) site channel details older than one day" do
? +++++++++++++ +
assert SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
- assert_difference("SiteChannelDetails.count", -1) do
+ assert_difference("SiteChannelDetails.count", -1 * Channel.not_visible_site_channels.count) do
CleanAbandonedSiteChannelsJob.perform_now
end
refute SiteChannelDetails.find_by(brave_publisher_id: "medium_2.org")
end
end | 10 | 0.37037 | 5 | 5 |
aaaecf4a4d503fa509ab74b2c77589c3579bf030 | .travis.yml | .travis.yml | language: ruby
sudo: false
cache: bundler
rvm:
- 2.2.6
- 2.3.1
- 2.4.1
- ruby-head
gemfile:
- gemfiles/rails_4_0.gemfile
- gemfiles/rails_4_1.gemfile
- gemfiles/rails_4_2.gemfile
- gemfiles/rails_5_0.gemfile
- gemfiles/rails_master.gemfile
- Gemfile
matrix:
exclude:
- rvm: 2.4.1
gemfile: gemfiles/rails_4_0.gemfile
- rvm: 2.4.1
gemfile: gemfiles/rails_4_1.gemfile
allow_failures:
- rvm: ruby-head
- rvm: 2.4.1
gemfile: gemfiles/rails_4_2.gemfile
- gemfile: gemfiles/rails_master.gemfile
fast_finish: true
notifications:
email: false
irc:
on_success: change
on_failure: always
channels:
- "irc.freenode.org#rails-contrib"
| language: ruby
sudo: false
cache: bundler
rvm:
- 2.2.6
- 2.3.1
- 2.4.1
- ruby-head
gemfile:
- gemfiles/rails_4_2.gemfile
- gemfiles/rails_5_0.gemfile
- gemfiles/rails_master.gemfile
- Gemfile
matrix:
allow_failures:
- rvm: ruby-head
- gemfile: gemfiles/rails_master.gemfile
fast_finish: true
notifications:
email: false
irc:
on_success: change
on_failure: always
channels:
- "irc.freenode.org#rails-contrib"
| Remove old gemfiles from Travis. | Remove old gemfiles from Travis.
Require Rails 4.2 to pass with Ruby 2.4.1 since it works
in the last couple of builds.
| YAML | mit | rails/rails-observers | yaml | ## Code Before:
language: ruby
sudo: false
cache: bundler
rvm:
- 2.2.6
- 2.3.1
- 2.4.1
- ruby-head
gemfile:
- gemfiles/rails_4_0.gemfile
- gemfiles/rails_4_1.gemfile
- gemfiles/rails_4_2.gemfile
- gemfiles/rails_5_0.gemfile
- gemfiles/rails_master.gemfile
- Gemfile
matrix:
exclude:
- rvm: 2.4.1
gemfile: gemfiles/rails_4_0.gemfile
- rvm: 2.4.1
gemfile: gemfiles/rails_4_1.gemfile
allow_failures:
- rvm: ruby-head
- rvm: 2.4.1
gemfile: gemfiles/rails_4_2.gemfile
- gemfile: gemfiles/rails_master.gemfile
fast_finish: true
notifications:
email: false
irc:
on_success: change
on_failure: always
channels:
- "irc.freenode.org#rails-contrib"
## Instruction:
Remove old gemfiles from Travis.
Require Rails 4.2 to pass with Ruby 2.4.1 since it works
in the last couple of builds.
## Code After:
language: ruby
sudo: false
cache: bundler
rvm:
- 2.2.6
- 2.3.1
- 2.4.1
- ruby-head
gemfile:
- gemfiles/rails_4_2.gemfile
- gemfiles/rails_5_0.gemfile
- gemfiles/rails_master.gemfile
- Gemfile
matrix:
allow_failures:
- rvm: ruby-head
- gemfile: gemfiles/rails_master.gemfile
fast_finish: true
notifications:
email: false
irc:
on_success: change
on_failure: always
channels:
- "irc.freenode.org#rails-contrib"
| language: ruby
sudo: false
cache: bundler
rvm:
- 2.2.6
- 2.3.1
- 2.4.1
- ruby-head
gemfile:
- - gemfiles/rails_4_0.gemfile
- - gemfiles/rails_4_1.gemfile
- gemfiles/rails_4_2.gemfile
- gemfiles/rails_5_0.gemfile
- gemfiles/rails_master.gemfile
- Gemfile
matrix:
- exclude:
- - rvm: 2.4.1
- gemfile: gemfiles/rails_4_0.gemfile
- - rvm: 2.4.1
- gemfile: gemfiles/rails_4_1.gemfile
allow_failures:
- rvm: ruby-head
- - rvm: 2.4.1
- gemfile: gemfiles/rails_4_2.gemfile
- gemfile: gemfiles/rails_master.gemfile
fast_finish: true
notifications:
email: false
irc:
on_success: change
on_failure: always
channels:
- "irc.freenode.org#rails-contrib" | 9 | 0.264706 | 0 | 9 |
1c153fdbbe75c3a74a1fd7e2cd6d03dd5cd48c9f | Windows_Setup.txt | Windows_Setup.txt | Windows Setup
------------------------
Google Chrome
Octave
Rapid Environment Editor
Sublime Text 3
ConEmu
Git for Windows
Scilab
Nano
Miniconda
tortoisehg
Ubuntu 12.04
vcredist 2010 x86
Virtualbox
Visual Studio 2008
Environment variables:
PATH: Anaconda, Anaconda Scipts, Git path, Octave path, Octave msys (for make), sublime path
set HOME
Run the anaconda bash program
| Windows Setup
------------------------
Google Chrome
Octave
Rapid Environment Editor
Sublime Text 3
ConEmu
Git for Windows
Scilab
Nano
Miniconda
tortoisehg
Ubuntu 12.04
Ubuntu 14.04
vcredist 2010 x86
Virtualbox
Visual Studio 2008
Environment variables:
PATH: Anaconda, Anaconda Scipts, Git path, Octave path, Octave msys (for make), sublime path
set HOME
Run the anaconda bash program
| Add Ubuntu 14.04 to setup notes | Add Ubuntu 14.04 to setup notes
| Text | mit | blink1073/dotfiles,blink1073/dotfiles | text | ## Code Before:
Windows Setup
------------------------
Google Chrome
Octave
Rapid Environment Editor
Sublime Text 3
ConEmu
Git for Windows
Scilab
Nano
Miniconda
tortoisehg
Ubuntu 12.04
vcredist 2010 x86
Virtualbox
Visual Studio 2008
Environment variables:
PATH: Anaconda, Anaconda Scipts, Git path, Octave path, Octave msys (for make), sublime path
set HOME
Run the anaconda bash program
## Instruction:
Add Ubuntu 14.04 to setup notes
## Code After:
Windows Setup
------------------------
Google Chrome
Octave
Rapid Environment Editor
Sublime Text 3
ConEmu
Git for Windows
Scilab
Nano
Miniconda
tortoisehg
Ubuntu 12.04
Ubuntu 14.04
vcredist 2010 x86
Virtualbox
Visual Studio 2008
Environment variables:
PATH: Anaconda, Anaconda Scipts, Git path, Octave path, Octave msys (for make), sublime path
set HOME
Run the anaconda bash program
| Windows Setup
------------------------
Google Chrome
Octave
Rapid Environment Editor
Sublime Text 3
ConEmu
Git for Windows
Scilab
Nano
Miniconda
tortoisehg
Ubuntu 12.04
+ Ubuntu 14.04
vcredist 2010 x86
Virtualbox
Visual Studio 2008
Environment variables:
PATH: Anaconda, Anaconda Scipts, Git path, Octave path, Octave msys (for make), sublime path
set HOME
Run the anaconda bash program | 1 | 0.043478 | 1 | 0 |
f0ce462a22746e04031d2de18674653e864c172a | 7ml7w/README.md | 7ml7w/README.md |
Code generated from meetings of [the London Computation
Club](http://london.computation.club) as we study [Seven more languages
in seven weeks](https://pragprog.com/book/7lang/seven-more-languages-in-seven-weeks)
* Lua
* [Day 1](./lua/day_1) - just the exercises
* [Day 2](./lua/day_2) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_2/notes.md) on how I got on
|
Code generated from meetings of [the London Computation
Club](http://london.computation.club) as we study [Seven more languages
in seven weeks](https://pragprog.com/book/7lang/seven-more-languages-in-seven-weeks)
* Lua
* [Day 1](./lua/day_1) - just the exercises
* [Day 2](./lua/day_2) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_2/notes.md) on how I got on
* [Day 3](./lua/day_3) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_3/notes.md) on how I got on
| Add Day 3 link to main 7ml7w readme | Add Day 3 link to main 7ml7w readme
| Markdown | cc0-1.0 | h-lame/computationclub-mandelbrot,h-lame/computationclub-mandelbrot | markdown | ## Code Before:
Code generated from meetings of [the London Computation
Club](http://london.computation.club) as we study [Seven more languages
in seven weeks](https://pragprog.com/book/7lang/seven-more-languages-in-seven-weeks)
* Lua
* [Day 1](./lua/day_1) - just the exercises
* [Day 2](./lua/day_2) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_2/notes.md) on how I got on
## Instruction:
Add Day 3 link to main 7ml7w readme
## Code After:
Code generated from meetings of [the London Computation
Club](http://london.computation.club) as we study [Seven more languages
in seven weeks](https://pragprog.com/book/7lang/seven-more-languages-in-seven-weeks)
* Lua
* [Day 1](./lua/day_1) - just the exercises
* [Day 2](./lua/day_2) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_2/notes.md) on how I got on
* [Day 3](./lua/day_3) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_3/notes.md) on how I got on
|
Code generated from meetings of [the London Computation
Club](http://london.computation.club) as we study [Seven more languages
in seven weeks](https://pragprog.com/book/7lang/seven-more-languages-in-seven-weeks)
* Lua
* [Day 1](./lua/day_1) - just the exercises
* [Day 2](./lua/day_2) - scratch pad files from working through the text,
exercies, and some [notes](./lua/day_2/notes.md) on how I got on
+ * [Day 3](./lua/day_3) - scratch pad files from working through the text,
+ exercies, and some [notes](./lua/day_3/notes.md) on how I got on | 2 | 0.222222 | 2 | 0 |
aedf2c4d55f165d6f837096e9b244d74ffe159af | scripts/run_unit_tests.sh | scripts/run_unit_tests.sh |
set -e
docker rm -f qgis || true
docker run -d --name qgis -v /tmp/.X11-unix:/tmp/.X11-unix \
-v `pwd`/../.:/tests_directory \
-e DISPLAY=:99 \
-e GEM_QGIS_TEST=y \
qgis/qgis:final-3_8_3
docker exec -it qgis sh -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit python3-pytest"
# OGR_SQLITE_JOURNAL=delete prevents QGIS from using WAL, which modifies geopackages even if they are just read
docker exec -it qgis sh -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete python3 -m pytest -v /tests_directory/svir/test/unit/"
|
set -e
docker rm -f qgis || true
docker run -d --name qgis -v /tmp/.X11-unix:/tmp/.X11-unix \
-v `pwd`/../.:/tests_directory \
-e DISPLAY=:99 \
-e GEM_QGIS_TEST=y \
qgis/qgis:final-3_8_3
docker exec -it qgis bash -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit"
docker exec -it qgis bash -c "python3 -m pip install pytest"
# OGR_SQLITE_JOURNAL=delete prevents QGIS from using WAL, which modifies geopackages even if they are just read
docker exec -it qgis bash -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete pytest -v /tests_directory/svir/test/unit/"
| Use pytest instead of nose in unit tests | Use pytest instead of nose in unit tests
| Shell | agpl-3.0 | gem/oq-svir-qgis,gem/oq-svir-qgis,gem/oq-svir-qgis,gem/oq-svir-qgis | shell | ## Code Before:
set -e
docker rm -f qgis || true
docker run -d --name qgis -v /tmp/.X11-unix:/tmp/.X11-unix \
-v `pwd`/../.:/tests_directory \
-e DISPLAY=:99 \
-e GEM_QGIS_TEST=y \
qgis/qgis:final-3_8_3
docker exec -it qgis sh -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit python3-pytest"
# OGR_SQLITE_JOURNAL=delete prevents QGIS from using WAL, which modifies geopackages even if they are just read
docker exec -it qgis sh -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete python3 -m pytest -v /tests_directory/svir/test/unit/"
## Instruction:
Use pytest instead of nose in unit tests
## Code After:
set -e
docker rm -f qgis || true
docker run -d --name qgis -v /tmp/.X11-unix:/tmp/.X11-unix \
-v `pwd`/../.:/tests_directory \
-e DISPLAY=:99 \
-e GEM_QGIS_TEST=y \
qgis/qgis:final-3_8_3
docker exec -it qgis bash -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit"
docker exec -it qgis bash -c "python3 -m pip install pytest"
# OGR_SQLITE_JOURNAL=delete prevents QGIS from using WAL, which modifies geopackages even if they are just read
docker exec -it qgis bash -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete pytest -v /tests_directory/svir/test/unit/"
|
set -e
docker rm -f qgis || true
docker run -d --name qgis -v /tmp/.X11-unix:/tmp/.X11-unix \
-v `pwd`/../.:/tests_directory \
-e DISPLAY=:99 \
-e GEM_QGIS_TEST=y \
qgis/qgis:final-3_8_3
- docker exec -it qgis sh -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit python3-pytest"
? ---------------
+ docker exec -it qgis bash -c "apt update; DEBIAN_FRONTEND=noninteractive apt install -y python3-scipy python3-matplotlib python3-pyqt5.qtwebkit"
? ++
+ docker exec -it qgis bash -c "python3 -m pip install pytest"
# OGR_SQLITE_JOURNAL=delete prevents QGIS from using WAL, which modifies geopackages even if they are just read
- docker exec -it qgis sh -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete python3 -m pytest -v /tests_directory/svir/test/unit/"
? -----------
+ docker exec -it qgis bash -c "export PYTHONPATH=/usr/share/qgis/python/plugins/:$PYTHONPATH; OGR_SQLITE_JOURNAL=delete pytest -v /tests_directory/svir/test/unit/"
? ++
| 5 | 0.357143 | 3 | 2 |
ecd9e8045a985d6df05bb201728b1687342c1759 | demo/clusters/cluster.cluster-us-west.yaml | demo/clusters/cluster.cluster-us-west.yaml | name: cluster-us-west
type: kubernetes
metadata:
kubecontext: cluster-us-west
namespace: demo
| name: cluster-us-west
type: kubernetes
#labels:
# compromised: true
metadata:
kubecontext: cluster-us-west
namespace: demo
| Add commented label for us-west cluster | Add commented label for us-west cluster
| YAML | apache-2.0 | Aptomi/aptomi,Aptomi/aptomi,Aptomi/aptomi,Aptomi/aptomi | yaml | ## Code Before:
name: cluster-us-west
type: kubernetes
metadata:
kubecontext: cluster-us-west
namespace: demo
## Instruction:
Add commented label for us-west cluster
## Code After:
name: cluster-us-west
type: kubernetes
#labels:
# compromised: true
metadata:
kubecontext: cluster-us-west
namespace: demo
| name: cluster-us-west
type: kubernetes
+ #labels:
+ # compromised: true
+
metadata:
kubecontext: cluster-us-west
namespace: demo | 3 | 0.428571 | 3 | 0 |
12a9057cb95d5477e55f76369afa8893ed0cfebd | lib/issue_submit_information_patch.rb | lib/issue_submit_information_patch.rb | module IssueSubmitInformationPatch
def self.included(base)
base.send(:include, InstanceMethods)
base.class_eval do
alias_method_chain :create, :issue_link
end
end
module InstanceMethods
def create_with_issue_link
without = create_without_issue_link
flash[:notice] = l(:notice_successful_issue_create, link_to_issue(@issue, :project => @project))
return without
end
end
end
| module IssueSubmitInformationPatch
def self.included(base)
base.send(:include, InstanceMethods)
base.class_eval do
alias_method_chain :create, :issue_link
end
end
module InstanceMethods
def create_with_issue_link
without = create_without_issue_link
flash[:notice] = l(:notice_successful_issue_create, self.class.helpers.link_to_issue(@issue, :project => @project))
return without
end
end
end
| Resolve reference to link_to within controller. | Resolve reference to link_to within controller.
There seems to be some indications that using "link_to" within a
controller within a controller is bad practice. Most indications seem to
be that you could hand-roll the HTML yourself (a thousand times worse,
in my opinion), render a partial (which I don't know how to do), or just
leave the link_to for views.
However, outside of extending this to include a new view which is then
rendered and passed in to the flash notice, I think this is the quickest
and sanest option.
If it actually works!
Ref: http://blog.lrdesign.com/2009/05/using-link_to-or-other-helper-methods-in-a-controller/
| Ruby | mit | jmcb/issue-submit-information | ruby | ## Code Before:
module IssueSubmitInformationPatch
def self.included(base)
base.send(:include, InstanceMethods)
base.class_eval do
alias_method_chain :create, :issue_link
end
end
module InstanceMethods
def create_with_issue_link
without = create_without_issue_link
flash[:notice] = l(:notice_successful_issue_create, link_to_issue(@issue, :project => @project))
return without
end
end
end
## Instruction:
Resolve reference to link_to within controller.
There seems to be some indications that using "link_to" within a
controller within a controller is bad practice. Most indications seem to
be that you could hand-roll the HTML yourself (a thousand times worse,
in my opinion), render a partial (which I don't know how to do), or just
leave the link_to for views.
However, outside of extending this to include a new view which is then
rendered and passed in to the flash notice, I think this is the quickest
and sanest option.
If it actually works!
Ref: http://blog.lrdesign.com/2009/05/using-link_to-or-other-helper-methods-in-a-controller/
## Code After:
module IssueSubmitInformationPatch
def self.included(base)
base.send(:include, InstanceMethods)
base.class_eval do
alias_method_chain :create, :issue_link
end
end
module InstanceMethods
def create_with_issue_link
without = create_without_issue_link
flash[:notice] = l(:notice_successful_issue_create, self.class.helpers.link_to_issue(@issue, :project => @project))
return without
end
end
end
| module IssueSubmitInformationPatch
def self.included(base)
base.send(:include, InstanceMethods)
base.class_eval do
alias_method_chain :create, :issue_link
end
end
module InstanceMethods
def create_with_issue_link
without = create_without_issue_link
- flash[:notice] = l(:notice_successful_issue_create, link_to_issue(@issue, :project => @project))
+ flash[:notice] = l(:notice_successful_issue_create, self.class.helpers.link_to_issue(@issue, :project => @project))
? +++++++++++++++++++
return without
end
end
end | 2 | 0.111111 | 1 | 1 |
ca9490f558e6fba0980e47d4e2cda88e76f700ac | app/javascript/app/styles/themes/sdg-card/sdg-card.scss | app/javascript/app/styles/themes/sdg-card/sdg-card.scss | @import '~styles/layout.scss';
.card {
position: relative;
margin-bottom: -1px;
border: solid 1px $theme-border;
}
| @import '~styles/layout.scss';
.card {
position: relative;
margin-bottom: -1px;
border: solid 1px $theme-border;
&:last-child {
border-right: 1px solid transparent;
box-shadow: 1px 0 0 $theme-border;
}
&:not(:last-child) {
border-right: none;
}
}
| Delete double border to sdg cards | Delete double border to sdg cards
| SCSS | mit | Vizzuality/climate-watch,Vizzuality/climate-watch,Vizzuality/climate-watch,Vizzuality/climate-watch | scss | ## Code Before:
@import '~styles/layout.scss';
.card {
position: relative;
margin-bottom: -1px;
border: solid 1px $theme-border;
}
## Instruction:
Delete double border to sdg cards
## Code After:
@import '~styles/layout.scss';
.card {
position: relative;
margin-bottom: -1px;
border: solid 1px $theme-border;
&:last-child {
border-right: 1px solid transparent;
box-shadow: 1px 0 0 $theme-border;
}
&:not(:last-child) {
border-right: none;
}
}
| @import '~styles/layout.scss';
.card {
position: relative;
margin-bottom: -1px;
border: solid 1px $theme-border;
+
+ &:last-child {
+ border-right: 1px solid transparent;
+ box-shadow: 1px 0 0 $theme-border;
+ }
+
+ &:not(:last-child) {
+ border-right: none;
+ }
} | 9 | 1.285714 | 9 | 0 |
59b81f2734f0158ab61a25d7d3e18608b3d5a3cc | README.md | README.md |
[](https://travis-ci.org/andrewferrier/imbk)
iPhone Media Backup - an experimental project to provide easy/regular backups
of iPhone photos/videos to a remote SFTP server.
Usage documentation is not-existent at the moment, but it should be
self-explanatory if you understand
[SSH](https://en.wikipedia.org/wiki/SSH_File_Transfer_Protocol)/SFTP. It looks
a bit like this:

|
***This project is now unmaintained. For my own photo backup needs, I'm using [PhotoSync](https://photosync-app.com/iOS.html) instead.***
---
[](https://travis-ci.org/andrewferrier/imbk)
iPhone Media Backup - an experimental project to provide easy/regular backups
of iPhone photos/videos to a remote SFTP server.
Usage documentation is not-existent at the moment, but it should be
self-explanatory if you understand
[SSH](https://en.wikipedia.org/wiki/SSH_File_Transfer_Protocol)/SFTP. It looks
a bit like this:

| Add note regarding going unmaintained. | Add note regarding going unmaintained. | Markdown | mit | andrewferrier/imbk,andrewferrier/imbk,andrewferrier/imbk | markdown | ## Code Before:
[](https://travis-ci.org/andrewferrier/imbk)
iPhone Media Backup - an experimental project to provide easy/regular backups
of iPhone photos/videos to a remote SFTP server.
Usage documentation is not-existent at the moment, but it should be
self-explanatory if you understand
[SSH](https://en.wikipedia.org/wiki/SSH_File_Transfer_Protocol)/SFTP. It looks
a bit like this:

## Instruction:
Add note regarding going unmaintained.
## Code After:
***This project is now unmaintained. For my own photo backup needs, I'm using [PhotoSync](https://photosync-app.com/iOS.html) instead.***
---
[](https://travis-ci.org/andrewferrier/imbk)
iPhone Media Backup - an experimental project to provide easy/regular backups
of iPhone photos/videos to a remote SFTP server.
Usage documentation is not-existent at the moment, but it should be
self-explanatory if you understand
[SSH](https://en.wikipedia.org/wiki/SSH_File_Transfer_Protocol)/SFTP. It looks
a bit like this:

| +
+ ***This project is now unmaintained. For my own photo backup needs, I'm using [PhotoSync](https://photosync-app.com/iOS.html) instead.***
+
+ ---
[](https://travis-ci.org/andrewferrier/imbk)
iPhone Media Backup - an experimental project to provide easy/regular backups
of iPhone photos/videos to a remote SFTP server.
Usage documentation is not-existent at the moment, but it should be
self-explanatory if you understand
[SSH](https://en.wikipedia.org/wiki/SSH_File_Transfer_Protocol)/SFTP. It looks
a bit like this:
 | 4 | 0.333333 | 4 | 0 |
0f05299ed7cf79d7b30aed3371e84319c6c3c055 | xtab/org.eclipse.birt.report.item.crosstab.ui/src/org/eclipse/birt/report/item/crosstab/internal/ui/dialogs/CrosstabComputedMeasureExpressionProvider.java | xtab/org.eclipse.birt.report.item.crosstab.ui/src/org/eclipse/birt/report/item/crosstab/internal/ui/dialogs/CrosstabComputedMeasureExpressionProvider.java | /*******************************************************************************
* Copyright (c) 2004 Actuate Corporation.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Actuate Corporation - initial API and implementation
*******************************************************************************/
package org.eclipse.birt.report.item.crosstab.internal.ui.dialogs;
import org.eclipse.birt.report.designer.ui.dialogs.ExpressionProvider;
import org.eclipse.birt.report.designer.ui.expressions.ExpressionFilter;
import org.eclipse.birt.report.model.api.DesignElementHandle;
/**
*
*/
public class CrosstabComputedMeasureExpressionProvider extends
CrosstabExpressionProvider
{
public CrosstabComputedMeasureExpressionProvider( DesignElementHandle handle )
{
super( handle, null );
}
protected void addFilterToProvider( )
{
this.addFilter( new ExpressionFilter( ) {
public boolean select( Object parentElement, Object element )
{
if ( ExpressionFilter.CATEGORY.equals( parentElement )
&& ExpressionProvider.CURRENT_CUBE.equals( element ) )
{
return false;
}
return true;
}
} );
}
}
| /*******************************************************************************
* Copyright (c) 2004 Actuate Corporation.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Actuate Corporation - initial API and implementation
*******************************************************************************/
package org.eclipse.birt.report.item.crosstab.internal.ui.dialogs;
import org.eclipse.birt.report.designer.ui.dialogs.ExpressionProvider;
import org.eclipse.birt.report.designer.ui.expressions.ExpressionFilter;
import org.eclipse.birt.report.model.api.DesignElementHandle;
/**
*
*/
public class CrosstabComputedMeasureExpressionProvider extends
CrosstabExpressionProvider
{
public CrosstabComputedMeasureExpressionProvider( DesignElementHandle handle )
{
super( handle, null );
}
// protected void addFilterToProvider( )
// {
// this.addFilter( new ExpressionFilter( ) {
//
// public boolean select( Object parentElement, Object element )
// {
// if ( ExpressionFilter.CATEGORY.equals( parentElement )
// && ExpressionProvider.CURRENT_CUBE.equals( element ) )
// {
// return false;
// }
// return true;
// }
// } );
// }
}
| Remove the restriction on cube element on the expression builder when creating a derived measure. | Remove the restriction on cube element on the expression builder when
creating a derived measure. | Java | epl-1.0 | sguan-actuate/birt,rrimmana/birt-1,sguan-actuate/birt,Charling-Huang/birt,Charling-Huang/birt,Charling-Huang/birt,sguan-actuate/birt,rrimmana/birt-1,sguan-actuate/birt,rrimmana/birt-1,rrimmana/birt-1,rrimmana/birt-1,Charling-Huang/birt,Charling-Huang/birt,sguan-actuate/birt | java | ## Code Before:
/*******************************************************************************
* Copyright (c) 2004 Actuate Corporation.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Actuate Corporation - initial API and implementation
*******************************************************************************/
package org.eclipse.birt.report.item.crosstab.internal.ui.dialogs;
import org.eclipse.birt.report.designer.ui.dialogs.ExpressionProvider;
import org.eclipse.birt.report.designer.ui.expressions.ExpressionFilter;
import org.eclipse.birt.report.model.api.DesignElementHandle;
/**
*
*/
public class CrosstabComputedMeasureExpressionProvider extends
CrosstabExpressionProvider
{
public CrosstabComputedMeasureExpressionProvider( DesignElementHandle handle )
{
super( handle, null );
}
protected void addFilterToProvider( )
{
this.addFilter( new ExpressionFilter( ) {
public boolean select( Object parentElement, Object element )
{
if ( ExpressionFilter.CATEGORY.equals( parentElement )
&& ExpressionProvider.CURRENT_CUBE.equals( element ) )
{
return false;
}
return true;
}
} );
}
}
## Instruction:
Remove the restriction on cube element on the expression builder when
creating a derived measure.
## Code After:
/*******************************************************************************
* Copyright (c) 2004 Actuate Corporation.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Actuate Corporation - initial API and implementation
*******************************************************************************/
package org.eclipse.birt.report.item.crosstab.internal.ui.dialogs;
import org.eclipse.birt.report.designer.ui.dialogs.ExpressionProvider;
import org.eclipse.birt.report.designer.ui.expressions.ExpressionFilter;
import org.eclipse.birt.report.model.api.DesignElementHandle;
/**
*
*/
public class CrosstabComputedMeasureExpressionProvider extends
CrosstabExpressionProvider
{
public CrosstabComputedMeasureExpressionProvider( DesignElementHandle handle )
{
super( handle, null );
}
// protected void addFilterToProvider( )
// {
// this.addFilter( new ExpressionFilter( ) {
//
// public boolean select( Object parentElement, Object element )
// {
// if ( ExpressionFilter.CATEGORY.equals( parentElement )
// && ExpressionProvider.CURRENT_CUBE.equals( element ) )
// {
// return false;
// }
// return true;
// }
// } );
// }
}
| /*******************************************************************************
* Copyright (c) 2004 Actuate Corporation.
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Actuate Corporation - initial API and implementation
*******************************************************************************/
package org.eclipse.birt.report.item.crosstab.internal.ui.dialogs;
import org.eclipse.birt.report.designer.ui.dialogs.ExpressionProvider;
import org.eclipse.birt.report.designer.ui.expressions.ExpressionFilter;
import org.eclipse.birt.report.model.api.DesignElementHandle;
/**
*
*/
public class CrosstabComputedMeasureExpressionProvider extends
CrosstabExpressionProvider
{
public CrosstabComputedMeasureExpressionProvider( DesignElementHandle handle )
{
super( handle, null );
}
- protected void addFilterToProvider( )
+ // protected void addFilterToProvider( )
? ++
- {
+ // {
- this.addFilter( new ExpressionFilter( ) {
+ // this.addFilter( new ExpressionFilter( ) {
? ++
-
+ //
- public boolean select( Object parentElement, Object element )
+ // public boolean select( Object parentElement, Object element )
? ++
- {
+ // {
? ++
- if ( ExpressionFilter.CATEGORY.equals( parentElement )
+ // if ( ExpressionFilter.CATEGORY.equals( parentElement )
? ++
- && ExpressionProvider.CURRENT_CUBE.equals( element ) )
+ // && ExpressionProvider.CURRENT_CUBE.equals( element ) )
? ++
- {
+ // {
? ++
- return false;
+ // return false;
? ++
- }
+ // }
? ++
- return true;
+ // return true;
? ++
- }
+ // }
? ++
- } );
+ // } );
? ++
- }
+ // }
} | 30 | 0.638298 | 15 | 15 |
d66a789b5a4ba519c7388acd626c1793d2c75638 | src/stemcell_builder/stages/system_open_vm_tools/apply.sh | src/stemcell_builder/stages/system_open_vm_tools/apply.sh |
set -e
base_dir=$(readlink -nf $(dirname $0)/../..)
source $base_dir/lib/prelude_apply.bash
source $base_dir/lib/prelude_bosh.bash
# Installation on CentOS requires v7
pkg_mgr install open-vm-tools
# The above installation adds a PAM configuration with 'nullok' values in it.
# We need to get rid of those as per stig V-38497.
sed -i -r 's/\bnullok[^ ]*//g' $chroot/etc/pam.d/vmtoolsd
|
set -e
base_dir=$(readlink -nf $(dirname $0)/../..)
source $base_dir/lib/prelude_apply.bash
source $base_dir/lib/prelude_bosh.bash
# Installation on CentOS requires v7
pkg_mgr install open-vm-tools
# open-vm-tools installs unwanted fusermount binary
run_in_chroot $chroot "rm -f /usr/bin/fusermount"
# The above installation adds a PAM configuration with 'nullok' values in it.
# We need to get rid of those as per stig V-38497.
sed -i -r 's/\bnullok[^ ]*//g' $chroot/etc/pam.d/vmtoolsd
| Remove unwanted fusermount binary when installing open-vm-tools package | Remove unwanted fusermount binary when installing open-vm-tools package
[#137042791](https://www.pivotaltracker.com/story/show/137042791)
Signed-off-by: Rob Day-Reynolds <904086f049d807ae99640dbdfde98536a5822f56@pivotal.io>
| Shell | apache-2.0 | barthy1/bosh,barthy1/bosh,barthy1/bosh,barthy1/bosh | shell | ## Code Before:
set -e
base_dir=$(readlink -nf $(dirname $0)/../..)
source $base_dir/lib/prelude_apply.bash
source $base_dir/lib/prelude_bosh.bash
# Installation on CentOS requires v7
pkg_mgr install open-vm-tools
# The above installation adds a PAM configuration with 'nullok' values in it.
# We need to get rid of those as per stig V-38497.
sed -i -r 's/\bnullok[^ ]*//g' $chroot/etc/pam.d/vmtoolsd
## Instruction:
Remove unwanted fusermount binary when installing open-vm-tools package
[#137042791](https://www.pivotaltracker.com/story/show/137042791)
Signed-off-by: Rob Day-Reynolds <904086f049d807ae99640dbdfde98536a5822f56@pivotal.io>
## Code After:
set -e
base_dir=$(readlink -nf $(dirname $0)/../..)
source $base_dir/lib/prelude_apply.bash
source $base_dir/lib/prelude_bosh.bash
# Installation on CentOS requires v7
pkg_mgr install open-vm-tools
# open-vm-tools installs unwanted fusermount binary
run_in_chroot $chroot "rm -f /usr/bin/fusermount"
# The above installation adds a PAM configuration with 'nullok' values in it.
# We need to get rid of those as per stig V-38497.
sed -i -r 's/\bnullok[^ ]*//g' $chroot/etc/pam.d/vmtoolsd
|
set -e
base_dir=$(readlink -nf $(dirname $0)/../..)
source $base_dir/lib/prelude_apply.bash
source $base_dir/lib/prelude_bosh.bash
# Installation on CentOS requires v7
pkg_mgr install open-vm-tools
+ # open-vm-tools installs unwanted fusermount binary
+ run_in_chroot $chroot "rm -f /usr/bin/fusermount"
+
# The above installation adds a PAM configuration with 'nullok' values in it.
# We need to get rid of those as per stig V-38497.
sed -i -r 's/\bnullok[^ ]*//g' $chroot/etc/pam.d/vmtoolsd | 3 | 0.230769 | 3 | 0 |
583e3bc4ba82191e34b715485650248398afc2b6 | src/endpoints/base.py | src/endpoints/base.py | class Base:
def __init__(self, client):
self.client = client
| class Base:
def __init__(self, client):
self.client = client
def build_query(self, query):
if query is None:
query_string = ''
else:
query_string = '?'
for key, value in query.items():
if not query_string.endswith('?'):
query_string = query_string + '&'
query_string = query_string + key + '=' + value
return query_string
| Add method to build a query string to every class | Add method to build a query string to every class
| Python | mit | Vaelor/python-mattermost-driver | python | ## Code Before:
class Base:
def __init__(self, client):
self.client = client
## Instruction:
Add method to build a query string to every class
## Code After:
class Base:
def __init__(self, client):
self.client = client
def build_query(self, query):
if query is None:
query_string = ''
else:
query_string = '?'
for key, value in query.items():
if not query_string.endswith('?'):
query_string = query_string + '&'
query_string = query_string + key + '=' + value
return query_string
| class Base:
def __init__(self, client):
self.client = client
+
+ def build_query(self, query):
+ if query is None:
+ query_string = ''
+ else:
+ query_string = '?'
+ for key, value in query.items():
+ if not query_string.endswith('?'):
+ query_string = query_string + '&'
+ query_string = query_string + key + '=' + value
+ return query_string | 11 | 3.666667 | 11 | 0 |
c3a59dfe23eaca36e21fe0fae5d71d909d0f5eec | src/Post/PostAuthorizer.php | src/Post/PostAuthorizer.php | <?php namespace Anomaly\PostsModule\Post;
use Anomaly\PostsModule\Post\Contract\PostInterface;
use Anomaly\Streams\Platform\Support\Authorizer;
use Illuminate\Auth\Guard;
/**
* Class PostAuthorizer
*
* @link http://pyrocms.com/
* @author PyroCMS, Inc. <support@pyrocms.com>
* @author Ryan Thompson <ryan@pyrocms.com>
* @package Anomaly\PostsModule\Post
*/
class PostAuthorizer
{
/**
* The authorization utility.
*
* @var Guard
*/
protected $guard;
/**
* The authorizer utility.
*
* @var Authorizer
*/
protected $authorizer;
/**
* Create a new PostAuthorizer instance.
*
* @param Guard $guard
* @param Authorizer $authorizer
*/
public function __construct(Guard $guard, Authorizer $authorizer)
{
$this->guard = $guard;
$this->authorizer = $authorizer;
}
/**
* Authorize the post.
*
* @param PostInterface $post
*/
public function authorize(PostInterface $post)
{
if (!$post->isEnabled() && !$this->guard->user()) {
abort(404);
}
if (!$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
abort(404);
}
}
}
| <?php namespace Anomaly\PostsModule\Post;
use Anomaly\PostsModule\Post\Contract\PostInterface;
use Anomaly\Streams\Platform\Support\Authorizer;
use Illuminate\Auth\Guard;
/**
* Class PostAuthorizer
*
* @link http://pyrocms.com/
* @author PyroCMS, Inc. <support@pyrocms.com>
* @author Ryan Thompson <ryan@pyrocms.com>
* @package Anomaly\PostsModule\Post
*/
class PostAuthorizer
{
/**
* The authorization utility.
*
* @var Guard
*/
protected $guard;
/**
* The authorizer utility.
*
* @var Authorizer
*/
protected $authorizer;
/**
* Create a new PostAuthorizer instance.
*
* @param Guard $guard
* @param Authorizer $authorizer
*/
public function __construct(Guard $guard, Authorizer $authorizer)
{
$this->guard = $guard;
$this->authorizer = $authorizer;
}
/**
* Authorize the post.
*
* @param PostInterface $post
*/
public function authorize(PostInterface $post)
{
if (!$post->isEnabled() && !$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
abort(404);
}
}
}
| Fix issue where posts were being authorized when not needed. | Fix issue where posts were being authorized when not needed.
| PHP | mit | anomalylabs/posts-module | php | ## Code Before:
<?php namespace Anomaly\PostsModule\Post;
use Anomaly\PostsModule\Post\Contract\PostInterface;
use Anomaly\Streams\Platform\Support\Authorizer;
use Illuminate\Auth\Guard;
/**
* Class PostAuthorizer
*
* @link http://pyrocms.com/
* @author PyroCMS, Inc. <support@pyrocms.com>
* @author Ryan Thompson <ryan@pyrocms.com>
* @package Anomaly\PostsModule\Post
*/
class PostAuthorizer
{
/**
* The authorization utility.
*
* @var Guard
*/
protected $guard;
/**
* The authorizer utility.
*
* @var Authorizer
*/
protected $authorizer;
/**
* Create a new PostAuthorizer instance.
*
* @param Guard $guard
* @param Authorizer $authorizer
*/
public function __construct(Guard $guard, Authorizer $authorizer)
{
$this->guard = $guard;
$this->authorizer = $authorizer;
}
/**
* Authorize the post.
*
* @param PostInterface $post
*/
public function authorize(PostInterface $post)
{
if (!$post->isEnabled() && !$this->guard->user()) {
abort(404);
}
if (!$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
abort(404);
}
}
}
## Instruction:
Fix issue where posts were being authorized when not needed.
## Code After:
<?php namespace Anomaly\PostsModule\Post;
use Anomaly\PostsModule\Post\Contract\PostInterface;
use Anomaly\Streams\Platform\Support\Authorizer;
use Illuminate\Auth\Guard;
/**
* Class PostAuthorizer
*
* @link http://pyrocms.com/
* @author PyroCMS, Inc. <support@pyrocms.com>
* @author Ryan Thompson <ryan@pyrocms.com>
* @package Anomaly\PostsModule\Post
*/
class PostAuthorizer
{
/**
* The authorization utility.
*
* @var Guard
*/
protected $guard;
/**
* The authorizer utility.
*
* @var Authorizer
*/
protected $authorizer;
/**
* Create a new PostAuthorizer instance.
*
* @param Guard $guard
* @param Authorizer $authorizer
*/
public function __construct(Guard $guard, Authorizer $authorizer)
{
$this->guard = $guard;
$this->authorizer = $authorizer;
}
/**
* Authorize the post.
*
* @param PostInterface $post
*/
public function authorize(PostInterface $post)
{
if (!$post->isEnabled() && !$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
abort(404);
}
}
}
| <?php namespace Anomaly\PostsModule\Post;
use Anomaly\PostsModule\Post\Contract\PostInterface;
use Anomaly\Streams\Platform\Support\Authorizer;
use Illuminate\Auth\Guard;
/**
* Class PostAuthorizer
*
* @link http://pyrocms.com/
* @author PyroCMS, Inc. <support@pyrocms.com>
* @author Ryan Thompson <ryan@pyrocms.com>
* @package Anomaly\PostsModule\Post
*/
class PostAuthorizer
{
/**
* The authorization utility.
*
* @var Guard
*/
protected $guard;
/**
* The authorizer utility.
*
* @var Authorizer
*/
protected $authorizer;
/**
* Create a new PostAuthorizer instance.
*
* @param Guard $guard
* @param Authorizer $authorizer
*/
public function __construct(Guard $guard, Authorizer $authorizer)
{
$this->guard = $guard;
$this->authorizer = $authorizer;
}
/**
* Authorize the post.
*
* @param PostInterface $post
*/
public function authorize(PostInterface $post)
{
- if (!$post->isEnabled() && !$this->guard->user()) {
- abort(404);
- }
-
- if (!$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
+ if (!$post->isEnabled() && !$this->authorizer->authorize('anomaly.module.posts::view_drafts')) {
? +++++++++++++++++++++++
abort(404);
}
}
} | 6 | 0.101695 | 1 | 5 |
2fabe2fab5bdf028a79309b35362223f02e3d4a4 | src/resources/assets/js/trans.js | src/resources/assets/js/trans.js | /**
* Translate keys like laravel helper method trans
*
* @param {String} key the translation key to translate
* @param {Object} [params={}] the params to include in the translation
* @return {String} the translated string
*/
export function trans(key, params = {}) {
if (typeof Polyglot === 'undefined') {
throw new Error('Polyglot is missing.');
}
let trans = Polyglot;
key.split('.').forEach(item => trans = ((typeof trans[item] !== 'undefined') ? trans[item] : key));
for (let param in params) {
let pattern = `:${param}`;
let regex = new RegExp(pattern, "g");
trans = trans.replace(regex, params[param]);
}
return trans;
}
window.trans = trans;
| /**
* Translate keys like laravel helper method trans
*
* @param {String} key the translation key to translate
* @param {Object} [params={}] the params to include in the translation
* @return {String} the translated string
*/
var trans = function(key, params = {}) {
if (typeof Polyglot === 'undefined') {
throw new Error('Polyglot is missing.');
}
let trans = Polyglot;
key.split('.').forEach(item => trans = ((typeof trans[item] !== 'undefined') ? trans[item] : key));
for (let param in params) {
let pattern = `:${param}`;
let regex = new RegExp(pattern, "g");
trans = trans.replace(regex, params[param]);
}
return trans;
}
export { trans };
| Fix exporting issue, removed the glorious window ref. | Fix exporting issue, removed the glorious window ref.
| JavaScript | mit | hadefication/polyglot,hadefication/polyglot | javascript | ## Code Before:
/**
* Translate keys like laravel helper method trans
*
* @param {String} key the translation key to translate
* @param {Object} [params={}] the params to include in the translation
* @return {String} the translated string
*/
export function trans(key, params = {}) {
if (typeof Polyglot === 'undefined') {
throw new Error('Polyglot is missing.');
}
let trans = Polyglot;
key.split('.').forEach(item => trans = ((typeof trans[item] !== 'undefined') ? trans[item] : key));
for (let param in params) {
let pattern = `:${param}`;
let regex = new RegExp(pattern, "g");
trans = trans.replace(regex, params[param]);
}
return trans;
}
window.trans = trans;
## Instruction:
Fix exporting issue, removed the glorious window ref.
## Code After:
/**
* Translate keys like laravel helper method trans
*
* @param {String} key the translation key to translate
* @param {Object} [params={}] the params to include in the translation
* @return {String} the translated string
*/
var trans = function(key, params = {}) {
if (typeof Polyglot === 'undefined') {
throw new Error('Polyglot is missing.');
}
let trans = Polyglot;
key.split('.').forEach(item => trans = ((typeof trans[item] !== 'undefined') ? trans[item] : key));
for (let param in params) {
let pattern = `:${param}`;
let regex = new RegExp(pattern, "g");
trans = trans.replace(regex, params[param]);
}
return trans;
}
export { trans };
| /**
* Translate keys like laravel helper method trans
*
* @param {String} key the translation key to translate
* @param {Object} [params={}] the params to include in the translation
* @return {String} the translated string
*/
- export function trans(key, params = {}) {
? ^^^^ ------
+ var trans = function(key, params = {}) {
? ^^ + ++++++
if (typeof Polyglot === 'undefined') {
throw new Error('Polyglot is missing.');
}
let trans = Polyglot;
key.split('.').forEach(item => trans = ((typeof trans[item] !== 'undefined') ? trans[item] : key));
for (let param in params) {
let pattern = `:${param}`;
let regex = new RegExp(pattern, "g");
trans = trans.replace(regex, params[param]);
}
return trans;
}
- window.trans = trans;
+ export { trans }; | 4 | 0.153846 | 2 | 2 |
1775782f100f9db9ad101a19887ba95fbc36a6e9 | backend/project_name/celerybeat_schedule.py | backend/project_name/celerybeat_schedule.py | from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
# Internal tasks
"clearsessions": {"schedule": crontab(hour=3, minute=0), "task": "users.tasks.clearsessions"},
}
| from celery.schedules import crontab # pylint:disable=import-error,no-name-in-module
CELERYBEAT_SCHEDULE = {
# Internal tasks
"clearsessions": {"schedule": crontab(hour=3, minute=0), "task": "users.tasks.clearsessions"},
}
| Disable prospector on celery.schedules import | Disable prospector on celery.schedules import
| Python | mit | vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate | python | ## Code Before:
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
# Internal tasks
"clearsessions": {"schedule": crontab(hour=3, minute=0), "task": "users.tasks.clearsessions"},
}
## Instruction:
Disable prospector on celery.schedules import
## Code After:
from celery.schedules import crontab # pylint:disable=import-error,no-name-in-module
CELERYBEAT_SCHEDULE = {
# Internal tasks
"clearsessions": {"schedule": crontab(hour=3, minute=0), "task": "users.tasks.clearsessions"},
}
| - from celery.schedules import crontab
+ from celery.schedules import crontab # pylint:disable=import-error,no-name-in-module
CELERYBEAT_SCHEDULE = {
# Internal tasks
"clearsessions": {"schedule": crontab(hour=3, minute=0), "task": "users.tasks.clearsessions"},
} | 2 | 0.285714 | 1 | 1 |
17b2723d589572b9717577a2306a629f750f357b | bin/import_all_country_data.sh | bin/import_all_country_data.sh |
echo 'Started. See import_all_country.log for progres...'
truncate -s 0 data/logs/all.log
time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> import_all_country.log
|
echo 'Started. See data/logs/import_all_country.log for progress...'
truncate -s 0 data/logs/all.log
time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> data/logs/import_all_country.log
| Save log in appropriate folder | Save log in appropriate folder
| Shell | mit | Ecodev/gims,Ecodev/gims,Ecodev/gims,Ecodev/gims,Ecodev/gims | shell | ## Code Before:
echo 'Started. See import_all_country.log for progres...'
truncate -s 0 data/logs/all.log
time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> import_all_country.log
## Instruction:
Save log in appropriate folder
## Code After:
echo 'Started. See data/logs/import_all_country.log for progress...'
truncate -s 0 data/logs/all.log
time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> data/logs/import_all_country.log
|
- echo 'Started. See import_all_country.log for progres...'
+ echo 'Started. See data/logs/import_all_country.log for progress...'
? ++++++++++ +
truncate -s 0 data/logs/all.log
- time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> import_all_country.log
+ time find data/cache/country_data -type f -name '*_12.xlsm' ! -name '~$*' -print -exec date --iso-8601=seconds \; -exec php htdocs/index.php import jmp {} \; &> data/logs/import_all_country.log
? ++++++++++
| 4 | 1 | 2 | 2 |
b34f14b4ecf41e131b3da3b7044a027be4a23603 | main.go | main.go | package main
import (
"fmt"
"log"
"time"
"autotcpdump/parser"
"autotcpdump/executer"
"autotcpdump/checker"
)
func main() {
config := parser.ConfigParser{}
if err := config.Parse("config/config.json"); err != nil {
log.Fatal(err)
}
if err := checker.CheckIfPathWritable(config.PcapLocation); err != nil {
log.Fatal(err)
}
filename := fmt.Sprintf("tcpdump_%v.pcap", time.Now().Format("20060102_150405"))
fmt.Println("directory:", config.PcapLocation, "filename:", filename)
tcpdump := executer.TcpdumpExecuter{}
if err := tcpdump.RunTcpdump(config.PcapLocation, filename, config.CommandOptions); err != nil {
log.Fatal(err)
}
if err := tcpdump.TerminateTcpdump(); err != nil {
log.Fatal(err)
}
if err := tcpdump.AdbPullPcapFile(config.PcapLocation, filename); err != nil {
log.Fatal(err)
}
if err := tcpdump.OpenWithWireshark(config.WiresharkLocation, filename); err != nil {
log.Fatal(err)
}
}
| package main
import (
"fmt"
"log"
"os"
"time"
"strings"
"autotcpdump/parser"
"autotcpdump/executer"
"autotcpdump/checker"
)
func main() {
cmdlineArgs := os.Args[1:]
config := parser.ConfigParser{}
if err := config.Parse("config/config.json"); err != nil {
log.Fatal(err)
}
if err := checker.CheckIfPathWritable(config.PcapLocation); err != nil {
log.Fatal(err)
}
filename := fmt.Sprintf("tcpdump_%v.pcap", time.Now().Format("20060102_150405"))
fmt.Println("directory:", config.PcapLocation, "filename:", filename)
commandOptions := config.CommandOptions + " " + strings.Join(cmdlineArgs, " ")
tcpdump := executer.TcpdumpExecuter{}
if err := tcpdump.RunTcpdump(config.PcapLocation, filename, commandOptions); err != nil {
log.Fatal(err)
}
if err := tcpdump.TerminateTcpdump(); err != nil {
log.Fatal(err)
}
if err := tcpdump.AdbPullPcapFile(config.PcapLocation, filename); err != nil {
log.Fatal(err)
}
if err := tcpdump.OpenWithWireshark(config.WiresharkLocation, filename); err != nil {
log.Fatal(err)
}
}
| Allow passing of command line arguments to tcpdump | Allow passing of command line arguments to tcpdump
| Go | mit | zulhilmizainuddin/autotcpdump | go | ## Code Before:
package main
import (
"fmt"
"log"
"time"
"autotcpdump/parser"
"autotcpdump/executer"
"autotcpdump/checker"
)
func main() {
config := parser.ConfigParser{}
if err := config.Parse("config/config.json"); err != nil {
log.Fatal(err)
}
if err := checker.CheckIfPathWritable(config.PcapLocation); err != nil {
log.Fatal(err)
}
filename := fmt.Sprintf("tcpdump_%v.pcap", time.Now().Format("20060102_150405"))
fmt.Println("directory:", config.PcapLocation, "filename:", filename)
tcpdump := executer.TcpdumpExecuter{}
if err := tcpdump.RunTcpdump(config.PcapLocation, filename, config.CommandOptions); err != nil {
log.Fatal(err)
}
if err := tcpdump.TerminateTcpdump(); err != nil {
log.Fatal(err)
}
if err := tcpdump.AdbPullPcapFile(config.PcapLocation, filename); err != nil {
log.Fatal(err)
}
if err := tcpdump.OpenWithWireshark(config.WiresharkLocation, filename); err != nil {
log.Fatal(err)
}
}
## Instruction:
Allow passing of command line arguments to tcpdump
## Code After:
package main
import (
"fmt"
"log"
"os"
"time"
"strings"
"autotcpdump/parser"
"autotcpdump/executer"
"autotcpdump/checker"
)
func main() {
cmdlineArgs := os.Args[1:]
config := parser.ConfigParser{}
if err := config.Parse("config/config.json"); err != nil {
log.Fatal(err)
}
if err := checker.CheckIfPathWritable(config.PcapLocation); err != nil {
log.Fatal(err)
}
filename := fmt.Sprintf("tcpdump_%v.pcap", time.Now().Format("20060102_150405"))
fmt.Println("directory:", config.PcapLocation, "filename:", filename)
commandOptions := config.CommandOptions + " " + strings.Join(cmdlineArgs, " ")
tcpdump := executer.TcpdumpExecuter{}
if err := tcpdump.RunTcpdump(config.PcapLocation, filename, commandOptions); err != nil {
log.Fatal(err)
}
if err := tcpdump.TerminateTcpdump(); err != nil {
log.Fatal(err)
}
if err := tcpdump.AdbPullPcapFile(config.PcapLocation, filename); err != nil {
log.Fatal(err)
}
if err := tcpdump.OpenWithWireshark(config.WiresharkLocation, filename); err != nil {
log.Fatal(err)
}
}
| package main
import (
"fmt"
"log"
+ "os"
"time"
+ "strings"
"autotcpdump/parser"
"autotcpdump/executer"
"autotcpdump/checker"
)
func main() {
+ cmdlineArgs := os.Args[1:]
+
config := parser.ConfigParser{}
if err := config.Parse("config/config.json"); err != nil {
log.Fatal(err)
}
if err := checker.CheckIfPathWritable(config.PcapLocation); err != nil {
log.Fatal(err)
}
filename := fmt.Sprintf("tcpdump_%v.pcap", time.Now().Format("20060102_150405"))
fmt.Println("directory:", config.PcapLocation, "filename:", filename)
+ commandOptions := config.CommandOptions + " " + strings.Join(cmdlineArgs, " ")
+
tcpdump := executer.TcpdumpExecuter{}
- if err := tcpdump.RunTcpdump(config.PcapLocation, filename, config.CommandOptions); err != nil {
? -------
+ if err := tcpdump.RunTcpdump(config.PcapLocation, filename, commandOptions); err != nil {
log.Fatal(err)
}
if err := tcpdump.TerminateTcpdump(); err != nil {
log.Fatal(err)
}
if err := tcpdump.AdbPullPcapFile(config.PcapLocation, filename); err != nil {
log.Fatal(err)
}
if err := tcpdump.OpenWithWireshark(config.WiresharkLocation, filename); err != nil {
log.Fatal(err)
}
} | 8 | 0.195122 | 7 | 1 |
2a7d9250f69ee2477cd4ccb27ef58287ae1f9796 | lib/deploy/ansible/roles/common/tasks/main.yml | lib/deploy/ansible/roles/common/tasks/main.yml | ---
- include: aptitude.yml
- include: dotfiles.yml
when: dotfiles is defined
- include: node.yml
- include: bower.yml
- include: chruby.yml
- include: nginx.yml
- include: postgres.yml
when: db is defined
| ---
- include: aptitude.yml
- include: dotfiles.yml
when: dotfiles is defined
- include: node.yml
when: node is defined
- include: bower.yml
when: node is defined
- include: chruby.yml
when: ruby is defined
- include: nginx.yml
- include: postgres.yml
when: db is defined
| Remove non-essential tasks from default execution | Remove non-essential tasks from default execution
| YAML | mit | jrhorn424/recruiter,jrhorn424/recruiter | yaml | ## Code Before:
---
- include: aptitude.yml
- include: dotfiles.yml
when: dotfiles is defined
- include: node.yml
- include: bower.yml
- include: chruby.yml
- include: nginx.yml
- include: postgres.yml
when: db is defined
## Instruction:
Remove non-essential tasks from default execution
## Code After:
---
- include: aptitude.yml
- include: dotfiles.yml
when: dotfiles is defined
- include: node.yml
when: node is defined
- include: bower.yml
when: node is defined
- include: chruby.yml
when: ruby is defined
- include: nginx.yml
- include: postgres.yml
when: db is defined
| ---
- include: aptitude.yml
- include: dotfiles.yml
when: dotfiles is defined
- include: node.yml
+ when: node is defined
- include: bower.yml
+ when: node is defined
- include: chruby.yml
+ when: ruby is defined
- include: nginx.yml
- include: postgres.yml
when: db is defined | 3 | 0.272727 | 3 | 0 |
abd7b983e34929bceb37005f8ab9f92e2df20cb1 | src/code/stores/nodes-store.coffee | src/code/stores/nodes-store.coffee | PaletteStore = require './palette-store'
GraphStore = require './graph-store'
nodeActions = Reflux.createActions(
[
"nodesChanged"
]
)
nodeStore = Reflux.createStore
listenables: [nodeActions]
init: ->
@nodes = []
@paletteItemHasNodes = false
@selectedPaletteItem = null
PaletteStore.store.listen @paletteChanged
GraphStore.store.listen @graphChanged
onNodesChanged: (nodes) ->
@nodes = nodes
@internalUpdate()
graphChanged: (status) ->
@nodes = status.nodes
@internalUpdate()
paletteChanged: ->
@selectedPaletteItem = PaletteStore.store.selectedPaletteItem
@internalUpdate()
internalUpdate: ->
@paletteItemHasNodes = false
return unless @selectedPaletteItem
_.each @nodes, (node) =>
if node.paletteItemIs @selectedPaletteItem
@paletteItemHasNodes = true
@notifyChange()
notifyChange: ->
data =
nodes: @nodes
paletteItemHasNodes: @paletteItemHasNodes
@trigger(data)
mixin =
getInitialState: ->
nodes: nodeStore.nodes
paletteItemHasNodes: nodeStore.paletteItemHasNodes
componentDidMount: ->
nodeStore.listen @onNodesChange
onNodesChange: (status) ->
@setState
# nodes: status.nodes
paletteItemHasNodes: status.paletteItemHasNodes
module.exports =
actions: nodeActions
store: nodeStore
mixin: mixin
| PaletteStore = require './palette-store'
GraphActions = require '../actions/graph-actions'
nodeActions = Reflux.createActions(
[
"nodesChanged"
]
)
nodeStore = Reflux.createStore
listenables: [nodeActions]
init: ->
@nodes = []
@paletteItemHasNodes = false
@selectedPaletteItem = null
PaletteStore.store.listen @paletteChanged
GraphActions.graphChanged.listen @graphChanged
onNodesChanged: (nodes) ->
@nodes = nodes
@internalUpdate()
graphChanged: (status) ->
@nodes = status.nodes
@internalUpdate()
paletteChanged: ->
@selectedPaletteItem = PaletteStore.store.selectedPaletteItem
@internalUpdate()
internalUpdate: ->
@paletteItemHasNodes = false
return unless @selectedPaletteItem
_.each @nodes, (node) =>
if node.paletteItemIs @selectedPaletteItem
@paletteItemHasNodes = true
@notifyChange()
notifyChange: ->
data =
nodes: @nodes
paletteItemHasNodes: @paletteItemHasNodes
@trigger(data)
mixin =
getInitialState: ->
nodes: nodeStore.nodes
paletteItemHasNodes: nodeStore.paletteItemHasNodes
componentDidMount: ->
nodeStore.listen @onNodesChange
onNodesChange: (status) ->
@setState
# nodes: status.nodes
paletteItemHasNodes: status.paletteItemHasNodes
module.exports =
actions: nodeActions
store: nodeStore
mixin: mixin
| Fix regression: Deleting items from the pallete doesn't replace them. | Fix regression: Deleting items from the pallete doesn't replace them.
[#105213716]
https://www.pivotaltracker.com/story/show/105213716
| CoffeeScript | mit | concord-consortium/building-models,concord-consortium/building-models,concord-consortium/building-models,concord-consortium/building-models,concord-consortium/building-models | coffeescript | ## Code Before:
PaletteStore = require './palette-store'
GraphStore = require './graph-store'
nodeActions = Reflux.createActions(
[
"nodesChanged"
]
)
nodeStore = Reflux.createStore
listenables: [nodeActions]
init: ->
@nodes = []
@paletteItemHasNodes = false
@selectedPaletteItem = null
PaletteStore.store.listen @paletteChanged
GraphStore.store.listen @graphChanged
onNodesChanged: (nodes) ->
@nodes = nodes
@internalUpdate()
graphChanged: (status) ->
@nodes = status.nodes
@internalUpdate()
paletteChanged: ->
@selectedPaletteItem = PaletteStore.store.selectedPaletteItem
@internalUpdate()
internalUpdate: ->
@paletteItemHasNodes = false
return unless @selectedPaletteItem
_.each @nodes, (node) =>
if node.paletteItemIs @selectedPaletteItem
@paletteItemHasNodes = true
@notifyChange()
notifyChange: ->
data =
nodes: @nodes
paletteItemHasNodes: @paletteItemHasNodes
@trigger(data)
mixin =
getInitialState: ->
nodes: nodeStore.nodes
paletteItemHasNodes: nodeStore.paletteItemHasNodes
componentDidMount: ->
nodeStore.listen @onNodesChange
onNodesChange: (status) ->
@setState
# nodes: status.nodes
paletteItemHasNodes: status.paletteItemHasNodes
module.exports =
actions: nodeActions
store: nodeStore
mixin: mixin
## Instruction:
Fix regression: Deleting items from the pallete doesn't replace them.
[#105213716]
https://www.pivotaltracker.com/story/show/105213716
## Code After:
PaletteStore = require './palette-store'
GraphActions = require '../actions/graph-actions'
nodeActions = Reflux.createActions(
[
"nodesChanged"
]
)
nodeStore = Reflux.createStore
listenables: [nodeActions]
init: ->
@nodes = []
@paletteItemHasNodes = false
@selectedPaletteItem = null
PaletteStore.store.listen @paletteChanged
GraphActions.graphChanged.listen @graphChanged
onNodesChanged: (nodes) ->
@nodes = nodes
@internalUpdate()
graphChanged: (status) ->
@nodes = status.nodes
@internalUpdate()
paletteChanged: ->
@selectedPaletteItem = PaletteStore.store.selectedPaletteItem
@internalUpdate()
internalUpdate: ->
@paletteItemHasNodes = false
return unless @selectedPaletteItem
_.each @nodes, (node) =>
if node.paletteItemIs @selectedPaletteItem
@paletteItemHasNodes = true
@notifyChange()
notifyChange: ->
data =
nodes: @nodes
paletteItemHasNodes: @paletteItemHasNodes
@trigger(data)
mixin =
getInitialState: ->
nodes: nodeStore.nodes
paletteItemHasNodes: nodeStore.paletteItemHasNodes
componentDidMount: ->
nodeStore.listen @onNodesChange
onNodesChange: (status) ->
@setState
# nodes: status.nodes
paletteItemHasNodes: status.paletteItemHasNodes
module.exports =
actions: nodeActions
store: nodeStore
mixin: mixin
| PaletteStore = require './palette-store'
- GraphStore = require './graph-store'
+ GraphActions = require '../actions/graph-actions'
nodeActions = Reflux.createActions(
[
"nodesChanged"
]
)
nodeStore = Reflux.createStore
listenables: [nodeActions]
init: ->
@nodes = []
@paletteItemHasNodes = false
@selectedPaletteItem = null
PaletteStore.store.listen @paletteChanged
- GraphStore.store.listen @graphChanged
? ^ ^^^^^^
+ GraphActions.graphChanged.listen @graphChanged
? ^^ + ++++ ++++++++ ^
onNodesChanged: (nodes) ->
@nodes = nodes
@internalUpdate()
graphChanged: (status) ->
@nodes = status.nodes
@internalUpdate()
paletteChanged: ->
@selectedPaletteItem = PaletteStore.store.selectedPaletteItem
@internalUpdate()
internalUpdate: ->
@paletteItemHasNodes = false
return unless @selectedPaletteItem
_.each @nodes, (node) =>
if node.paletteItemIs @selectedPaletteItem
@paletteItemHasNodes = true
@notifyChange()
notifyChange: ->
data =
nodes: @nodes
paletteItemHasNodes: @paletteItemHasNodes
@trigger(data)
mixin =
getInitialState: ->
nodes: nodeStore.nodes
paletteItemHasNodes: nodeStore.paletteItemHasNodes
componentDidMount: ->
nodeStore.listen @onNodesChange
onNodesChange: (status) ->
@setState
# nodes: status.nodes
paletteItemHasNodes: status.paletteItemHasNodes
module.exports =
actions: nodeActions
store: nodeStore
mixin: mixin | 4 | 0.063492 | 2 | 2 |
2337eb7d236ec4933a126b5b0e1c020902040630 | lib/stash/wrapper/stash_wrapper.rb | lib/stash/wrapper/stash_wrapper.rb | require 'xml/mapping'
require_relative 'descriptive_node'
require_relative 'st_identifier'
require_relative 'stash_administrative'
module Stash
module Wrapper
class StashWrapper
include ::XML::Mapping
root_element_name 'stash_wrapper'
object_node :identifier, 'identifier', class: Identifier
object_node :stash_administrative, 'stash_administrative', class: StashAdministrative
descriptive_node :stash_descriptive, 'stash_descriptive'
def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:)
self.identifier = identifier
self.stash_administrative = StashAdministrative.new(
version: version,
license: license,
embargo: embargo,
inventory: inventory
)
self.stash_descriptive = descriptive_elements
end
def pre_save(options = { mapping: :_default })
xml = super(options)
xml.add_namespace('http://dash.cdlib.org/stash_wrapper/')
xml.add_namespace('xsi', 'http://www.w3.org/2001/XMLSchema-instance')
xml.add_attribute('xsi:schemaLocation', 'http://dash.cdlib.org/stash_wrapper/ http://dash.cdlib.org/stash_wrapper/stash_wrapper.xsd')
xml
end
end
end
end
| require 'xml/mapping'
require_relative 'descriptive_node'
require_relative 'st_identifier'
require_relative 'stash_administrative'
module Stash
module Wrapper
class StashWrapper
include ::XML::Mapping
root_element_name 'stash_wrapper'
object_node :identifier, 'identifier', class: Identifier
object_node :stash_administrative, 'stash_administrative', class: StashAdministrative
descriptive_node :stash_descriptive, 'stash_descriptive'
def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:) # rubocop:disable Metrics/ParameterLists
self.identifier = identifier
self.stash_administrative = StashAdministrative.new(
version: version,
license: license,
embargo: embargo,
inventory: inventory
)
self.stash_descriptive = descriptive_elements
end
def pre_save(options = { mapping: :_default })
xml = super(options)
xml.add_namespace('http://dash.cdlib.org/stash_wrapper/')
xml.add_namespace('xsi', 'http://www.w3.org/2001/XMLSchema-instance')
xml.add_attribute('xsi:schemaLocation', 'http://dash.cdlib.org/stash_wrapper/ http://dash.cdlib.org/stash_wrapper/stash_wrapper.xsd')
xml
end
end
end
end
| Disable Rubocop warning re: long parameter lists | Disable Rubocop warning re: long parameter lists
| Ruby | mit | CDLUC3/stash,CDLUC3/stash,CDLUC3/stash,CDLUC3/stash | ruby | ## Code Before:
require 'xml/mapping'
require_relative 'descriptive_node'
require_relative 'st_identifier'
require_relative 'stash_administrative'
module Stash
module Wrapper
class StashWrapper
include ::XML::Mapping
root_element_name 'stash_wrapper'
object_node :identifier, 'identifier', class: Identifier
object_node :stash_administrative, 'stash_administrative', class: StashAdministrative
descriptive_node :stash_descriptive, 'stash_descriptive'
def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:)
self.identifier = identifier
self.stash_administrative = StashAdministrative.new(
version: version,
license: license,
embargo: embargo,
inventory: inventory
)
self.stash_descriptive = descriptive_elements
end
def pre_save(options = { mapping: :_default })
xml = super(options)
xml.add_namespace('http://dash.cdlib.org/stash_wrapper/')
xml.add_namespace('xsi', 'http://www.w3.org/2001/XMLSchema-instance')
xml.add_attribute('xsi:schemaLocation', 'http://dash.cdlib.org/stash_wrapper/ http://dash.cdlib.org/stash_wrapper/stash_wrapper.xsd')
xml
end
end
end
end
## Instruction:
Disable Rubocop warning re: long parameter lists
## Code After:
require 'xml/mapping'
require_relative 'descriptive_node'
require_relative 'st_identifier'
require_relative 'stash_administrative'
module Stash
module Wrapper
class StashWrapper
include ::XML::Mapping
root_element_name 'stash_wrapper'
object_node :identifier, 'identifier', class: Identifier
object_node :stash_administrative, 'stash_administrative', class: StashAdministrative
descriptive_node :stash_descriptive, 'stash_descriptive'
def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:) # rubocop:disable Metrics/ParameterLists
self.identifier = identifier
self.stash_administrative = StashAdministrative.new(
version: version,
license: license,
embargo: embargo,
inventory: inventory
)
self.stash_descriptive = descriptive_elements
end
def pre_save(options = { mapping: :_default })
xml = super(options)
xml.add_namespace('http://dash.cdlib.org/stash_wrapper/')
xml.add_namespace('xsi', 'http://www.w3.org/2001/XMLSchema-instance')
xml.add_attribute('xsi:schemaLocation', 'http://dash.cdlib.org/stash_wrapper/ http://dash.cdlib.org/stash_wrapper/stash_wrapper.xsd')
xml
end
end
end
end
| require 'xml/mapping'
require_relative 'descriptive_node'
require_relative 'st_identifier'
require_relative 'stash_administrative'
module Stash
module Wrapper
class StashWrapper
include ::XML::Mapping
root_element_name 'stash_wrapper'
object_node :identifier, 'identifier', class: Identifier
object_node :stash_administrative, 'stash_administrative', class: StashAdministrative
descriptive_node :stash_descriptive, 'stash_descriptive'
- def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:)
+ def initialize(identifier:, version:, license:, embargo:, inventory:, descriptive_elements:) # rubocop:disable Metrics/ParameterLists
? +++++++++++++++++++++++++++++++++++++++++
self.identifier = identifier
self.stash_administrative = StashAdministrative.new(
version: version,
license: license,
embargo: embargo,
inventory: inventory
)
self.stash_descriptive = descriptive_elements
end
def pre_save(options = { mapping: :_default })
xml = super(options)
xml.add_namespace('http://dash.cdlib.org/stash_wrapper/')
xml.add_namespace('xsi', 'http://www.w3.org/2001/XMLSchema-instance')
xml.add_attribute('xsi:schemaLocation', 'http://dash.cdlib.org/stash_wrapper/ http://dash.cdlib.org/stash_wrapper/stash_wrapper.xsd')
xml
end
end
end
end | 2 | 0.05 | 1 | 1 |
1dce75a07db9b9c4b9cc374e4df6cf0a36f03204 | gce/ubuntu-15-10-slave.sh | gce/ubuntu-15-10-slave.sh |
apt-get update
apt-get install -y zip g++ zlib1g-dev wget git unzip python python3 curl \
openjdk-8-jdk openjdk-8-source ca-certificates-java xvfb
# Android SDK requires 32-bits libraries
dpkg --add-architecture i386
apt-get -qqy update
apt-get -qqy install libncurses5:i386 libstdc++6:i386 zlib1g:i386
apt-get -y install expect # Needed to 'yes' the SDK licenses.
# Dependencies for TensorFlow
apt-get -y install python-numpy swig python-dev python-pip libcurl3-dev
pip install mock
| dpkg --add-architecture i386
apt-get -y update
apt-get -y dist-upgrade
# Explicitly install the OpenJDK 8 before anything else to prevent
# Ubuntu from pulling in OpenJDK 9.
apt-get -y install \
openjdk-8-jdk \
openjdk-8-source
packages=(
# Bazel dependencies.
build-essential
curl
git
python
python3
unzip
wget
xvfb
zip
zlib1g-dev
# Dependencies for Android SDK.
# https://developer.android.com/studio/troubleshoot.html#linux-libraries
# https://code.google.com/p/android/issues/detail?id=207212
expect
libbz2-1.0:i386
libncurses5:i386
libstdc++6:i386
libz1:i386
# Dependencies for TensorFlow.
libcurl3-dev
python-dev
python-numpy
python-pip
python-wheel
swig
)
apt-get -y install "${packages[@]}"
pip install mock
| Install openjdk-8 before anything else on Ubuntu 15.10 slaves to prevent openjdk-7 from being pulled in and becoming the default JDK. | Install openjdk-8 before anything else on Ubuntu 15.10 slaves to prevent openjdk-7 from being pulled in and becoming the default JDK.
Change-Id: Iff22a0916c67f6860229664d6e2e7248aeb22bba
| Shell | apache-2.0 | bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration,bazelbuild/continuous-integration | shell | ## Code Before:
apt-get update
apt-get install -y zip g++ zlib1g-dev wget git unzip python python3 curl \
openjdk-8-jdk openjdk-8-source ca-certificates-java xvfb
# Android SDK requires 32-bits libraries
dpkg --add-architecture i386
apt-get -qqy update
apt-get -qqy install libncurses5:i386 libstdc++6:i386 zlib1g:i386
apt-get -y install expect # Needed to 'yes' the SDK licenses.
# Dependencies for TensorFlow
apt-get -y install python-numpy swig python-dev python-pip libcurl3-dev
pip install mock
## Instruction:
Install openjdk-8 before anything else on Ubuntu 15.10 slaves to prevent openjdk-7 from being pulled in and becoming the default JDK.
Change-Id: Iff22a0916c67f6860229664d6e2e7248aeb22bba
## Code After:
dpkg --add-architecture i386
apt-get -y update
apt-get -y dist-upgrade
# Explicitly install the OpenJDK 8 before anything else to prevent
# Ubuntu from pulling in OpenJDK 9.
apt-get -y install \
openjdk-8-jdk \
openjdk-8-source
packages=(
# Bazel dependencies.
build-essential
curl
git
python
python3
unzip
wget
xvfb
zip
zlib1g-dev
# Dependencies for Android SDK.
# https://developer.android.com/studio/troubleshoot.html#linux-libraries
# https://code.google.com/p/android/issues/detail?id=207212
expect
libbz2-1.0:i386
libncurses5:i386
libstdc++6:i386
libz1:i386
# Dependencies for TensorFlow.
libcurl3-dev
python-dev
python-numpy
python-pip
python-wheel
swig
)
apt-get -y install "${packages[@]}"
pip install mock
| + dpkg --add-architecture i386
+ apt-get -y update
+ apt-get -y dist-upgrade
- apt-get update
- apt-get install -y zip g++ zlib1g-dev wget git unzip python python3 curl \
- openjdk-8-jdk openjdk-8-source ca-certificates-java xvfb
+ # Explicitly install the OpenJDK 8 before anything else to prevent
+ # Ubuntu from pulling in OpenJDK 9.
+ apt-get -y install \
+ openjdk-8-jdk \
+ openjdk-8-source
- # Android SDK requires 32-bits libraries
- dpkg --add-architecture i386
- apt-get -qqy update
- apt-get -qqy install libncurses5:i386 libstdc++6:i386 zlib1g:i386
- apt-get -y install expect # Needed to 'yes' the SDK licenses.
+ packages=(
+ # Bazel dependencies.
+ build-essential
+ curl
+ git
+ python
+ python3
+ unzip
+ wget
+ xvfb
+ zip
+ zlib1g-dev
+ # Dependencies for Android SDK.
+ # https://developer.android.com/studio/troubleshoot.html#linux-libraries
+ # https://code.google.com/p/android/issues/detail?id=207212
+ expect
+ libbz2-1.0:i386
+ libncurses5:i386
+ libstdc++6:i386
+ libz1:i386
+
- # Dependencies for TensorFlow
+ # Dependencies for TensorFlow.
? ++ +
- apt-get -y install python-numpy swig python-dev python-pip libcurl3-dev
+ libcurl3-dev
+ python-dev
+ python-numpy
+ python-pip
+ python-wheel
+ swig
+ )
+ apt-get -y install "${packages[@]}"
+
pip install mock | 49 | 3.5 | 39 | 10 |
3b07bab65a344330d08df527533b6a7d26cb8858 | src/Button.php | src/Button.php | <?php namespace GeneaLabs\LaravelCasts;
class Button extends Component
{
public function __construct(string $value, array $options = [])
{
parent::__construct('', $value, $options);
if (in_array($this->framework, ['bootstrap3', 'bootstrap4'])) {
$this->classes = 'btn';
}
$this->excludedClasses = $this->excludedKeys->merge(collect([
'form-control' => '',
]));
$this->excludedKeys = $this->excludedKeys->merge(collect([
'placeholder' => '',
]));
}
protected function renderBaseControl() : string
{
return app('form')->callParentMethod(
$this->type,
$this->value,
$this->options
);
}
public function getTypeAttribute() : string
{
return 'button';
}
}
| <?php namespace GeneaLabs\LaravelCasts;
class Button extends Component
{
public function __construct(string $value, array $options = [])
{
parent::__construct('', $value, $options);
if (in_array($this->framework, ['bootstrap3', 'bootstrap4'])) {
$this->classes = 'btn';
}
if ($this->framework === 'bootstrap3') {
$options['offsetClass'] = ($options['label'] ?? '') === ''
? ' col-sm-offset-' . $this->labelWidth
: '';
}
if ($this->framework === 'bootstrap4') {
$options['offsetClass'] = ($options['label'] ?? '') === ''
? ' offset-sm-' . $this->labelWidth
: '';
}
$this->attributes['options'] = $options;
$this->excludedClasses = $this->excludedKeys->merge(collect([
'form-control' => '',
]));
$this->excludedKeys = $this->excludedKeys->merge(collect([
'placeholder' => '',
]));
}
protected function renderBaseControl() : string
{
return app('form')->callParentMethod(
$this->type,
$this->value,
$this->options
);
}
public function getTypeAttribute() : string
{
return 'button';
}
}
| Fix offset calculations for buttons without labels | Fix offset calculations for buttons without labels
| PHP | mit | GeneaLabs/laravel-casts,GeneaLabs/laravel-casts | php | ## Code Before:
<?php namespace GeneaLabs\LaravelCasts;
class Button extends Component
{
public function __construct(string $value, array $options = [])
{
parent::__construct('', $value, $options);
if (in_array($this->framework, ['bootstrap3', 'bootstrap4'])) {
$this->classes = 'btn';
}
$this->excludedClasses = $this->excludedKeys->merge(collect([
'form-control' => '',
]));
$this->excludedKeys = $this->excludedKeys->merge(collect([
'placeholder' => '',
]));
}
protected function renderBaseControl() : string
{
return app('form')->callParentMethod(
$this->type,
$this->value,
$this->options
);
}
public function getTypeAttribute() : string
{
return 'button';
}
}
## Instruction:
Fix offset calculations for buttons without labels
## Code After:
<?php namespace GeneaLabs\LaravelCasts;
class Button extends Component
{
public function __construct(string $value, array $options = [])
{
parent::__construct('', $value, $options);
if (in_array($this->framework, ['bootstrap3', 'bootstrap4'])) {
$this->classes = 'btn';
}
if ($this->framework === 'bootstrap3') {
$options['offsetClass'] = ($options['label'] ?? '') === ''
? ' col-sm-offset-' . $this->labelWidth
: '';
}
if ($this->framework === 'bootstrap4') {
$options['offsetClass'] = ($options['label'] ?? '') === ''
? ' offset-sm-' . $this->labelWidth
: '';
}
$this->attributes['options'] = $options;
$this->excludedClasses = $this->excludedKeys->merge(collect([
'form-control' => '',
]));
$this->excludedKeys = $this->excludedKeys->merge(collect([
'placeholder' => '',
]));
}
protected function renderBaseControl() : string
{
return app('form')->callParentMethod(
$this->type,
$this->value,
$this->options
);
}
public function getTypeAttribute() : string
{
return 'button';
}
}
| <?php namespace GeneaLabs\LaravelCasts;
class Button extends Component
{
public function __construct(string $value, array $options = [])
{
parent::__construct('', $value, $options);
if (in_array($this->framework, ['bootstrap3', 'bootstrap4'])) {
$this->classes = 'btn';
}
+ if ($this->framework === 'bootstrap3') {
+ $options['offsetClass'] = ($options['label'] ?? '') === ''
+ ? ' col-sm-offset-' . $this->labelWidth
+ : '';
+ }
+
+ if ($this->framework === 'bootstrap4') {
+ $options['offsetClass'] = ($options['label'] ?? '') === ''
+ ? ' offset-sm-' . $this->labelWidth
+ : '';
+ }
+
+ $this->attributes['options'] = $options;
$this->excludedClasses = $this->excludedKeys->merge(collect([
'form-control' => '',
]));
$this->excludedKeys = $this->excludedKeys->merge(collect([
'placeholder' => '',
]));
}
protected function renderBaseControl() : string
{
return app('form')->callParentMethod(
$this->type,
$this->value,
$this->options
);
}
public function getTypeAttribute() : string
{
return 'button';
}
} | 13 | 0.382353 | 13 | 0 |
290d873c4954a8bf0e174c9b500917dc5ea54a8c | roles/master/tasks/main.yml | roles/master/tasks/main.yml | ---
- name: Install Kubernetes packages (RHEL/CentOS)
when: ansible_os_family == "RedHat"
yum:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Install Kubernetes packages (Debian/Ubuntu)
when: ansible_os_family == "Debian"
apt:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Create and copy master config file
include: config.yml
- name: Reload systemd daemon
command: systemctl daemon-reload
- name: Enable and restart kubelet
service: name=kubelet enabled=yes state=restarted
register: kubelet_start
| ---
- name: Install Kubernetes packages (RHEL/CentOS)
when: ansible_os_family == "RedHat"
yum:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Install Kubernetes packages (Debian/Ubuntu)
when: ansible_os_family == "Debian"
apt:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Create and copy master config file
include: config.yml
- name: Reload systemd daemon
command: systemctl daemon-reload
- name: Enable and restart kubelet
service: name=kubelet enabled=yes state=restarted
register: kubelet_start
- name: Wait for API server start
when: kubelet_start
wait_for:
host: "127.0.0.1"
port: "8080"
connect_timeout: 5
state: started
| Add wait for apiserver start step | master: Add wait for apiserver start step
| YAML | apache-2.0 | kairen/vagrants,kairen/kube-ansible,kairen/vagrants,kairen/kube-ansible,kairen/kube-ansible,kairen/kubernetes-ceph-ansible,kairen/kubernetes-ceph-ansible | yaml | ## Code Before:
---
- name: Install Kubernetes packages (RHEL/CentOS)
when: ansible_os_family == "RedHat"
yum:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Install Kubernetes packages (Debian/Ubuntu)
when: ansible_os_family == "Debian"
apt:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Create and copy master config file
include: config.yml
- name: Reload systemd daemon
command: systemctl daemon-reload
- name: Enable and restart kubelet
service: name=kubelet enabled=yes state=restarted
register: kubelet_start
## Instruction:
master: Add wait for apiserver start step
## Code After:
---
- name: Install Kubernetes packages (RHEL/CentOS)
when: ansible_os_family == "RedHat"
yum:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Install Kubernetes packages (Debian/Ubuntu)
when: ansible_os_family == "Debian"
apt:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Create and copy master config file
include: config.yml
- name: Reload systemd daemon
command: systemctl daemon-reload
- name: Enable and restart kubelet
service: name=kubelet enabled=yes state=restarted
register: kubelet_start
- name: Wait for API server start
when: kubelet_start
wait_for:
host: "127.0.0.1"
port: "8080"
connect_timeout: 5
state: started
| ---
- name: Install Kubernetes packages (RHEL/CentOS)
when: ansible_os_family == "RedHat"
yum:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Install Kubernetes packages (Debian/Ubuntu)
when: ansible_os_family == "Debian"
apt:
name: "{{ item }}"
update_cache: yes
with_items:
- kubectl
- kubelet
- kubernetes-cni
register: masterpackage
- name: Create and copy master config file
include: config.yml
- name: Reload systemd daemon
command: systemctl daemon-reload
- name: Enable and restart kubelet
service: name=kubelet enabled=yes state=restarted
register: kubelet_start
+
+ - name: Wait for API server start
+ when: kubelet_start
+ wait_for:
+ host: "127.0.0.1"
+ port: "8080"
+ connect_timeout: 5
+ state: started | 8 | 0.25 | 8 | 0 |
885816c694f559b5229c77bd272a7bf11210b973 | routes/requests/index.js | routes/requests/index.js | const requestsController = require('../../controllers/requests.js');
const requestHelper = require('../../helpers/requestHelper.js');
module.exports = {
'/:request/top/:size': {
get: function get(req, res, cb) {
const size = parseInt(req.params.size);
res.setHeader('Access-Control-Allow-Origin', '*');
if (typeof size == 'number' && size > 5) {
requestsController.getRowRequest(req.params.request, size, req, res);
} else {
res.json({ success: false, message: 'Invalid size parameter'});
}
},
},
'/game/:type': {
get: function get (req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getGameRequest(req.params.type, req, res);
},
},
};
| const requestsController = require('../../controllers/requests.js');
const requestHelper = require('../../helpers/requestHelper.js');
module.exports = {
'/:request/top/:size': {
get: function get(req, res, cb) {
const size = parseInt(req.params.size);
res.setHeader('Access-Control-Allow-Origin', '*');
if (typeof size == 'number' && size > 5) {
requestsController.getRowRequest(req.params.request, size, req, res);
} else {
res.json({ success: false, message: 'Invalid size parameter'});
}
},
},
'/game/:type': {
get: function get (req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getGameRequest(req.params.type, req, res);
},
},
'/getTodayHits': {
get: function get(req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getTodaysRequests(req, res);
},
}
};
| Add route for todays hits | Add route for todays hits
| JavaScript | apache-2.0 | seestats/api | javascript | ## Code Before:
const requestsController = require('../../controllers/requests.js');
const requestHelper = require('../../helpers/requestHelper.js');
module.exports = {
'/:request/top/:size': {
get: function get(req, res, cb) {
const size = parseInt(req.params.size);
res.setHeader('Access-Control-Allow-Origin', '*');
if (typeof size == 'number' && size > 5) {
requestsController.getRowRequest(req.params.request, size, req, res);
} else {
res.json({ success: false, message: 'Invalid size parameter'});
}
},
},
'/game/:type': {
get: function get (req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getGameRequest(req.params.type, req, res);
},
},
};
## Instruction:
Add route for todays hits
## Code After:
const requestsController = require('../../controllers/requests.js');
const requestHelper = require('../../helpers/requestHelper.js');
module.exports = {
'/:request/top/:size': {
get: function get(req, res, cb) {
const size = parseInt(req.params.size);
res.setHeader('Access-Control-Allow-Origin', '*');
if (typeof size == 'number' && size > 5) {
requestsController.getRowRequest(req.params.request, size, req, res);
} else {
res.json({ success: false, message: 'Invalid size parameter'});
}
},
},
'/game/:type': {
get: function get (req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getGameRequest(req.params.type, req, res);
},
},
'/getTodayHits': {
get: function get(req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getTodaysRequests(req, res);
},
}
};
| const requestsController = require('../../controllers/requests.js');
const requestHelper = require('../../helpers/requestHelper.js');
module.exports = {
'/:request/top/:size': {
get: function get(req, res, cb) {
const size = parseInt(req.params.size);
res.setHeader('Access-Control-Allow-Origin', '*');
if (typeof size == 'number' && size > 5) {
requestsController.getRowRequest(req.params.request, size, req, res);
} else {
res.json({ success: false, message: 'Invalid size parameter'});
}
},
},
'/game/:type': {
get: function get (req, res, cb) {
res.setHeader('Access-Control-Allow-Origin', '*');
requestsController.getGameRequest(req.params.type, req, res);
},
},
+ '/getTodayHits': {
+ get: function get(req, res, cb) {
+ res.setHeader('Access-Control-Allow-Origin', '*');
+
+ requestsController.getTodaysRequests(req, res);
+ },
+ }
}; | 7 | 0.291667 | 7 | 0 |
0ed8255690d62eeb08700c5410149c7f2d2a58c4 | src/index.js | src/index.js | import "core-js/fn/array/from";
import "core-js/fn/object/get-own-property-symbols";
import "core-js/fn/object/keys";
import "core-js/fn/set";
import Helmet from "./Helmet";
export default Helmet;
| import "core-js/fn/array/from";
import "core-js/fn/object/get-own-property-symbols";
import "core-js/fn/set";
import Helmet from "./Helmet";
export default Helmet;
| Remove Object.keys polyfill (supported since IE9) | Remove Object.keys polyfill (supported since IE9)
| JavaScript | mit | magus/react-helmet,bountylabs/react-helmet,nfl/react-helmet | javascript | ## Code Before:
import "core-js/fn/array/from";
import "core-js/fn/object/get-own-property-symbols";
import "core-js/fn/object/keys";
import "core-js/fn/set";
import Helmet from "./Helmet";
export default Helmet;
## Instruction:
Remove Object.keys polyfill (supported since IE9)
## Code After:
import "core-js/fn/array/from";
import "core-js/fn/object/get-own-property-symbols";
import "core-js/fn/set";
import Helmet from "./Helmet";
export default Helmet;
| import "core-js/fn/array/from";
import "core-js/fn/object/get-own-property-symbols";
- import "core-js/fn/object/keys";
import "core-js/fn/set";
import Helmet from "./Helmet";
export default Helmet; | 1 | 0.142857 | 0 | 1 |
b9701cfb65c4c641231bef385dde74b8d940f901 | gimlet/backends/sql.py | gimlet/backends/sql.py | from sqlalchemy import MetaData, Table, Column, types, create_engine, select
from .base import BaseBackend
class SQLBackend(BaseBackend):
def __init__(self, url, table_name='gimlet_channels', **engine_kwargs):
meta = MetaData(bind=create_engine(url, **engine_kwargs))
self.table = Table(table_name, meta,
Column('id', types.Integer, primary_key=True),
Column('key', types.CHAR(32), nullable=False,
unique=True),
Column('data', types.LargeBinary, nullable=False))
self.table.create(checkfirst=True)
def __setitem__(self, key, value):
raw = self.serialize(value)
# Check if this key exists with a SELECT FOR UPDATE, to protect
# against a race with other concurrent writers of this key.
r = self.table.select('1', for_update=True).\
where(self.table.c.key == key).execute().fetchone()
if r:
# If it exists, use an UPDATE.
self.table.update().values(data=raw).\
where(self.table.c.key == key).execute()
else:
# Otherwise INSERT.
self.table.insert().values(key=key, data=raw).execute()
def __getitem__(self, key):
r = select([self.table.c.data], self.table.c.key == key).\
execute().fetchone()
if r:
raw = r[0]
return self.deserialize(raw)
else:
raise KeyError('key %r not found' % key)
| from sqlalchemy import MetaData, Table, Column, types, create_engine, select
from .base import BaseBackend
class SQLBackend(BaseBackend):
def __init__(self, url, table_name='gimlet_channels', **engine_kwargs):
meta = MetaData(bind=create_engine(url, **engine_kwargs))
self.table = Table(table_name, meta,
Column('id', types.Integer, primary_key=True),
Column('key', types.CHAR(32), nullable=False,
unique=True),
Column('data', types.LargeBinary, nullable=False))
self.table.create(checkfirst=True)
def __setitem__(self, key, value):
table = self.table
key_col = table.c.key
raw = self.serialize(value)
# Check if this key exists with a SELECT FOR UPDATE, to protect
# against a race with other concurrent writers of this key.
r = table.count(key_col == key, for_update=True).scalar()
if r:
# If it exists, use an UPDATE.
table.update().values(data=raw).where(key_col == key).execute()
else:
# Otherwise INSERT.
table.insert().values(key=key, data=raw).execute()
def __getitem__(self, key):
r = select([self.table.c.data], self.table.c.key == key).\
execute().fetchone()
if r:
raw = r[0]
return self.deserialize(raw)
else:
raise KeyError('key %r not found' % key)
| Use count/scalar to test if key is present in SQL back end | Use count/scalar to test if key is present in SQL back end
This is simpler than using select/execute/fetchone. Also, scalar automatically
closes the result set whereas fetchone does not. This may fix some performance
issues.
| Python | mit | storborg/gimlet | python | ## Code Before:
from sqlalchemy import MetaData, Table, Column, types, create_engine, select
from .base import BaseBackend
class SQLBackend(BaseBackend):
def __init__(self, url, table_name='gimlet_channels', **engine_kwargs):
meta = MetaData(bind=create_engine(url, **engine_kwargs))
self.table = Table(table_name, meta,
Column('id', types.Integer, primary_key=True),
Column('key', types.CHAR(32), nullable=False,
unique=True),
Column('data', types.LargeBinary, nullable=False))
self.table.create(checkfirst=True)
def __setitem__(self, key, value):
raw = self.serialize(value)
# Check if this key exists with a SELECT FOR UPDATE, to protect
# against a race with other concurrent writers of this key.
r = self.table.select('1', for_update=True).\
where(self.table.c.key == key).execute().fetchone()
if r:
# If it exists, use an UPDATE.
self.table.update().values(data=raw).\
where(self.table.c.key == key).execute()
else:
# Otherwise INSERT.
self.table.insert().values(key=key, data=raw).execute()
def __getitem__(self, key):
r = select([self.table.c.data], self.table.c.key == key).\
execute().fetchone()
if r:
raw = r[0]
return self.deserialize(raw)
else:
raise KeyError('key %r not found' % key)
## Instruction:
Use count/scalar to test if key is present in SQL back end
This is simpler than using select/execute/fetchone. Also, scalar automatically
closes the result set whereas fetchone does not. This may fix some performance
issues.
## Code After:
from sqlalchemy import MetaData, Table, Column, types, create_engine, select
from .base import BaseBackend
class SQLBackend(BaseBackend):
def __init__(self, url, table_name='gimlet_channels', **engine_kwargs):
meta = MetaData(bind=create_engine(url, **engine_kwargs))
self.table = Table(table_name, meta,
Column('id', types.Integer, primary_key=True),
Column('key', types.CHAR(32), nullable=False,
unique=True),
Column('data', types.LargeBinary, nullable=False))
self.table.create(checkfirst=True)
def __setitem__(self, key, value):
table = self.table
key_col = table.c.key
raw = self.serialize(value)
# Check if this key exists with a SELECT FOR UPDATE, to protect
# against a race with other concurrent writers of this key.
r = table.count(key_col == key, for_update=True).scalar()
if r:
# If it exists, use an UPDATE.
table.update().values(data=raw).where(key_col == key).execute()
else:
# Otherwise INSERT.
table.insert().values(key=key, data=raw).execute()
def __getitem__(self, key):
r = select([self.table.c.data], self.table.c.key == key).\
execute().fetchone()
if r:
raw = r[0]
return self.deserialize(raw)
else:
raise KeyError('key %r not found' % key)
| from sqlalchemy import MetaData, Table, Column, types, create_engine, select
from .base import BaseBackend
class SQLBackend(BaseBackend):
def __init__(self, url, table_name='gimlet_channels', **engine_kwargs):
meta = MetaData(bind=create_engine(url, **engine_kwargs))
self.table = Table(table_name, meta,
Column('id', types.Integer, primary_key=True),
Column('key', types.CHAR(32), nullable=False,
unique=True),
Column('data', types.LargeBinary, nullable=False))
self.table.create(checkfirst=True)
def __setitem__(self, key, value):
+ table = self.table
+ key_col = table.c.key
raw = self.serialize(value)
-
# Check if this key exists with a SELECT FOR UPDATE, to protect
# against a race with other concurrent writers of this key.
+ r = table.count(key_col == key, for_update=True).scalar()
- r = self.table.select('1', for_update=True).\
- where(self.table.c.key == key).execute().fetchone()
-
if r:
# If it exists, use an UPDATE.
+ table.update().values(data=raw).where(key_col == key).execute()
- self.table.update().values(data=raw).\
- where(self.table.c.key == key).execute()
else:
# Otherwise INSERT.
- self.table.insert().values(key=key, data=raw).execute()
? -----
+ table.insert().values(key=key, data=raw).execute()
def __getitem__(self, key):
r = select([self.table.c.data], self.table.c.key == key).\
execute().fetchone()
if r:
raw = r[0]
return self.deserialize(raw)
else:
raise KeyError('key %r not found' % key) | 12 | 0.3 | 5 | 7 |
2f9114896250805a1ed3da055715d8db3453b660 | app/views/home/index.html.erb | app/views/home/index.html.erb | <h1>Welcome to Something Else 2!</h1>
<p>Find me in app/views/home/index.html.erb</p>
| <h1>Welcome to ChefBuddy!</h1>
<p>Find me in app/views/home/index.html.erb</p>
| Change something else to ChefBuddy | Change something else to ChefBuddy
| HTML+ERB | mit | chef-buddy/chef-buddy-rails,chef-buddy/chef-buddy-rails,chef-buddy/chef-buddy-rails | html+erb | ## Code Before:
<h1>Welcome to Something Else 2!</h1>
<p>Find me in app/views/home/index.html.erb</p>
## Instruction:
Change something else to ChefBuddy
## Code After:
<h1>Welcome to ChefBuddy!</h1>
<p>Find me in app/views/home/index.html.erb</p>
| - <h1>Welcome to Something Else 2!</h1>
+ <h1>Welcome to ChefBuddy!</h1>
<p>Find me in app/views/home/index.html.erb</p> | 2 | 1 | 1 | 1 |
c967fdf7e18782af183b38eddb589dfce3ee761e | requirements.txt | requirements.txt | git+https://github.com/adsabs/ADSMicroserviceUtils.git@v1.1.6
requests==2.20.0
mock
httpretty
coveralls
pytest
coverage
pytest-cov
| git+https://github.com/adsabs/ADSMicroserviceUtils.git@v1.1.6
requests==2.20.0
mock
httpretty
coveralls
pytest
coverage
pytest-cov
six==1.14.0
| Update six version to be python3 compatible | Update six version to be python3 compatible
| Text | mit | adsabs/adsrex | text | ## Code Before:
git+https://github.com/adsabs/ADSMicroserviceUtils.git@v1.1.6
requests==2.20.0
mock
httpretty
coveralls
pytest
coverage
pytest-cov
## Instruction:
Update six version to be python3 compatible
## Code After:
git+https://github.com/adsabs/ADSMicroserviceUtils.git@v1.1.6
requests==2.20.0
mock
httpretty
coveralls
pytest
coverage
pytest-cov
six==1.14.0
| git+https://github.com/adsabs/ADSMicroserviceUtils.git@v1.1.6
requests==2.20.0
mock
httpretty
coveralls
pytest
coverage
pytest-cov
+ six==1.14.0 | 1 | 0.125 | 1 | 0 |
e07fa8f45e723042b967e9edd780a5d582fb761a | doc/source/install-undercloud.rst | doc/source/install-undercloud.rst | Installing the Undercloud
=========================
Make sure you are logged in as a non-root user (such as the stack user) on the
node on which you want to install the undercloud.
If you used the virt setup this node will be a VM called *instack* and you can
use the stack user.
For a baremetal setup this will be the host you selected for the Undercloud
while preparing the environment.
#. Download and execute the instack-undercloud setup script::
curl https://raw.githubusercontent.com/rdo-management/instack-undercloud/master/scripts/instack-setup-host-rhel7 | bash -x
#. Install instack-undercloud::
sudo yum install -y instack-undercloud
#. If installing on baremetal, copy in the sample answers file and edit it
to reflect your environment::
cp /usr/share/instack-undercloud/instack.answers.sample ~/instack.answers
#. Run script to install the undercloud::
instack-install-undercloud
Once the install script has run to completion, you should take note of the
files ``/root/stackrc`` and ``/root/tripleo-undercloud-passwords``. Both these
files will be needed to interact with the installed undercloud. Copy them to
the home directory for easier use later.::
sudo cp /root/tripleo-undercloud-passwords .
sudo cp /root/stackrc .
| Installing the Undercloud
=========================
Make sure you are logged in as a non-root user (such as the stack user) on the
node on which you want to install the undercloud.
If you used the virt setup this node will be a VM called *instack* and you can
use the stack user.
For a baremetal setup this will be the host you selected for the Undercloud
while preparing the environment.
#. Download and execute the instack-undercloud setup script::
curl https://raw.githubusercontent.com/rdo-management/instack-undercloud/master/scripts/instack-setup-host-rhel7 | bash -x
#. Install instack-undercloud::
sudo yum install -y instack-undercloud
#. If installing on baremetal, copy in the sample answers file and edit it
to reflect your environment::
cp /usr/share/instack-undercloud/instack.answers.sample ~/instack.answers
#. Run script to install the undercloud::
instack-install-undercloud
Once the install script has run to completion, you should take note of the
files ``/root/stackrc`` and ``/root/tripleo-undercloud-passwords``. Both these
files will be needed to interact with the installed undercloud. Copy them to
the home directory for easier use later.::
sudo cp /root/tripleo-undercloud-passwords .
sudo cp /root/stackrc .
Updating the Undercloud
-----------------------
The installed packages can be upgraded on the Undercloud.
#. Rerun the setup script to update the list of defined yum repositories::
instack-setup-host-rhel7
#. Use yum to update the installed packages. No services should need to be
restarted after updating::
sudo yum update
| Add docs for updating the undercloud | Add docs for updating the undercloud
Change-Id: I358b52ec9476bbdbd7627e2fd1b9c6c2fb9f08d2
| reStructuredText | apache-2.0 | dpkshetty/instack-undercloud,eprasad/instack-undercloud,eprasad/instack-undercloud,bcrochet/instack-undercloud,dpkshetty/instack-undercloud,dmsimard/instack-undercloud,rdo-management/instack-undercloud,dmsimard/instack-undercloud,rdo-management/instack-undercloud,bcrochet/instack-undercloud | restructuredtext | ## Code Before:
Installing the Undercloud
=========================
Make sure you are logged in as a non-root user (such as the stack user) on the
node on which you want to install the undercloud.
If you used the virt setup this node will be a VM called *instack* and you can
use the stack user.
For a baremetal setup this will be the host you selected for the Undercloud
while preparing the environment.
#. Download and execute the instack-undercloud setup script::
curl https://raw.githubusercontent.com/rdo-management/instack-undercloud/master/scripts/instack-setup-host-rhel7 | bash -x
#. Install instack-undercloud::
sudo yum install -y instack-undercloud
#. If installing on baremetal, copy in the sample answers file and edit it
to reflect your environment::
cp /usr/share/instack-undercloud/instack.answers.sample ~/instack.answers
#. Run script to install the undercloud::
instack-install-undercloud
Once the install script has run to completion, you should take note of the
files ``/root/stackrc`` and ``/root/tripleo-undercloud-passwords``. Both these
files will be needed to interact with the installed undercloud. Copy them to
the home directory for easier use later.::
sudo cp /root/tripleo-undercloud-passwords .
sudo cp /root/stackrc .
## Instruction:
Add docs for updating the undercloud
Change-Id: I358b52ec9476bbdbd7627e2fd1b9c6c2fb9f08d2
## Code After:
Installing the Undercloud
=========================
Make sure you are logged in as a non-root user (such as the stack user) on the
node on which you want to install the undercloud.
If you used the virt setup this node will be a VM called *instack* and you can
use the stack user.
For a baremetal setup this will be the host you selected for the Undercloud
while preparing the environment.
#. Download and execute the instack-undercloud setup script::
curl https://raw.githubusercontent.com/rdo-management/instack-undercloud/master/scripts/instack-setup-host-rhel7 | bash -x
#. Install instack-undercloud::
sudo yum install -y instack-undercloud
#. If installing on baremetal, copy in the sample answers file and edit it
to reflect your environment::
cp /usr/share/instack-undercloud/instack.answers.sample ~/instack.answers
#. Run script to install the undercloud::
instack-install-undercloud
Once the install script has run to completion, you should take note of the
files ``/root/stackrc`` and ``/root/tripleo-undercloud-passwords``. Both these
files will be needed to interact with the installed undercloud. Copy them to
the home directory for easier use later.::
sudo cp /root/tripleo-undercloud-passwords .
sudo cp /root/stackrc .
Updating the Undercloud
-----------------------
The installed packages can be upgraded on the Undercloud.
#. Rerun the setup script to update the list of defined yum repositories::
instack-setup-host-rhel7
#. Use yum to update the installed packages. No services should need to be
restarted after updating::
sudo yum update
| Installing the Undercloud
=========================
Make sure you are logged in as a non-root user (such as the stack user) on the
node on which you want to install the undercloud.
If you used the virt setup this node will be a VM called *instack* and you can
use the stack user.
For a baremetal setup this will be the host you selected for the Undercloud
while preparing the environment.
#. Download and execute the instack-undercloud setup script::
curl https://raw.githubusercontent.com/rdo-management/instack-undercloud/master/scripts/instack-setup-host-rhel7 | bash -x
#. Install instack-undercloud::
sudo yum install -y instack-undercloud
#. If installing on baremetal, copy in the sample answers file and edit it
to reflect your environment::
cp /usr/share/instack-undercloud/instack.answers.sample ~/instack.answers
#. Run script to install the undercloud::
instack-install-undercloud
Once the install script has run to completion, you should take note of the
files ``/root/stackrc`` and ``/root/tripleo-undercloud-passwords``. Both these
files will be needed to interact with the installed undercloud. Copy them to
the home directory for easier use later.::
sudo cp /root/tripleo-undercloud-passwords .
sudo cp /root/stackrc .
+
+
+ Updating the Undercloud
+ -----------------------
+
+ The installed packages can be upgraded on the Undercloud.
+
+ #. Rerun the setup script to update the list of defined yum repositories::
+
+ instack-setup-host-rhel7
+
+ #. Use yum to update the installed packages. No services should need to be
+ restarted after updating::
+
+ sudo yum update | 15 | 0.416667 | 15 | 0 |
ca869eb746a84d71e9190dd9246a3d61dc40e66a | fund/src/test/java/cl/fatman/capital/fund/ControllerTest.java | fund/src/test/java/cl/fatman/capital/fund/ControllerTest.java | package cl.fatman.capital.fund;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
public class ControllerTest {
private static Controller control;
private static PersistenceData persistence;
@BeforeClass
public static void setUp() {
control = Controller.getInstance();
control.setUp();
List<FundType> ftList = new ArrayList<FundType>();
FundType type = new FundType(1, "Deuda < 90 días");
ftList.add(type);
persistence = PersistenceData.getInstance();
persistence.insertObjectList(ftList);
}
@AfterClass
public static void tearDown() {
control.tearDown();
}
@Test
public void storeFundDataTest() {
control.storeFundData();
List<?> fundList = persistence.selectAllObjects("from Fund", Fund.class);
List<?> fundRateList = persistence.selectAllObjects("from FundRate", FundRate.class);
assertThat("Fund rate list should be greater than fund list.", fundRateList.size(), greaterThan(fundList.size()));
}
} | package cl.fatman.capital.fund;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
public class ControllerTest {
private static Controller control;
private static PersistenceData persistence;
@BeforeClass
public static void setUp() {
control = Controller.getInstance();
control.setUp();
persistence = PersistenceData.getInstance();
//Insert a initial fund type.
List<FundType> ftList = new ArrayList<FundType>();
FundType type = new FundType(1, "Deuda < 90 días");
ftList.add(type);
persistence.insertObjectList(ftList);
}
@AfterClass
public static void tearDown() {
control.tearDown();
}
@Test
public void storeFundDataTest() {
control.storeFundData();
control.storeFundData();
List<?> fundList = persistence.selectAllObjects("from Fund", Fund.class);
List<?> fundRateList = persistence.selectAllObjects("from FundRate", FundRate.class);
assertThat("Fund rate list should be greater than fund list.", fundRateList.size(), greaterThan(fundList.size()));
}
} | Update and improve the successful cases for the Controller. | Update and improve the successful cases for the Controller. | Java | mit | mparra-mpz/Capital | java | ## Code Before:
package cl.fatman.capital.fund;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
public class ControllerTest {
private static Controller control;
private static PersistenceData persistence;
@BeforeClass
public static void setUp() {
control = Controller.getInstance();
control.setUp();
List<FundType> ftList = new ArrayList<FundType>();
FundType type = new FundType(1, "Deuda < 90 días");
ftList.add(type);
persistence = PersistenceData.getInstance();
persistence.insertObjectList(ftList);
}
@AfterClass
public static void tearDown() {
control.tearDown();
}
@Test
public void storeFundDataTest() {
control.storeFundData();
List<?> fundList = persistence.selectAllObjects("from Fund", Fund.class);
List<?> fundRateList = persistence.selectAllObjects("from FundRate", FundRate.class);
assertThat("Fund rate list should be greater than fund list.", fundRateList.size(), greaterThan(fundList.size()));
}
}
## Instruction:
Update and improve the successful cases for the Controller.
## Code After:
package cl.fatman.capital.fund;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
public class ControllerTest {
private static Controller control;
private static PersistenceData persistence;
@BeforeClass
public static void setUp() {
control = Controller.getInstance();
control.setUp();
persistence = PersistenceData.getInstance();
//Insert a initial fund type.
List<FundType> ftList = new ArrayList<FundType>();
FundType type = new FundType(1, "Deuda < 90 días");
ftList.add(type);
persistence.insertObjectList(ftList);
}
@AfterClass
public static void tearDown() {
control.tearDown();
}
@Test
public void storeFundDataTest() {
control.storeFundData();
control.storeFundData();
List<?> fundList = persistence.selectAllObjects("from Fund", Fund.class);
List<?> fundRateList = persistence.selectAllObjects("from FundRate", FundRate.class);
assertThat("Fund rate list should be greater than fund list.", fundRateList.size(), greaterThan(fundList.size()));
}
} | package cl.fatman.capital.fund;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
public class ControllerTest {
private static Controller control;
private static PersistenceData persistence;
@BeforeClass
public static void setUp() {
control = Controller.getInstance();
control.setUp();
+ persistence = PersistenceData.getInstance();
+ //Insert a initial fund type.
List<FundType> ftList = new ArrayList<FundType>();
FundType type = new FundType(1, "Deuda < 90 días");
ftList.add(type);
- persistence = PersistenceData.getInstance();
persistence.insertObjectList(ftList);
}
@AfterClass
public static void tearDown() {
control.tearDown();
}
@Test
public void storeFundDataTest() {
control.storeFundData();
+ control.storeFundData();
List<?> fundList = persistence.selectAllObjects("from Fund", Fund.class);
List<?> fundRateList = persistence.selectAllObjects("from FundRate", FundRate.class);
assertThat("Fund rate list should be greater than fund list.", fundRateList.size(), greaterThan(fundList.size()));
}
} | 4 | 0.097561 | 3 | 1 |
17db1c8fde489a7f1139cccf3997a928a08dee3c | circle.yml | circle.yml | dependencies:
pre:
- sudo -u postgres createuser user_testing
- createdb test
test:
override:
- eval "$(curl -sL https://swift.vapor.sh/ci-3.1)"
| dependencies:
override:
- eval "$(curl -sL https://apt.vapor.sh)"
- sudo apt-get install vapor
- sudo chmod -R a+rx /usr/
pre:
- sudo -u postgres createdb test
test:
override:
- swift build
- swift build -c release
- swift test
| Update CircleCI configuration for the new vapor scripts | Update CircleCI configuration for the new vapor scripts
| YAML | mit | vapor-community/postgresql | yaml | ## Code Before:
dependencies:
pre:
- sudo -u postgres createuser user_testing
- createdb test
test:
override:
- eval "$(curl -sL https://swift.vapor.sh/ci-3.1)"
## Instruction:
Update CircleCI configuration for the new vapor scripts
## Code After:
dependencies:
override:
- eval "$(curl -sL https://apt.vapor.sh)"
- sudo apt-get install vapor
- sudo chmod -R a+rx /usr/
pre:
- sudo -u postgres createdb test
test:
override:
- swift build
- swift build -c release
- swift test
| dependencies:
+ override:
+ - eval "$(curl -sL https://apt.vapor.sh)"
+ - sudo apt-get install vapor
+ - sudo chmod -R a+rx /usr/
+
pre:
- - sudo -u postgres createuser user_testing
? ^^^^ ----- ---
+ - sudo -u postgres createdb test
? ^^
- - createdb test
test:
override:
- - eval "$(curl -sL https://swift.vapor.sh/ci-3.1)"
+ - swift build
+ - swift build -c release
+ - swift test | 12 | 1.5 | 9 | 3 |
852ab5e476ac6e1b9a7eda578ffe77a358a84e09 | src/PhpPact/Standalone/Installer/Service/InstallerPosixPreinstalled.php | src/PhpPact/Standalone/Installer/Service/InstallerPosixPreinstalled.php | <?php
namespace PhpPact\Standalone\Installer\Service;
use PhpPact\Standalone\Installer\Model\Scripts;
class InstallerPosixPreinstalled implements InstallerInterface
{
/**
* {@inheritdoc}
*/
public function isEligible(): bool
{
return in_array(PHP_OS, ['Linux', 'Darwin']) && !empty($this->getBinaryPath('pact-provider-verifier'));
}
/**
* {@inheritdoc}
*/
public function install(string $destinationDir): Scripts
{
$scripts = new Scripts(
'pact-mock-service',
'pact-stub-service',
'pact-provider-verifier',
'pact-message',
'pact-broker'
);
return $scripts;
}
private function getBinaryPath($binary)
{
return trim(shell_exec('command -v ' . escapeshellarg($binary)));
}
}
| <?php
namespace PhpPact\Standalone\Installer\Service;
use PhpPact\Standalone\Installer\Model\Scripts;
class InstallerPosixPreinstalled implements InstallerInterface
{
/**
* {@inheritdoc}
*/
public function isEligible(): bool
{
return in_array(PHP_OS, ['Linux', 'Darwin']) && !empty($this->getBinaryPath('pact-provider-verifier'));
}
/**
* {@inheritdoc}
*/
public function install(string $destinationDir): Scripts
{
$scripts = new Scripts(
'pact-mock-service',
'pact-stub-service',
'pact-provider-verifier',
'pact-message',
'pact-broker'
);
return $scripts;
}
private function getBinaryPath(string $binary): string
{
return trim(shell_exec('command -v ' . escapeshellarg($binary)));
}
}
| Add typehints to private function | chore: Add typehints to private function
| PHP | apache-2.0 | pact-foundation/pact-php,pact-foundation/pact-php | php | ## Code Before:
<?php
namespace PhpPact\Standalone\Installer\Service;
use PhpPact\Standalone\Installer\Model\Scripts;
class InstallerPosixPreinstalled implements InstallerInterface
{
/**
* {@inheritdoc}
*/
public function isEligible(): bool
{
return in_array(PHP_OS, ['Linux', 'Darwin']) && !empty($this->getBinaryPath('pact-provider-verifier'));
}
/**
* {@inheritdoc}
*/
public function install(string $destinationDir): Scripts
{
$scripts = new Scripts(
'pact-mock-service',
'pact-stub-service',
'pact-provider-verifier',
'pact-message',
'pact-broker'
);
return $scripts;
}
private function getBinaryPath($binary)
{
return trim(shell_exec('command -v ' . escapeshellarg($binary)));
}
}
## Instruction:
chore: Add typehints to private function
## Code After:
<?php
namespace PhpPact\Standalone\Installer\Service;
use PhpPact\Standalone\Installer\Model\Scripts;
class InstallerPosixPreinstalled implements InstallerInterface
{
/**
* {@inheritdoc}
*/
public function isEligible(): bool
{
return in_array(PHP_OS, ['Linux', 'Darwin']) && !empty($this->getBinaryPath('pact-provider-verifier'));
}
/**
* {@inheritdoc}
*/
public function install(string $destinationDir): Scripts
{
$scripts = new Scripts(
'pact-mock-service',
'pact-stub-service',
'pact-provider-verifier',
'pact-message',
'pact-broker'
);
return $scripts;
}
private function getBinaryPath(string $binary): string
{
return trim(shell_exec('command -v ' . escapeshellarg($binary)));
}
}
| <?php
namespace PhpPact\Standalone\Installer\Service;
use PhpPact\Standalone\Installer\Model\Scripts;
class InstallerPosixPreinstalled implements InstallerInterface
{
/**
* {@inheritdoc}
*/
public function isEligible(): bool
{
return in_array(PHP_OS, ['Linux', 'Darwin']) && !empty($this->getBinaryPath('pact-provider-verifier'));
}
/**
* {@inheritdoc}
*/
public function install(string $destinationDir): Scripts
{
$scripts = new Scripts(
'pact-mock-service',
'pact-stub-service',
'pact-provider-verifier',
'pact-message',
'pact-broker'
);
return $scripts;
}
- private function getBinaryPath($binary)
+ private function getBinaryPath(string $binary): string
? +++++++ ++++++++
{
return trim(shell_exec('command -v ' . escapeshellarg($binary)));
}
} | 2 | 0.054054 | 1 | 1 |
96f37057d35a328b7b680e8bc1e424cd0569b5c3 | README.md | README.md | Ansible module used by OSAS to manage a set of server using ansible.
Architecture
------------
This role permit to install a trusted server that will deploy configurations
and run playbooks safely without having to distribute ssh keys to every admin.
It use for that 2 git repositories where a authorized user can push,
triggering a automated deployment using a script to deploy only where it is
needed ( ie, modify a role, only systems where the role is used will be
targeted ).
The role use 2 git repositories, one named "public" and the other named
"private". As the name imply, the public repository is meant to be public and
usually will contain everything that is not deemed private sur as password. The
private repository is used mostly for passwords, and should usually only
contains groups_vars/ or vars/ repository. A post commit hook will extract
the 2 repositories in /etc/ansible, and run ansible if needed.
For increased security, ansible is run as a non privileged user, to avoid
a attack from the managed system ( see CVE-2014-3498 ). Others ideas are
planned to be implemented.
Variables
---------
A few variables have been added to configure the role, please look at
defaults/main.yml
| Ansible module used by OSAS to manage a set of server using ansible.
[](https://travis-ci.org/OSAS/ansible-role-ansible_bastion)
Architecture
------------
This role permit to install a trusted server that will deploy configurations
and run playbooks safely without having to distribute ssh keys to every admin.
It use for that 2 git repositories where a authorized user can push,
triggering a automated deployment using a script to deploy only where it is
needed ( ie, modify a role, only systems where the role is used will be
targeted ).
The role use 2 git repositories, one named "public" and the other named
"private". As the name imply, the public repository is meant to be public and
usually will contain everything that is not deemed private sur as password. The
private repository is used mostly for passwords, and should usually only
contains groups_vars/ or vars/ repository. A post commit hook will extract
the 2 repositories in /etc/ansible, and run ansible if needed.
For increased security, ansible is run as a non privileged user, to avoid
a attack from the managed system ( see CVE-2014-3498 ). Others ideas are
planned to be implemented.
Variables
---------
A few variables have been added to configure the role, please look at
defaults/main.yml
| Add a image to signal build status | Add a image to signal build status
| Markdown | mit | mscherer/ansible-role-ansible_bastion,OSAS/ansible-role-ansible_bastion,mscherer/ansible-role-ansible_bastion,OSAS/ansible-role-ansible_bastion | markdown | ## Code Before:
Ansible module used by OSAS to manage a set of server using ansible.
Architecture
------------
This role permit to install a trusted server that will deploy configurations
and run playbooks safely without having to distribute ssh keys to every admin.
It use for that 2 git repositories where a authorized user can push,
triggering a automated deployment using a script to deploy only where it is
needed ( ie, modify a role, only systems where the role is used will be
targeted ).
The role use 2 git repositories, one named "public" and the other named
"private". As the name imply, the public repository is meant to be public and
usually will contain everything that is not deemed private sur as password. The
private repository is used mostly for passwords, and should usually only
contains groups_vars/ or vars/ repository. A post commit hook will extract
the 2 repositories in /etc/ansible, and run ansible if needed.
For increased security, ansible is run as a non privileged user, to avoid
a attack from the managed system ( see CVE-2014-3498 ). Others ideas are
planned to be implemented.
Variables
---------
A few variables have been added to configure the role, please look at
defaults/main.yml
## Instruction:
Add a image to signal build status
## Code After:
Ansible module used by OSAS to manage a set of server using ansible.
[](https://travis-ci.org/OSAS/ansible-role-ansible_bastion)
Architecture
------------
This role permit to install a trusted server that will deploy configurations
and run playbooks safely without having to distribute ssh keys to every admin.
It use for that 2 git repositories where a authorized user can push,
triggering a automated deployment using a script to deploy only where it is
needed ( ie, modify a role, only systems where the role is used will be
targeted ).
The role use 2 git repositories, one named "public" and the other named
"private". As the name imply, the public repository is meant to be public and
usually will contain everything that is not deemed private sur as password. The
private repository is used mostly for passwords, and should usually only
contains groups_vars/ or vars/ repository. A post commit hook will extract
the 2 repositories in /etc/ansible, and run ansible if needed.
For increased security, ansible is run as a non privileged user, to avoid
a attack from the managed system ( see CVE-2014-3498 ). Others ideas are
planned to be implemented.
Variables
---------
A few variables have been added to configure the role, please look at
defaults/main.yml
| Ansible module used by OSAS to manage a set of server using ansible.
+
+ [](https://travis-ci.org/OSAS/ansible-role-ansible_bastion)
Architecture
------------
This role permit to install a trusted server that will deploy configurations
and run playbooks safely without having to distribute ssh keys to every admin.
It use for that 2 git repositories where a authorized user can push,
triggering a automated deployment using a script to deploy only where it is
needed ( ie, modify a role, only systems where the role is used will be
targeted ).
The role use 2 git repositories, one named "public" and the other named
"private". As the name imply, the public repository is meant to be public and
usually will contain everything that is not deemed private sur as password. The
private repository is used mostly for passwords, and should usually only
contains groups_vars/ or vars/ repository. A post commit hook will extract
the 2 repositories in /etc/ansible, and run ansible if needed.
For increased security, ansible is run as a non privileged user, to avoid
a attack from the managed system ( see CVE-2014-3498 ). Others ideas are
planned to be implemented.
Variables
---------
A few variables have been added to configure the role, please look at
defaults/main.yml
| 2 | 0.068966 | 2 | 0 |
2e99893065abef2f751e3fb5f19a59bfee79a756 | language_model_transcription.py | language_model_transcription.py | import metasentence
import language_model
import standard_kaldi
import diff_align
import json
import os
import sys
vocab = metasentence.load_vocabulary('PROTO_LANGDIR/graphdir/words.txt')
def lm_transcribe(audio_f, text_f):
ms = metasentence.MetaSentence(open(text_f).read(), vocab)
model_dir = language_model.getLanguageModel(ms.get_kaldi_sequence())
print 'generated model', model_dir
k = standard_kaldi.Kaldi(os.path.join(model_dir, 'graphdir', 'HCLG.fst'))
trans = standard_kaldi.transcribe(k, audio_f)
ret = diff_align.align(trans["words"], ms)
return ret
if __name__=='__main__':
AUDIO_FILE = sys.argv[1]
TEXT_FILE = sys.argv[2]
OUTPUT_FILE = sys.argv[3]
ret = lm_transcribe(AUDIO_FILE, TEXT_FILE)
json.dump(ret, open(OUTPUT_FILE, 'w'), indent=2)
| import metasentence
import language_model
import standard_kaldi
import diff_align
import json
import os
import sys
vocab = metasentence.load_vocabulary('PROTO_LANGDIR/graphdir/words.txt')
def lm_transcribe(audio_f, text_f):
ms = metasentence.MetaSentence(open(text_f).read(), vocab)
model_dir = language_model.getLanguageModel(ms.get_kaldi_sequence())
print 'generated model', model_dir
k = standard_kaldi.Kaldi(os.path.join(model_dir, 'graphdir', 'HCLG.fst'))
trans = standard_kaldi.transcribe(k, audio_f)
ret = diff_align.align(trans["words"], ms)
return ret
if __name__=='__main__':
import argparse
parser = argparse.ArgumentParser(
description='Align a transcript to audio by generating a new language model.')
parser.add_argument('audio_file', help='input audio file in any format supported by FFMPEG')
parser.add_argument('text_file', help='input transcript as plain text')
parser.add_argument('output_file', type=argparse.FileType('w'),
help='output json file for aligned transcript')
parser.add_argument('--proto_langdir', default="PROTO_LANGDIR",
help='path to the prototype language directory')
args = parser.parse_args()
ret = lm_transcribe(args.audio_file, args.text_file)
json.dump(ret, args.output_file, indent=2)
| Use argparse for main python entrypoint args. | Use argparse for main python entrypoint args.
Will make it easier to add proto_langdir as a flag argument in a future commit.
| Python | mit | lowerquality/gentle,lowerquality/gentle,lowerquality/gentle,lowerquality/gentle | python | ## Code Before:
import metasentence
import language_model
import standard_kaldi
import diff_align
import json
import os
import sys
vocab = metasentence.load_vocabulary('PROTO_LANGDIR/graphdir/words.txt')
def lm_transcribe(audio_f, text_f):
ms = metasentence.MetaSentence(open(text_f).read(), vocab)
model_dir = language_model.getLanguageModel(ms.get_kaldi_sequence())
print 'generated model', model_dir
k = standard_kaldi.Kaldi(os.path.join(model_dir, 'graphdir', 'HCLG.fst'))
trans = standard_kaldi.transcribe(k, audio_f)
ret = diff_align.align(trans["words"], ms)
return ret
if __name__=='__main__':
AUDIO_FILE = sys.argv[1]
TEXT_FILE = sys.argv[2]
OUTPUT_FILE = sys.argv[3]
ret = lm_transcribe(AUDIO_FILE, TEXT_FILE)
json.dump(ret, open(OUTPUT_FILE, 'w'), indent=2)
## Instruction:
Use argparse for main python entrypoint args.
Will make it easier to add proto_langdir as a flag argument in a future commit.
## Code After:
import metasentence
import language_model
import standard_kaldi
import diff_align
import json
import os
import sys
vocab = metasentence.load_vocabulary('PROTO_LANGDIR/graphdir/words.txt')
def lm_transcribe(audio_f, text_f):
ms = metasentence.MetaSentence(open(text_f).read(), vocab)
model_dir = language_model.getLanguageModel(ms.get_kaldi_sequence())
print 'generated model', model_dir
k = standard_kaldi.Kaldi(os.path.join(model_dir, 'graphdir', 'HCLG.fst'))
trans = standard_kaldi.transcribe(k, audio_f)
ret = diff_align.align(trans["words"], ms)
return ret
if __name__=='__main__':
import argparse
parser = argparse.ArgumentParser(
description='Align a transcript to audio by generating a new language model.')
parser.add_argument('audio_file', help='input audio file in any format supported by FFMPEG')
parser.add_argument('text_file', help='input transcript as plain text')
parser.add_argument('output_file', type=argparse.FileType('w'),
help='output json file for aligned transcript')
parser.add_argument('--proto_langdir', default="PROTO_LANGDIR",
help='path to the prototype language directory')
args = parser.parse_args()
ret = lm_transcribe(args.audio_file, args.text_file)
json.dump(ret, args.output_file, indent=2)
| import metasentence
import language_model
import standard_kaldi
import diff_align
import json
import os
import sys
vocab = metasentence.load_vocabulary('PROTO_LANGDIR/graphdir/words.txt')
def lm_transcribe(audio_f, text_f):
ms = metasentence.MetaSentence(open(text_f).read(), vocab)
model_dir = language_model.getLanguageModel(ms.get_kaldi_sequence())
print 'generated model', model_dir
k = standard_kaldi.Kaldi(os.path.join(model_dir, 'graphdir', 'HCLG.fst'))
trans = standard_kaldi.transcribe(k, audio_f)
ret = diff_align.align(trans["words"], ms)
return ret
if __name__=='__main__':
+ import argparse
- AUDIO_FILE = sys.argv[1]
- TEXT_FILE = sys.argv[2]
- OUTPUT_FILE = sys.argv[3]
- ret = lm_transcribe(AUDIO_FILE, TEXT_FILE)
- json.dump(ret, open(OUTPUT_FILE, 'w'), indent=2)
+ parser = argparse.ArgumentParser(
+ description='Align a transcript to audio by generating a new language model.')
+ parser.add_argument('audio_file', help='input audio file in any format supported by FFMPEG')
+ parser.add_argument('text_file', help='input transcript as plain text')
+ parser.add_argument('output_file', type=argparse.FileType('w'),
+ help='output json file for aligned transcript')
+ parser.add_argument('--proto_langdir', default="PROTO_LANGDIR",
+ help='path to the prototype language directory')
+
+ args = parser.parse_args()
+
+ ret = lm_transcribe(args.audio_file, args.text_file)
+ json.dump(ret, args.output_file, indent=2)
| 19 | 0.527778 | 14 | 5 |
c55f3b8cc4933a4e342d165206b6b43f169def56 | .gitlab-ci.yml | .gitlab-ci.yml | image: centos
pages:
stage: deploy
script:
- yum install python34
- yum install python-pip
- pip install sphinx recommonmark
- make html
- mkdir -p public
- mv _build/html public
artifacts:
paths:
- public
only:
- pages
| image: centos
pages:
stage: deploy
script:
- yum install epel-release
- yum install python34 python34-devel
- yum install python-pip
- pip install sphinx recommonmark
- make html
- mkdir -p public
- mv _build/html public
artifacts:
paths:
- public
only:
- pages
- master
| Add epel-release package to install python34 | Add epel-release package to install python34
| YAML | mit | HadrienG/taxadb | yaml | ## Code Before:
image: centos
pages:
stage: deploy
script:
- yum install python34
- yum install python-pip
- pip install sphinx recommonmark
- make html
- mkdir -p public
- mv _build/html public
artifacts:
paths:
- public
only:
- pages
## Instruction:
Add epel-release package to install python34
## Code After:
image: centos
pages:
stage: deploy
script:
- yum install epel-release
- yum install python34 python34-devel
- yum install python-pip
- pip install sphinx recommonmark
- make html
- mkdir -p public
- mv _build/html public
artifacts:
paths:
- public
only:
- pages
- master
| image: centos
pages:
stage: deploy
script:
+ - yum install epel-release
- - yum install python34
+ - yum install python34 python34-devel
? +++++++++++++++
- yum install python-pip
- pip install sphinx recommonmark
- make html
- mkdir -p public
- mv _build/html public
artifacts:
paths:
- public
only:
- pages
+ - master | 4 | 0.25 | 3 | 1 |
0501b825efc1642a066c26819668a6e83774b776 | web/public/services/analytics.js | web/public/services/analytics.js | // analytics.js
CircleBlvd.Services.analytics = function ($window, $location) {
// Reference:
// https://developers.google.com/analytics/devguides/collection/analyticsjs/pages
var trackPage = function () {
if ($window.ga) {
var hash = $window.location.hash;
var path;
if (hash.length > 0) {
path = $location.path();
}
else {
path = $window.location.pathname;
}
$window.ga('send', 'pageview', path);
}
};
var trackEvent = function (category, action) {
if ($window.ga) {
$window.ga('send', 'event', category, action);
}
};
var trackError = function (status) {
if ($window.ga) {
$window.ga('send', 'event', 'error', status);
}
};
return {
trackPage: trackPage,
trackEvent: trackEvent,
trackError: trackError
};
};
CircleBlvd.Services.analytics.$inject = ['$window', '$location']; | // analytics.js
CircleBlvd.Services.analytics = function ($window, $location) {
// Reference:
// https://developers.google.com/analytics/devguides/collection/analyticsjs/pages
var trackPage = function () {
if ($window.ga) {
var hash = $window.location.hash;
var path;
if (hash.length > 0) {
path = $location.path();
}
else {
path = $window.location.pathname;
}
var search = $window.location.search;
if (search) {
path = path + search;
}
$window.ga('send', 'pageview', path);
}
};
var trackEvent = function (category, action) {
if ($window.ga) {
$window.ga('send', 'event', category, action);
}
};
var trackError = function (status) {
if ($window.ga) {
$window.ga('send', 'event', 'error', status);
}
};
return {
trackPage: trackPage,
trackEvent: trackEvent,
trackError: trackError
};
};
CircleBlvd.Services.analytics.$inject = ['$window', '$location']; | Add search string to page view | Analytics: Add search string to page view
As we want to see which advertising channels
are most effective, we want to record the query
parameters.
| JavaScript | bsd-2-clause | holmwell/circle-blvd,holmwell/circle-blvd,holmwell/circle-blvd,secret-project/circle-blvd,holmwell/circle-blvd,secret-project/circle-blvd,secret-project/circle-blvd,secret-project/circle-blvd | javascript | ## Code Before:
// analytics.js
CircleBlvd.Services.analytics = function ($window, $location) {
// Reference:
// https://developers.google.com/analytics/devguides/collection/analyticsjs/pages
var trackPage = function () {
if ($window.ga) {
var hash = $window.location.hash;
var path;
if (hash.length > 0) {
path = $location.path();
}
else {
path = $window.location.pathname;
}
$window.ga('send', 'pageview', path);
}
};
var trackEvent = function (category, action) {
if ($window.ga) {
$window.ga('send', 'event', category, action);
}
};
var trackError = function (status) {
if ($window.ga) {
$window.ga('send', 'event', 'error', status);
}
};
return {
trackPage: trackPage,
trackEvent: trackEvent,
trackError: trackError
};
};
CircleBlvd.Services.analytics.$inject = ['$window', '$location'];
## Instruction:
Analytics: Add search string to page view
As we want to see which advertising channels
are most effective, we want to record the query
parameters.
## Code After:
// analytics.js
CircleBlvd.Services.analytics = function ($window, $location) {
// Reference:
// https://developers.google.com/analytics/devguides/collection/analyticsjs/pages
var trackPage = function () {
if ($window.ga) {
var hash = $window.location.hash;
var path;
if (hash.length > 0) {
path = $location.path();
}
else {
path = $window.location.pathname;
}
var search = $window.location.search;
if (search) {
path = path + search;
}
$window.ga('send', 'pageview', path);
}
};
var trackEvent = function (category, action) {
if ($window.ga) {
$window.ga('send', 'event', category, action);
}
};
var trackError = function (status) {
if ($window.ga) {
$window.ga('send', 'event', 'error', status);
}
};
return {
trackPage: trackPage,
trackEvent: trackEvent,
trackError: trackError
};
};
CircleBlvd.Services.analytics.$inject = ['$window', '$location']; | // analytics.js
CircleBlvd.Services.analytics = function ($window, $location) {
// Reference:
// https://developers.google.com/analytics/devguides/collection/analyticsjs/pages
var trackPage = function () {
if ($window.ga) {
var hash = $window.location.hash;
var path;
if (hash.length > 0) {
path = $location.path();
}
else {
path = $window.location.pathname;
+ }
+
+ var search = $window.location.search;
+ if (search) {
+ path = path + search;
}
$window.ga('send', 'pageview', path);
}
};
var trackEvent = function (category, action) {
if ($window.ga) {
$window.ga('send', 'event', category, action);
}
};
var trackError = function (status) {
if ($window.ga) {
$window.ga('send', 'event', 'error', status);
}
};
return {
trackPage: trackPage,
trackEvent: trackEvent,
trackError: trackError
};
};
CircleBlvd.Services.analytics.$inject = ['$window', '$location']; | 5 | 0.125 | 5 | 0 |
1ab4424effd5f5ca1362e042d26d06911993511f | packages/landing-scripts/scripts/init.js | packages/landing-scripts/scripts/init.js | const fs = require('fs-extra');
const path = require('path');
const chalk = require('chalk');
function copyTemplate(appDir, templatePath) {
fs.copySync(templatePath, appDir);
fs.moveSync(
path.join(appDir, 'gitignore'),
path.join(appDir, '.gitignore'),
);
}
function addPackageScripts(appDir) {
const scripts = {
start: 'landing-scripts start',
build: 'landing-scripts build',
};
const packageJson = require(`${appDir}/package.json`);
packageJson.scripts = scripts;
fs.writeFileSync(
path.join(appDir, 'package.json'),
JSON.stringify(packageJson, null, 2),
);
}
function init(appDir, appName, templatePath) {
const isUsingYarn = fs.existsSync(path.join(appDir, 'yarn.lock'));
const command = isUsingYarn ? 'yarn' : 'npm';
copyTemplate(appDir, templatePath);
addPackageScripts(appDir);
console.log(chalk.green('Project ready!'));
console.log();
console.log('To start the project typing:');
console.log();
console.log(chalk.cyan('cd'), appName);
console.log(chalk.cyan(command), 'start');
}
module.exports = init;
| const fs = require('fs-extra');
const path = require('path');
const chalk = require('chalk');
function copyTemplate(appDir, templatePath) {
fs.copySync(templatePath, appDir);
fs.removeSync('.gitignore');
fs.moveSync(
path.join(appDir, 'gitignore'),
path.join(appDir, '.gitignore'),
);
}
function addPackageScripts(appDir) {
const scripts = {
start: 'landing-scripts start',
build: 'landing-scripts build',
};
const packageJson = require(`${appDir}/package.json`);
packageJson.scripts = scripts;
fs.writeFileSync(
path.join(appDir, 'package.json'),
JSON.stringify(packageJson, null, 2),
);
}
function init(appDir, appName, templatePath) {
const isUsingYarn = fs.existsSync(path.join(appDir, 'yarn.lock'));
const command = isUsingYarn ? 'yarn' : 'npm';
copyTemplate(appDir, templatePath);
addPackageScripts(appDir);
console.log(chalk.green('Project ready!'));
console.log();
console.log('To start the project typing:');
console.log();
console.log(chalk.cyan('cd'), appName);
console.log(chalk.cyan(command), 'start');
}
module.exports = init;
| Remove .gitignore on bootstrap the project | Remove .gitignore on bootstrap the project
| JavaScript | mit | kevindantas/create-landing-page,kevindantas/create-landing-page,kevindantas/create-landing-page | javascript | ## Code Before:
const fs = require('fs-extra');
const path = require('path');
const chalk = require('chalk');
function copyTemplate(appDir, templatePath) {
fs.copySync(templatePath, appDir);
fs.moveSync(
path.join(appDir, 'gitignore'),
path.join(appDir, '.gitignore'),
);
}
function addPackageScripts(appDir) {
const scripts = {
start: 'landing-scripts start',
build: 'landing-scripts build',
};
const packageJson = require(`${appDir}/package.json`);
packageJson.scripts = scripts;
fs.writeFileSync(
path.join(appDir, 'package.json'),
JSON.stringify(packageJson, null, 2),
);
}
function init(appDir, appName, templatePath) {
const isUsingYarn = fs.existsSync(path.join(appDir, 'yarn.lock'));
const command = isUsingYarn ? 'yarn' : 'npm';
copyTemplate(appDir, templatePath);
addPackageScripts(appDir);
console.log(chalk.green('Project ready!'));
console.log();
console.log('To start the project typing:');
console.log();
console.log(chalk.cyan('cd'), appName);
console.log(chalk.cyan(command), 'start');
}
module.exports = init;
## Instruction:
Remove .gitignore on bootstrap the project
## Code After:
const fs = require('fs-extra');
const path = require('path');
const chalk = require('chalk');
function copyTemplate(appDir, templatePath) {
fs.copySync(templatePath, appDir);
fs.removeSync('.gitignore');
fs.moveSync(
path.join(appDir, 'gitignore'),
path.join(appDir, '.gitignore'),
);
}
function addPackageScripts(appDir) {
const scripts = {
start: 'landing-scripts start',
build: 'landing-scripts build',
};
const packageJson = require(`${appDir}/package.json`);
packageJson.scripts = scripts;
fs.writeFileSync(
path.join(appDir, 'package.json'),
JSON.stringify(packageJson, null, 2),
);
}
function init(appDir, appName, templatePath) {
const isUsingYarn = fs.existsSync(path.join(appDir, 'yarn.lock'));
const command = isUsingYarn ? 'yarn' : 'npm';
copyTemplate(appDir, templatePath);
addPackageScripts(appDir);
console.log(chalk.green('Project ready!'));
console.log();
console.log('To start the project typing:');
console.log();
console.log(chalk.cyan('cd'), appName);
console.log(chalk.cyan(command), 'start');
}
module.exports = init;
| const fs = require('fs-extra');
const path = require('path');
const chalk = require('chalk');
function copyTemplate(appDir, templatePath) {
fs.copySync(templatePath, appDir);
+ fs.removeSync('.gitignore');
fs.moveSync(
path.join(appDir, 'gitignore'),
path.join(appDir, '.gitignore'),
);
}
function addPackageScripts(appDir) {
const scripts = {
start: 'landing-scripts start',
build: 'landing-scripts build',
};
const packageJson = require(`${appDir}/package.json`);
packageJson.scripts = scripts;
fs.writeFileSync(
path.join(appDir, 'package.json'),
JSON.stringify(packageJson, null, 2),
);
}
function init(appDir, appName, templatePath) {
const isUsingYarn = fs.existsSync(path.join(appDir, 'yarn.lock'));
const command = isUsingYarn ? 'yarn' : 'npm';
copyTemplate(appDir, templatePath);
addPackageScripts(appDir);
console.log(chalk.green('Project ready!'));
console.log();
console.log('To start the project typing:');
console.log();
console.log(chalk.cyan('cd'), appName);
console.log(chalk.cyan(command), 'start');
}
module.exports = init; | 1 | 0.02439 | 1 | 0 |
b482b9668f75149da9990d06d7713e99b3cda5f1 | r/cloudwatch_log_group.html.markdown | r/cloudwatch_log_group.html.markdown | ---
layout: "aws"
page_title: "AWS: aws_cloudwatch_log_group"
sidebar_current: "docs-aws-resource-cloudwatch-log-group"
description: |-
Provides a CloudWatch Log Group resource.
---
# aws\_cloudwatch\_log\_group
Provides a CloudWatch Log Group resource.
## Example Usage
```
resource "aws_cloudwatch_log_group" "yada" {
name = "Yada"
}
```
## Argument Reference
The following arguments are supported:
* `name` - (Required) The name of the log group
* `retention_in_days` - (Optional) Specifies the number of days
you want to retain log events in the specified log group.
## Attributes Reference
The following attributes are exported:
* `arn` - The Amazon Resource Name (ARN) specifying the log group.
## Import
Cloudwatch Log Groups can be imported using the `name`, e.g.
```
$ terraform import aws_cloudwatch_log_group.test_group yada
``` | ---
layout: "aws"
page_title: "AWS: aws_cloudwatch_log_group"
sidebar_current: "docs-aws-resource-cloudwatch-log-group"
description: |-
Provides a CloudWatch Log Group resource.
---
# aws\_cloudwatch\_log\_group
Provides a CloudWatch Log Group resource.
## Example Usage
```
resource "aws_cloudwatch_log_group" "yada" {
name = "Yada"
tags {
Environment = "production"
Application = "serviceA"
}
}
```
## Argument Reference
The following arguments are supported:
* `name` - (Required) The name of the log group
* `retention_in_days` - (Optional) Specifies the number of days
you want to retain log events in the specified log group.
* `tags` - (Optional) A mapping of tags to assign to the resource.
## Attributes Reference
The following attributes are exported:
* `arn` - The Amazon Resource Name (ARN) specifying the log group.
## Import
Cloudwatch Log Groups can be imported using the `name`, e.g.
```
$ terraform import aws_cloudwatch_log_group.test_group yada
``` | Add Support for Tags to aws_cloudwatch_log_group | provider/aws: Add Support for Tags to aws_cloudwatch_log_group
``````
make testacc TEST=./builtin/providers/aws TESTARGS='-run=TestAccAWSCloudWatchLogGroup_' 2 ↵ ✭
==> Checking that code complies with gofmt requirements...
go generate $(go list ./... | grep -v /terraform/vendor/)
2017/01/12 16:22:07 Generated command/internal_plugin_list.go
TF_ACC=1 go test ./builtin/providers/aws -v -run=TestAccAWSCloudWatchLogGroup_ -timeout 120m
=== RUN TestAccAWSCloudWatchLogGroup_importBasic
--- PASS: TestAccAWSCloudWatchLogGroup_importBasic (44.20s)
=== RUN TestAccAWSCloudWatchLogGroup_basic
--- PASS: TestAccAWSCloudWatchLogGroup_basic (38.08s)
=== RUN TestAccAWSCloudWatchLogGroup_retentionPolicy
--- PASS: TestAccAWSCloudWatchLogGroup_retentionPolicy (55.85s)
=== RUN TestAccAWSCloudWatchLogGroup_multiple
--- PASS: TestAccAWSCloudWatchLogGroup_multiple (20.68s)
=== RUN TestAccAWSCloudWatchLogGroup_disappears
--- PASS: TestAccAWSCloudWatchLogGroup_disappears (21.48s)
=== RUN TestAccAWSCloudWatchLogGroup_tagging
--- PASS: TestAccAWSCloudWatchLogGroup_tagging (39.09s)
ok
PASS github.com/hashicorp/terraform/builtin/providers/aws 219.411s
```
| Markdown | mpl-2.0 | charles-at-geospock/terraform-provider-aws,ashinohara/terraform-provider-aws,ClementCunin/terraform-provider-aws,Ninir/terraform-provider-aws,Ninir/terraform-provider-aws,ClementCunin/terraform-provider-aws,kjmkznr/terraform-provider-aws,terraform-providers/terraform-provider-aws,ashinohara/terraform-provider-aws,charles-at-geospock/terraform-provider-aws,Ninir/terraform-provider-aws,kjmkznr/terraform-provider-aws,terraform-providers/terraform-provider-aws,charles-at-geospock/terraform-provider-aws,terraform-providers/terraform-provider-aws,ashinohara/terraform-provider-aws,terraform-providers/terraform-provider-aws,nbaztec/terraform-provider-aws,ashinohara/terraform-provider-aws,kjmkznr/terraform-provider-aws,nbaztec/terraform-provider-aws,charles-at-geospock/terraform-provider-aws,ClementCunin/terraform-provider-aws,ClementCunin/terraform-provider-aws,nbaztec/terraform-provider-aws,Ninir/terraform-provider-aws,kjmkznr/terraform-provider-aws,nbaztec/terraform-provider-aws | markdown | ## Code Before:
---
layout: "aws"
page_title: "AWS: aws_cloudwatch_log_group"
sidebar_current: "docs-aws-resource-cloudwatch-log-group"
description: |-
Provides a CloudWatch Log Group resource.
---
# aws\_cloudwatch\_log\_group
Provides a CloudWatch Log Group resource.
## Example Usage
```
resource "aws_cloudwatch_log_group" "yada" {
name = "Yada"
}
```
## Argument Reference
The following arguments are supported:
* `name` - (Required) The name of the log group
* `retention_in_days` - (Optional) Specifies the number of days
you want to retain log events in the specified log group.
## Attributes Reference
The following attributes are exported:
* `arn` - The Amazon Resource Name (ARN) specifying the log group.
## Import
Cloudwatch Log Groups can be imported using the `name`, e.g.
```
$ terraform import aws_cloudwatch_log_group.test_group yada
```
## Instruction:
provider/aws: Add Support for Tags to aws_cloudwatch_log_group
``````
make testacc TEST=./builtin/providers/aws TESTARGS='-run=TestAccAWSCloudWatchLogGroup_' 2 ↵ ✭
==> Checking that code complies with gofmt requirements...
go generate $(go list ./... | grep -v /terraform/vendor/)
2017/01/12 16:22:07 Generated command/internal_plugin_list.go
TF_ACC=1 go test ./builtin/providers/aws -v -run=TestAccAWSCloudWatchLogGroup_ -timeout 120m
=== RUN TestAccAWSCloudWatchLogGroup_importBasic
--- PASS: TestAccAWSCloudWatchLogGroup_importBasic (44.20s)
=== RUN TestAccAWSCloudWatchLogGroup_basic
--- PASS: TestAccAWSCloudWatchLogGroup_basic (38.08s)
=== RUN TestAccAWSCloudWatchLogGroup_retentionPolicy
--- PASS: TestAccAWSCloudWatchLogGroup_retentionPolicy (55.85s)
=== RUN TestAccAWSCloudWatchLogGroup_multiple
--- PASS: TestAccAWSCloudWatchLogGroup_multiple (20.68s)
=== RUN TestAccAWSCloudWatchLogGroup_disappears
--- PASS: TestAccAWSCloudWatchLogGroup_disappears (21.48s)
=== RUN TestAccAWSCloudWatchLogGroup_tagging
--- PASS: TestAccAWSCloudWatchLogGroup_tagging (39.09s)
ok
PASS github.com/hashicorp/terraform/builtin/providers/aws 219.411s
```
## Code After:
---
layout: "aws"
page_title: "AWS: aws_cloudwatch_log_group"
sidebar_current: "docs-aws-resource-cloudwatch-log-group"
description: |-
Provides a CloudWatch Log Group resource.
---
# aws\_cloudwatch\_log\_group
Provides a CloudWatch Log Group resource.
## Example Usage
```
resource "aws_cloudwatch_log_group" "yada" {
name = "Yada"
tags {
Environment = "production"
Application = "serviceA"
}
}
```
## Argument Reference
The following arguments are supported:
* `name` - (Required) The name of the log group
* `retention_in_days` - (Optional) Specifies the number of days
you want to retain log events in the specified log group.
* `tags` - (Optional) A mapping of tags to assign to the resource.
## Attributes Reference
The following attributes are exported:
* `arn` - The Amazon Resource Name (ARN) specifying the log group.
## Import
Cloudwatch Log Groups can be imported using the `name`, e.g.
```
$ terraform import aws_cloudwatch_log_group.test_group yada
``` | ---
layout: "aws"
page_title: "AWS: aws_cloudwatch_log_group"
sidebar_current: "docs-aws-resource-cloudwatch-log-group"
description: |-
Provides a CloudWatch Log Group resource.
---
# aws\_cloudwatch\_log\_group
Provides a CloudWatch Log Group resource.
## Example Usage
```
resource "aws_cloudwatch_log_group" "yada" {
name = "Yada"
+
+ tags {
+ Environment = "production"
+ Application = "serviceA"
+ }
}
```
## Argument Reference
The following arguments are supported:
* `name` - (Required) The name of the log group
* `retention_in_days` - (Optional) Specifies the number of days
you want to retain log events in the specified log group.
+ * `tags` - (Optional) A mapping of tags to assign to the resource.
## Attributes Reference
The following attributes are exported:
* `arn` - The Amazon Resource Name (ARN) specifying the log group.
## Import
Cloudwatch Log Groups can be imported using the `name`, e.g.
```
$ terraform import aws_cloudwatch_log_group.test_group yada
``` | 6 | 0.142857 | 6 | 0 |
88cd76cee49bb4e43c033a260fc004ff6bb23810 | _config.yml | _config.yml | permalink: /projects/:title
# My variables
name: sgelb
description: projects and stuff
github: https://sgelb.github.com
include: ["_pages", "demo"]
# Sass
sass:
style: :compressed
sass_dir: _sass
# Markdown
markdown: kramdown
kramdown:
input: GFM
hard_wrap: false
highlighter: rouge
| permalink: /projects/:title
# My variables
name: sgelb
description: projects and stuff
github: https://github.com/sgelb
include: ["_pages", "demo"]
# Sass
sass:
style: :compressed
sass_dir: _sass
# Markdown
markdown: kramdown
kramdown:
input: GFM
hard_wrap: false
highlighter: rouge
| Fix typo in github base url | Fix typo in github base url
| YAML | mit | sgelb/sgelb.github.io,sgelb/sgelb.github.io | yaml | ## Code Before:
permalink: /projects/:title
# My variables
name: sgelb
description: projects and stuff
github: https://sgelb.github.com
include: ["_pages", "demo"]
# Sass
sass:
style: :compressed
sass_dir: _sass
# Markdown
markdown: kramdown
kramdown:
input: GFM
hard_wrap: false
highlighter: rouge
## Instruction:
Fix typo in github base url
## Code After:
permalink: /projects/:title
# My variables
name: sgelb
description: projects and stuff
github: https://github.com/sgelb
include: ["_pages", "demo"]
# Sass
sass:
style: :compressed
sass_dir: _sass
# Markdown
markdown: kramdown
kramdown:
input: GFM
hard_wrap: false
highlighter: rouge
| permalink: /projects/:title
# My variables
name: sgelb
description: projects and stuff
- github: https://sgelb.github.com
? ------
+ github: https://github.com/sgelb
? ++++++
include: ["_pages", "demo"]
# Sass
sass:
style: :compressed
sass_dir: _sass
# Markdown
markdown: kramdown
kramdown:
input: GFM
hard_wrap: false
highlighter: rouge | 2 | 0.083333 | 1 | 1 |
dbe00c57f6a88f8cbbf25829a542c130446db1b2 | Cargo.toml | Cargo.toml | [package]
name = "hubcaps"
version = "0.2.8"
authors = ["softprops <d.tangren@gmail.com>"]
description = "Rust interface for Github"
documentation = "https://softprops.github.io/hubcaps"
homepage = "https://github.com/softprops/hubcaps"
repository = "https://github.com/softprops/hubcaps"
keywords = ["hyper", "github"]
license = "MIT"
build = "build.rs"
[dev-dependencies]
env_logger = "0.3"
[build-dependencies]
serde_codegen = "0.8"
glob = "0.2.11"
[dependencies]
log = "0.3"
hyper = "0.9"
url = "1.2"
serde = "0.8"
serde_json = "0.8"
serializable_enum = "0.3"
| [package]
name = "hubcaps"
version = "0.2.8"
authors = ["softprops <d.tangren@gmail.com>"]
description = "Rust interface for Github"
documentation = "https://softprops.github.io/hubcaps"
homepage = "https://github.com/softprops/hubcaps"
repository = "https://github.com/softprops/hubcaps"
keywords = ["hyper", "github"]
license = "MIT"
build = "build.rs"
[dev-dependencies]
env_logger = "0.3"
[build-dependencies]
serde_codegen = "0.8"
glob = "0.2.11"
[dependencies]
log = "0.3"
hyper = ">= 0.9.0, <0.11.0"
url = "1.2"
serde = "0.8"
serde_json = "0.8"
serializable_enum = "0.3"
| Bump Hyper to support 0.10 as well. | Bump Hyper to support 0.10 as well.
| TOML | mit | softprops/hubcaps,softprops/gust | toml | ## Code Before:
[package]
name = "hubcaps"
version = "0.2.8"
authors = ["softprops <d.tangren@gmail.com>"]
description = "Rust interface for Github"
documentation = "https://softprops.github.io/hubcaps"
homepage = "https://github.com/softprops/hubcaps"
repository = "https://github.com/softprops/hubcaps"
keywords = ["hyper", "github"]
license = "MIT"
build = "build.rs"
[dev-dependencies]
env_logger = "0.3"
[build-dependencies]
serde_codegen = "0.8"
glob = "0.2.11"
[dependencies]
log = "0.3"
hyper = "0.9"
url = "1.2"
serde = "0.8"
serde_json = "0.8"
serializable_enum = "0.3"
## Instruction:
Bump Hyper to support 0.10 as well.
## Code After:
[package]
name = "hubcaps"
version = "0.2.8"
authors = ["softprops <d.tangren@gmail.com>"]
description = "Rust interface for Github"
documentation = "https://softprops.github.io/hubcaps"
homepage = "https://github.com/softprops/hubcaps"
repository = "https://github.com/softprops/hubcaps"
keywords = ["hyper", "github"]
license = "MIT"
build = "build.rs"
[dev-dependencies]
env_logger = "0.3"
[build-dependencies]
serde_codegen = "0.8"
glob = "0.2.11"
[dependencies]
log = "0.3"
hyper = ">= 0.9.0, <0.11.0"
url = "1.2"
serde = "0.8"
serde_json = "0.8"
serializable_enum = "0.3"
| [package]
name = "hubcaps"
version = "0.2.8"
authors = ["softprops <d.tangren@gmail.com>"]
description = "Rust interface for Github"
documentation = "https://softprops.github.io/hubcaps"
homepage = "https://github.com/softprops/hubcaps"
repository = "https://github.com/softprops/hubcaps"
keywords = ["hyper", "github"]
license = "MIT"
build = "build.rs"
[dev-dependencies]
env_logger = "0.3"
[build-dependencies]
serde_codegen = "0.8"
glob = "0.2.11"
[dependencies]
log = "0.3"
- hyper = "0.9"
+ hyper = ">= 0.9.0, <0.11.0"
url = "1.2"
serde = "0.8"
serde_json = "0.8"
serializable_enum = "0.3" | 2 | 0.076923 | 1 | 1 |
30aadd15379de8b0718e25c7b989aa368387e70a | appveyor.yml | appveyor.yml | version: 0.9.1.{build}
configuration: Release
skip_tags: true
nuget:
disable_publish_on_pr: true
clone_depth: 1
test: off
build_script:
- ps: .\build.ps1 Package All
artifacts:
- path: build\packages\**\*.nupkg
name: NuGet
| version: 0.9.1.{build}
configuration: Release
image: Visual Studio 2019
skip_tags: true
nuget:
disable_publish_on_pr: true
clone_depth: 1
test: off
build_script:
- ps: .\build.ps1 Package All
artifacts:
- path: build\packages\**\*.nupkg
name: NuGet
| Use VS 2019 image in CI | Use VS 2019 image in CI
| YAML | mit | NRules/NRules | yaml | ## Code Before:
version: 0.9.1.{build}
configuration: Release
skip_tags: true
nuget:
disable_publish_on_pr: true
clone_depth: 1
test: off
build_script:
- ps: .\build.ps1 Package All
artifacts:
- path: build\packages\**\*.nupkg
name: NuGet
## Instruction:
Use VS 2019 image in CI
## Code After:
version: 0.9.1.{build}
configuration: Release
image: Visual Studio 2019
skip_tags: true
nuget:
disable_publish_on_pr: true
clone_depth: 1
test: off
build_script:
- ps: .\build.ps1 Package All
artifacts:
- path: build\packages\**\*.nupkg
name: NuGet
| version: 0.9.1.{build}
configuration: Release
+
+ image: Visual Studio 2019
skip_tags: true
nuget:
disable_publish_on_pr: true
clone_depth: 1
test: off
build_script:
- ps: .\build.ps1 Package All
artifacts:
- path: build\packages\**\*.nupkg
name: NuGet | 2 | 0.111111 | 2 | 0 |
7b95bdc48f3a2787f02f13e0cb015593ece4b2af | src/Chromabits/Illuminated/Database/Articulate/ValidatingObserver.php | src/Chromabits/Illuminated/Database/Articulate/ValidatingObserver.php | <?php
namespace Chromabits\Illuminated\Database\Articulate;
use Chromabits\Nucleus\Foundation\BaseObject;
use Chromabits\Nucleus\Meditation\Exceptions\FailedCheckException;
/**
* Class ValidatingObserver
*
* Originally from: https://github.com/AltThree/Validator/
*
* @author Eduardo Trujillo <ed@chromabits.com>
* @package Chromabits\Illuminated\Database\Articulate
*/
class ValidatingObserver extends BaseObject
{
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function saving(Model $model)
{
$this->validate($model);
}
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function restoring(Model $model)
{
$this->validate($model);
}
protected function validate(Model $model)
{
$attributes = $model->getAttributes();
$checkable = $model->getCheckable();
if ($checkable === null) {
return;
}
$result = $checkable->check($attributes);
if ($result->failed()) {
throw new FailedCheckException($checkable, $result);
}
}
} | <?php
namespace Chromabits\Illuminated\Database\Articulate;
use Chromabits\Nucleus\Foundation\BaseObject;
use Chromabits\Nucleus\Meditation\Exceptions\FailedCheckException;
/**
* Class ValidatingObserver
*
* Originally from: https://github.com/AltThree/Validator/
*
* @author Eduardo Trujillo <ed@chromabits.com>
* @package Chromabits\Illuminated\Database\Articulate
*/
class ValidatingObserver extends BaseObject
{
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function saving(Model $model)
{
$this->validate($model);
}
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function restoring(Model $model)
{
$this->validate($model);
}
protected function validate(Model $model)
{
$attributes = $model->attributesToArray();
$checkable = $model->getCheckable();
if ($checkable === null) {
return;
}
$result = $checkable->check($attributes);
if ($result->failed()) {
throw new FailedCheckException($checkable, $result);
}
}
} | Use attributesToArray instead of getAttributes | Use attributesToArray instead of getAttributes
| PHP | mit | etcinit/laravel-helpers,chromabits/illuminated,etcinit/illuminated,chromabits/illuminated,chromabits/illuminated,etcinit/laravel-helpers,etcinit/illuminated | php | ## Code Before:
<?php
namespace Chromabits\Illuminated\Database\Articulate;
use Chromabits\Nucleus\Foundation\BaseObject;
use Chromabits\Nucleus\Meditation\Exceptions\FailedCheckException;
/**
* Class ValidatingObserver
*
* Originally from: https://github.com/AltThree/Validator/
*
* @author Eduardo Trujillo <ed@chromabits.com>
* @package Chromabits\Illuminated\Database\Articulate
*/
class ValidatingObserver extends BaseObject
{
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function saving(Model $model)
{
$this->validate($model);
}
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function restoring(Model $model)
{
$this->validate($model);
}
protected function validate(Model $model)
{
$attributes = $model->getAttributes();
$checkable = $model->getCheckable();
if ($checkable === null) {
return;
}
$result = $checkable->check($attributes);
if ($result->failed()) {
throw new FailedCheckException($checkable, $result);
}
}
}
## Instruction:
Use attributesToArray instead of getAttributes
## Code After:
<?php
namespace Chromabits\Illuminated\Database\Articulate;
use Chromabits\Nucleus\Foundation\BaseObject;
use Chromabits\Nucleus\Meditation\Exceptions\FailedCheckException;
/**
* Class ValidatingObserver
*
* Originally from: https://github.com/AltThree/Validator/
*
* @author Eduardo Trujillo <ed@chromabits.com>
* @package Chromabits\Illuminated\Database\Articulate
*/
class ValidatingObserver extends BaseObject
{
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function saving(Model $model)
{
$this->validate($model);
}
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function restoring(Model $model)
{
$this->validate($model);
}
protected function validate(Model $model)
{
$attributes = $model->attributesToArray();
$checkable = $model->getCheckable();
if ($checkable === null) {
return;
}
$result = $checkable->check($attributes);
if ($result->failed()) {
throw new FailedCheckException($checkable, $result);
}
}
} | <?php
namespace Chromabits\Illuminated\Database\Articulate;
use Chromabits\Nucleus\Foundation\BaseObject;
use Chromabits\Nucleus\Meditation\Exceptions\FailedCheckException;
/**
* Class ValidatingObserver
*
* Originally from: https://github.com/AltThree/Validator/
*
* @author Eduardo Trujillo <ed@chromabits.com>
* @package Chromabits\Illuminated\Database\Articulate
*/
class ValidatingObserver extends BaseObject
{
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function saving(Model $model)
{
$this->validate($model);
}
/**
* Validate the model on saving.
*
* @param Model $model
*/
public function restoring(Model $model)
{
$this->validate($model);
}
protected function validate(Model $model)
{
- $attributes = $model->getAttributes();
? ^^^^
+ $attributes = $model->attributesToArray();
? ^ +++++++
$checkable = $model->getCheckable();
if ($checkable === null) {
return;
}
$result = $checkable->check($attributes);
if ($result->failed()) {
throw new FailedCheckException($checkable, $result);
}
}
} | 2 | 0.037736 | 1 | 1 |
68748591b9235c548606d0880567174e16a67922 | xmlToJson/function.json | xmlToJson/function.json | {
"bindings": [
{
"name": "xmlZipBlob",
"type": "blobTrigger",
"direction": "in",
"path": "balancedataxml",
"connection": "sc2iq_STORAGE"
}
],
"disabled": false
} | {
"bindings": [
{
"name": "xmlZipBlob",
"type": "blobTrigger",
"dataType": "binary",
"direction": "in",
"path": "balancedataxml",
"connection": "sc2iq_STORAGE"
}
],
"disabled": false
} | Change blobInput type to binary | Change blobInput type to binary
| JSON | mit | mattmazzola/sc2iq-azure-functions | json | ## Code Before:
{
"bindings": [
{
"name": "xmlZipBlob",
"type": "blobTrigger",
"direction": "in",
"path": "balancedataxml",
"connection": "sc2iq_STORAGE"
}
],
"disabled": false
}
## Instruction:
Change blobInput type to binary
## Code After:
{
"bindings": [
{
"name": "xmlZipBlob",
"type": "blobTrigger",
"dataType": "binary",
"direction": "in",
"path": "balancedataxml",
"connection": "sc2iq_STORAGE"
}
],
"disabled": false
} | {
"bindings": [
{
"name": "xmlZipBlob",
"type": "blobTrigger",
+ "dataType": "binary",
"direction": "in",
"path": "balancedataxml",
"connection": "sc2iq_STORAGE"
}
],
"disabled": false
} | 1 | 0.083333 | 1 | 0 |
16eafc60b205442d6f1880bdc67b56de007c43d9 | vagrant-aws/stage1.sh | vagrant-aws/stage1.sh |
export DEBIAN_FRONTEND=noninteractive
apt-get update
echo "Installing ZFS from latest git HEAD"
apt-get -y install build-essential gawk alien fakeroot linux-headers-$(uname -r) zlib1g-dev uuid-dev libblkid-dev libselinux-dev parted lsscsi dh-autoreconf linux-crashdump git
# As of Feb 18 2015, recommended ZoL revisions from Richard Yao:
good_zfs_version="d958324f97f4668a2a6e4a6ce3e5ca09b71b31d9"
good_spl_version="47af4b76ffe72457166e4abfcfe23848ac51811a"
# Compile and install spl
cd ~/
git clone https://github.com/zfsonlinux/spl
cd spl
git checkout $good_spl_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
# Compile and install zfs
cd ~/
git clone https://github.com/zfsonlinux/zfs
cd zfs
git checkout $good_zfs_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
|
export DEBIAN_FRONTEND=noninteractive
apt-get update
echo "Installing ZFS from latest git HEAD"
apt-get -y install build-essential gawk alien fakeroot linux-headers-$(uname -r) zlib1g-dev uuid-dev libblkid-dev libselinux-dev parted lsscsi dh-autoreconf linux-crashdump git
# As of Apr 2 2015, recommended ZoL revisions from Richard Yao:
good_zfs_version="7f3e466"
good_spl_version="6ab0866"
# Compile and install spl
cd ~/
git clone https://github.com/zfsonlinux/spl
cd spl
git checkout $good_spl_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
# Compile and install zfs
cd ~/
git clone https://github.com/zfsonlinux/zfs
cd zfs
git checkout $good_zfs_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
| Switch to better versions for newer kernels. | Switch to better versions for newer kernels.
| Shell | apache-2.0 | mbrukman/flocker-docker-plugin,wallnerryan/flocker-docker-plugin,ClusterHQ/powerstrip-flocker,hackday-profilers/flocker-docker-plugin,hackday-profilers/flocker-docker-plugin,hackday-profilers/flocker-docker-plugin,mbrukman/flocker-docker-plugin,wallnerryan/flocker-docker-plugin,moypray/flocker-docker-plugin,wallnerryan/flocker-docker-plugin,ClusterHQ/powerstrip-flocker,mbrukman/flocker-docker-plugin,moypray/flocker-docker-plugin,ClusterHQ/powerstrip-flocker,moypray/flocker-docker-plugin | shell | ## Code Before:
export DEBIAN_FRONTEND=noninteractive
apt-get update
echo "Installing ZFS from latest git HEAD"
apt-get -y install build-essential gawk alien fakeroot linux-headers-$(uname -r) zlib1g-dev uuid-dev libblkid-dev libselinux-dev parted lsscsi dh-autoreconf linux-crashdump git
# As of Feb 18 2015, recommended ZoL revisions from Richard Yao:
good_zfs_version="d958324f97f4668a2a6e4a6ce3e5ca09b71b31d9"
good_spl_version="47af4b76ffe72457166e4abfcfe23848ac51811a"
# Compile and install spl
cd ~/
git clone https://github.com/zfsonlinux/spl
cd spl
git checkout $good_spl_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
# Compile and install zfs
cd ~/
git clone https://github.com/zfsonlinux/zfs
cd zfs
git checkout $good_zfs_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
## Instruction:
Switch to better versions for newer kernels.
## Code After:
export DEBIAN_FRONTEND=noninteractive
apt-get update
echo "Installing ZFS from latest git HEAD"
apt-get -y install build-essential gawk alien fakeroot linux-headers-$(uname -r) zlib1g-dev uuid-dev libblkid-dev libselinux-dev parted lsscsi dh-autoreconf linux-crashdump git
# As of Apr 2 2015, recommended ZoL revisions from Richard Yao:
good_zfs_version="7f3e466"
good_spl_version="6ab0866"
# Compile and install spl
cd ~/
git clone https://github.com/zfsonlinux/spl
cd spl
git checkout $good_spl_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
# Compile and install zfs
cd ~/
git clone https://github.com/zfsonlinux/zfs
cd zfs
git checkout $good_zfs_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
|
export DEBIAN_FRONTEND=noninteractive
apt-get update
echo "Installing ZFS from latest git HEAD"
apt-get -y install build-essential gawk alien fakeroot linux-headers-$(uname -r) zlib1g-dev uuid-dev libblkid-dev libselinux-dev parted lsscsi dh-autoreconf linux-crashdump git
- # As of Feb 18 2015, recommended ZoL revisions from Richard Yao:
? ^^^ ^^
+ # As of Apr 2 2015, recommended ZoL revisions from Richard Yao:
? ^^^ ^
- good_zfs_version="d958324f97f4668a2a6e4a6ce3e5ca09b71b31d9"
- good_spl_version="47af4b76ffe72457166e4abfcfe23848ac51811a"
+ good_zfs_version="7f3e466"
+ good_spl_version="6ab0866"
# Compile and install spl
cd ~/
git clone https://github.com/zfsonlinux/spl
cd spl
git checkout $good_spl_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
# Compile and install zfs
cd ~/
git clone https://github.com/zfsonlinux/zfs
cd zfs
git checkout $good_zfs_version
./autogen.sh
./configure
make
make deb
sudo dpkg -i *.deb
| 6 | 0.176471 | 3 | 3 |
bb3e55c5a910a6d447a9b60bab0de975b6017f2b | .travis.yml | .travis.yml | language: java
jdk:
- oraclejdk8
- oraclejdk7
- openjdk6
branches:
only:
- master
- javadoc
script:
mvn test site
# End .travis.yml
| language: java
jdk:
- oraclejdk8
- oraclejdk7
- openjdk6
branches:
only:
- master
- javadoc
script:
mvn test site
git:
depth: 10000
# End .travis.yml
| Fix git-commit-id-plugin error when running in Travis-CI. | Fix git-commit-id-plugin error when running in Travis-CI.
| YAML | apache-2.0 | mehant/incubator-calcite,dindin5258/calcite,devth/calcite,amoghmargoor/incubator-calcite,datametica/calcite,dindin5258/calcite,arina-ielchiieva/calcite,looker-open-source/calcite-avatica,glimpseio/incubator-calcite,looker-open-source/calcite-avatica,jinfengni/incubator-optiq,minji-kim/calcite,apache/calcite,b-slim/calcite,adeshr/incubator-calcite,apache/calcite,apache/calcite-avatica,apache/calcite-avatica,looker-open-source/calcite,julianhyde/calcite,xhoong/incubator-calcite,sreev/incubator-calcite,apache/calcite,amoghmargoor/incubator-calcite,YrAuYong/incubator-calcite,b-slim/calcite,looker-open-source/calcite-avatica,apache/calcite,looker-open-source/calcite,googleinterns/calcite,b-slim/calcite,googleinterns/calcite,wanglan/calcite,jcamachor/calcite,yeongwei/incubator-calcite,vlsi/incubator-calcite,YrAuYong/incubator-calcite,looker-open-source/calcite,mapr/incubator-calcite,minji-kim/calcite,julianhyde/calcite,wanglan/calcite,joshelser/calcite-avatica,xhoong/incubator-calcite,datametica/calcite,sudheeshkatkam/incubator-calcite,glimpseio/incubator-calcite,apache/calcite,b-slim/calcite,looker-open-source/calcite,dindin5258/calcite,mapr/incubator-calcite,yeongwei/incubator-calcite,datametica/calcite,hsuanyi/incubator-calcite,vlsi/calcite,hsuanyi/incubator-calcite,yeongwei/incubator-calcite,hsuanyi/incubator-calcite,jcamachor/calcite,wanglan/calcite,arina-ielchiieva/calcite,joshelser/incubator-calcite,minji-kim/calcite,sreev/incubator-calcite,amoghmargoor/incubator-calcite,datametica/calcite,arina-ielchiieva/calcite,julianhyde/calcite,adeshr/incubator-calcite,glimpseio/incubator-calcite,dindin5258/calcite,looker-open-source/calcite,jinfengni/optiq,YrAuYong/incubator-calcite,jcamachor/calcite,vlsi/calcite,joshelser/calcite-avatica,looker-open-source/calcite-avatica,googleinterns/calcite,apache/calcite-avatica,sreev/incubator-calcite,googleinterns/calcite,arina-ielchiieva/calcite,jinfengni/optiq,sreev/incubator-calcite,apache/calcite-avatica,joshelser/calcite-avatica,sudheeshkatkam/incubator-calcite,jcamachor/calcite,xhoong/incubator-calcite,YrAuYong/incubator-calcite,julianhyde/calcite,yeongwei/incubator-calcite,datametica/calcite,sudheeshkatkam/incubator-calcite,sudheeshkatkam/incubator-calcite,apache/calcite-avatica,amoghmargoor/incubator-calcite,minji-kim/calcite,adeshr/incubator-calcite,apache/calcite,wanglan/calcite,joshelser/incubator-calcite,vlsi/calcite,xhoong/incubator-calcite,julianhyde/calcite,googleinterns/calcite,jinfengni/incubator-optiq,joshelser/incubator-calcite,vlsi/calcite,vlsi/calcite,dindin5258/calcite,adeshr/incubator-calcite,devth/calcite,vlsi/calcite,looker-open-source/calcite,vlsi/incubator-calcite,julianhyde/calcite,b-slim/calcite,jcamachor/calcite,looker-open-source/calcite-avatica,datametica/calcite,googleinterns/calcite,mehant/incubator-calcite,minji-kim/calcite,arina-ielchiieva/calcite,jcamachor/calcite,dindin5258/calcite,glimpseio/incubator-calcite,joshelser/incubator-calcite,hsuanyi/incubator-calcite | yaml | ## Code Before:
language: java
jdk:
- oraclejdk8
- oraclejdk7
- openjdk6
branches:
only:
- master
- javadoc
script:
mvn test site
# End .travis.yml
## Instruction:
Fix git-commit-id-plugin error when running in Travis-CI.
## Code After:
language: java
jdk:
- oraclejdk8
- oraclejdk7
- openjdk6
branches:
only:
- master
- javadoc
script:
mvn test site
git:
depth: 10000
# End .travis.yml
| language: java
jdk:
- oraclejdk8
- oraclejdk7
- openjdk6
branches:
only:
- master
- javadoc
script:
mvn test site
+ git:
+ depth: 10000
# End .travis.yml | 2 | 0.166667 | 2 | 0 |
a4be5a8caaf6a4d6451302b1400ae3ddf25bceb2 | README.md | README.md |
Gearworks is a template for new Shopify apps using the "HNTR" stack — Hapi + Node + Typescript + React. Gearworks is fully equipped for handling Shopify's OAuth process, subscription billing and webhooks.
**This app template is a work in progress. Please check back soon!**
## Documentation
Note: docs are incomplete and only serve as notes for the real documentation once the template reaches v1.
**Q**: My users can't log in!
**A**: If your users can't log in (even after confirming that their username and password is correct), the most likely problem is that the auth cookie is not being set because their connection is not secure. Gearworks uses [yar](https://github.com/hapijs/yar) to handle auth cookies, and it's configured to only set those cookies when the connection is secure (i.e. http**s**://mydomain.com). You can fix this in two different ways:
1. Force all login requests to a secure domain (i.e. http**s**://mydomain.com).
2. Disable secure cookies by setting the yar plugin's `isSecure` option to `false` in `server.ts`. |
Gearworks is a template for new Shopify apps using the "HNTR" stack — Hapi + Node + Typescript + React. Gearworks is fully equipped for handling Shopify's OAuth process, subscription billing and webhooks.
**This app template is a work in progress. Please check back soon!**
## Documentation
Note: docs are incomplete and only serve as notes for the real documentation once the template reaches v1.
**Q**: My users can't log in!
**A**: If your users can't log in (even after confirming that their username and password is correct), the most likely problem is that the auth cookie is not being set because their connection is not secure. Gearworks uses [yar](https://github.com/hapijs/yar) to handle auth cookies, and it's configured to only set those cookies when the connection is secure (i.e. http**s**://mydomain.com). You can fix this in two different ways:
1. Force all login requests to a secure domain (i.e. http**s**://mydomain.com).
2. Disable secure cookies by setting the yar plugin's `isSecure` option to `false` in `server.ts`.
## Building Gearworks
- Gearworks requires a PouchDB-compatible database. For development, Gearworks assumes that you have [pouchdb-server](https://github.com/pouchdb/pouchdb-server) installed globally via NPM. | Update docs with pouchdb-server info. | Update docs with pouchdb-server info.
| Markdown | mit | nozzlegear/Gearworks,nozzlegear/geodirect,nozzlegear/geodirect,nozzlegear/Gearworks,nozzlegear/geodirect | markdown | ## Code Before:
Gearworks is a template for new Shopify apps using the "HNTR" stack — Hapi + Node + Typescript + React. Gearworks is fully equipped for handling Shopify's OAuth process, subscription billing and webhooks.
**This app template is a work in progress. Please check back soon!**
## Documentation
Note: docs are incomplete and only serve as notes for the real documentation once the template reaches v1.
**Q**: My users can't log in!
**A**: If your users can't log in (even after confirming that their username and password is correct), the most likely problem is that the auth cookie is not being set because their connection is not secure. Gearworks uses [yar](https://github.com/hapijs/yar) to handle auth cookies, and it's configured to only set those cookies when the connection is secure (i.e. http**s**://mydomain.com). You can fix this in two different ways:
1. Force all login requests to a secure domain (i.e. http**s**://mydomain.com).
2. Disable secure cookies by setting the yar plugin's `isSecure` option to `false` in `server.ts`.
## Instruction:
Update docs with pouchdb-server info.
## Code After:
Gearworks is a template for new Shopify apps using the "HNTR" stack — Hapi + Node + Typescript + React. Gearworks is fully equipped for handling Shopify's OAuth process, subscription billing and webhooks.
**This app template is a work in progress. Please check back soon!**
## Documentation
Note: docs are incomplete and only serve as notes for the real documentation once the template reaches v1.
**Q**: My users can't log in!
**A**: If your users can't log in (even after confirming that their username and password is correct), the most likely problem is that the auth cookie is not being set because their connection is not secure. Gearworks uses [yar](https://github.com/hapijs/yar) to handle auth cookies, and it's configured to only set those cookies when the connection is secure (i.e. http**s**://mydomain.com). You can fix this in two different ways:
1. Force all login requests to a secure domain (i.e. http**s**://mydomain.com).
2. Disable secure cookies by setting the yar plugin's `isSecure` option to `false` in `server.ts`.
## Building Gearworks
- Gearworks requires a PouchDB-compatible database. For development, Gearworks assumes that you have [pouchdb-server](https://github.com/pouchdb/pouchdb-server) installed globally via NPM. |
Gearworks is a template for new Shopify apps using the "HNTR" stack — Hapi + Node + Typescript + React. Gearworks is fully equipped for handling Shopify's OAuth process, subscription billing and webhooks.
**This app template is a work in progress. Please check back soon!**
## Documentation
Note: docs are incomplete and only serve as notes for the real documentation once the template reaches v1.
**Q**: My users can't log in!
**A**: If your users can't log in (even after confirming that their username and password is correct), the most likely problem is that the auth cookie is not being set because their connection is not secure. Gearworks uses [yar](https://github.com/hapijs/yar) to handle auth cookies, and it's configured to only set those cookies when the connection is secure (i.e. http**s**://mydomain.com). You can fix this in two different ways:
1. Force all login requests to a secure domain (i.e. http**s**://mydomain.com).
2. Disable secure cookies by setting the yar plugin's `isSecure` option to `false` in `server.ts`.
+
+ ## Building Gearworks
+
+ - Gearworks requires a PouchDB-compatible database. For development, Gearworks assumes that you have [pouchdb-server](https://github.com/pouchdb/pouchdb-server) installed globally via NPM. | 4 | 0.266667 | 4 | 0 |
ccff07fdeaa8b7f5c075cee38995146123ef6db5 | Editor/Services/Parts/Preview.php | Editor/Services/Parts/Preview.php | <?php
/**
* @package OnlinePublisher
* @subpackage Services.Parts
*/
require_once '../../Include/Private.php';
$type = Request::getString('type');
$id = Request::getInt('id');
$pageId = Request::getInt('pageId');
if ($controller = PartService::getController($type)) {
if ($part = $controller->getFromRequest($id)) {
header("Content-Type: text/html; charset=UTF-8");
$context = new PartContext();
$context->setLanguage(PageService::getLanguage($pageId));
echo $controller->render($part,$context);
}
}
?> | <?php
/**
* @package OnlinePublisher
* @subpackage Services.Parts
*/
require_once '../../Include/Private.php';
$type = Request::getString('type');
$id = Request::getInt('id');
$pageId = Request::getInt('pageId');
if ($controller = PartService::getController($type)) {
if ($part = $controller->getFromRequest($id)) {
header("Content-Type: text/html; charset=UTF-8");
$context = DocumentTemplateController::buildPartContext($pageId);
echo $controller->render($part,$context);
}
}
?> | Use correct context when previewing widget | Use correct context when previewing widget
| PHP | unlicense | Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui | php | ## Code Before:
<?php
/**
* @package OnlinePublisher
* @subpackage Services.Parts
*/
require_once '../../Include/Private.php';
$type = Request::getString('type');
$id = Request::getInt('id');
$pageId = Request::getInt('pageId');
if ($controller = PartService::getController($type)) {
if ($part = $controller->getFromRequest($id)) {
header("Content-Type: text/html; charset=UTF-8");
$context = new PartContext();
$context->setLanguage(PageService::getLanguage($pageId));
echo $controller->render($part,$context);
}
}
?>
## Instruction:
Use correct context when previewing widget
## Code After:
<?php
/**
* @package OnlinePublisher
* @subpackage Services.Parts
*/
require_once '../../Include/Private.php';
$type = Request::getString('type');
$id = Request::getInt('id');
$pageId = Request::getInt('pageId');
if ($controller = PartService::getController($type)) {
if ($part = $controller->getFromRequest($id)) {
header("Content-Type: text/html; charset=UTF-8");
$context = DocumentTemplateController::buildPartContext($pageId);
echo $controller->render($part,$context);
}
}
?> | <?php
/**
* @package OnlinePublisher
* @subpackage Services.Parts
*/
require_once '../../Include/Private.php';
$type = Request::getString('type');
$id = Request::getInt('id');
$pageId = Request::getInt('pageId');
if ($controller = PartService::getController($type)) {
if ($part = $controller->getFromRequest($id)) {
header("Content-Type: text/html; charset=UTF-8");
+ $context = DocumentTemplateController::buildPartContext($pageId);
- $context = new PartContext();
- $context->setLanguage(PageService::getLanguage($pageId));
echo $controller->render($part,$context);
}
}
?> | 3 | 0.15 | 1 | 2 |
686999f6f65a6fbd6e45250469f70994f0af26d7 | README.md | README.md |
This repository holds source for Debian packages that prepare a host
to boot cuttlefish, an Android device that is designed to run on Google
Compute Engine.
|
This repository holds source for Debian packages that prepare a host
to boot cuttlefish, an Android device that is designed to run on Google
Compute Engine.
This package can be built directly with dpkg-buildpackage, but it is
designed to be built with:
device/google/cuttlefish_common/tools/create_base_image.sh
[Check out the AOSP tree](https://source.android.com/setup/build/downloading)
to obtain the script.
| Update instructions to point back to the AOSP tree | Update instructions to point back to the AOSP tree
Change-Id: I614572429c01fc8a10219f5950f239c8c7a8b479
| Markdown | apache-2.0 | google/android-cuttlefish,google/android-cuttlefish,google/android-cuttlefish,google/android-cuttlefish,google/android-cuttlefish,google/android-cuttlefish,google/android-cuttlefish | markdown | ## Code Before:
This repository holds source for Debian packages that prepare a host
to boot cuttlefish, an Android device that is designed to run on Google
Compute Engine.
## Instruction:
Update instructions to point back to the AOSP tree
Change-Id: I614572429c01fc8a10219f5950f239c8c7a8b479
## Code After:
This repository holds source for Debian packages that prepare a host
to boot cuttlefish, an Android device that is designed to run on Google
Compute Engine.
This package can be built directly with dpkg-buildpackage, but it is
designed to be built with:
device/google/cuttlefish_common/tools/create_base_image.sh
[Check out the AOSP tree](https://source.android.com/setup/build/downloading)
to obtain the script.
|
This repository holds source for Debian packages that prepare a host
to boot cuttlefish, an Android device that is designed to run on Google
Compute Engine.
+
+ This package can be built directly with dpkg-buildpackage, but it is
+ designed to be built with:
+
+ device/google/cuttlefish_common/tools/create_base_image.sh
+
+ [Check out the AOSP tree](https://source.android.com/setup/build/downloading)
+ to obtain the script. | 8 | 2 | 8 | 0 |
a36cfe1b08ea04918e8be770f083018bcc693582 | app/styles/components/program-details.scss | app/styles/components/program-details.scss | .program-details {
.backtolink {
margin: .5rem;
}
}
| .program-details {
margin-bottom: 1rem;
.backtolink {
margin: .5rem;
}
}
| Add some space between the program and the list | Add some space between the program and the list
| SCSS | mit | djvoa12/frontend,ilios/frontend,jrjohnson/frontend,dartajax/frontend,thecoolestguy/frontend,jrjohnson/frontend,dartajax/frontend,djvoa12/frontend,thecoolestguy/frontend,ilios/frontend | scss | ## Code Before:
.program-details {
.backtolink {
margin: .5rem;
}
}
## Instruction:
Add some space between the program and the list
## Code After:
.program-details {
margin-bottom: 1rem;
.backtolink {
margin: .5rem;
}
}
| .program-details {
+ margin-bottom: 1rem;
+
.backtolink {
margin: .5rem;
}
} | 2 | 0.4 | 2 | 0 |
58b4a325f06e178c398d96513f5ac7bd69ad0b1e | src/export.js | src/export.js | 'use strict'
/**
* @class Elastic
*/
var Elastic = {
klass : klass,
domain: {
Layout : LayoutDomain,
View : ViewDomain,
Compontnt: ComponentDomain
},
exception: {
Runtime : RuntimeException,
Logic : LogicException
},
util: {
History : History,
Router : Router,
Storage : Storage,
Validator: Validator
},
trait: {
AsyncCallback: AsyncCallbackTrait,
Observable : ObservableTrait
}
};
// for RequireJS
if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
window.define(function() {
return Elastic;
});
}
// for Node.js & browserify
else if (typeof module == 'object' && module &&
typeof exports == 'object' && exports &&
module.exports == exports
) {
module.exports = Elastic;
}
// for Browser
else {
window.Elastic = Elastic;
}
| 'use strict'
/**
* @class Elastic
*/
var Elastic = {
// like OOP
klass : klass,
// shorthand
Layout : LayoutDomain,
View : ViewDomain,
Component: ComponentDomain,
// classes
domain: {
Layout : LayoutDomain,
View : ViewDomain,
Compontnt: ComponentDomain
},
exception: {
Runtime : RuntimeException,
Logic : LogicException
},
util: {
History : History,
Router : Router,
Storage : Storage,
Validator: Validator
},
trait: {
AsyncCallback: AsyncCallbackTrait,
Observable : ObservableTrait
}
};
// for RequireJS
if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
window.define(function() {
return Elastic;
});
}
// for Node.js & browserify
else if (typeof module == 'object' && module &&
typeof exports == 'object' && exports &&
module.exports == exports
) {
module.exports = Elastic;
}
// for Browser
else {
window.Elastic = Elastic;
}
| Add Domain class's shorthand name | Add Domain class's shorthand name
| JavaScript | mit | ahomu/Elastic | javascript | ## Code Before:
'use strict'
/**
* @class Elastic
*/
var Elastic = {
klass : klass,
domain: {
Layout : LayoutDomain,
View : ViewDomain,
Compontnt: ComponentDomain
},
exception: {
Runtime : RuntimeException,
Logic : LogicException
},
util: {
History : History,
Router : Router,
Storage : Storage,
Validator: Validator
},
trait: {
AsyncCallback: AsyncCallbackTrait,
Observable : ObservableTrait
}
};
// for RequireJS
if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
window.define(function() {
return Elastic;
});
}
// for Node.js & browserify
else if (typeof module == 'object' && module &&
typeof exports == 'object' && exports &&
module.exports == exports
) {
module.exports = Elastic;
}
// for Browser
else {
window.Elastic = Elastic;
}
## Instruction:
Add Domain class's shorthand name
## Code After:
'use strict'
/**
* @class Elastic
*/
var Elastic = {
// like OOP
klass : klass,
// shorthand
Layout : LayoutDomain,
View : ViewDomain,
Component: ComponentDomain,
// classes
domain: {
Layout : LayoutDomain,
View : ViewDomain,
Compontnt: ComponentDomain
},
exception: {
Runtime : RuntimeException,
Logic : LogicException
},
util: {
History : History,
Router : Router,
Storage : Storage,
Validator: Validator
},
trait: {
AsyncCallback: AsyncCallbackTrait,
Observable : ObservableTrait
}
};
// for RequireJS
if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
window.define(function() {
return Elastic;
});
}
// for Node.js & browserify
else if (typeof module == 'object' && module &&
typeof exports == 'object' && exports &&
module.exports == exports
) {
module.exports = Elastic;
}
// for Browser
else {
window.Elastic = Elastic;
}
| 'use strict'
/**
* @class Elastic
*/
var Elastic = {
+ // like OOP
klass : klass,
+
+ // shorthand
+ Layout : LayoutDomain,
+ View : ViewDomain,
+ Component: ComponentDomain,
+
+ // classes
domain: {
Layout : LayoutDomain,
View : ViewDomain,
Compontnt: ComponentDomain
},
exception: {
Runtime : RuntimeException,
Logic : LogicException
},
util: {
History : History,
Router : Router,
Storage : Storage,
Validator: Validator
},
trait: {
AsyncCallback: AsyncCallbackTrait,
Observable : ObservableTrait
}
};
// for RequireJS
if (typeof define == 'function' && typeof define.amd == 'object' && define.amd) {
window.define(function() {
return Elastic;
});
}
// for Node.js & browserify
else if (typeof module == 'object' && module &&
typeof exports == 'object' && exports &&
module.exports == exports
) {
module.exports = Elastic;
}
// for Browser
else {
window.Elastic = Elastic;
} | 8 | 0.177778 | 8 | 0 |
9b043b0bd31f35e140831f61a4484513922f8712 | stop_words/__init__.py | stop_words/__init__.py | import os
__VERSION__ = (2014, 5, 26)
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
STOP_WORDS_DIR = os.path.join(CURRENT_DIR, 'stop-words/')
def get_version():
"""
:rtype: basestring
"""
return ".".join(str(v) for v in __VERSION__)
def get_stop_words(language):
"""
:type language: basestring
:rtype: list
"""
with open('{0}{1}.txt'.format(STOP_WORDS_DIR, language)) as lang_file:
lines = lang_file.readlines()
return [str(line.strip()).decode('utf-8') for line in lines]
| import os
__VERSION__ = (2014, 5, 26)
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
STOP_WORDS_DIR = os.path.join(CURRENT_DIR, 'stop-words/')
LANGUAGE_MAPPING = {
'ar': 'arabic',
'da': 'danish',
'nl': 'dutch',
'en': 'english',
'fi': 'finnish',
'fr': 'french',
'de': 'german',
'hu': 'hungarian',
'it': 'italian',
'nb': 'norwegian',
'pt': 'portuguese',
'ro': 'romanian',
'ru': 'russian',
'es': 'spanish',
'sv': 'swedish',
'tr': 'turkish',
}
AVAILABLE_LANGUAGES = LANGUAGE_MAPPING.values()
def get_version():
"""
:rtype: basestring
"""
return ".".join(str(v) for v in __VERSION__)
class StopWordError(Exception):
pass
def get_stop_words(language):
"""
:type language: basestring
:rtype: list
"""
try:
language = LANGUAGE_MAPPING[language]
except KeyError:
pass
if language not in AVAILABLE_LANGUAGES:
raise StopWordError('%s language is unavailable')
with open('{0}{1}.txt'.format(STOP_WORDS_DIR, language)) as lang_file:
lines = lang_file.readlines()
return [str(line.strip()).decode('utf-8') for line in lines]
| Implement language code mapping and check availability of the language | Implement language code mapping and check availability of the language
| Python | bsd-3-clause | Alir3z4/python-stop-words | python | ## Code Before:
import os
__VERSION__ = (2014, 5, 26)
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
STOP_WORDS_DIR = os.path.join(CURRENT_DIR, 'stop-words/')
def get_version():
"""
:rtype: basestring
"""
return ".".join(str(v) for v in __VERSION__)
def get_stop_words(language):
"""
:type language: basestring
:rtype: list
"""
with open('{0}{1}.txt'.format(STOP_WORDS_DIR, language)) as lang_file:
lines = lang_file.readlines()
return [str(line.strip()).decode('utf-8') for line in lines]
## Instruction:
Implement language code mapping and check availability of the language
## Code After:
import os
__VERSION__ = (2014, 5, 26)
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
STOP_WORDS_DIR = os.path.join(CURRENT_DIR, 'stop-words/')
LANGUAGE_MAPPING = {
'ar': 'arabic',
'da': 'danish',
'nl': 'dutch',
'en': 'english',
'fi': 'finnish',
'fr': 'french',
'de': 'german',
'hu': 'hungarian',
'it': 'italian',
'nb': 'norwegian',
'pt': 'portuguese',
'ro': 'romanian',
'ru': 'russian',
'es': 'spanish',
'sv': 'swedish',
'tr': 'turkish',
}
AVAILABLE_LANGUAGES = LANGUAGE_MAPPING.values()
def get_version():
"""
:rtype: basestring
"""
return ".".join(str(v) for v in __VERSION__)
class StopWordError(Exception):
pass
def get_stop_words(language):
"""
:type language: basestring
:rtype: list
"""
try:
language = LANGUAGE_MAPPING[language]
except KeyError:
pass
if language not in AVAILABLE_LANGUAGES:
raise StopWordError('%s language is unavailable')
with open('{0}{1}.txt'.format(STOP_WORDS_DIR, language)) as lang_file:
lines = lang_file.readlines()
return [str(line.strip()).decode('utf-8') for line in lines]
| import os
__VERSION__ = (2014, 5, 26)
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
STOP_WORDS_DIR = os.path.join(CURRENT_DIR, 'stop-words/')
+ LANGUAGE_MAPPING = {
+ 'ar': 'arabic',
+ 'da': 'danish',
+ 'nl': 'dutch',
+ 'en': 'english',
+ 'fi': 'finnish',
+ 'fr': 'french',
+ 'de': 'german',
+ 'hu': 'hungarian',
+ 'it': 'italian',
+ 'nb': 'norwegian',
+ 'pt': 'portuguese',
+ 'ro': 'romanian',
+ 'ru': 'russian',
+ 'es': 'spanish',
+ 'sv': 'swedish',
+ 'tr': 'turkish',
+ }
+
+ AVAILABLE_LANGUAGES = LANGUAGE_MAPPING.values()
def get_version():
"""
:rtype: basestring
"""
return ".".join(str(v) for v in __VERSION__)
+ class StopWordError(Exception):
+ pass
+
+
def get_stop_words(language):
"""
:type language: basestring
:rtype: list
"""
+ try:
+ language = LANGUAGE_MAPPING[language]
+ except KeyError:
+ pass
+
+ if language not in AVAILABLE_LANGUAGES:
+ raise StopWordError('%s language is unavailable')
+
with open('{0}{1}.txt'.format(STOP_WORDS_DIR, language)) as lang_file:
lines = lang_file.readlines()
return [str(line.strip()).decode('utf-8') for line in lines] | 32 | 1.391304 | 32 | 0 |
5fea8764ed28371e0e8b16a3ef3a21e9fe3504f4 | Gruntfile.js | Gruntfile.js | /*
** lite-proxyserver
** Design and Development by msg Applied Technology Research
** Copyright (c) 2016 msg systems ag (http://www.msg.group/)
*/
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-contrib-connect')
grunt.loadNpmTasks('grunt-connect-proxy')
grunt.loadNpmTasks('grunt-extend-config')
grunt.loadTasks("grunt-config")
}; | /*
** lite-proxyserver
** Design and Development by msg Applied Technology Research
** Copyright (c) 2016 msg systems ag (http://www.msg.group/)
*/
var path = require("path");
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-contrib-connect')
grunt.loadNpmTasks('grunt-connect-proxy')
grunt.loadNpmTasks('grunt-extend-config')
grunt.loadTasks(path.join(__dirname, "grunt-config"))
};
| Load config relative to current dir. | Load config relative to current dir. | JavaScript | mit | msg-systems/lite-proxyserver | javascript | ## Code Before:
/*
** lite-proxyserver
** Design and Development by msg Applied Technology Research
** Copyright (c) 2016 msg systems ag (http://www.msg.group/)
*/
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-contrib-connect')
grunt.loadNpmTasks('grunt-connect-proxy')
grunt.loadNpmTasks('grunt-extend-config')
grunt.loadTasks("grunt-config")
};
## Instruction:
Load config relative to current dir.
## Code After:
/*
** lite-proxyserver
** Design and Development by msg Applied Technology Research
** Copyright (c) 2016 msg systems ag (http://www.msg.group/)
*/
var path = require("path");
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-contrib-connect')
grunt.loadNpmTasks('grunt-connect-proxy')
grunt.loadNpmTasks('grunt-extend-config')
grunt.loadTasks(path.join(__dirname, "grunt-config"))
};
| /*
** lite-proxyserver
** Design and Development by msg Applied Technology Research
** Copyright (c) 2016 msg systems ag (http://www.msg.group/)
*/
+
+ var path = require("path");
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-contrib-connect')
grunt.loadNpmTasks('grunt-connect-proxy')
grunt.loadNpmTasks('grunt-extend-config')
- grunt.loadTasks("grunt-config")
+ grunt.loadTasks(path.join(__dirname, "grunt-config"))
? +++++++++++++++++++++ +
-
}; | 5 | 0.333333 | 3 | 2 |
fb35d2b169266a1c11e2a853ca6eccadf9ce9da2 | scripts/install-network-bridge.sh | scripts/install-network-bridge.sh | sudo apt-get install -y bridge-utils iptables dnsmasq
sudo update-rc.d dnsmasq disable
sudo update-rc.d dnsmasq remove
# deploy QEMU bridge start script
cp /vagrant/files/etc/qemu-ifup-NAT /etc/qemu-ifup
chmod 755 /etc/qemu-ifup
cp /vagrant/files/etc/qemu-ifdown-NAT /etc/qemu-ifdown
chmod 755 /etc/qemu-ifdown | sudo apt-get install -y bridge-utils iptables dnsmasq
# disable dnsmasq, because we need it for QEMU NAT networking
sudo systemctl stop dnsmasq.service
sudo systemctl disable dnsmasq.service
# deploy QEMU bridge start script
cp /vagrant/files/etc/qemu-ifup-NAT /etc/qemu-ifup
chmod 755 /etc/qemu-ifup
cp /vagrant/files/etc/qemu-ifdown-NAT /etc/qemu-ifdown
chmod 755 /etc/qemu-ifdown
| Stop and disable systemd service dnsmasq | Stop and disable systemd service dnsmasq
| Shell | mit | DieterReuter/qemu-arm-box | shell | ## Code Before:
sudo apt-get install -y bridge-utils iptables dnsmasq
sudo update-rc.d dnsmasq disable
sudo update-rc.d dnsmasq remove
# deploy QEMU bridge start script
cp /vagrant/files/etc/qemu-ifup-NAT /etc/qemu-ifup
chmod 755 /etc/qemu-ifup
cp /vagrant/files/etc/qemu-ifdown-NAT /etc/qemu-ifdown
chmod 755 /etc/qemu-ifdown
## Instruction:
Stop and disable systemd service dnsmasq
## Code After:
sudo apt-get install -y bridge-utils iptables dnsmasq
# disable dnsmasq, because we need it for QEMU NAT networking
sudo systemctl stop dnsmasq.service
sudo systemctl disable dnsmasq.service
# deploy QEMU bridge start script
cp /vagrant/files/etc/qemu-ifup-NAT /etc/qemu-ifup
chmod 755 /etc/qemu-ifup
cp /vagrant/files/etc/qemu-ifdown-NAT /etc/qemu-ifdown
chmod 755 /etc/qemu-ifdown
| sudo apt-get install -y bridge-utils iptables dnsmasq
- sudo update-rc.d dnsmasq disable
- sudo update-rc.d dnsmasq remove
+
+ # disable dnsmasq, because we need it for QEMU NAT networking
+ sudo systemctl stop dnsmasq.service
+ sudo systemctl disable dnsmasq.service
# deploy QEMU bridge start script
cp /vagrant/files/etc/qemu-ifup-NAT /etc/qemu-ifup
chmod 755 /etc/qemu-ifup
cp /vagrant/files/etc/qemu-ifdown-NAT /etc/qemu-ifdown
chmod 755 /etc/qemu-ifdown | 6 | 0.666667 | 4 | 2 |
2d50d5eb33264279c522ff5bbc9021c039819698 | OGDL/Parser.swift | OGDL/Parser.swift | //
// Parser.swift
// OGDL
//
// Created by Justin Spahr-Summers on 2015-01-07.
// Copyright (c) 2015 Carthage. All rights reserved.
//
import Foundation
import Madness
/// Returns a parser which parses one character from the given set.
internal prefix func % (characterSet: NSCharacterSet) -> Parser<String>.Function {
return { string in
let scalars = string.unicodeScalars
if let scalar = first(scalars) {
if characterSet.longCharacterIsMember(scalar.value) {
return (String(scalar), String(dropFirst(scalars)))
}
}
return nil
}
}
| //
// Parser.swift
// OGDL
//
// Created by Justin Spahr-Summers on 2015-01-07.
// Copyright (c) 2015 Carthage. All rights reserved.
//
import Foundation
import Madness
/// Returns a parser which parses one character from the given set.
internal prefix func % (characterSet: NSCharacterSet) -> Parser<String>.Function {
return { string in
let scalars = string.unicodeScalars
if let scalar = first(scalars) {
if characterSet.longCharacterIsMember(scalar.value) {
return (String(scalar), String(dropFirst(scalars)))
}
}
return nil
}
}
/// Removes the characters in the given string from the character set.
internal func - (characterSet: NSCharacterSet, characters: String) -> NSCharacterSet {
let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
mutableSet.removeCharactersInString(characters)
return mutableSet
}
/// Removes characters in the latter set from the former.
internal func - (characterSet: NSCharacterSet, subtrahend: NSCharacterSet) -> NSCharacterSet {
let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
mutableSet.formIntersectionWithCharacterSet(subtrahend.invertedSet)
return mutableSet
}
| Add - operator for manipulating NSCharacterSet | Add - operator for manipulating NSCharacterSet
| Swift | mit | Carthage/ogdl-swift,Carthage/ogdl-swift,Carthage/ogdl-swift | swift | ## Code Before:
//
// Parser.swift
// OGDL
//
// Created by Justin Spahr-Summers on 2015-01-07.
// Copyright (c) 2015 Carthage. All rights reserved.
//
import Foundation
import Madness
/// Returns a parser which parses one character from the given set.
internal prefix func % (characterSet: NSCharacterSet) -> Parser<String>.Function {
return { string in
let scalars = string.unicodeScalars
if let scalar = first(scalars) {
if characterSet.longCharacterIsMember(scalar.value) {
return (String(scalar), String(dropFirst(scalars)))
}
}
return nil
}
}
## Instruction:
Add - operator for manipulating NSCharacterSet
## Code After:
//
// Parser.swift
// OGDL
//
// Created by Justin Spahr-Summers on 2015-01-07.
// Copyright (c) 2015 Carthage. All rights reserved.
//
import Foundation
import Madness
/// Returns a parser which parses one character from the given set.
internal prefix func % (characterSet: NSCharacterSet) -> Parser<String>.Function {
return { string in
let scalars = string.unicodeScalars
if let scalar = first(scalars) {
if characterSet.longCharacterIsMember(scalar.value) {
return (String(scalar), String(dropFirst(scalars)))
}
}
return nil
}
}
/// Removes the characters in the given string from the character set.
internal func - (characterSet: NSCharacterSet, characters: String) -> NSCharacterSet {
let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
mutableSet.removeCharactersInString(characters)
return mutableSet
}
/// Removes characters in the latter set from the former.
internal func - (characterSet: NSCharacterSet, subtrahend: NSCharacterSet) -> NSCharacterSet {
let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
mutableSet.formIntersectionWithCharacterSet(subtrahend.invertedSet)
return mutableSet
}
| //
// Parser.swift
// OGDL
//
// Created by Justin Spahr-Summers on 2015-01-07.
// Copyright (c) 2015 Carthage. All rights reserved.
//
import Foundation
import Madness
/// Returns a parser which parses one character from the given set.
internal prefix func % (characterSet: NSCharacterSet) -> Parser<String>.Function {
return { string in
let scalars = string.unicodeScalars
if let scalar = first(scalars) {
if characterSet.longCharacterIsMember(scalar.value) {
return (String(scalar), String(dropFirst(scalars)))
}
}
return nil
}
}
+
+ /// Removes the characters in the given string from the character set.
+ internal func - (characterSet: NSCharacterSet, characters: String) -> NSCharacterSet {
+ let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
+ mutableSet.removeCharactersInString(characters)
+ return mutableSet
+ }
+
+ /// Removes characters in the latter set from the former.
+ internal func - (characterSet: NSCharacterSet, subtrahend: NSCharacterSet) -> NSCharacterSet {
+ let mutableSet = characterSet.mutableCopy() as NSMutableCharacterSet
+ mutableSet.formIntersectionWithCharacterSet(subtrahend.invertedSet)
+ return mutableSet
+ } | 14 | 0.56 | 14 | 0 |
d308e8a091cd0a2c3ba361989061da4056c1e62c | docs/source/toy/index.rst | docs/source/toy/index.rst | ************
Toy problems
************
The `toy` module provides toy :class:`models<pints.ForwardModel>`,
:class:`distributions<pints.LogPDF>` and
:class:`error measures<pints.ErrorMeasure>` that can be used for tests and in
examples.
Some toy classes provide extra functionality defined in the
:class:`pints.toy.ToyModel` and :class:`pints.toy.ToyLogPDF` classes.
.. toctree::
annulus_logpdf
beeler_reuter_ap_model
cone_logpdf
constant_model
fitzhugh_nagumo_model
gaussian_logpdf
goodwin_oscillator_model
hes1_michaelis_menten_model
high_dimensional_gaussian_logpdf
hodgkin_huxley_ik_model
logistic_model
lotka_volterra_model
multimodal_gaussian_logpdf
neals_funnel
parabolic_error
repressilator_model
rosenbrock
simple_egg_box_logpdf
sir_model
stochastic_degradation_model
toy_classes
twisted_gaussian_logpdf
| ************
Toy problems
************
The `toy` module provides toy :class:`models<pints.ForwardModel>`,
:class:`distributions<pints.LogPDF>` and
:class:`error measures<pints.ErrorMeasure>` that can be used for tests and in
examples.
Some toy classes provide extra functionality defined in the
:class:`pints.toy.ToyModel` and :class:`pints.toy.ToyLogPDF` classes.
.. toctree::
toy_classes
annulus_logpdf
beeler_reuter_ap_model
cone_logpdf
constant_model
fitzhugh_nagumo_model
gaussian_logpdf
goodwin_oscillator_model
hes1_michaelis_menten_model
high_dimensional_gaussian_logpdf
hodgkin_huxley_ik_model
logistic_model
lotka_volterra_model
multimodal_gaussian_logpdf
neals_funnel
parabolic_error
repressilator_model
rosenbrock
simple_egg_box_logpdf
sir_model
stochastic_degradation_model
twisted_gaussian_logpdf
| Tweak to ordering of toy classes | Tweak to ordering of toy classes
| reStructuredText | bsd-3-clause | martinjrobins/hobo,martinjrobins/hobo,martinjrobins/hobo,martinjrobins/hobo | restructuredtext | ## Code Before:
************
Toy problems
************
The `toy` module provides toy :class:`models<pints.ForwardModel>`,
:class:`distributions<pints.LogPDF>` and
:class:`error measures<pints.ErrorMeasure>` that can be used for tests and in
examples.
Some toy classes provide extra functionality defined in the
:class:`pints.toy.ToyModel` and :class:`pints.toy.ToyLogPDF` classes.
.. toctree::
annulus_logpdf
beeler_reuter_ap_model
cone_logpdf
constant_model
fitzhugh_nagumo_model
gaussian_logpdf
goodwin_oscillator_model
hes1_michaelis_menten_model
high_dimensional_gaussian_logpdf
hodgkin_huxley_ik_model
logistic_model
lotka_volterra_model
multimodal_gaussian_logpdf
neals_funnel
parabolic_error
repressilator_model
rosenbrock
simple_egg_box_logpdf
sir_model
stochastic_degradation_model
toy_classes
twisted_gaussian_logpdf
## Instruction:
Tweak to ordering of toy classes
## Code After:
************
Toy problems
************
The `toy` module provides toy :class:`models<pints.ForwardModel>`,
:class:`distributions<pints.LogPDF>` and
:class:`error measures<pints.ErrorMeasure>` that can be used for tests and in
examples.
Some toy classes provide extra functionality defined in the
:class:`pints.toy.ToyModel` and :class:`pints.toy.ToyLogPDF` classes.
.. toctree::
toy_classes
annulus_logpdf
beeler_reuter_ap_model
cone_logpdf
constant_model
fitzhugh_nagumo_model
gaussian_logpdf
goodwin_oscillator_model
hes1_michaelis_menten_model
high_dimensional_gaussian_logpdf
hodgkin_huxley_ik_model
logistic_model
lotka_volterra_model
multimodal_gaussian_logpdf
neals_funnel
parabolic_error
repressilator_model
rosenbrock
simple_egg_box_logpdf
sir_model
stochastic_degradation_model
twisted_gaussian_logpdf
| ************
Toy problems
************
The `toy` module provides toy :class:`models<pints.ForwardModel>`,
:class:`distributions<pints.LogPDF>` and
:class:`error measures<pints.ErrorMeasure>` that can be used for tests and in
examples.
Some toy classes provide extra functionality defined in the
:class:`pints.toy.ToyModel` and :class:`pints.toy.ToyLogPDF` classes.
.. toctree::
+ toy_classes
annulus_logpdf
beeler_reuter_ap_model
cone_logpdf
constant_model
fitzhugh_nagumo_model
gaussian_logpdf
goodwin_oscillator_model
hes1_michaelis_menten_model
high_dimensional_gaussian_logpdf
hodgkin_huxley_ik_model
logistic_model
lotka_volterra_model
multimodal_gaussian_logpdf
neals_funnel
parabolic_error
repressilator_model
rosenbrock
simple_egg_box_logpdf
sir_model
stochastic_degradation_model
- toy_classes
twisted_gaussian_logpdf | 2 | 0.054054 | 1 | 1 |
777b2be613a95efa05f46f5b41ded6ed3063157d | app/templates/emails/clarification_question_submitted.html | app/templates/emails/clarification_question_submitted.html | <!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
</head>
<body>
Hello {{ user_name }},
<br /><br />
Thanks for sending your G-Cloud 7 clarification question. All clarification questions and answers will be published regularly on the Digital Marketplace G-Cloud 7 updates page.
<br /><br />
You’ll receive an email when new clarification answers are posted. The Crown Commercial Service will not respond to you individually.
<br /><br />
Here is a copy of the question that you sent:<br />
"{{ message }}"
<br /><br />
This is an automated message. Email <a href="mailto:enquiries@digitalmarketplace.service.gov.uk">enquiries@digitalmarketplace.service.gov.uk</a> if you need help.
<br /><br />
Thanks,
<br /><br />
The Digital Marketplace team
</body>
</html>
| <!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
</head>
<body>
Hello {{ user_name }},
<br /><br />
Thanks for sending your G-Cloud 7 clarification question. All clarification questions and answers will be published regularly on the Digital Marketplace G-Cloud 7 updates page.
<br /><br />
You’ll receive an email when new clarification answers are posted. The Crown Commercial Service will not respond to you individually.
<br /><br />
Here is a copy of the question that you sent:<br />
"{{ message }}"
<br /><br />
This is an automated message. Please do not reply to this email.
<br /><br />
Thanks,
<br /><br />
The Digital Marketplace team
</body>
</html>
| Update email copy - do not reply | Update email copy - do not reply
| HTML | mit | alphagov/digitalmarketplace-supplier-frontend,alphagov/digitalmarketplace-supplier-frontend,alphagov/digitalmarketplace-supplier-frontend,alphagov/digitalmarketplace-supplier-frontend | html | ## Code Before:
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
</head>
<body>
Hello {{ user_name }},
<br /><br />
Thanks for sending your G-Cloud 7 clarification question. All clarification questions and answers will be published regularly on the Digital Marketplace G-Cloud 7 updates page.
<br /><br />
You’ll receive an email when new clarification answers are posted. The Crown Commercial Service will not respond to you individually.
<br /><br />
Here is a copy of the question that you sent:<br />
"{{ message }}"
<br /><br />
This is an automated message. Email <a href="mailto:enquiries@digitalmarketplace.service.gov.uk">enquiries@digitalmarketplace.service.gov.uk</a> if you need help.
<br /><br />
Thanks,
<br /><br />
The Digital Marketplace team
</body>
</html>
## Instruction:
Update email copy - do not reply
## Code After:
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
</head>
<body>
Hello {{ user_name }},
<br /><br />
Thanks for sending your G-Cloud 7 clarification question. All clarification questions and answers will be published regularly on the Digital Marketplace G-Cloud 7 updates page.
<br /><br />
You’ll receive an email when new clarification answers are posted. The Crown Commercial Service will not respond to you individually.
<br /><br />
Here is a copy of the question that you sent:<br />
"{{ message }}"
<br /><br />
This is an automated message. Please do not reply to this email.
<br /><br />
Thanks,
<br /><br />
The Digital Marketplace team
</body>
</html>
| <!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
</head>
<body>
Hello {{ user_name }},
<br /><br />
Thanks for sending your G-Cloud 7 clarification question. All clarification questions and answers will be published regularly on the Digital Marketplace G-Cloud 7 updates page.
<br /><br />
You’ll receive an email when new clarification answers are posted. The Crown Commercial Service will not respond to you individually.
<br /><br />
Here is a copy of the question that you sent:<br />
"{{ message }}"
<br /><br />
- This is an automated message. Email <a href="mailto:enquiries@digitalmarketplace.service.gov.uk">enquiries@digitalmarketplace.service.gov.uk</a> if you need help.
+ This is an automated message. Please do not reply to this email.
<br /><br />
Thanks,
<br /><br />
The Digital Marketplace team
</body>
</html> | 2 | 0.090909 | 1 | 1 |
814a4e537e8fa5eb0325d9f8ac924f2fd6d2b1ab | AWS/ECS/cleanup-docker-images.sh | AWS/ECS/cleanup-docker-images.sh |
if [ $# -lt 2 ]; then
echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group> <organization> <repository> <revision>"
echo "The first two arguments are required: region, auto-scaling-group"
exit 0
fi
REGION=$1
AUTO_SCALING_GROUP_NAME=$2
ORGANIZATION=$3
REPOSITORY=$4
REVISION=$5
for i in `aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name ${AUTO_SCALING_GROUP_NAME} | grep -i instanceid | awk '{ print $2}' | cut -d',' -f1 | sed -e 's/"//g'`
do
# Remove all unused Docker images.
aws ssm send-command --region=${REGION} --instance-ids $i --document-name="AWS-RunShellScript" --comment="${ORGANIZATION}/${REPOSITORY}:${REVISION}" --parameters commands="docker images -q | xargs --no-run-if-empty docker rmi"
done
|
if [ $# -lt 2 ]; then
echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group-name> <username> <organization> <repository> <branch> <revision>"
echo "The first two arguments are required: region, auto-scaling-group"
exit 0
fi
REGION=$1
AUTO_SCALING_GROUP_NAME=$2
USERNAME=$3
ORGANIZATION=$4
REPOSITORY=$5
BRANCH=$6
REVISION=$7
INSTANCES=($(aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name "${AUTO_SCALING_GROUP_NAME}" | grep InstanceId | cut -d'"' -f4 | tr '\n' ' '))
for INSTANCE_ID in ${INSTANCES[*]}
do
aws ssm send-command --region=${REGION} --instance-ids ${INSTANCE_ID} --document-name="AWS-RunShellScript" --comment="${USERNAME} triggered ${ORGANIZATION}/${REPOSITORY} > ${BRANCH} > build ${REVISION}" --parameters commands="docker run --rm -v /var/run/docker.sock:/var/run/docker.sock spotify/docker-gc"
done
| Fix the for loop and a little help from Spotify | Fix the for loop and a little help from Spotify
| Shell | mit | wpsinc/omnificent | shell | ## Code Before:
if [ $# -lt 2 ]; then
echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group> <organization> <repository> <revision>"
echo "The first two arguments are required: region, auto-scaling-group"
exit 0
fi
REGION=$1
AUTO_SCALING_GROUP_NAME=$2
ORGANIZATION=$3
REPOSITORY=$4
REVISION=$5
for i in `aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name ${AUTO_SCALING_GROUP_NAME} | grep -i instanceid | awk '{ print $2}' | cut -d',' -f1 | sed -e 's/"//g'`
do
# Remove all unused Docker images.
aws ssm send-command --region=${REGION} --instance-ids $i --document-name="AWS-RunShellScript" --comment="${ORGANIZATION}/${REPOSITORY}:${REVISION}" --parameters commands="docker images -q | xargs --no-run-if-empty docker rmi"
done
## Instruction:
Fix the for loop and a little help from Spotify
## Code After:
if [ $# -lt 2 ]; then
echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group-name> <username> <organization> <repository> <branch> <revision>"
echo "The first two arguments are required: region, auto-scaling-group"
exit 0
fi
REGION=$1
AUTO_SCALING_GROUP_NAME=$2
USERNAME=$3
ORGANIZATION=$4
REPOSITORY=$5
BRANCH=$6
REVISION=$7
INSTANCES=($(aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name "${AUTO_SCALING_GROUP_NAME}" | grep InstanceId | cut -d'"' -f4 | tr '\n' ' '))
for INSTANCE_ID in ${INSTANCES[*]}
do
aws ssm send-command --region=${REGION} --instance-ids ${INSTANCE_ID} --document-name="AWS-RunShellScript" --comment="${USERNAME} triggered ${ORGANIZATION}/${REPOSITORY} > ${BRANCH} > build ${REVISION}" --parameters commands="docker run --rm -v /var/run/docker.sock:/var/run/docker.sock spotify/docker-gc"
done
|
if [ $# -lt 2 ]; then
- echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group> <organization> <repository> <revision>"
+ echo "Usage: cleanup-docker-images.sh <region> <auto-scaling-group-name> <username> <organization> <repository> <branch> <revision>"
? ++++++++++++++++ +++++++++
echo "The first two arguments are required: region, auto-scaling-group"
exit 0
fi
REGION=$1
AUTO_SCALING_GROUP_NAME=$2
+ USERNAME=$3
- ORGANIZATION=$3
? ^
+ ORGANIZATION=$4
? ^
- REPOSITORY=$4
? ^
+ REPOSITORY=$5
? ^
+ BRANCH=$6
- REVISION=$5
? ^
+ REVISION=$7
? ^
- for i in `aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name ${AUTO_SCALING_GROUP_NAME} | grep -i instanceid | awk '{ print $2}' | cut -d',' -f1 | sed -e 's/"//g'`
? ^^^^^^^^^^ ^^^^ ^ --------------------- ^ ^ ^^^^^^ ^^^^^^ ^
+ INSTANCES=($(aws autoscaling describe-auto-scaling-groups --region=${REGION} --auto-scaling-group-name "${AUTO_SCALING_GROUP_NAME}" | grep InstanceId | cut -d'"' -f4 | tr '\n' ' '))
? ^^^^^^^^^^^^^ + + ^ ^ ^ ^ ^^ ^^ ^^^^^^
+
+ for INSTANCE_ID in ${INSTANCES[*]}
do
+ aws ssm send-command --region=${REGION} --instance-ids ${INSTANCE_ID} --document-name="AWS-RunShellScript" --comment="${USERNAME} triggered ${ORGANIZATION}/${REPOSITORY} > ${BRANCH} > build ${REVISION}" --parameters commands="docker run --rm -v /var/run/docker.sock:/var/run/docker.sock spotify/docker-gc"
- # Remove all unused Docker images.
- aws ssm send-command --region=${REGION} --instance-ids $i --document-name="AWS-RunShellScript" --comment="${ORGANIZATION}/${REPOSITORY}:${REVISION}" --parameters commands="docker images -q | xargs --no-run-if-empty docker rmi"
done | 17 | 0.944444 | 10 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.