commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
6452b05315ef7aff70e039bca7bf12ffc489190d | metadata/com.sovworks.edslite.yml | metadata/com.sovworks.edslite.yml | Categories:
- Security
License: GPL-2.0-or-later
AuthorName: sovworks
AuthorEmail: eds@sovworks.com
WebSite: https://sovworks.com/eds/
SourceCode: https://github.com/sovworks/edslite
IssueTracker: https://github.com/sovworks/edslite/issues
Donate: https://sovworks.com/eds/donations.php
Description: |
EDS allows you to store your files in an encrypted container.
* Supports VeraCrypt(R), TrueCrypt(R), LUKS, EncFs container formats.
* Choose among different secure ciphers.
* Encrypt/decrypt any kind of file.
* All the standard file operations supported.
* You can quickly open a folder (or file) inside a container from the Home screen using the shortcut widget.
RepoType: git
Repo: https://github.com/sovworks/edslite
Builds:
- versionName: 2.0.0.224
versionCode: 224
commit: release-lite-224
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
AutoUpdateMode: Version release-lite-%c
UpdateCheckMode: Tags
CurrentVersion: 2.0.0.224
CurrentVersionCode: 224
| Categories:
- Security
License: GPL-2.0-or-later
AuthorName: sovworks
AuthorEmail: eds@sovworks.com
WebSite: https://sovworks.com/eds/
SourceCode: https://github.com/sovworks/edslite
IssueTracker: https://github.com/sovworks/edslite/issues
Donate: https://sovworks.com/eds/donations.php
Description: |
EDS allows you to store your files in an encrypted container.
* Supports VeraCrypt(R), TrueCrypt(R), LUKS, EncFs container formats.
* Choose among different secure ciphers.
* Encrypt/decrypt any kind of file.
* All the standard file operations supported.
* You can quickly open a folder (or file) inside a container from the Home screen using the shortcut widget.
RepoType: git
Repo: https://github.com/sovworks/edslite
Builds:
- versionName: 2.0.0.224
versionCode: 224
commit: release-lite-224
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
- versionName: 2.0.0.237
versionCode: 237
commit: 8a0c0d3e81804d37ff593a26414b7470440b2665
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
ndk: r13b
AutoUpdateMode: Version release-lite-%c
UpdateCheckMode: Tags
CurrentVersion: 2.0.0.237
CurrentVersionCode: 237
| Update EDS Lite to 2.0.0.237 (237) | Update EDS Lite to 2.0.0.237 (237)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata | yaml | ## Code Before:
Categories:
- Security
License: GPL-2.0-or-later
AuthorName: sovworks
AuthorEmail: eds@sovworks.com
WebSite: https://sovworks.com/eds/
SourceCode: https://github.com/sovworks/edslite
IssueTracker: https://github.com/sovworks/edslite/issues
Donate: https://sovworks.com/eds/donations.php
Description: |
EDS allows you to store your files in an encrypted container.
* Supports VeraCrypt(R), TrueCrypt(R), LUKS, EncFs container formats.
* Choose among different secure ciphers.
* Encrypt/decrypt any kind of file.
* All the standard file operations supported.
* You can quickly open a folder (or file) inside a container from the Home screen using the shortcut widget.
RepoType: git
Repo: https://github.com/sovworks/edslite
Builds:
- versionName: 2.0.0.224
versionCode: 224
commit: release-lite-224
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
AutoUpdateMode: Version release-lite-%c
UpdateCheckMode: Tags
CurrentVersion: 2.0.0.224
CurrentVersionCode: 224
## Instruction:
Update EDS Lite to 2.0.0.237 (237)
## Code After:
Categories:
- Security
License: GPL-2.0-or-later
AuthorName: sovworks
AuthorEmail: eds@sovworks.com
WebSite: https://sovworks.com/eds/
SourceCode: https://github.com/sovworks/edslite
IssueTracker: https://github.com/sovworks/edslite/issues
Donate: https://sovworks.com/eds/donations.php
Description: |
EDS allows you to store your files in an encrypted container.
* Supports VeraCrypt(R), TrueCrypt(R), LUKS, EncFs container formats.
* Choose among different secure ciphers.
* Encrypt/decrypt any kind of file.
* All the standard file operations supported.
* You can quickly open a folder (or file) inside a container from the Home screen using the shortcut widget.
RepoType: git
Repo: https://github.com/sovworks/edslite
Builds:
- versionName: 2.0.0.224
versionCode: 224
commit: release-lite-224
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
- versionName: 2.0.0.237
versionCode: 237
commit: 8a0c0d3e81804d37ff593a26414b7470440b2665
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
ndk: r13b
AutoUpdateMode: Version release-lite-%c
UpdateCheckMode: Tags
CurrentVersion: 2.0.0.237
CurrentVersionCode: 237
| Categories:
- Security
License: GPL-2.0-or-later
AuthorName: sovworks
AuthorEmail: eds@sovworks.com
WebSite: https://sovworks.com/eds/
SourceCode: https://github.com/sovworks/edslite
IssueTracker: https://github.com/sovworks/edslite/issues
Donate: https://sovworks.com/eds/donations.php
Description: |
EDS allows you to store your files in an encrypted container.
* Supports VeraCrypt(R), TrueCrypt(R), LUKS, EncFs container formats.
* Choose among different secure ciphers.
* Encrypt/decrypt any kind of file.
* All the standard file operations supported.
* You can quickly open a folder (or file) inside a container from the Home screen using the shortcut widget.
RepoType: git
Repo: https://github.com/sovworks/edslite
Builds:
- versionName: 2.0.0.224
versionCode: 224
commit: release-lite-224
subdir: app
gradle:
- liteLicCheckNoneNoinetNofsml
+ - versionName: 2.0.0.237
+ versionCode: 237
+ commit: 8a0c0d3e81804d37ff593a26414b7470440b2665
+ subdir: app
+ gradle:
+ - liteLicCheckNoneNoinetNofsml
+ ndk: r13b
+
AutoUpdateMode: Version release-lite-%c
UpdateCheckMode: Tags
- CurrentVersion: 2.0.0.224
? ^^
+ CurrentVersion: 2.0.0.237
? ^^
- CurrentVersionCode: 224
? ^^
+ CurrentVersionCode: 237
? ^^
| 12 | 0.352941 | 10 | 2 |
2e28cf549bd7de29143c317871008b3115e44975 | tests/vstb-example-html5/tests/rotate.py | tests/vstb-example-html5/tests/rotate.py | from stbt import press, wait_for_match
def wait_for_vstb_startup():
wait_for_match('stb-tester-350px.png')
def test_that_image_is_rotated_by_arrows():
press("KEY_LEFT")
wait_for_match('stb-tester-left.png')
press("KEY_RIGHT")
wait_for_match('stb-tester-right.png')
press("KEY_UP")
wait_for_match('stb-tester-up.png')
press("KEY_DOWN")
wait_for_match('stb-tester-down.png')
def test_that_image_returns_to_normal_on_OK():
press("KEY_OK")
wait_for_match('stb-tester-350px.png')
def test_that_custom_key_is_recognised():
press("KEY_CUSTOM")
wait_for_match('stb-tester-up.png', timeout_secs=1)
| from stbt import press, wait_for_match
def wait_for_vstb_startup():
wait_for_match('stb-tester-350px.png', timeout_secs=20)
def test_that_image_is_rotated_by_arrows():
press("KEY_LEFT")
wait_for_match('stb-tester-left.png')
press("KEY_RIGHT")
wait_for_match('stb-tester-right.png')
press("KEY_UP")
wait_for_match('stb-tester-up.png')
press("KEY_DOWN")
wait_for_match('stb-tester-down.png')
def test_that_image_returns_to_normal_on_OK():
press("KEY_OK")
wait_for_match('stb-tester-350px.png')
def test_that_custom_key_is_recognised():
press("KEY_CUSTOM")
wait_for_match('stb-tester-up.png', timeout_secs=1)
| Fix virtual-stb intermittant test-failure on Travis | Fix virtual-stb intermittant test-failure on Travis
test_that_virtual_stb_configures_stb_tester_for_testing_virtual_stbs fails
intermittently on Travis because sometimes chrome takes longer than 10s to
start-up. This causes the test to fail with:
> MatchTimeout: Didn't find match for '.../stb-tester-350px.png' within 10
> seconds
This commit should fix that issue.
| Python | lgpl-2.1 | LewisHaley/stb-tester,LewisHaley/stb-tester,martynjarvis/stb-tester,martynjarvis/stb-tester,LewisHaley/stb-tester,LewisHaley/stb-tester,LewisHaley/stb-tester,stb-tester/stb-tester,stb-tester/stb-tester,LewisHaley/stb-tester,LewisHaley/stb-tester,stb-tester/stb-tester,martynjarvis/stb-tester,stb-tester/stb-tester,martynjarvis/stb-tester,martynjarvis/stb-tester,martynjarvis/stb-tester,martynjarvis/stb-tester | python | ## Code Before:
from stbt import press, wait_for_match
def wait_for_vstb_startup():
wait_for_match('stb-tester-350px.png')
def test_that_image_is_rotated_by_arrows():
press("KEY_LEFT")
wait_for_match('stb-tester-left.png')
press("KEY_RIGHT")
wait_for_match('stb-tester-right.png')
press("KEY_UP")
wait_for_match('stb-tester-up.png')
press("KEY_DOWN")
wait_for_match('stb-tester-down.png')
def test_that_image_returns_to_normal_on_OK():
press("KEY_OK")
wait_for_match('stb-tester-350px.png')
def test_that_custom_key_is_recognised():
press("KEY_CUSTOM")
wait_for_match('stb-tester-up.png', timeout_secs=1)
## Instruction:
Fix virtual-stb intermittant test-failure on Travis
test_that_virtual_stb_configures_stb_tester_for_testing_virtual_stbs fails
intermittently on Travis because sometimes chrome takes longer than 10s to
start-up. This causes the test to fail with:
> MatchTimeout: Didn't find match for '.../stb-tester-350px.png' within 10
> seconds
This commit should fix that issue.
## Code After:
from stbt import press, wait_for_match
def wait_for_vstb_startup():
wait_for_match('stb-tester-350px.png', timeout_secs=20)
def test_that_image_is_rotated_by_arrows():
press("KEY_LEFT")
wait_for_match('stb-tester-left.png')
press("KEY_RIGHT")
wait_for_match('stb-tester-right.png')
press("KEY_UP")
wait_for_match('stb-tester-up.png')
press("KEY_DOWN")
wait_for_match('stb-tester-down.png')
def test_that_image_returns_to_normal_on_OK():
press("KEY_OK")
wait_for_match('stb-tester-350px.png')
def test_that_custom_key_is_recognised():
press("KEY_CUSTOM")
wait_for_match('stb-tester-up.png', timeout_secs=1)
| from stbt import press, wait_for_match
def wait_for_vstb_startup():
- wait_for_match('stb-tester-350px.png')
+ wait_for_match('stb-tester-350px.png', timeout_secs=20)
? +++++++++++++++++
def test_that_image_is_rotated_by_arrows():
press("KEY_LEFT")
wait_for_match('stb-tester-left.png')
press("KEY_RIGHT")
wait_for_match('stb-tester-right.png')
press("KEY_UP")
wait_for_match('stb-tester-up.png')
press("KEY_DOWN")
wait_for_match('stb-tester-down.png')
def test_that_image_returns_to_normal_on_OK():
press("KEY_OK")
wait_for_match('stb-tester-350px.png')
def test_that_custom_key_is_recognised():
press("KEY_CUSTOM")
wait_for_match('stb-tester-up.png', timeout_secs=1) | 2 | 0.076923 | 1 | 1 |
626a301449dc771d4445a673b9f4dfe93228e9a8 | app/assets/stylesheets/reports_kit/reports.css.sass | app/assets/stylesheets/reports_kit/reports.css.sass | .reports_kit_report
width: 100%
padding: 1px
canvas
width: 100%
.date_range_picker
width: 180px
| .reports_kit_report
width: 100%
padding: 1px
canvas
width: 100%
max-height: 500px
.date_range_picker
width: 180px
| Fix issue with canvas height with `maintainAspectRatio: false` | Fix issue with canvas height with `maintainAspectRatio: false`
| Sass | mit | tombenner/reports_kit,tombenner/reports_kit | sass | ## Code Before:
.reports_kit_report
width: 100%
padding: 1px
canvas
width: 100%
.date_range_picker
width: 180px
## Instruction:
Fix issue with canvas height with `maintainAspectRatio: false`
## Code After:
.reports_kit_report
width: 100%
padding: 1px
canvas
width: 100%
max-height: 500px
.date_range_picker
width: 180px
| .reports_kit_report
width: 100%
padding: 1px
canvas
width: 100%
+ max-height: 500px
.date_range_picker
width: 180px | 1 | 0.142857 | 1 | 0 |
010ea4c7526f6207bcae4f25351a3cc7b65d35c6 | rollup.config.js | rollup.config.js | import { terser } from 'rollup-plugin-terser';
import typescript from 'rollup-plugin-typescript2';
export default [
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'cjs',
file: 'dist/keycode.cjs.js'
}
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'iife',
file: 'dist/keycode.min.js',
sourcemap: true
},
plugins: [typescript(), terser()]
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'esm',
file: 'dist/keycode.es6.js'
}
},
{
input: 'test-definitions.ts',
output: {
format: 'cjs',
file: 'dist/test-definitions.cjs.js'
},
plugins: [typescript()]
}
];
| import { terser } from 'rollup-plugin-terser';
import typescript from 'rollup-plugin-typescript2';
export default [
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'cjs',
file: 'dist/keycode.cjs.js'
}
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'iife',
file: 'dist/keycode.min.js',
sourcemap: true
},
plugins: [typescript(), terser()]
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'esm',
file: 'dist/keycode.es6.js'
}
},
{
input: 'test-definitions.ts',
output: {
name: 'KeyCodeJSTestDefinitions',
format: 'umd',
file: 'dist/test-definitions.umd.js'
},
plugins: [typescript()]
}
];
| Build test definitions as umd | Build test definitions as umd
| JavaScript | mit | kabirbaidhya/keycode-js,kabirbaidhya/keycode-js,kabirbaidhya/keycode-js,kabirbaidhya/keycode-js | javascript | ## Code Before:
import { terser } from 'rollup-plugin-terser';
import typescript from 'rollup-plugin-typescript2';
export default [
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'cjs',
file: 'dist/keycode.cjs.js'
}
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'iife',
file: 'dist/keycode.min.js',
sourcemap: true
},
plugins: [typescript(), terser()]
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'esm',
file: 'dist/keycode.es6.js'
}
},
{
input: 'test-definitions.ts',
output: {
format: 'cjs',
file: 'dist/test-definitions.cjs.js'
},
plugins: [typescript()]
}
];
## Instruction:
Build test definitions as umd
## Code After:
import { terser } from 'rollup-plugin-terser';
import typescript from 'rollup-plugin-typescript2';
export default [
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'cjs',
file: 'dist/keycode.cjs.js'
}
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'iife',
file: 'dist/keycode.min.js',
sourcemap: true
},
plugins: [typescript(), terser()]
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'esm',
file: 'dist/keycode.es6.js'
}
},
{
input: 'test-definitions.ts',
output: {
name: 'KeyCodeJSTestDefinitions',
format: 'umd',
file: 'dist/test-definitions.umd.js'
},
plugins: [typescript()]
}
];
| import { terser } from 'rollup-plugin-terser';
import typescript from 'rollup-plugin-typescript2';
export default [
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'cjs',
file: 'dist/keycode.cjs.js'
}
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'iife',
file: 'dist/keycode.min.js',
sourcemap: true
},
plugins: [typescript(), terser()]
},
{
input: 'mod.ts',
output: {
name: 'KeyCode',
format: 'esm',
file: 'dist/keycode.es6.js'
}
},
{
input: 'test-definitions.ts',
output: {
+ name: 'KeyCodeJSTestDefinitions',
- format: 'cjs',
? ^^^
+ format: 'umd',
? ^^^
- file: 'dist/test-definitions.cjs.js'
? ^^^
+ file: 'dist/test-definitions.umd.js'
? ^^^
},
plugins: [typescript()]
}
]; | 5 | 0.128205 | 3 | 2 |
03b4739226cc198eaf262363b14e0d4ee477e686 | interpol.gemspec | interpol.gemspec | require File.expand_path('../lib/interpol/version', __FILE__)
Gem::Specification.new do |gem|
gem.authors = ["Myron Marston"]
gem.email = ["myron.marston@gmail.com"]
gem.description = %q{Interpol is a toolkit for working with API endpoint definition files, giving you a stub app, a schema validation middleware, and browsable documentation.}
gem.summary = %q{Police your HTTP JSON interface with interpol.}
gem.homepage = ""
gem.files = %w(README.md) + Dir.glob("lib/**/*.rb")
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "interpol"
gem.require_paths = ["lib"]
gem.version = Interpol::VERSION
gem.add_dependency 'sinatra', '>= 1.3.2', '< 2.0.0'
gem.add_dependency 'json-schema', '~> 1.0.5'
gem.add_development_dependency 'rspec', '~> 2.9'
gem.add_development_dependency 'rspec-fire', '~> 0.4'
gem.add_development_dependency 'simplecov', '~> 0.6'
gem.add_development_dependency 'cane', '~> 1.2'
gem.add_development_dependency 'rake', '~> 0.9.2.2'
end
| require File.expand_path('../lib/interpol/version', __FILE__)
Gem::Specification.new do |gem|
gem.authors = ["Myron Marston"]
gem.email = ["myron.marston@gmail.com"]
gem.description = %q{Interpol is a toolkit for working with API endpoint definition files, giving you a stub app, a schema validation middleware, and browsable documentation.}
gem.summary = %q{Police your HTTP JSON interface with interpol.}
gem.homepage = ""
gem.files = %w(README.md License Gemfile Rakefile) + Dir.glob("lib/**/*.rb")
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "interpol"
gem.require_paths = ["lib"]
gem.version = Interpol::VERSION
gem.add_dependency 'sinatra', '>= 1.3.2', '< 2.0.0'
gem.add_dependency 'json-schema', '~> 1.0.5'
gem.add_development_dependency 'rspec', '~> 2.9'
gem.add_development_dependency 'rspec-fire', '~> 0.4'
gem.add_development_dependency 'simplecov', '~> 0.6'
gem.add_development_dependency 'cane', '~> 1.2'
gem.add_development_dependency 'rake', '~> 0.9.2.2'
end
| Add additional files to gemspec. | Add additional files to gemspec.
[ci skip] | Ruby | mit | Health123/interpol,Health123/interpol,seomoz/interpol,seomoz/interpol,Health123/interpol,seomoz/interpol | ruby | ## Code Before:
require File.expand_path('../lib/interpol/version', __FILE__)
Gem::Specification.new do |gem|
gem.authors = ["Myron Marston"]
gem.email = ["myron.marston@gmail.com"]
gem.description = %q{Interpol is a toolkit for working with API endpoint definition files, giving you a stub app, a schema validation middleware, and browsable documentation.}
gem.summary = %q{Police your HTTP JSON interface with interpol.}
gem.homepage = ""
gem.files = %w(README.md) + Dir.glob("lib/**/*.rb")
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "interpol"
gem.require_paths = ["lib"]
gem.version = Interpol::VERSION
gem.add_dependency 'sinatra', '>= 1.3.2', '< 2.0.0'
gem.add_dependency 'json-schema', '~> 1.0.5'
gem.add_development_dependency 'rspec', '~> 2.9'
gem.add_development_dependency 'rspec-fire', '~> 0.4'
gem.add_development_dependency 'simplecov', '~> 0.6'
gem.add_development_dependency 'cane', '~> 1.2'
gem.add_development_dependency 'rake', '~> 0.9.2.2'
end
## Instruction:
Add additional files to gemspec.
[ci skip]
## Code After:
require File.expand_path('../lib/interpol/version', __FILE__)
Gem::Specification.new do |gem|
gem.authors = ["Myron Marston"]
gem.email = ["myron.marston@gmail.com"]
gem.description = %q{Interpol is a toolkit for working with API endpoint definition files, giving you a stub app, a schema validation middleware, and browsable documentation.}
gem.summary = %q{Police your HTTP JSON interface with interpol.}
gem.homepage = ""
gem.files = %w(README.md License Gemfile Rakefile) + Dir.glob("lib/**/*.rb")
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "interpol"
gem.require_paths = ["lib"]
gem.version = Interpol::VERSION
gem.add_dependency 'sinatra', '>= 1.3.2', '< 2.0.0'
gem.add_dependency 'json-schema', '~> 1.0.5'
gem.add_development_dependency 'rspec', '~> 2.9'
gem.add_development_dependency 'rspec-fire', '~> 0.4'
gem.add_development_dependency 'simplecov', '~> 0.6'
gem.add_development_dependency 'cane', '~> 1.2'
gem.add_development_dependency 'rake', '~> 0.9.2.2'
end
| require File.expand_path('../lib/interpol/version', __FILE__)
Gem::Specification.new do |gem|
gem.authors = ["Myron Marston"]
gem.email = ["myron.marston@gmail.com"]
gem.description = %q{Interpol is a toolkit for working with API endpoint definition files, giving you a stub app, a schema validation middleware, and browsable documentation.}
gem.summary = %q{Police your HTTP JSON interface with interpol.}
gem.homepage = ""
- gem.files = %w(README.md) + Dir.glob("lib/**/*.rb")
+ gem.files = %w(README.md License Gemfile Rakefile) + Dir.glob("lib/**/*.rb")
? +++++++++++++++++++++++++
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.name = "interpol"
gem.require_paths = ["lib"]
gem.version = Interpol::VERSION
gem.add_dependency 'sinatra', '>= 1.3.2', '< 2.0.0'
gem.add_dependency 'json-schema', '~> 1.0.5'
gem.add_development_dependency 'rspec', '~> 2.9'
gem.add_development_dependency 'rspec-fire', '~> 0.4'
gem.add_development_dependency 'simplecov', '~> 0.6'
gem.add_development_dependency 'cane', '~> 1.2'
gem.add_development_dependency 'rake', '~> 0.9.2.2'
end | 2 | 0.08 | 1 | 1 |
100f297b02149079f3b09bfff267d32be1acb514 | src/backends/createTestBackend.js | src/backends/createTestBackend.js | class TestBackend {
constructor(manager) {
this.actions = manager.getActions();
}
setup() {
this.didCallSetup = true;
}
teardown() {
this.didCallTeardown = true;
}
simulateBeginDrag(sourceIds, options) {
this.actions.beginDrag(sourceIds, options);
}
simulatePublishDragSource() {
this.actions.publishDragSource();
}
simulateHover(targetIds, options) {
this.actions.hover(targetIds, options);
}
simulateDrop() {
this.actions.drop();
}
simulateEndDrag() {
this.actions.endDrag();
}
}
export default function createBackend(manager) {
return new TestBackend(manager);
} | import noop from 'lodash/utility/noop';
class TestBackend {
constructor(manager) {
this.actions = manager.getActions();
}
setup() {
this.didCallSetup = true;
}
teardown() {
this.didCallTeardown = true;
}
connectDragSource() {
return noop;
}
connectDragPreview() {
return noop;
}
connectDropTarget() {
return noop;
}
simulateBeginDrag(sourceIds, options) {
this.actions.beginDrag(sourceIds, options);
}
simulatePublishDragSource() {
this.actions.publishDragSource();
}
simulateHover(targetIds, options) {
this.actions.hover(targetIds, options);
}
simulateDrop() {
this.actions.drop();
}
simulateEndDrag() {
this.actions.endDrag();
}
}
export default function createBackend(manager) {
return new TestBackend(manager);
} | Add noop connect* methods to the test backend | Add noop connect* methods to the test backend
| JavaScript | mit | zetkin/dnd-core,gaearon/dnd-core,randrianov/dnd-core,zetkin/dnd-core,mattiasgronlund/dnd-core | javascript | ## Code Before:
class TestBackend {
constructor(manager) {
this.actions = manager.getActions();
}
setup() {
this.didCallSetup = true;
}
teardown() {
this.didCallTeardown = true;
}
simulateBeginDrag(sourceIds, options) {
this.actions.beginDrag(sourceIds, options);
}
simulatePublishDragSource() {
this.actions.publishDragSource();
}
simulateHover(targetIds, options) {
this.actions.hover(targetIds, options);
}
simulateDrop() {
this.actions.drop();
}
simulateEndDrag() {
this.actions.endDrag();
}
}
export default function createBackend(manager) {
return new TestBackend(manager);
}
## Instruction:
Add noop connect* methods to the test backend
## Code After:
import noop from 'lodash/utility/noop';
class TestBackend {
constructor(manager) {
this.actions = manager.getActions();
}
setup() {
this.didCallSetup = true;
}
teardown() {
this.didCallTeardown = true;
}
connectDragSource() {
return noop;
}
connectDragPreview() {
return noop;
}
connectDropTarget() {
return noop;
}
simulateBeginDrag(sourceIds, options) {
this.actions.beginDrag(sourceIds, options);
}
simulatePublishDragSource() {
this.actions.publishDragSource();
}
simulateHover(targetIds, options) {
this.actions.hover(targetIds, options);
}
simulateDrop() {
this.actions.drop();
}
simulateEndDrag() {
this.actions.endDrag();
}
}
export default function createBackend(manager) {
return new TestBackend(manager);
} | + import noop from 'lodash/utility/noop';
+
class TestBackend {
constructor(manager) {
this.actions = manager.getActions();
}
setup() {
this.didCallSetup = true;
}
teardown() {
this.didCallTeardown = true;
+ }
+
+ connectDragSource() {
+ return noop;
+ }
+
+ connectDragPreview() {
+ return noop;
+ }
+
+ connectDropTarget() {
+ return noop;
}
simulateBeginDrag(sourceIds, options) {
this.actions.beginDrag(sourceIds, options);
}
simulatePublishDragSource() {
this.actions.publishDragSource();
}
simulateHover(targetIds, options) {
this.actions.hover(targetIds, options);
}
simulateDrop() {
this.actions.drop();
}
simulateEndDrag() {
this.actions.endDrag();
}
}
export default function createBackend(manager) {
return new TestBackend(manager);
} | 14 | 0.378378 | 14 | 0 |
9a81879bd4eb01be5ed74acfdaf22acb635a9817 | pikalang/__init__.py | pikalang/__init__.py |
import sys
import os
from pikalang.interpreter import PikalangProgram
def load_source(file):
if os.path.isfile(file):
if os.path.splitext(file)[1] == ".pokeball":
with open(file, "r") as pikalang_file:
pikalang_data = pikalang_file.read()
return pikalang_data
else:
print("pikalang: file is not a pokeball", file=sys.stderr)
return False
else:
print("pikalang: file does not exist", file=sys.stderr)
return False
def evaluate(source):
"""Run Pikalang system."""
program = PikalangProgram(source)
program.run()
|
from __future__ import print_function
import sys
import os
from pikalang.interpreter import PikalangProgram
def load_source(file):
if os.path.isfile(file):
if os.path.splitext(file)[1] == ".pokeball":
with open(file, "r") as pikalang_file:
pikalang_data = pikalang_file.read()
return pikalang_data
else:
print("pikalang: file is not a pokeball", file=sys.stderr)
return False
else:
print("pikalang: file does not exist", file=sys.stderr)
return False
def evaluate(source):
"""Run Pikalang system."""
program = PikalangProgram(source)
program.run()
| Add proper printing for py2 | Add proper printing for py2
| Python | mit | groteworld/pikalang,grotewold/pikalang | python | ## Code Before:
import sys
import os
from pikalang.interpreter import PikalangProgram
def load_source(file):
if os.path.isfile(file):
if os.path.splitext(file)[1] == ".pokeball":
with open(file, "r") as pikalang_file:
pikalang_data = pikalang_file.read()
return pikalang_data
else:
print("pikalang: file is not a pokeball", file=sys.stderr)
return False
else:
print("pikalang: file does not exist", file=sys.stderr)
return False
def evaluate(source):
"""Run Pikalang system."""
program = PikalangProgram(source)
program.run()
## Instruction:
Add proper printing for py2
## Code After:
from __future__ import print_function
import sys
import os
from pikalang.interpreter import PikalangProgram
def load_source(file):
if os.path.isfile(file):
if os.path.splitext(file)[1] == ".pokeball":
with open(file, "r") as pikalang_file:
pikalang_data = pikalang_file.read()
return pikalang_data
else:
print("pikalang: file is not a pokeball", file=sys.stderr)
return False
else:
print("pikalang: file does not exist", file=sys.stderr)
return False
def evaluate(source):
"""Run Pikalang system."""
program = PikalangProgram(source)
program.run()
| +
+ from __future__ import print_function
import sys
import os
from pikalang.interpreter import PikalangProgram
def load_source(file):
if os.path.isfile(file):
if os.path.splitext(file)[1] == ".pokeball":
with open(file, "r") as pikalang_file:
pikalang_data = pikalang_file.read()
return pikalang_data
else:
print("pikalang: file is not a pokeball", file=sys.stderr)
return False
else:
print("pikalang: file does not exist", file=sys.stderr)
return False
def evaluate(source):
"""Run Pikalang system."""
program = PikalangProgram(source)
program.run() | 2 | 0.071429 | 2 | 0 |
075bd187f35f50c72f888550c27ea6e7ab270f57 | src/Oro/Bundle/EmailBundle/Resources/views/Email/myEmails.html.twig | src/Oro/Bundle/EmailBundle/Resources/views/Email/myEmails.html.twig | {% extends 'OroUIBundle:actions:index.html.twig' %}
{% import 'OroUIBundle::macros.html.twig' as UI %}
{% set pageTitle = 'oro.email.entity_plural_label'|trans %}
{% set gridName = 'users-email-grid' %}
{# grid params #}
{% set params = {'userId': app.user.id} %}
| {% extends 'OroUIBundle:actions:index.html.twig' %}
{% set pageTitle = 'oro.email.entity_plural_label'|trans %}
{% set gridName = 'users-email-grid' %}
{# grid params #}
{% set params = {'userId': app.user.id} %}
| Create email index page - remove not used macros | BAP-4225: Create email index page
- remove not used macros
| Twig | mit | Djamy/platform,geoffroycochard/platform,hugeval/platform,ramunasd/platform,trustify/oroplatform,Djamy/platform,orocrm/platform,2ndkauboy/platform,morontt/platform,northdakota/platform,geoffroycochard/platform,hugeval/platform,orocrm/platform,ramunasd/platform,hugeval/platform,trustify/oroplatform,ramunasd/platform,geoffroycochard/platform,orocrm/platform,2ndkauboy/platform,morontt/platform,trustify/oroplatform,morontt/platform,northdakota/platform,2ndkauboy/platform,Djamy/platform,northdakota/platform | twig | ## Code Before:
{% extends 'OroUIBundle:actions:index.html.twig' %}
{% import 'OroUIBundle::macros.html.twig' as UI %}
{% set pageTitle = 'oro.email.entity_plural_label'|trans %}
{% set gridName = 'users-email-grid' %}
{# grid params #}
{% set params = {'userId': app.user.id} %}
## Instruction:
BAP-4225: Create email index page
- remove not used macros
## Code After:
{% extends 'OroUIBundle:actions:index.html.twig' %}
{% set pageTitle = 'oro.email.entity_plural_label'|trans %}
{% set gridName = 'users-email-grid' %}
{# grid params #}
{% set params = {'userId': app.user.id} %}
| {% extends 'OroUIBundle:actions:index.html.twig' %}
- {% import 'OroUIBundle::macros.html.twig' as UI %}
{% set pageTitle = 'oro.email.entity_plural_label'|trans %}
{% set gridName = 'users-email-grid' %}
{# grid params #}
{% set params = {'userId': app.user.id} %} | 1 | 0.142857 | 0 | 1 |
6f0cfb0e09a602175c0b5e6b90b8f8a2bd90f96f | index.js | index.js | import {default as color, Color} from "./src/color";
import {default as rgb, Rgb} from "./src/rgb";
import {default as hsl, Hsl} from "./src/hsl";
import {default as lab, Lab} from "./src/lab";
import {default as hcl, Hcl} from "./src/hcl";
import {default as cubehelix, Cubehelix} from "./src/cubehelix";
import interpolateRgb from "./src/interpolateRgb";
import interpolateHsl from "./src/interpolateHsl";
import interpolateHslLong from "./src/interpolateHslLong";
import interpolateLab from "./src/interpolateLab";
import interpolateHcl from "./src/interpolateHcl";
import interpolateHclLong from "./src/interpolateHclLong";
import interpolateCubehelixGamma from "./src/interpolateCubehelixGamma";
import interpolateCubehelixGammaLong from "./src/interpolateCubehelixGammaLong";
export var interpolateCubehelix = interpolateCubehelixGamma(1);
export var interpolateCubehelixLong = interpolateCubehelixGammaLong(1);
export {
color,
rgb,
hsl,
lab,
hcl,
cubehelix,
interpolateRgb,
interpolateHsl,
interpolateHslLong,
interpolateLab,
interpolateHcl,
interpolateHclLong,
interpolateCubehelixGamma,
interpolateCubehelixGammaLong
};
| import color from "./src/color";
import rgb from "./src/rgb";
import hsl from "./src/hsl";
import lab from "./src/lab";
import hcl from "./src/hcl";
import cubehelix from "./src/cubehelix";
import interpolateRgb from "./src/interpolateRgb";
import interpolateHsl from "./src/interpolateHsl";
import interpolateHslLong from "./src/interpolateHslLong";
import interpolateLab from "./src/interpolateLab";
import interpolateHcl from "./src/interpolateHcl";
import interpolateHclLong from "./src/interpolateHclLong";
import interpolateCubehelixGamma from "./src/interpolateCubehelixGamma";
import interpolateCubehelixGammaLong from "./src/interpolateCubehelixGammaLong";
export var interpolateCubehelix = interpolateCubehelixGamma(1);
export var interpolateCubehelixLong = interpolateCubehelixGammaLong(1);
export {
color,
rgb,
hsl,
lab,
hcl,
cubehelix,
interpolateRgb,
interpolateHsl,
interpolateHslLong,
interpolateLab,
interpolateHcl,
interpolateHclLong,
interpolateCubehelixGamma,
interpolateCubehelixGammaLong
};
| Use just the default exports | Use just the default exports
| JavaScript | isc | d3/d3-color | javascript | ## Code Before:
import {default as color, Color} from "./src/color";
import {default as rgb, Rgb} from "./src/rgb";
import {default as hsl, Hsl} from "./src/hsl";
import {default as lab, Lab} from "./src/lab";
import {default as hcl, Hcl} from "./src/hcl";
import {default as cubehelix, Cubehelix} from "./src/cubehelix";
import interpolateRgb from "./src/interpolateRgb";
import interpolateHsl from "./src/interpolateHsl";
import interpolateHslLong from "./src/interpolateHslLong";
import interpolateLab from "./src/interpolateLab";
import interpolateHcl from "./src/interpolateHcl";
import interpolateHclLong from "./src/interpolateHclLong";
import interpolateCubehelixGamma from "./src/interpolateCubehelixGamma";
import interpolateCubehelixGammaLong from "./src/interpolateCubehelixGammaLong";
export var interpolateCubehelix = interpolateCubehelixGamma(1);
export var interpolateCubehelixLong = interpolateCubehelixGammaLong(1);
export {
color,
rgb,
hsl,
lab,
hcl,
cubehelix,
interpolateRgb,
interpolateHsl,
interpolateHslLong,
interpolateLab,
interpolateHcl,
interpolateHclLong,
interpolateCubehelixGamma,
interpolateCubehelixGammaLong
};
## Instruction:
Use just the default exports
## Code After:
import color from "./src/color";
import rgb from "./src/rgb";
import hsl from "./src/hsl";
import lab from "./src/lab";
import hcl from "./src/hcl";
import cubehelix from "./src/cubehelix";
import interpolateRgb from "./src/interpolateRgb";
import interpolateHsl from "./src/interpolateHsl";
import interpolateHslLong from "./src/interpolateHslLong";
import interpolateLab from "./src/interpolateLab";
import interpolateHcl from "./src/interpolateHcl";
import interpolateHclLong from "./src/interpolateHclLong";
import interpolateCubehelixGamma from "./src/interpolateCubehelixGamma";
import interpolateCubehelixGammaLong from "./src/interpolateCubehelixGammaLong";
export var interpolateCubehelix = interpolateCubehelixGamma(1);
export var interpolateCubehelixLong = interpolateCubehelixGammaLong(1);
export {
color,
rgb,
hsl,
lab,
hcl,
cubehelix,
interpolateRgb,
interpolateHsl,
interpolateHslLong,
interpolateLab,
interpolateHcl,
interpolateHclLong,
interpolateCubehelixGamma,
interpolateCubehelixGammaLong
};
| - import {default as color, Color} from "./src/color";
? ------------ --------
+ import color from "./src/color";
- import {default as rgb, Rgb} from "./src/rgb";
? ------------ ------
+ import rgb from "./src/rgb";
- import {default as hsl, Hsl} from "./src/hsl";
? ------------ ------
+ import hsl from "./src/hsl";
- import {default as lab, Lab} from "./src/lab";
? ------------ ------
+ import lab from "./src/lab";
- import {default as hcl, Hcl} from "./src/hcl";
? ------------ ------
+ import hcl from "./src/hcl";
- import {default as cubehelix, Cubehelix} from "./src/cubehelix";
? ------------ ------------
+ import cubehelix from "./src/cubehelix";
import interpolateRgb from "./src/interpolateRgb";
import interpolateHsl from "./src/interpolateHsl";
import interpolateHslLong from "./src/interpolateHslLong";
import interpolateLab from "./src/interpolateLab";
import interpolateHcl from "./src/interpolateHcl";
import interpolateHclLong from "./src/interpolateHclLong";
import interpolateCubehelixGamma from "./src/interpolateCubehelixGamma";
import interpolateCubehelixGammaLong from "./src/interpolateCubehelixGammaLong";
export var interpolateCubehelix = interpolateCubehelixGamma(1);
export var interpolateCubehelixLong = interpolateCubehelixGammaLong(1);
export {
color,
rgb,
hsl,
lab,
hcl,
cubehelix,
interpolateRgb,
interpolateHsl,
interpolateHslLong,
interpolateLab,
interpolateHcl,
interpolateHclLong,
interpolateCubehelixGamma,
interpolateCubehelixGammaLong
}; | 12 | 0.352941 | 6 | 6 |
9ef134cb756009260e05e219bcecbe45d2729e9f | exp/wsj/configs/wsj_reward1.yaml | exp/wsj/configs/wsj_reward1.yaml | parent: $LVSR/exp/wsj/configs/wsj_paper1.yaml
net:
criterion:
name: mse_gain
min_reward: -5
training:
scale: 0.01
initialization:
/recognizer:
weights_init:
!!python/object/apply:blocks.initialization.Uniform [0.0, 0.1]
/recognizer/generator/readout/post_merge/mlp:
biases_init:
!!python/object/apply:blocks.initialization.Constant [-1.0]
monitoring:
search:
round_to_inf: 4.5
stages:
pretraining:
training:
num_epochs: 4
annealing2: null
| parent: $LVSR/exp/wsj/configs/wsj_paper1.yaml
net:
criterion:
name: mse_gain
min_reward: -5
training:
scale: 0.01
initialization:
/recognizer:
weights_init:
!!python/object/apply:blocks.initialization.Uniform [0.0, 0.1]
/recognizer/generator/readout/post_merge/mlp:
biases_init:
!!python/object/apply:blocks.initialization.Constant [-1.0]
monitoring:
search:
round_to_inf: 4.5
stop_on: patience
stages:
pretraining:
training:
num_epochs: 4
annealing2: null
| Fix stop_on for wsj_reward model | Fix stop_on for wsj_reward model
| YAML | mit | rizar/attention-lvcsr,rizar/attention-lvcsr,rizar/attention-lvcsr,rizar/attention-lvcsr,rizar/attention-lvcsr | yaml | ## Code Before:
parent: $LVSR/exp/wsj/configs/wsj_paper1.yaml
net:
criterion:
name: mse_gain
min_reward: -5
training:
scale: 0.01
initialization:
/recognizer:
weights_init:
!!python/object/apply:blocks.initialization.Uniform [0.0, 0.1]
/recognizer/generator/readout/post_merge/mlp:
biases_init:
!!python/object/apply:blocks.initialization.Constant [-1.0]
monitoring:
search:
round_to_inf: 4.5
stages:
pretraining:
training:
num_epochs: 4
annealing2: null
## Instruction:
Fix stop_on for wsj_reward model
## Code After:
parent: $LVSR/exp/wsj/configs/wsj_paper1.yaml
net:
criterion:
name: mse_gain
min_reward: -5
training:
scale: 0.01
initialization:
/recognizer:
weights_init:
!!python/object/apply:blocks.initialization.Uniform [0.0, 0.1]
/recognizer/generator/readout/post_merge/mlp:
biases_init:
!!python/object/apply:blocks.initialization.Constant [-1.0]
monitoring:
search:
round_to_inf: 4.5
stop_on: patience
stages:
pretraining:
training:
num_epochs: 4
annealing2: null
| parent: $LVSR/exp/wsj/configs/wsj_paper1.yaml
net:
criterion:
name: mse_gain
min_reward: -5
training:
scale: 0.01
initialization:
/recognizer:
weights_init:
!!python/object/apply:blocks.initialization.Uniform [0.0, 0.1]
/recognizer/generator/readout/post_merge/mlp:
biases_init:
!!python/object/apply:blocks.initialization.Constant [-1.0]
monitoring:
search:
round_to_inf: 4.5
+ stop_on: patience
stages:
pretraining:
training:
num_epochs: 4
annealing2: null | 1 | 0.045455 | 1 | 0 |
7852b90cfe1cea1e0cdaa19d490c83f0d8684b50 | doc/ReleaseNotes4.13.1.md | doc/ReleaseNotes4.13.1.md |
Prior to this change, custom runners could make `FrameworkMethod` instances, but not `FrameworkField` instances. This small change allows for both now, because `FrameworkField`'s constructor has been promoted from package-private to public.
|
A local information disclosure vulnerability in `TemporaryFolder` has been fixed. See the published [security advisory](https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp) for details.
# Test Runners
### [Pull request #1669:](https://github.com/junit-team/junit/pull/1669) Make `FrameworkField` constructor public
Prior to this change, custom runners could make `FrameworkMethod` instances, but not `FrameworkField` instances. This small change allows for both now, because `FrameworkField`'s constructor has been promoted from package-private to public.
| Document security fix in release notes | Document security fix in release notes
| Markdown | epl-1.0 | junit-team/junit,kcooney/junit,junit-team/junit4,stefanbirkner/junit,junit-team/junit4,stefanbirkner/junit,junit-team/junit,stefanbirkner/junit,junit-team/junit4,junit-team/junit,kcooney/junit,kcooney/junit | markdown | ## Code Before:
Prior to this change, custom runners could make `FrameworkMethod` instances, but not `FrameworkField` instances. This small change allows for both now, because `FrameworkField`'s constructor has been promoted from package-private to public.
## Instruction:
Document security fix in release notes
## Code After:
A local information disclosure vulnerability in `TemporaryFolder` has been fixed. See the published [security advisory](https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp) for details.
# Test Runners
### [Pull request #1669:](https://github.com/junit-team/junit/pull/1669) Make `FrameworkField` constructor public
Prior to this change, custom runners could make `FrameworkMethod` instances, but not `FrameworkField` instances. This small change allows for both now, because `FrameworkField`'s constructor has been promoted from package-private to public.
| +
+ A local information disclosure vulnerability in `TemporaryFolder` has been fixed. See the published [security advisory](https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp) for details.
+
+ # Test Runners
+
+ ### [Pull request #1669:](https://github.com/junit-team/junit/pull/1669) Make `FrameworkField` constructor public
Prior to this change, custom runners could make `FrameworkMethod` instances, but not `FrameworkField` instances. This small change allows for both now, because `FrameworkField`'s constructor has been promoted from package-private to public. | 6 | 3 | 6 | 0 |
e09d60380f626502532e78494314f9ed97eca7c8 | build-cutline-map.py | build-cutline-map.py | from osgeo import ogr
from osgeo import osr
from glob import glob
import os.path
driver = ogr.GetDriverByName("ESRI Shapefile")
ds = driver.CreateDataSource("summary-maps/cutline-map.shp")
srs = osr.SpatialReference()
srs.ImportFromEPSG(4326)
layer = ds.CreateLayer("tiles", srs, ogr.wkbPolygon)
field_name = ogr.FieldDefn("Name", ogr.OFTString)
field_name.SetWidth(16)
layer.CreateField(field_name)
for fn in glob("cutlines/*.json"):
tile_id = os.path.splitext(os.path.basename(fn))[0]
cutline_ds = ogr.Open(fn)
cutline_layer = cutline_ds.GetLayerByIndex(0)
cutline_feature = cutline_layer.GetNextFeature()
while cutline_feature:
poly = cutline_feature.GetGeometryRef().Clone()
feature = ogr.Feature(layer.GetLayerDefn())
feature.SetField("Name", tile_id)
feature.SetGeometry(poly)
layer.CreateFeature(feature)
feature.Destroy()
cutline_feature = cutline_layer.GetNextFeature()
ds.Destroy()
| from osgeo import ogr
from osgeo import osr
from glob import glob
import os.path
driver = ogr.GetDriverByName("ESRI Shapefile")
ds = driver.CreateDataSource("summary-maps/cutline-map.shp")
srs = osr.SpatialReference()
srs.ImportFromEPSG(26915)
cutline_srs = osr.SpatialReference()
cutline_srs.ImportFromEPSG(4326)
coord_trans = osr.CoordinateTransformation(cutline_srs, srs)
layer = ds.CreateLayer("tiles", srs, ogr.wkbPolygon)
field_name = ogr.FieldDefn("Name", ogr.OFTString)
field_name.SetWidth(16)
layer.CreateField(field_name)
for fn in glob("cutlines/*.json"):
tile_id = os.path.splitext(os.path.basename(fn))[0]
cutline_ds = ogr.Open(fn)
cutline_layer = cutline_ds.GetLayerByIndex(0)
cutline_feature = cutline_layer.GetNextFeature()
while cutline_feature:
poly = cutline_feature.GetGeometryRef().Clone()
poly.Transform(coord_trans)
feature = ogr.Feature(layer.GetLayerDefn())
feature.SetField("Name", tile_id)
feature.SetGeometry(poly)
layer.CreateFeature(feature)
feature.Destroy()
cutline_feature = cutline_layer.GetNextFeature()
ds.Destroy()
| Put cutline map in 26915 | Put cutline map in 26915
| Python | mit | simonsonc/mn-glo-mosaic,simonsonc/mn-glo-mosaic,simonsonc/mn-glo-mosaic | python | ## Code Before:
from osgeo import ogr
from osgeo import osr
from glob import glob
import os.path
driver = ogr.GetDriverByName("ESRI Shapefile")
ds = driver.CreateDataSource("summary-maps/cutline-map.shp")
srs = osr.SpatialReference()
srs.ImportFromEPSG(4326)
layer = ds.CreateLayer("tiles", srs, ogr.wkbPolygon)
field_name = ogr.FieldDefn("Name", ogr.OFTString)
field_name.SetWidth(16)
layer.CreateField(field_name)
for fn in glob("cutlines/*.json"):
tile_id = os.path.splitext(os.path.basename(fn))[0]
cutline_ds = ogr.Open(fn)
cutline_layer = cutline_ds.GetLayerByIndex(0)
cutline_feature = cutline_layer.GetNextFeature()
while cutline_feature:
poly = cutline_feature.GetGeometryRef().Clone()
feature = ogr.Feature(layer.GetLayerDefn())
feature.SetField("Name", tile_id)
feature.SetGeometry(poly)
layer.CreateFeature(feature)
feature.Destroy()
cutline_feature = cutline_layer.GetNextFeature()
ds.Destroy()
## Instruction:
Put cutline map in 26915
## Code After:
from osgeo import ogr
from osgeo import osr
from glob import glob
import os.path
driver = ogr.GetDriverByName("ESRI Shapefile")
ds = driver.CreateDataSource("summary-maps/cutline-map.shp")
srs = osr.SpatialReference()
srs.ImportFromEPSG(26915)
cutline_srs = osr.SpatialReference()
cutline_srs.ImportFromEPSG(4326)
coord_trans = osr.CoordinateTransformation(cutline_srs, srs)
layer = ds.CreateLayer("tiles", srs, ogr.wkbPolygon)
field_name = ogr.FieldDefn("Name", ogr.OFTString)
field_name.SetWidth(16)
layer.CreateField(field_name)
for fn in glob("cutlines/*.json"):
tile_id = os.path.splitext(os.path.basename(fn))[0]
cutline_ds = ogr.Open(fn)
cutline_layer = cutline_ds.GetLayerByIndex(0)
cutline_feature = cutline_layer.GetNextFeature()
while cutline_feature:
poly = cutline_feature.GetGeometryRef().Clone()
poly.Transform(coord_trans)
feature = ogr.Feature(layer.GetLayerDefn())
feature.SetField("Name", tile_id)
feature.SetGeometry(poly)
layer.CreateFeature(feature)
feature.Destroy()
cutline_feature = cutline_layer.GetNextFeature()
ds.Destroy()
| from osgeo import ogr
from osgeo import osr
from glob import glob
import os.path
driver = ogr.GetDriverByName("ESRI Shapefile")
ds = driver.CreateDataSource("summary-maps/cutline-map.shp")
srs = osr.SpatialReference()
- srs.ImportFromEPSG(4326)
? --
+ srs.ImportFromEPSG(26915)
? +++
+
+ cutline_srs = osr.SpatialReference()
+ cutline_srs.ImportFromEPSG(4326)
+
+ coord_trans = osr.CoordinateTransformation(cutline_srs, srs)
layer = ds.CreateLayer("tiles", srs, ogr.wkbPolygon)
field_name = ogr.FieldDefn("Name", ogr.OFTString)
field_name.SetWidth(16)
layer.CreateField(field_name)
for fn in glob("cutlines/*.json"):
tile_id = os.path.splitext(os.path.basename(fn))[0]
cutline_ds = ogr.Open(fn)
cutline_layer = cutline_ds.GetLayerByIndex(0)
cutline_feature = cutline_layer.GetNextFeature()
while cutline_feature:
poly = cutline_feature.GetGeometryRef().Clone()
+ poly.Transform(coord_trans)
feature = ogr.Feature(layer.GetLayerDefn())
feature.SetField("Name", tile_id)
feature.SetGeometry(poly)
layer.CreateFeature(feature)
feature.Destroy()
cutline_feature = cutline_layer.GetNextFeature()
ds.Destroy() | 8 | 0.216216 | 7 | 1 |
7955230dbf029df59ac13b5a7d1ca81805bbb6b1 | deploy.sh | deploy.sh |
cd src
php artisan down
cd ../
git reset --hard
git checkout master
git pull
cd src
composer install --no-interaction --optimize-autoloader --no-dev
npm install --production
php artisan cache:clear
php artisan config:clear
php artisan config:cache
php artisan route:cache
# php artisan -v queue:restart
php artisan migrate --force
cd ../
chown -R www-data:www-data src
chmod -R 755 src
cd src
php artisan up |
cd src
php artisan down
cd ../
git reset --hard
git checkout master
git pull
cd src
composer install --no-interaction --optimize-autoloader --no-dev
npm install --production
npm run production
php artisan cache:clear
php artisan config:clear
php artisan config:cache
php artisan route:cache
# php artisan -v queue:restart
php artisan migrate --force
cd ../
chown -R www-data:www-data src
chmod -R 755 src
cd src
php artisan up | Add missing asset build command | Add missing asset build command
| Shell | mpl-2.0 | itsmyfirstday/ProjectCityBuild,itsmyfirstday/ProjectCityBuild,itsmyfirstday/ProjectCityBuild,itsmyfirstday/ProjectCityBuild | shell | ## Code Before:
cd src
php artisan down
cd ../
git reset --hard
git checkout master
git pull
cd src
composer install --no-interaction --optimize-autoloader --no-dev
npm install --production
php artisan cache:clear
php artisan config:clear
php artisan config:cache
php artisan route:cache
# php artisan -v queue:restart
php artisan migrate --force
cd ../
chown -R www-data:www-data src
chmod -R 755 src
cd src
php artisan up
## Instruction:
Add missing asset build command
## Code After:
cd src
php artisan down
cd ../
git reset --hard
git checkout master
git pull
cd src
composer install --no-interaction --optimize-autoloader --no-dev
npm install --production
npm run production
php artisan cache:clear
php artisan config:clear
php artisan config:cache
php artisan route:cache
# php artisan -v queue:restart
php artisan migrate --force
cd ../
chown -R www-data:www-data src
chmod -R 755 src
cd src
php artisan up |
cd src
php artisan down
cd ../
git reset --hard
git checkout master
git pull
cd src
composer install --no-interaction --optimize-autoloader --no-dev
npm install --production
+
+ npm run production
php artisan cache:clear
php artisan config:clear
php artisan config:cache
php artisan route:cache
# php artisan -v queue:restart
php artisan migrate --force
cd ../
chown -R www-data:www-data src
chmod -R 755 src
cd src
php artisan up | 2 | 0.071429 | 2 | 0 |
48d5cccd2907efda9ff18d187e175ed5ba34e93c | tasks/build.js | tasks/build.js | 'use strict';
let gulp = require('gulp'),
rename = require('gulp-rename'),
rollup = require('rollup').rollup;
module.exports = () => {
gulp.task('copy-dependencies', () => {
return gulp
.src([
'./node_modules/three/build/three.js'
])
.pipe(gulp.dest('./dist/'))
});
gulp.task('build', ['copy-dependencies'], (callback) => {
return rollup({
entry: 'src/js/main.js',
sourceMap: true,
external: ['three']
}).then(function(bundle) {
return bundle.write({
dest: 'dist/main.js',
format: 'iife',
globals: {
three: 'THREE'
}
});
});
});
};
| 'use strict';
let gulp = require('gulp'),
rollup = require('rollup').rollup;
let cache;
module.exports = () => {
gulp.task('copy-dependencies', () => {
return gulp
.src([
'./node_modules/three/build/three.js'
])
.pipe(gulp.dest('./dist/'))
});
gulp.task('build', ['copy-dependencies'], (callback) => {
return rollup({
entry: 'src/js/main.js',
cache: cache,
sourceMap: true,
external: ['three']
}).then(function(bundle) {
return bundle.write({
dest: 'dist/main.js',
sourceMap: true,
format: 'iife',
globals: {
three: 'THREE'
}
});
});
});
};
| Create a source map file | Create a source map file
| JavaScript | mit | Thunraz/LOWREZJAM2017,Thunraz/LOWREZJAM2017 | javascript | ## Code Before:
'use strict';
let gulp = require('gulp'),
rename = require('gulp-rename'),
rollup = require('rollup').rollup;
module.exports = () => {
gulp.task('copy-dependencies', () => {
return gulp
.src([
'./node_modules/three/build/three.js'
])
.pipe(gulp.dest('./dist/'))
});
gulp.task('build', ['copy-dependencies'], (callback) => {
return rollup({
entry: 'src/js/main.js',
sourceMap: true,
external: ['three']
}).then(function(bundle) {
return bundle.write({
dest: 'dist/main.js',
format: 'iife',
globals: {
three: 'THREE'
}
});
});
});
};
## Instruction:
Create a source map file
## Code After:
'use strict';
let gulp = require('gulp'),
rollup = require('rollup').rollup;
let cache;
module.exports = () => {
gulp.task('copy-dependencies', () => {
return gulp
.src([
'./node_modules/three/build/three.js'
])
.pipe(gulp.dest('./dist/'))
});
gulp.task('build', ['copy-dependencies'], (callback) => {
return rollup({
entry: 'src/js/main.js',
cache: cache,
sourceMap: true,
external: ['three']
}).then(function(bundle) {
return bundle.write({
dest: 'dist/main.js',
sourceMap: true,
format: 'iife',
globals: {
three: 'THREE'
}
});
});
});
};
| 'use strict';
let gulp = require('gulp'),
- rename = require('gulp-rename'),
rollup = require('rollup').rollup;
+
+ let cache;
module.exports = () => {
gulp.task('copy-dependencies', () => {
return gulp
.src([
'./node_modules/three/build/three.js'
])
.pipe(gulp.dest('./dist/'))
});
gulp.task('build', ['copy-dependencies'], (callback) => {
return rollup({
entry: 'src/js/main.js',
+ cache: cache,
sourceMap: true,
external: ['three']
}).then(function(bundle) {
return bundle.write({
dest: 'dist/main.js',
+ sourceMap: true,
format: 'iife',
globals: {
three: 'THREE'
}
});
});
});
}; | 5 | 0.16129 | 4 | 1 |
72bcff83f71a0fd9cb04a05386e43898c6fbf75b | hooks/lib/android-helper.js | hooks/lib/android-helper.js |
var fs = require("fs");
var path = require("path");
var utilities = require("./utilities");
module.exports = {
addFabricBuildToolsGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle += [
"// Fabric Cordova Plugin - Start Fabric Build Tools ",
"buildscript {",
" repositories {",
" maven { url 'https://maven.fabric.io/public' }",
" }",
" dependencies {",
" classpath 'io.fabric.tools:gradle:1.+'",
" }",
"}",
"",
"apply plugin: 'io.fabric'",
"// Fabric Cordova Plugin - End Fabric Build Tools",
].join("\n");
utilities.writeBuildGradle(buildGradle);
},
removeFabricBuildToolsFromGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools\n/, "");
utilities.writeBuildGradle(buildGradle);
}
};
|
var fs = require("fs");
var path = require("path");
var utilities = require("./utilities");
module.exports = {
addFabricBuildToolsGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle += [
"",
"// Fabric Cordova Plugin - Start Fabric Build Tools ",
"buildscript {",
" repositories {",
" maven { url 'https://maven.fabric.io/public' }",
" }",
" dependencies {",
" classpath 'io.fabric.tools:gradle:1.+'",
" }",
"}",
"",
"apply plugin: 'io.fabric'",
"// Fabric Cordova Plugin - End Fabric Build Tools",
].join("\n");
utilities.writeBuildGradle(buildGradle);
},
removeFabricBuildToolsFromGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools/, "");
utilities.writeBuildGradle(buildGradle);
}
};
| Fix hook to remove build.gradle config on uninstall. | Fix hook to remove build.gradle config on uninstall.
| JavaScript | mit | andrestraumann/FabricPlugin,sarriaroman/FabricPlugin,andrestraumann/FabricPlugin,sarriaroman/FabricPlugin,sarriaroman/FabricPlugin,andrestraumann/FabricPlugin,sarriaroman/FabricPlugin | javascript | ## Code Before:
var fs = require("fs");
var path = require("path");
var utilities = require("./utilities");
module.exports = {
addFabricBuildToolsGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle += [
"// Fabric Cordova Plugin - Start Fabric Build Tools ",
"buildscript {",
" repositories {",
" maven { url 'https://maven.fabric.io/public' }",
" }",
" dependencies {",
" classpath 'io.fabric.tools:gradle:1.+'",
" }",
"}",
"",
"apply plugin: 'io.fabric'",
"// Fabric Cordova Plugin - End Fabric Build Tools",
].join("\n");
utilities.writeBuildGradle(buildGradle);
},
removeFabricBuildToolsFromGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools\n/, "");
utilities.writeBuildGradle(buildGradle);
}
};
## Instruction:
Fix hook to remove build.gradle config on uninstall.
## Code After:
var fs = require("fs");
var path = require("path");
var utilities = require("./utilities");
module.exports = {
addFabricBuildToolsGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle += [
"",
"// Fabric Cordova Plugin - Start Fabric Build Tools ",
"buildscript {",
" repositories {",
" maven { url 'https://maven.fabric.io/public' }",
" }",
" dependencies {",
" classpath 'io.fabric.tools:gradle:1.+'",
" }",
"}",
"",
"apply plugin: 'io.fabric'",
"// Fabric Cordova Plugin - End Fabric Build Tools",
].join("\n");
utilities.writeBuildGradle(buildGradle);
},
removeFabricBuildToolsFromGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools/, "");
utilities.writeBuildGradle(buildGradle);
}
};
|
var fs = require("fs");
var path = require("path");
var utilities = require("./utilities");
module.exports = {
addFabricBuildToolsGradle: function() {
var buildGradle = utilities.readBuildGradle();
buildGradle += [
+ "",
"// Fabric Cordova Plugin - Start Fabric Build Tools ",
"buildscript {",
" repositories {",
" maven { url 'https://maven.fabric.io/public' }",
" }",
" dependencies {",
" classpath 'io.fabric.tools:gradle:1.+'",
" }",
"}",
"",
"apply plugin: 'io.fabric'",
"// Fabric Cordova Plugin - End Fabric Build Tools",
].join("\n");
utilities.writeBuildGradle(buildGradle);
},
removeFabricBuildToolsFromGradle: function() {
var buildGradle = utilities.readBuildGradle();
- buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools\n/, "");
? --
+ buildGradle = buildGradle.replace(/\n\/\/ Fabric Cordova Plugin - Start Fabric Build Tools[\s\S]*\/\/ Fabric Cordova Plugin - End Fabric Build Tools/, "");
utilities.writeBuildGradle(buildGradle);
}
}; | 3 | 0.078947 | 2 | 1 |
cdb64f61b1b5dbef28ce978f49452242d3fc5d1e | css/inject/inject.css | css/inject/inject.css | .preview a.label.blocked {
background-color: hsl(349, 69%, 46%);
border-radius: 20px;
color: white;
margin: 0 2px;
padding: 1px 6px 1px 6px;
white-space: nowrap;
}
| .preview a.label.blocked {
background-color: hsl(349, 69%, 46%);
border-radius: 20px;
color: white;
margin: 0 2px;
padding: 1px 6px 1px 6px;
white-space: nowrap;
}
.labels_maker .selected.labels li a.label.blocked,
.labels_maker .selected.labels li a.label.blocked + a.remove {
background-color: hsl(349, 69%, 46%);
}
.labels_maker .selected.labels li a.label.blocked:hover,
.labels_maker .selected.labels li a.label.blocked + a.remove:hover {
background-color: hsl(349, 69%, 32%);
} | Add red and :hover styles to details view | Add red and :hover styles to details view
Previous logic changes were responsible for adding .blocked
to those labels. Now let's style them!
| CSS | isc | mkenyon/red-labels-for-pivotal-tracker,oliverswitzer/wwltw-for-pivotal-tracker,oliverswitzer/wwltw-for-pivotal-tracker | css | ## Code Before:
.preview a.label.blocked {
background-color: hsl(349, 69%, 46%);
border-radius: 20px;
color: white;
margin: 0 2px;
padding: 1px 6px 1px 6px;
white-space: nowrap;
}
## Instruction:
Add red and :hover styles to details view
Previous logic changes were responsible for adding .blocked
to those labels. Now let's style them!
## Code After:
.preview a.label.blocked {
background-color: hsl(349, 69%, 46%);
border-radius: 20px;
color: white;
margin: 0 2px;
padding: 1px 6px 1px 6px;
white-space: nowrap;
}
.labels_maker .selected.labels li a.label.blocked,
.labels_maker .selected.labels li a.label.blocked + a.remove {
background-color: hsl(349, 69%, 46%);
}
.labels_maker .selected.labels li a.label.blocked:hover,
.labels_maker .selected.labels li a.label.blocked + a.remove:hover {
background-color: hsl(349, 69%, 32%);
} | .preview a.label.blocked {
background-color: hsl(349, 69%, 46%);
border-radius: 20px;
color: white;
margin: 0 2px;
padding: 1px 6px 1px 6px;
white-space: nowrap;
}
+
+ .labels_maker .selected.labels li a.label.blocked,
+ .labels_maker .selected.labels li a.label.blocked + a.remove {
+ background-color: hsl(349, 69%, 46%);
+ }
+
+ .labels_maker .selected.labels li a.label.blocked:hover,
+ .labels_maker .selected.labels li a.label.blocked + a.remove:hover {
+ background-color: hsl(349, 69%, 32%);
+ } | 10 | 1.25 | 10 | 0 |
cc1c28f454f6c41ca4765101946865560568a837 | lib/docs/scrapers/vue.rb | lib/docs/scrapers/vue.rb | module Docs
class Vue < UrlScraper
self.name = 'Vue.js'
self.slug = 'vue'
self.type = 'vue'
self.release = '1.0.24'
self.base_url = 'https://vuejs.org'
self.root_path = '/guide/index.html'
self.initial_paths = %w(/api/index.html)
self.links = {
home: 'https://vuejs.org/',
code: 'https://github.com/vuejs/vue'
}
html_filters.push 'vue/clean_html', 'vue/entries'
options[:only_patterns] = [/\/guide\//, /\/api\//]
options[:attribution] = <<-HTML
© 2013–2016 Evan You, Vue.js contributors<br>
Licensed under the MIT License.
HTML
end
end
| module Docs
class Vue < UrlScraper
self.name = 'Vue.js'
self.slug = 'vue'
self.type = 'vue'
self.root_path = '/guide/index.html'
self.initial_paths = %w(/api/index.html)
self.links = {
home: 'https://vuejs.org/',
code: 'https://github.com/vuejs/vue'
}
html_filters.push 'vue/clean_html', 'vue/entries'
options[:only_patterns] = [/\/guide\//, /\/api\//]
options[:attribution] = <<-HTML
© 2013–2016 Evan You, Vue.js contributors<br>
Licensed under the MIT License.
HTML
version '2' do
self.release = '2.0.1'
self.base_url = 'https://vuejs.org'
end
version '1' do
self.release = '1.0.27'
self.base_url = 'https://v1.vuejs.org'
end
end
end
| Update Vue.js documentation (2.0.1, 1.0.27) | Update Vue.js documentation (2.0.1, 1.0.27)
| Ruby | mpl-2.0 | webcoding/devdocs,yosiat/devdocs,webcoding/devdocs,yosiat/devdocs,rlugojr/devdocs,rlugojr/devdocs,webcoding/devdocs,rlugojr/devdocs,yosiat/devdocs | ruby | ## Code Before:
module Docs
class Vue < UrlScraper
self.name = 'Vue.js'
self.slug = 'vue'
self.type = 'vue'
self.release = '1.0.24'
self.base_url = 'https://vuejs.org'
self.root_path = '/guide/index.html'
self.initial_paths = %w(/api/index.html)
self.links = {
home: 'https://vuejs.org/',
code: 'https://github.com/vuejs/vue'
}
html_filters.push 'vue/clean_html', 'vue/entries'
options[:only_patterns] = [/\/guide\//, /\/api\//]
options[:attribution] = <<-HTML
© 2013–2016 Evan You, Vue.js contributors<br>
Licensed under the MIT License.
HTML
end
end
## Instruction:
Update Vue.js documentation (2.0.1, 1.0.27)
## Code After:
module Docs
class Vue < UrlScraper
self.name = 'Vue.js'
self.slug = 'vue'
self.type = 'vue'
self.root_path = '/guide/index.html'
self.initial_paths = %w(/api/index.html)
self.links = {
home: 'https://vuejs.org/',
code: 'https://github.com/vuejs/vue'
}
html_filters.push 'vue/clean_html', 'vue/entries'
options[:only_patterns] = [/\/guide\//, /\/api\//]
options[:attribution] = <<-HTML
© 2013–2016 Evan You, Vue.js contributors<br>
Licensed under the MIT License.
HTML
version '2' do
self.release = '2.0.1'
self.base_url = 'https://vuejs.org'
end
version '1' do
self.release = '1.0.27'
self.base_url = 'https://v1.vuejs.org'
end
end
end
| module Docs
class Vue < UrlScraper
self.name = 'Vue.js'
self.slug = 'vue'
self.type = 'vue'
- self.release = '1.0.24'
- self.base_url = 'https://vuejs.org'
self.root_path = '/guide/index.html'
self.initial_paths = %w(/api/index.html)
self.links = {
home: 'https://vuejs.org/',
code: 'https://github.com/vuejs/vue'
}
html_filters.push 'vue/clean_html', 'vue/entries'
options[:only_patterns] = [/\/guide\//, /\/api\//]
options[:attribution] = <<-HTML
© 2013–2016 Evan You, Vue.js contributors<br>
Licensed under the MIT License.
HTML
+
+ version '2' do
+ self.release = '2.0.1'
+ self.base_url = 'https://vuejs.org'
+ end
+
+ version '1' do
+ self.release = '1.0.27'
+ self.base_url = 'https://v1.vuejs.org'
+ end
end
end | 12 | 0.5 | 10 | 2 |
81783d4828335cbc25266e7fe8faa85aa89a35d0 | cotracker/templates/checkouts/base_list.html | cotracker/templates/checkouts/base_list.html | {% extends "base.html" %}
{% block title %}Bases{% endblock title %}
{% block content %}
{% if base_list %}
<table border="1">
<tr>
<th rowspan="2">Bases</th>
<th colspan="2">Airstrips</th>
</tr>
<tr><th>Attached</th><th>Unattached</th></tr>
{% for base, attached_count, unattached_count in base_list %}
<tr>
<td>{{ base.name }}</td>
<td><a href="{% url 'base_attached_detail' base.ident %}">{{ attached_count }}</a></td>
<td><a href="{% url 'base_unattached_detail' base.ident %}">{{ unattached_count }}</a></td>
</tr>
{% endfor %}
</table>
{% endif %}
{% endblock content %}
| {% extends "base.html" %}
{% block title %}Bases{% endblock title %}
{% block content %}
{% if base_list %}
<table border="1">
<tr>
<th rowspan="2">Bases</th>
<th colspan="2">Airstrips</th>
{% if user.is_superuser %}
<th rowspan="2"></th>
{% endif %}
</tr>
<tr><th>Attached</th><th>Unattached</th></tr>
{% for base, attached_count, unattached_count in base_list %}
<tr>
<td>{{ base.name }}</td>
<td><a href="{% url 'base_attached_detail' base.ident %}">{{ attached_count }}</a></td>
<td><a href="{% url 'base_unattached_detail' base.ident %}">{{ unattached_count }}</a></td>
{% if user.is_superuser %}
<td><a href="{% url 'base_edit' base.ident %}">Edit</a></td>
{% endif %}
</tr>
{% endfor %}
</table>
{% endif %}
{% endblock content %}
| Add links to base editing urls for superusers | Add links to base editing urls for superusers
| HTML | mit | eallrich/checkniner,eallrich/checkniner,eallrich/checkniner | html | ## Code Before:
{% extends "base.html" %}
{% block title %}Bases{% endblock title %}
{% block content %}
{% if base_list %}
<table border="1">
<tr>
<th rowspan="2">Bases</th>
<th colspan="2">Airstrips</th>
</tr>
<tr><th>Attached</th><th>Unattached</th></tr>
{% for base, attached_count, unattached_count in base_list %}
<tr>
<td>{{ base.name }}</td>
<td><a href="{% url 'base_attached_detail' base.ident %}">{{ attached_count }}</a></td>
<td><a href="{% url 'base_unattached_detail' base.ident %}">{{ unattached_count }}</a></td>
</tr>
{% endfor %}
</table>
{% endif %}
{% endblock content %}
## Instruction:
Add links to base editing urls for superusers
## Code After:
{% extends "base.html" %}
{% block title %}Bases{% endblock title %}
{% block content %}
{% if base_list %}
<table border="1">
<tr>
<th rowspan="2">Bases</th>
<th colspan="2">Airstrips</th>
{% if user.is_superuser %}
<th rowspan="2"></th>
{% endif %}
</tr>
<tr><th>Attached</th><th>Unattached</th></tr>
{% for base, attached_count, unattached_count in base_list %}
<tr>
<td>{{ base.name }}</td>
<td><a href="{% url 'base_attached_detail' base.ident %}">{{ attached_count }}</a></td>
<td><a href="{% url 'base_unattached_detail' base.ident %}">{{ unattached_count }}</a></td>
{% if user.is_superuser %}
<td><a href="{% url 'base_edit' base.ident %}">Edit</a></td>
{% endif %}
</tr>
{% endfor %}
</table>
{% endif %}
{% endblock content %}
| {% extends "base.html" %}
{% block title %}Bases{% endblock title %}
{% block content %}
{% if base_list %}
<table border="1">
<tr>
<th rowspan="2">Bases</th>
<th colspan="2">Airstrips</th>
+ {% if user.is_superuser %}
+ <th rowspan="2"></th>
+ {% endif %}
</tr>
<tr><th>Attached</th><th>Unattached</th></tr>
{% for base, attached_count, unattached_count in base_list %}
<tr>
<td>{{ base.name }}</td>
<td><a href="{% url 'base_attached_detail' base.ident %}">{{ attached_count }}</a></td>
<td><a href="{% url 'base_unattached_detail' base.ident %}">{{ unattached_count }}</a></td>
+ {% if user.is_superuser %}
+ <td><a href="{% url 'base_edit' base.ident %}">Edit</a></td>
+ {% endif %}
</tr>
{% endfor %}
</table>
{% endif %}
{% endblock content %} | 6 | 0.272727 | 6 | 0 |
d4d4532c638663f34ea6a6588e018e29e65620fb | spec/xml/encoding_spec.rb | spec/xml/encoding_spec.rb | require 'spec/spec_helper'
describe ROXML, "encoding" do
class TestResult
include ROXML
xml_accessor :message
end
context "when provided non-latin characters" do
it "should output those characters as input via methods" do
res = TestResult.new
res.message = "sadfk одловыа jjklsd " #random russian and english charecters
res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
end
it "should output those characters as input via xml" do
res = TestResult.from_xml("<test_result><message>sadfk одловыа jjklsd </message></test_result>")
res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
end
end
end
| require 'spec/spec_helper'
describe ROXML, "encoding" do
class TestResult
include ROXML
xml_accessor :message
end
context "when provided non-latin characters" do
it "should output those characters as input via methods" do
res = TestResult.new
res.message = "sadfk одловыа jjklsd " #random russian and english charecters
doc = ROXML::XML::Document.new
doc.root = res.to_xml
if defined?(Nokogiri)
doc.at('message').inner_text
else
doc.find_first('message').inner_xml
end.should == "sadfk одловыа jjklsd "
end
it "should output those characters as input via xml" do
res = TestResult.from_xml("<test_result><message>sadfk одловыа jjklsd </message></test_result>")
doc = ROXML::XML::Document.new
doc.root = res.to_xml
if defined?(Nokogiri)
doc.at('message').inner_text
else
doc.find_first('message').inner_xml
end.should == "sadfk одловыа jjklsd "
end
end
end
| Expand on encoding spec a bit to accomodate libxml idosyncracies | Expand on encoding spec a bit to accomodate libxml idosyncracies
| Ruby | mit | tastyworks/representable,Casecommons/representable,boutil/roxml,jarib/roxml,boutil/roxml,parndt/representable,irfanah/representable,simonoff/representable,Empact/roxml,apotonick/representable,yob/roxml,jajuki13/representable,Empact/roxml,rusikf/representable,tastyworks/representable | ruby | ## Code Before:
require 'spec/spec_helper'
describe ROXML, "encoding" do
class TestResult
include ROXML
xml_accessor :message
end
context "when provided non-latin characters" do
it "should output those characters as input via methods" do
res = TestResult.new
res.message = "sadfk одловыа jjklsd " #random russian and english charecters
res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
end
it "should output those characters as input via xml" do
res = TestResult.from_xml("<test_result><message>sadfk одловыа jjklsd </message></test_result>")
res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
end
end
end
## Instruction:
Expand on encoding spec a bit to accomodate libxml idosyncracies
## Code After:
require 'spec/spec_helper'
describe ROXML, "encoding" do
class TestResult
include ROXML
xml_accessor :message
end
context "when provided non-latin characters" do
it "should output those characters as input via methods" do
res = TestResult.new
res.message = "sadfk одловыа jjklsd " #random russian and english charecters
doc = ROXML::XML::Document.new
doc.root = res.to_xml
if defined?(Nokogiri)
doc.at('message').inner_text
else
doc.find_first('message').inner_xml
end.should == "sadfk одловыа jjklsd "
end
it "should output those characters as input via xml" do
res = TestResult.from_xml("<test_result><message>sadfk одловыа jjklsd </message></test_result>")
doc = ROXML::XML::Document.new
doc.root = res.to_xml
if defined?(Nokogiri)
doc.at('message').inner_text
else
doc.find_first('message').inner_xml
end.should == "sadfk одловыа jjklsd "
end
end
end
| require 'spec/spec_helper'
describe ROXML, "encoding" do
class TestResult
include ROXML
xml_accessor :message
end
context "when provided non-latin characters" do
it "should output those characters as input via methods" do
res = TestResult.new
res.message = "sadfk одловыа jjklsd " #random russian and english charecters
- res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
+ doc = ROXML::XML::Document.new
+ doc.root = res.to_xml
+ if defined?(Nokogiri)
+ doc.at('message').inner_text
+ else
+ doc.find_first('message').inner_xml
+ end.should == "sadfk одловыа jjklsd "
end
it "should output those characters as input via xml" do
res = TestResult.from_xml("<test_result><message>sadfk одловыа jjklsd </message></test_result>")
- res.to_xml.at('message').inner_text.should == "sadfk одловыа jjklsd "
+ doc = ROXML::XML::Document.new
+ doc.root = res.to_xml
+ if defined?(Nokogiri)
+ doc.at('message').inner_text
+ else
+ doc.find_first('message').inner_xml
+ end.should == "sadfk одловыа jjklsd "
end
end
end | 16 | 0.761905 | 14 | 2 |
ee326f20babdb783071337d5ed097d91fb791a89 | lib/facter/nginx_version.rb | lib/facter/nginx_version.rb | Facter.add(:nginx_version) do
setcode do
if Facter.value('kernel') != 'windows' && Facter::Util::Resolution.which('nginx')
nginx_version = Facter::Util::Resolution.exec('nginx -v 2>&1')
%r{nginx version: (nginx|openresty)\/([\w\.]+)}.match(nginx_version)[2]
end
end
end
| Facter.add(:nginx_version) do
setcode do
if Facter.value('kernel') != 'windows' && Facter::Util::Resolution.which('nginx')
nginx_version = Facter::Util::Resolution.exec('nginx -v 2>&1')
%r{nginx version: nginx\/([\w\.]+)}.match(nginx_version)[1]
end
end
end
| Revert "Prevent custom fact from complaining when openresty is installed" | Revert "Prevent custom fact from complaining when openresty is installed"
| Ruby | mit | raphink/puppet-nginx,bastelfreak/puppet-nginx,voxpupuli/puppet-nginx,jfryman/puppet-nginx,bulletproofnetworks/puppet-nginx,xaque208/puppet-nginx,bulletproofnetworks/puppet-nginx,jfryman/puppet-nginx,bastelfreak/puppet-nginx,hbog/puppet-nginx,hbog/puppet-nginx,voxpupuli/puppet-nginx,icann-dns/puppet-nginx,xaque208/puppet-nginx,icann-dns/puppet-nginx | ruby | ## Code Before:
Facter.add(:nginx_version) do
setcode do
if Facter.value('kernel') != 'windows' && Facter::Util::Resolution.which('nginx')
nginx_version = Facter::Util::Resolution.exec('nginx -v 2>&1')
%r{nginx version: (nginx|openresty)\/([\w\.]+)}.match(nginx_version)[2]
end
end
end
## Instruction:
Revert "Prevent custom fact from complaining when openresty is installed"
## Code After:
Facter.add(:nginx_version) do
setcode do
if Facter.value('kernel') != 'windows' && Facter::Util::Resolution.which('nginx')
nginx_version = Facter::Util::Resolution.exec('nginx -v 2>&1')
%r{nginx version: nginx\/([\w\.]+)}.match(nginx_version)[1]
end
end
end
| Facter.add(:nginx_version) do
setcode do
if Facter.value('kernel') != 'windows' && Facter::Util::Resolution.which('nginx')
nginx_version = Facter::Util::Resolution.exec('nginx -v 2>&1')
- %r{nginx version: (nginx|openresty)\/([\w\.]+)}.match(nginx_version)[2]
? - ----------- ^
+ %r{nginx version: nginx\/([\w\.]+)}.match(nginx_version)[1]
? ^
end
end
end | 2 | 0.25 | 1 | 1 |
13ab0dee237c8f3919aaffdf696289b70a4a0304 | README.md | README.md |
Install gems via Bundler:
```
bundle
```
## Run
This module is intended to be run as part of the [CASA Engine reference implementation](https://github.com/AppSharing/casa-engine); however, it may also be run directly standalone to query publishers as detailed herein.
To query a manually via the command line:
```
thor receiver:get_payloads
```
This will request a URL for the server and an optional secret, then issuing a query and returning the result/error; it will then the user whether to issue another query or not.
|
[](https://travis-ci.org/AppSharing/casa-receiver) [](https://gemnasium.com/AppSharing/casa-receiver) [](https://codeclimate.com/github/AppSharing/casa-receiver)
## Setup
Install gems via Bundler:
```
bundle
```
## Run
This module is intended to be run as part of the [CASA Engine reference implementation](https://github.com/AppSharing/casa-engine); however, it may also be run directly standalone to query publishers as detailed herein.
To query a manually via the command line:
```
thor receiver:get_payloads
```
This will request a URL for the server and an optional secret, then issuing a query and returning the result/error; it will then the user whether to issue another query or not.
| Add Code Climate and Gemnasium | Add Code Climate and Gemnasium
| Markdown | bsd-3-clause | AppSharing/casa-receiver | markdown | ## Code Before:
Install gems via Bundler:
```
bundle
```
## Run
This module is intended to be run as part of the [CASA Engine reference implementation](https://github.com/AppSharing/casa-engine); however, it may also be run directly standalone to query publishers as detailed herein.
To query a manually via the command line:
```
thor receiver:get_payloads
```
This will request a URL for the server and an optional secret, then issuing a query and returning the result/error; it will then the user whether to issue another query or not.
## Instruction:
Add Code Climate and Gemnasium
## Code After:
[](https://travis-ci.org/AppSharing/casa-receiver) [](https://gemnasium.com/AppSharing/casa-receiver) [](https://codeclimate.com/github/AppSharing/casa-receiver)
## Setup
Install gems via Bundler:
```
bundle
```
## Run
This module is intended to be run as part of the [CASA Engine reference implementation](https://github.com/AppSharing/casa-engine); however, it may also be run directly standalone to query publishers as detailed herein.
To query a manually via the command line:
```
thor receiver:get_payloads
```
This will request a URL for the server and an optional secret, then issuing a query and returning the result/error; it will then the user whether to issue another query or not.
| +
+ [](https://travis-ci.org/AppSharing/casa-receiver) [](https://gemnasium.com/AppSharing/casa-receiver) [](https://codeclimate.com/github/AppSharing/casa-receiver)
+
+ ## Setup
Install gems via Bundler:
```
bundle
```
## Run
This module is intended to be run as part of the [CASA Engine reference implementation](https://github.com/AppSharing/casa-engine); however, it may also be run directly standalone to query publishers as detailed herein.
To query a manually via the command line:
```
thor receiver:get_payloads
```
This will request a URL for the server and an optional secret, then issuing a query and returning the result/error; it will then the user whether to issue another query or not. | 4 | 0.222222 | 4 | 0 |
04d087c105430a6fa81af1c5adbe31caae10a01b | README.adoc | README.adoc | = asciidoctor-chrome-extension, AsciiDoc.js Chrome Extension
Guillaume Grossetie
:sources: https://github.com/Mogztter/asciidoctor-chrome-extension
:license: https://github.com/Mogztter/asciidoctor-chrome-extension/blob/master/LICENSE
This project uses https://github.com/asciidoctor/asciidoctor.js[Asciidoctor.js] to render AsciiDoc as HTML inside Chrome!
== Copyright
Copyright (C) 2013 Guillaume Grossetie.
Free use of this software is granted under the terms of the MIT License.
See the {license}[LICENSE] file for details. | = asciidoctor-chrome-extension, Asciidoctor Chrome Extension
Guillaume Grossetie
:sources: https://github.com/asciidoctor/asciidoctor-chrome-extension
:license: https://github.com/asciidoctor/asciidoctor-chrome-extension/blob/master/LICENSE
This project uses https://github.com/asciidoctor/asciidoctor.js[Asciidoctor.js] to render AsciiDoc as HTML inside Chrome!
== Copyright
Copyright (C) 2013 Guillaume Grossetie.
Free use of this software is granted under the terms of the MIT License.
See the {license}[LICENSE] file for details. | Update AsciiDoc.js to Asciidoctor in the title and the repository URL (asciidoctor org) | Update AsciiDoc.js to Asciidoctor in the title and the repository URL (asciidoctor org)
| AsciiDoc | mit | asciidoctor/asciidoctor-chrome-extension,asciidoctor/asciidoctor-chrome-extension,rotty3000/asciidoctor-chrome-extension,Mogztter/asciidoctor-chrome-extension,bbucko/asciidoctor-chrome-extension,bbucko/asciidoctor-chrome-extension,rotty3000/asciidoctor-chrome-extension,rotty3000/asciidoctor-chrome-extension,Mogztter/asciidoctor-chrome-extension,bbucko/asciidoctor-chrome-extension,asciidoctor/asciidoctor-chrome-extension | asciidoc | ## Code Before:
= asciidoctor-chrome-extension, AsciiDoc.js Chrome Extension
Guillaume Grossetie
:sources: https://github.com/Mogztter/asciidoctor-chrome-extension
:license: https://github.com/Mogztter/asciidoctor-chrome-extension/blob/master/LICENSE
This project uses https://github.com/asciidoctor/asciidoctor.js[Asciidoctor.js] to render AsciiDoc as HTML inside Chrome!
== Copyright
Copyright (C) 2013 Guillaume Grossetie.
Free use of this software is granted under the terms of the MIT License.
See the {license}[LICENSE] file for details.
## Instruction:
Update AsciiDoc.js to Asciidoctor in the title and the repository URL (asciidoctor org)
## Code After:
= asciidoctor-chrome-extension, Asciidoctor Chrome Extension
Guillaume Grossetie
:sources: https://github.com/asciidoctor/asciidoctor-chrome-extension
:license: https://github.com/asciidoctor/asciidoctor-chrome-extension/blob/master/LICENSE
This project uses https://github.com/asciidoctor/asciidoctor.js[Asciidoctor.js] to render AsciiDoc as HTML inside Chrome!
== Copyright
Copyright (C) 2013 Guillaume Grossetie.
Free use of this software is granted under the terms of the MIT License.
See the {license}[LICENSE] file for details. | - = asciidoctor-chrome-extension, AsciiDoc.js Chrome Extension
? ^ ^^^
+ = asciidoctor-chrome-extension, Asciidoctor Chrome Extension
? ^ ^^^
Guillaume Grossetie
- :sources: https://github.com/Mogztter/asciidoctor-chrome-extension
? ^ ^^ ^^
+ :sources: https://github.com/asciidoctor/asciidoctor-chrome-extension
? ^^^^^^ ^ ^
- :license: https://github.com/Mogztter/asciidoctor-chrome-extension/blob/master/LICENSE
? ^ ^^ ^^
+ :license: https://github.com/asciidoctor/asciidoctor-chrome-extension/blob/master/LICENSE
? ^^^^^^ ^ ^
This project uses https://github.com/asciidoctor/asciidoctor.js[Asciidoctor.js] to render AsciiDoc as HTML inside Chrome!
== Copyright
Copyright (C) 2013 Guillaume Grossetie.
Free use of this software is granted under the terms of the MIT License.
See the {license}[LICENSE] file for details. | 6 | 0.461538 | 3 | 3 |
f8f580319d85f65b78c04fa44594947aef945c03 | awa/bundles/commands.properties | awa/bundles/commands.properties | command_user_list_name=Name:20
command_user_list_id=Id:6
command_user_list_email=Email:25
command_user_list_acl_count=ACL:6
command_user_list_auth_count=Login Count:15
command_user_list_last_login=Last login:20
command_session_list_id=Id:6
command_session_list_name=Name:20
command_session_list_session_id=Session:8
command_session_list_start_date=Start date:20
command_session_list_end_date=End date:20
command_session_list_auth_date=Auth date:20
command_job_list_id=Id:8
command_job_list_name=Name:10
command_job_list_status=Status:6
command_job_list_create_date=Created:20
command_job_list_start_date=Started:20
command_job_list_finish_date=Finished:20
| command_user_list_name=Name:20
command_user_list_id=Id:6
command_user_list_email=Email:25
command_user_list_acl_count=ACL:6
command_user_list_auth_count=Login Count:15
command_user_list_last_login=Last login:20
command_session_list_id=Id:6
command_session_list_name=Name:20
command_session_list_session_id=Session:8
command_session_list_start_date=Start date:20
command_session_list_end_date=End date:20
command_session_list_auth_date=Auth date:20
command_job_list_id=Id:8
command_job_list_name=Name:10
command_job_list_status=Status:7
command_job_list_create_date=Created:25
command_job_list_start_date=Started:10
command_job_list_finish_date=Duration:10
command_job_list_user_id=User:8
command_job_list_uname=User:15
| Update the awa command presentation | Update the awa command presentation
| INI | apache-2.0 | stcarrez/ada-awa,stcarrez/ada-awa,stcarrez/ada-awa,stcarrez/ada-awa | ini | ## Code Before:
command_user_list_name=Name:20
command_user_list_id=Id:6
command_user_list_email=Email:25
command_user_list_acl_count=ACL:6
command_user_list_auth_count=Login Count:15
command_user_list_last_login=Last login:20
command_session_list_id=Id:6
command_session_list_name=Name:20
command_session_list_session_id=Session:8
command_session_list_start_date=Start date:20
command_session_list_end_date=End date:20
command_session_list_auth_date=Auth date:20
command_job_list_id=Id:8
command_job_list_name=Name:10
command_job_list_status=Status:6
command_job_list_create_date=Created:20
command_job_list_start_date=Started:20
command_job_list_finish_date=Finished:20
## Instruction:
Update the awa command presentation
## Code After:
command_user_list_name=Name:20
command_user_list_id=Id:6
command_user_list_email=Email:25
command_user_list_acl_count=ACL:6
command_user_list_auth_count=Login Count:15
command_user_list_last_login=Last login:20
command_session_list_id=Id:6
command_session_list_name=Name:20
command_session_list_session_id=Session:8
command_session_list_start_date=Start date:20
command_session_list_end_date=End date:20
command_session_list_auth_date=Auth date:20
command_job_list_id=Id:8
command_job_list_name=Name:10
command_job_list_status=Status:7
command_job_list_create_date=Created:25
command_job_list_start_date=Started:10
command_job_list_finish_date=Duration:10
command_job_list_user_id=User:8
command_job_list_uname=User:15
| command_user_list_name=Name:20
command_user_list_id=Id:6
command_user_list_email=Email:25
command_user_list_acl_count=ACL:6
command_user_list_auth_count=Login Count:15
command_user_list_last_login=Last login:20
command_session_list_id=Id:6
command_session_list_name=Name:20
command_session_list_session_id=Session:8
command_session_list_start_date=Start date:20
command_session_list_end_date=End date:20
command_session_list_auth_date=Auth date:20
command_job_list_id=Id:8
command_job_list_name=Name:10
- command_job_list_status=Status:6
? ^
+ command_job_list_status=Status:7
? ^
- command_job_list_create_date=Created:20
? ^
+ command_job_list_create_date=Created:25
? ^
- command_job_list_start_date=Started:20
? ^
+ command_job_list_start_date=Started:10
? ^
- command_job_list_finish_date=Finished:20
? ^ ----- ^
+ command_job_list_finish_date=Duration:10
? ^^^^^ + ^
+ command_job_list_user_id=User:8
+ command_job_list_uname=User:15 | 10 | 0.555556 | 6 | 4 |
f8071a26c8e1ef00f4aaf0dd98221cefbabbd0e0 | deployer/deploy/task/deploy_check_composer_install.php | deployer/deploy/task/deploy_check_composer_install.php | <?php
namespace Deployer;
// Read more on https://github.com/sourcebroker/deployer-extended#deploy-check-composer-install
task('deploy:check_composer_install', function () {
if (file_exists(get('current_dir') . '/composer.lock')) {
$output = runLocally('{{local/bin/composer}} --ignore-platform-reqs install --dry-run 2>&1');
if (strpos($output, 'Nothing to install') === false) {
throw new \Exception('A composer.lock changes has been detected but you did not run "composer install". Please run composer install, then check if everything is working on your instance and do deploy after.');
}
}
})->desc('Check if composer install is needed before making deployment');
| <?php
namespace Deployer;
// Read more on https://github.com/sourcebroker/deployer-extended#deploy-check-composer-install
task('deploy:check_composer_install', function () {
if (file_exists(get('current_dir') . '/composer.lock')) {
$output = runLocally('{{local/bin/composer}} --ignore-platform-reqs install --dry-run 2>&1');
if (strpos($output, 'Nothing to install') === false && strpos($output, 'Package operations: 0 installs, 0 updates, 0 removals') === false) {
throw new \Exception('A composer.lock changes has been detected but you did not run "composer install". Please run composer install, then check if everything is working on your instance and do deploy after.');
}
}
})->desc('Check if composer install is needed before making deployment');
| Fix check if composer install must be run because of composer.lock changes. | [BUGFIX] Fix check if composer install must be run because of composer.lock changes.
| PHP | mit | sourcebroker/deployer-extended | php | ## Code Before:
<?php
namespace Deployer;
// Read more on https://github.com/sourcebroker/deployer-extended#deploy-check-composer-install
task('deploy:check_composer_install', function () {
if (file_exists(get('current_dir') . '/composer.lock')) {
$output = runLocally('{{local/bin/composer}} --ignore-platform-reqs install --dry-run 2>&1');
if (strpos($output, 'Nothing to install') === false) {
throw new \Exception('A composer.lock changes has been detected but you did not run "composer install". Please run composer install, then check if everything is working on your instance and do deploy after.');
}
}
})->desc('Check if composer install is needed before making deployment');
## Instruction:
[BUGFIX] Fix check if composer install must be run because of composer.lock changes.
## Code After:
<?php
namespace Deployer;
// Read more on https://github.com/sourcebroker/deployer-extended#deploy-check-composer-install
task('deploy:check_composer_install', function () {
if (file_exists(get('current_dir') . '/composer.lock')) {
$output = runLocally('{{local/bin/composer}} --ignore-platform-reqs install --dry-run 2>&1');
if (strpos($output, 'Nothing to install') === false && strpos($output, 'Package operations: 0 installs, 0 updates, 0 removals') === false) {
throw new \Exception('A composer.lock changes has been detected but you did not run "composer install". Please run composer install, then check if everything is working on your instance and do deploy after.');
}
}
})->desc('Check if composer install is needed before making deployment');
| <?php
namespace Deployer;
// Read more on https://github.com/sourcebroker/deployer-extended#deploy-check-composer-install
task('deploy:check_composer_install', function () {
if (file_exists(get('current_dir') . '/composer.lock')) {
$output = runLocally('{{local/bin/composer}} --ignore-platform-reqs install --dry-run 2>&1');
- if (strpos($output, 'Nothing to install') === false) {
+ if (strpos($output, 'Nothing to install') === false && strpos($output, 'Package operations: 0 installs, 0 updates, 0 removals') === false) {
throw new \Exception('A composer.lock changes has been detected but you did not run "composer install". Please run composer install, then check if everything is working on your instance and do deploy after.');
}
}
})->desc('Check if composer install is needed before making deployment'); | 2 | 0.153846 | 1 | 1 |
407159a54f53a385493fd0b4d150265d15ed9e49 | README.md | README.md | Sunrise for WordPress
---------------------
#### Options Pages Framework for WordPress Plugins and Themes
Sunrise is an open-source and OOP-based plugin framework. It can help you to make unlimited number of fast and powerful options pages with native look in your WordPress plugins or themes. It was designed to speed up plugin deployment and development, together with sufficient
functionality.
##### Quick start
Open file _plugin-example.php_ and modify the code as you want. Detailed documentation will be added in closest future. Use native WordPress functions to get and update created options - get_option( $id ), update_option( $id, 'new value' )
##### Useful links
* [Documentation](https://gndev.info/kb/) - __not available yet__ - read the code for more info
* [My Twitter](http://twitter.com/gndevinfo) and [homepage](http://gndev.info/)
* [Page at wordpress.org](http://wordpress.org/plugins/sunrise/)
#### Screenshots
,  | Sunrise for WordPress
---------------------
### New Maintainer
As of March 1, 2015, [Tyler Longren](https://longrendev.io/) has taken over maintenance of this plugin. If you have any feature requests or know of something broken, please create an issue. I am **especially interested in removing any deprecated WordPress functions** and replacing them with their proper alternatives.
#### Options Pages Framework for WordPress Plugins and Themes
Sunrise is an open-source and OOP-based plugin framework. It can help you to make unlimited number of fast and powerful options pages with native look in your WordPress plugins or themes. It was designed to speed up plugin deployment and development, together with sufficient
functionality.
##### Quick start
Open file _plugin-example.php_ and modify the code as you want. Detailed documentation will be added in closest future. Use native WordPress functions to get and update created options - get_option( $id ), update_option( $id, 'new value' )
##### Useful links
* [Documentation](https://gndev.info/kb/) - __not available yet__ - read the code for more info
* [My Twitter](http://twitter.com/gndevinfo) and [homepage](http://gndev.info/)
* [Page at wordpress.org](http://wordpress.org/plugins/sunrise/)
#### Screenshots
,  | Update readme with new mainainer anbd requests. | Update readme with new mainainer anbd requests.
| Markdown | mit | gndev/sunrise,gndev/sunrise,SecureCloud-biz/sunrise,SecureCloud-biz/sunrise | markdown | ## Code Before:
Sunrise for WordPress
---------------------
#### Options Pages Framework for WordPress Plugins and Themes
Sunrise is an open-source and OOP-based plugin framework. It can help you to make unlimited number of fast and powerful options pages with native look in your WordPress plugins or themes. It was designed to speed up plugin deployment and development, together with sufficient
functionality.
##### Quick start
Open file _plugin-example.php_ and modify the code as you want. Detailed documentation will be added in closest future. Use native WordPress functions to get and update created options - get_option( $id ), update_option( $id, 'new value' )
##### Useful links
* [Documentation](https://gndev.info/kb/) - __not available yet__ - read the code for more info
* [My Twitter](http://twitter.com/gndevinfo) and [homepage](http://gndev.info/)
* [Page at wordpress.org](http://wordpress.org/plugins/sunrise/)
#### Screenshots
, 
## Instruction:
Update readme with new mainainer anbd requests.
## Code After:
Sunrise for WordPress
---------------------
### New Maintainer
As of March 1, 2015, [Tyler Longren](https://longrendev.io/) has taken over maintenance of this plugin. If you have any feature requests or know of something broken, please create an issue. I am **especially interested in removing any deprecated WordPress functions** and replacing them with their proper alternatives.
#### Options Pages Framework for WordPress Plugins and Themes
Sunrise is an open-source and OOP-based plugin framework. It can help you to make unlimited number of fast and powerful options pages with native look in your WordPress plugins or themes. It was designed to speed up plugin deployment and development, together with sufficient
functionality.
##### Quick start
Open file _plugin-example.php_ and modify the code as you want. Detailed documentation will be added in closest future. Use native WordPress functions to get and update created options - get_option( $id ), update_option( $id, 'new value' )
##### Useful links
* [Documentation](https://gndev.info/kb/) - __not available yet__ - read the code for more info
* [My Twitter](http://twitter.com/gndevinfo) and [homepage](http://gndev.info/)
* [Page at wordpress.org](http://wordpress.org/plugins/sunrise/)
#### Screenshots
,  | Sunrise for WordPress
---------------------
+
+ ### New Maintainer
+
+ As of March 1, 2015, [Tyler Longren](https://longrendev.io/) has taken over maintenance of this plugin. If you have any feature requests or know of something broken, please create an issue. I am **especially interested in removing any deprecated WordPress functions** and replacing them with their proper alternatives.
#### Options Pages Framework for WordPress Plugins and Themes
Sunrise is an open-source and OOP-based plugin framework. It can help you to make unlimited number of fast and powerful options pages with native look in your WordPress plugins or themes. It was designed to speed up plugin deployment and development, together with sufficient
- functionality.
+ functionality.
? +
##### Quick start
Open file _plugin-example.php_ and modify the code as you want. Detailed documentation will be added in closest future. Use native WordPress functions to get and update created options - get_option( $id ), update_option( $id, 'new value' )
##### Useful links
* [Documentation](https://gndev.info/kb/) - __not available yet__ - read the code for more info
* [My Twitter](http://twitter.com/gndevinfo) and [homepage](http://gndev.info/)
* [Page at wordpress.org](http://wordpress.org/plugins/sunrise/)
#### Screenshots
,  | 6 | 0.285714 | 5 | 1 |
fb3779e628294cf8fa4754d9ae19cdbf461a1b6e | src/sequoia-tree-actions.html | src/sequoia-tree-actions.html | <ul>
<li data-ng-if="(model.length || onlySelected) && !isEditing">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleSelected()" data-ng-bind-html="model.length && !onlySelected ? buttons.showSelected : !model.length && onlySelected ? buttons.backToList : buttons.hideSelected"></a>
</li>
<li data-ng-if="isEditing || !tree.tree.length">
<a class="sequoia-button sequoia-button-success" href="" data-ng-click="addNode()" data-ng-bind-html="buttons.addNode"></a>
</li>
<li data-ng-if="canEdit && tree.tree.length">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleEditing(sequoiaEditForm)" data-ng-bind-html="isEditing ? buttons.done : buttons.edit"></a>
</li>
</ul>
| <ul>
<li data-ng-if="(model.length || onlySelected) && !isEditing">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleSelected()" data-ng-bind-html="model.length && !onlySelected ? buttons.showSelected : !model.length && onlySelected ? buttons.backToList : buttons.hideSelected"></a>
</li>
<li data-ng-if="isEditing">
<a class="sequoia-button sequoia-button-success" href="" data-ng-click="addNode()" data-ng-bind-html="buttons.addNode"></a>
</li>
<li data-ng-if="canEdit">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleEditing(sequoiaEditForm)" data-ng-bind-html="isEditing ? buttons.done : buttons.edit"></a>
</li>
</ul>
| Fix bug that was hiding the Done button when the user deleted all the tree nodes in edit mode | Fix bug that was hiding the Done button when the user deleted all the tree nodes in edit mode
| HTML | mit | tricinel/angular-sequoia,tricinel/angular-sequoia,tricinel/angular-sequoia | html | ## Code Before:
<ul>
<li data-ng-if="(model.length || onlySelected) && !isEditing">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleSelected()" data-ng-bind-html="model.length && !onlySelected ? buttons.showSelected : !model.length && onlySelected ? buttons.backToList : buttons.hideSelected"></a>
</li>
<li data-ng-if="isEditing || !tree.tree.length">
<a class="sequoia-button sequoia-button-success" href="" data-ng-click="addNode()" data-ng-bind-html="buttons.addNode"></a>
</li>
<li data-ng-if="canEdit && tree.tree.length">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleEditing(sequoiaEditForm)" data-ng-bind-html="isEditing ? buttons.done : buttons.edit"></a>
</li>
</ul>
## Instruction:
Fix bug that was hiding the Done button when the user deleted all the tree nodes in edit mode
## Code After:
<ul>
<li data-ng-if="(model.length || onlySelected) && !isEditing">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleSelected()" data-ng-bind-html="model.length && !onlySelected ? buttons.showSelected : !model.length && onlySelected ? buttons.backToList : buttons.hideSelected"></a>
</li>
<li data-ng-if="isEditing">
<a class="sequoia-button sequoia-button-success" href="" data-ng-click="addNode()" data-ng-bind-html="buttons.addNode"></a>
</li>
<li data-ng-if="canEdit">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleEditing(sequoiaEditForm)" data-ng-bind-html="isEditing ? buttons.done : buttons.edit"></a>
</li>
</ul>
| <ul>
<li data-ng-if="(model.length || onlySelected) && !isEditing">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleSelected()" data-ng-bind-html="model.length && !onlySelected ? buttons.showSelected : !model.length && onlySelected ? buttons.backToList : buttons.hideSelected"></a>
</li>
- <li data-ng-if="isEditing || !tree.tree.length">
+ <li data-ng-if="isEditing">
<a class="sequoia-button sequoia-button-success" href="" data-ng-click="addNode()" data-ng-bind-html="buttons.addNode"></a>
</li>
- <li data-ng-if="canEdit && tree.tree.length">
+ <li data-ng-if="canEdit">
<a class="sequoia-button sequoia-button-info" href="" data-ng-click="toggleEditing(sequoiaEditForm)" data-ng-bind-html="isEditing ? buttons.done : buttons.edit"></a>
</li>
</ul> | 4 | 0.285714 | 2 | 2 |
dde1e8a05d93b83abbd3821cb4bd352ef8263b58 | mkdocs.yml | mkdocs.yml | site_name: LARA Interactive API
theme: readthedocs
site_dir: dist/docs
repo_name: Github
repo_url: https://github.com/concord-consortium/lara-interactive-api
sute_url: http://concord-consortium.github.io/lara-interactive-api/
pages:
- ['index.md','LARA API']
- ['about-mkdocs.md','About the documentation generator':W
]
| site_name: LARA Interactive API
theme: readthedocs
site_dir: dist/docs
repo_name: Github
repo_url: https://github.com/concord-consortium/lara-interactive-api
site_url: http://concord-consortium.github.io/lara-interactive-api/
pages:
- LARA API: 'index.md'
- About the documentation generator: 'about-mkdocs.md'
| Fix some docs config typos/deprecation | Fix some docs config typos/deprecation
| YAML | mit | concord-consortium/lara-interactive-api,concord-consortium/lara-interactive-api,concord-consortium/lara-interactive-api | yaml | ## Code Before:
site_name: LARA Interactive API
theme: readthedocs
site_dir: dist/docs
repo_name: Github
repo_url: https://github.com/concord-consortium/lara-interactive-api
sute_url: http://concord-consortium.github.io/lara-interactive-api/
pages:
- ['index.md','LARA API']
- ['about-mkdocs.md','About the documentation generator':W
]
## Instruction:
Fix some docs config typos/deprecation
## Code After:
site_name: LARA Interactive API
theme: readthedocs
site_dir: dist/docs
repo_name: Github
repo_url: https://github.com/concord-consortium/lara-interactive-api
site_url: http://concord-consortium.github.io/lara-interactive-api/
pages:
- LARA API: 'index.md'
- About the documentation generator: 'about-mkdocs.md'
| site_name: LARA Interactive API
theme: readthedocs
site_dir: dist/docs
repo_name: Github
repo_url: https://github.com/concord-consortium/lara-interactive-api
- sute_url: http://concord-consortium.github.io/lara-interactive-api/
? ^
+ site_url: http://concord-consortium.github.io/lara-interactive-api/
? ^
pages:
+ - LARA API: 'index.md'
+ - About the documentation generator: 'about-mkdocs.md'
- - ['index.md','LARA API']
- - ['about-mkdocs.md','About the documentation generator':W
- ] | 7 | 0.7 | 3 | 4 |
8f05ab27ea38ee8770226824e069b6325b286edd | Haxe/Externs/UE4.22/unreal/animationbudgetallocator/USkeletalMeshComponentBudgeted_Extra.hx | Haxe/Externs/UE4.22/unreal/animationbudgetallocator/USkeletalMeshComponentBudgeted_Extra.hx | package unreal.animationbudgetallocator;
extern class USkeletalMeshComponentBudgeted_Extra
{
/** Updates significance budget if this component has been registered with a AnimationBudgetAllocator */
public function SetComponentSignificance(Significance:Float32, bNeverSkip:Bool = false, bTickEvenIfNotRendered:Bool = false, bAllowReducedWork:Bool = true, bForceInterpolate:Bool = false) : Void;
}
| package unreal.animationbudgetallocator;
extern class USkeletalMeshComponentBudgeted_Extra
{
@:thisConst
public function GetShouldUseActorRenderedFlag() : Bool;
public function SetShouldUseActorRenderedFlag(value:Bool) : Void;
/** Updates significance budget if this component has been registered with a AnimationBudgetAllocator */
public function SetComponentSignificance(Significance:Float32, bNeverSkip:Bool = false, bTickEvenIfNotRendered:Bool = false, bAllowReducedWork:Bool = true, bForceInterpolate:Bool = false) : Void;
}
| Update to latest perforce change | [CL-62799] Update to latest perforce change
| Haxe | mit | proletariatgames/unreal.hx,proletariatgames/unreal.hx,proletariatgames/unreal.hx | haxe | ## Code Before:
package unreal.animationbudgetallocator;
extern class USkeletalMeshComponentBudgeted_Extra
{
/** Updates significance budget if this component has been registered with a AnimationBudgetAllocator */
public function SetComponentSignificance(Significance:Float32, bNeverSkip:Bool = false, bTickEvenIfNotRendered:Bool = false, bAllowReducedWork:Bool = true, bForceInterpolate:Bool = false) : Void;
}
## Instruction:
[CL-62799] Update to latest perforce change
## Code After:
package unreal.animationbudgetallocator;
extern class USkeletalMeshComponentBudgeted_Extra
{
@:thisConst
public function GetShouldUseActorRenderedFlag() : Bool;
public function SetShouldUseActorRenderedFlag(value:Bool) : Void;
/** Updates significance budget if this component has been registered with a AnimationBudgetAllocator */
public function SetComponentSignificance(Significance:Float32, bNeverSkip:Bool = false, bTickEvenIfNotRendered:Bool = false, bAllowReducedWork:Bool = true, bForceInterpolate:Bool = false) : Void;
}
| package unreal.animationbudgetallocator;
extern class USkeletalMeshComponentBudgeted_Extra
{
+ @:thisConst
+ public function GetShouldUseActorRenderedFlag() : Bool;
+
+ public function SetShouldUseActorRenderedFlag(value:Bool) : Void;
+
/** Updates significance budget if this component has been registered with a AnimationBudgetAllocator */
public function SetComponentSignificance(Significance:Float32, bNeverSkip:Bool = false, bTickEvenIfNotRendered:Bool = false, bAllowReducedWork:Bool = true, bForceInterpolate:Bool = false) : Void;
} | 5 | 0.714286 | 5 | 0 |
40c8d682b6949f0932eb04fb7d6ddd505399b187 | site/index.md | site/index.md | ---
title: Happy Shader Programming with Static Staging
---
An introduction goes here.
## Let's Draw Something
An example here.
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(abs(normal), 1.0);
>
>;
draw_mesh(indices, size);
>
More text goes here.
## Another Example
We can use as many examples as we want!
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(0.3, 0.1, 0.9, 1.0);
>
>;
draw_mesh(indices, size);
>
| ---
title: Happy Shader Programming with Static Staging
---
An introduction goes here.
## Let's Draw Something
An example here.
# Position the model.
var model = mat4.create();
mat4.scale(model, model, vec3(2.0, 2.0, 2.0));
mat4.translate(model, model, vec3(0.0, -2.0, 0.0));
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view * model *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(abs(normal), 1.0);
>
>;
draw_mesh(indices, size);
>
More text goes here.
## Another Example
We can use as many examples as we want!
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(0.3, 0.1, 0.9, 1.0);
>
>;
draw_mesh(indices, size);
>
| Use a model matrix for better appearance | Use a model matrix for better appearance
| Markdown | mit | guoyiteng/braid,cucapra/braid,cucapra/braid,cucapra/braid,guoyiteng/braid,cucapra/braid,guoyiteng/braid,cucapra/braid,guoyiteng/braid,guoyiteng/braid,guoyiteng/braid,cucapra/braid | markdown | ## Code Before:
---
title: Happy Shader Programming with Static Staging
---
An introduction goes here.
## Let's Draw Something
An example here.
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(abs(normal), 1.0);
>
>;
draw_mesh(indices, size);
>
More text goes here.
## Another Example
We can use as many examples as we want!
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(0.3, 0.1, 0.9, 1.0);
>
>;
draw_mesh(indices, size);
>
## Instruction:
Use a model matrix for better appearance
## Code After:
---
title: Happy Shader Programming with Static Staging
---
An introduction goes here.
## Let's Draw Something
An example here.
# Position the model.
var model = mat4.create();
mat4.scale(model, model, vec3(2.0, 2.0, 2.0));
mat4.translate(model, model, vec3(0.0, -2.0, 0.0));
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view * model *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(abs(normal), 1.0);
>
>;
draw_mesh(indices, size);
>
More text goes here.
## Another Example
We can use as many examples as we want!
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(0.3, 0.1, 0.9, 1.0);
>
>;
draw_mesh(indices, size);
>
| ---
title: Happy Shader Programming with Static Staging
---
An introduction goes here.
## Let's Draw Something
An example here.
+ # Position the model.
var model = mat4.create();
+ mat4.scale(model, model, vec3(2.0, 2.0, 2.0));
+ mat4.translate(model, model, vec3(0.0, -2.0, 0.0));
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
- gl_Position = projection * view *
+ gl_Position = projection * view * model *
? ++++++++
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(abs(normal), 1.0);
>
>;
draw_mesh(indices, size);
>
More text goes here.
## Another Example
We can use as many examples as we want!
var model = mat4.create();
# Load buffers and parameters for the model.
var mesh = bunny;
var position = mesh_positions(mesh);
var normal = mesh_normals(mesh);
var indices = mesh_indices(mesh);
var size = mesh_size(mesh);
# ---
render js<
vertex glsl<
gl_Position = projection * view *
vec4(position, 1.0);
fragment glsl<
gl_FragColor =
vec4(0.3, 0.1, 0.9, 1.0);
>
>;
draw_mesh(indices, size);
> | 5 | 0.083333 | 4 | 1 |
f262402168981746e431e71eac5661c2726b5023 | package.json | package.json | {
"name": "restjs",
"company": "Azuqua, inc.",
"version": "0.1.0",
"description": "Lightweight REST client",
"author": "Kevin McTigue",
"private": true,
"main": "index",
"dependencies": {
"async": "0.2.10",
"qs": "^4.0.0",
"xml2js": "*"
},
"repository": {
"type": "git",
"url": "https://github.com/azuqua/restjs"
},
"keywords": [
"rest",
"http"
],
"engines": {
"node": "~0.12.2"
}
}
| {
"name": "restjs",
"company": "Azuqua, inc.",
"version": "0.1.0",
"description": "Lightweight REST client",
"author": "Kevin McTigue",
"private": true,
"main": "index",
"dependencies": {
"async": "0.2.10",
"qs": "^6.0.0",
"xml2js": "*"
},
"repository": {
"type": "git",
"url": "https://github.com/azuqua/restjs"
},
"keywords": [
"rest",
"http"
]
}
| Upgrade qs to version 6, remove engines declaration | chore: Upgrade qs to version 6, remove engines declaration
| JSON | mit | azuqua/restjs | json | ## Code Before:
{
"name": "restjs",
"company": "Azuqua, inc.",
"version": "0.1.0",
"description": "Lightweight REST client",
"author": "Kevin McTigue",
"private": true,
"main": "index",
"dependencies": {
"async": "0.2.10",
"qs": "^4.0.0",
"xml2js": "*"
},
"repository": {
"type": "git",
"url": "https://github.com/azuqua/restjs"
},
"keywords": [
"rest",
"http"
],
"engines": {
"node": "~0.12.2"
}
}
## Instruction:
chore: Upgrade qs to version 6, remove engines declaration
## Code After:
{
"name": "restjs",
"company": "Azuqua, inc.",
"version": "0.1.0",
"description": "Lightweight REST client",
"author": "Kevin McTigue",
"private": true,
"main": "index",
"dependencies": {
"async": "0.2.10",
"qs": "^6.0.0",
"xml2js": "*"
},
"repository": {
"type": "git",
"url": "https://github.com/azuqua/restjs"
},
"keywords": [
"rest",
"http"
]
}
| {
"name": "restjs",
"company": "Azuqua, inc.",
"version": "0.1.0",
"description": "Lightweight REST client",
"author": "Kevin McTigue",
"private": true,
"main": "index",
"dependencies": {
"async": "0.2.10",
- "qs": "^4.0.0",
? ^
+ "qs": "^6.0.0",
? ^
"xml2js": "*"
},
"repository": {
"type": "git",
"url": "https://github.com/azuqua/restjs"
},
"keywords": [
"rest",
"http"
- ],
? -
+ ]
- "engines": {
- "node": "~0.12.2"
- }
} | 7 | 0.28 | 2 | 5 |
27165dd4fa3b207562717d2f76a5fabd4588cb19 | lib/gittake.rb | lib/gittake.rb | libdir = File.dirname(__FILE__)
$LOAD_PATH.unshift(libdir) unless $LOAD_PATH.include?(libdir)
require "gittake/version"
require "grit"
require "colormath"
require "rainbow"
module Gittake
class Blame
def initialize(repopath, blamepath)
@repo = ::Grit::Repo.new(repopath)
@blame = ::Grit::Blame.new(@repo, blamepath, @repo.commits.first)
@dates = @blame.lines.map(&:commit).map(&:committed_date)
@most_recent = @dates.max
@age = @dates.max - @dates.min
end
# Percent -- higher is newer
def age_rating(date)
since(date).to_f / @age.to_f
end
def since(date)
# Invert, so closest gives highest number
@most_recent - date
end
def rg_scale(rating)
::ColorMath::HSL.new(120 * rating, 1, 0.5)
end
def graph_blame
pry = true
@blame.lines.each do |line|
date = line.commit.date
cm = rg_scale(age_rating(date))
puts line.line.color(cm.red * 255, cm.green * 255, cm.blue * 255)
end
end
end
end
app = Gittake::Blame.new ARGV[0], ARGV[1]
app.graph_blame
| libdir = File.dirname(__FILE__)
$LOAD_PATH.unshift(libdir) unless $LOAD_PATH.include?(libdir)
require "gittake/version"
require "grit"
require "colormath"
require "rainbow"
module Gittake
class Blame
def initialize(blamepath)
@repo = ::Grit::Repo.new(Dir.pwd)
@blame = ::Grit::Blame.new(@repo, blamepath, @repo.commits.first)
@dates = @blame.lines.map(&:commit).map(&:committed_date)
@most_recent = @dates.max
@age = @dates.max - @dates.min
end
# Percent -- higher is newer
def age_rating(date)
since(date).to_f / @age.to_f
end
def since(date)
# Invert, so closest gives highest number
@most_recent - date
end
def rg_scale(rating)
::ColorMath::HSL.new(120 * rating, 1, 0.5)
end
def graph_blame
pry = true
@blame.lines.each do |line|
date = line.commit.date
cm = rg_scale(age_rating(date))
puts line.line.color(cm.red * 255, cm.green * 255, cm.blue * 255)
end
end
end
end
app = Gittake::Blame.new ARGV[0]
app.graph_blame
| Use execute dir for the repo | Use execute dir for the repo
| Ruby | mit | jessicalevine/gittake | ruby | ## Code Before:
libdir = File.dirname(__FILE__)
$LOAD_PATH.unshift(libdir) unless $LOAD_PATH.include?(libdir)
require "gittake/version"
require "grit"
require "colormath"
require "rainbow"
module Gittake
class Blame
def initialize(repopath, blamepath)
@repo = ::Grit::Repo.new(repopath)
@blame = ::Grit::Blame.new(@repo, blamepath, @repo.commits.first)
@dates = @blame.lines.map(&:commit).map(&:committed_date)
@most_recent = @dates.max
@age = @dates.max - @dates.min
end
# Percent -- higher is newer
def age_rating(date)
since(date).to_f / @age.to_f
end
def since(date)
# Invert, so closest gives highest number
@most_recent - date
end
def rg_scale(rating)
::ColorMath::HSL.new(120 * rating, 1, 0.5)
end
def graph_blame
pry = true
@blame.lines.each do |line|
date = line.commit.date
cm = rg_scale(age_rating(date))
puts line.line.color(cm.red * 255, cm.green * 255, cm.blue * 255)
end
end
end
end
app = Gittake::Blame.new ARGV[0], ARGV[1]
app.graph_blame
## Instruction:
Use execute dir for the repo
## Code After:
libdir = File.dirname(__FILE__)
$LOAD_PATH.unshift(libdir) unless $LOAD_PATH.include?(libdir)
require "gittake/version"
require "grit"
require "colormath"
require "rainbow"
module Gittake
class Blame
def initialize(blamepath)
@repo = ::Grit::Repo.new(Dir.pwd)
@blame = ::Grit::Blame.new(@repo, blamepath, @repo.commits.first)
@dates = @blame.lines.map(&:commit).map(&:committed_date)
@most_recent = @dates.max
@age = @dates.max - @dates.min
end
# Percent -- higher is newer
def age_rating(date)
since(date).to_f / @age.to_f
end
def since(date)
# Invert, so closest gives highest number
@most_recent - date
end
def rg_scale(rating)
::ColorMath::HSL.new(120 * rating, 1, 0.5)
end
def graph_blame
pry = true
@blame.lines.each do |line|
date = line.commit.date
cm = rg_scale(age_rating(date))
puts line.line.color(cm.red * 255, cm.green * 255, cm.blue * 255)
end
end
end
end
app = Gittake::Blame.new ARGV[0]
app.graph_blame
| libdir = File.dirname(__FILE__)
$LOAD_PATH.unshift(libdir) unless $LOAD_PATH.include?(libdir)
require "gittake/version"
require "grit"
require "colormath"
require "rainbow"
module Gittake
class Blame
- def initialize(repopath, blamepath)
? ----------
+ def initialize(blamepath)
- @repo = ::Grit::Repo.new(repopath)
? ^ ^^^^^
+ @repo = ::Grit::Repo.new(Dir.pwd)
? ++ ^ ^^
@blame = ::Grit::Blame.new(@repo, blamepath, @repo.commits.first)
@dates = @blame.lines.map(&:commit).map(&:committed_date)
@most_recent = @dates.max
@age = @dates.max - @dates.min
end
# Percent -- higher is newer
def age_rating(date)
since(date).to_f / @age.to_f
end
def since(date)
# Invert, so closest gives highest number
@most_recent - date
end
def rg_scale(rating)
::ColorMath::HSL.new(120 * rating, 1, 0.5)
end
def graph_blame
pry = true
@blame.lines.each do |line|
date = line.commit.date
cm = rg_scale(age_rating(date))
puts line.line.color(cm.red * 255, cm.green * 255, cm.blue * 255)
end
end
end
end
- app = Gittake::Blame.new ARGV[0], ARGV[1]
? ---------
+ app = Gittake::Blame.new ARGV[0]
app.graph_blame | 6 | 0.130435 | 3 | 3 |
36632d14acbe9bf65e69aed0bc6ca69c20033d98 | lib/shell/plugins/python/env.zsh | lib/shell/plugins/python/env.zsh | for dir in "$HOME"/Library/Python/*; do
if [ -d "$dir/bin" ]; then
export PATH="$PATH:$dir/bin"
fi
done
| if [ -d "$HOME"/Library/Python ]; then
find "$HOME"/Library/Python -mindepth 2 -maxdepth 2 -type d -name bin -print0 |
while IFS= read -r -d '' dir; do
export PATH="$PATH:$dir"
done
fi
| Fix looping over dirs that don’t exist | shell: Fix looping over dirs that don’t exist
Fails with glob error when the directory doesn’t exist, or it has no
children.
| Shell | mit | amarshall/dotfiles,amarshall/dotfiles,amarshall/dotfiles,amarshall/dotfiles | shell | ## Code Before:
for dir in "$HOME"/Library/Python/*; do
if [ -d "$dir/bin" ]; then
export PATH="$PATH:$dir/bin"
fi
done
## Instruction:
shell: Fix looping over dirs that don’t exist
Fails with glob error when the directory doesn’t exist, or it has no
children.
## Code After:
if [ -d "$HOME"/Library/Python ]; then
find "$HOME"/Library/Python -mindepth 2 -maxdepth 2 -type d -name bin -print0 |
while IFS= read -r -d '' dir; do
export PATH="$PATH:$dir"
done
fi
| - for dir in "$HOME"/Library/Python/*; do
- if [ -d "$dir/bin" ]; then
+ if [ -d "$HOME"/Library/Python ]; then
+ find "$HOME"/Library/Python -mindepth 2 -maxdepth 2 -type d -name bin -print0 |
+ while IFS= read -r -d '' dir; do
- export PATH="$PATH:$dir/bin"
? ----
+ export PATH="$PATH:$dir"
? ++
- fi
- done
+ done
+ fi | 11 | 2.2 | 6 | 5 |
06ed4b04c11297aebe05ccde74dd632d1e42532e | README.md | README.md | Source code of my personal site that publish on https://pongkiat.surge.sh
## Design
- Responsive
- ...
| Source code of my personal site that publish on [https://pongkiat.surge.sh](https://pongkiat.surge.sh)
## Design
- Responsive
- ...
| Add web link to my web site | Add web link to my web site
| Markdown | mit | pongkiat/myPersonalSite,pongkiat/myPersonalSite | markdown | ## Code Before:
Source code of my personal site that publish on https://pongkiat.surge.sh
## Design
- Responsive
- ...
## Instruction:
Add web link to my web site
## Code After:
Source code of my personal site that publish on [https://pongkiat.surge.sh](https://pongkiat.surge.sh)
## Design
- Responsive
- ...
| - Source code of my personal site that publish on https://pongkiat.surge.sh
+ Source code of my personal site that publish on [https://pongkiat.surge.sh](https://pongkiat.surge.sh)
? + ++++++++++++++++++++++++++++
## Design
- Responsive
- ... | 2 | 0.4 | 1 | 1 |
d7073e0565ceb8ef674156395bfbdc30799a3eb9 | assets/main.scss | assets/main.scss | ---
---
@import "variables";
@import "bootstrap/bootstrap";
@import "syntax-highlighting";
@import "bootstrap-4-jekyll/bootstrap-4-jekyll";
@import "bootstrap_customization";
| ---
---
@import "variables";
@import "bootstrap/bootstrap";
@import "syntax-highlighting";
@import "bootstrap-4-jekyll/bootstrap-4-jekyll";
@import "bootstrap_customization";
body { padding-top: 70px; } | Add styles to compensate for fixed-top overlap | Add styles to compensate for fixed-top overlap
| SCSS | mit | leonardreidy/leonardreidy.github.io,leonardreidy/leonardreidy.github.io,leonardreidy/leonardreidy.github.io | scss | ## Code Before:
---
---
@import "variables";
@import "bootstrap/bootstrap";
@import "syntax-highlighting";
@import "bootstrap-4-jekyll/bootstrap-4-jekyll";
@import "bootstrap_customization";
## Instruction:
Add styles to compensate for fixed-top overlap
## Code After:
---
---
@import "variables";
@import "bootstrap/bootstrap";
@import "syntax-highlighting";
@import "bootstrap-4-jekyll/bootstrap-4-jekyll";
@import "bootstrap_customization";
body { padding-top: 70px; } | ---
---
@import "variables";
@import "bootstrap/bootstrap";
@import "syntax-highlighting";
@import "bootstrap-4-jekyll/bootstrap-4-jekyll";
@import "bootstrap_customization";
+
+ body { padding-top: 70px; } | 2 | 0.285714 | 2 | 0 |
ce4e6d0e9026af484478cbfb5a365b91a0870728 | resources/views/blocks/ninjas.blade.php | resources/views/blocks/ninjas.blade.php | <div class="block">
<div class="card-heading">Ninja Miners</div>
@foreach ($ninjas as $ninja)
@include('common.card', [
'size' => 'small',
'avatar' => $ninja->avatar,
'name' => $ninja->name . ' (' . $ninja->corporation->name . ')',
'amount' => $ninja->amount_owed
])
@endforeach
</div>
| <div class="block">
<div class="card-heading">Ninja Miners</div>
@foreach ($ninjas as $ninja)
@include('common.card', [
'size' => 'small',
'avatar' => $ninja->avatar,
'name' => $ninja->name . ' (' . (isset($ninja->corporation->name) ? $ninja->corporation->name : 'UNKNOWN') . ')',
'amount' => $ninja->amount_owed
])
@endforeach
</div>
| Stop unknown ninja miner corps from breaking the homepage. | Stop unknown ninja miner corps from breaking the homepage.
| PHP | mit | matthewpennell/moon-mining-manager,matthewpennell/moon-mining-manager | php | ## Code Before:
<div class="block">
<div class="card-heading">Ninja Miners</div>
@foreach ($ninjas as $ninja)
@include('common.card', [
'size' => 'small',
'avatar' => $ninja->avatar,
'name' => $ninja->name . ' (' . $ninja->corporation->name . ')',
'amount' => $ninja->amount_owed
])
@endforeach
</div>
## Instruction:
Stop unknown ninja miner corps from breaking the homepage.
## Code After:
<div class="block">
<div class="card-heading">Ninja Miners</div>
@foreach ($ninjas as $ninja)
@include('common.card', [
'size' => 'small',
'avatar' => $ninja->avatar,
'name' => $ninja->name . ' (' . (isset($ninja->corporation->name) ? $ninja->corporation->name : 'UNKNOWN') . ')',
'amount' => $ninja->amount_owed
])
@endforeach
</div>
| <div class="block">
<div class="card-heading">Ninja Miners</div>
@foreach ($ninjas as $ninja)
@include('common.card', [
'size' => 'small',
'avatar' => $ninja->avatar,
- 'name' => $ninja->name . ' (' . $ninja->corporation->name . ')',
+ 'name' => $ninja->name . ' (' . (isset($ninja->corporation->name) ? $ninja->corporation->name : 'UNKNOWN') . ')',
? ++++++++++++++++++++++++++++++++++++ +++++++++++++
'amount' => $ninja->amount_owed
])
@endforeach
</div> | 2 | 0.142857 | 1 | 1 |
743ac57c0d62e01b9a8ad3481792271d91a0dcce | .travis.yml | .travis.yml | language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
- elixir: 1.2
| language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
| Remove Elixir v1.2 from tests | Remove Elixir v1.2 from tests
| YAML | mit | geolessel/react-phoenix | yaml | ## Code Before:
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
- elixir: 1.2
## Instruction:
Remove Elixir v1.2 from tests
## Code After:
language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
| language: elixir
env:
- NODE_VERSION="6.10.2"
before_install:
- mix local.rebar --force
- mix local.hex --force
- nvm install $NODE_VERSION
install:
- mix deps.get
- npm install
elixir:
- 1.6.3
matrix:
include:
- elixir: 1.6
- elixir: 1.5
- elixir: 1.4
- elixir: 1.3
- - elixir: 1.2 | 1 | 0.05 | 0 | 1 |
81b3c21661d2f9530912d4213a4804d6c925befd | README.md | README.md |
Memorable keyboard shortcuts for Imgur.
Because no one really remembers the defaults (why does `0` = star?).
## What are the shortcuts?
All highlighted shortcuts are available. [Imgur's default shortcuts](http://imgur.com/IO1WONW) are shown in parentheses for reference.
- **previous image** - `j` or `,` or (`left` by default)
- **next image** - `k` or `.` or (`right` by default)
- **favorite image** - `l` or `f` (`0` by default)
- **view comments on reddit** - `r` or `v` or `c`
|
Memorable keyboard shortcuts for Imgur.
Because no one really remembers the defaults (why does `0` = star?).
## Current Status
MVP works, but only minimally tested.
## What are the shortcuts?
All highlighted shortcuts are available. [Imgur's default shortcuts](http://imgur.com/IO1WONW) are shown in parentheses for reference.
- **previous image** - `j` or `,` or (`left` by default)
- **next image** - `k` or `.` or (`right` by default)
- **favorite image** - `l` or `f` (`0` by default)
- **view comments on reddit** - `r` or `v` or `c`
| Add current status to readme. | Add current status to readme.
| Markdown | mit | tedmiston/imgur-shortcuts | markdown | ## Code Before:
Memorable keyboard shortcuts for Imgur.
Because no one really remembers the defaults (why does `0` = star?).
## What are the shortcuts?
All highlighted shortcuts are available. [Imgur's default shortcuts](http://imgur.com/IO1WONW) are shown in parentheses for reference.
- **previous image** - `j` or `,` or (`left` by default)
- **next image** - `k` or `.` or (`right` by default)
- **favorite image** - `l` or `f` (`0` by default)
- **view comments on reddit** - `r` or `v` or `c`
## Instruction:
Add current status to readme.
## Code After:
Memorable keyboard shortcuts for Imgur.
Because no one really remembers the defaults (why does `0` = star?).
## Current Status
MVP works, but only minimally tested.
## What are the shortcuts?
All highlighted shortcuts are available. [Imgur's default shortcuts](http://imgur.com/IO1WONW) are shown in parentheses for reference.
- **previous image** - `j` or `,` or (`left` by default)
- **next image** - `k` or `.` or (`right` by default)
- **favorite image** - `l` or `f` (`0` by default)
- **view comments on reddit** - `r` or `v` or `c`
|
Memorable keyboard shortcuts for Imgur.
Because no one really remembers the defaults (why does `0` = star?).
+
+ ## Current Status
+
+ MVP works, but only minimally tested.
## What are the shortcuts?
All highlighted shortcuts are available. [Imgur's default shortcuts](http://imgur.com/IO1WONW) are shown in parentheses for reference.
- **previous image** - `j` or `,` or (`left` by default)
- **next image** - `k` or `.` or (`right` by default)
- **favorite image** - `l` or `f` (`0` by default)
- **view comments on reddit** - `r` or `v` or `c` | 4 | 0.307692 | 4 | 0 |
3213d4fdf5852aaa3dc2cde4c5cd563abbd011e1 | ethereumj-core/src/test/java/org/ethereum/solidity/CompilerTest.java | ethereumj-core/src/test/java/org/ethereum/solidity/CompilerTest.java | package org.ethereum.solidity;
import org.ethereum.solidity.compiler.CompilationResult;
import org.ethereum.solidity.compiler.SolidityCompiler;
import org.junit.Ignore;
import org.junit.Test;
import java.io.IOException;
/**
* Created by Anton Nashatyrev on 03.03.2016.
*/
public class CompilerTest {
@Test
public void simpleTest() throws IOException {
String contract =
"contract a {" +
" int i1;" +
" function i() returns (int) {" +
" return i1;" +
" }" +
"}";
SolidityCompiler.Result res = SolidityCompiler.compile(
contract.getBytes(), true, SolidityCompiler.Options.ABI, SolidityCompiler.Options.BIN, SolidityCompiler.Options.INTERFACE);
System.out.println(res.output);
System.out.println(res.errors);
CompilationResult result = CompilationResult.parse(res.output);
System.out.println(result.contracts.get("a").bin);
}
public static void main(String[] args) throws Exception {
new CompilerTest().simpleTest();
}
}
| package org.ethereum.solidity;
import org.ethereum.solidity.compiler.CompilationResult;
import org.ethereum.solidity.compiler.SolidityCompiler;
import org.junit.Ignore;
import org.junit.Test;
import java.io.IOException;
/**
* Created by Anton Nashatyrev on 03.03.2016.
*/
public class CompilerTest {
@Test
public void simpleTest() throws IOException {
String contract =
"contract a {" +
" int i1;" +
" function i() returns (int) {" +
" return i1;" +
" }" +
"}";
SolidityCompiler.Result res = SolidityCompiler.compile(
contract.getBytes(), true, SolidityCompiler.Options.ABI, SolidityCompiler.Options.BIN, SolidityCompiler.Options.INTERFACE);
System.out.println("Out: '" + res.output + "'");
System.out.println("Err: '" + res.errors + "'");
CompilationResult result = CompilationResult.parse(res.output);
System.out.println(result.contracts.get("a").bin);
}
public static void main(String[] args) throws Exception {
new CompilerTest().simpleTest();
}
}
| Add bit more out in test | Add bit more out in test
| Java | mit | chengtalent/ethereumj,loxal/FreeEthereum,loxal/FreeEthereum,loxal/ethereumj,loxal/FreeEthereum,caxqueiroz/ethereumj | java | ## Code Before:
package org.ethereum.solidity;
import org.ethereum.solidity.compiler.CompilationResult;
import org.ethereum.solidity.compiler.SolidityCompiler;
import org.junit.Ignore;
import org.junit.Test;
import java.io.IOException;
/**
* Created by Anton Nashatyrev on 03.03.2016.
*/
public class CompilerTest {
@Test
public void simpleTest() throws IOException {
String contract =
"contract a {" +
" int i1;" +
" function i() returns (int) {" +
" return i1;" +
" }" +
"}";
SolidityCompiler.Result res = SolidityCompiler.compile(
contract.getBytes(), true, SolidityCompiler.Options.ABI, SolidityCompiler.Options.BIN, SolidityCompiler.Options.INTERFACE);
System.out.println(res.output);
System.out.println(res.errors);
CompilationResult result = CompilationResult.parse(res.output);
System.out.println(result.contracts.get("a").bin);
}
public static void main(String[] args) throws Exception {
new CompilerTest().simpleTest();
}
}
## Instruction:
Add bit more out in test
## Code After:
package org.ethereum.solidity;
import org.ethereum.solidity.compiler.CompilationResult;
import org.ethereum.solidity.compiler.SolidityCompiler;
import org.junit.Ignore;
import org.junit.Test;
import java.io.IOException;
/**
* Created by Anton Nashatyrev on 03.03.2016.
*/
public class CompilerTest {
@Test
public void simpleTest() throws IOException {
String contract =
"contract a {" +
" int i1;" +
" function i() returns (int) {" +
" return i1;" +
" }" +
"}";
SolidityCompiler.Result res = SolidityCompiler.compile(
contract.getBytes(), true, SolidityCompiler.Options.ABI, SolidityCompiler.Options.BIN, SolidityCompiler.Options.INTERFACE);
System.out.println("Out: '" + res.output + "'");
System.out.println("Err: '" + res.errors + "'");
CompilationResult result = CompilationResult.parse(res.output);
System.out.println(result.contracts.get("a").bin);
}
public static void main(String[] args) throws Exception {
new CompilerTest().simpleTest();
}
}
| package org.ethereum.solidity;
import org.ethereum.solidity.compiler.CompilationResult;
import org.ethereum.solidity.compiler.SolidityCompiler;
import org.junit.Ignore;
import org.junit.Test;
import java.io.IOException;
/**
* Created by Anton Nashatyrev on 03.03.2016.
*/
public class CompilerTest {
@Test
public void simpleTest() throws IOException {
String contract =
"contract a {" +
" int i1;" +
" function i() returns (int) {" +
" return i1;" +
" }" +
"}";
SolidityCompiler.Result res = SolidityCompiler.compile(
contract.getBytes(), true, SolidityCompiler.Options.ABI, SolidityCompiler.Options.BIN, SolidityCompiler.Options.INTERFACE);
- System.out.println(res.output);
+ System.out.println("Out: '" + res.output + "'");
? +++++++++++ ++++++
- System.out.println(res.errors);
+ System.out.println("Err: '" + res.errors + "'");
? +++++++++++ ++++++
CompilationResult result = CompilationResult.parse(res.output);
System.out.println(result.contracts.get("a").bin);
}
public static void main(String[] args) throws Exception {
new CompilerTest().simpleTest();
}
} | 4 | 0.114286 | 2 | 2 |
b78f6e0e9f8a9ca3e7c8c096400dd32428550397 | lib/fs/readdir-directories.js | lib/fs/readdir-directories.js | // Read all filenames from directory and it's subdirectories
'use strict';
var fs = require('fs')
, aritize = require('es5-ext/lib/Function/aritize').call
, curry = require('es5-ext/lib/Function/curry').call
, invoke = require('es5-ext/lib/Function/invoke')
, k = require('es5-ext/lib/Function/k')
, a2p = require('deferred/lib/async-to-promise').call
, ba2p = require('deferred/lib/async-to-promise').bind
, all = require('deferred/lib/join/all')
, concat = aritize(String.prototype.concat, 1)
, trim = require('../path/trim')
, readdir = ba2p(fs.readdir), stat = ba2p(fs.lstat);
require('deferred/lib/ext/cb');
module.exports = function self (path, callback) {
readdir(path = trim(path))
(function (files) {
return all(files, function (file) {
var npath = path + '/' + file;
return stat(npath)
(function (stats) {
return stats.isDirectory() ? file : null;
}, k(null));
})
(invoke('filter', Boolean));
}).cb(callback);
};
| // Read all filenames from directory and it's subdirectories
'use strict';
var fs = require('fs')
, aritize = require('es5-ext/lib/Function/prototype/aritize')
, invoke = require('es5-ext/lib/Function/invoke')
, k = require('es5-ext/lib/Function/k')
, a2p = require('deferred/lib/async-to-promise').call
, ba2p = require('deferred/lib/async-to-promise').bind
, all = require('deferred/lib/join/all')
, concat = aritize.call(String.prototype.concat, 1)
, trim = require('../path/trim')
, readdir = ba2p(fs.readdir), stat = ba2p(fs.lstat);
require('deferred/lib/ext/cb');
module.exports = function self (path, callback) {
readdir(path = trim(path))
(function (files) {
return all(files, function (file) {
var npath = path + '/' + file;
return stat(npath)
(function (stats) {
return stats.isDirectory() ? file : null;
}, k(null));
})
(invoke('filter', Boolean));
}).cb(callback);
};
| Update up to changes in es5-ext | Update up to changes in es5-ext
| JavaScript | mit | medikoo/node-ext | javascript | ## Code Before:
// Read all filenames from directory and it's subdirectories
'use strict';
var fs = require('fs')
, aritize = require('es5-ext/lib/Function/aritize').call
, curry = require('es5-ext/lib/Function/curry').call
, invoke = require('es5-ext/lib/Function/invoke')
, k = require('es5-ext/lib/Function/k')
, a2p = require('deferred/lib/async-to-promise').call
, ba2p = require('deferred/lib/async-to-promise').bind
, all = require('deferred/lib/join/all')
, concat = aritize(String.prototype.concat, 1)
, trim = require('../path/trim')
, readdir = ba2p(fs.readdir), stat = ba2p(fs.lstat);
require('deferred/lib/ext/cb');
module.exports = function self (path, callback) {
readdir(path = trim(path))
(function (files) {
return all(files, function (file) {
var npath = path + '/' + file;
return stat(npath)
(function (stats) {
return stats.isDirectory() ? file : null;
}, k(null));
})
(invoke('filter', Boolean));
}).cb(callback);
};
## Instruction:
Update up to changes in es5-ext
## Code After:
// Read all filenames from directory and it's subdirectories
'use strict';
var fs = require('fs')
, aritize = require('es5-ext/lib/Function/prototype/aritize')
, invoke = require('es5-ext/lib/Function/invoke')
, k = require('es5-ext/lib/Function/k')
, a2p = require('deferred/lib/async-to-promise').call
, ba2p = require('deferred/lib/async-to-promise').bind
, all = require('deferred/lib/join/all')
, concat = aritize.call(String.prototype.concat, 1)
, trim = require('../path/trim')
, readdir = ba2p(fs.readdir), stat = ba2p(fs.lstat);
require('deferred/lib/ext/cb');
module.exports = function self (path, callback) {
readdir(path = trim(path))
(function (files) {
return all(files, function (file) {
var npath = path + '/' + file;
return stat(npath)
(function (stats) {
return stats.isDirectory() ? file : null;
}, k(null));
})
(invoke('filter', Boolean));
}).cb(callback);
};
| // Read all filenames from directory and it's subdirectories
'use strict';
var fs = require('fs')
- , aritize = require('es5-ext/lib/Function/aritize').call
? -----
+ , aritize = require('es5-ext/lib/Function/prototype/aritize')
? ++++++++++
- , curry = require('es5-ext/lib/Function/curry').call
, invoke = require('es5-ext/lib/Function/invoke')
, k = require('es5-ext/lib/Function/k')
, a2p = require('deferred/lib/async-to-promise').call
, ba2p = require('deferred/lib/async-to-promise').bind
, all = require('deferred/lib/join/all')
- , concat = aritize(String.prototype.concat, 1)
+ , concat = aritize.call(String.prototype.concat, 1)
? +++++
, trim = require('../path/trim')
, readdir = ba2p(fs.readdir), stat = ba2p(fs.lstat);
require('deferred/lib/ext/cb');
module.exports = function self (path, callback) {
readdir(path = trim(path))
(function (files) {
return all(files, function (file) {
var npath = path + '/' + file;
return stat(npath)
(function (stats) {
return stats.isDirectory() ? file : null;
}, k(null));
})
(invoke('filter', Boolean));
}).cb(callback);
}; | 5 | 0.151515 | 2 | 3 |
c10f8e6ac6c79b95c131658f8fe4c376acccc163 | anna_tests.sh | anna_tests.sh |
error_count=0
test_count=0
for i in tests/*.anna; do
# echo "Run test $i"
./anna tests/$(basename $i .anna) >anna_tests.out 2>/dev/null
status=$?
out_correct=tests/$(basename $i .anna).output
status_correct_file=tests/$(basename $i .anna).status
status_correct=0
error=0
if test -f $out_correct; then
if diff anna_tests.out $out_correct; then
true
else
error=1
echo "Error in output for test $i!!"
fi
fi
if test -f $status_correct_file; then
status_correct=$(cat $status_correct_file)
fi
if test "$status" != "$status_correct"; then
echo "Error in exit status for test $i." $status_correct != $status
error=1
fi
error_count=$(echo $error_count + $error|bc)
test_count=$(echo $test_count + 1|bc)
done
echo "Found $error_count errors while running $test_count tests" |
error_count=0
test_count=0
for i in tests/*.anna; do
# echo "Run test $i"
./anna tests/$(basename $i .anna) >anna_tests.out 2>/dev/null
status=$?
out_correct=tests/$(basename $i .anna).output
status_correct_file=tests/$(basename $i .anna).status
status_correct=0
error=0
if test -f $out_correct; then
if diff anna_tests.out $out_correct; then
true
else
error=1
echo "Error in output for test $i!!"
fi
fi
if test -f $status_correct_file; then
status_correct=$(cat $status_correct_file)
fi
if test "$status" != "$status_correct"; then
echo "Error in exit status for test $i." $status_correct != $status
error=1
fi
error_count=$(echo $error_count + $error|bc)
test_count=$(echo $test_count + 1|bc)
done
echo "Found $error_count errors while running $test_count tests"
test $error_count = 0 | Correct exit status on regression check tests | Correct exit status on regression check tests
| Shell | bsd-2-clause | liljencrantz/anna,liljencrantz/anna,liljencrantz/anna,liljencrantz/anna,liljencrantz/anna | shell | ## Code Before:
error_count=0
test_count=0
for i in tests/*.anna; do
# echo "Run test $i"
./anna tests/$(basename $i .anna) >anna_tests.out 2>/dev/null
status=$?
out_correct=tests/$(basename $i .anna).output
status_correct_file=tests/$(basename $i .anna).status
status_correct=0
error=0
if test -f $out_correct; then
if diff anna_tests.out $out_correct; then
true
else
error=1
echo "Error in output for test $i!!"
fi
fi
if test -f $status_correct_file; then
status_correct=$(cat $status_correct_file)
fi
if test "$status" != "$status_correct"; then
echo "Error in exit status for test $i." $status_correct != $status
error=1
fi
error_count=$(echo $error_count + $error|bc)
test_count=$(echo $test_count + 1|bc)
done
echo "Found $error_count errors while running $test_count tests"
## Instruction:
Correct exit status on regression check tests
## Code After:
error_count=0
test_count=0
for i in tests/*.anna; do
# echo "Run test $i"
./anna tests/$(basename $i .anna) >anna_tests.out 2>/dev/null
status=$?
out_correct=tests/$(basename $i .anna).output
status_correct_file=tests/$(basename $i .anna).status
status_correct=0
error=0
if test -f $out_correct; then
if diff anna_tests.out $out_correct; then
true
else
error=1
echo "Error in output for test $i!!"
fi
fi
if test -f $status_correct_file; then
status_correct=$(cat $status_correct_file)
fi
if test "$status" != "$status_correct"; then
echo "Error in exit status for test $i." $status_correct != $status
error=1
fi
error_count=$(echo $error_count + $error|bc)
test_count=$(echo $test_count + 1|bc)
done
echo "Found $error_count errors while running $test_count tests"
test $error_count = 0 |
error_count=0
test_count=0
for i in tests/*.anna; do
# echo "Run test $i"
./anna tests/$(basename $i .anna) >anna_tests.out 2>/dev/null
status=$?
out_correct=tests/$(basename $i .anna).output
status_correct_file=tests/$(basename $i .anna).status
status_correct=0
error=0
if test -f $out_correct; then
if diff anna_tests.out $out_correct; then
true
else
error=1
echo "Error in output for test $i!!"
fi
fi
+
if test -f $status_correct_file; then
status_correct=$(cat $status_correct_file)
fi
if test "$status" != "$status_correct"; then
echo "Error in exit status for test $i." $status_correct != $status
error=1
fi
error_count=$(echo $error_count + $error|bc)
test_count=$(echo $test_count + 1|bc)
done
echo "Found $error_count errors while running $test_count tests"
+ test $error_count = 0 | 2 | 0.058824 | 2 | 0 |
9149ee0d3d32d77a70d2111365004d455c4e6ca8 | core/tests/org.openhealthtools.mdht.uml.hl7.datatypes.test/src/org/openhealthtools/mdht/uml/hl7/datatypes/operations/ALLDatatypeOperationsTests.java | core/tests/org.openhealthtools.mdht.uml.hl7.datatypes.test/src/org/openhealthtools/mdht/uml/hl7/datatypes/operations/ALLDatatypeOperationsTests.java | /**
* Copyright (c) 2010 IBM Corporation
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* IBM Corporation - initial API and implementation
*
* $Id$
*/
package org.openhealthtools.mdht.uml.hl7.datatypes.operations;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
/**
* This class represents a suite of Junit 4 test cases for HL7 Datatypes.
*/
@RunWith(Suite.class)
@Suite.SuiteClasses( { ADOperationsTest.class, BNOperationsTest.class,
EDOperationsTest.class, ENOperationsTest.class, ONOperationsTest.class,
PNOperationsTest.class, TNOperationsTest.class })
public class ALLDatatypeOperationsTests {
// Nothing
} // ALLDatatypeOperationsTests | /**
* Copyright (c) 2010 IBM Corporation
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* IBM Corporation - initial API and implementation
*
* $Id$
*/
package org.openhealthtools.mdht.uml.hl7.datatypes.operations;
import junit.framework.JUnit4TestAdapter;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
/**
* This class represents a suite of Junit 4 test cases for HL7 Datatypes.
*/
@RunWith(Suite.class)
@Suite.SuiteClasses( { ADOperationsTest.class, BNOperationsTest.class,
EDOperationsTest.class, ENOperationsTest.class, ONOperationsTest.class,
PNOperationsTest.class, TNOperationsTest.class })
public class ALLDatatypeOperationsTests {
public static junit.framework.Test suite() {
return new JUnit4TestAdapter(ALLDatatypeOperationsTests.class);
}
// Nothing
} // ALLDatatypeOperationsTests | Add junit ant task support | Add junit ant task support | Java | epl-1.0 | vadimnehta/mdht,sarpkayanehta/mdht,sarpkayanehta/mdht,mdht/mdht,vadimnehta/mdht,mdht/mdht,drbgfc/mdht,mdht/mdht,drbgfc/mdht,drbgfc/mdht,drbgfc/mdht,mdht/mdht,drbgfc/mdht,vadimnehta/mdht,sarpkayanehta/mdht,sarpkayanehta/mdht,vadimnehta/mdht,drbgfc/mdht,sarpkayanehta/mdht,mdht/mdht,vadimnehta/mdht,vadimnehta/mdht,sarpkayanehta/mdht | java | ## Code Before:
/**
* Copyright (c) 2010 IBM Corporation
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* IBM Corporation - initial API and implementation
*
* $Id$
*/
package org.openhealthtools.mdht.uml.hl7.datatypes.operations;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
/**
* This class represents a suite of Junit 4 test cases for HL7 Datatypes.
*/
@RunWith(Suite.class)
@Suite.SuiteClasses( { ADOperationsTest.class, BNOperationsTest.class,
EDOperationsTest.class, ENOperationsTest.class, ONOperationsTest.class,
PNOperationsTest.class, TNOperationsTest.class })
public class ALLDatatypeOperationsTests {
// Nothing
} // ALLDatatypeOperationsTests
## Instruction:
Add junit ant task support
## Code After:
/**
* Copyright (c) 2010 IBM Corporation
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* IBM Corporation - initial API and implementation
*
* $Id$
*/
package org.openhealthtools.mdht.uml.hl7.datatypes.operations;
import junit.framework.JUnit4TestAdapter;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
/**
* This class represents a suite of Junit 4 test cases for HL7 Datatypes.
*/
@RunWith(Suite.class)
@Suite.SuiteClasses( { ADOperationsTest.class, BNOperationsTest.class,
EDOperationsTest.class, ENOperationsTest.class, ONOperationsTest.class,
PNOperationsTest.class, TNOperationsTest.class })
public class ALLDatatypeOperationsTests {
public static junit.framework.Test suite() {
return new JUnit4TestAdapter(ALLDatatypeOperationsTests.class);
}
// Nothing
} // ALLDatatypeOperationsTests | /**
* Copyright (c) 2010 IBM Corporation
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* IBM Corporation - initial API and implementation
*
* $Id$
*/
package org.openhealthtools.mdht.uml.hl7.datatypes.operations;
+ import junit.framework.JUnit4TestAdapter;
+
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
-
/**
* This class represents a suite of Junit 4 test cases for HL7 Datatypes.
*/
@RunWith(Suite.class)
@Suite.SuiteClasses( { ADOperationsTest.class, BNOperationsTest.class,
EDOperationsTest.class, ENOperationsTest.class, ONOperationsTest.class,
PNOperationsTest.class, TNOperationsTest.class })
public class ALLDatatypeOperationsTests {
+
+ public static junit.framework.Test suite() {
+ return new JUnit4TestAdapter(ALLDatatypeOperationsTests.class);
+ }
// Nothing
} // ALLDatatypeOperationsTests | 7 | 0.259259 | 6 | 1 |
075e66b35e268ef4d79e87e3fac703f598770d24 | test-inkwell/Cargo.toml | test-inkwell/Cargo.toml | [package]
name = "test-inkwell"
version = "0.1.0"
authors = ["Shaked Flur <fshaked@gmail.com>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "2.33"
# inkwell = { git = "https://github.com/TheDan64/inkwell", branch = "master", features = ["llvm10-0"] }
# inkwell = { features = ["llvm10-0"] }
inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
| [package]
name = "test-inkwell"
version = "0.1.0"
authors = ["Shaked Flur <fshaked@gmail.com>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "2.33"
# inkwell = { git = "https://github.com/TheDan64/inkwell", branch = "master", features = ["llvm10-0"] }
# inkwell = { features = ["llvm10-0"] }
# inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
inkwell = { git = "https://github.com/alastairreid/inkwell", features = ["llvm10-0"] }
| Switch to our local copy of inkwell | Switch to our local copy of inkwell
| TOML | apache-2.0 | project-oak/rust-verification-tools,project-oak/rust-verification-tools,project-oak/rust-verification-tools,project-oak/rust-verification-tools | toml | ## Code Before:
[package]
name = "test-inkwell"
version = "0.1.0"
authors = ["Shaked Flur <fshaked@gmail.com>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "2.33"
# inkwell = { git = "https://github.com/TheDan64/inkwell", branch = "master", features = ["llvm10-0"] }
# inkwell = { features = ["llvm10-0"] }
inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
## Instruction:
Switch to our local copy of inkwell
## Code After:
[package]
name = "test-inkwell"
version = "0.1.0"
authors = ["Shaked Flur <fshaked@gmail.com>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "2.33"
# inkwell = { git = "https://github.com/TheDan64/inkwell", branch = "master", features = ["llvm10-0"] }
# inkwell = { features = ["llvm10-0"] }
# inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
inkwell = { git = "https://github.com/alastairreid/inkwell", features = ["llvm10-0"] }
| [package]
name = "test-inkwell"
version = "0.1.0"
authors = ["Shaked Flur <fshaked@gmail.com>"]
edition = "2018"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
clap = "2.33"
# inkwell = { git = "https://github.com/TheDan64/inkwell", branch = "master", features = ["llvm10-0"] }
# inkwell = { features = ["llvm10-0"] }
- inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
+ # inkwell = { git = "https://github.com/dylanede/inkwell", branch = "section_null", features = ["llvm10-0"] }
? ++
+ inkwell = { git = "https://github.com/alastairreid/inkwell", features = ["llvm10-0"] } | 3 | 0.230769 | 2 | 1 |
d516cc28ca767dda67179798711494d26ba1c197 | lib/setup.js | lib/setup.js | var $ = require('jquery');
module.exports = function(namespace) {
var toggleSelector = '[data-' + namespace + '-toggle]';
var openClass = namespace + '--open';
var body = $(document.body).addClass(namespace);
$(document)
.on('click.' + namespace, toggleSelector, function(e) {
e.preventDefault();
var expanded = body
.toggleClass(openClass)
.hasClass(openClass)
;
$(toggleSelector)
.attr('aria-expanded', (expanded ? 'true' : 'false'))
.trigger(namespace + ':' + (expanded ? 'open' : 'close'), [this]);
})
;
};
| var $ = require('jquery');
module.exports = function(namespace) {
var toggleSelector = '[data-' + namespace + '-toggle]';
var openClass = namespace + '--open';
var root = $('html').addClass(namespace);
$(document)
.on('click.' + namespace, toggleSelector, function(e) {
e.preventDefault();
var expanded = root
.toggleClass(openClass)
.hasClass(openClass)
;
$(toggleSelector)
.attr('aria-expanded', (expanded ? 'true' : 'false'))
.trigger(namespace + ':' + (expanded ? 'open' : 'close'), [this]);
})
;
};
| Fix adding root class to html, not body | Fix adding root class to html, not body
| JavaScript | mit | dotsunited/off-canvas-navigation,dotsunited/off-canvas-navigation | javascript | ## Code Before:
var $ = require('jquery');
module.exports = function(namespace) {
var toggleSelector = '[data-' + namespace + '-toggle]';
var openClass = namespace + '--open';
var body = $(document.body).addClass(namespace);
$(document)
.on('click.' + namespace, toggleSelector, function(e) {
e.preventDefault();
var expanded = body
.toggleClass(openClass)
.hasClass(openClass)
;
$(toggleSelector)
.attr('aria-expanded', (expanded ? 'true' : 'false'))
.trigger(namespace + ':' + (expanded ? 'open' : 'close'), [this]);
})
;
};
## Instruction:
Fix adding root class to html, not body
## Code After:
var $ = require('jquery');
module.exports = function(namespace) {
var toggleSelector = '[data-' + namespace + '-toggle]';
var openClass = namespace + '--open';
var root = $('html').addClass(namespace);
$(document)
.on('click.' + namespace, toggleSelector, function(e) {
e.preventDefault();
var expanded = root
.toggleClass(openClass)
.hasClass(openClass)
;
$(toggleSelector)
.attr('aria-expanded', (expanded ? 'true' : 'false'))
.trigger(namespace + ':' + (expanded ? 'open' : 'close'), [this]);
})
;
};
| var $ = require('jquery');
module.exports = function(namespace) {
var toggleSelector = '[data-' + namespace + '-toggle]';
var openClass = namespace + '--open';
- var body = $(document.body).addClass(namespace);
? ^ ^^ ^^^^ ^^^^^^^^
+ var root = $('html').addClass(namespace);
? ^ ^^ ^^^ ^^
$(document)
.on('click.' + namespace, toggleSelector, function(e) {
e.preventDefault();
- var expanded = body
? ^ ^^
+ var expanded = root
? ^ ^^
.toggleClass(openClass)
.hasClass(openClass)
;
$(toggleSelector)
.attr('aria-expanded', (expanded ? 'true' : 'false'))
.trigger(namespace + ':' + (expanded ? 'open' : 'close'), [this]);
})
;
}; | 4 | 0.173913 | 2 | 2 |
fe98a627943c235ba24fc6de781deec69e7fd02e | relayer/__init__.py | relayer/__init__.py | from kafka import KafkaProducer
from .event_emitter import EventEmitter
from .exceptions import ConfigurationError
__version__ = '0.1.3'
class Relayer(object):
def __init__(self, logging_topic, context_handler_class, kafka_hosts=None, topic_prefix='', topic_suffix='', source=''):
self.logging_topic = logging_topic
if not kafka_hosts:
raise ConfigurationError()
if source == '':
self.source = '{0}{1}{2}'.format(topic_prefix, logging_topic, topic_suffix)
else:
self.source = source
producer = KafkaProducer(bootstrap_servers=kafka_hosts)
emitter = EventEmitter(producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
self.context = context_handler_class(emitter)
def emit(self, event_type, event_subtype, payload, partition_key=None):
payload = {
'source': self.source,
'event_type': event_type,
'event_subtype': event_subtype,
'payload': payload
}
self.context.emit(event_type, payload, partition_key)
def emit_raw(self, topic, message, partition_key=None):
self.context.emit(topic, message, partition_key)
def log(self, log_level, payload):
message = {
'log_level': log_level,
'payload': payload
}
self.context.log(message)
def flush(self):
self.emitter.flush()
| from kafka import KafkaProducer
from .event_emitter import EventEmitter
from .exceptions import ConfigurationError
__version__ = '0.1.3'
class Relayer(object):
def __init__(self, logging_topic, context_handler_class, kafka_hosts=None, topic_prefix='', topic_suffix='', source=''):
self.logging_topic = logging_topic
if not kafka_hosts:
raise ConfigurationError()
if source == '':
self.source = '{0}{1}{2}'.format(topic_prefix, logging_topic, topic_suffix)
else:
self.source = source
self._producer = KafkaProducer(bootstrap_servers=kafka_hosts)
self._emitter = EventEmitter(self._producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
self.context = context_handler_class(self._emitter)
def emit(self, event_type, event_subtype, payload, partition_key=None):
payload = {
'source': self.source,
'event_type': event_type,
'event_subtype': event_subtype,
'payload': payload
}
self.context.emit(event_type, payload, partition_key)
def emit_raw(self, topic, message, partition_key=None):
self.context.emit(topic, message, partition_key)
def log(self, log_level, payload):
message = {
'log_level': log_level,
'payload': payload
}
self.context.log(message)
def flush(self):
self._emitter.flush()
| Save event emitter y producer reference in relayer instance | Save event emitter y producer reference in relayer instance
| Python | mit | wizeline/relayer | python | ## Code Before:
from kafka import KafkaProducer
from .event_emitter import EventEmitter
from .exceptions import ConfigurationError
__version__ = '0.1.3'
class Relayer(object):
def __init__(self, logging_topic, context_handler_class, kafka_hosts=None, topic_prefix='', topic_suffix='', source=''):
self.logging_topic = logging_topic
if not kafka_hosts:
raise ConfigurationError()
if source == '':
self.source = '{0}{1}{2}'.format(topic_prefix, logging_topic, topic_suffix)
else:
self.source = source
producer = KafkaProducer(bootstrap_servers=kafka_hosts)
emitter = EventEmitter(producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
self.context = context_handler_class(emitter)
def emit(self, event_type, event_subtype, payload, partition_key=None):
payload = {
'source': self.source,
'event_type': event_type,
'event_subtype': event_subtype,
'payload': payload
}
self.context.emit(event_type, payload, partition_key)
def emit_raw(self, topic, message, partition_key=None):
self.context.emit(topic, message, partition_key)
def log(self, log_level, payload):
message = {
'log_level': log_level,
'payload': payload
}
self.context.log(message)
def flush(self):
self.emitter.flush()
## Instruction:
Save event emitter y producer reference in relayer instance
## Code After:
from kafka import KafkaProducer
from .event_emitter import EventEmitter
from .exceptions import ConfigurationError
__version__ = '0.1.3'
class Relayer(object):
def __init__(self, logging_topic, context_handler_class, kafka_hosts=None, topic_prefix='', topic_suffix='', source=''):
self.logging_topic = logging_topic
if not kafka_hosts:
raise ConfigurationError()
if source == '':
self.source = '{0}{1}{2}'.format(topic_prefix, logging_topic, topic_suffix)
else:
self.source = source
self._producer = KafkaProducer(bootstrap_servers=kafka_hosts)
self._emitter = EventEmitter(self._producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
self.context = context_handler_class(self._emitter)
def emit(self, event_type, event_subtype, payload, partition_key=None):
payload = {
'source': self.source,
'event_type': event_type,
'event_subtype': event_subtype,
'payload': payload
}
self.context.emit(event_type, payload, partition_key)
def emit_raw(self, topic, message, partition_key=None):
self.context.emit(topic, message, partition_key)
def log(self, log_level, payload):
message = {
'log_level': log_level,
'payload': payload
}
self.context.log(message)
def flush(self):
self._emitter.flush()
| from kafka import KafkaProducer
from .event_emitter import EventEmitter
from .exceptions import ConfigurationError
__version__ = '0.1.3'
class Relayer(object):
def __init__(self, logging_topic, context_handler_class, kafka_hosts=None, topic_prefix='', topic_suffix='', source=''):
self.logging_topic = logging_topic
if not kafka_hosts:
raise ConfigurationError()
if source == '':
self.source = '{0}{1}{2}'.format(topic_prefix, logging_topic, topic_suffix)
else:
self.source = source
- producer = KafkaProducer(bootstrap_servers=kafka_hosts)
+ self._producer = KafkaProducer(bootstrap_servers=kafka_hosts)
? ++++++
- emitter = EventEmitter(producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
+ self._emitter = EventEmitter(self._producer, topic_prefix=topic_prefix, topic_suffix=topic_suffix)
? ++++++ ++++++
- self.context = context_handler_class(emitter)
+ self.context = context_handler_class(self._emitter)
? ++++++
def emit(self, event_type, event_subtype, payload, partition_key=None):
payload = {
'source': self.source,
'event_type': event_type,
'event_subtype': event_subtype,
'payload': payload
}
self.context.emit(event_type, payload, partition_key)
def emit_raw(self, topic, message, partition_key=None):
self.context.emit(topic, message, partition_key)
def log(self, log_level, payload):
message = {
'log_level': log_level,
'payload': payload
}
self.context.log(message)
def flush(self):
- self.emitter.flush()
+ self._emitter.flush()
? +
| 8 | 0.186047 | 4 | 4 |
7d59389da37688f815bf53a30a3e3018844fb971 | javascript.html | javascript.html | <section name="javascript" class="javascript">
<p class="ioDesc">In</p>
<pre class="incoming brush:javascript">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
</section>
| <section name="javascript" class="javascript">
<p class="ioDesc">In</p>
<pre class="incoming brush:javascript">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
<pre class="jsRunCode" style="display: none">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
</section>
| Fix alert typo & add ugly hack for in-browser code execution | Fix alert typo & add ugly hack for in-browser code execution
| HTML | mit | apiaryio/language-templates | html | ## Code Before:
<section name="javascript" class="javascript">
<p class="ioDesc">In</p>
<pre class="incoming brush:javascript">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
</section>
## Instruction:
Fix alert typo & add ugly hack for in-browser code execution
## Code After:
<section name="javascript" class="javascript">
<p class="ioDesc">In</p>
<pre class="incoming brush:javascript">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
<pre class="jsRunCode" style="display: none">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
</section>
| <section name="javascript" class="javascript">
<p class="ioDesc">In</p>
<pre class="incoming brush:javascript">
var xhr = new XMLHttpRequest();
xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
<% for header, value of @headers: %>
xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
<% end %>
xhr.onreadystatechange = function () {
if (this.readyState == 4) {
- alert('Status: '+this.status+\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
+ alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
? +
+ }
+ };
+ xhr.send(<%= @helpers.escape @body.join('') %>);
+ </pre>
+ <pre class="jsRunCode" style="display: none">
+ var xhr = new XMLHttpRequest();
+ xhr.open('<%= @method.toUpperCase() %>', '<%= @apiUrl %><%= @url %>');
+ <% for header, value of @headers: %>
+ xhr.setRequestHeader(<%= @helpers.escape header %>, <%= @helpers.escape value %>);
+ <% end %>
+ xhr.onreadystatechange = function () {
+ if (this.readyState == 4) {
+ alert('Status: '+this.status+'\nHeaders: '+JSON.stringify(this.getAllResponseHeaders())+'\nBody: '+this.responseText);
}
};
xhr.send(<%= @helpers.escape @body.join('') %>);
</pre>
</section> | 15 | 0.9375 | 14 | 1 |
7f1ca228982dd24a5eade2585877d6fd18d1bb77 | css/lat_long_field.css | css/lat_long_field.css | margin: 0px;
}
#LatitudeLongitudeZoom .fieldholder-small-label {
margin: 0px;
} | margin: 0px;
}
#LatLonZoomLevel .fieldholder-small-label {
margin: 0px;
} | Fix width of Latitude/Longitude fields in order to deal with the number of decimal points | UPGRADE/BUGFIX: Fix width of Latitude/Longitude fields in order to deal with the number of decimal points
| CSS | bsd-3-clause | gordonbanderson/Mappable,gordonbanderson/Mappable | css | ## Code Before:
margin: 0px;
}
#LatitudeLongitudeZoom .fieldholder-small-label {
margin: 0px;
}
## Instruction:
UPGRADE/BUGFIX: Fix width of Latitude/Longitude fields in order to deal with the number of decimal points
## Code After:
margin: 0px;
}
#LatLonZoomLevel .fieldholder-small-label {
margin: 0px;
} | margin: 0px;
}
- #LatitudeLongitudeZoom .fieldholder-small-label {
? ----- ------
+ #LatLonZoomLevel .fieldholder-small-label {
? +++++
margin: 0px;
} | 2 | 0.333333 | 1 | 1 |
14efcc79edfce6e97e7865df24c0b10f84a6ef0b | src/Korobi/WebBundle/Resources/views/controller/log/channel.html.twig | src/Korobi/WebBundle/Resources/views/controller/log/channel.html.twig | {% extends 'KorobiWebBundle::layout.html.twig' %}
{% set page_title = "Log" %}
{% block body %}
<div class="logs">
{% for message in logs %}
<span class="logs--line">{{ message|raw }}</span>
{% endfor %}
</div>
{% endblock %}
| {% extends 'KorobiWebBundle::layout.html.twig' %}
{% set page_title = "Log" %}
{% block body %}
<div class="logs">
{% for message in logs %}
<span class="logs--line" data-line-num="{{ loop.index }}">{{ message|raw }}</span>
{% endfor %}
</div>
{% endblock %}
| Add line numbers to the HTML | Add line numbers to the HTML
| Twig | mit | korobi/Web,korobi/Web,korobi/Web,korobi/Web,korobi/Web | twig | ## Code Before:
{% extends 'KorobiWebBundle::layout.html.twig' %}
{% set page_title = "Log" %}
{% block body %}
<div class="logs">
{% for message in logs %}
<span class="logs--line">{{ message|raw }}</span>
{% endfor %}
</div>
{% endblock %}
## Instruction:
Add line numbers to the HTML
## Code After:
{% extends 'KorobiWebBundle::layout.html.twig' %}
{% set page_title = "Log" %}
{% block body %}
<div class="logs">
{% for message in logs %}
<span class="logs--line" data-line-num="{{ loop.index }}">{{ message|raw }}</span>
{% endfor %}
</div>
{% endblock %}
| {% extends 'KorobiWebBundle::layout.html.twig' %}
{% set page_title = "Log" %}
{% block body %}
<div class="logs">
{% for message in logs %}
- <span class="logs--line">{{ message|raw }}</span>
+ <span class="logs--line" data-line-num="{{ loop.index }}">{{ message|raw }}</span>
? +++++++++++++++++++++++++++++++++
{% endfor %}
</div>
{% endblock %} | 2 | 0.181818 | 1 | 1 |
9fa9229ce668e0fa5bded410e23df473e962d00d | src/pipeline/Paginate.ts | src/pipeline/Paginate.ts | import { PipelineAbstract, option, description } from '../serafin/pipeline/Abstract'
import * as Promise from 'bluebird'
@description("Provides pagination over the read results")
export class Paginate<T,
ReadQuery = {},
ReadOptions = { offset?: number, count?: number },
ReadWrapper = { count: number, results: {}[] }>
extends PipelineAbstract<T, ReadQuery, ReadOptions, ReadWrapper> {
@description("Reads a limited count of results")
@option('offset', { type: "integer" }, false)
@option('count', { type: "integer" }, false)
@option('page', { type: "integer" }, false)
read(query?: ReadQuery): Promise<ReadWrapper> {
return this.parent.read(query).then((resources) => {
let count = resources.results.length;
return Promise.resolve({ ...resources, count: count });
});
}
} | import { PipelineAbstract, option, description } from '../serafin/pipeline/Abstract'
import * as Promise from 'bluebird'
@description("Provides pagination over the read results")
export class Paginate<T,
ReadQuery = {},
ReadOptions = { offset?: number, count?: number },
ReadWrapper = { count: number, results: {}[] }>
extends PipelineAbstract<T, ReadQuery, ReadOptions, ReadWrapper> {
@description("Reads a limited count of results")
@option('offset', { type: "integer" }, false)
@option('count', { type: "integer" }, false)
@option('page', { type: "integer" }, false)
read(query?: ReadQuery, options?: ReadOptions): Promise<ReadWrapper> {
return this.parent.read(query, options).then((resources) => {
let offset = 0;
if (options) {
if (options['offset']) {
offset = options['offset'];
} else if (options['page'] && options['count']) {
offset = (resources.length / options['count']) * options['page'];
}
if (options['count']) {
if (offset > resources.length) {
throw new Error("Offset higher than the number of resources");
}
resources = resources.slice(offset, offset + options['count']);
}
}
return Promise.resolve({ ...resources, count: resources.length });
});
}
} | Add operations to the Pager Pipeline | Add operations to the Pager Pipeline
| TypeScript | mit | serafin-framework/serafin,serafin-framework/serafin | typescript | ## Code Before:
import { PipelineAbstract, option, description } from '../serafin/pipeline/Abstract'
import * as Promise from 'bluebird'
@description("Provides pagination over the read results")
export class Paginate<T,
ReadQuery = {},
ReadOptions = { offset?: number, count?: number },
ReadWrapper = { count: number, results: {}[] }>
extends PipelineAbstract<T, ReadQuery, ReadOptions, ReadWrapper> {
@description("Reads a limited count of results")
@option('offset', { type: "integer" }, false)
@option('count', { type: "integer" }, false)
@option('page', { type: "integer" }, false)
read(query?: ReadQuery): Promise<ReadWrapper> {
return this.parent.read(query).then((resources) => {
let count = resources.results.length;
return Promise.resolve({ ...resources, count: count });
});
}
}
## Instruction:
Add operations to the Pager Pipeline
## Code After:
import { PipelineAbstract, option, description } from '../serafin/pipeline/Abstract'
import * as Promise from 'bluebird'
@description("Provides pagination over the read results")
export class Paginate<T,
ReadQuery = {},
ReadOptions = { offset?: number, count?: number },
ReadWrapper = { count: number, results: {}[] }>
extends PipelineAbstract<T, ReadQuery, ReadOptions, ReadWrapper> {
@description("Reads a limited count of results")
@option('offset', { type: "integer" }, false)
@option('count', { type: "integer" }, false)
@option('page', { type: "integer" }, false)
read(query?: ReadQuery, options?: ReadOptions): Promise<ReadWrapper> {
return this.parent.read(query, options).then((resources) => {
let offset = 0;
if (options) {
if (options['offset']) {
offset = options['offset'];
} else if (options['page'] && options['count']) {
offset = (resources.length / options['count']) * options['page'];
}
if (options['count']) {
if (offset > resources.length) {
throw new Error("Offset higher than the number of resources");
}
resources = resources.slice(offset, offset + options['count']);
}
}
return Promise.resolve({ ...resources, count: resources.length });
});
}
} | import { PipelineAbstract, option, description } from '../serafin/pipeline/Abstract'
import * as Promise from 'bluebird'
@description("Provides pagination over the read results")
export class Paginate<T,
ReadQuery = {},
ReadOptions = { offset?: number, count?: number },
ReadWrapper = { count: number, results: {}[] }>
extends PipelineAbstract<T, ReadQuery, ReadOptions, ReadWrapper> {
@description("Reads a limited count of results")
@option('offset', { type: "integer" }, false)
@option('count', { type: "integer" }, false)
@option('page', { type: "integer" }, false)
- read(query?: ReadQuery): Promise<ReadWrapper> {
+ read(query?: ReadQuery, options?: ReadOptions): Promise<ReadWrapper> {
? +++++++++++++++++++++++
- return this.parent.read(query).then((resources) => {
+ return this.parent.read(query, options).then((resources) => {
? +++++++++
- let count = resources.results.length;
+ let offset = 0;
+
+ if (options) {
+ if (options['offset']) {
+ offset = options['offset'];
+ } else if (options['page'] && options['count']) {
+ offset = (resources.length / options['count']) * options['page'];
+ }
+
+ if (options['count']) {
+ if (offset > resources.length) {
+ throw new Error("Offset higher than the number of resources");
+ }
+
+ resources = resources.slice(offset, offset + options['count']);
+ }
+ }
+
- return Promise.resolve({ ...resources, count: count });
? ^
+ return Promise.resolve({ ...resources, count: resources.length });
? ^^^ +++++++ + +
});
}
} | 25 | 1.190476 | 21 | 4 |
56d9504b21ec60e773adb4429dbf021ef92c2f3e | setup.cfg | setup.cfg | [metadata]
name = venvs
url = https://github.com/Julian/venvs
description = "A simpler tool for creating venvs in a central location"
long_description = file: README.rst
author = "Julian Berman"
author_email = "Julian@GrayVines.com"
classifiers =
Development Status :: 4 - Beta
Operating System :: OS Independent
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[options]
packages = find:
install_requires =
appdirs
attrs>=19.2.0
click
filesystems>=0.23.0
functools32; python_version == '2.7'
packaging
pyrsistent
toml>=0.10.0
tomlkit
tqdm
virtualenv
[options.entry_points]
console_scripts =
venvs = venvs._cli:main
[flake8]
exclude = venvs/__init__.py
[doc8]
ignore-path =
version.txt,
.*/,
_*/
[bdist_wheel]
universal = 1
| [metadata]
name = venvs
url = https://github.com/Julian/venvs
description = "A simpler tool for creating venvs in a central location"
long_description = file: README.rst
long_description_content_type = text/x-rst
author = "Julian Berman"
author_email = "Julian@GrayVines.com"
classifiers =
Development Status :: 4 - Beta
Operating System :: OS Independent
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[options]
packages = find:
install_requires =
appdirs
attrs>=19.2.0
click
filesystems>=0.23.0
functools32; python_version == '2.7'
packaging
pyrsistent
toml>=0.10.0
tomlkit
tqdm
virtualenv
[options.entry_points]
console_scripts =
venvs = venvs._cli:main
[flake8]
exclude = venvs/__init__.py
[doc8]
ignore-path =
version.txt,
.*/,
_*/
[bdist_wheel]
universal = 1
| Fix the warnings emitted by twine check. | Fix the warnings emitted by twine check.
See pypa/twine#454 for a way to catch this via CI.
| INI | mit | Julian/mkenv | ini | ## Code Before:
[metadata]
name = venvs
url = https://github.com/Julian/venvs
description = "A simpler tool for creating venvs in a central location"
long_description = file: README.rst
author = "Julian Berman"
author_email = "Julian@GrayVines.com"
classifiers =
Development Status :: 4 - Beta
Operating System :: OS Independent
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[options]
packages = find:
install_requires =
appdirs
attrs>=19.2.0
click
filesystems>=0.23.0
functools32; python_version == '2.7'
packaging
pyrsistent
toml>=0.10.0
tomlkit
tqdm
virtualenv
[options.entry_points]
console_scripts =
venvs = venvs._cli:main
[flake8]
exclude = venvs/__init__.py
[doc8]
ignore-path =
version.txt,
.*/,
_*/
[bdist_wheel]
universal = 1
## Instruction:
Fix the warnings emitted by twine check.
See pypa/twine#454 for a way to catch this via CI.
## Code After:
[metadata]
name = venvs
url = https://github.com/Julian/venvs
description = "A simpler tool for creating venvs in a central location"
long_description = file: README.rst
long_description_content_type = text/x-rst
author = "Julian Berman"
author_email = "Julian@GrayVines.com"
classifiers =
Development Status :: 4 - Beta
Operating System :: OS Independent
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[options]
packages = find:
install_requires =
appdirs
attrs>=19.2.0
click
filesystems>=0.23.0
functools32; python_version == '2.7'
packaging
pyrsistent
toml>=0.10.0
tomlkit
tqdm
virtualenv
[options.entry_points]
console_scripts =
venvs = venvs._cli:main
[flake8]
exclude = venvs/__init__.py
[doc8]
ignore-path =
version.txt,
.*/,
_*/
[bdist_wheel]
universal = 1
| [metadata]
name = venvs
url = https://github.com/Julian/venvs
description = "A simpler tool for creating venvs in a central location"
long_description = file: README.rst
+ long_description_content_type = text/x-rst
author = "Julian Berman"
author_email = "Julian@GrayVines.com"
classifiers =
Development Status :: 4 - Beta
Operating System :: OS Independent
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[options]
packages = find:
install_requires =
appdirs
attrs>=19.2.0
click
filesystems>=0.23.0
functools32; python_version == '2.7'
packaging
pyrsistent
toml>=0.10.0
tomlkit
tqdm
virtualenv
[options.entry_points]
console_scripts =
venvs = venvs._cli:main
[flake8]
exclude = venvs/__init__.py
[doc8]
ignore-path =
version.txt,
.*/,
_*/
[bdist_wheel]
universal = 1 | 1 | 0.02 | 1 | 0 |
2cd8bda1ba1ff57c6cd6f25759eeff9d8e258dd6 | circle.yml | circle.yml | machine:
node:
version: 6.9
test:
override:
- nvm install 6.8 && npm rebuild && npm test
- nvm install 7.1 && npm rebuild && npm test
- nvm install 7.2 && npm rebuild && npm test
- nvm install 4 && npm rebuild && npm test
| machine:
node:
version: 6
test:
override:
- npm test
- nvm install 7 && npm rebuild && npm test
- nvm install 4 && npm rebuild && npm test
| Test on 4.x, 6.x and 7.x | Test on 4.x, 6.x and 7.x | YAML | mit | brainsiq/hapi-boom-decorators | yaml | ## Code Before:
machine:
node:
version: 6.9
test:
override:
- nvm install 6.8 && npm rebuild && npm test
- nvm install 7.1 && npm rebuild && npm test
- nvm install 7.2 && npm rebuild && npm test
- nvm install 4 && npm rebuild && npm test
## Instruction:
Test on 4.x, 6.x and 7.x
## Code After:
machine:
node:
version: 6
test:
override:
- npm test
- nvm install 7 && npm rebuild && npm test
- nvm install 4 && npm rebuild && npm test
| machine:
node:
- version: 6.9
? --
+ version: 6
test:
override:
- - nvm install 6.8 && npm rebuild && npm test
+ - npm test
- - nvm install 7.1 && npm rebuild && npm test
? --
+ - nvm install 7 && npm rebuild && npm test
- - nvm install 7.2 && npm rebuild && npm test
- nvm install 4 && npm rebuild && npm test | 7 | 0.7 | 3 | 4 |
6e74e3b50e927e847acb87fc59dc9b5ebc6de926 | src/Microsoft.DotNet.Archive/LZMA/README.md | src/Microsoft.DotNet.Archive/LZMA/README.md | This source came from the C# implementation of LZMA from the LZMA SDK, version 16.02, from http://www.7-zip.org/sdk.html.
## License
The LZMA SDK is public domain. Thanks goes to Igor Pavlov for making this available.
| This source came from the C# implementation of LZMA from the LZMA SDK, version 16.02, from http://www.7-zip.org/sdk.html.
## License
LZMA SDK is placed in the public domain.
Anyone is free to copy, modify, publish, use, compile, sell, or distribute the original LZMA SDK code, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means.
## Thanks!
Thanks goes to Igor Pavlov for making this available.
| Update LZMA license with correct text | Update LZMA license with correct text
| Markdown | mit | harshjain2/cli,Faizan2304/cli,ravimeda/cli,dasMulli/cli,EdwardBlair/cli,johnbeisner/cli,Faizan2304/cli,harshjain2/cli,Faizan2304/cli,ravimeda/cli,johnbeisner/cli,blackdwarf/cli,svick/cli,livarcocc/cli-1,ravimeda/cli,jonsequitur/cli,jonsequitur/cli,dasMulli/cli,livarcocc/cli-1,EdwardBlair/cli,harshjain2/cli,jonsequitur/cli,svick/cli,blackdwarf/cli,EdwardBlair/cli,blackdwarf/cli,livarcocc/cli-1,dasMulli/cli,svick/cli,blackdwarf/cli,johnbeisner/cli,jonsequitur/cli | markdown | ## Code Before:
This source came from the C# implementation of LZMA from the LZMA SDK, version 16.02, from http://www.7-zip.org/sdk.html.
## License
The LZMA SDK is public domain. Thanks goes to Igor Pavlov for making this available.
## Instruction:
Update LZMA license with correct text
## Code After:
This source came from the C# implementation of LZMA from the LZMA SDK, version 16.02, from http://www.7-zip.org/sdk.html.
## License
LZMA SDK is placed in the public domain.
Anyone is free to copy, modify, publish, use, compile, sell, or distribute the original LZMA SDK code, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means.
## Thanks!
Thanks goes to Igor Pavlov for making this available.
| This source came from the C# implementation of LZMA from the LZMA SDK, version 16.02, from http://www.7-zip.org/sdk.html.
## License
+ LZMA SDK is placed in the public domain.
+
+ Anyone is free to copy, modify, publish, use, compile, sell, or distribute the original LZMA SDK code, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means.
+
+ ## Thanks!
- The LZMA SDK is public domain. Thanks goes to Igor Pavlov for making this available.
? --------------------------------
+ Thanks goes to Igor Pavlov for making this available. | 7 | 1.75 | 6 | 1 |
2831655d7b56cea5380a7f9ce7062ad054b5a207 | spec/runtime/background.js | spec/runtime/background.js | /**
* Listens for the app launching then creates the window.
* Ignores the provided window size.
*
* @see http://developer.chrome.com/trunk/apps/app.window.html
*/
function initPage() {
addActionButton('Attach onSuspend', function() {
var buttonTime = new Date();
chrome.runtime.onSuspend.addListener(function() {
var callbackTime = new Date();
log('onSuspend fired: ' + (callbackTime - buttonTime) + 'ms after button');
});
});
}
| /**
* Listens for the app launching then creates the window.
* Ignores the provided window size.
*
* @see http://developer.chrome.com/trunk/apps/app.window.html
*/
function initPage() {
addActionButton('Attach onSuspend', function() {
var buttonTime = new Date();
chrome.runtime.onSuspend.addListener(function() {
var callbackTime = new Date();
log('onSuspend fired: ' + (callbackTime - buttonTime) + 'ms after button');
});
});
addActionButton('chrome.runtime.reload()', function() {
chrome.runtime.reload();
});
}
| Add a button that calls chrome.runtime.reload() | Add a button that calls chrome.runtime.reload()
| JavaScript | bsd-3-clause | hgl888/mobile-chrome-apps,MobileChromeApps/mobile-chrome-apps,chirilo/mobile-chrome-apps,guozanhua/mobile-chrome-apps,guozanhua/mobile-chrome-apps,chirilo/mobile-chrome-apps,liqingzhu/mobile-chrome-apps,xiaoyanit/mobile-chrome-apps,hgl888/mobile-chrome-apps,wudkj/mobile-chrome-apps,hgl888/mobile-chrome-apps,liqingzhu/mobile-chrome-apps,wudkj/mobile-chrome-apps,MobileChromeApps/mobile-chrome-apps,liqingzhu/mobile-chrome-apps,xiaoyanit/mobile-chrome-apps,MobileChromeApps/mobile-chrome-apps,chirilo/mobile-chrome-apps,MobileChromeApps/mobile-chrome-apps,hgl888/mobile-chrome-apps,liqingzhu/mobile-chrome-apps,chirilo/mobile-chrome-apps,xiaoyanit/mobile-chrome-apps,hgl888/mobile-chrome-apps,wudkj/mobile-chrome-apps,hgl888/mobile-chrome-apps,guozanhua/mobile-chrome-apps,xiaoyanit/mobile-chrome-apps,guozanhua/mobile-chrome-apps,hgl888/mobile-chrome-apps,wudkj/mobile-chrome-apps | javascript | ## Code Before:
/**
* Listens for the app launching then creates the window.
* Ignores the provided window size.
*
* @see http://developer.chrome.com/trunk/apps/app.window.html
*/
function initPage() {
addActionButton('Attach onSuspend', function() {
var buttonTime = new Date();
chrome.runtime.onSuspend.addListener(function() {
var callbackTime = new Date();
log('onSuspend fired: ' + (callbackTime - buttonTime) + 'ms after button');
});
});
}
## Instruction:
Add a button that calls chrome.runtime.reload()
## Code After:
/**
* Listens for the app launching then creates the window.
* Ignores the provided window size.
*
* @see http://developer.chrome.com/trunk/apps/app.window.html
*/
function initPage() {
addActionButton('Attach onSuspend', function() {
var buttonTime = new Date();
chrome.runtime.onSuspend.addListener(function() {
var callbackTime = new Date();
log('onSuspend fired: ' + (callbackTime - buttonTime) + 'ms after button');
});
});
addActionButton('chrome.runtime.reload()', function() {
chrome.runtime.reload();
});
}
| /**
* Listens for the app launching then creates the window.
* Ignores the provided window size.
*
* @see http://developer.chrome.com/trunk/apps/app.window.html
*/
function initPage() {
addActionButton('Attach onSuspend', function() {
var buttonTime = new Date();
chrome.runtime.onSuspend.addListener(function() {
var callbackTime = new Date();
log('onSuspend fired: ' + (callbackTime - buttonTime) + 'ms after button');
});
});
+ addActionButton('chrome.runtime.reload()', function() {
+ chrome.runtime.reload();
+ });
}
| 3 | 0.176471 | 3 | 0 |
80e4e2a943d9a156ed2a674a76a1da42b891dd57 | Josh_Zane_Sebastian/src/main.h | Josh_Zane_Sebastian/src/main.h |
struct stack_node {
struct stack_node *cdr;
union node_data data;
char type;
}
union node_data {
struct routine routine;
int numval;
}
struct routine {
struct routine *parent;
struct iq_node *nodes;
int num_nodes;
}
struct iq_node {
struct iq_node *next;
union node_data instr;
char type;
}
|
struct stack_node {
struct stack_node *cdr;
union node_data data;
char type;
}
union node_data {
struct routine routine;
int numval;
}
struct routine {
struct routine *parent;
struct iq_node *nodes;
}
struct iq_node {
struct iq_node *next;
union node_data instr;
char type;
}
| Remove unused node-counter in routines | Remove unused node-counter in routines
| C | mit | aacoppa/final,aacoppa/final | c | ## Code Before:
struct stack_node {
struct stack_node *cdr;
union node_data data;
char type;
}
union node_data {
struct routine routine;
int numval;
}
struct routine {
struct routine *parent;
struct iq_node *nodes;
int num_nodes;
}
struct iq_node {
struct iq_node *next;
union node_data instr;
char type;
}
## Instruction:
Remove unused node-counter in routines
## Code After:
struct stack_node {
struct stack_node *cdr;
union node_data data;
char type;
}
union node_data {
struct routine routine;
int numval;
}
struct routine {
struct routine *parent;
struct iq_node *nodes;
}
struct iq_node {
struct iq_node *next;
union node_data instr;
char type;
}
|
struct stack_node {
struct stack_node *cdr;
union node_data data;
char type;
}
union node_data {
struct routine routine;
int numval;
}
struct routine {
struct routine *parent;
struct iq_node *nodes;
- int num_nodes;
}
struct iq_node {
struct iq_node *next;
union node_data instr;
char type;
} | 1 | 0.043478 | 0 | 1 |
0e88d04592f493d037d14a8d79ab27d60a1b32f6 | _layouts/gig.html | _layouts/gig.html | ---
layout: default
---
<div class='row'>
<div class='col-md-1'></div>
<div class='col-md-5'>
<div class='gig'>
<dl class='dl-horizontal'>
<dt>Date<dt>
<dd>{{ page.date | date_to_long_string }}</dd>
<dt>Time</dt>
<dd>{{ page.time }}</dd>
<dt>Location</dt>
<dd>{{ page.location }}</dd>
{% if page.price %}
<dt>Price</dt>
<dd>{{ page.price }}</dd>
{% endif %}
{% if page.facebook_id %}
<dt>More info</dt>
<dd>
<a href='https://facebook.com/events/{{ page.facebook_id }}/' title='Facebook event'>
Facebook
</a>
</dd>
{% endif %}
</dl>
</div>
</div>
<div class='col-md-6'>
{% include map.html latitude = page.latitude longitude = page.longitude %}
</div>
</div>
{% include json-ld.html title = page.title date = page.date time = page.time location = page.location %}
| ---
layout: default
---
<div class='row'>
<div class='col-md-6'>
<div class='gig'>
<dl class='dl-horizontal'>
<dt>Date<dt>
<dd>{{ page.date | date_to_long_string }}</dd>
<dt>Time</dt>
<dd>{{ page.time }}</dd>
<dt>Location</dt>
<dd>{{ page.location }}</dd>
{% if page.price %}
<dt>Price</dt>
<dd>{{ page.price }}</dd>
{% endif %}
{% if page.facebook_id %}
<dt>More info</dt>
<dd>
<a href='https://facebook.com/events/{{ page.facebook_id }}/' title='Facebook event'>
Facebook
</a>
</dd>
{% endif %}
</dl>
</div>
</div>
<div class='col-md-6'>
{% include map.html latitude = page.latitude longitude = page.longitude %}
</div>
</div>
{% include json-ld.html title = page.title date = page.date time = page.time location = page.location %}
| Address fits on one line now | Address fits on one line now
| HTML | mit | rawfunkmaharishi/rawfunkmaharishi.github.io,rawfunkmaharishi/rawfunkmaharishi.github.io | html | ## Code Before:
---
layout: default
---
<div class='row'>
<div class='col-md-1'></div>
<div class='col-md-5'>
<div class='gig'>
<dl class='dl-horizontal'>
<dt>Date<dt>
<dd>{{ page.date | date_to_long_string }}</dd>
<dt>Time</dt>
<dd>{{ page.time }}</dd>
<dt>Location</dt>
<dd>{{ page.location }}</dd>
{% if page.price %}
<dt>Price</dt>
<dd>{{ page.price }}</dd>
{% endif %}
{% if page.facebook_id %}
<dt>More info</dt>
<dd>
<a href='https://facebook.com/events/{{ page.facebook_id }}/' title='Facebook event'>
Facebook
</a>
</dd>
{% endif %}
</dl>
</div>
</div>
<div class='col-md-6'>
{% include map.html latitude = page.latitude longitude = page.longitude %}
</div>
</div>
{% include json-ld.html title = page.title date = page.date time = page.time location = page.location %}
## Instruction:
Address fits on one line now
## Code After:
---
layout: default
---
<div class='row'>
<div class='col-md-6'>
<div class='gig'>
<dl class='dl-horizontal'>
<dt>Date<dt>
<dd>{{ page.date | date_to_long_string }}</dd>
<dt>Time</dt>
<dd>{{ page.time }}</dd>
<dt>Location</dt>
<dd>{{ page.location }}</dd>
{% if page.price %}
<dt>Price</dt>
<dd>{{ page.price }}</dd>
{% endif %}
{% if page.facebook_id %}
<dt>More info</dt>
<dd>
<a href='https://facebook.com/events/{{ page.facebook_id }}/' title='Facebook event'>
Facebook
</a>
</dd>
{% endif %}
</dl>
</div>
</div>
<div class='col-md-6'>
{% include map.html latitude = page.latitude longitude = page.longitude %}
</div>
</div>
{% include json-ld.html title = page.title date = page.date time = page.time location = page.location %}
| ---
layout: default
---
<div class='row'>
- <div class='col-md-1'></div>
- <div class='col-md-5'>
? ^
+ <div class='col-md-6'>
? ^
<div class='gig'>
<dl class='dl-horizontal'>
<dt>Date<dt>
<dd>{{ page.date | date_to_long_string }}</dd>
<dt>Time</dt>
<dd>{{ page.time }}</dd>
<dt>Location</dt>
<dd>{{ page.location }}</dd>
{% if page.price %}
<dt>Price</dt>
<dd>{{ page.price }}</dd>
{% endif %}
{% if page.facebook_id %}
<dt>More info</dt>
<dd>
<a href='https://facebook.com/events/{{ page.facebook_id }}/' title='Facebook event'>
Facebook
</a>
</dd>
{% endif %}
</dl>
</div>
</div>
<div class='col-md-6'>
{% include map.html latitude = page.latitude longitude = page.longitude %}
</div>
</div>
{% include json-ld.html title = page.title date = page.date time = page.time location = page.location %} | 3 | 0.071429 | 1 | 2 |
bab4f346cef626f29c67cc214b03db2475ef6b64 | scriptcore/process/popen.py | scriptcore/process/popen.py |
from subprocess import Popen as BasePopen
class Popen(BasePopen):
def communicate(self, input=None, timeout=None):
"""
Communicate
:param input: Optional input
:param timeout: Optional timeout
:return: Out, err, exitcode
"""
out, err = super(Popen, self).communicate(input=input, timeout=timeout)
out = out.strip().split('\n')
err = err.strip().split('\n')
return out, err, self.returncode
def is_running(self):
"""
Running
:return: Boolean
"""
return True if self.poll() is None else False
|
from subprocess import Popen as BasePopen
class Popen(BasePopen):
def communicate(self, input=None):
"""
Communicate
:param input: Optional input
:return: Out, err, exitcode
"""
out, err = super(Popen, self).communicate(input=input)
out = out.strip().split('\n')
err = err.strip().split('\n')
return out, err, self.returncode
def is_running(self):
"""
Running
:return: Boolean
"""
return True if self.poll() is None else False
| Fix error in communicate function. | Fix error in communicate function.
| Python | apache-2.0 | LowieHuyghe/script-core | python | ## Code Before:
from subprocess import Popen as BasePopen
class Popen(BasePopen):
def communicate(self, input=None, timeout=None):
"""
Communicate
:param input: Optional input
:param timeout: Optional timeout
:return: Out, err, exitcode
"""
out, err = super(Popen, self).communicate(input=input, timeout=timeout)
out = out.strip().split('\n')
err = err.strip().split('\n')
return out, err, self.returncode
def is_running(self):
"""
Running
:return: Boolean
"""
return True if self.poll() is None else False
## Instruction:
Fix error in communicate function.
## Code After:
from subprocess import Popen as BasePopen
class Popen(BasePopen):
def communicate(self, input=None):
"""
Communicate
:param input: Optional input
:return: Out, err, exitcode
"""
out, err = super(Popen, self).communicate(input=input)
out = out.strip().split('\n')
err = err.strip().split('\n')
return out, err, self.returncode
def is_running(self):
"""
Running
:return: Boolean
"""
return True if self.poll() is None else False
|
from subprocess import Popen as BasePopen
class Popen(BasePopen):
- def communicate(self, input=None, timeout=None):
? --------------
+ def communicate(self, input=None):
"""
Communicate
:param input: Optional input
- :param timeout: Optional timeout
:return: Out, err, exitcode
"""
- out, err = super(Popen, self).communicate(input=input, timeout=timeout)
? -----------------
+ out, err = super(Popen, self).communicate(input=input)
out = out.strip().split('\n')
err = err.strip().split('\n')
return out, err, self.returncode
def is_running(self):
"""
Running
:return: Boolean
"""
return True if self.poll() is None else False | 5 | 0.178571 | 2 | 3 |
6982cd66dcfba89bff6daa2f0e70751e3deac335 | composer.json | composer.json | {
"name": "irazasyed/telegram-bot-sdk",
"description": "The Unofficial Telegram Bot API PHP SDK",
"keywords": ["telegram", "telegram bot", "telegram bot api", "telegram sdk", "telegram php", "laravel telegram", "laravel"],
"type": "library",
"homepage": "https://github.com/irazasyed/telegram-bot-sdk",
"license": "BSD-3-Clause",
"authors": [
{
"name": "Syed Irfaq R.",
"email": "syed+gh@lukonet.com",
"homepage": "https://github.com/irazasyed"
}
],
"require": {
"php": ">=5.5.0",
"guzzlehttp/guzzle": "~6.0",
"illuminate/support": "5.0.*|5.1.*"
},
"autoload": {
"psr-4": {
"Irazasyed\\Telegram\\": "src/"
}
},
"suggest": {
"irazasyed/larasupport": "Allows you to use any Laravel Package in Lumen by adding support!"
}
} | {
"name": "irazasyed/telegram-bot-sdk",
"description": "The Unofficial Telegram Bot API PHP SDK",
"keywords": ["telegram", "telegram bot", "telegram bot api", "telegram sdk", "telegram php", "laravel telegram", "laravel"],
"type": "library",
"homepage": "https://github.com/irazasyed/telegram-bot-sdk",
"license": "BSD-3-Clause",
"authors": [
{
"name": "Syed Irfaq R.",
"email": "syed+gh@lukonet.com",
"homepage": "https://github.com/irazasyed"
}
],
"require": {
"php": ">=5.5.0",
"guzzlehttp/guzzle": "~6.0",
"illuminate/support": "5.0.*|5.1.*"
},
"autoload": {
"psr-4": {
"Irazasyed\\Telegram\\": "src/"
}
},
"suggest": {
"irazasyed/larasupport": "Allows you to use any Laravel Package in Lumen by adding support!"
},
"extra": {
"branch-alias": {
"dev-master": "1.0-dev"
}
}
}
| Add Branch Alias to Composer. | Add Branch Alias to Composer. | JSON | bsd-3-clause | irazasyed/telegram-bot-sdk,antoniomadonna/telegram-bot-sdk,masterweb121/telegram-bot-sdk,mohsenHa/telegram-bot-sdk,Mirocow/telegram-bot-sdk,m4h4n/telegram-bot-sdk,ihoru/telegram-bot-sdk,halaei/telegram-bot,nandadotexe/telegram-bot-sdk,jonnywilliamson/telegram-bot-sdk | json | ## Code Before:
{
"name": "irazasyed/telegram-bot-sdk",
"description": "The Unofficial Telegram Bot API PHP SDK",
"keywords": ["telegram", "telegram bot", "telegram bot api", "telegram sdk", "telegram php", "laravel telegram", "laravel"],
"type": "library",
"homepage": "https://github.com/irazasyed/telegram-bot-sdk",
"license": "BSD-3-Clause",
"authors": [
{
"name": "Syed Irfaq R.",
"email": "syed+gh@lukonet.com",
"homepage": "https://github.com/irazasyed"
}
],
"require": {
"php": ">=5.5.0",
"guzzlehttp/guzzle": "~6.0",
"illuminate/support": "5.0.*|5.1.*"
},
"autoload": {
"psr-4": {
"Irazasyed\\Telegram\\": "src/"
}
},
"suggest": {
"irazasyed/larasupport": "Allows you to use any Laravel Package in Lumen by adding support!"
}
}
## Instruction:
Add Branch Alias to Composer.
## Code After:
{
"name": "irazasyed/telegram-bot-sdk",
"description": "The Unofficial Telegram Bot API PHP SDK",
"keywords": ["telegram", "telegram bot", "telegram bot api", "telegram sdk", "telegram php", "laravel telegram", "laravel"],
"type": "library",
"homepage": "https://github.com/irazasyed/telegram-bot-sdk",
"license": "BSD-3-Clause",
"authors": [
{
"name": "Syed Irfaq R.",
"email": "syed+gh@lukonet.com",
"homepage": "https://github.com/irazasyed"
}
],
"require": {
"php": ">=5.5.0",
"guzzlehttp/guzzle": "~6.0",
"illuminate/support": "5.0.*|5.1.*"
},
"autoload": {
"psr-4": {
"Irazasyed\\Telegram\\": "src/"
}
},
"suggest": {
"irazasyed/larasupport": "Allows you to use any Laravel Package in Lumen by adding support!"
},
"extra": {
"branch-alias": {
"dev-master": "1.0-dev"
}
}
}
| {
"name": "irazasyed/telegram-bot-sdk",
"description": "The Unofficial Telegram Bot API PHP SDK",
"keywords": ["telegram", "telegram bot", "telegram bot api", "telegram sdk", "telegram php", "laravel telegram", "laravel"],
"type": "library",
"homepage": "https://github.com/irazasyed/telegram-bot-sdk",
"license": "BSD-3-Clause",
"authors": [
{
"name": "Syed Irfaq R.",
"email": "syed+gh@lukonet.com",
"homepage": "https://github.com/irazasyed"
}
],
"require": {
"php": ">=5.5.0",
"guzzlehttp/guzzle": "~6.0",
"illuminate/support": "5.0.*|5.1.*"
},
"autoload": {
"psr-4": {
"Irazasyed\\Telegram\\": "src/"
}
},
"suggest": {
"irazasyed/larasupport": "Allows you to use any Laravel Package in Lumen by adding support!"
+ },
+ "extra": {
+ "branch-alias": {
+ "dev-master": "1.0-dev"
+ }
}
} | 5 | 0.178571 | 5 | 0 |
0addd8c91c2f85086d2c5cdb08f09a73c5139da1 | web/src/stores/scrolls.js | web/src/stores/scrolls.js | export default new class StoreScrolls {
positions = {};
constructor() {
// Save scroll position to store.
window.addEventListener('scroll', (e) => {
const id = e.target.getAttribute('scroller');
if (id) this.positions[id] = e.target.scrollTop;
}, { passive: true, capture: true });
// Apply scroll position from store.
(new MutationObserver((mutations) => {
for (const mutation of mutations) for (const node of mutation.addedNodes) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
})).observe(document.getElementById('root'), {
childList: true,
subtree: true
});
}
};
| export default new class StoreScrolls {
positions = {};
constructor() {
// Save scroll position to store.
window.addEventListener('scroll', (e) => {
const id = e.target.getAttribute('scroller');
if (id) this.positions[id] = e.target.scrollTop;
}, { passive: true, capture: true });
// Apply scroll position from store.
(new MutationObserver((mutations) => {
for (const mutation of mutations) for (const node of mutation.addedNodes) if (node.querySelectorAll) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
})).observe(document.getElementById('root'), {
childList: true,
subtree: true
});
}
};
| Fix temporary scroll store (web) | Fix temporary scroll store (web)
| JavaScript | agpl-3.0 | karlkoorna/Bussiaeg,karlkoorna/Bussiaeg | javascript | ## Code Before:
export default new class StoreScrolls {
positions = {};
constructor() {
// Save scroll position to store.
window.addEventListener('scroll', (e) => {
const id = e.target.getAttribute('scroller');
if (id) this.positions[id] = e.target.scrollTop;
}, { passive: true, capture: true });
// Apply scroll position from store.
(new MutationObserver((mutations) => {
for (const mutation of mutations) for (const node of mutation.addedNodes) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
})).observe(document.getElementById('root'), {
childList: true,
subtree: true
});
}
};
## Instruction:
Fix temporary scroll store (web)
## Code After:
export default new class StoreScrolls {
positions = {};
constructor() {
// Save scroll position to store.
window.addEventListener('scroll', (e) => {
const id = e.target.getAttribute('scroller');
if (id) this.positions[id] = e.target.scrollTop;
}, { passive: true, capture: true });
// Apply scroll position from store.
(new MutationObserver((mutations) => {
for (const mutation of mutations) for (const node of mutation.addedNodes) if (node.querySelectorAll) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
})).observe(document.getElementById('root'), {
childList: true,
subtree: true
});
}
};
| export default new class StoreScrolls {
positions = {};
constructor() {
// Save scroll position to store.
window.addEventListener('scroll', (e) => {
const id = e.target.getAttribute('scroller');
if (id) this.positions[id] = e.target.scrollTop;
}, { passive: true, capture: true });
// Apply scroll position from store.
(new MutationObserver((mutations) => {
- for (const mutation of mutations) for (const node of mutation.addedNodes) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
? ^^^^^^^^^^^^^^^^^
+ for (const mutation of mutations) for (const node of mutation.addedNodes) if (node.querySelectorAll) for (const el of node.querySelectorAll('[scroller]')) el.scrollTop = this.positions[el.getAttribute('scroller')];
? ^^^^ ++++++++++++++++++++++++++++++++++++++++
})).observe(document.getElementById('root'), {
childList: true,
subtree: true
});
}
}; | 2 | 0.086957 | 1 | 1 |
f665baeeca6a29893b42b2da3aeb49f6e185b947 | README.md | README.md | [][travis]
# Typed Binary lib
Standard `Binary` serializes to `ByteString`, which is an untyped format;
deserialization of unexpected input usually results in unusable data.
This module is built around a `Typed` type, which allows serializing both a
value and the type of that value; deserialization can then check whether the
received data was sent assuming the right type, and error messages may provide
insight into the type mismatch.
More information can be found on the [binary-typed Hackage page][hackage], or
you can generate the documentation yourself (via `cabal haddock`).
[travis]: https://travis-ci.org/quchen/binary-typed
[hackage]: http://hackage.haskell.org/package/binary-typed | [][travis]
# Typed Binary lib
Standard `Binary` serializes to `ByteString`, which is an untyped format;
deserialization of unexpected input usually results in unusable data.
This module is built around a `Typed` type, which allows serializing both a
value and the type of that value; deserialization can then check whether the
received data was sent assuming the right type, and error messages may provide
insight into the type mismatch.
This package serves the same purpose as [tagged-binary][tagged-binary], with
a couple of key differences:
- Support of different kinds of serialized type annotations, each with
specific strengths and weaknesses.
- Error messages can provide details on type errors at the cost of
longer message lengths to include the necessary information.
- Serialization computationally almost as efficient as "Data.Binary" when
precaching type representations; decoding however is slower.
These values obviously depend a lot on the involved data and its type;
an example benchmark is shown in the picture below.
- No depencency on @Internal@ modules of other libraries, and a very small
dependency footprint in general.
For information about usage, see the `Tutorial` submodule.
Performance-wise, here is a value `Left (Left <100 chars lipsum>)` of
type `Either (Char, Int) (Either String (Maybe Integer))` benchmarked
using the `Hashed` type representation:

More information can be found on the [binary-typed Hackage page][hackage], or
you can generate the documentation yourself (via `cabal haddock`).
[travis]: https://travis-ci.org/quchen/binary-typed
[hackage]: http://hackage.haskell.org/package/binary-typed
[tagged-binary]: http://hackage.haskell.org/package/tagged-binary | Bring readme up to date with .cabal file | Bring readme up to date with .cabal file
[skip ci]
| Markdown | bsd-2-clause | quchen/binary-typed | markdown | ## Code Before:
[][travis]
# Typed Binary lib
Standard `Binary` serializes to `ByteString`, which is an untyped format;
deserialization of unexpected input usually results in unusable data.
This module is built around a `Typed` type, which allows serializing both a
value and the type of that value; deserialization can then check whether the
received data was sent assuming the right type, and error messages may provide
insight into the type mismatch.
More information can be found on the [binary-typed Hackage page][hackage], or
you can generate the documentation yourself (via `cabal haddock`).
[travis]: https://travis-ci.org/quchen/binary-typed
[hackage]: http://hackage.haskell.org/package/binary-typed
## Instruction:
Bring readme up to date with .cabal file
[skip ci]
## Code After:
[][travis]
# Typed Binary lib
Standard `Binary` serializes to `ByteString`, which is an untyped format;
deserialization of unexpected input usually results in unusable data.
This module is built around a `Typed` type, which allows serializing both a
value and the type of that value; deserialization can then check whether the
received data was sent assuming the right type, and error messages may provide
insight into the type mismatch.
This package serves the same purpose as [tagged-binary][tagged-binary], with
a couple of key differences:
- Support of different kinds of serialized type annotations, each with
specific strengths and weaknesses.
- Error messages can provide details on type errors at the cost of
longer message lengths to include the necessary information.
- Serialization computationally almost as efficient as "Data.Binary" when
precaching type representations; decoding however is slower.
These values obviously depend a lot on the involved data and its type;
an example benchmark is shown in the picture below.
- No depencency on @Internal@ modules of other libraries, and a very small
dependency footprint in general.
For information about usage, see the `Tutorial` submodule.
Performance-wise, here is a value `Left (Left <100 chars lipsum>)` of
type `Either (Char, Int) (Either String (Maybe Integer))` benchmarked
using the `Hashed` type representation:

More information can be found on the [binary-typed Hackage page][hackage], or
you can generate the documentation yourself (via `cabal haddock`).
[travis]: https://travis-ci.org/quchen/binary-typed
[hackage]: http://hackage.haskell.org/package/binary-typed
[tagged-binary]: http://hackage.haskell.org/package/tagged-binary | - [][travis]
? ---------------------
+ [][travis]
# Typed Binary lib
Standard `Binary` serializes to `ByteString`, which is an untyped format;
deserialization of unexpected input usually results in unusable data.
This module is built around a `Typed` type, which allows serializing both a
value and the type of that value; deserialization can then check whether the
received data was sent assuming the right type, and error messages may provide
insight into the type mismatch.
+ This package serves the same purpose as [tagged-binary][tagged-binary], with
+ a couple of key differences:
+
+ - Support of different kinds of serialized type annotations, each with
+ specific strengths and weaknesses.
+
+ - Error messages can provide details on type errors at the cost of
+ longer message lengths to include the necessary information.
+
+ - Serialization computationally almost as efficient as "Data.Binary" when
+ precaching type representations; decoding however is slower.
+ These values obviously depend a lot on the involved data and its type;
+ an example benchmark is shown in the picture below.
+
+ - No depencency on @Internal@ modules of other libraries, and a very small
+ dependency footprint in general.
+
+ For information about usage, see the `Tutorial` submodule.
+
+ Performance-wise, here is a value `Left (Left <100 chars lipsum>)` of
+ type `Either (Char, Int) (Either String (Maybe Integer))` benchmarked
+ using the `Hashed` type representation:
+
+ 
+
More information can be found on the [binary-typed Hackage page][hackage], or
you can generate the documentation yourself (via `cabal haddock`).
[travis]: https://travis-ci.org/quchen/binary-typed
[hackage]: http://hackage.haskell.org/package/binary-typed
+ [tagged-binary]: http://hackage.haskell.org/package/tagged-binary | 28 | 1.555556 | 27 | 1 |
06a6e5bf89d5cfac5330a538395de230489e2d43 | sack/helpers.go | sack/helpers.go | package sack
import (
"fmt"
"io/ioutil"
"path"
"strings"
)
func checkState() {}
func splitLine(s string) []string {
arr := strings.SplitN(s, ":", 3)
return arr
}
func check(e error) {
if e != nil {
fmt.Printf("\n----\nError: %#v\n----\n", e)
panic(e)
}
}
func content() []string {
filePath := path.Join(home, shortcutFilename)
dat, err := ioutil.ReadFile(filePath)
check(err)
lines := strings.Split(string(dat), "\n")
return lines[0 : len(lines)-1]
}
| package sack
import (
"fmt"
"io/ioutil"
"path"
"strings"
)
func checkState() {}
func splitLine(s string) []string {
arr := strings.SplitN(s, ":", 3)
return arr
}
func check(e error) {
if e != nil {
fmt.Printf("\n----\nError: %#v\n----\n", e)
panic(e)
}
}
func content() []string {
filePath := path.Join(home, shortcutFilename)
dat, err := ioutil.ReadFile(filePath)
if err != nil {
fmt.Println("Unable to open shortcut file. Try doing a search.")
panic(1)
}
lines := strings.Split(string(dat), "\n")
return lines[0 : len(lines)-1]
}
| Improve error message for no shortcut file | Improve error message for no shortcut file
| Go | mit | zph/go-sack,zph/go-sack | go | ## Code Before:
package sack
import (
"fmt"
"io/ioutil"
"path"
"strings"
)
func checkState() {}
func splitLine(s string) []string {
arr := strings.SplitN(s, ":", 3)
return arr
}
func check(e error) {
if e != nil {
fmt.Printf("\n----\nError: %#v\n----\n", e)
panic(e)
}
}
func content() []string {
filePath := path.Join(home, shortcutFilename)
dat, err := ioutil.ReadFile(filePath)
check(err)
lines := strings.Split(string(dat), "\n")
return lines[0 : len(lines)-1]
}
## Instruction:
Improve error message for no shortcut file
## Code After:
package sack
import (
"fmt"
"io/ioutil"
"path"
"strings"
)
func checkState() {}
func splitLine(s string) []string {
arr := strings.SplitN(s, ":", 3)
return arr
}
func check(e error) {
if e != nil {
fmt.Printf("\n----\nError: %#v\n----\n", e)
panic(e)
}
}
func content() []string {
filePath := path.Join(home, shortcutFilename)
dat, err := ioutil.ReadFile(filePath)
if err != nil {
fmt.Println("Unable to open shortcut file. Try doing a search.")
panic(1)
}
lines := strings.Split(string(dat), "\n")
return lines[0 : len(lines)-1]
}
| package sack
import (
"fmt"
"io/ioutil"
"path"
"strings"
)
func checkState() {}
func splitLine(s string) []string {
arr := strings.SplitN(s, ":", 3)
return arr
}
func check(e error) {
if e != nil {
fmt.Printf("\n----\nError: %#v\n----\n", e)
panic(e)
}
}
func content() []string {
filePath := path.Join(home, shortcutFilename)
dat, err := ioutil.ReadFile(filePath)
- check(err)
+ if err != nil {
+ fmt.Println("Unable to open shortcut file. Try doing a search.")
+ panic(1)
+ }
lines := strings.Split(string(dat), "\n")
return lines[0 : len(lines)-1]
} | 5 | 0.166667 | 4 | 1 |
02f441806e86aa581d76cf2793b9ec54ff5f7eb8 | assets/stylesheets/common.css.sass | assets/stylesheets/common.css.sass | *
@include box-sizing(border-box)
body
margin: 0
main
font: 1em "Helvetica Neue", sans-serif
margin: 0 auto
| *, *:before, *:after
@include box-sizing(border-box)
body
margin: 0
main
font: 1em "Helvetica Neue", sans-serif
margin: 0 auto
| Cover :before and :after with box sizing. | Cover :before and :after with box sizing.
| Sass | mit | danbee/mpd-client,danbee/mpd-client | sass | ## Code Before:
*
@include box-sizing(border-box)
body
margin: 0
main
font: 1em "Helvetica Neue", sans-serif
margin: 0 auto
## Instruction:
Cover :before and :after with box sizing.
## Code After:
*, *:before, *:after
@include box-sizing(border-box)
body
margin: 0
main
font: 1em "Helvetica Neue", sans-serif
margin: 0 auto
| - *
+ *, *:before, *:after
@include box-sizing(border-box)
body
margin: 0
main
font: 1em "Helvetica Neue", sans-serif
margin: 0 auto | 2 | 0.222222 | 1 | 1 |
7c86a23a3ab77a39ded479b80c6194cd98331775 | scraper/scraper.rb | scraper/scraper.rb | require 'nokogiri'
require 'open-uri'
require 'uri'
# Get the URL
puts 'Who you gonna scrape?'
target = gets.chomp
# Open the page
doc = Nokogiri::HTML(open("#{target}"))
# Gets dates and spits them out
doc.xpath('//div[@id="main-content"]//h2').each do |header|
h = header.content
# Removes "Agenda for" - need to remove any locations next
date = h.sub(/^Agenda\ for\ /, '')
puts date
end
| require 'nokogiri'
require 'open-uri'
require 'uri'
# Get the URL
puts 'Who you gonna scrape?'
target = gets.chomp
# Open the page
doc = Nokogiri::HTML(open("#{target}"))
# Gets dates and spits them out
doc.xpath('//div[@id="main-content"]//h2').each do |header|
h = header.content
# Removes "Agenda for" and stuff in parentheses at the end
date = h.sub(/^Agenda\ for\ /, '').sub(/\ *\([a-zA-Z0-9 ]*\)$/, '').chomp
puts date
end
| Cut the extra locations from h2s | Cut the extra locations from h2s
Gross regex.
| Ruby | apache-2.0 | tomnatt/show-tell,tomnatt/show-tell,tomnatt/show-tell | ruby | ## Code Before:
require 'nokogiri'
require 'open-uri'
require 'uri'
# Get the URL
puts 'Who you gonna scrape?'
target = gets.chomp
# Open the page
doc = Nokogiri::HTML(open("#{target}"))
# Gets dates and spits them out
doc.xpath('//div[@id="main-content"]//h2').each do |header|
h = header.content
# Removes "Agenda for" - need to remove any locations next
date = h.sub(/^Agenda\ for\ /, '')
puts date
end
## Instruction:
Cut the extra locations from h2s
Gross regex.
## Code After:
require 'nokogiri'
require 'open-uri'
require 'uri'
# Get the URL
puts 'Who you gonna scrape?'
target = gets.chomp
# Open the page
doc = Nokogiri::HTML(open("#{target}"))
# Gets dates and spits them out
doc.xpath('//div[@id="main-content"]//h2').each do |header|
h = header.content
# Removes "Agenda for" and stuff in parentheses at the end
date = h.sub(/^Agenda\ for\ /, '').sub(/\ *\([a-zA-Z0-9 ]*\)$/, '').chomp
puts date
end
| require 'nokogiri'
require 'open-uri'
require 'uri'
# Get the URL
puts 'Who you gonna scrape?'
target = gets.chomp
# Open the page
doc = Nokogiri::HTML(open("#{target}"))
# Gets dates and spits them out
doc.xpath('//div[@id="main-content"]//h2').each do |header|
h = header.content
- # Removes "Agenda for" - need to remove any locations next
- date = h.sub(/^Agenda\ for\ /, '')
+ # Removes "Agenda for" and stuff in parentheses at the end
+ date = h.sub(/^Agenda\ for\ /, '').sub(/\ *\([a-zA-Z0-9 ]*\)$/, '').chomp
puts date
end | 4 | 0.210526 | 2 | 2 |
aea6d9835c7b6e7824f431892f747cfe184beb1e | .travis.yml | .travis.yml | ---
sudo: false
language: ruby
bundler_args: --without system_tests
script: "bundle exec rake validate && bundle exec rake lint && bundle exec rake spec SPEC_OPTS='--format documentation'"
matrix:
fast_finish: true
include:
- rvm: 1.8.7
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.6
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
notifications:
email: false
| ---
sudo: false
language: ruby
bundler_args: --without system_tests
script: "bundle exec rake validate && bundle exec rake lint && bundle exec rake spec SPEC_OPTS='--format documentation'"
matrix:
fast_finish: true
include:
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.6
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
notifications:
email: false
| Drop support for Ruby 1.8.7 | Drop support for Ruby 1.8.7
| YAML | apache-2.0 | cegeka/puppet-rabbitmq,cegeka/puppet-rabbitmq | yaml | ## Code Before:
---
sudo: false
language: ruby
bundler_args: --without system_tests
script: "bundle exec rake validate && bundle exec rake lint && bundle exec rake spec SPEC_OPTS='--format documentation'"
matrix:
fast_finish: true
include:
- rvm: 1.8.7
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.6
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
notifications:
email: false
## Instruction:
Drop support for Ruby 1.8.7
## Code After:
---
sudo: false
language: ruby
bundler_args: --without system_tests
script: "bundle exec rake validate && bundle exec rake lint && bundle exec rake spec SPEC_OPTS='--format documentation'"
matrix:
fast_finish: true
include:
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.6
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
notifications:
email: false
| ---
sudo: false
language: ruby
bundler_args: --without system_tests
script: "bundle exec rake validate && bundle exec rake lint && bundle exec rake spec SPEC_OPTS='--format documentation'"
matrix:
fast_finish: true
include:
- - rvm: 1.8.7
- env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 1.9.3
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0"
- rvm: 2.1.5
env: PUPPET_GEM_VERSION="~> 3.0" FUTURE_PARSER="yes"
- rvm: 2.1.6
env: PUPPET_GEM_VERSION="~> 4.0" STRICT_VARIABLES="yes"
notifications:
email: false | 2 | 0.1 | 0 | 2 |
30b4623fdb896378ba468b8289a1d64388434d03 | app/views/shared/_dojo.html.haml | app/views/shared/_dojo.html.haml | %li.dojo
%header
%a{:href => "#{dojo.url}"}
%span.dojo-picture{style: "background-image: url(#{dojo.logo});"}
%a{:href => "#{dojo.url}"}
%span.dojo-name
= "#{dojo.name} (#{dojo.prefecture.name}) "
- if not dojo.counter == 1
%span.dojo-counter{'data-toggle' => 'tooltip', 'data-placement' => 'bottom', 'data-original-title' => '道場数'}= "#{dojo.counter}"
%ul.tags
- dojo.tags.each do |tag|
%li= tag
%p.dojo-desciption
= dojo.description
- if dojo.is_private
%a.dojo-private{href: '/docs/private-dojo'} Private
| %li.dojo
%header
%a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-picture{style: "background-image: url(#{dojo.logo});"}
%a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-name
= "#{dojo.name} (#{dojo.prefecture.name}) "
- if not dojo.counter == 1
%span.dojo-counter{'data-toggle' => 'tooltip', 'data-placement' => 'bottom', 'data-original-title' => '道場数'}= "#{dojo.counter}"
%ul.tags
- dojo.tags.each do |tag|
%li= tag
%p.dojo-desciption
= dojo.description
- if dojo.is_private
%a.dojo-private{href: '/docs/private-dojo'} Private
| Add 'rel=noopener' attr to links in top page | Add 'rel=noopener' attr to links in top page
| Haml | mit | yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,coderdojo-japan/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp | haml | ## Code Before:
%li.dojo
%header
%a{:href => "#{dojo.url}"}
%span.dojo-picture{style: "background-image: url(#{dojo.logo});"}
%a{:href => "#{dojo.url}"}
%span.dojo-name
= "#{dojo.name} (#{dojo.prefecture.name}) "
- if not dojo.counter == 1
%span.dojo-counter{'data-toggle' => 'tooltip', 'data-placement' => 'bottom', 'data-original-title' => '道場数'}= "#{dojo.counter}"
%ul.tags
- dojo.tags.each do |tag|
%li= tag
%p.dojo-desciption
= dojo.description
- if dojo.is_private
%a.dojo-private{href: '/docs/private-dojo'} Private
## Instruction:
Add 'rel=noopener' attr to links in top page
## Code After:
%li.dojo
%header
%a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-picture{style: "background-image: url(#{dojo.logo});"}
%a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-name
= "#{dojo.name} (#{dojo.prefecture.name}) "
- if not dojo.counter == 1
%span.dojo-counter{'data-toggle' => 'tooltip', 'data-placement' => 'bottom', 'data-original-title' => '道場数'}= "#{dojo.counter}"
%ul.tags
- dojo.tags.each do |tag|
%li= tag
%p.dojo-desciption
= dojo.description
- if dojo.is_private
%a.dojo-private{href: '/docs/private-dojo'} Private
| %li.dojo
%header
- %a{:href => "#{dojo.url}"}
+ %a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-picture{style: "background-image: url(#{dojo.logo});"}
- %a{:href => "#{dojo.url}"}
+ %a{href: "#{dojo.url}", target: "_blank", rel: "external noopener"}
%span.dojo-name
= "#{dojo.name} (#{dojo.prefecture.name}) "
- if not dojo.counter == 1
%span.dojo-counter{'data-toggle' => 'tooltip', 'data-placement' => 'bottom', 'data-original-title' => '道場数'}= "#{dojo.counter}"
%ul.tags
- dojo.tags.each do |tag|
%li= tag
%p.dojo-desciption
= dojo.description
- if dojo.is_private
%a.dojo-private{href: '/docs/private-dojo'} Private | 4 | 0.25 | 2 | 2 |
667c0519b457e912b4b4e8429032399081ae43e6 | README.md | README.md | Flight simulator device for drosophila
| Flight simulator device for drosophila
## BaseSimulationDemo
This module is a demo of plot accumulative point.
| Add the markdown description about the basesinulationdemo | Add the markdown description about the basesinulationdemo
| Markdown | mit | Taaccoo-beta/Flight-Simulator-In-Drosophila,Taaccoo-beta/Flight-Simulator-In-Drosophila,Taaccoo-beta/Flight-Simulator-In-Drosophila | markdown | ## Code Before:
Flight simulator device for drosophila
## Instruction:
Add the markdown description about the basesinulationdemo
## Code After:
Flight simulator device for drosophila
## BaseSimulationDemo
This module is a demo of plot accumulative point.
| Flight simulator device for drosophila
+
+ ## BaseSimulationDemo
+ This module is a demo of plot accumulative point. | 3 | 3 | 3 | 0 |
dba6caa3b2ad42c567b4767a08a27c869df296f8 | lib/vagrant-ec2/actions/check_state.rb | lib/vagrant-ec2/actions/check_state.rb | require 'aws-sdk'
module VagrantPlugins
module Ec2
module Actions
class CheckState
def initialize(app, env)
@app = app
end
def call(env)
if ! env[:machine].id
env[:machine_state] = :not_created
elsif ! env[:machine_state]
ec2 = Aws::EC2::Resource.new(env[:connection_options])
instance = ec2.instance(env[:machine].id)
if ! instance.exists?
env[:machine].id = nil
env[:machine_state] = :not_created
else
env[:machine_state] = instance.state.name.to_sym
end
Provider.set_instance_state(env[:machine], env[:machine_state])
end
@app.call(env)
end
end
end
end
end
| require 'aws-sdk'
module VagrantPlugins
module Ec2
module Actions
class CheckState
def initialize(app, env)
@app = app
end
def call(env)
if ! env[:machine].id
env[:machine_state] = :not_created
elsif ! env[:machine_state]
ec2 = Aws::EC2::Resource.new(env[:connection_options])
instance = ec2.instance(env[:machine].id)
if ! instance.exists? || instance.state.name == "shutting-down" || instance.state.name == "terminated"
env[:machine_state] = :not_created
else
env[:machine_state] = instance.state.name.to_sym
end
Provider.set_instance_state(env[:machine], env[:machine_state])
env[:machine].id = nil if env[:machine_state] === :not_created
end
@app.call(env)
end
end
end
end
end
| Fix the case where the instance exists but is being shutdown | Fix the case where the instance exists but is being shutdown
| Ruby | mit | ColinHebert/vagrant-ec2,ColinHebert/vagrant-ec2 | ruby | ## Code Before:
require 'aws-sdk'
module VagrantPlugins
module Ec2
module Actions
class CheckState
def initialize(app, env)
@app = app
end
def call(env)
if ! env[:machine].id
env[:machine_state] = :not_created
elsif ! env[:machine_state]
ec2 = Aws::EC2::Resource.new(env[:connection_options])
instance = ec2.instance(env[:machine].id)
if ! instance.exists?
env[:machine].id = nil
env[:machine_state] = :not_created
else
env[:machine_state] = instance.state.name.to_sym
end
Provider.set_instance_state(env[:machine], env[:machine_state])
end
@app.call(env)
end
end
end
end
end
## Instruction:
Fix the case where the instance exists but is being shutdown
## Code After:
require 'aws-sdk'
module VagrantPlugins
module Ec2
module Actions
class CheckState
def initialize(app, env)
@app = app
end
def call(env)
if ! env[:machine].id
env[:machine_state] = :not_created
elsif ! env[:machine_state]
ec2 = Aws::EC2::Resource.new(env[:connection_options])
instance = ec2.instance(env[:machine].id)
if ! instance.exists? || instance.state.name == "shutting-down" || instance.state.name == "terminated"
env[:machine_state] = :not_created
else
env[:machine_state] = instance.state.name.to_sym
end
Provider.set_instance_state(env[:machine], env[:machine_state])
env[:machine].id = nil if env[:machine_state] === :not_created
end
@app.call(env)
end
end
end
end
end
| require 'aws-sdk'
module VagrantPlugins
module Ec2
module Actions
class CheckState
def initialize(app, env)
@app = app
end
def call(env)
if ! env[:machine].id
env[:machine_state] = :not_created
elsif ! env[:machine_state]
ec2 = Aws::EC2::Resource.new(env[:connection_options])
instance = ec2.instance(env[:machine].id)
+ if ! instance.exists? || instance.state.name == "shutting-down" || instance.state.name == "terminated"
- if ! instance.exists?
- env[:machine].id = nil
env[:machine_state] = :not_created
else
env[:machine_state] = instance.state.name.to_sym
end
Provider.set_instance_state(env[:machine], env[:machine_state])
+ env[:machine].id = nil if env[:machine_state] === :not_created
end
@app.call(env)
end
end
end
end
end | 4 | 0.129032 | 2 | 2 |
45ad682d9e511fe34328e454ad30c57d90fc489d | sql/wf_ac_group_roles.sql | sql/wf_ac_group_roles.sql | DROP TABLE IF EXISTS wf_access_contexts;
--
CREATE TABLE wf_access_contexts (
id INT NOT NULL AUTO_INCREMENT,
ns_id INT NOT NULL,
group_id INT NOT NULL,
role_id INT NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY (group_id) REFERENCES wf_groups_master(id),
FOREIGN KEY (role_id) REFERENCES wf_roles_master(id)
);
| DROP TABLE IF EXISTS wf_ac_group_roles;
--
CREATE TABLE wf_ac_group_roles (
id INT NOT NULL AUTO_INCREMENT,
ns_id INT NOT NULL,
group_id INT NOT NULL,
role_id INT NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY (group_id) REFERENCES wf_groups_master(id),
FOREIGN KEY (role_id) REFERENCES wf_roles_master(id)
);
| Update SQL DDL to match renamed table | Update SQL DDL to match renamed table
| SQL | apache-2.0 | js-ojus/flow,js-ojus/flow | sql | ## Code Before:
DROP TABLE IF EXISTS wf_access_contexts;
--
CREATE TABLE wf_access_contexts (
id INT NOT NULL AUTO_INCREMENT,
ns_id INT NOT NULL,
group_id INT NOT NULL,
role_id INT NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY (group_id) REFERENCES wf_groups_master(id),
FOREIGN KEY (role_id) REFERENCES wf_roles_master(id)
);
## Instruction:
Update SQL DDL to match renamed table
## Code After:
DROP TABLE IF EXISTS wf_ac_group_roles;
--
CREATE TABLE wf_ac_group_roles (
id INT NOT NULL AUTO_INCREMENT,
ns_id INT NOT NULL,
group_id INT NOT NULL,
role_id INT NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY (group_id) REFERENCES wf_groups_master(id),
FOREIGN KEY (role_id) REFERENCES wf_roles_master(id)
);
| - DROP TABLE IF EXISTS wf_access_contexts;
+ DROP TABLE IF EXISTS wf_ac_group_roles;
--
- CREATE TABLE wf_access_contexts (
+ CREATE TABLE wf_ac_group_roles (
id INT NOT NULL AUTO_INCREMENT,
ns_id INT NOT NULL,
group_id INT NOT NULL,
role_id INT NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY (group_id) REFERENCES wf_groups_master(id),
FOREIGN KEY (role_id) REFERENCES wf_roles_master(id)
); | 4 | 0.307692 | 2 | 2 |
3f5c396820812f8a80e9ef9da33f7005af3393d7 | Cargo.toml | Cargo.toml | [package]
name = "ppapi"
version = "0.0.1"
authors = ["Richard Diamond"]
links = "helper"
build = "src/build.rs"
[lib]
name = "ppapi"
path = "src/lib/lib.rs"
[dependencies.http]
git = "https://github.com/chris-morgan/rust-http.git"
[dependencies.openssl]
git = "https://github.com/sfackler/rust-openssl.git"
[build-dependencies.pnacl-build-helper]
git = "https://github.com/DiamondLovesYou/cargo-pnacl-helper.git"
| [package]
name = "ppapi"
version = "0.0.1"
authors = ["Richard Diamond"]
links = "helper"
build = "src/build.rs"
[lib]
name = "ppapi"
path = "src/lib/lib.rs"
[dependencies.http]
git = "https://github.com/DiamondLovesYou/rust-http.git"
[dependencies.openssl]
git = "https://github.com/DiamondLovesYou/rust-openssl.git"
[build-dependencies.pnacl-build-helper]
git = "https://github.com/DiamondLovesYou/cargo-pnacl-helper.git"
| Revert "Revert to upstream repos of rust-http && rust-openssl now that" | Revert "Revert to upstream repos of rust-http && rust-openssl now that"
This reverts commit c7ea05f74f9fd033133cd093187c98d084ea464b.
| TOML | mpl-2.0 | DiamondLovesYou/rust-ppapi,DiamondLovesYou/rust-ppapi,DiamondLovesYou/rust-ppapi,DiamondLovesYou/rust-ppapi | toml | ## Code Before:
[package]
name = "ppapi"
version = "0.0.1"
authors = ["Richard Diamond"]
links = "helper"
build = "src/build.rs"
[lib]
name = "ppapi"
path = "src/lib/lib.rs"
[dependencies.http]
git = "https://github.com/chris-morgan/rust-http.git"
[dependencies.openssl]
git = "https://github.com/sfackler/rust-openssl.git"
[build-dependencies.pnacl-build-helper]
git = "https://github.com/DiamondLovesYou/cargo-pnacl-helper.git"
## Instruction:
Revert "Revert to upstream repos of rust-http && rust-openssl now that"
This reverts commit c7ea05f74f9fd033133cd093187c98d084ea464b.
## Code After:
[package]
name = "ppapi"
version = "0.0.1"
authors = ["Richard Diamond"]
links = "helper"
build = "src/build.rs"
[lib]
name = "ppapi"
path = "src/lib/lib.rs"
[dependencies.http]
git = "https://github.com/DiamondLovesYou/rust-http.git"
[dependencies.openssl]
git = "https://github.com/DiamondLovesYou/rust-openssl.git"
[build-dependencies.pnacl-build-helper]
git = "https://github.com/DiamondLovesYou/cargo-pnacl-helper.git"
| [package]
name = "ppapi"
version = "0.0.1"
authors = ["Richard Diamond"]
links = "helper"
build = "src/build.rs"
[lib]
name = "ppapi"
path = "src/lib/lib.rs"
[dependencies.http]
- git = "https://github.com/chris-morgan/rust-http.git"
? ^^^ ^^ ---
+ git = "https://github.com/DiamondLovesYou/rust-http.git"
? ^ ^ +++++++++
[dependencies.openssl]
- git = "https://github.com/sfackler/rust-openssl.git"
? ^^^^^^^
+ git = "https://github.com/DiamondLovesYou/rust-openssl.git"
? +++++++++++ ^^^
[build-dependencies.pnacl-build-helper]
git = "https://github.com/DiamondLovesYou/cargo-pnacl-helper.git" | 4 | 0.210526 | 2 | 2 |
bb8e4d4d0e2d0c46d7d0df5779b8a6623f28eb96 | .travis.yml | .travis.yml | language: python
# Run the test runner using the same version of Python we use.
python: 3.6
# Install the test runner.
install: pip install tox
# Run each environment separately so we get errors back from all of them.
env:
- TOX_ENV=py36
- TOX_ENV=py35
- TOX_ENV=py34
- TOX_ENV=pep8
- TOX_ENV=docs
script:
- tox -e $TOX_ENV
# Control the branches that get built.
branches:
only:
- master
| language: python
matrix:
include:
- python: 3.4
env: TOX_ENV=py34
- python: 3.5
env: TOX_ENV=py35
- python: 3.6
env: TOX_ENV=py36
- python: 3.6
env: TOX_ENV=docs
- python: 3.6
env: TOX_ENV=pep8
install: pip install tox
script: tox -e $TOX_ENV
# Control the branches that get built.
branches:
only:
- master
| Use a Travis CI build matrix | Use a Travis CI build matrix
In order to use both Python 3.5 and 3.6 on Travis CI, both need to be
included in the list of Python versions. This would result in the entire
build matrix being run using both versions. 3.5 jobs would fail on the
3.6 image and 3.6 jobs would fail on the 3.5 image.
While more verbose than what we had before, this build matrix allows us
to run each job with the correct version of Python.
| YAML | apache-2.0 | iheartradio/Henson | yaml | ## Code Before:
language: python
# Run the test runner using the same version of Python we use.
python: 3.6
# Install the test runner.
install: pip install tox
# Run each environment separately so we get errors back from all of them.
env:
- TOX_ENV=py36
- TOX_ENV=py35
- TOX_ENV=py34
- TOX_ENV=pep8
- TOX_ENV=docs
script:
- tox -e $TOX_ENV
# Control the branches that get built.
branches:
only:
- master
## Instruction:
Use a Travis CI build matrix
In order to use both Python 3.5 and 3.6 on Travis CI, both need to be
included in the list of Python versions. This would result in the entire
build matrix being run using both versions. 3.5 jobs would fail on the
3.6 image and 3.6 jobs would fail on the 3.5 image.
While more verbose than what we had before, this build matrix allows us
to run each job with the correct version of Python.
## Code After:
language: python
matrix:
include:
- python: 3.4
env: TOX_ENV=py34
- python: 3.5
env: TOX_ENV=py35
- python: 3.6
env: TOX_ENV=py36
- python: 3.6
env: TOX_ENV=docs
- python: 3.6
env: TOX_ENV=pep8
install: pip install tox
script: tox -e $TOX_ENV
# Control the branches that get built.
branches:
only:
- master
| language: python
- # Run the test runner using the same version of Python we use.
+ matrix:
+ include:
+ - python: 3.4
+ env: TOX_ENV=py34
+ - python: 3.5
+ env: TOX_ENV=py35
- python: 3.6
+ - python: 3.6
? ++++++
+ env: TOX_ENV=py36
+ - python: 3.6
+ env: TOX_ENV=docs
+ - python: 3.6
+ env: TOX_ENV=pep8
- # Install the test runner.
install: pip install tox
-
- # Run each environment separately so we get errors back from all of them.
- env:
- - TOX_ENV=py36
- - TOX_ENV=py35
- - TOX_ENV=py34
- - TOX_ENV=pep8
- - TOX_ENV=docs
- script:
- - tox -e $TOX_ENV
? ^^^
+ script: tox -e $TOX_ENV
? ^^^^^^^
# Control the branches that get built.
branches:
only:
- master | 26 | 1.181818 | 13 | 13 |
f2fc7f1015fc24fdbb69069ac74a21437e94657b | xmantissa/plugins/sineoff.py | xmantissa/plugins/sineoff.py | from axiom import iaxiom, userbase
from xmantissa import website, offering, provisioning
from sine import sipserver, sinetheme
sineproxy = provisioning.BenefactorFactory(
name = u'sineproxy',
description = u'Sine SIP Proxy',
benefactorClass = sipserver.SineBenefactor)
plugin = offering.Offering(
name = u"Sine",
description = u"""
The Sine SIP proxy and registrar.
""",
siteRequirements = (
(userbase.IRealm, userbase.LoginSystem),
(None, website.WebSite),
(None, sipserver.SIPServer)),
appPowerups = (sipserver.SinePublicPage,
),
benefactorFactories = (sineproxy,),
loginInterfaces=(),
themes = (sinetheme.XHTMLDirectoryTheme('base'),)
)
| from axiom import iaxiom, userbase
from xmantissa import website, offering, provisioning
from sine import sipserver, sinetheme
sineproxy = provisioning.BenefactorFactory(
name = u'sineproxy',
description = u'Sine SIP Proxy',
benefactorClass = sipserver.SineBenefactor)
plugin = offering.Offering(
name = u"Sine",
description = u"""
The Sine SIP proxy and registrar.
""",
siteRequirements = (
(userbase.IRealm, userbase.LoginSystem),
(None, website.WebSite),
(None, sipserver.SIPServer)),
appPowerups = (sipserver.SinePublicPage,
),
benefactorFactories = (sineproxy,),
themes = (sinetheme.XHTMLDirectoryTheme('base'),)
)
| Revert 5505 - introduced numerous regressions into the test suite | Revert 5505 - introduced numerous regressions into the test suite | Python | mit | habnabit/divmod-sine,twisted/sine | python | ## Code Before:
from axiom import iaxiom, userbase
from xmantissa import website, offering, provisioning
from sine import sipserver, sinetheme
sineproxy = provisioning.BenefactorFactory(
name = u'sineproxy',
description = u'Sine SIP Proxy',
benefactorClass = sipserver.SineBenefactor)
plugin = offering.Offering(
name = u"Sine",
description = u"""
The Sine SIP proxy and registrar.
""",
siteRequirements = (
(userbase.IRealm, userbase.LoginSystem),
(None, website.WebSite),
(None, sipserver.SIPServer)),
appPowerups = (sipserver.SinePublicPage,
),
benefactorFactories = (sineproxy,),
loginInterfaces=(),
themes = (sinetheme.XHTMLDirectoryTheme('base'),)
)
## Instruction:
Revert 5505 - introduced numerous regressions into the test suite
## Code After:
from axiom import iaxiom, userbase
from xmantissa import website, offering, provisioning
from sine import sipserver, sinetheme
sineproxy = provisioning.BenefactorFactory(
name = u'sineproxy',
description = u'Sine SIP Proxy',
benefactorClass = sipserver.SineBenefactor)
plugin = offering.Offering(
name = u"Sine",
description = u"""
The Sine SIP proxy and registrar.
""",
siteRequirements = (
(userbase.IRealm, userbase.LoginSystem),
(None, website.WebSite),
(None, sipserver.SIPServer)),
appPowerups = (sipserver.SinePublicPage,
),
benefactorFactories = (sineproxy,),
themes = (sinetheme.XHTMLDirectoryTheme('base'),)
)
| from axiom import iaxiom, userbase
from xmantissa import website, offering, provisioning
from sine import sipserver, sinetheme
sineproxy = provisioning.BenefactorFactory(
name = u'sineproxy',
description = u'Sine SIP Proxy',
benefactorClass = sipserver.SineBenefactor)
plugin = offering.Offering(
name = u"Sine",
description = u"""
The Sine SIP proxy and registrar.
""",
siteRequirements = (
(userbase.IRealm, userbase.LoginSystem),
(None, website.WebSite),
(None, sipserver.SIPServer)),
appPowerups = (sipserver.SinePublicPage,
),
benefactorFactories = (sineproxy,),
- loginInterfaces=(),
+
themes = (sinetheme.XHTMLDirectoryTheme('base'),)
)
| 2 | 0.064516 | 1 | 1 |
d34c47708fa2324d60f1ceb141e6173476896187 | scripts/virtualbox.sh | scripts/virtualbox.sh | yum -y install gcc kernel-devel-`uname -r`
VBOX_VERSION=$(cat /tmp/.vbox_version)
cd /tmp
mount -o loop /tmp/VBoxGuestAdditions_$VBOX_VERSION.iso /media
sh /media/VBoxLinuxAdditions.run
umount /media
rm -rf /tmp/VBoxGuestAdditions_*.iso
| yum -y install gcc make kernel-devel-`uname -r`
KERN_DIR=/usr/src/kernels/`uname -r`
export KERN_DIR
VBOX_VERSION=$(cat /tmp/.vbox_version)
cd /tmp
mount -o loop /tmp/VBoxGuestAdditions_$VBOX_VERSION.iso /media
sh /media/VBoxLinuxAdditions.run
umount /media
rm -rf /tmp/VBoxGuestAdditions_*.iso
| Create ENV containing path to current kernel source | Create ENV containing path to current kernel source
| Shell | mit | northerngit/packer-vagrant-centos | shell | ## Code Before:
yum -y install gcc kernel-devel-`uname -r`
VBOX_VERSION=$(cat /tmp/.vbox_version)
cd /tmp
mount -o loop /tmp/VBoxGuestAdditions_$VBOX_VERSION.iso /media
sh /media/VBoxLinuxAdditions.run
umount /media
rm -rf /tmp/VBoxGuestAdditions_*.iso
## Instruction:
Create ENV containing path to current kernel source
## Code After:
yum -y install gcc make kernel-devel-`uname -r`
KERN_DIR=/usr/src/kernels/`uname -r`
export KERN_DIR
VBOX_VERSION=$(cat /tmp/.vbox_version)
cd /tmp
mount -o loop /tmp/VBoxGuestAdditions_$VBOX_VERSION.iso /media
sh /media/VBoxLinuxAdditions.run
umount /media
rm -rf /tmp/VBoxGuestAdditions_*.iso
| - yum -y install gcc kernel-devel-`uname -r`
+ yum -y install gcc make kernel-devel-`uname -r`
? +++++
+
+ KERN_DIR=/usr/src/kernels/`uname -r`
+ export KERN_DIR
VBOX_VERSION=$(cat /tmp/.vbox_version)
cd /tmp
mount -o loop /tmp/VBoxGuestAdditions_$VBOX_VERSION.iso /media
sh /media/VBoxLinuxAdditions.run
umount /media
rm -rf /tmp/VBoxGuestAdditions_*.iso | 5 | 0.625 | 4 | 1 |
7587e36716db1230c6920a473fd6fa84ed41605a | README.md | README.md |
Http4s is a minimal, idiomatic Scala interface for HTTP services. Http4s is
Scala's answer to Ruby's Rack, Python's WSGI, Haskell's WAI, and Java's
Servlets.
Learn more at [http4s.org](http://http4s.org/).
|
Http4s is a minimal, idiomatic Scala interface for HTTP services. Http4s is
Scala's answer to Ruby's Rack, Python's WSGI, Haskell's WAI, and Java's
Servlets.
```scala
val service: HttpService = {
case Get -> Root / "hello" =>
Ok("Hello, better world.")
}
```
Learn more at [http4s.org](http://http4s.org/).
| Add hello world example back to readme. | Add hello world example back to readme.
| Markdown | apache-2.0 | m4dc4p/http4s,hvesalai/http4s,m4dc4p/http4s,http4s/http4s,hvesalai/http4s,aeons/http4s,rossabaker/http4s,m4dc4p/http4s,ZizhengTai/http4s,aeons/http4s,ChristopherDavenport/http4s,hvesalai/http4s,reactormonk/http4s,rossabaker/http4s,reactormonk/http4s,aeons/http4s,ChristopherDavenport/http4s,ZizhengTai/http4s,reactormonk/http4s,ChristopherDavenport/http4s,ZizhengTai/http4s | markdown | ## Code Before:
Http4s is a minimal, idiomatic Scala interface for HTTP services. Http4s is
Scala's answer to Ruby's Rack, Python's WSGI, Haskell's WAI, and Java's
Servlets.
Learn more at [http4s.org](http://http4s.org/).
## Instruction:
Add hello world example back to readme.
## Code After:
Http4s is a minimal, idiomatic Scala interface for HTTP services. Http4s is
Scala's answer to Ruby's Rack, Python's WSGI, Haskell's WAI, and Java's
Servlets.
```scala
val service: HttpService = {
case Get -> Root / "hello" =>
Ok("Hello, better world.")
}
```
Learn more at [http4s.org](http://http4s.org/).
|
Http4s is a minimal, idiomatic Scala interface for HTTP services. Http4s is
Scala's answer to Ruby's Rack, Python's WSGI, Haskell's WAI, and Java's
Servlets.
+ ```scala
+ val service: HttpService = {
+ case Get -> Root / "hello" =>
+ Ok("Hello, better world.")
+ }
+ ```
+
Learn more at [http4s.org](http://http4s.org/). | 7 | 1.166667 | 7 | 0 |
07b8af5da46c70342342f4eea922e36913ed995f | .gitlab-ci.yml | .gitlab-ci.yml | before_script:
- docker info
- env
- dig security.debian.org
- ping -c2 security.debian.org
- cat /etc/resolv.conf
build_image:
script:
- docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID .
- docker push bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID
tags:
- docker-build
| before_script:
- docker info
- env
- dig security.debian.org
- ping -c2 security.debian.org
- cat /etc/resolv.conf
build_image:
script:
- docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID .
- docker push bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID
tags:
- docker-build
| Put my username in gitlab CI instead of Brian Talley's | Put my username in gitlab CI instead of Brian Talley's | YAML | apache-2.0 | michaeldayreads/marathon-bigip-ctlr,michaeldayreads/marathon-bigip-ctlr,michaeldayreads/marathon-bigip-ctlr | yaml | ## Code Before:
before_script:
- docker info
- env
- dig security.debian.org
- ping -c2 security.debian.org
- cat /etc/resolv.conf
build_image:
script:
- docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID .
- docker push bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID
tags:
- docker-build
## Instruction:
Put my username in gitlab CI instead of Brian Talley's
## Code After:
before_script:
- docker info
- env
- dig security.debian.org
- ping -c2 security.debian.org
- cat /etc/resolv.conf
build_image:
script:
- docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID .
- docker push bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID
tags:
- docker-build
| before_script:
- docker info
- env
- dig security.debian.org
- ping -c2 security.debian.org
- cat /etc/resolv.conf
build_image:
script:
- - docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID .
? - ^^ ^
+ - docker build -t bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID .
? ^^^ ^^
- - docker push bldr-docker-registry.lab.lineratesystems.com:5000/talley/f5-service-router:build$CI_BUILD_ID
? - ^^ ^
+ - docker push bldr-docker-registry.lab.lineratesystems.com:5000/andrewj/f5-service-router:build$CI_BUILD_ID
? ^^^ ^^
tags:
- docker-build | 4 | 0.307692 | 2 | 2 |
31cf067f3e4da104551baf0e02332e22a75bb80a | tests/commit/field/test__field_math.py | tests/commit/field/test__field_math.py | from unittest import TestCase
from phi import math
from phi.geom import Box
from phi import field
from phi.physics import Domain
class TestFieldMath(TestCase):
def test_gradient(self):
domain = Domain(x=4, y=3)
phi = domain.grid() * (1, 2)
grad = field.gradient(phi, stack_dim='gradient')
self.assertEqual(('spatial', 'spatial', 'channel', 'channel'), grad.shape.types)
def test_divergence_centered(self):
v = field.CenteredGrid(math.ones(x=3, y=3), Box[0:1, 0:1], math.extrapolation.ZERO) * (1, 0) # flow to the right
div = field.divergence(v).values
math.assert_close(div.y[0], (1.5, 0, -1.5))
| from unittest import TestCase
from phi import math
from phi.field import StaggeredGrid, CenteredGrid
from phi.geom import Box
from phi import field
from phi.physics import Domain
class TestFieldMath(TestCase):
def test_gradient(self):
domain = Domain(x=4, y=3)
phi = domain.grid() * (1, 2)
grad = field.gradient(phi, stack_dim='gradient')
self.assertEqual(('spatial', 'spatial', 'channel', 'channel'), grad.shape.types)
def test_divergence_centered(self):
v = field.CenteredGrid(math.ones(x=3, y=3), Box[0:1, 0:1], math.extrapolation.ZERO) * (1, 0) # flow to the right
div = field.divergence(v).values
math.assert_close(div.y[0], (1.5, 0, -1.5))
def test_trace_function(self):
def f(x: StaggeredGrid, y: CenteredGrid):
return x + (y >> x)
ft = field.trace_function(f)
domain = Domain(x=4, y=3)
x = domain.staggered_grid(1)
y = domain.vector_grid(1)
res_f = f(x, y)
res_ft = ft(x, y)
self.assertEqual(res_f.shape, res_ft.shape)
field.assert_close(res_f, res_ft)
| Add unit test, update documentation | Add unit test, update documentation
| Python | mit | tum-pbs/PhiFlow,tum-pbs/PhiFlow | python | ## Code Before:
from unittest import TestCase
from phi import math
from phi.geom import Box
from phi import field
from phi.physics import Domain
class TestFieldMath(TestCase):
def test_gradient(self):
domain = Domain(x=4, y=3)
phi = domain.grid() * (1, 2)
grad = field.gradient(phi, stack_dim='gradient')
self.assertEqual(('spatial', 'spatial', 'channel', 'channel'), grad.shape.types)
def test_divergence_centered(self):
v = field.CenteredGrid(math.ones(x=3, y=3), Box[0:1, 0:1], math.extrapolation.ZERO) * (1, 0) # flow to the right
div = field.divergence(v).values
math.assert_close(div.y[0], (1.5, 0, -1.5))
## Instruction:
Add unit test, update documentation
## Code After:
from unittest import TestCase
from phi import math
from phi.field import StaggeredGrid, CenteredGrid
from phi.geom import Box
from phi import field
from phi.physics import Domain
class TestFieldMath(TestCase):
def test_gradient(self):
domain = Domain(x=4, y=3)
phi = domain.grid() * (1, 2)
grad = field.gradient(phi, stack_dim='gradient')
self.assertEqual(('spatial', 'spatial', 'channel', 'channel'), grad.shape.types)
def test_divergence_centered(self):
v = field.CenteredGrid(math.ones(x=3, y=3), Box[0:1, 0:1], math.extrapolation.ZERO) * (1, 0) # flow to the right
div = field.divergence(v).values
math.assert_close(div.y[0], (1.5, 0, -1.5))
def test_trace_function(self):
def f(x: StaggeredGrid, y: CenteredGrid):
return x + (y >> x)
ft = field.trace_function(f)
domain = Domain(x=4, y=3)
x = domain.staggered_grid(1)
y = domain.vector_grid(1)
res_f = f(x, y)
res_ft = ft(x, y)
self.assertEqual(res_f.shape, res_ft.shape)
field.assert_close(res_f, res_ft)
| from unittest import TestCase
from phi import math
+ from phi.field import StaggeredGrid, CenteredGrid
from phi.geom import Box
from phi import field
from phi.physics import Domain
class TestFieldMath(TestCase):
def test_gradient(self):
domain = Domain(x=4, y=3)
phi = domain.grid() * (1, 2)
grad = field.gradient(phi, stack_dim='gradient')
self.assertEqual(('spatial', 'spatial', 'channel', 'channel'), grad.shape.types)
def test_divergence_centered(self):
v = field.CenteredGrid(math.ones(x=3, y=3), Box[0:1, 0:1], math.extrapolation.ZERO) * (1, 0) # flow to the right
div = field.divergence(v).values
math.assert_close(div.y[0], (1.5, 0, -1.5))
+
+ def test_trace_function(self):
+ def f(x: StaggeredGrid, y: CenteredGrid):
+ return x + (y >> x)
+
+ ft = field.trace_function(f)
+ domain = Domain(x=4, y=3)
+ x = domain.staggered_grid(1)
+ y = domain.vector_grid(1)
+
+ res_f = f(x, y)
+ res_ft = ft(x, y)
+ self.assertEqual(res_f.shape, res_ft.shape)
+ field.assert_close(res_f, res_ft) | 15 | 0.75 | 15 | 0 |
07661839be5c791061cf166ae0d04b066544cf05 | _sass/_notif.scss | _sass/_notif.scss | /* Notification banner and Fallback for certain css features -- See script.js */
#notifBanner {
display: none;
background: #FF5F5F;
padding: 10px;
height: 45px;
margin: 0px auto;
z-index: 9999;
color: $white;
font-weight: bold;
text-align: center;
position: fixed;
top: 0;
right: 0;
left: 0;
.close-button {
float: right;
margin-right: 69px;
margin-top: 0px;
cursor: pointer;
font-weight: normal;
}
}
.notif_vis {
display: block !important;
}
.default-body .adjustSiteContent {
padding-bottom: 316px !important;
}
.secondary-body .adjustSiteContent {
margin-top: 95px !important;
}
body.category-page .adjustSiteContent {
padding-bottom: 238px !important;
}
.adjustedMargin {
margin-top: 45px !important;
}
.adjustedTop {
top: 45px !important;
}
| /* Notification banner and Fallback for certain css features -- See script.js */
#notifBanner {
display: none;
background: #FF5F5F;
padding: 10px;
height: 45px;
margin: 0px auto;
z-index: 9999;
color: $white;
font-weight: bold;
text-align: center;
position: fixed;
top: 0;
right: 0;
left: 0;
.close-button {
float: right;
margin-right: 69px;
margin-top: 0px;
cursor: pointer;
font-weight: normal;
}
}
.notif_vis {
display: block !important;
}
.default-body .adjustSiteContent {
padding-bottom: 316px !important;
}
.secondary-body {
.adjustSiteContent {
margin-top: 135px !important;
}
&.single-post .adjustSiteContent {
margin-top: 95px !important;
}
}
body.category-page .adjustSiteContent {
padding-bottom: 238px !important;
}
.adjustedMargin {
margin-top: 45px !important;
}
.adjustedTop {
top: 45px !important;
}
| Fix bug with secondary-site-content adjustedSiteContent | Fix bug with secondary-site-content adjustedSiteContent
| SCSS | mit | webtwic/webtwic.github.io,webtwic/webtwic.github.io | scss | ## Code Before:
/* Notification banner and Fallback for certain css features -- See script.js */
#notifBanner {
display: none;
background: #FF5F5F;
padding: 10px;
height: 45px;
margin: 0px auto;
z-index: 9999;
color: $white;
font-weight: bold;
text-align: center;
position: fixed;
top: 0;
right: 0;
left: 0;
.close-button {
float: right;
margin-right: 69px;
margin-top: 0px;
cursor: pointer;
font-weight: normal;
}
}
.notif_vis {
display: block !important;
}
.default-body .adjustSiteContent {
padding-bottom: 316px !important;
}
.secondary-body .adjustSiteContent {
margin-top: 95px !important;
}
body.category-page .adjustSiteContent {
padding-bottom: 238px !important;
}
.adjustedMargin {
margin-top: 45px !important;
}
.adjustedTop {
top: 45px !important;
}
## Instruction:
Fix bug with secondary-site-content adjustedSiteContent
## Code After:
/* Notification banner and Fallback for certain css features -- See script.js */
#notifBanner {
display: none;
background: #FF5F5F;
padding: 10px;
height: 45px;
margin: 0px auto;
z-index: 9999;
color: $white;
font-weight: bold;
text-align: center;
position: fixed;
top: 0;
right: 0;
left: 0;
.close-button {
float: right;
margin-right: 69px;
margin-top: 0px;
cursor: pointer;
font-weight: normal;
}
}
.notif_vis {
display: block !important;
}
.default-body .adjustSiteContent {
padding-bottom: 316px !important;
}
.secondary-body {
.adjustSiteContent {
margin-top: 135px !important;
}
&.single-post .adjustSiteContent {
margin-top: 95px !important;
}
}
body.category-page .adjustSiteContent {
padding-bottom: 238px !important;
}
.adjustedMargin {
margin-top: 45px !important;
}
.adjustedTop {
top: 45px !important;
}
| /* Notification banner and Fallback for certain css features -- See script.js */
#notifBanner {
display: none;
background: #FF5F5F;
padding: 10px;
height: 45px;
margin: 0px auto;
z-index: 9999;
color: $white;
font-weight: bold;
text-align: center;
position: fixed;
top: 0;
right: 0;
left: 0;
.close-button {
float: right;
margin-right: 69px;
margin-top: 0px;
cursor: pointer;
font-weight: normal;
}
}
.notif_vis {
display: block !important;
}
.default-body .adjustSiteContent {
padding-bottom: 316px !important;
}
- .secondary-body .adjustSiteContent {
+ .secondary-body {
+ .adjustSiteContent {
+ margin-top: 135px !important;
+ }
+ &.single-post .adjustSiteContent {
- margin-top: 95px !important;
+ margin-top: 95px !important;
? +
+ }
}
body.category-page .adjustSiteContent {
padding-bottom: 238px !important;
}
.adjustedMargin {
margin-top: 45px !important;
}
.adjustedTop {
top: 45px !important;
} | 9 | 0.1875 | 7 | 2 |
9c2db5c1e4cfa1a547c30c8ab826da3cf477851f | app/templates/auth/accept-updated-terms.html | app/templates/auth/accept-updated-terms.html | {% extends "_base_page.html" %}
{% block page_title %}Terms of Use Updated – Digital Marketplace{% endblock %}
{% block phase_banner %}{% endblock %}
{% block body_classes %} registration-pages form-page {% endblock %}
{% block main_content %}
{%
with
smaller = true,
heading = "Our Terms of Use have been updated"
%}
{% include "toolkit/page-heading.html" %}
{% endwith %}
<p>
To continue using the Marketplace please accept the latest <a target="_blank" class="expand-terms" href="{{ url_for('main.terms_of_use') }}">version</a>.
</p>
{% include "_form_errors.html" %}
<form action="{{ url_for('main.accept_updated_terms') }}" method="POST">
{{ form.csrf_token }}
<div>
{% if form.accept_terms.errors %}
{{ form.accept_terms(class="invalid") }}
{% else %}
{{ form.accept_terms }}
{% endif %}
{{ form.accept_terms.label }}
</div>
<div>
{%
with
type = "save",
role = "button",
label = "Accept"
%}
{% include "toolkit/button.html" %}
{% endwith %}
</div>
</form>
{% endblock %}
| {% extends "_base_page.html" %}
{% block page_title %}Terms of Use Updated – Digital Marketplace{% endblock %}
{% block phase_banner %}{% endblock %}
{% block body_classes %} registration-pages form-page {% endblock %}
{% block main_content %}
{%
with
smaller = true,
heading = "Our Terms of Use have been updated"
%}
{% include "toolkit/page-heading.html" %}
{% endwith %}
<p>
To continue using the Marketplace please accept the latest <a href="{{ url_for('main.terms_of_use') }}" target="_blank">version</a>.
</p>
{% include "_form_errors.html" %}
<form action="{{ url_for('main.accept_updated_terms') }}" method="POST">
{{ form.csrf_token }}
<div>
{% if form.accept_terms.errors %}
{{ form.accept_terms(class="invalid") }}
{% else %}
{{ form.accept_terms }}
{% endif %}
{{ form.accept_terms.label }}
</div>
<div>
{%
with
type = "save",
role = "button",
label = "Accept"
%}
{% include "toolkit/button.html" %}
{% endwith %}
</div>
</form>
{% endblock %}
| Move target attrib and remove class | Move target attrib and remove class
| HTML | mit | AusDTO/dto-digitalmarketplace-buyer-frontend,AusDTO/dto-digitalmarketplace-buyer-frontend,AusDTO/dto-digitalmarketplace-buyer-frontend,AusDTO/dto-digitalmarketplace-buyer-frontend | html | ## Code Before:
{% extends "_base_page.html" %}
{% block page_title %}Terms of Use Updated – Digital Marketplace{% endblock %}
{% block phase_banner %}{% endblock %}
{% block body_classes %} registration-pages form-page {% endblock %}
{% block main_content %}
{%
with
smaller = true,
heading = "Our Terms of Use have been updated"
%}
{% include "toolkit/page-heading.html" %}
{% endwith %}
<p>
To continue using the Marketplace please accept the latest <a target="_blank" class="expand-terms" href="{{ url_for('main.terms_of_use') }}">version</a>.
</p>
{% include "_form_errors.html" %}
<form action="{{ url_for('main.accept_updated_terms') }}" method="POST">
{{ form.csrf_token }}
<div>
{% if form.accept_terms.errors %}
{{ form.accept_terms(class="invalid") }}
{% else %}
{{ form.accept_terms }}
{% endif %}
{{ form.accept_terms.label }}
</div>
<div>
{%
with
type = "save",
role = "button",
label = "Accept"
%}
{% include "toolkit/button.html" %}
{% endwith %}
</div>
</form>
{% endblock %}
## Instruction:
Move target attrib and remove class
## Code After:
{% extends "_base_page.html" %}
{% block page_title %}Terms of Use Updated – Digital Marketplace{% endblock %}
{% block phase_banner %}{% endblock %}
{% block body_classes %} registration-pages form-page {% endblock %}
{% block main_content %}
{%
with
smaller = true,
heading = "Our Terms of Use have been updated"
%}
{% include "toolkit/page-heading.html" %}
{% endwith %}
<p>
To continue using the Marketplace please accept the latest <a href="{{ url_for('main.terms_of_use') }}" target="_blank">version</a>.
</p>
{% include "_form_errors.html" %}
<form action="{{ url_for('main.accept_updated_terms') }}" method="POST">
{{ form.csrf_token }}
<div>
{% if form.accept_terms.errors %}
{{ form.accept_terms(class="invalid") }}
{% else %}
{{ form.accept_terms }}
{% endif %}
{{ form.accept_terms.label }}
</div>
<div>
{%
with
type = "save",
role = "button",
label = "Accept"
%}
{% include "toolkit/button.html" %}
{% endwith %}
</div>
</form>
{% endblock %}
| {% extends "_base_page.html" %}
{% block page_title %}Terms of Use Updated – Digital Marketplace{% endblock %}
{% block phase_banner %}{% endblock %}
{% block body_classes %} registration-pages form-page {% endblock %}
{% block main_content %}
{%
with
smaller = true,
heading = "Our Terms of Use have been updated"
%}
{% include "toolkit/page-heading.html" %}
{% endwith %}
<p>
- To continue using the Marketplace please accept the latest <a target="_blank" class="expand-terms" href="{{ url_for('main.terms_of_use') }}">version</a>.
? -------------------------------------
+ To continue using the Marketplace please accept the latest <a href="{{ url_for('main.terms_of_use') }}" target="_blank">version</a>.
? ++++++++++++++++
</p>
{% include "_form_errors.html" %}
<form action="{{ url_for('main.accept_updated_terms') }}" method="POST">
{{ form.csrf_token }}
<div>
{% if form.accept_terms.errors %}
{{ form.accept_terms(class="invalid") }}
{% else %}
{{ form.accept_terms }}
{% endif %}
{{ form.accept_terms.label }}
</div>
<div>
{%
with
type = "save",
role = "button",
label = "Accept"
%}
{% include "toolkit/button.html" %}
{% endwith %}
</div>
</form>
{% endblock %} | 2 | 0.041667 | 1 | 1 |
afb788a68bf08840747c85eed189885bc1951d10 | src/sequoia-item-actions.html | src/sequoia-item-actions.html | <span data-ng-if="isSelected(node)">
<a class="sequoia-button sequoia-button-danger" href="" title="Deselect" data-ng-click="deselect(node)" data-ng-bind-html="buttons.deselect"></a>
</span>
<span data-ng-if="!isSelected(node)">
<a class="sequoia-button sequoia-button-primary" href="" title="Select" data-ng-click="select(node)" data-ng-bind-html="buttons.select"></a>
</span>
<span data-ng-if="node[tree.template.nodes] && node[tree.template.nodes].length">
<a class="sequoia-button sequoia-button-info" href="" title="Go to subitems" data-ng-click="load(node)" data-ng-bind-html="buttons.goToSubitems"></a>
</span>
| <span data-ng-if="allowSelect && isSelected(node)">
<a class="sequoia-button sequoia-button-danger" href="" title="Deselect" data-ng-click="deselect(node)" data-ng-bind-html="buttons.deselect"></a>
</span>
<span data-ng-if="allowSelect && !isSelected(node)">
<a class="sequoia-button sequoia-button-primary" href="" title="Select" data-ng-click="select(node)" data-ng-bind-html="buttons.select"></a>
</span>
<span data-ng-if="node[tree.template.nodes] && node[tree.template.nodes].length">
<a class="sequoia-button sequoia-button-info" href="" title="Go to subitems" data-ng-click="load(node)" data-ng-bind-html="buttons.goToSubitems"></a>
</span>
| Fix allowSelect when no model is provided | Fix allowSelect when no model is provided
| HTML | mit | tricinel/angular-sequoia,tricinel/angular-sequoia,tricinel/angular-sequoia | html | ## Code Before:
<span data-ng-if="isSelected(node)">
<a class="sequoia-button sequoia-button-danger" href="" title="Deselect" data-ng-click="deselect(node)" data-ng-bind-html="buttons.deselect"></a>
</span>
<span data-ng-if="!isSelected(node)">
<a class="sequoia-button sequoia-button-primary" href="" title="Select" data-ng-click="select(node)" data-ng-bind-html="buttons.select"></a>
</span>
<span data-ng-if="node[tree.template.nodes] && node[tree.template.nodes].length">
<a class="sequoia-button sequoia-button-info" href="" title="Go to subitems" data-ng-click="load(node)" data-ng-bind-html="buttons.goToSubitems"></a>
</span>
## Instruction:
Fix allowSelect when no model is provided
## Code After:
<span data-ng-if="allowSelect && isSelected(node)">
<a class="sequoia-button sequoia-button-danger" href="" title="Deselect" data-ng-click="deselect(node)" data-ng-bind-html="buttons.deselect"></a>
</span>
<span data-ng-if="allowSelect && !isSelected(node)">
<a class="sequoia-button sequoia-button-primary" href="" title="Select" data-ng-click="select(node)" data-ng-bind-html="buttons.select"></a>
</span>
<span data-ng-if="node[tree.template.nodes] && node[tree.template.nodes].length">
<a class="sequoia-button sequoia-button-info" href="" title="Go to subitems" data-ng-click="load(node)" data-ng-bind-html="buttons.goToSubitems"></a>
</span>
| - <span data-ng-if="isSelected(node)">
+ <span data-ng-if="allowSelect && isSelected(node)">
? +++++++++++++++
<a class="sequoia-button sequoia-button-danger" href="" title="Deselect" data-ng-click="deselect(node)" data-ng-bind-html="buttons.deselect"></a>
</span>
- <span data-ng-if="!isSelected(node)">
+ <span data-ng-if="allowSelect && !isSelected(node)">
? +++++++++++++++
<a class="sequoia-button sequoia-button-primary" href="" title="Select" data-ng-click="select(node)" data-ng-bind-html="buttons.select"></a>
</span>
<span data-ng-if="node[tree.template.nodes] && node[tree.template.nodes].length">
<a class="sequoia-button sequoia-button-info" href="" title="Go to subitems" data-ng-click="load(node)" data-ng-bind-html="buttons.goToSubitems"></a>
</span> | 4 | 0.363636 | 2 | 2 |
49850872cf2c3362760eb9c0dd9e3e35c9d0bade | static_content/_sass/components/_stat.scss | static_content/_sass/components/_stat.scss | /* Statistics */
.stat {
border-left: 1px dashed $grapefruit;
&:first-of-type {
border-left: 0;
}
a {
display: block;
padding-right: $spacing-05;
text-decoration: none;
}
}
.stat__number {
display: block;
font-weight: 600;
@include font-serif;
@include typescale-02;
}
.stat__description {
display: block;
text-transform: uppercase;
color: $dark-grey;
line-height: 1.35;
@include typescale-06;
}
| /* Statistics */
.stat {
border-left: 1px dashed $grapefruit;
&:first-of-type {
border-left: 0;
}
a {
display: block;
text-decoration: none;
}
}
.stat__number {
display: block;
font-weight: 600;
@include font-serif;
@include typescale-02;
}
.stat__description {
display: block;
text-transform: uppercase;
color: $dark-grey;
line-height: 1.35;
@include typescale-06;
}
| Remove padding to display full numbers | Remove padding to display full numbers
See DOAJ/doajPM#2481
| SCSS | apache-2.0 | DOAJ/doaj,DOAJ/doaj,DOAJ/doaj,DOAJ/doaj | scss | ## Code Before:
/* Statistics */
.stat {
border-left: 1px dashed $grapefruit;
&:first-of-type {
border-left: 0;
}
a {
display: block;
padding-right: $spacing-05;
text-decoration: none;
}
}
.stat__number {
display: block;
font-weight: 600;
@include font-serif;
@include typescale-02;
}
.stat__description {
display: block;
text-transform: uppercase;
color: $dark-grey;
line-height: 1.35;
@include typescale-06;
}
## Instruction:
Remove padding to display full numbers
See DOAJ/doajPM#2481
## Code After:
/* Statistics */
.stat {
border-left: 1px dashed $grapefruit;
&:first-of-type {
border-left: 0;
}
a {
display: block;
text-decoration: none;
}
}
.stat__number {
display: block;
font-weight: 600;
@include font-serif;
@include typescale-02;
}
.stat__description {
display: block;
text-transform: uppercase;
color: $dark-grey;
line-height: 1.35;
@include typescale-06;
}
| /* Statistics */
.stat {
border-left: 1px dashed $grapefruit;
&:first-of-type {
border-left: 0;
}
a {
display: block;
- padding-right: $spacing-05;
text-decoration: none;
}
}
.stat__number {
display: block;
font-weight: 600;
@include font-serif;
@include typescale-02;
}
.stat__description {
display: block;
text-transform: uppercase;
color: $dark-grey;
line-height: 1.35;
@include typescale-06;
} | 1 | 0.033333 | 0 | 1 |
eda19e3dfd23e8302da5c7e684fa3c5701380d42 | etc/core-site.xml | etc/core-site.xml |
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://__HOSTNAME__:9000</value>
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
</property>
</configuration>
| <configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://127.0.0.1:9000</value>
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
</property>
</configuration>
| Change hadoop command to use localhost as standalone config by default | Change hadoop command to use localhost as standalone config by default
| XML | apache-2.0 | mikefaille/ubuntu-impala | xml | ## Code Before:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://__HOSTNAME__:9000</value>
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
</property>
</configuration>
## Instruction:
Change hadoop command to use localhost as standalone config by default
## Code After:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://127.0.0.1:9000</value>
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
</property>
</configuration>
| -
<configuration>
<property>
<name>fs.default.name</name>
- <value>hdfs://__HOSTNAME__:9000</value>
? ^^^^^^^^^^^^
+ <value>hdfs://127.0.0.1:9000</value>
? ^^^^^^^^^
</property>
<property>
<name>dfs.client.read.shortcircuit</name>
<value>true</value>
- </property>
? -------
+ </property>
</configuration> | 5 | 0.454545 | 2 | 3 |
3b0c89a045b82779858b1758b4cc04a7e3e1b435 | lib/snowreports.rb | lib/snowreports.rb | require_relative "snowreports/version"
require_relative "snowreports/builder"
require_relative "snowreports/endpoint"
require_relative "snowreports/fetcher"
module Snowreports
def self.fetch(ski_field)
endpoint = Endpoint.all.fetch(ski_field)
response = Fetcher.fetch(path: endpoint.path)
Builder.build(xml: response.body, field_id: endpoint.id)
end
def self.all
Endpoint.all.first(2).each_with_object([]) do |(_field_name, endpoint), acc|
response = Fetcher.fetch(path: endpoint.path)
acc << Builder.build(xml: response.body, field_id: endpoint.id)
end
end
end
| require_relative "snowreports/version"
require_relative "snowreports/builder"
require_relative "snowreports/endpoint"
require_relative "snowreports/fetcher"
module Snowreports
def self.fetch(ski_field)
endpoint = Endpoint.all.fetch(ski_field)
response = Fetcher.fetch(path: endpoint.path)
Builder.build(xml: response.body, field_id: endpoint.id)
end
def self.all
Endpoint.all.each_with_object([]) do |(_field_name, endpoint), acc|
response = Fetcher.fetch(path: endpoint.path)
acc << Builder.build(xml: response.body, field_id: endpoint.id)
acc
end
end
end
| Return all the reports properly. | Return all the reports properly.
| Ruby | mit | snowpool/snowreports,snowpool/snowreports | ruby | ## Code Before:
require_relative "snowreports/version"
require_relative "snowreports/builder"
require_relative "snowreports/endpoint"
require_relative "snowreports/fetcher"
module Snowreports
def self.fetch(ski_field)
endpoint = Endpoint.all.fetch(ski_field)
response = Fetcher.fetch(path: endpoint.path)
Builder.build(xml: response.body, field_id: endpoint.id)
end
def self.all
Endpoint.all.first(2).each_with_object([]) do |(_field_name, endpoint), acc|
response = Fetcher.fetch(path: endpoint.path)
acc << Builder.build(xml: response.body, field_id: endpoint.id)
end
end
end
## Instruction:
Return all the reports properly.
## Code After:
require_relative "snowreports/version"
require_relative "snowreports/builder"
require_relative "snowreports/endpoint"
require_relative "snowreports/fetcher"
module Snowreports
def self.fetch(ski_field)
endpoint = Endpoint.all.fetch(ski_field)
response = Fetcher.fetch(path: endpoint.path)
Builder.build(xml: response.body, field_id: endpoint.id)
end
def self.all
Endpoint.all.each_with_object([]) do |(_field_name, endpoint), acc|
response = Fetcher.fetch(path: endpoint.path)
acc << Builder.build(xml: response.body, field_id: endpoint.id)
acc
end
end
end
| require_relative "snowreports/version"
require_relative "snowreports/builder"
require_relative "snowreports/endpoint"
require_relative "snowreports/fetcher"
module Snowreports
def self.fetch(ski_field)
endpoint = Endpoint.all.fetch(ski_field)
response = Fetcher.fetch(path: endpoint.path)
Builder.build(xml: response.body, field_id: endpoint.id)
end
def self.all
- Endpoint.all.first(2).each_with_object([]) do |(_field_name, endpoint), acc|
? ---------
+ Endpoint.all.each_with_object([]) do |(_field_name, endpoint), acc|
response = Fetcher.fetch(path: endpoint.path)
acc << Builder.build(xml: response.body, field_id: endpoint.id)
+ acc
end
end
end | 3 | 0.157895 | 2 | 1 |
8a0c17f39fd63a90b24ed79bd5bde4d52622e41d | irc/message.py | irc/message.py |
class Tag(object):
"""
An IRC message tag ircv3.net/specs/core/message-tags-3.2.html
"""
@staticmethod
def parse(item):
key, sep, value = item.partition('=')
value = value.replace('\\:', ';')
value = value.replace('\\s', ' ')
value = value.replace('\\n', '\n')
value = value.replace('\\r', '\r')
value = value.replace('\\\\', '\\')
value = value or None
return {
'key': key,
'value': value,
}
| from __future__ import print_function
class Tag(object):
"""
An IRC message tag ircv3.net/specs/core/message-tags-3.2.html
"""
@staticmethod
def parse(item):
r"""
>>> Tag.parse('x') == {'key': 'x', 'value': None}
True
>>> Tag.parse('x=yes') == {'key': 'x', 'value': 'yes'}
True
>>> Tag.parse('x=3')['value']
'3'
>>> Tag.parse('x=red fox\\:green eggs')['value']
'red fox;green eggs'
>>> Tag.parse('x=red fox:green eggs')['value']
'red fox:green eggs'
>>> print(Tag.parse('x=a\\nb\\nc')['value'])
a
b
c
"""
key, sep, value = item.partition('=')
value = value.replace('\\:', ';')
value = value.replace('\\s', ' ')
value = value.replace('\\n', '\n')
value = value.replace('\\r', '\r')
value = value.replace('\\\\', '\\')
value = value or None
return {
'key': key,
'value': value,
}
| Add tests for tag parsing | Add tests for tag parsing
| Python | mit | jaraco/irc | python | ## Code Before:
class Tag(object):
"""
An IRC message tag ircv3.net/specs/core/message-tags-3.2.html
"""
@staticmethod
def parse(item):
key, sep, value = item.partition('=')
value = value.replace('\\:', ';')
value = value.replace('\\s', ' ')
value = value.replace('\\n', '\n')
value = value.replace('\\r', '\r')
value = value.replace('\\\\', '\\')
value = value or None
return {
'key': key,
'value': value,
}
## Instruction:
Add tests for tag parsing
## Code After:
from __future__ import print_function
class Tag(object):
"""
An IRC message tag ircv3.net/specs/core/message-tags-3.2.html
"""
@staticmethod
def parse(item):
r"""
>>> Tag.parse('x') == {'key': 'x', 'value': None}
True
>>> Tag.parse('x=yes') == {'key': 'x', 'value': 'yes'}
True
>>> Tag.parse('x=3')['value']
'3'
>>> Tag.parse('x=red fox\\:green eggs')['value']
'red fox;green eggs'
>>> Tag.parse('x=red fox:green eggs')['value']
'red fox:green eggs'
>>> print(Tag.parse('x=a\\nb\\nc')['value'])
a
b
c
"""
key, sep, value = item.partition('=')
value = value.replace('\\:', ';')
value = value.replace('\\s', ' ')
value = value.replace('\\n', '\n')
value = value.replace('\\r', '\r')
value = value.replace('\\\\', '\\')
value = value or None
return {
'key': key,
'value': value,
}
| + from __future__ import print_function
class Tag(object):
"""
An IRC message tag ircv3.net/specs/core/message-tags-3.2.html
"""
@staticmethod
def parse(item):
+ r"""
+ >>> Tag.parse('x') == {'key': 'x', 'value': None}
+ True
+
+ >>> Tag.parse('x=yes') == {'key': 'x', 'value': 'yes'}
+ True
+
+ >>> Tag.parse('x=3')['value']
+ '3'
+
+ >>> Tag.parse('x=red fox\\:green eggs')['value']
+ 'red fox;green eggs'
+
+ >>> Tag.parse('x=red fox:green eggs')['value']
+ 'red fox:green eggs'
+
+ >>> print(Tag.parse('x=a\\nb\\nc')['value'])
+ a
+ b
+ c
+ """
key, sep, value = item.partition('=')
value = value.replace('\\:', ';')
value = value.replace('\\s', ' ')
value = value.replace('\\n', '\n')
value = value.replace('\\r', '\r')
value = value.replace('\\\\', '\\')
value = value or None
return {
'key': key,
'value': value,
} | 22 | 1.222222 | 22 | 0 |
519c857ad11241f07287cd3ce70d7bbd33347bec | project.qbs | project.qbs | import qbs
Project {
qbsSearchPaths: ['project/qbs']
references: [
"emulator/emulator.qbs",
"libtee/libtee.qbs",
"tests/tests.qbs",
"CAs/ClientApplications.qbs",
"TAs/TrustedApplications.qbs",
"libtee_pkcs11/libtee_pkcs11.qbs"
]
}
| import qbs
import qbs.File
Project {
qbsSearchPaths: ['project/qbs']
references: {
var binaries = [
"emulator/emulator.qbs",
"libtee/libtee.qbs",
"tests/tests.qbs",
"CAs/ClientApplications.qbs",
"TAs/TrustedApplications.qbs",
"libtee_pkcs11/libtee_pkcs11.qbs"
]
if (File.exists(sourceDirectory + "/libomnishare/libomnishare.qbs")) {
binaries.push("libomnishare/libomnishare.qbs")
}
return binaries
}
}
| Add conditional include of omnishare | Add conditional include of omnishare
Change-Id: I4370239784271de2b5113b40684cad2ddb6e08a8
Signed-off-by: TanelDettenborn <tanel.dettenborn@intel.com>
Conflicts:
project.qbs
| QML | apache-2.0 | Open-TEE/project,Open-TEE/project,Open-TEE/project | qml | ## Code Before:
import qbs
Project {
qbsSearchPaths: ['project/qbs']
references: [
"emulator/emulator.qbs",
"libtee/libtee.qbs",
"tests/tests.qbs",
"CAs/ClientApplications.qbs",
"TAs/TrustedApplications.qbs",
"libtee_pkcs11/libtee_pkcs11.qbs"
]
}
## Instruction:
Add conditional include of omnishare
Change-Id: I4370239784271de2b5113b40684cad2ddb6e08a8
Signed-off-by: TanelDettenborn <tanel.dettenborn@intel.com>
Conflicts:
project.qbs
## Code After:
import qbs
import qbs.File
Project {
qbsSearchPaths: ['project/qbs']
references: {
var binaries = [
"emulator/emulator.qbs",
"libtee/libtee.qbs",
"tests/tests.qbs",
"CAs/ClientApplications.qbs",
"TAs/TrustedApplications.qbs",
"libtee_pkcs11/libtee_pkcs11.qbs"
]
if (File.exists(sourceDirectory + "/libomnishare/libomnishare.qbs")) {
binaries.push("libomnishare/libomnishare.qbs")
}
return binaries
}
}
| import qbs
+ import qbs.File
+
Project {
- qbsSearchPaths: ['project/qbs']
? ^
+ qbsSearchPaths: ['project/qbs']
? ^^^^
- references: [
? ^ ^
+ references: {
? ^^^ ^
+ var binaries = [
- "emulator/emulator.qbs",
? ^
+ "emulator/emulator.qbs",
? ^^^^^^^^^^^^^^^^
- "libtee/libtee.qbs",
? ^
+ "libtee/libtee.qbs",
? ^^^^^^^^^^^^^^^
- "tests/tests.qbs",
+ "tests/tests.qbs",
? ++++++++
- "CAs/ClientApplications.qbs",
+ "CAs/ClientApplications.qbs",
? ++++++++
- "TAs/TrustedApplications.qbs",
+ "TAs/TrustedApplications.qbs",
? ++++++++
- "libtee_pkcs11/libtee_pkcs11.qbs"
? ^
+ "libtee_pkcs11/libtee_pkcs11.qbs"
? ^^^^^^^^^^^^^^^^
- ]
+ ]
+
+ if (File.exists(sourceDirectory + "/libomnishare/libomnishare.qbs")) {
+ binaries.push("libomnishare/libomnishare.qbs")
+ }
+ return binaries
+ }
} | 27 | 1.928571 | 18 | 9 |
29c8ba20773018354828e36b94f150b994854a46 | src/Entity/Fixture/RolePermission.php | src/Entity/Fixture/RolePermission.php | <?php
namespace App\Entity\Fixture;
use Doctrine\Common\DataFixtures\AbstractFixture;
use Doctrine\Common\DataFixtures\DependentFixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use App\Entity;
class RolePermission extends AbstractFixture implements DependentFixtureInterface
{
public function load(ObjectManager $em)
{
$permissions = [
'admin_role' => [
'administer all',
],
'demo_role' => [
'administer api keys',
'administer stations',
'view administration',
],
];
foreach($permissions as $role_reference => $perm_names) {
/** @var Entity\Role $role */
$role = $this->getReference($role_reference);
foreach($perm_names as $perm_name) {
$rp = new Entity\RolePermission($role);
$rp->setActionName($perm_name);
$em->persist($rp);
}
}
$em->flush();
}
public function getDependencies()
{
return [
Role::class
];
}
}
| <?php
namespace App\Entity\Fixture;
use App\Acl;
use Doctrine\Common\DataFixtures\AbstractFixture;
use Doctrine\Common\DataFixtures\DependentFixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use App\Entity;
class RolePermission extends AbstractFixture implements DependentFixtureInterface
{
public function load(ObjectManager $em)
{
/** @var Entity\Station $station */
$station = $this->getReference('station');
$permissions = [
'admin_role' => [
[Acl::GLOBAL_ALL, null],
],
'demo_role' => [
[Acl::STATION_ALL, $station],
[Acl::STATION_VIEW, $station],
],
];
foreach($permissions as $role_reference => $perm_names) {
/** @var Entity\Role $role */
$role = $this->getReference($role_reference);
foreach($perm_names as $perm_name) {
$rp = new Entity\RolePermission($role, $perm_name[1], $perm_name[0]);
$em->persist($rp);
}
}
$em->flush();
}
public function getDependencies()
{
return [
Role::class
];
}
}
| Disable station administrator role for demo account. | Disable station administrator role for demo account.
| PHP | agpl-3.0 | SlvrEagle23/AzuraCast,SlvrEagle23/AzuraCast,AzuraCast/AzuraCast,SlvrEagle23/AzuraCast,AzuraCast/AzuraCast,SlvrEagle23/AzuraCast,AzuraCast/AzuraCast,AzuraCast/AzuraCast,SlvrEagle23/AzuraCast | php | ## Code Before:
<?php
namespace App\Entity\Fixture;
use Doctrine\Common\DataFixtures\AbstractFixture;
use Doctrine\Common\DataFixtures\DependentFixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use App\Entity;
class RolePermission extends AbstractFixture implements DependentFixtureInterface
{
public function load(ObjectManager $em)
{
$permissions = [
'admin_role' => [
'administer all',
],
'demo_role' => [
'administer api keys',
'administer stations',
'view administration',
],
];
foreach($permissions as $role_reference => $perm_names) {
/** @var Entity\Role $role */
$role = $this->getReference($role_reference);
foreach($perm_names as $perm_name) {
$rp = new Entity\RolePermission($role);
$rp->setActionName($perm_name);
$em->persist($rp);
}
}
$em->flush();
}
public function getDependencies()
{
return [
Role::class
];
}
}
## Instruction:
Disable station administrator role for demo account.
## Code After:
<?php
namespace App\Entity\Fixture;
use App\Acl;
use Doctrine\Common\DataFixtures\AbstractFixture;
use Doctrine\Common\DataFixtures\DependentFixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use App\Entity;
class RolePermission extends AbstractFixture implements DependentFixtureInterface
{
public function load(ObjectManager $em)
{
/** @var Entity\Station $station */
$station = $this->getReference('station');
$permissions = [
'admin_role' => [
[Acl::GLOBAL_ALL, null],
],
'demo_role' => [
[Acl::STATION_ALL, $station],
[Acl::STATION_VIEW, $station],
],
];
foreach($permissions as $role_reference => $perm_names) {
/** @var Entity\Role $role */
$role = $this->getReference($role_reference);
foreach($perm_names as $perm_name) {
$rp = new Entity\RolePermission($role, $perm_name[1], $perm_name[0]);
$em->persist($rp);
}
}
$em->flush();
}
public function getDependencies()
{
return [
Role::class
];
}
}
| <?php
namespace App\Entity\Fixture;
+ use App\Acl;
use Doctrine\Common\DataFixtures\AbstractFixture;
use Doctrine\Common\DataFixtures\DependentFixtureInterface;
use Doctrine\Common\Persistence\ObjectManager;
use App\Entity;
class RolePermission extends AbstractFixture implements DependentFixtureInterface
{
public function load(ObjectManager $em)
{
+ /** @var Entity\Station $station */
+ $station = $this->getReference('station');
+
$permissions = [
'admin_role' => [
- 'administer all',
+ [Acl::GLOBAL_ALL, null],
],
'demo_role' => [
+ [Acl::STATION_ALL, $station],
+ [Acl::STATION_VIEW, $station],
- 'administer api keys',
- 'administer stations',
- 'view administration',
],
];
foreach($permissions as $role_reference => $perm_names) {
/** @var Entity\Role $role */
$role = $this->getReference($role_reference);
foreach($perm_names as $perm_name) {
- $rp = new Entity\RolePermission($role);
+ $rp = new Entity\RolePermission($role, $perm_name[1], $perm_name[0]);
? ++++++++++++++++++++++++++++++
- $rp->setActionName($perm_name);
$em->persist($rp);
}
}
$em->flush();
}
public function getDependencies()
{
return [
Role::class
];
}
} | 14 | 0.318182 | 8 | 6 |
0ebb2fdf651ca3b142772fe07dcb332ac0f2ea6f | app/templates/email_templates/email_response_denial.html | app/templates/email_templates/email_response_denial.html | {% if default_content %}
<span class="mceNonEditable">
The {{ agency_name }} has <strong>denied</strong> your FOIL request <a
href='{{ page }}'>{{ request.id }}</a>
for the following reason{% if reasons|length > 1 %}s{% endif %}:
</span>
{{ reasons | safe }}
<span class="mceNonEditable">
Please visit <a href='{{ page }}'>{{ request.id }}</a> to view additional information and take any necessary
action.
</span>
<span class="mceNonEditable">
You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
by contacting the agency's FOIL Appeals Officer: <a
href="mailto:{{ agency_appeals_email }}">{{ agency_appeals_email }}</a>
within 30 days.
</span>
{% else %}
{{ content | safe }}
<p>
<strong>Request Information:</strong><br/>
Request Title: {{ request.title }}
</p>
<p>
Request Description: {{ request.description }}
</p>
{% include "email_templates/email_footer.html" %}
{% endif %} | {% if default_content %}
<span class="mceNonEditable">
The {{ agency_name }} has <strong>denied</strong> your FOIL request <a href='{{ page }}'>{{ request.id }}</a>
for the following reason{% if reasons|length > 1 %}s{% endif %}:
</span>
{{ reasons | safe }}
<span class="mceNonEditable">
Please visit <a href='{{ page }}'>{{ request.id }}</a> to view additional information and take any necessary
action. You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
by contacting the agency's FOIL Appeals Officer: <a
href="mailto:{{ agency_appeals_email }}">{{ agency_appeals_email }}</a>
within 30 days.
</span>
{% else %}
{{ content | safe }}
<p>
<strong>Request Information:</strong><br/>
Request Title: {{ request.title }}
</p>
<p>
Request Description: {{ request.description }}
</p>
{% include "email_templates/email_footer.html" %}
{% endif %}
| Remove additional span and reformat HTML tags | Remove additional span and reformat HTML tags | HTML | apache-2.0 | CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords | html | ## Code Before:
{% if default_content %}
<span class="mceNonEditable">
The {{ agency_name }} has <strong>denied</strong> your FOIL request <a
href='{{ page }}'>{{ request.id }}</a>
for the following reason{% if reasons|length > 1 %}s{% endif %}:
</span>
{{ reasons | safe }}
<span class="mceNonEditable">
Please visit <a href='{{ page }}'>{{ request.id }}</a> to view additional information and take any necessary
action.
</span>
<span class="mceNonEditable">
You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
by contacting the agency's FOIL Appeals Officer: <a
href="mailto:{{ agency_appeals_email }}">{{ agency_appeals_email }}</a>
within 30 days.
</span>
{% else %}
{{ content | safe }}
<p>
<strong>Request Information:</strong><br/>
Request Title: {{ request.title }}
</p>
<p>
Request Description: {{ request.description }}
</p>
{% include "email_templates/email_footer.html" %}
{% endif %}
## Instruction:
Remove additional span and reformat HTML tags
## Code After:
{% if default_content %}
<span class="mceNonEditable">
The {{ agency_name }} has <strong>denied</strong> your FOIL request <a href='{{ page }}'>{{ request.id }}</a>
for the following reason{% if reasons|length > 1 %}s{% endif %}:
</span>
{{ reasons | safe }}
<span class="mceNonEditable">
Please visit <a href='{{ page }}'>{{ request.id }}</a> to view additional information and take any necessary
action. You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
by contacting the agency's FOIL Appeals Officer: <a
href="mailto:{{ agency_appeals_email }}">{{ agency_appeals_email }}</a>
within 30 days.
</span>
{% else %}
{{ content | safe }}
<p>
<strong>Request Information:</strong><br/>
Request Title: {{ request.title }}
</p>
<p>
Request Description: {{ request.description }}
</p>
{% include "email_templates/email_footer.html" %}
{% endif %}
| {% if default_content %}
<span class="mceNonEditable">
- The {{ agency_name }} has <strong>denied</strong> your FOIL request <a
+ The {{ agency_name }} has <strong>denied</strong> your FOIL request <a href='{{ page }}'>{{ request.id }}</a>
? +++++++++++++++++++++++++++++++++++++++
- href='{{ page }}'>{{ request.id }}</a>
for the following reason{% if reasons|length > 1 %}s{% endif %}:
</span>
{{ reasons | safe }}
<span class="mceNonEditable">
Please visit <a href='{{ page }}'>{{ request.id }}</a> to view additional information and take any necessary
- action.
- </span>
- <span class="mceNonEditable">
- You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
+ action. You may appeal the decision to deny access to material that was redacted in part or withheld in entirety
? ++++++++
by contacting the agency's FOIL Appeals Officer: <a
href="mailto:{{ agency_appeals_email }}">{{ agency_appeals_email }}</a>
within 30 days.
</span>
{% else %}
{{ content | safe }}
<p>
<strong>Request Information:</strong><br/>
Request Title: {{ request.title }}
</p>
<p>
Request Description: {{ request.description }}
</p>
{% include "email_templates/email_footer.html" %}
{% endif %} | 8 | 0.285714 | 2 | 6 |
f80b080f62b450531007f58849019fd18c75c25f | stacker/blueprints/rds/postgres.py | stacker/blueprints/rds/postgres.py | from stacker.blueprints.rds import base
class PostgresMixin(object):
def engine(self):
return "postgres"
def get_engine_versions(self):
return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.4.1']
def get_db_families(self):
return ["postgres9.3", "postgres9.4"]
class MasterInstance(PostgresMixin, base.MasterInstance):
pass
class ReadReplica(PostgresMixin, base.ReadReplica):
pass
| from stacker.blueprints.rds import base
class PostgresMixin(object):
def engine(self):
return "postgres"
def get_engine_versions(self):
return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.3.9',
'9.3.10', '9.4.1', '9.4.4', '9.4.5']
def get_db_families(self):
return ["postgres9.3", "postgres9.4"]
class MasterInstance(PostgresMixin, base.MasterInstance):
pass
class ReadReplica(PostgresMixin, base.ReadReplica):
pass
| Add new versions of Postgres | Add new versions of Postgres
| Python | bsd-2-clause | mhahn/stacker,remind101/stacker,mhahn/stacker,remind101/stacker | python | ## Code Before:
from stacker.blueprints.rds import base
class PostgresMixin(object):
def engine(self):
return "postgres"
def get_engine_versions(self):
return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.4.1']
def get_db_families(self):
return ["postgres9.3", "postgres9.4"]
class MasterInstance(PostgresMixin, base.MasterInstance):
pass
class ReadReplica(PostgresMixin, base.ReadReplica):
pass
## Instruction:
Add new versions of Postgres
## Code After:
from stacker.blueprints.rds import base
class PostgresMixin(object):
def engine(self):
return "postgres"
def get_engine_versions(self):
return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.3.9',
'9.3.10', '9.4.1', '9.4.4', '9.4.5']
def get_db_families(self):
return ["postgres9.3", "postgres9.4"]
class MasterInstance(PostgresMixin, base.MasterInstance):
pass
class ReadReplica(PostgresMixin, base.ReadReplica):
pass
| from stacker.blueprints.rds import base
class PostgresMixin(object):
def engine(self):
return "postgres"
def get_engine_versions(self):
- return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.4.1']
? ^ ^ ^
+ return ['9.3.1', '9.3.2', '9.3.3', '9.3.5', '9.3.6', '9.3.9',
? ^ ^ ^
+ '9.3.10', '9.4.1', '9.4.4', '9.4.5']
def get_db_families(self):
return ["postgres9.3", "postgres9.4"]
class MasterInstance(PostgresMixin, base.MasterInstance):
pass
class ReadReplica(PostgresMixin, base.ReadReplica):
pass | 3 | 0.15 | 2 | 1 |
c743b0ba03174d07553b5be3edfe7f49ae8a0e29 | packages/al/alarmclock.yaml | packages/al/alarmclock.yaml | homepage: https://bitbucket.org/davecturner/alarmclock
changelog-type: ''
hash: 63dc10e71593e2797138e79fd238a4ba9375622be77bd6f35f02d4587fd171a6
test-bench-deps:
base: -any
time: -any
alarmclock: -any
maintainer: dave.c.turner@gmail.com
synopsis: Wake up and perform an action at a certain time.
changelog: ''
basic-deps:
stm: -any
base: ! '>=4.7 && <4.9'
time: -any
unbounded-delays: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.2.0.1'
- '0.2.0.2'
- '0.2.0.3'
- '0.2.0.4'
- '0.2.0.5'
- '0.2.0.6'
- '0.2.0.7'
- '0.2.0.8'
author: David Turner
latest: '0.2.0.8'
description-type: haddock
description: Wake up and perform an action at a certain time.
license-name: BSD3
| homepage: https://bitbucket.org/davecturner/alarmclock
changelog-type: ''
hash: bde3f96aaa41e5f0cd96e1de2d5e5cc5ddbce59ed94367ddf0144b0d7386c7a9
test-bench-deps: {}
maintainer: dave.c.turner@gmail.com
synopsis: Wake up and perform an action at a certain time.
changelog: ''
basic-deps:
stm: -any
base: ! '>=4.7 && <4.9'
time: -any
unbounded-delays: -any
alarmclock: -any
all-versions:
- '0.1.0.1'
- '0.1.0.2'
- '0.2.0.1'
- '0.2.0.2'
- '0.2.0.3'
- '0.2.0.4'
- '0.2.0.5'
- '0.2.0.6'
- '0.2.0.7'
- '0.2.0.8'
- '0.2.0.9'
author: David Turner
latest: '0.2.0.9'
description-type: haddock
description: Wake up and perform an action at a certain time.
license-name: BSD3
| Update from Hackage at 2016-03-31T11:40:27+0000 | Update from Hackage at 2016-03-31T11:40:27+0000
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://bitbucket.org/davecturner/alarmclock
changelog-type: ''
hash: 63dc10e71593e2797138e79fd238a4ba9375622be77bd6f35f02d4587fd171a6
test-bench-deps:
base: -any
time: -any
alarmclock: -any
maintainer: dave.c.turner@gmail.com
synopsis: Wake up and perform an action at a certain time.
changelog: ''
basic-deps:
stm: -any
base: ! '>=4.7 && <4.9'
time: -any
unbounded-delays: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.2.0.1'
- '0.2.0.2'
- '0.2.0.3'
- '0.2.0.4'
- '0.2.0.5'
- '0.2.0.6'
- '0.2.0.7'
- '0.2.0.8'
author: David Turner
latest: '0.2.0.8'
description-type: haddock
description: Wake up and perform an action at a certain time.
license-name: BSD3
## Instruction:
Update from Hackage at 2016-03-31T11:40:27+0000
## Code After:
homepage: https://bitbucket.org/davecturner/alarmclock
changelog-type: ''
hash: bde3f96aaa41e5f0cd96e1de2d5e5cc5ddbce59ed94367ddf0144b0d7386c7a9
test-bench-deps: {}
maintainer: dave.c.turner@gmail.com
synopsis: Wake up and perform an action at a certain time.
changelog: ''
basic-deps:
stm: -any
base: ! '>=4.7 && <4.9'
time: -any
unbounded-delays: -any
alarmclock: -any
all-versions:
- '0.1.0.1'
- '0.1.0.2'
- '0.2.0.1'
- '0.2.0.2'
- '0.2.0.3'
- '0.2.0.4'
- '0.2.0.5'
- '0.2.0.6'
- '0.2.0.7'
- '0.2.0.8'
- '0.2.0.9'
author: David Turner
latest: '0.2.0.9'
description-type: haddock
description: Wake up and perform an action at a certain time.
license-name: BSD3
| homepage: https://bitbucket.org/davecturner/alarmclock
changelog-type: ''
- hash: 63dc10e71593e2797138e79fd238a4ba9375622be77bd6f35f02d4587fd171a6
+ hash: bde3f96aaa41e5f0cd96e1de2d5e5cc5ddbce59ed94367ddf0144b0d7386c7a9
- test-bench-deps:
+ test-bench-deps: {}
? +++
- base: -any
- time: -any
- alarmclock: -any
maintainer: dave.c.turner@gmail.com
synopsis: Wake up and perform an action at a certain time.
changelog: ''
basic-deps:
stm: -any
base: ! '>=4.7 && <4.9'
time: -any
unbounded-delays: -any
+ alarmclock: -any
all-versions:
- - '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.2.0.1'
- '0.2.0.2'
- '0.2.0.3'
- '0.2.0.4'
- '0.2.0.5'
- '0.2.0.6'
- '0.2.0.7'
- '0.2.0.8'
+ - '0.2.0.9'
author: David Turner
- latest: '0.2.0.8'
? ^
+ latest: '0.2.0.9'
? ^
description-type: haddock
description: Wake up and perform an action at a certain time.
license-name: BSD3 | 12 | 0.375 | 5 | 7 |
516a81459a87e7d582a4a33e1daeb12908bf6abb | .travis.yml | .travis.yml | language: python
# Which versions of Python to test
python:
- "2.7"
# Have the virtualenv use system-wide site packages
virtualenv:
system_site_packages: true
# command to install dependencies
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq git libopencv-dev python-opencv python-numpy python-matplotlib sqlite3
# Install PlantCV
install:
- python setup.py install
# command to run tests
script: py.test -v tests/tests.py
# Which branches to run build tests on
branches:
only:
- dev
| language: python
# Which versions of Python to test
python:
- "2.7"
# Have the virtualenv use system-wide site packages
virtualenv:
system_site_packages: true
# command to install dependencies
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq git libopencv-dev python-opencv python-numpy python-matplotlib sqlite3
- sudo pip install --upgrade numpy
# Install PlantCV
install:
- python setup.py install
# command to run tests
script: py.test -v tests/tests.py
# Which branches to run build tests on
branches:
only:
- dev
| Update bumpy in Travis-CI VM | Update bumpy in Travis-CI VM
| YAML | mit | danforthcenter/plantcv,stiphyMT/plantcv,stiphyMT/plantcv,stiphyMT/plantcv,jshoyer/plantcv,AntonSax/plantcv,danforthcenter/plantcv,danforthcenter/plantcv,jshoyer/plantcv,AntonSax/plantcv | yaml | ## Code Before:
language: python
# Which versions of Python to test
python:
- "2.7"
# Have the virtualenv use system-wide site packages
virtualenv:
system_site_packages: true
# command to install dependencies
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq git libopencv-dev python-opencv python-numpy python-matplotlib sqlite3
# Install PlantCV
install:
- python setup.py install
# command to run tests
script: py.test -v tests/tests.py
# Which branches to run build tests on
branches:
only:
- dev
## Instruction:
Update bumpy in Travis-CI VM
## Code After:
language: python
# Which versions of Python to test
python:
- "2.7"
# Have the virtualenv use system-wide site packages
virtualenv:
system_site_packages: true
# command to install dependencies
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq git libopencv-dev python-opencv python-numpy python-matplotlib sqlite3
- sudo pip install --upgrade numpy
# Install PlantCV
install:
- python setup.py install
# command to run tests
script: py.test -v tests/tests.py
# Which branches to run build tests on
branches:
only:
- dev
| language: python
# Which versions of Python to test
python:
- "2.7"
# Have the virtualenv use system-wide site packages
virtualenv:
system_site_packages: true
# command to install dependencies
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq git libopencv-dev python-opencv python-numpy python-matplotlib sqlite3
+ - sudo pip install --upgrade numpy
# Install PlantCV
install:
- python setup.py install
# command to run tests
script: py.test -v tests/tests.py
# Which branches to run build tests on
branches:
only:
- dev | 1 | 0.05 | 1 | 0 |
27c2a6966ca6e57fe51c0a0d4b6d604707c0af6f | scripts/aws/setupdb.sh | scripts/aws/setupdb.sh |
set -e
set -x
# Export settings required to run psql non-interactively
export PGHOST=$(cat /etc/mmw.d/env/MMW_DB_HOST)
export PGDATABASE=$(cat /etc/mmw.d/env/MMW_DB_NAME)
export PGUSER=$(cat /etc/mmw.d/env/MMW_DB_USER)
export PGPASSWORD=$(cat /etc/mmw.d/env/MMW_DB_PASSWORD)
# Ensure that the PostGIS extension exists
psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
# Run migrations
envdir /etc/mmw.d/env /opt/app/manage.py migrate
|
set -e
set -x
usage="$(basename "$0") [-h] [-d] \n
--Sets up a postgresql database for MMW \n
\n
where: \n
-h show this help text\n
-d load/reload base data\n
"
load_data=false
while getopts ":hd" opt; do
case $opt in
h)
echo -e $usage
exit
;;
d)
load_data=true
;;
\?)
echo "invalid option: -$OPTARG"
exit
;;
esac
done
# Export settings required to run psql non-interactively
export PGHOST=$(cat /etc/mmw.d/env/MMW_DB_HOST)
export PGDATABASE=$(cat /etc/mmw.d/env/MMW_DB_NAME)
export PGUSER=$(cat /etc/mmw.d/env/MMW_DB_USER)
export PGPASSWORD=$(cat /etc/mmw.d/env/MMW_DB_PASSWORD)
# Ensure that the PostGIS extension exists
psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
# Run migrations
envdir /etc/mmw.d/env /opt/app/manage.py migrate
if [ "$load_data" = "true" ] ; then
# Fetch boundary layer sql files
FILE_HOST="https://s3.amazonaws.com/data.mmw.azavea.com"
FILES=("boundary_huc12.sql.gz" "boundary_huc10.sql.gz" "boundary_huc08.sql.gz")
for f in "${FILES[@]}"; do
curl -s $FILE_HOST/$f | gunzip -q | psql --single-transaction
done
fi
| Update script to load HUC data to database | Update script to load HUC data to database
This script is intended to be run from an `app` vm in either development
or production.
* gzipped SQL files are hosted on external source (S3)
* `setupdb.sh` can be run idempotently, it will download/create/insert
HUC boundary data safely and repeatedly.
Original shp data can be found on the Azavea project fileshare for
Stroud/downloads. They were transformed to pgsql via `shp2pgsql` using
the following pattern:
```sql
shp2pgsql -W "LATIN1" /path/to/shape.shp
public.boundary_{layer} | psql -h localhost -d mmw -U mmw
```
The schema was modified via `psql` to remove unncessary attributes, and
in the case of HUC12, simplified the polygon with a .00008 threshold
```sql
update boundary_huc12 set geom =
ST_Multi(ST_SimplifyPreserveTopology(geom_detailed, 0.00008));
```
I then exported the tables via `pg_dump`:
```bash
pg_dump -c --if-exists -O -t boundary_huc12 -h localhost -U mmw mmw >
boundary_huc12.sql
```
The resulting files are gzipped for distribution.
| Shell | apache-2.0 | lliss/model-my-watershed,project-icp/bee-pollinator-app,WikiWatershed/model-my-watershed,project-icp/bee-pollinator-app,kdeloach/model-my-watershed,lliss/model-my-watershed,WikiWatershed/model-my-watershed,kdeloach/model-my-watershed,kdeloach/model-my-watershed,project-icp/bee-pollinator-app,lliss/model-my-watershed,WikiWatershed/model-my-watershed,kdeloach/model-my-watershed,project-icp/bee-pollinator-app,kdeloach/model-my-watershed,WikiWatershed/model-my-watershed,WikiWatershed/model-my-watershed,lliss/model-my-watershed,lliss/model-my-watershed | shell | ## Code Before:
set -e
set -x
# Export settings required to run psql non-interactively
export PGHOST=$(cat /etc/mmw.d/env/MMW_DB_HOST)
export PGDATABASE=$(cat /etc/mmw.d/env/MMW_DB_NAME)
export PGUSER=$(cat /etc/mmw.d/env/MMW_DB_USER)
export PGPASSWORD=$(cat /etc/mmw.d/env/MMW_DB_PASSWORD)
# Ensure that the PostGIS extension exists
psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
# Run migrations
envdir /etc/mmw.d/env /opt/app/manage.py migrate
## Instruction:
Update script to load HUC data to database
This script is intended to be run from an `app` vm in either development
or production.
* gzipped SQL files are hosted on external source (S3)
* `setupdb.sh` can be run idempotently, it will download/create/insert
HUC boundary data safely and repeatedly.
Original shp data can be found on the Azavea project fileshare for
Stroud/downloads. They were transformed to pgsql via `shp2pgsql` using
the following pattern:
```sql
shp2pgsql -W "LATIN1" /path/to/shape.shp
public.boundary_{layer} | psql -h localhost -d mmw -U mmw
```
The schema was modified via `psql` to remove unncessary attributes, and
in the case of HUC12, simplified the polygon with a .00008 threshold
```sql
update boundary_huc12 set geom =
ST_Multi(ST_SimplifyPreserveTopology(geom_detailed, 0.00008));
```
I then exported the tables via `pg_dump`:
```bash
pg_dump -c --if-exists -O -t boundary_huc12 -h localhost -U mmw mmw >
boundary_huc12.sql
```
The resulting files are gzipped for distribution.
## Code After:
set -e
set -x
usage="$(basename "$0") [-h] [-d] \n
--Sets up a postgresql database for MMW \n
\n
where: \n
-h show this help text\n
-d load/reload base data\n
"
load_data=false
while getopts ":hd" opt; do
case $opt in
h)
echo -e $usage
exit
;;
d)
load_data=true
;;
\?)
echo "invalid option: -$OPTARG"
exit
;;
esac
done
# Export settings required to run psql non-interactively
export PGHOST=$(cat /etc/mmw.d/env/MMW_DB_HOST)
export PGDATABASE=$(cat /etc/mmw.d/env/MMW_DB_NAME)
export PGUSER=$(cat /etc/mmw.d/env/MMW_DB_USER)
export PGPASSWORD=$(cat /etc/mmw.d/env/MMW_DB_PASSWORD)
# Ensure that the PostGIS extension exists
psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
# Run migrations
envdir /etc/mmw.d/env /opt/app/manage.py migrate
if [ "$load_data" = "true" ] ; then
# Fetch boundary layer sql files
FILE_HOST="https://s3.amazonaws.com/data.mmw.azavea.com"
FILES=("boundary_huc12.sql.gz" "boundary_huc10.sql.gz" "boundary_huc08.sql.gz")
for f in "${FILES[@]}"; do
curl -s $FILE_HOST/$f | gunzip -q | psql --single-transaction
done
fi
|
set -e
set -x
+ usage="$(basename "$0") [-h] [-d] \n
+ --Sets up a postgresql database for MMW \n
+ \n
+ where: \n
+ -h show this help text\n
+ -d load/reload base data\n
+ "
+
+ load_data=false
+
+ while getopts ":hd" opt; do
+ case $opt in
+ h)
+ echo -e $usage
+ exit
+ ;;
+ d)
+ load_data=true
+ ;;
+ \?)
+ echo "invalid option: -$OPTARG"
+ exit
+ ;;
+ esac
+ done
# Export settings required to run psql non-interactively
export PGHOST=$(cat /etc/mmw.d/env/MMW_DB_HOST)
export PGDATABASE=$(cat /etc/mmw.d/env/MMW_DB_NAME)
export PGUSER=$(cat /etc/mmw.d/env/MMW_DB_USER)
export PGPASSWORD=$(cat /etc/mmw.d/env/MMW_DB_PASSWORD)
# Ensure that the PostGIS extension exists
psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
# Run migrations
envdir /etc/mmw.d/env /opt/app/manage.py migrate
+
+ if [ "$load_data" = "true" ] ; then
+ # Fetch boundary layer sql files
+ FILE_HOST="https://s3.amazonaws.com/data.mmw.azavea.com"
+ FILES=("boundary_huc12.sql.gz" "boundary_huc10.sql.gz" "boundary_huc08.sql.gz")
+
+ for f in "${FILES[@]}"; do
+ curl -s $FILE_HOST/$f | gunzip -q | psql --single-transaction
+ done
+ fi | 35 | 2.333333 | 35 | 0 |
cd5764812337d2435b9a207422ffeb1e216f0504 | .travis.yml | .travis.yml | sudo: false
language: node_js
node_js:
- "0.10"
env:
- TEST_DIR=/
- TEST_DIR=packages/dispatchr
- TEST_DIR=packages/fluxible
- TEST_DIR=packages/fluxible-plugin-fetchr
- TEST_DIR=packages/fluxible-reducer-store
- TEST_DIR=packages/fluxible-addons-react
- TEST_DIR=packages/fluxible-router
- TEST_DIR=packages/generator-fluxible
- TEST_DIR=site
script: cd $TEST_DIR && npm test
after_success:
- "npm run cover"
- "cat artifacts/lcov.info | ./node_modules/coveralls/bin/coveralls.js"
| sudo: false
language: node_js
node_js:
- "0.10"
env:
- TEST_DIR=packages/dispatchr
- TEST_DIR=packages/fluxible
- TEST_DIR=packages/fluxible-plugin-fetchr
- TEST_DIR=packages/fluxible-reducer-store
- TEST_DIR=packages/fluxible-addons-react
- TEST_DIR=packages/fluxible-router
- TEST_DIR=packages/generator-fluxible
- TEST_DIR=site
script: cd $TEST_DIR && npm test
after_success:
- "npm run cover"
- "cat artifacts/lcov.info | ./node_modules/coveralls/bin/coveralls.js"
| Remove root from test_dir dimensions | Remove root from test_dir dimensions
| YAML | bsd-3-clause | pablolmiranda/fluxible | yaml | ## Code Before:
sudo: false
language: node_js
node_js:
- "0.10"
env:
- TEST_DIR=/
- TEST_DIR=packages/dispatchr
- TEST_DIR=packages/fluxible
- TEST_DIR=packages/fluxible-plugin-fetchr
- TEST_DIR=packages/fluxible-reducer-store
- TEST_DIR=packages/fluxible-addons-react
- TEST_DIR=packages/fluxible-router
- TEST_DIR=packages/generator-fluxible
- TEST_DIR=site
script: cd $TEST_DIR && npm test
after_success:
- "npm run cover"
- "cat artifacts/lcov.info | ./node_modules/coveralls/bin/coveralls.js"
## Instruction:
Remove root from test_dir dimensions
## Code After:
sudo: false
language: node_js
node_js:
- "0.10"
env:
- TEST_DIR=packages/dispatchr
- TEST_DIR=packages/fluxible
- TEST_DIR=packages/fluxible-plugin-fetchr
- TEST_DIR=packages/fluxible-reducer-store
- TEST_DIR=packages/fluxible-addons-react
- TEST_DIR=packages/fluxible-router
- TEST_DIR=packages/generator-fluxible
- TEST_DIR=site
script: cd $TEST_DIR && npm test
after_success:
- "npm run cover"
- "cat artifacts/lcov.info | ./node_modules/coveralls/bin/coveralls.js"
| sudo: false
language: node_js
node_js:
- "0.10"
env:
- - TEST_DIR=/
- TEST_DIR=packages/dispatchr
- TEST_DIR=packages/fluxible
- TEST_DIR=packages/fluxible-plugin-fetchr
- TEST_DIR=packages/fluxible-reducer-store
- TEST_DIR=packages/fluxible-addons-react
- TEST_DIR=packages/fluxible-router
- TEST_DIR=packages/generator-fluxible
- TEST_DIR=site
script: cd $TEST_DIR && npm test
after_success:
- "npm run cover"
- "cat artifacts/lcov.info | ./node_modules/coveralls/bin/coveralls.js" | 1 | 0.055556 | 0 | 1 |
588f3f3b91a70ae44752043f9fd8d3d39280fa0b | README.md | README.md | Dot Files
=========
[](https://drone.albertyw.com/albertyw/dotfiles)
[](https://codeclimate.com/github/albertyw/dotfiles)
This repository includes my personal Unix config files that are synchronized
across my various computers.
It is based off of [hrs's dotfiles](https://github.com/hrs/dotfiles).
Contents
--------
There are a few particular files/directories to note in here:
- [bashrc](https://github.com/albertyw/dotfiles/blob/master/files/bashrc)
- [gitconfig](https://github.com/albertyw/dotfiles/blob/master/files/gitconfig)
- [git browse](https://github.com/albertyw/git-browse)
- [git reviewers](https://github.com/albertyw/git-reviewers)
- [vim configs](https://github.com/albertyw/dotfiles/tree/master/files/vim/)
Setup
-----
```shell
git clone git@github.com:albertyw/dotfiles.git .dotfiles
cd .dotfiles
scripts/link.sh
scripts/install.sh
```
| Dot Files
=========
[](https://drone.albertyw.com/albertyw/dotfiles)
This repository includes my personal Unix config files that are synchronized
across my various computers.
It is based off of [hrs's dotfiles](https://github.com/hrs/dotfiles).
Contents
--------
There are a few particular files/directories to note in here:
- [bashrc](https://github.com/albertyw/dotfiles/blob/master/files/bashrc)
- [gitconfig](https://github.com/albertyw/dotfiles/blob/master/files/gitconfig)
- [git browse](https://github.com/albertyw/git-browse)
- [git reviewers](https://github.com/albertyw/git-reviewers)
- [vim configs](https://github.com/albertyw/dotfiles/tree/master/files/vim/)
Setup
-----
```shell
git clone git@github.com:albertyw/dotfiles.git .dotfiles
cd .dotfiles
scripts/link.sh
scripts/install.sh
```
| Delete broken codeclimate badge from readme | Delete broken codeclimate badge from readme
| Markdown | mit | albertyw/dotfiles,albertyw/dotfiles | markdown | ## Code Before:
Dot Files
=========
[](https://drone.albertyw.com/albertyw/dotfiles)
[](https://codeclimate.com/github/albertyw/dotfiles)
This repository includes my personal Unix config files that are synchronized
across my various computers.
It is based off of [hrs's dotfiles](https://github.com/hrs/dotfiles).
Contents
--------
There are a few particular files/directories to note in here:
- [bashrc](https://github.com/albertyw/dotfiles/blob/master/files/bashrc)
- [gitconfig](https://github.com/albertyw/dotfiles/blob/master/files/gitconfig)
- [git browse](https://github.com/albertyw/git-browse)
- [git reviewers](https://github.com/albertyw/git-reviewers)
- [vim configs](https://github.com/albertyw/dotfiles/tree/master/files/vim/)
Setup
-----
```shell
git clone git@github.com:albertyw/dotfiles.git .dotfiles
cd .dotfiles
scripts/link.sh
scripts/install.sh
```
## Instruction:
Delete broken codeclimate badge from readme
## Code After:
Dot Files
=========
[](https://drone.albertyw.com/albertyw/dotfiles)
This repository includes my personal Unix config files that are synchronized
across my various computers.
It is based off of [hrs's dotfiles](https://github.com/hrs/dotfiles).
Contents
--------
There are a few particular files/directories to note in here:
- [bashrc](https://github.com/albertyw/dotfiles/blob/master/files/bashrc)
- [gitconfig](https://github.com/albertyw/dotfiles/blob/master/files/gitconfig)
- [git browse](https://github.com/albertyw/git-browse)
- [git reviewers](https://github.com/albertyw/git-reviewers)
- [vim configs](https://github.com/albertyw/dotfiles/tree/master/files/vim/)
Setup
-----
```shell
git clone git@github.com:albertyw/dotfiles.git .dotfiles
cd .dotfiles
scripts/link.sh
scripts/install.sh
```
| Dot Files
=========
[](https://drone.albertyw.com/albertyw/dotfiles)
- [](https://codeclimate.com/github/albertyw/dotfiles)
This repository includes my personal Unix config files that are synchronized
across my various computers.
It is based off of [hrs's dotfiles](https://github.com/hrs/dotfiles).
Contents
--------
There are a few particular files/directories to note in here:
- [bashrc](https://github.com/albertyw/dotfiles/blob/master/files/bashrc)
- [gitconfig](https://github.com/albertyw/dotfiles/blob/master/files/gitconfig)
- [git browse](https://github.com/albertyw/git-browse)
- [git reviewers](https://github.com/albertyw/git-reviewers)
- [vim configs](https://github.com/albertyw/dotfiles/tree/master/files/vim/)
Setup
-----
```shell
git clone git@github.com:albertyw/dotfiles.git .dotfiles
cd .dotfiles
scripts/link.sh
scripts/install.sh
``` | 1 | 0.032258 | 0 | 1 |
ed5827566626e3e1e2fd9ccbec83829bbfeba929 | lib/tugboat/middleware/snapshot_droplet.rb | lib/tugboat/middleware/snapshot_droplet.rb | module Tugboat
module Middleware
class SnapshotDroplet < Base
def call(env)
ocean = env["ocean"]
say "Queuing snapshot '#{env["user_snapshot_name"]}' for #{env["droplet_id"]} #{env["droplet_name"]}...", nil, false
# Temporary
req = ocean.droplets.snapshot env["droplet_id"],
:name => env["user_snapshot_name"]
if req.status == "ERROR"
say "#{req.status}: #{req.error_message}", :red
return
end
say "done", :green
@app.call(env)
end
end
end
end
| module Tugboat
module Middleware
class SnapshotDroplet < Base
def call(env)
ocean = env["ocean"]
# Right now, the digital ocean API doesn't return an error
# when your droplet is not powered off and you try to snapshot.
# This is a temporary measure to let the user know.
say "Warning: Droplet must be in a powered off state for snapshot to be successful", :yellow
say "Queuing snapshot '#{env["user_snapshot_name"]}' for #{env["droplet_id"]} #{env["droplet_name"]}...", nil, false
req = ocean.droplets.snapshot env["droplet_id"],
:name => env["user_snapshot_name"]
if req.status == "ERROR"
say "#{req.status}: #{req.error_message}", :red
return
end
say "done", :green
@app.call(env)
end
end
end
end
| Add a warning for snapshots in a non-powered off state. | Add a warning for snapshots in a non-powered off state.
This is temporary until DigitalOcean returns an error from the API.
| Ruby | mit | mtbottle/tugboat,beni55/tugboat,haihappen/tugboat,noma4i/tugboat,noma4i/tugboat,conorsch/tugboat,Ferada/tugboat,Ferada/tugboat,conorsch/tugboat,pearkes/tugboat,beni55/tugboat,haihappen/tugboat,petems/tugboat | ruby | ## Code Before:
module Tugboat
module Middleware
class SnapshotDroplet < Base
def call(env)
ocean = env["ocean"]
say "Queuing snapshot '#{env["user_snapshot_name"]}' for #{env["droplet_id"]} #{env["droplet_name"]}...", nil, false
# Temporary
req = ocean.droplets.snapshot env["droplet_id"],
:name => env["user_snapshot_name"]
if req.status == "ERROR"
say "#{req.status}: #{req.error_message}", :red
return
end
say "done", :green
@app.call(env)
end
end
end
end
## Instruction:
Add a warning for snapshots in a non-powered off state.
This is temporary until DigitalOcean returns an error from the API.
## Code After:
module Tugboat
module Middleware
class SnapshotDroplet < Base
def call(env)
ocean = env["ocean"]
# Right now, the digital ocean API doesn't return an error
# when your droplet is not powered off and you try to snapshot.
# This is a temporary measure to let the user know.
say "Warning: Droplet must be in a powered off state for snapshot to be successful", :yellow
say "Queuing snapshot '#{env["user_snapshot_name"]}' for #{env["droplet_id"]} #{env["droplet_name"]}...", nil, false
req = ocean.droplets.snapshot env["droplet_id"],
:name => env["user_snapshot_name"]
if req.status == "ERROR"
say "#{req.status}: #{req.error_message}", :red
return
end
say "done", :green
@app.call(env)
end
end
end
end
| module Tugboat
module Middleware
class SnapshotDroplet < Base
def call(env)
ocean = env["ocean"]
+ # Right now, the digital ocean API doesn't return an error
+ # when your droplet is not powered off and you try to snapshot.
+ # This is a temporary measure to let the user know.
+ say "Warning: Droplet must be in a powered off state for snapshot to be successful", :yellow
say "Queuing snapshot '#{env["user_snapshot_name"]}' for #{env["droplet_id"]} #{env["droplet_name"]}...", nil, false
- # Temporary
req = ocean.droplets.snapshot env["droplet_id"],
:name => env["user_snapshot_name"]
if req.status == "ERROR"
say "#{req.status}: #{req.error_message}", :red
return
end
say "done", :green
@app.call(env)
end
end
end
end
| 5 | 0.2 | 4 | 1 |
657a59583bf1c3b131a08eafb8c1b80150f94e8a | packages/ve/vector-th-unbox.yaml | packages/ve/vector-th-unbox.yaml | homepage: ''
changelog-type: ''
hash: bbbaaa67662ac866911ed63077c38f763c0b4f5e10ad3a73d0b3cac3236dfdae
test-bench-deps:
base: -any
data-default: -any
vector-th-unbox: -any
vector: -any
maintainer: Liyang HU <vector-th-unbox@liyang.hu>
synopsis: Deriver for Data.Vector.Unboxed using Template Haskell
changelog: ''
basic-deps:
base: ! '>=4.5 && <4.13'
template-haskell: ! '>=2.5 && <2.15'
vector: ! '>=0.7'
all-versions:
- 0.2.1.3
- 0.2.1.4
- 0.2.1.5
- 0.2.1.6
author: Liyang HU <vector-th-unbox@liyang.hu>
latest: 0.2.1.6
description-type: haddock
description: |-
A Template Haskell deriver for unboxed vectors, given a pair of coercion
functions to and from some existing type with an Unbox instance.
Refer to "Data.Vector.Unboxed.Deriving" for documentation and examples.
license-name: BSD-3-Clause
| homepage: ''
changelog-type: ''
hash: f7054cf1bd32c042a7980ec22eba4c9e230f88519d5511b7094b35beab9f9352
test-bench-deps:
base: -any
data-default: -any
vector-th-unbox: -any
vector: -any
maintainer: Liyang HU <vector-th-unbox@liyang.hu>
synopsis: Deriver for Data.Vector.Unboxed using Template Haskell
changelog: ''
basic-deps:
base: ! '>=4.5 && <4.14'
template-haskell: ! '>=2.5 && <2.16'
vector: ! '>=0.7.1 && <0.13'
all-versions:
- 0.2.1.3
- 0.2.1.4
- 0.2.1.5
- 0.2.1.6
- 0.2.1.7
author: Liyang HU <vector-th-unbox@liyang.hu>
latest: 0.2.1.7
description-type: haddock
description: |-
A Template Haskell deriver for unboxed vectors, given a pair of coercion
functions to and from some existing type with an Unbox instance.
Refer to "Data.Vector.Unboxed.Deriving" for documentation and examples.
license-name: BSD-3-Clause
| Update from Hackage at 2019-09-24T13:02:10Z | Update from Hackage at 2019-09-24T13:02:10Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: ''
changelog-type: ''
hash: bbbaaa67662ac866911ed63077c38f763c0b4f5e10ad3a73d0b3cac3236dfdae
test-bench-deps:
base: -any
data-default: -any
vector-th-unbox: -any
vector: -any
maintainer: Liyang HU <vector-th-unbox@liyang.hu>
synopsis: Deriver for Data.Vector.Unboxed using Template Haskell
changelog: ''
basic-deps:
base: ! '>=4.5 && <4.13'
template-haskell: ! '>=2.5 && <2.15'
vector: ! '>=0.7'
all-versions:
- 0.2.1.3
- 0.2.1.4
- 0.2.1.5
- 0.2.1.6
author: Liyang HU <vector-th-unbox@liyang.hu>
latest: 0.2.1.6
description-type: haddock
description: |-
A Template Haskell deriver for unboxed vectors, given a pair of coercion
functions to and from some existing type with an Unbox instance.
Refer to "Data.Vector.Unboxed.Deriving" for documentation and examples.
license-name: BSD-3-Clause
## Instruction:
Update from Hackage at 2019-09-24T13:02:10Z
## Code After:
homepage: ''
changelog-type: ''
hash: f7054cf1bd32c042a7980ec22eba4c9e230f88519d5511b7094b35beab9f9352
test-bench-deps:
base: -any
data-default: -any
vector-th-unbox: -any
vector: -any
maintainer: Liyang HU <vector-th-unbox@liyang.hu>
synopsis: Deriver for Data.Vector.Unboxed using Template Haskell
changelog: ''
basic-deps:
base: ! '>=4.5 && <4.14'
template-haskell: ! '>=2.5 && <2.16'
vector: ! '>=0.7.1 && <0.13'
all-versions:
- 0.2.1.3
- 0.2.1.4
- 0.2.1.5
- 0.2.1.6
- 0.2.1.7
author: Liyang HU <vector-th-unbox@liyang.hu>
latest: 0.2.1.7
description-type: haddock
description: |-
A Template Haskell deriver for unboxed vectors, given a pair of coercion
functions to and from some existing type with an Unbox instance.
Refer to "Data.Vector.Unboxed.Deriving" for documentation and examples.
license-name: BSD-3-Clause
| homepage: ''
changelog-type: ''
- hash: bbbaaa67662ac866911ed63077c38f763c0b4f5e10ad3a73d0b3cac3236dfdae
+ hash: f7054cf1bd32c042a7980ec22eba4c9e230f88519d5511b7094b35beab9f9352
test-bench-deps:
base: -any
data-default: -any
vector-th-unbox: -any
vector: -any
maintainer: Liyang HU <vector-th-unbox@liyang.hu>
synopsis: Deriver for Data.Vector.Unboxed using Template Haskell
changelog: ''
basic-deps:
- base: ! '>=4.5 && <4.13'
? ^
+ base: ! '>=4.5 && <4.14'
? ^
- template-haskell: ! '>=2.5 && <2.15'
? ^
+ template-haskell: ! '>=2.5 && <2.16'
? ^
- vector: ! '>=0.7'
+ vector: ! '>=0.7.1 && <0.13'
? +++++++++++
all-versions:
- 0.2.1.3
- 0.2.1.4
- 0.2.1.5
- 0.2.1.6
+ - 0.2.1.7
author: Liyang HU <vector-th-unbox@liyang.hu>
- latest: 0.2.1.6
? ^
+ latest: 0.2.1.7
? ^
description-type: haddock
description: |-
A Template Haskell deriver for unboxed vectors, given a pair of coercion
functions to and from some existing type with an Unbox instance.
Refer to "Data.Vector.Unboxed.Deriving" for documentation and examples.
license-name: BSD-3-Clause | 11 | 0.37931 | 6 | 5 |
02dd6eedc63d7a14eed31ea65d41e42fe9395144 | src/components/Admin.vue | src/components/Admin.vue | <template>
<div id="admin">
<game></game>
<div id="admin-page">
<button v-on:click="clearAll">Clear All !</button>
</div>
</div>
</template>
<script>
import Game from './game.vue'
export default {
name: 'admin',
components: {
Game
},
methods: {
clearAll: function () {
var request = require('superagent')
request.post('/api/clearAll', function (result) {
if (result === true) {
window.alert('Cleared!')
} else {
window.alert('Error')
}
})
}
}
}
</script> | <template>
<div id="admin">
<game :players="players"></game>
<div id="admin-page">
<button v-on:click="clearAll">Clear All !</button>
</div>
</div>
</template>
<script>
import Game from './game.vue'
export default {
name: 'admin',
components: {
Game
},
props: ['players'],
methods: {
clearAll: function () {
var request = require('superagent')
request.post('/api/clearAll', function (result) {
if (result === true) {
window.alert('Cleared!')
} else {
window.alert('Error')
}
})
}
}
}
</script> | Fix player vue on admin page | Fix player vue on admin page
| Vue | mit | groupwrite-io/groupwrite.io,groupwrite-io/groupwrite.io | vue | ## Code Before:
<template>
<div id="admin">
<game></game>
<div id="admin-page">
<button v-on:click="clearAll">Clear All !</button>
</div>
</div>
</template>
<script>
import Game from './game.vue'
export default {
name: 'admin',
components: {
Game
},
methods: {
clearAll: function () {
var request = require('superagent')
request.post('/api/clearAll', function (result) {
if (result === true) {
window.alert('Cleared!')
} else {
window.alert('Error')
}
})
}
}
}
</script>
## Instruction:
Fix player vue on admin page
## Code After:
<template>
<div id="admin">
<game :players="players"></game>
<div id="admin-page">
<button v-on:click="clearAll">Clear All !</button>
</div>
</div>
</template>
<script>
import Game from './game.vue'
export default {
name: 'admin',
components: {
Game
},
props: ['players'],
methods: {
clearAll: function () {
var request = require('superagent')
request.post('/api/clearAll', function (result) {
if (result === true) {
window.alert('Cleared!')
} else {
window.alert('Error')
}
})
}
}
}
</script> | <template>
<div id="admin">
- <game></game>
+ <game :players="players"></game>
<div id="admin-page">
<button v-on:click="clearAll">Clear All !</button>
</div>
</div>
</template>
<script>
import Game from './game.vue'
export default {
name: 'admin',
components: {
Game
},
+ props: ['players'],
methods: {
clearAll: function () {
var request = require('superagent')
request.post('/api/clearAll', function (result) {
if (result === true) {
window.alert('Cleared!')
} else {
window.alert('Error')
}
})
}
}
}
</script> | 3 | 0.090909 | 2 | 1 |
dac61da2cba32ee681fbbe17d933f6b16227929d | locales/ko/notes.properties | locales/ko/notes.properties | welcomeTitle2=안녕하세요!
forgetEmail=이 이메일 없애기
# LOCALIZATION NOTE (disableSync): Sync is intended as a generic
# synchronization, not Firefox Sync.
# LOCALIZATION NOTE (syncComplete): {date} is the date of last sync. If this
# structure doesn't work for your locale, you can translate this as "Last sync:
# {date}".
# Tooltips for toolbar buttons
boldTitle=굵게
italicTitle=기울임
# Settings page labels
| welcomeTitle2=안녕하세요!
forgetEmail=이 이메일 없애기
# LOCALIZATION NOTE (disableSync): Sync is intended as a generic
# synchronization, not Firefox Sync.
# LOCALIZATION NOTE (syncComplete): {date} is the date of last sync. If this
# structure doesn't work for your locale, you can translate this as "Last sync:
# {date}".
# Tooltips for toolbar buttons
fontSizeTitle=글자 크기
boldTitle=굵게
italicTitle=기울임
strikethroughTitle=취소선
numberedListTitle=번호 목록
bulletedListTitle=번호없는 목록
textDirectionTitle=텍스트 방향
# Settings page labels
themeLegend=테마
defaultThemeTitle=기본
darkThemeTitle=어두움
| Update Korean (ko) localization of Test Pilot: Notes | Pontoon: Update Korean (ko) localization of Test Pilot: Notes
Localization authors:
- Hyeonseok Shin <hyeonseok@gmail.com>
| INI | mpl-2.0 | cedricium/notes,cedricium/notes,cedricium/notes,cedricium/notes,cedricium/notes | ini | ## Code Before:
welcomeTitle2=안녕하세요!
forgetEmail=이 이메일 없애기
# LOCALIZATION NOTE (disableSync): Sync is intended as a generic
# synchronization, not Firefox Sync.
# LOCALIZATION NOTE (syncComplete): {date} is the date of last sync. If this
# structure doesn't work for your locale, you can translate this as "Last sync:
# {date}".
# Tooltips for toolbar buttons
boldTitle=굵게
italicTitle=기울임
# Settings page labels
## Instruction:
Pontoon: Update Korean (ko) localization of Test Pilot: Notes
Localization authors:
- Hyeonseok Shin <hyeonseok@gmail.com>
## Code After:
welcomeTitle2=안녕하세요!
forgetEmail=이 이메일 없애기
# LOCALIZATION NOTE (disableSync): Sync is intended as a generic
# synchronization, not Firefox Sync.
# LOCALIZATION NOTE (syncComplete): {date} is the date of last sync. If this
# structure doesn't work for your locale, you can translate this as "Last sync:
# {date}".
# Tooltips for toolbar buttons
fontSizeTitle=글자 크기
boldTitle=굵게
italicTitle=기울임
strikethroughTitle=취소선
numberedListTitle=번호 목록
bulletedListTitle=번호없는 목록
textDirectionTitle=텍스트 방향
# Settings page labels
themeLegend=테마
defaultThemeTitle=기본
darkThemeTitle=어두움
| welcomeTitle2=안녕하세요!
forgetEmail=이 이메일 없애기
# LOCALIZATION NOTE (disableSync): Sync is intended as a generic
# synchronization, not Firefox Sync.
# LOCALIZATION NOTE (syncComplete): {date} is the date of last sync. If this
# structure doesn't work for your locale, you can translate this as "Last sync:
# {date}".
# Tooltips for toolbar buttons
+ fontSizeTitle=글자 크기
boldTitle=굵게
italicTitle=기울임
+ strikethroughTitle=취소선
+ numberedListTitle=번호 목록
+ bulletedListTitle=번호없는 목록
+ textDirectionTitle=텍스트 방향
# Settings page labels
+ themeLegend=테마
+ defaultThemeTitle=기본
+ darkThemeTitle=어두움 | 8 | 0.470588 | 8 | 0 |
a0c09f48b2bf70411a06e070760a396f492cf1cd | README.md | README.md |
a starter project using react + mobx + react-router-dom
```bash
yarn
yarn run serve
```
## RoadMap
- 国际化
- 主题选择
- 模板生成
- 异步action规范
|
a starter project using react + mobx + react-router-dom
```bash
yarn
yarn run serve
```
## RoadMap
- 国际化
- 主题选择
- 异步action规范
- manifest.json (for nodejs)
## 单元测试
- [Jest & Enzyme](https://semaphoreci.com/community/tutorials/how-to-test-react-and-mobx-with-jest)
| Add test link in readme | Add test link in readme
| Markdown | mit | simongfxu/react-starter-project,simongfxu/react-starter-project | markdown | ## Code Before:
a starter project using react + mobx + react-router-dom
```bash
yarn
yarn run serve
```
## RoadMap
- 国际化
- 主题选择
- 模板生成
- 异步action规范
## Instruction:
Add test link in readme
## Code After:
a starter project using react + mobx + react-router-dom
```bash
yarn
yarn run serve
```
## RoadMap
- 国际化
- 主题选择
- 异步action规范
- manifest.json (for nodejs)
## 单元测试
- [Jest & Enzyme](https://semaphoreci.com/community/tutorials/how-to-test-react-and-mobx-with-jest)
|
a starter project using react + mobx + react-router-dom
```bash
yarn
yarn run serve
```
## RoadMap
- 国际化
- 主题选择
- - 模板生成
- 异步action规范
+ - manifest.json (for nodejs)
+
+ ## 单元测试
+
+ - [Jest & Enzyme](https://semaphoreci.com/community/tutorials/how-to-test-react-and-mobx-with-jest) | 6 | 0.428571 | 5 | 1 |
2e09de9ba9528ac1d1cc37a3bb30550c3751dcfa | gnu-smalltalk-install-dependencies.sh | gnu-smalltalk-install-dependencies.sh | sudo apt-get install libzip-dev libsigsegv-dev libffi-dev libltdl-dev \
libreadline-dev libgmp-dev libgnutls-dev libcairo2-dev libsdl2-dev \
texi2html automake autoconf m4 perl autotools-dev libtool bison flex
# in order to compile, type the following commands:
# $ autoreconf -vi
# $ ./configure
# $ make
| sudo apt-get install libzip-dev libsigsegv-dev libffi-dev libltdl-dev \
libreadline-dev libgmp-dev libgnutls-dev libcairo2-dev libsdl2-dev \
texi2html automake autoconf m4 perl autotools-dev libtool bison flex \
gawk
# in order to compile, type the following commands:
# $ autoreconf -vi
# $ ./configure
# $ make
# after the compilation, I usually leave the directory without removing it,
# in order to consult and read library source code with good Smalltalk.
| Add a comment for `GST`. | Add a comment for `GST`.
| Shell | mit | massimo-nocentini/install-dependencies-scripts | shell | ## Code Before:
sudo apt-get install libzip-dev libsigsegv-dev libffi-dev libltdl-dev \
libreadline-dev libgmp-dev libgnutls-dev libcairo2-dev libsdl2-dev \
texi2html automake autoconf m4 perl autotools-dev libtool bison flex
# in order to compile, type the following commands:
# $ autoreconf -vi
# $ ./configure
# $ make
## Instruction:
Add a comment for `GST`.
## Code After:
sudo apt-get install libzip-dev libsigsegv-dev libffi-dev libltdl-dev \
libreadline-dev libgmp-dev libgnutls-dev libcairo2-dev libsdl2-dev \
texi2html automake autoconf m4 perl autotools-dev libtool bison flex \
gawk
# in order to compile, type the following commands:
# $ autoreconf -vi
# $ ./configure
# $ make
# after the compilation, I usually leave the directory without removing it,
# in order to consult and read library source code with good Smalltalk.
| sudo apt-get install libzip-dev libsigsegv-dev libffi-dev libltdl-dev \
libreadline-dev libgmp-dev libgnutls-dev libcairo2-dev libsdl2-dev \
- texi2html automake autoconf m4 perl autotools-dev libtool bison flex
+ texi2html automake autoconf m4 perl autotools-dev libtool bison flex \
? ++
+ gawk
# in order to compile, type the following commands:
# $ autoreconf -vi
# $ ./configure
# $ make
+
+ # after the compilation, I usually leave the directory without removing it,
+ # in order to consult and read library source code with good Smalltalk. | 6 | 0.75 | 5 | 1 |
7a3a35c09d281a091f7457397bb2e3cce20ffaa0 | java/backend/src/main/resources/application.yml | java/backend/src/main/resources/application.yml | spring.application.name: backend
# Set the port
server.port: ${SERVER_PORT:8081}
# Enable all the actuator endpoints for HTTP (keep them under the base path) and JMX
management.endpoints:
web:
base-path: /
exposure.include: "*"
jmx.exposure.include: "*"
spring.jpa.hibernate.ddl-auto: create
spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}
spring.datasource.username: ${DATABASE_USERNAME:root}
spring.datasource.password: ${DATABASE_PASSWORD:}
| spring.application.name: backend
# Set the port
server.port: ${SERVER_PORT:8081}
# Enable all the actuator endpoints for HTTP (keep them under the base path) and JMX
management.endpoints:
web:
base-path: /
exposure.include: "*"
jmx.exposure.include: "*"
spring.jpa.hibernate.ddl-auto: create
spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}?serverTimezone=UTC
spring.datasource.username: ${DATABASE_USERNAME:root}
spring.datasource.password: ${DATABASE_PASSWORD:}
| Fix MySQL JDBC connection (we need a Timezone) | Fix MySQL JDBC connection (we need a Timezone)
| YAML | mit | xeraa/microservice-monitoring,xeraa/microservice-monitoring,xeraa/microservice-monitoring,xeraa/microservice-monitoring | yaml | ## Code Before:
spring.application.name: backend
# Set the port
server.port: ${SERVER_PORT:8081}
# Enable all the actuator endpoints for HTTP (keep them under the base path) and JMX
management.endpoints:
web:
base-path: /
exposure.include: "*"
jmx.exposure.include: "*"
spring.jpa.hibernate.ddl-auto: create
spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}
spring.datasource.username: ${DATABASE_USERNAME:root}
spring.datasource.password: ${DATABASE_PASSWORD:}
## Instruction:
Fix MySQL JDBC connection (we need a Timezone)
## Code After:
spring.application.name: backend
# Set the port
server.port: ${SERVER_PORT:8081}
# Enable all the actuator endpoints for HTTP (keep them under the base path) and JMX
management.endpoints:
web:
base-path: /
exposure.include: "*"
jmx.exposure.include: "*"
spring.jpa.hibernate.ddl-auto: create
spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}?serverTimezone=UTC
spring.datasource.username: ${DATABASE_USERNAME:root}
spring.datasource.password: ${DATABASE_PASSWORD:}
| spring.application.name: backend
# Set the port
server.port: ${SERVER_PORT:8081}
# Enable all the actuator endpoints for HTTP (keep them under the base path) and JMX
management.endpoints:
web:
base-path: /
exposure.include: "*"
jmx.exposure.include: "*"
spring.jpa.hibernate.ddl-auto: create
- spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}
+ spring.datasource.url: jdbc:mysql://${DATABASE_SERVER:localhost}:${DATABASE_PORT:3306}/${DATABASE_NAME:person}?serverTimezone=UTC
? +++++++++++++++++++
spring.datasource.username: ${DATABASE_USERNAME:root}
spring.datasource.password: ${DATABASE_PASSWORD:} | 2 | 0.125 | 1 | 1 |
f9cc37e5f48acf2bf1cc051be3ed1ef8648d4153 | package.json | package.json | {
"name": "fancyClick",
"version": "0.0.1",
"description": "A lightweight pjax library focused on animated transitions and seamless background loading",
"main": "dist/fancyClick.min.js",
"directories": {
"test": "tests"
},
"dependencies": {
"jquery": "^2.1.1"
},
"devDependencies": {
"karma": "^0.12.23",
"karma-chrome-launcher": "^0.1.4",
"karma-cli": "^0.0.4",
"karma-jasmine": "^0.1.5",
"karma-jasmine-async": "^0.0.1",
"karma-jasmine-jquery": "^0.1.0"
},
"scripts": {
"test": "./node_modules/karma/bin/karma start"
},
"keywords": [
"pjax",
"animation"
],
"author": "Maayan Glikser",
"license": "MIT"
}
| {
"name": "fancyClick",
"version": "0.0.1",
"description": "A lightweight pjax library focused on animated transitions and seamless background loading",
"main": "dist/fancyClick.min.js",
"directories": {
"test": "tests"
},
"dependencies": {
"jquery": "^2.1.1"
},
"devDependencies": {
"karma": "^0.12.23",
"karma-chrome-launcher": "^0.1.4",
"karma-cli": "^0.0.4",
"karma-jasmine": "^0.1.5",
"karma-jasmine-async": "^0.0.1",
"karma-jasmine-jquery": "^0.1.0"
},
"scripts": {
"test": "./node_modules/.bin/karma start --single-run --browsers PhantomJS"
},
"keywords": [
"pjax",
"animation"
],
"author": "Maayan Glikser",
"license": "MIT"
}
| Change npm test to run phantomjs | Change npm test to run phantomjs | JSON | mit | morsdyce/fancyClick | json | ## Code Before:
{
"name": "fancyClick",
"version": "0.0.1",
"description": "A lightweight pjax library focused on animated transitions and seamless background loading",
"main": "dist/fancyClick.min.js",
"directories": {
"test": "tests"
},
"dependencies": {
"jquery": "^2.1.1"
},
"devDependencies": {
"karma": "^0.12.23",
"karma-chrome-launcher": "^0.1.4",
"karma-cli": "^0.0.4",
"karma-jasmine": "^0.1.5",
"karma-jasmine-async": "^0.0.1",
"karma-jasmine-jquery": "^0.1.0"
},
"scripts": {
"test": "./node_modules/karma/bin/karma start"
},
"keywords": [
"pjax",
"animation"
],
"author": "Maayan Glikser",
"license": "MIT"
}
## Instruction:
Change npm test to run phantomjs
## Code After:
{
"name": "fancyClick",
"version": "0.0.1",
"description": "A lightweight pjax library focused on animated transitions and seamless background loading",
"main": "dist/fancyClick.min.js",
"directories": {
"test": "tests"
},
"dependencies": {
"jquery": "^2.1.1"
},
"devDependencies": {
"karma": "^0.12.23",
"karma-chrome-launcher": "^0.1.4",
"karma-cli": "^0.0.4",
"karma-jasmine": "^0.1.5",
"karma-jasmine-async": "^0.0.1",
"karma-jasmine-jquery": "^0.1.0"
},
"scripts": {
"test": "./node_modules/.bin/karma start --single-run --browsers PhantomJS"
},
"keywords": [
"pjax",
"animation"
],
"author": "Maayan Glikser",
"license": "MIT"
}
| {
"name": "fancyClick",
"version": "0.0.1",
"description": "A lightweight pjax library focused on animated transitions and seamless background loading",
"main": "dist/fancyClick.min.js",
"directories": {
"test": "tests"
},
"dependencies": {
"jquery": "^2.1.1"
},
"devDependencies": {
"karma": "^0.12.23",
"karma-chrome-launcher": "^0.1.4",
"karma-cli": "^0.0.4",
"karma-jasmine": "^0.1.5",
"karma-jasmine-async": "^0.0.1",
"karma-jasmine-jquery": "^0.1.0"
},
"scripts": {
- "test": "./node_modules/karma/bin/karma start"
+ "test": "./node_modules/.bin/karma start --single-run --browsers PhantomJS"
},
"keywords": [
"pjax",
"animation"
],
"author": "Maayan Glikser",
"license": "MIT"
} | 2 | 0.068966 | 1 | 1 |
bb9f83722ca678129ffafeacac9d8efbd238fcf8 | app/views/georgia/pages/_new.html.erb | app/views/georgia/pages/_new.html.erb | <div id='new_page' class="modal hide fade">
<%= simple_form_for @page, as: :page, url: url_for( controller: controller_name, action: nil), remote: true, html: {class: 'form-vertical'} do |f| %>
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-hidden="true">×</button>
<h3>New <%= instance_name.titleize %></h3>
</div>
<div class="modal-body">
<%= f.simple_fields_for :contents do |c| %>
<%= c.input :locale, as: :hidden %>
<%= c.input :title, autofocus: true %>
<% end -%>
</div>
<div class="modal-footer">
<%= link_to 'Cancel', '#', class: 'btn', data: {dismiss: 'modal'} %>
<%= f.submit 'Create', class: 'btn btn-primary' %>
</div>
<% end -%>
</div>
| <div id='new_page' class="modal hide fade">
<%= simple_form_for @page, as: :page, remote: true, html: {class: 'form-vertical'} do |f| %>
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-hidden="true">×</button>
<h3>New <%= instance_name.titleize %></h3>
</div>
<div class="modal-body">
<%= f.simple_fields_for :contents do |c| %>
<%= c.input :locale, as: :hidden %>
<%= c.input :title, autofocus: true %>
<% end -%>
</div>
<div class="modal-footer">
<%= link_to 'Cancel', '#', class: 'btn', data: {dismiss: 'modal'} %>
<%= f.submit 'Create', class: 'btn btn-primary' %>
</div>
<% end -%>
</div>
| Fix new page form url issue | Fix new page form url issue
| HTML+ERB | mit | georgia-cms/georgia,georgia-cms/georgia,georgia-cms/georgia | html+erb | ## Code Before:
<div id='new_page' class="modal hide fade">
<%= simple_form_for @page, as: :page, url: url_for( controller: controller_name, action: nil), remote: true, html: {class: 'form-vertical'} do |f| %>
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-hidden="true">×</button>
<h3>New <%= instance_name.titleize %></h3>
</div>
<div class="modal-body">
<%= f.simple_fields_for :contents do |c| %>
<%= c.input :locale, as: :hidden %>
<%= c.input :title, autofocus: true %>
<% end -%>
</div>
<div class="modal-footer">
<%= link_to 'Cancel', '#', class: 'btn', data: {dismiss: 'modal'} %>
<%= f.submit 'Create', class: 'btn btn-primary' %>
</div>
<% end -%>
</div>
## Instruction:
Fix new page form url issue
## Code After:
<div id='new_page' class="modal hide fade">
<%= simple_form_for @page, as: :page, remote: true, html: {class: 'form-vertical'} do |f| %>
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-hidden="true">×</button>
<h3>New <%= instance_name.titleize %></h3>
</div>
<div class="modal-body">
<%= f.simple_fields_for :contents do |c| %>
<%= c.input :locale, as: :hidden %>
<%= c.input :title, autofocus: true %>
<% end -%>
</div>
<div class="modal-footer">
<%= link_to 'Cancel', '#', class: 'btn', data: {dismiss: 'modal'} %>
<%= f.submit 'Create', class: 'btn btn-primary' %>
</div>
<% end -%>
</div>
| <div id='new_page' class="modal hide fade">
- <%= simple_form_for @page, as: :page, url: url_for( controller: controller_name, action: nil), remote: true, html: {class: 'form-vertical'} do |f| %>
? ---------------------------------------------------------
+ <%= simple_form_for @page, as: :page, remote: true, html: {class: 'form-vertical'} do |f| %>
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-hidden="true">×</button>
<h3>New <%= instance_name.titleize %></h3>
</div>
<div class="modal-body">
<%= f.simple_fields_for :contents do |c| %>
<%= c.input :locale, as: :hidden %>
<%= c.input :title, autofocus: true %>
<% end -%>
</div>
<div class="modal-footer">
<%= link_to 'Cancel', '#', class: 'btn', data: {dismiss: 'modal'} %>
<%= f.submit 'Create', class: 'btn btn-primary' %>
</div>
<% end -%>
</div> | 2 | 0.095238 | 1 | 1 |
68c07cf7115085a5d44a737c319e2adc566f33d2 | requirements-dev.txt | requirements-dev.txt | cov-core==1.14.0
coverage==3.7.1
ipdb==0.8
ipython==2.2.0
py==1.4.23
pytest==2.6.1
pytest-cov==1.8.0
six==1.7.3 | cov-core==1.14.0
coverage==3.7.1
py==1.4.23
pytest==2.6.1
pytest-cov==1.8.0
six==1.7.3 | Drop ipython and ipdb from requirements | Drop ipython and ipdb from requirements
| Text | mit | santiagobasulto/smartcsv | text | ## Code Before:
cov-core==1.14.0
coverage==3.7.1
ipdb==0.8
ipython==2.2.0
py==1.4.23
pytest==2.6.1
pytest-cov==1.8.0
six==1.7.3
## Instruction:
Drop ipython and ipdb from requirements
## Code After:
cov-core==1.14.0
coverage==3.7.1
py==1.4.23
pytest==2.6.1
pytest-cov==1.8.0
six==1.7.3 | cov-core==1.14.0
coverage==3.7.1
- ipdb==0.8
- ipython==2.2.0
py==1.4.23
pytest==2.6.1
pytest-cov==1.8.0
six==1.7.3 | 2 | 0.25 | 0 | 2 |
6fe1f2300716777ab9b0e2d65de4cd48aaa69302 | mktarball.sh | mktarball.sh |
set -e
rev=$(git rev-parse --short --default HEAD $rev)
build_id=${BUILD_NUMBER:-$rev}
subdir="locker-$build_id"
top="$PWD"
out="$PWD/locker-$build_id.tar.gz"
builddir="$top/build"
buildlog="$(tempfile build)"
rm -rf "$builddir"
mkdir -p "$builddir/$subdir"
trap "rm -rf \"$builddir\"" EXIT
# fetch a clean copy of the code from git
echo "Fetching code..."
git archive $rev | tar -x -C "$builddir/$subdir"
echo "Building..."
cd "$builddir/$subdir"
npm install 2>&1 | tee -a "$buildlog"
echo "Compressing..."
(cd "$builddir"; tar czf "$out" "$subdir")
# The test suite doesn't clean up after itself, so do this last
#echo "Testing..."
#cd "$builddir/$subdir/tests"
#if ! node runTests.js; then
# echo "Tests failed!"
# rm -f "$out"
# exit 1
#fi
echo "Done."
echo "$out"
|
set -e
rev=$(git rev-parse --short --default HEAD $rev)
build_id=${BUILD_NUMBER:-$rev}
subdir="locker-$build_id"
top="$PWD"
out="$PWD/locker-$build_id.tar.gz"
builddir="$top/build"
buildlog="$(tempfile build)"
rm -rf "$builddir"
mkdir -p "$builddir/$subdir"
trap "rm -rf \"$builddir\"" EXIT
# fetch a clean copy of the code from git
echo "Fetching code..."
git archive $rev | tar -x -C "$builddir/$subdir"
echo "Building..."
cd "$builddir/$subdir"
npm install 2>&1 | tee -a "$buildlog"
test -d Me || mkdir Me
echo "Compressing..."
(cd "$builddir"; tar czf "$out" "$subdir")
# The test suite doesn't clean up after itself, so do this last
#echo "Testing..."
#cd "$builddir/$subdir/tests"
#if ! node runTests.js; then
# echo "Tests failed!"
# rm -f "$out"
# exit 1
#fi
echo "Done."
echo "$out"
| Create Me/ in the build tarball | Create Me/ in the build tarball
| Shell | bsd-3-clause | quartzjer/Locker,quartzjer/Locker,LockerProject/Locker,LockerProject/Locker,Singly/hallway,quartzjer/Locker,othiym23/locker,othiym23/locker,othiym23/locker,quartzjer/Locker,Singly/hallway,othiym23/locker,quartzjer/Locker,LockerProject/Locker,LockerProject/Locker,LockerProject/Locker,othiym23/locker | shell | ## Code Before:
set -e
rev=$(git rev-parse --short --default HEAD $rev)
build_id=${BUILD_NUMBER:-$rev}
subdir="locker-$build_id"
top="$PWD"
out="$PWD/locker-$build_id.tar.gz"
builddir="$top/build"
buildlog="$(tempfile build)"
rm -rf "$builddir"
mkdir -p "$builddir/$subdir"
trap "rm -rf \"$builddir\"" EXIT
# fetch a clean copy of the code from git
echo "Fetching code..."
git archive $rev | tar -x -C "$builddir/$subdir"
echo "Building..."
cd "$builddir/$subdir"
npm install 2>&1 | tee -a "$buildlog"
echo "Compressing..."
(cd "$builddir"; tar czf "$out" "$subdir")
# The test suite doesn't clean up after itself, so do this last
#echo "Testing..."
#cd "$builddir/$subdir/tests"
#if ! node runTests.js; then
# echo "Tests failed!"
# rm -f "$out"
# exit 1
#fi
echo "Done."
echo "$out"
## Instruction:
Create Me/ in the build tarball
## Code After:
set -e
rev=$(git rev-parse --short --default HEAD $rev)
build_id=${BUILD_NUMBER:-$rev}
subdir="locker-$build_id"
top="$PWD"
out="$PWD/locker-$build_id.tar.gz"
builddir="$top/build"
buildlog="$(tempfile build)"
rm -rf "$builddir"
mkdir -p "$builddir/$subdir"
trap "rm -rf \"$builddir\"" EXIT
# fetch a clean copy of the code from git
echo "Fetching code..."
git archive $rev | tar -x -C "$builddir/$subdir"
echo "Building..."
cd "$builddir/$subdir"
npm install 2>&1 | tee -a "$buildlog"
test -d Me || mkdir Me
echo "Compressing..."
(cd "$builddir"; tar czf "$out" "$subdir")
# The test suite doesn't clean up after itself, so do this last
#echo "Testing..."
#cd "$builddir/$subdir/tests"
#if ! node runTests.js; then
# echo "Tests failed!"
# rm -f "$out"
# exit 1
#fi
echo "Done."
echo "$out"
|
set -e
rev=$(git rev-parse --short --default HEAD $rev)
build_id=${BUILD_NUMBER:-$rev}
subdir="locker-$build_id"
top="$PWD"
out="$PWD/locker-$build_id.tar.gz"
builddir="$top/build"
buildlog="$(tempfile build)"
rm -rf "$builddir"
mkdir -p "$builddir/$subdir"
trap "rm -rf \"$builddir\"" EXIT
# fetch a clean copy of the code from git
echo "Fetching code..."
git archive $rev | tar -x -C "$builddir/$subdir"
echo "Building..."
cd "$builddir/$subdir"
npm install 2>&1 | tee -a "$buildlog"
+ test -d Me || mkdir Me
echo "Compressing..."
(cd "$builddir"; tar czf "$out" "$subdir")
# The test suite doesn't clean up after itself, so do this last
#echo "Testing..."
#cd "$builddir/$subdir/tests"
#if ! node runTests.js; then
# echo "Tests failed!"
# rm -f "$out"
# exit 1
#fi
echo "Done."
echo "$out" | 1 | 0.026316 | 1 | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.