commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0e2b4b7f393b18229809833f9d11e36004f8caea | package.json | package.json | {
"name": "github-extended-newsfeed",
"version": "0.0.1",
"description": "Chrome extension that displays descriptions and numbers of stars for repositories in the GitHub News Feed",
"main": "dist/index.js",
"scripts": {
"start": "webpack"
},
"author": "Philipp Alferov <philipp.alferov@gmail.com>",
"license": "MIT",
"devDependencies": {
"webpack": "^1.13.1"
}
}
| {
"name": "github-extended-newsfeed",
"version": "0.0.1",
"description": "Chrome extension that displays descriptions and numbers of stars for repositories in the GitHub News Feed",
"main": "dist/index.js",
"scripts": {
"start": "webpack",
"dev": "npm start -- --watch"
},
"author": "Philipp Alferov <philipp.alferov@gmail.com>",
"license": "MIT",
"devDependencies": {
"webpack": "^1.13.1"
}
}
| Add dev task to npm scripts | Add dev task to npm scripts
| JSON | mit | alferov/explorify | json | ## Code Before:
{
"name": "github-extended-newsfeed",
"version": "0.0.1",
"description": "Chrome extension that displays descriptions and numbers of stars for repositories in the GitHub News Feed",
"main": "dist/index.js",
"scripts": {
"start": "webpack"
},
"author": "Philipp Alferov <philipp.alferov@gmail.com>",
"license": "MIT",
"devDependencies": {
"webpack": "^1.13.1"
}
}
## Instruction:
Add dev task to npm scripts
## Code After:
{
"name": "github-extended-newsfeed",
"version": "0.0.1",
"description": "Chrome extension that displays descriptions and numbers of stars for repositories in the GitHub News Feed",
"main": "dist/index.js",
"scripts": {
"start": "webpack",
"dev": "npm start -- --watch"
},
"author": "Philipp Alferov <philipp.alferov@gmail.com>",
"license": "MIT",
"devDependencies": {
"webpack": "^1.13.1"
}
}
| {
"name": "github-extended-newsfeed",
"version": "0.0.1",
"description": "Chrome extension that displays descriptions and numbers of stars for repositories in the GitHub News Feed",
"main": "dist/index.js",
"scripts": {
- "start": "webpack"
+ "start": "webpack",
? +
+ "dev": "npm start -- --watch"
},
"author": "Philipp Alferov <philipp.alferov@gmail.com>",
"license": "MIT",
"devDependencies": {
"webpack": "^1.13.1"
}
} | 3 | 0.214286 | 2 | 1 |
8435ed758c5cff926aef1f9ed434492457d62cbb | src/Microsoft.PowerShell.PSReadLine/project.json | src/Microsoft.PowerShell.PSReadLine/project.json | {
"name": "Microsoft.PowerShell.PSReadLine",
"version": "1.0.0-*",
"authors": [ "andschwa" ],
"compilationOptions": {
"warningsAsErrors": true
},
"dependencies": {
"System.Management.Automation": "1.0.0-*"
},
"frameworks": {
"netstandard1.5": {
"compilationOptions": {
"define": [ "CORECLR" ]
},
"imports": [ "dnxcore50", "portable-net45+win8" ]
},
"net451": {
}
}
}
| {
"name": "Microsoft.PowerShell.PSReadLine",
"version": "1.0.0-*",
"authors": [ "andschwa" ],
"compilationOptions": {
"warningsAsErrors": true
},
"dependencies": {
"System.Management.Automation": "1.0.0-*"
},
"frameworks": {
"netstandard1.5": {
"compilationOptions": {
"define": [ "CORECLR" ]
},
"imports": [ "dnxcore50", "portable-net45+win8" ]
},
"net451": {
"frameworkAssemblies": {
"System.Windows.Forms": {
"type": "build"
}
}
}
}
}
| Fix PSReadLine for net451 build | Fix PSReadLine for net451 build
Needed System.Windows.Forms
| JSON | mit | bmanikm/PowerShell,kmosher/PowerShell,KarolKaczmarek/PowerShell,KarolKaczmarek/PowerShell,bingbing8/PowerShell,bmanikm/PowerShell,bingbing8/PowerShell,PaulHigin/PowerShell,jsoref/PowerShell,KarolKaczmarek/PowerShell,JamesWTruher/PowerShell-1,daxian-dbw/PowerShell,JamesWTruher/PowerShell-1,bingbing8/PowerShell,PaulHigin/PowerShell,kmosher/PowerShell,jsoref/PowerShell,daxian-dbw/PowerShell,daxian-dbw/PowerShell,KarolKaczmarek/PowerShell,kmosher/PowerShell,TravisEz13/PowerShell,kmosher/PowerShell,jsoref/PowerShell,bmanikm/PowerShell,bingbing8/PowerShell,KarolKaczmarek/PowerShell,PaulHigin/PowerShell,TravisEz13/PowerShell,JamesWTruher/PowerShell-1,JamesWTruher/PowerShell-1,TravisEz13/PowerShell,bmanikm/PowerShell,kmosher/PowerShell,bmanikm/PowerShell,TravisEz13/PowerShell,daxian-dbw/PowerShell,jsoref/PowerShell,jsoref/PowerShell,PaulHigin/PowerShell,bingbing8/PowerShell | json | ## Code Before:
{
"name": "Microsoft.PowerShell.PSReadLine",
"version": "1.0.0-*",
"authors": [ "andschwa" ],
"compilationOptions": {
"warningsAsErrors": true
},
"dependencies": {
"System.Management.Automation": "1.0.0-*"
},
"frameworks": {
"netstandard1.5": {
"compilationOptions": {
"define": [ "CORECLR" ]
},
"imports": [ "dnxcore50", "portable-net45+win8" ]
},
"net451": {
}
}
}
## Instruction:
Fix PSReadLine for net451 build
Needed System.Windows.Forms
## Code After:
{
"name": "Microsoft.PowerShell.PSReadLine",
"version": "1.0.0-*",
"authors": [ "andschwa" ],
"compilationOptions": {
"warningsAsErrors": true
},
"dependencies": {
"System.Management.Automation": "1.0.0-*"
},
"frameworks": {
"netstandard1.5": {
"compilationOptions": {
"define": [ "CORECLR" ]
},
"imports": [ "dnxcore50", "portable-net45+win8" ]
},
"net451": {
"frameworkAssemblies": {
"System.Windows.Forms": {
"type": "build"
}
}
}
}
}
| {
"name": "Microsoft.PowerShell.PSReadLine",
"version": "1.0.0-*",
"authors": [ "andschwa" ],
"compilationOptions": {
"warningsAsErrors": true
},
"dependencies": {
"System.Management.Automation": "1.0.0-*"
},
"frameworks": {
"netstandard1.5": {
"compilationOptions": {
"define": [ "CORECLR" ]
},
"imports": [ "dnxcore50", "portable-net45+win8" ]
},
"net451": {
+ "frameworkAssemblies": {
+ "System.Windows.Forms": {
+ "type": "build"
+ }
+ }
}
}
} | 5 | 0.208333 | 5 | 0 |
0fd8ca0cff75e9ebead413300aff82823a37d983 | README.md | README.md |
Requires the following boxen modules:
* `boxen`
* `homebrew`
* `wget`
* `autoconf`
* `libtool`
Currently 5.3.x versions of PHP are not building due to a dependence on a lower version of autoconf which is not provided by brew. This is being worked on.
## Usage
```puppet
# Install PHP and set as the global default php
class { 'php::global': version => '5.4.10' }
# ensure a certain php version is used within a dir
php::local { '/path/to/my/awesome/project':
version => '5.4.9'
}
# install a php version or two
php::version { '5.3.20': }
php::version { '5.4.10': }
# we provide a ton of predefined ones for you though
require php::5-4-10
require php::5-3-17
# Set up PHP-FPM to run a specific version of PHP
php::fpm { '5.4.10': }
# Spin up an FPM pool for a project
# Ensures:
# * the version of PHP is installed
# * a PHP-FPM service is configured for this PHP version
# * a FPM pool is listening on a per project nginx socket
$name = "project-name"
php::fpm::pool { "${name}-5.4.10":
version => 5.4.10,
socket => "${boxen::config::socketdir}/${name}",
require => File["${nginx::config::sitesdir}/${name}.conf"],
}
```
|
Requires the following boxen modules:
* `boxen`
* `homebrew`
* `wget`
* `autoconf`
* `libtool`
## Usage
```puppet
# Install PHP and set as the global default php
class { 'php::global': version => '5.4.10' }
# ensure a certain php version is used within a dir
php::local { '/path/to/my/awesome/project':
version => '5.4.9'
}
# install a php version or two
php::version { '5.3.20': }
php::version { '5.4.10': }
# we provide a ton of predefined ones for you though
require php::5-4-10
require php::5-3-17
# Set up PHP-FPM to run a specific version of PHP
php::fpm { '5.4.10': }
# Spin up an FPM pool for a project
# Ensures:
# * the version of PHP is installed
# * a PHP-FPM service is configured for this PHP version
# * a FPM pool is listening on a per project nginx socket
$name = "project-name"
php::fpm::pool { "${name}-5.4.10":
version => 5.4.10,
socket => "${boxen::config::socketdir}/${name}",
require => File["${nginx::config::sitesdir}/${name}.conf"],
}
```
| Remove 5.3 caveat from readme as we can compile it again - yay :D | Remove 5.3 caveat from readme as we can compile it again - yay :D
| Markdown | mit | toby-griffiths/puppet-php,hussfelt/puppet-php,mattheath/puppet-php,castiron/puppet-php,typhonius/puppet-php,theand-boxen/puppet-php,castiron/puppet-php,rolfvandekrol/puppet-php,namesco/puppet-php,boxen/puppet-php,typhonius/puppet-php,castiron/puppet-php,toby-griffiths/puppet-php,boxen/puppet-php,oddhill/puppet-php,webflo/puppet-php,theand-boxen/puppet-php,mattheath/puppet-php,hussfelt/puppet-php,namesco/puppet-php,rolfvandekrol/puppet-php,oddhill/puppet-php,oddhill/puppet-php,typhonius/puppet-php,boxen/puppet-php,theand-boxen/puppet-php,hussfelt/puppet-php,namesco/puppet-php,toby-griffiths/puppet-php,rolfvandekrol/puppet-php,webflo/puppet-php | markdown | ## Code Before:
Requires the following boxen modules:
* `boxen`
* `homebrew`
* `wget`
* `autoconf`
* `libtool`
Currently 5.3.x versions of PHP are not building due to a dependence on a lower version of autoconf which is not provided by brew. This is being worked on.
## Usage
```puppet
# Install PHP and set as the global default php
class { 'php::global': version => '5.4.10' }
# ensure a certain php version is used within a dir
php::local { '/path/to/my/awesome/project':
version => '5.4.9'
}
# install a php version or two
php::version { '5.3.20': }
php::version { '5.4.10': }
# we provide a ton of predefined ones for you though
require php::5-4-10
require php::5-3-17
# Set up PHP-FPM to run a specific version of PHP
php::fpm { '5.4.10': }
# Spin up an FPM pool for a project
# Ensures:
# * the version of PHP is installed
# * a PHP-FPM service is configured for this PHP version
# * a FPM pool is listening on a per project nginx socket
$name = "project-name"
php::fpm::pool { "${name}-5.4.10":
version => 5.4.10,
socket => "${boxen::config::socketdir}/${name}",
require => File["${nginx::config::sitesdir}/${name}.conf"],
}
```
## Instruction:
Remove 5.3 caveat from readme as we can compile it again - yay :D
## Code After:
Requires the following boxen modules:
* `boxen`
* `homebrew`
* `wget`
* `autoconf`
* `libtool`
## Usage
```puppet
# Install PHP and set as the global default php
class { 'php::global': version => '5.4.10' }
# ensure a certain php version is used within a dir
php::local { '/path/to/my/awesome/project':
version => '5.4.9'
}
# install a php version or two
php::version { '5.3.20': }
php::version { '5.4.10': }
# we provide a ton of predefined ones for you though
require php::5-4-10
require php::5-3-17
# Set up PHP-FPM to run a specific version of PHP
php::fpm { '5.4.10': }
# Spin up an FPM pool for a project
# Ensures:
# * the version of PHP is installed
# * a PHP-FPM service is configured for this PHP version
# * a FPM pool is listening on a per project nginx socket
$name = "project-name"
php::fpm::pool { "${name}-5.4.10":
version => 5.4.10,
socket => "${boxen::config::socketdir}/${name}",
require => File["${nginx::config::sitesdir}/${name}.conf"],
}
```
|
Requires the following boxen modules:
* `boxen`
* `homebrew`
* `wget`
* `autoconf`
* `libtool`
-
- Currently 5.3.x versions of PHP are not building due to a dependence on a lower version of autoconf which is not provided by brew. This is being worked on.
## Usage
```puppet
# Install PHP and set as the global default php
class { 'php::global': version => '5.4.10' }
# ensure a certain php version is used within a dir
php::local { '/path/to/my/awesome/project':
version => '5.4.9'
}
# install a php version or two
php::version { '5.3.20': }
php::version { '5.4.10': }
# we provide a ton of predefined ones for you though
require php::5-4-10
require php::5-3-17
# Set up PHP-FPM to run a specific version of PHP
php::fpm { '5.4.10': }
# Spin up an FPM pool for a project
# Ensures:
# * the version of PHP is installed
# * a PHP-FPM service is configured for this PHP version
# * a FPM pool is listening on a per project nginx socket
$name = "project-name"
php::fpm::pool { "${name}-5.4.10":
version => 5.4.10,
socket => "${boxen::config::socketdir}/${name}",
require => File["${nginx::config::sitesdir}/${name}.conf"],
}
``` | 2 | 0.043478 | 0 | 2 |
39247451a55c16ea0f4a0e021981618a12cc53c9 | src/converter/r2t/FnGroupConverter.js | src/converter/r2t/FnGroupConverter.js | export default class FnGroupConverter {
import(dom) {
let fnGroups = dom.findAll('fn-group')
if(fnGroups.length === 0) {
let back = dom.find('back')
back.append(
dom.createElement('fn-group')
)
}
}
export(dom) {
dom.findAll('fn-group').forEach(fnGroup => {
if(fnGroup.children.length === 0) {
fnGroup.getParent().removeChild(fnGroup)
}
})
}
} | export default class FnGroupConverter {
import(dom) {
let fnGroups = dom.findAll('fn-group')
if(fnGroups.length === 0) {
let back = dom.find('back')
back.append(
dom.createElement('fn-group')
)
} else {
fnGroups.forEach(fnGroup => {
let xTags = fnGroup.findAll('x')
xTags.forEach(x => {
fnGroup.removeChild(x)
})
})
}
}
export(dom) {
dom.findAll('fn-group').forEach(fnGroup => {
if(fnGroup.children.length === 0) {
fnGroup.getParent().removeChild(fnGroup)
}
})
}
} | Remove x tags during fn-group conversion. | Remove x tags during fn-group conversion.
| JavaScript | mit | substance/texture,substance/texture | javascript | ## Code Before:
export default class FnGroupConverter {
import(dom) {
let fnGroups = dom.findAll('fn-group')
if(fnGroups.length === 0) {
let back = dom.find('back')
back.append(
dom.createElement('fn-group')
)
}
}
export(dom) {
dom.findAll('fn-group').forEach(fnGroup => {
if(fnGroup.children.length === 0) {
fnGroup.getParent().removeChild(fnGroup)
}
})
}
}
## Instruction:
Remove x tags during fn-group conversion.
## Code After:
export default class FnGroupConverter {
import(dom) {
let fnGroups = dom.findAll('fn-group')
if(fnGroups.length === 0) {
let back = dom.find('back')
back.append(
dom.createElement('fn-group')
)
} else {
fnGroups.forEach(fnGroup => {
let xTags = fnGroup.findAll('x')
xTags.forEach(x => {
fnGroup.removeChild(x)
})
})
}
}
export(dom) {
dom.findAll('fn-group').forEach(fnGroup => {
if(fnGroup.children.length === 0) {
fnGroup.getParent().removeChild(fnGroup)
}
})
}
} | export default class FnGroupConverter {
import(dom) {
let fnGroups = dom.findAll('fn-group')
if(fnGroups.length === 0) {
let back = dom.find('back')
back.append(
dom.createElement('fn-group')
)
+ } else {
+ fnGroups.forEach(fnGroup => {
+ let xTags = fnGroup.findAll('x')
+ xTags.forEach(x => {
+ fnGroup.removeChild(x)
+ })
+ })
}
}
export(dom) {
dom.findAll('fn-group').forEach(fnGroup => {
if(fnGroup.children.length === 0) {
fnGroup.getParent().removeChild(fnGroup)
}
})
}
} | 7 | 0.368421 | 7 | 0 |
901e038d1354ac2eff479ec61e87db2ec15966b6 | spec/unit/ratio_spec.rb | spec/unit/ratio_spec.rb |
RSpec.describe TTY::ProgressBar, 'ratio=' do
let(:output) { StringIO.new('', 'w+') }
it "allows to set ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 10)
progress.ratio = 0.7
expect(progress.current).to eq(7)
output.rewind
expect(output.read).to eq([
"\e[1G[======= ]"
].join)
end
it "finds closest available step from the ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 0.5
expect(progress.current).to eq(1)
end
it "doesn't allow to set wrong ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 3.2
expect(progress.current).to eq(3)
expect(progress.complete?).to eq(true)
end
end
|
RSpec.describe TTY::ProgressBar, '.ratio=' do
let(:output) { StringIO.new('', 'w+') }
it "allows to set ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 10)
progress.ratio = 0.7
expect(progress.current).to eq(7)
output.rewind
expect(output.read).to eq([
"\e[1G[======= ]"
].join)
end
it "finds closest available step from the ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 0.5
expect(progress.current).to eq(1)
end
it "doesn't allow to set wrong ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 3.2
expect(progress.current).to eq(3)
expect(progress.complete?).to eq(true)
end
it "avoids division by zero" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 0)
expect(progress.ratio).to eq(0)
end
end
| Add spec for ratio and total division. | Add spec for ratio and total division.
| Ruby | mit | peter-murach/tty-progressbar | ruby | ## Code Before:
RSpec.describe TTY::ProgressBar, 'ratio=' do
let(:output) { StringIO.new('', 'w+') }
it "allows to set ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 10)
progress.ratio = 0.7
expect(progress.current).to eq(7)
output.rewind
expect(output.read).to eq([
"\e[1G[======= ]"
].join)
end
it "finds closest available step from the ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 0.5
expect(progress.current).to eq(1)
end
it "doesn't allow to set wrong ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 3.2
expect(progress.current).to eq(3)
expect(progress.complete?).to eq(true)
end
end
## Instruction:
Add spec for ratio and total division.
## Code After:
RSpec.describe TTY::ProgressBar, '.ratio=' do
let(:output) { StringIO.new('', 'w+') }
it "allows to set ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 10)
progress.ratio = 0.7
expect(progress.current).to eq(7)
output.rewind
expect(output.read).to eq([
"\e[1G[======= ]"
].join)
end
it "finds closest available step from the ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 0.5
expect(progress.current).to eq(1)
end
it "doesn't allow to set wrong ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 3.2
expect(progress.current).to eq(3)
expect(progress.complete?).to eq(true)
end
it "avoids division by zero" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 0)
expect(progress.ratio).to eq(0)
end
end
|
- RSpec.describe TTY::ProgressBar, 'ratio=' do
+ RSpec.describe TTY::ProgressBar, '.ratio=' do
? +
let(:output) { StringIO.new('', 'w+') }
it "allows to set ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 10)
progress.ratio = 0.7
expect(progress.current).to eq(7)
output.rewind
expect(output.read).to eq([
"\e[1G[======= ]"
].join)
end
it "finds closest available step from the ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 0.5
expect(progress.current).to eq(1)
end
it "doesn't allow to set wrong ratio" do
progress = TTY::ProgressBar.new("[:bar]", output: output, total: 3)
progress.ratio = 3.2
expect(progress.current).to eq(3)
expect(progress.complete?).to eq(true)
end
+
+ it "avoids division by zero" do
+ progress = TTY::ProgressBar.new("[:bar]", output: output, total: 0)
+ expect(progress.ratio).to eq(0)
+ end
end | 7 | 0.259259 | 6 | 1 |
b41349440fdadf4ec27ee0ca30414de29f93d61a | course/editsection.html | course/editsection.html | <form id="theform" method="post" action="editsection.php">
<table summary="Summary of week" cellpadding="5" class="boxaligncenter">
<tr valign="top">
<td align="right"><p><b><?php print_string("summary") ?>:</b></p>
<br />
<?php helpbutton("summaries", get_string("helpsummaries"), "moodle", true, true);
echo "<br />";
if ($usehtmleditor) {
helpbutton("richtext2", get_string("helprichtext"), "moodle", true, true);
} else {
helpbutton("text2", get_string("helptext"), "moodle", true, true);
}
?>
<br />
</td>
<td>
<?php
print_textarea($usehtmleditor, 25, 60, 660, 200, 'summary', $form->summary, 0, false, 'summary');
print($htmlEditorObject->activateEditor('summary', 'summary'));
?>
</td>
</tr>
</table>
<div class="singlebutton">
<input type="hidden" name="id" value="<?php echo $form->id ?>" />
<input type="hidden" name="sesskey" value="<?php echo sesskey() ?>" />
<input type="submit" value="<?php print_string("savechanges") ?>" />
</div>
</form>
| <form id="theform" method="post" action="editsection.php">
<table summary="Summary of week" cellpadding="5" class="boxaligncenter">
<tr valign="top">
<td align="right"><p><b><?php print_string("summary") ?>:</b></p>
<?php helpbutton("summaries", get_string("helpsummaries"), "moodle", true, true);
echo "<br />";
if ($usehtmleditor) {
helpbutton("richtext2", get_string("helprichtext"), "moodle", true, true);
} else {
helpbutton("text2", get_string("helptext"), "moodle", true, true);
}
?>
</td>
<td>
<?php print_textarea($usehtmleditor, 25, 60, 0, 0, 'summary', $form->summary, 0, false, 'summary'); ?>
</td>
</tr>
</table>
<div class="singlebutton">
<input type="hidden" name="id" value="<?php echo $form->id ?>" />
<input type="hidden" name="sesskey" value="<?php echo sesskey() ?>" />
<input type="submit" value="<?php print_string("savechanges") ?>" />
</div>
</form>
| Make the editor work for the frontpage summary field | MDL-11113: Make the editor work for the frontpage summary field
| HTML | bsd-3-clause | dilawar/moodle | html | ## Code Before:
<form id="theform" method="post" action="editsection.php">
<table summary="Summary of week" cellpadding="5" class="boxaligncenter">
<tr valign="top">
<td align="right"><p><b><?php print_string("summary") ?>:</b></p>
<br />
<?php helpbutton("summaries", get_string("helpsummaries"), "moodle", true, true);
echo "<br />";
if ($usehtmleditor) {
helpbutton("richtext2", get_string("helprichtext"), "moodle", true, true);
} else {
helpbutton("text2", get_string("helptext"), "moodle", true, true);
}
?>
<br />
</td>
<td>
<?php
print_textarea($usehtmleditor, 25, 60, 660, 200, 'summary', $form->summary, 0, false, 'summary');
print($htmlEditorObject->activateEditor('summary', 'summary'));
?>
</td>
</tr>
</table>
<div class="singlebutton">
<input type="hidden" name="id" value="<?php echo $form->id ?>" />
<input type="hidden" name="sesskey" value="<?php echo sesskey() ?>" />
<input type="submit" value="<?php print_string("savechanges") ?>" />
</div>
</form>
## Instruction:
MDL-11113: Make the editor work for the frontpage summary field
## Code After:
<form id="theform" method="post" action="editsection.php">
<table summary="Summary of week" cellpadding="5" class="boxaligncenter">
<tr valign="top">
<td align="right"><p><b><?php print_string("summary") ?>:</b></p>
<?php helpbutton("summaries", get_string("helpsummaries"), "moodle", true, true);
echo "<br />";
if ($usehtmleditor) {
helpbutton("richtext2", get_string("helprichtext"), "moodle", true, true);
} else {
helpbutton("text2", get_string("helptext"), "moodle", true, true);
}
?>
</td>
<td>
<?php print_textarea($usehtmleditor, 25, 60, 0, 0, 'summary', $form->summary, 0, false, 'summary'); ?>
</td>
</tr>
</table>
<div class="singlebutton">
<input type="hidden" name="id" value="<?php echo $form->id ?>" />
<input type="hidden" name="sesskey" value="<?php echo sesskey() ?>" />
<input type="submit" value="<?php print_string("savechanges") ?>" />
</div>
</form>
| <form id="theform" method="post" action="editsection.php">
<table summary="Summary of week" cellpadding="5" class="boxaligncenter">
<tr valign="top">
<td align="right"><p><b><?php print_string("summary") ?>:</b></p>
- <br />
-
<?php helpbutton("summaries", get_string("helpsummaries"), "moodle", true, true);
echo "<br />";
if ($usehtmleditor) {
helpbutton("richtext2", get_string("helprichtext"), "moodle", true, true);
} else {
helpbutton("text2", get_string("helptext"), "moodle", true, true);
}
?>
- <br />
-
</td>
<td>
- <?php
- print_textarea($usehtmleditor, 25, 60, 660, 200, 'summary', $form->summary, 0, false, 'summary');
? -- --
+ <?php print_textarea($usehtmleditor, 25, 60, 0, 0, 'summary', $form->summary, 0, false, 'summary'); ?>
? ++++++ +++
- print($htmlEditorObject->activateEditor('summary', 'summary'));
- ?>
-
</td>
</tr>
</table>
<div class="singlebutton">
<input type="hidden" name="id" value="<?php echo $form->id ?>" />
<input type="hidden" name="sesskey" value="<?php echo sesskey() ?>" />
<input type="submit" value="<?php print_string("savechanges") ?>" />
</div>
</form> | 10 | 0.3125 | 1 | 9 |
c8508781ca24a4b8ce3bf48f79aaaa9358e82c1b | lib/formalist/rich_text/rendering/embedded_form_renderer.rb | lib/formalist/rich_text/rendering/embedded_form_renderer.rb | module Formalist
module RichText
module Rendering
class EmbeddedFormRenderer
attr_reader :container
attr_reader :render_options
def initialize(container: {}, render_options: {})
@container = container
@render_options = render_options
end
def call(form_data)
type, data = form_data.values_at(:name, :data)
if container.key?(type)
container[type].(data, render_options)
else
""
end
end
end
end
end
end
| module Formalist
module RichText
module Rendering
class EmbeddedFormRenderer
attr_reader :container
attr_reader :options
def initialize(container = {}, **options)
@container = container
@options = options
end
def call(form_data)
type, data = form_data.values_at(:name, :data)
if container.key?(type)
container[type].(data, options)
else
""
end
end
end
end
end
end
| Tweak EmbeddedFormRenderer API a little | Tweak EmbeddedFormRenderer API a little
Make the container a single, positional parameter, followed by a splat of options.
| Ruby | mit | icelab/formalist | ruby | ## Code Before:
module Formalist
module RichText
module Rendering
class EmbeddedFormRenderer
attr_reader :container
attr_reader :render_options
def initialize(container: {}, render_options: {})
@container = container
@render_options = render_options
end
def call(form_data)
type, data = form_data.values_at(:name, :data)
if container.key?(type)
container[type].(data, render_options)
else
""
end
end
end
end
end
end
## Instruction:
Tweak EmbeddedFormRenderer API a little
Make the container a single, positional parameter, followed by a splat of options.
## Code After:
module Formalist
module RichText
module Rendering
class EmbeddedFormRenderer
attr_reader :container
attr_reader :options
def initialize(container = {}, **options)
@container = container
@options = options
end
def call(form_data)
type, data = form_data.values_at(:name, :data)
if container.key?(type)
container[type].(data, options)
else
""
end
end
end
end
end
end
| module Formalist
module RichText
module Rendering
class EmbeddedFormRenderer
attr_reader :container
- attr_reader :render_options
? -------
+ attr_reader :options
- def initialize(container: {}, render_options: {})
? ^ ^^^^^^^ ----
+ def initialize(container = {}, **options)
? ^^ ^^
@container = container
- @render_options = render_options
? ------- -------
+ @options = options
end
def call(form_data)
type, data = form_data.values_at(:name, :data)
if container.key?(type)
- container[type].(data, render_options)
? -------
+ container[type].(data, options)
else
""
end
end
end
end
end
end | 8 | 0.32 | 4 | 4 |
fc09c7dfcb2b14aff785bb84c7017b26b2805089 | README.md | README.md | MegaApiClient
=============
C# library to access http://mega.co.nz API
This library is based on highly valuable articles from http://julien-marchand.fr
Usage example:
---
```
MegaApiClient client = new MegaApiClient();
client.Login("megaclient@yopmail.com", "megaclient");
var nodes = client.GetNodes();
Node root = nodes.Single(n => n.Type == NodeType.Root);
Node myFolder = client.CreateFolder("Upload", root);
Node myFile = client.Upload("MyFile.ext", myFolder);
Uri downloadUrl = client.GetDownloadLink(myFile);
Console.WriteLine(downloadUrl);
```
API functions:
---
```
void Login(string email, string password)
void LoginAnonymous()
void Logout()
IEnumerable<Node> GetNodes()
Node CreateFolder(string name, Node parent)
void Delete(Node node, bool moveToTrash = true)
Node Move(Node node, Node destinationParentNode)
Uri GetDownloadLink(Node node)
void DownloadFile(Node node, string outputFile)
Stream Download(Node node)
Node Upload(string filename, Node parent)
Node Upload(Stream stream, string name, Node parent)
``` | MegaApiClient
=============
C# library to access http://mega.co.nz API
This library is based on highly valuable articles from http://julien-marchand.fr
Usage example:
---
```
MegaApiClient client = new MegaApiClient();
client.Login("megaclient@yopmail.com", "megaclient");
var nodes = client.GetNodes();
Node root = nodes.Single(n => n.Type == NodeType.Root);
Node myFolder = client.CreateFolder("Upload", root);
Node myFile = client.Upload("MyFile.ext", myFolder);
Uri downloadUrl = client.GetDownloadLink(myFile);
Console.WriteLine(downloadUrl);
```
API functions:
---
```
void Login(string email, string password)
void Login(AuthInfos authInfos)
void LoginAnonymous()
void Logout()
IEnumerable<Node> GetNodes()
Node CreateFolder(string name, Node parent)
void Delete(Node node, bool moveToTrash = true)
Node Move(Node node, Node destinationParentNode)
Uri GetDownloadLink(Node node)
void DownloadFile(Node node, string outputFile)
Stream Download(Node node)
Node Upload(string filename, Node parent)
Node Upload(Stream stream, string name, Node parent)
static AuthInfos GenerateAuthInfos(string email, string password)
``` | Add new login method in API list | Add new login method in API list
| Markdown | mit | PNSolutions/MegaApiClient,gpailler/MegaApiClient,wojciech-urbanowski/MegaApiClient | markdown | ## Code Before:
MegaApiClient
=============
C# library to access http://mega.co.nz API
This library is based on highly valuable articles from http://julien-marchand.fr
Usage example:
---
```
MegaApiClient client = new MegaApiClient();
client.Login("megaclient@yopmail.com", "megaclient");
var nodes = client.GetNodes();
Node root = nodes.Single(n => n.Type == NodeType.Root);
Node myFolder = client.CreateFolder("Upload", root);
Node myFile = client.Upload("MyFile.ext", myFolder);
Uri downloadUrl = client.GetDownloadLink(myFile);
Console.WriteLine(downloadUrl);
```
API functions:
---
```
void Login(string email, string password)
void LoginAnonymous()
void Logout()
IEnumerable<Node> GetNodes()
Node CreateFolder(string name, Node parent)
void Delete(Node node, bool moveToTrash = true)
Node Move(Node node, Node destinationParentNode)
Uri GetDownloadLink(Node node)
void DownloadFile(Node node, string outputFile)
Stream Download(Node node)
Node Upload(string filename, Node parent)
Node Upload(Stream stream, string name, Node parent)
```
## Instruction:
Add new login method in API list
## Code After:
MegaApiClient
=============
C# library to access http://mega.co.nz API
This library is based on highly valuable articles from http://julien-marchand.fr
Usage example:
---
```
MegaApiClient client = new MegaApiClient();
client.Login("megaclient@yopmail.com", "megaclient");
var nodes = client.GetNodes();
Node root = nodes.Single(n => n.Type == NodeType.Root);
Node myFolder = client.CreateFolder("Upload", root);
Node myFile = client.Upload("MyFile.ext", myFolder);
Uri downloadUrl = client.GetDownloadLink(myFile);
Console.WriteLine(downloadUrl);
```
API functions:
---
```
void Login(string email, string password)
void Login(AuthInfos authInfos)
void LoginAnonymous()
void Logout()
IEnumerable<Node> GetNodes()
Node CreateFolder(string name, Node parent)
void Delete(Node node, bool moveToTrash = true)
Node Move(Node node, Node destinationParentNode)
Uri GetDownloadLink(Node node)
void DownloadFile(Node node, string outputFile)
Stream Download(Node node)
Node Upload(string filename, Node parent)
Node Upload(Stream stream, string name, Node parent)
static AuthInfos GenerateAuthInfos(string email, string password)
``` | MegaApiClient
=============
C# library to access http://mega.co.nz API
This library is based on highly valuable articles from http://julien-marchand.fr
Usage example:
---
```
MegaApiClient client = new MegaApiClient();
client.Login("megaclient@yopmail.com", "megaclient");
var nodes = client.GetNodes();
Node root = nodes.Single(n => n.Type == NodeType.Root);
Node myFolder = client.CreateFolder("Upload", root);
Node myFile = client.Upload("MyFile.ext", myFolder);
Uri downloadUrl = client.GetDownloadLink(myFile);
Console.WriteLine(downloadUrl);
```
API functions:
---
```
void Login(string email, string password)
+ void Login(AuthInfos authInfos)
void LoginAnonymous()
void Logout()
IEnumerable<Node> GetNodes()
Node CreateFolder(string name, Node parent)
void Delete(Node node, bool moveToTrash = true)
Node Move(Node node, Node destinationParentNode)
Uri GetDownloadLink(Node node)
void DownloadFile(Node node, string outputFile)
Stream Download(Node node)
Node Upload(string filename, Node parent)
Node Upload(Stream stream, string name, Node parent)
+
+ static AuthInfos GenerateAuthInfos(string email, string password)
``` | 3 | 0.065217 | 3 | 0 |
775f31ff5323af91c1fffd5e45f75d58eb254b93 | package.json | package.json | {
"name": "arethusa",
"version": "0.0.1",
"dependencies": {},
"devDependencies": {
"grunt-cli": "0.1.13",
"grunt-contrib-jasmine": "0.6.3",
"grunt-contrib-watch": "0.6.1",
"grunt-contrib-jshint": "0.10.0",
"grunt-contrib-connect": "~0.7.1",
"grunt-coveralls": "0.3.0",
"grunt-karma": "0.8.0",
"karma-chrome-launcher": "0.1.3",
"karma-phantomjs-launcher": "0.1.4",
"karma-firefox-launcher": "0.1.3",
"karma-jasmine": "0.1.5",
"karma-coverage": "0.2.1",
"bower": "1.3.2",
"protractor": "~0.23.1",
"grunt-protractor-runner": "1.0.0",
"connect-livereload": "0.4.0",
"grunt-sauce-connect-launcher": "~0.3.0",
"grunt-contrib-uglify": "0.4.0",
"grunt-concurrent": "0.5.0",
"grunt-githooks": "^0.3.1",
"load-grunt-tasks": "^0.6.0"
},
"scripts": {
"test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js"
}
}
| {
"name": "arethusa",
"version": "0.0.1",
"dependencies": {},
"devDependencies": {
"grunt-cli": "0.1.13",
"grunt-contrib-jasmine": "0.6.3",
"grunt-contrib-watch": "0.6.1",
"grunt-contrib-jshint": "0.10.0",
"grunt-contrib-connect": "~0.7.1",
"grunt-coveralls": "0.3.0",
"grunt-karma": "0.8.0",
"karma-chrome-launcher": "0.1.3",
"karma-phantomjs-launcher": "0.1.4",
"karma-firefox-launcher": "0.1.3",
"karma-jasmine": "0.1.5",
"karma-coverage": "0.2.1",
"bower": "1.3.2",
"protractor": "~0.23.1",
"grunt-protractor-runner": "1.0.0",
"connect-livereload": "0.4.0",
"grunt-sauce-connect-launcher": "~0.3.0",
"grunt-contrib-uglify": "0.4.0",
"grunt-concurrent": "0.5.0",
"grunt-githooks": "^0.3.1",
"load-grunt-tasks": "^0.6.0"
},
"scripts": {
"test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js",
"postinstall": "grunt githooks"
}
}
| Install hooks after each npm install | Install hooks after each npm install
| JSON | mit | fbaumgardt/arethusa,PonteIneptique/arethusa,PonteIneptique/arethusa,fbaumgardt/arethusa,fbaumgardt/arethusa,latin-language-toolkit/arethusa,alpheios-project/arethusa,Masoumeh/arethusa,latin-language-toolkit/arethusa,Masoumeh/arethusa,alpheios-project/arethusa,alpheios-project/arethusa | json | ## Code Before:
{
"name": "arethusa",
"version": "0.0.1",
"dependencies": {},
"devDependencies": {
"grunt-cli": "0.1.13",
"grunt-contrib-jasmine": "0.6.3",
"grunt-contrib-watch": "0.6.1",
"grunt-contrib-jshint": "0.10.0",
"grunt-contrib-connect": "~0.7.1",
"grunt-coveralls": "0.3.0",
"grunt-karma": "0.8.0",
"karma-chrome-launcher": "0.1.3",
"karma-phantomjs-launcher": "0.1.4",
"karma-firefox-launcher": "0.1.3",
"karma-jasmine": "0.1.5",
"karma-coverage": "0.2.1",
"bower": "1.3.2",
"protractor": "~0.23.1",
"grunt-protractor-runner": "1.0.0",
"connect-livereload": "0.4.0",
"grunt-sauce-connect-launcher": "~0.3.0",
"grunt-contrib-uglify": "0.4.0",
"grunt-concurrent": "0.5.0",
"grunt-githooks": "^0.3.1",
"load-grunt-tasks": "^0.6.0"
},
"scripts": {
"test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js"
}
}
## Instruction:
Install hooks after each npm install
## Code After:
{
"name": "arethusa",
"version": "0.0.1",
"dependencies": {},
"devDependencies": {
"grunt-cli": "0.1.13",
"grunt-contrib-jasmine": "0.6.3",
"grunt-contrib-watch": "0.6.1",
"grunt-contrib-jshint": "0.10.0",
"grunt-contrib-connect": "~0.7.1",
"grunt-coveralls": "0.3.0",
"grunt-karma": "0.8.0",
"karma-chrome-launcher": "0.1.3",
"karma-phantomjs-launcher": "0.1.4",
"karma-firefox-launcher": "0.1.3",
"karma-jasmine": "0.1.5",
"karma-coverage": "0.2.1",
"bower": "1.3.2",
"protractor": "~0.23.1",
"grunt-protractor-runner": "1.0.0",
"connect-livereload": "0.4.0",
"grunt-sauce-connect-launcher": "~0.3.0",
"grunt-contrib-uglify": "0.4.0",
"grunt-concurrent": "0.5.0",
"grunt-githooks": "^0.3.1",
"load-grunt-tasks": "^0.6.0"
},
"scripts": {
"test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js",
"postinstall": "grunt githooks"
}
}
| {
"name": "arethusa",
"version": "0.0.1",
"dependencies": {},
"devDependencies": {
"grunt-cli": "0.1.13",
"grunt-contrib-jasmine": "0.6.3",
"grunt-contrib-watch": "0.6.1",
"grunt-contrib-jshint": "0.10.0",
"grunt-contrib-connect": "~0.7.1",
"grunt-coveralls": "0.3.0",
"grunt-karma": "0.8.0",
"karma-chrome-launcher": "0.1.3",
"karma-phantomjs-launcher": "0.1.4",
"karma-firefox-launcher": "0.1.3",
"karma-jasmine": "0.1.5",
"karma-coverage": "0.2.1",
"bower": "1.3.2",
"protractor": "~0.23.1",
"grunt-protractor-runner": "1.0.0",
"connect-livereload": "0.4.0",
"grunt-sauce-connect-launcher": "~0.3.0",
"grunt-contrib-uglify": "0.4.0",
"grunt-concurrent": "0.5.0",
"grunt-githooks": "^0.3.1",
"load-grunt-tasks": "^0.6.0"
},
"scripts": {
- "test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js"
+ "test": "grunt karma:spec coveralls jshint && protractor protractor-config-travis.js",
? +
+ "postinstall": "grunt githooks"
}
} | 3 | 0.096774 | 2 | 1 |
6d3d57e6c5956dcbbbb11d31c850b4775875da04 | share/spice/github/github.css | share/spice/github/github.css | .zci--github .tile__update__info {
color: #999;
position: absolute;
bottom: 0.5em;
}
.zci--github .star.starIcon {
margin-top: -5px;
}
.zci--github .tile__title__sub {
text-transform: none;
}
.zci--github .tile__body {
height: 14em;
} | .zci--github .tile__update__info {
color: #999;
position: absolute;
bottom: 0.5em;
}
.zci--github .star.starIcon {
margin-top: -5px;
}
.zci--github .tile__title__sub {
text-transform: none;
color: #999;
}
.zci--github .tile__body {
height: 14em;
}
| Adjust the color of the subtitle. | GitHub: Adjust the color of the subtitle.
| CSS | apache-2.0 | GrandpaCardigan/zeroclickinfo-spice,hshackathons/zeroclickinfo-spice,rubinovitz/zeroclickinfo-spice,deserted/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,ScreapDK/zeroclickinfo-spice,Kr1tya3/zeroclickinfo-spice,gautamkrishnar/zeroclickinfo-spice,brianrisk/zeroclickinfo-spice,evejweinberg/zeroclickinfo-spice,digit4lfa1l/zeroclickinfo-spice,alexandernext/zeroclickinfo-spice,rubinovitz/zeroclickinfo-spice,hshackathons/zeroclickinfo-spice,deserted/zeroclickinfo-spice,GrandpaCardigan/zeroclickinfo-spice,lernae/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,samskeller/zeroclickinfo-spice,loganom/zeroclickinfo-spice,sagarhani/zeroclickinfo-spice,tagawa/zeroclickinfo-spice,dheeraj143/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,toenu23/zeroclickinfo-spice,bibliotechy/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,dachinzo/zeroclickinfo-spice,bdjnk/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,echosa/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,evejweinberg/zeroclickinfo-spice,jyounker/zeroclickinfo-spice,imwally/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,mayo/zeroclickinfo-spice,mr-karan/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,AcriCAA/zeroclickinfo-spice,ScreapDK/zeroclickinfo-spice,mohan08p/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,rubinovitz/zeroclickinfo-spice,Kr1tya3/zeroclickinfo-spice,bibliotechy/zeroclickinfo-spice,jyounker/zeroclickinfo-spice,levaly/zeroclickinfo-spice,alexandernext/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,mr-karan/zeroclickinfo-spice,lw7360/zeroclickinfo-spice,lw7360/zeroclickinfo-spice,dachinzo/zeroclickinfo-spice,AcriCAA/zeroclickinfo-spice,levaly/zeroclickinfo-spice,GrandpaCardigan/zeroclickinfo-spice,mohan08p/zeroclickinfo-spice,claytonspinner/zeroclickinfo-spice,ColasBroux/zeroclickinfo-spice,imwally/zeroclickinfo-spice,sevki/zeroclickinfo-spice,Kakkoroid/zeroclickinfo-spice,levaly/zeroclickinfo-spice,claytonspinner/zeroclickinfo-spice,soleo/zeroclickinfo-spice,imwally/zeroclickinfo-spice,Queeniebee/zeroclickinfo-spice,dheeraj143/zeroclickinfo-spice,stevenmg/zeroclickinfo-spice,lw7360/zeroclickinfo-spice,soleo/zeroclickinfo-spice,Queeniebee/zeroclickinfo-spice,loganom/zeroclickinfo-spice,bdjnk/zeroclickinfo-spice,evejweinberg/zeroclickinfo-spice,sevki/zeroclickinfo-spice,Kr1tya3/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,tagawa/zeroclickinfo-spice,mayo/zeroclickinfo-spice,levaly/zeroclickinfo-spice,echosa/zeroclickinfo-spice,Kakkoroid/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,GrandpaCardigan/zeroclickinfo-spice,Retrobottega/zeroclickinfo-spice,stevenmg/zeroclickinfo-spice,Kr1tya3/zeroclickinfo-spice,gautamkrishnar/zeroclickinfo-spice,dachinzo/zeroclickinfo-spice,samskeller/zeroclickinfo-spice,soleo/zeroclickinfo-spice,ppant/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,alexandernext/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,lerna/zeroclickinfo-spice,lernae/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,echosa/zeroclickinfo-spice,jyounker/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,sevki/zeroclickinfo-spice,bdjnk/zeroclickinfo-spice,echosa/zeroclickinfo-spice,toenu23/zeroclickinfo-spice,ScreapDK/zeroclickinfo-spice,stennie/zeroclickinfo-spice,samskeller/zeroclickinfo-spice,Faiz7412/zeroclickinfo-spice,loganom/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,stevenmg/zeroclickinfo-spice,Faiz7412/zeroclickinfo-spice,lernae/zeroclickinfo-spice,Kakkoroid/zeroclickinfo-spice,timeanddate/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,ColasBroux/zeroclickinfo-spice,dachinzo/zeroclickinfo-spice,mayo/zeroclickinfo-spice,ppant/zeroclickinfo-spice,rmad17/zeroclickinfo-spice,Faiz7412/zeroclickinfo-spice,imwally/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,toenu23/zeroclickinfo-spice,imwally/zeroclickinfo-spice,deserted/zeroclickinfo-spice,gautamkrishnar/zeroclickinfo-spice,Kakkoroid/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,P71/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,levaly/zeroclickinfo-spice,dheeraj143/zeroclickinfo-spice,stennie/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,P71/zeroclickinfo-spice,claytonspinner/zeroclickinfo-spice,tagawa/zeroclickinfo-spice,navjotahuja92/zeroclickinfo-spice,andrey-p/zeroclickinfo-spice,bibliotechy/zeroclickinfo-spice,P71/zeroclickinfo-spice,lw7360/zeroclickinfo-spice,echosa/zeroclickinfo-spice,bibliotechy/zeroclickinfo-spice,P71/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,lerna/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,soleo/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,lernae/zeroclickinfo-spice,gautamkrishnar/zeroclickinfo-spice,ScreapDK/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,sevki/zeroclickinfo-spice,Retrobottega/zeroclickinfo-spice,ColasBroux/zeroclickinfo-spice,loganom/zeroclickinfo-spice,timeanddate/zeroclickinfo-spice,Kr1tya3/zeroclickinfo-spice,AcriCAA/zeroclickinfo-spice,mohan08p/zeroclickinfo-spice,Dwaligon/zeroclickinfo-spice,alexandernext/zeroclickinfo-spice,GrandpaCardigan/zeroclickinfo-spice,rmad17/zeroclickinfo-spice,navjotahuja92/zeroclickinfo-spice,sagarhani/zeroclickinfo-spice,evejweinberg/zeroclickinfo-spice,andrey-p/zeroclickinfo-spice,stennie/zeroclickinfo-spice,lernae/zeroclickinfo-spice,digit4lfa1l/zeroclickinfo-spice,Dwaligon/zeroclickinfo-spice,andrey-p/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,ppant/zeroclickinfo-spice,hshackathons/zeroclickinfo-spice,tagawa/zeroclickinfo-spice,timeanddate/zeroclickinfo-spice,Queeniebee/zeroclickinfo-spice,samskeller/zeroclickinfo-spice,soleo/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,Faiz7412/zeroclickinfo-spice,sagarhani/zeroclickinfo-spice,Retrobottega/zeroclickinfo-spice,Queeniebee/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,bdjnk/zeroclickinfo-spice,deserted/zeroclickinfo-spice,sagarhani/zeroclickinfo-spice,digit4lfa1l/zeroclickinfo-spice,brianrisk/zeroclickinfo-spice,shyamalschandra/zeroclickinfo-spice,shyamalschandra/zeroclickinfo-spice,digit4lfa1l/zeroclickinfo-spice,AcriCAA/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,tagawa/zeroclickinfo-spice,mr-karan/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,mayo/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,shyamalschandra/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,Kakkoroid/zeroclickinfo-spice,mr-karan/zeroclickinfo-spice,toenu23/zeroclickinfo-spice,levaly/zeroclickinfo-spice,hshackathons/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,shyamalschandra/zeroclickinfo-spice,ColasBroux/zeroclickinfo-spice,shyamalschandra/zeroclickinfo-spice,soleo/zeroclickinfo-spice,lernae/zeroclickinfo-spice,Dwaligon/zeroclickinfo-spice,jyounker/zeroclickinfo-spice,dheeraj143/zeroclickinfo-spice,Dwaligon/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,stennie/zeroclickinfo-spice,deserted/zeroclickinfo-spice,Retrobottega/zeroclickinfo-spice,navjotahuja92/zeroclickinfo-spice,brianrisk/zeroclickinfo-spice,AcriCAA/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,lerna/zeroclickinfo-spice,rubinovitz/zeroclickinfo-spice,navjotahuja92/zeroclickinfo-spice,andrey-p/zeroclickinfo-spice,deserted/zeroclickinfo-spice,sagarhani/zeroclickinfo-spice,ScreapDK/zeroclickinfo-spice,claytonspinner/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,rmad17/zeroclickinfo-spice,lerna/zeroclickinfo-spice,brianrisk/zeroclickinfo-spice,andrey-p/zeroclickinfo-spice,rmad17/zeroclickinfo-spice,rmad17/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,mohan08p/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,Retrobottega/zeroclickinfo-spice,timeanddate/zeroclickinfo-spice,ppant/zeroclickinfo-spice,mr-karan/zeroclickinfo-spice,stevenmg/zeroclickinfo-spice,lerna/zeroclickinfo-spice,brianrisk/zeroclickinfo-spice | css | ## Code Before:
.zci--github .tile__update__info {
color: #999;
position: absolute;
bottom: 0.5em;
}
.zci--github .star.starIcon {
margin-top: -5px;
}
.zci--github .tile__title__sub {
text-transform: none;
}
.zci--github .tile__body {
height: 14em;
}
## Instruction:
GitHub: Adjust the color of the subtitle.
## Code After:
.zci--github .tile__update__info {
color: #999;
position: absolute;
bottom: 0.5em;
}
.zci--github .star.starIcon {
margin-top: -5px;
}
.zci--github .tile__title__sub {
text-transform: none;
color: #999;
}
.zci--github .tile__body {
height: 14em;
}
| .zci--github .tile__update__info {
color: #999;
position: absolute;
bottom: 0.5em;
}
.zci--github .star.starIcon {
margin-top: -5px;
}
.zci--github .tile__title__sub {
text-transform: none;
+ color: #999;
}
.zci--github .tile__body {
height: 14em;
} | 1 | 0.058824 | 1 | 0 |
01cbba4bdb09711e65fa99bd182e12eedbe53d13 | README.md | README.md |
Spigot plugin for 1.8+ allowing the claiming of chunks.
*(Should work with Spigot 1.11+ with titles enabled, if they're disabled, should work back a few versions)*
Page on SpigotMC can be found [HERE](https://www.spigotmc.org/resources/claimchunk.44458/).
Usage and more information can be found [HERE](https://github.com/cjburkey01/ClaimChunk/wiki), on the Wiki.
Current version: **0.0.5** for Minecraft **1.12**.
Optional:
* [Vault](https://www.spigotmc.org/resources/vault.41918/).
* [Dynmap](https://www.spigotmc.org/resources/dynmap.274/) (Not implemented, yet). |
Spigot plugin for 1.8+ allowing the claiming of chunks.
*(Should work with Spigot 1.8+ with titles enabled. If they're disabled, may work back to 1.6.4)*
Page on SpigotMC can be found [HERE](https://www.spigotmc.org/resources/claimchunk.44458/).
Usage and more information can be found [HERE](https://github.com/cjburkey01/ClaimChunk/wiki), on the Wiki.
Current version: **0.0.5** for Minecraft **1.12**.
Optional:
* [Vault](https://www.spigotmc.org/resources/vault.41918/).
* [Dynmap](https://www.spigotmc.org/resources/dynmap.274/) (Not implemented, yet). | Fix the readme a bit | Fix the readme a bit | Markdown | mit | cjburkey01/ClaimChunk,cjburkey01/ClaimChunk | markdown | ## Code Before:
Spigot plugin for 1.8+ allowing the claiming of chunks.
*(Should work with Spigot 1.11+ with titles enabled, if they're disabled, should work back a few versions)*
Page on SpigotMC can be found [HERE](https://www.spigotmc.org/resources/claimchunk.44458/).
Usage and more information can be found [HERE](https://github.com/cjburkey01/ClaimChunk/wiki), on the Wiki.
Current version: **0.0.5** for Minecraft **1.12**.
Optional:
* [Vault](https://www.spigotmc.org/resources/vault.41918/).
* [Dynmap](https://www.spigotmc.org/resources/dynmap.274/) (Not implemented, yet).
## Instruction:
Fix the readme a bit
## Code After:
Spigot plugin for 1.8+ allowing the claiming of chunks.
*(Should work with Spigot 1.8+ with titles enabled. If they're disabled, may work back to 1.6.4)*
Page on SpigotMC can be found [HERE](https://www.spigotmc.org/resources/claimchunk.44458/).
Usage and more information can be found [HERE](https://github.com/cjburkey01/ClaimChunk/wiki), on the Wiki.
Current version: **0.0.5** for Minecraft **1.12**.
Optional:
* [Vault](https://www.spigotmc.org/resources/vault.41918/).
* [Dynmap](https://www.spigotmc.org/resources/dynmap.274/) (Not implemented, yet). |
Spigot plugin for 1.8+ allowing the claiming of chunks.
- *(Should work with Spigot 1.11+ with titles enabled, if they're disabled, should work back a few versions)*
? ^^ ^ ^ ^^^^^^ ^ ^^^^^^^^^^^^
+ *(Should work with Spigot 1.8+ with titles enabled. If they're disabled, may work back to 1.6.4)*
? ^ ^ ^ ^^^ ^^ ^^^^^
Page on SpigotMC can be found [HERE](https://www.spigotmc.org/resources/claimchunk.44458/).
Usage and more information can be found [HERE](https://github.com/cjburkey01/ClaimChunk/wiki), on the Wiki.
Current version: **0.0.5** for Minecraft **1.12**.
Optional:
* [Vault](https://www.spigotmc.org/resources/vault.41918/).
* [Dynmap](https://www.spigotmc.org/resources/dynmap.274/) (Not implemented, yet). | 2 | 0.142857 | 1 | 1 |
e37c7cace441e837120b820936c6f4ae8de78996 | sts/controller_manager.py | sts/controller_manager.py | from sts.util.console import msg
class ControllerManager(object):
''' Encapsulate a list of controllers objects '''
def __init__(self, controllers):
self.uuid2controller = {
controller.uuid : controller
for controller in controllers
}
@property
def controllers(self):
return self.uuid2controller.values()
@property
def live_controllers(self):
alive = [controller for controller in self.controllers if controller.alive]
return set(alive)
@property
def down_controllers(self):
down = [controller for controller in self.controllers if not controller.alive]
return set(down)
def get_controller(self, uuid):
if uuid not in self.uuid2controller:
raise ValueError("unknown uuid %s" % str(uuid))
return self.uuid2controller[uuid]
def kill_all(self):
for c in self.live_controllers:
c.kill()
self.uuid2controller = {}
@staticmethod
def kill_controller(controller):
msg.event("Killing controller %s" % str(controller))
controller.kill()
@staticmethod
def reboot_controller(controller):
msg.event("Restarting controller %s" % str(controller))
controller.start()
def check_controller_processes_alive(self):
controllers_with_problems = []
for c in self.live_controllers:
(rc, msg) = c.check_process_status()
if not rc:
c.alive = False
controllers_with_problems.append ( (c, msg) )
return controllers_with_problems
| from sts.util.console import msg
class ControllerManager(object):
''' Encapsulate a list of controllers objects '''
def __init__(self, controllers):
self.uuid2controller = {
controller.uuid : controller
for controller in controllers
}
@property
def controllers(self):
cs = self.uuid2controller.values()
cs.sort(key=lambda c: c.uuid)
return cs
@property
def live_controllers(self):
alive = [controller for controller in self.controllers if controller.alive]
return set(alive)
@property
def down_controllers(self):
down = [controller for controller in self.controllers if not controller.alive]
return set(down)
def get_controller(self, uuid):
if uuid not in self.uuid2controller:
raise ValueError("unknown uuid %s" % str(uuid))
return self.uuid2controller[uuid]
def kill_all(self):
for c in self.live_controllers:
c.kill()
self.uuid2controller = {}
@staticmethod
def kill_controller(controller):
msg.event("Killing controller %s" % str(controller))
controller.kill()
@staticmethod
def reboot_controller(controller):
msg.event("Restarting controller %s" % str(controller))
controller.start()
def check_controller_processes_alive(self):
controllers_with_problems = []
live = list(self.live_controllers)
live.sort(key=lambda c: c.uuid)
for c in live:
(rc, msg) = c.check_process_status()
if not rc:
c.alive = False
controllers_with_problems.append ( (c, msg) )
return controllers_with_problems
| Make .contollers() deterministic (was using hash.values()) | Make .contollers() deterministic (was using hash.values())
| Python | apache-2.0 | ucb-sts/sts,jmiserez/sts,ucb-sts/sts,jmiserez/sts | python | ## Code Before:
from sts.util.console import msg
class ControllerManager(object):
''' Encapsulate a list of controllers objects '''
def __init__(self, controllers):
self.uuid2controller = {
controller.uuid : controller
for controller in controllers
}
@property
def controllers(self):
return self.uuid2controller.values()
@property
def live_controllers(self):
alive = [controller for controller in self.controllers if controller.alive]
return set(alive)
@property
def down_controllers(self):
down = [controller for controller in self.controllers if not controller.alive]
return set(down)
def get_controller(self, uuid):
if uuid not in self.uuid2controller:
raise ValueError("unknown uuid %s" % str(uuid))
return self.uuid2controller[uuid]
def kill_all(self):
for c in self.live_controllers:
c.kill()
self.uuid2controller = {}
@staticmethod
def kill_controller(controller):
msg.event("Killing controller %s" % str(controller))
controller.kill()
@staticmethod
def reboot_controller(controller):
msg.event("Restarting controller %s" % str(controller))
controller.start()
def check_controller_processes_alive(self):
controllers_with_problems = []
for c in self.live_controllers:
(rc, msg) = c.check_process_status()
if not rc:
c.alive = False
controllers_with_problems.append ( (c, msg) )
return controllers_with_problems
## Instruction:
Make .contollers() deterministic (was using hash.values())
## Code After:
from sts.util.console import msg
class ControllerManager(object):
''' Encapsulate a list of controllers objects '''
def __init__(self, controllers):
self.uuid2controller = {
controller.uuid : controller
for controller in controllers
}
@property
def controllers(self):
cs = self.uuid2controller.values()
cs.sort(key=lambda c: c.uuid)
return cs
@property
def live_controllers(self):
alive = [controller for controller in self.controllers if controller.alive]
return set(alive)
@property
def down_controllers(self):
down = [controller for controller in self.controllers if not controller.alive]
return set(down)
def get_controller(self, uuid):
if uuid not in self.uuid2controller:
raise ValueError("unknown uuid %s" % str(uuid))
return self.uuid2controller[uuid]
def kill_all(self):
for c in self.live_controllers:
c.kill()
self.uuid2controller = {}
@staticmethod
def kill_controller(controller):
msg.event("Killing controller %s" % str(controller))
controller.kill()
@staticmethod
def reboot_controller(controller):
msg.event("Restarting controller %s" % str(controller))
controller.start()
def check_controller_processes_alive(self):
controllers_with_problems = []
live = list(self.live_controllers)
live.sort(key=lambda c: c.uuid)
for c in live:
(rc, msg) = c.check_process_status()
if not rc:
c.alive = False
controllers_with_problems.append ( (c, msg) )
return controllers_with_problems
| from sts.util.console import msg
class ControllerManager(object):
''' Encapsulate a list of controllers objects '''
def __init__(self, controllers):
self.uuid2controller = {
controller.uuid : controller
for controller in controllers
}
@property
def controllers(self):
- return self.uuid2controller.values()
? ^^^^^^
+ cs = self.uuid2controller.values()
? ^^^^^
+ cs.sort(key=lambda c: c.uuid)
+ return cs
@property
def live_controllers(self):
alive = [controller for controller in self.controllers if controller.alive]
return set(alive)
@property
def down_controllers(self):
down = [controller for controller in self.controllers if not controller.alive]
return set(down)
def get_controller(self, uuid):
if uuid not in self.uuid2controller:
raise ValueError("unknown uuid %s" % str(uuid))
return self.uuid2controller[uuid]
def kill_all(self):
for c in self.live_controllers:
c.kill()
self.uuid2controller = {}
@staticmethod
def kill_controller(controller):
msg.event("Killing controller %s" % str(controller))
controller.kill()
@staticmethod
def reboot_controller(controller):
msg.event("Restarting controller %s" % str(controller))
controller.start()
def check_controller_processes_alive(self):
controllers_with_problems = []
- for c in self.live_controllers:
? ^^^ ^ ^^ ^
+ live = list(self.live_controllers)
? ^^^^ ^ + ^^^ ^
+ live.sort(key=lambda c: c.uuid)
+ for c in live:
(rc, msg) = c.check_process_status()
if not rc:
c.alive = False
controllers_with_problems.append ( (c, msg) )
return controllers_with_problems | 8 | 0.153846 | 6 | 2 |
b59eeaf2d1365e2c42ea1bdb9521a5e0302469b8 | src/cli/main.c | src/cli/main.c |
int main(int argc, const char* argv[])
{
if (argc == 2 && strcmp(argv[1], "--help") == 0)
{
printf("Usage: wren [file] [arguments...]\n");
printf(" --help Show command line usage\n");
return 0;
}
osSetArguments(argc, argv);
if (argc == 1)
{
runRepl();
}
else
{
runFile(argv[1]);
}
return 0;
}
|
int main(int argc, const char* argv[])
{
if (argc == 2 && strcmp(argv[1], "--help") == 0)
{
printf("Usage: wren [file] [arguments...]\n");
printf(" --help Show command line usage\n");
return 0;
}
if (argc == 2 && strcmp(argv[1], "--version") == 0)
{
printf("wren %s\n", WREN_VERSION_STRING);
return 0;
}
osSetArguments(argc, argv);
if (argc == 1)
{
runRepl();
}
else
{
runFile(argv[1]);
}
return 0;
}
| Support "--version" in the CLI to print the version. | Support "--version" in the CLI to print the version.
| C | mit | Nelarius/wren,minirop/wren,minirop/wren,munificent/wren,foresterre/wren,minirop/wren,foresterre/wren,munificent/wren,minirop/wren,foresterre/wren,munificent/wren,Nelarius/wren,munificent/wren,minirop/wren,munificent/wren,foresterre/wren,Nelarius/wren,munificent/wren,Nelarius/wren,Nelarius/wren,foresterre/wren | c | ## Code Before:
int main(int argc, const char* argv[])
{
if (argc == 2 && strcmp(argv[1], "--help") == 0)
{
printf("Usage: wren [file] [arguments...]\n");
printf(" --help Show command line usage\n");
return 0;
}
osSetArguments(argc, argv);
if (argc == 1)
{
runRepl();
}
else
{
runFile(argv[1]);
}
return 0;
}
## Instruction:
Support "--version" in the CLI to print the version.
## Code After:
int main(int argc, const char* argv[])
{
if (argc == 2 && strcmp(argv[1], "--help") == 0)
{
printf("Usage: wren [file] [arguments...]\n");
printf(" --help Show command line usage\n");
return 0;
}
if (argc == 2 && strcmp(argv[1], "--version") == 0)
{
printf("wren %s\n", WREN_VERSION_STRING);
return 0;
}
osSetArguments(argc, argv);
if (argc == 1)
{
runRepl();
}
else
{
runFile(argv[1]);
}
return 0;
}
|
int main(int argc, const char* argv[])
{
if (argc == 2 && strcmp(argv[1], "--help") == 0)
{
printf("Usage: wren [file] [arguments...]\n");
printf(" --help Show command line usage\n");
+ return 0;
+ }
+
+ if (argc == 2 && strcmp(argv[1], "--version") == 0)
+ {
+ printf("wren %s\n", WREN_VERSION_STRING);
return 0;
}
osSetArguments(argc, argv);
if (argc == 1)
{
runRepl();
}
else
{
runFile(argv[1]);
}
return 0;
} | 6 | 0.26087 | 6 | 0 |
8923137c551fd8ab14798e0fe35ff2018f6b966f | app/constants/Images.js | app/constants/Images.js | /*
* Copyright 2017-present, Hippothesis, Inc.
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow
*/
'use strict';
export const Images = {
logo: null
};
export default Images;
| /*
* Copyright 2017-present, Hippothesis, Inc.
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow
*/
'use strict';
export const Images = {
icons: {
home: require('../images/home-icon.png'),
search: require('../images/search-icon.png'),
profile: require('../images/profile-icon.png')
}
};
export default Images;
| Add file paths to icons | Add file paths to icons
| JavaScript | bsd-3-clause | hippothesis/Recipezy,hippothesis/Recipezy,hippothesis/Recipezy | javascript | ## Code Before:
/*
* Copyright 2017-present, Hippothesis, Inc.
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow
*/
'use strict';
export const Images = {
logo: null
};
export default Images;
## Instruction:
Add file paths to icons
## Code After:
/*
* Copyright 2017-present, Hippothesis, Inc.
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow
*/
'use strict';
export const Images = {
icons: {
home: require('../images/home-icon.png'),
search: require('../images/search-icon.png'),
profile: require('../images/profile-icon.png')
}
};
export default Images;
| /*
* Copyright 2017-present, Hippothesis, Inc.
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree.
*
* @flow
*/
'use strict';
export const Images = {
- logo: null
+ icons: {
+ home: require('../images/home-icon.png'),
+ search: require('../images/search-icon.png'),
+ profile: require('../images/profile-icon.png')
+ }
};
export default Images;
| 6 | 0.333333 | 5 | 1 |
7cb68f6a749a38fef9820004a857e301c12a3044 | morenines/util.py | morenines/util.py | import os
import hashlib
from fnmatch import fnmatchcase
def get_files(index):
paths = []
for dirpath, dirnames, filenames in os.walk(index.headers['root_path']):
# Remove ignored directories
dirnames = [d for d in dirnames if not index.ignores.match(d)]
for filename in (f for f in filenames if not index.ignores.match(f)):
# We want the path of the file, not its name
path = os.path.join(dirpath, filename)
# That path must be relative to the root, not absolute
path = os.path.relpath(path, index.headers['root_path'])
paths.append(path)
return paths
def get_ignores(ignores_path):
with open(ignores_path, 'r') as ignores_file:
ignores = [line.strip() for line in ignores_file]
return ignores
def get_hash(path):
h = hashlib.sha1()
with open(path, 'rb') as f:
h.update(f.read())
return h.hexdigest()
def get_new_and_missing(index):
current_files = get_files(index)
new_files = [path for path in current_files if path not in index.files]
missing_files = [path for path in index.files.iterkeys() if path not in current_files]
return new_files, missing_files
| import os
import hashlib
def get_files(index):
paths = []
for dirpath, dirnames, filenames in os.walk(index.headers['root_path']):
# Remove ignored directories
dirnames[:] = [d for d in dirnames if not index.ignores.match(d)]
for filename in (f for f in filenames if not index.ignores.match(f)):
# We want the path of the file, not its name
path = os.path.join(dirpath, filename)
# That path must be relative to the root, not absolute
path = os.path.relpath(path, index.headers['root_path'])
paths.append(path)
return paths
def get_ignores(ignores_path):
with open(ignores_path, 'r') as ignores_file:
ignores = [line.strip() for line in ignores_file]
return ignores
def get_hash(path):
h = hashlib.sha1()
with open(path, 'rb') as f:
h.update(f.read())
return h.hexdigest()
def get_new_and_missing(index):
current_files = get_files(index)
new_files = [path for path in current_files if path not in index.files]
missing_files = [path for path in index.files.iterkeys() if path not in current_files]
return new_files, missing_files
| Fix in-place os.walk() dirs modification | Fix in-place os.walk() dirs modification
| Python | mit | mcgid/morenines,mcgid/morenines | python | ## Code Before:
import os
import hashlib
from fnmatch import fnmatchcase
def get_files(index):
paths = []
for dirpath, dirnames, filenames in os.walk(index.headers['root_path']):
# Remove ignored directories
dirnames = [d for d in dirnames if not index.ignores.match(d)]
for filename in (f for f in filenames if not index.ignores.match(f)):
# We want the path of the file, not its name
path = os.path.join(dirpath, filename)
# That path must be relative to the root, not absolute
path = os.path.relpath(path, index.headers['root_path'])
paths.append(path)
return paths
def get_ignores(ignores_path):
with open(ignores_path, 'r') as ignores_file:
ignores = [line.strip() for line in ignores_file]
return ignores
def get_hash(path):
h = hashlib.sha1()
with open(path, 'rb') as f:
h.update(f.read())
return h.hexdigest()
def get_new_and_missing(index):
current_files = get_files(index)
new_files = [path for path in current_files if path not in index.files]
missing_files = [path for path in index.files.iterkeys() if path not in current_files]
return new_files, missing_files
## Instruction:
Fix in-place os.walk() dirs modification
## Code After:
import os
import hashlib
def get_files(index):
paths = []
for dirpath, dirnames, filenames in os.walk(index.headers['root_path']):
# Remove ignored directories
dirnames[:] = [d for d in dirnames if not index.ignores.match(d)]
for filename in (f for f in filenames if not index.ignores.match(f)):
# We want the path of the file, not its name
path = os.path.join(dirpath, filename)
# That path must be relative to the root, not absolute
path = os.path.relpath(path, index.headers['root_path'])
paths.append(path)
return paths
def get_ignores(ignores_path):
with open(ignores_path, 'r') as ignores_file:
ignores = [line.strip() for line in ignores_file]
return ignores
def get_hash(path):
h = hashlib.sha1()
with open(path, 'rb') as f:
h.update(f.read())
return h.hexdigest()
def get_new_and_missing(index):
current_files = get_files(index)
new_files = [path for path in current_files if path not in index.files]
missing_files = [path for path in index.files.iterkeys() if path not in current_files]
return new_files, missing_files
| import os
import hashlib
-
- from fnmatch import fnmatchcase
def get_files(index):
paths = []
for dirpath, dirnames, filenames in os.walk(index.headers['root_path']):
# Remove ignored directories
- dirnames = [d for d in dirnames if not index.ignores.match(d)]
+ dirnames[:] = [d for d in dirnames if not index.ignores.match(d)]
? +++
for filename in (f for f in filenames if not index.ignores.match(f)):
# We want the path of the file, not its name
path = os.path.join(dirpath, filename)
# That path must be relative to the root, not absolute
path = os.path.relpath(path, index.headers['root_path'])
paths.append(path)
return paths
def get_ignores(ignores_path):
with open(ignores_path, 'r') as ignores_file:
ignores = [line.strip() for line in ignores_file]
return ignores
def get_hash(path):
h = hashlib.sha1()
with open(path, 'rb') as f:
h.update(f.read())
return h.hexdigest()
def get_new_and_missing(index):
current_files = get_files(index)
new_files = [path for path in current_files if path not in index.files]
missing_files = [path for path in index.files.iterkeys() if path not in current_files]
return new_files, missing_files | 4 | 0.081633 | 1 | 3 |
da6bffed7086e935346c5be2fb9748f67d491386 | app/views/key_pairs/_form.html.erb | app/views/key_pairs/_form.html.erb | <%= form_for(@key_pair) do |f| %>
<% if @key_pair.errors.any? %>
<div id="error_explanation">
<h2><%= pluralize(@key_pair.errors.count, "error") %> prohibited this key_pair from being saved:</h2>
<ul>
<% @key_pair.errors.full_messages.each do |msg| %>
<li><%= msg %></li>
<% end %>
</ul>
</div>
<% end %>
<div class="field">
<%= f.label t('key_pairs.file') %><br />
<%= f.file_field :file %>
<p><strong><%= t('key_pairs.or') %></strong></p>
<%= f.label t('key_pairs.copy_paste_key') %>
<br>
<textarea cols="55" rows="14" id="key_pair_key_string" name="key_pair[key_string]"></textarea>
</div>
<div class="actions">
<%= f.submit(t('key_pairs.create_key_pair')) %>
</div>
<% end %>
| <%= form_for(@key_pair) do |f| %>
<% if flash[:error] %>
<div class="error"><%= flash[:error] %></div>
<% end %>
<% if @key_pair.errors.any? %>
<div id="error_explanation">
<h2><%= pluralize(@key_pair.errors.count, "error") %> prohibited this key_pair from being saved:</h2>
<ul>
<% @key_pair.errors.full_messages.each do |msg| %>
<li><%= msg %></li>
<% end %>
</ul>
</div>
<% end %>
<div class="field">
<%= f.label t('key_pairs.file') %><br />
<%= f.file_field :file %>
<p><strong><%= t('key_pairs.or') %></strong></p>
<%= f.label t('key_pairs.copy_paste_key') %>
<br>
<textarea cols="55" rows="14" id="key_pair_key_string" name="key_pair[key_string]"></textarea>
</div>
<div class="actions">
<%= f.submit(t('key_pairs.create_key_pair')) %>
</div>
<% end %>
| Allow new key_pair form to display flash messages | Allow new key_pair form to display flash messages
| HTML+ERB | mit | ealonas/Markus,arkon/Markus,arkon/Markus,ealonas/Markus,ealonas/Markus,arkon/Markus,ealonas/Markus,reidka/Markus,ealonas/Markus,ealonas/Markus,benjaminvialle/Markus,reidka/Markus,benjaminvialle/Markus,MarkUsProject/Markus,benjaminvialle/Markus,ealonas/Markus,benjaminvialle/Markus,MarkUsProject/Markus,arkon/Markus,MarkUsProject/Markus,arkon/Markus,reidka/Markus,MarkUsProject/Markus,MarkUsProject/Markus,benjaminvialle/Markus,arkon/Markus,benjaminvialle/Markus,reidka/Markus,MarkUsProject/Markus,ealonas/Markus,reidka/Markus,arkon/Markus,arkon/Markus,reidka/Markus,arkon/Markus,MarkUsProject/Markus,ealonas/Markus,reidka/Markus,reidka/Markus,reidka/Markus,MarkUsProject/Markus,benjaminvialle/Markus | html+erb | ## Code Before:
<%= form_for(@key_pair) do |f| %>
<% if @key_pair.errors.any? %>
<div id="error_explanation">
<h2><%= pluralize(@key_pair.errors.count, "error") %> prohibited this key_pair from being saved:</h2>
<ul>
<% @key_pair.errors.full_messages.each do |msg| %>
<li><%= msg %></li>
<% end %>
</ul>
</div>
<% end %>
<div class="field">
<%= f.label t('key_pairs.file') %><br />
<%= f.file_field :file %>
<p><strong><%= t('key_pairs.or') %></strong></p>
<%= f.label t('key_pairs.copy_paste_key') %>
<br>
<textarea cols="55" rows="14" id="key_pair_key_string" name="key_pair[key_string]"></textarea>
</div>
<div class="actions">
<%= f.submit(t('key_pairs.create_key_pair')) %>
</div>
<% end %>
## Instruction:
Allow new key_pair form to display flash messages
## Code After:
<%= form_for(@key_pair) do |f| %>
<% if flash[:error] %>
<div class="error"><%= flash[:error] %></div>
<% end %>
<% if @key_pair.errors.any? %>
<div id="error_explanation">
<h2><%= pluralize(@key_pair.errors.count, "error") %> prohibited this key_pair from being saved:</h2>
<ul>
<% @key_pair.errors.full_messages.each do |msg| %>
<li><%= msg %></li>
<% end %>
</ul>
</div>
<% end %>
<div class="field">
<%= f.label t('key_pairs.file') %><br />
<%= f.file_field :file %>
<p><strong><%= t('key_pairs.or') %></strong></p>
<%= f.label t('key_pairs.copy_paste_key') %>
<br>
<textarea cols="55" rows="14" id="key_pair_key_string" name="key_pair[key_string]"></textarea>
</div>
<div class="actions">
<%= f.submit(t('key_pairs.create_key_pair')) %>
</div>
<% end %>
| <%= form_for(@key_pair) do |f| %>
+ <% if flash[:error] %>
+ <div class="error"><%= flash[:error] %></div>
+ <% end %>
<% if @key_pair.errors.any? %>
<div id="error_explanation">
<h2><%= pluralize(@key_pair.errors.count, "error") %> prohibited this key_pair from being saved:</h2>
<ul>
<% @key_pair.errors.full_messages.each do |msg| %>
<li><%= msg %></li>
<% end %>
</ul>
</div>
<% end %>
<div class="field">
<%= f.label t('key_pairs.file') %><br />
<%= f.file_field :file %>
<p><strong><%= t('key_pairs.or') %></strong></p>
<%= f.label t('key_pairs.copy_paste_key') %>
<br>
<textarea cols="55" rows="14" id="key_pair_key_string" name="key_pair[key_string]"></textarea>
</div>
<div class="actions">
<%= f.submit(t('key_pairs.create_key_pair')) %>
</div>
<% end %> | 3 | 0.115385 | 3 | 0 |
a9cfed00cf57d8de6fc9ae88e547cbbbada24a6a | app/src/main/java/com/example/android/miwok/NumbersActivity.java | app/src/main/java/com/example/android/miwok/NumbersActivity.java | package com.example.android.miwok;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
public class NumbersActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_numbers);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
/**
* Create a String array with 10 indices containing the words for the numbers 1-10
*/
String words[] = new String[10];
words[0] = "one";
words[1] = "two";
words[2] = "three";
words[3] = "four";
words[4] = "five";
words[5] = "six";
words[6] = "seven";
words[7] = "eight";
words[8] = "nine";
words[9] = "ten";
}
}
| package com.example.android.miwok;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.util.Log;
public class NumbersActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_numbers);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
//Create a String array with 10 indices containing the words for the numbers 1-10
String words[] = new String[10];
words[0] = "one";
words[1] = "two";
words[2] = "three";
words[3] = "four";
words[4] = "five";
words[5] = "six";
words[6] = "seven";
words[7] = "eight";
words[8] = "nine";
words[9] = "ten";
// Log messages to check that the array has been initialised correctly
Log.v("NumbersActivity", "Word at index 0: " + words[0]);
Log.v("NumbersActivity", "Word at index 1: " + words[1]);
Log.v("NumbersActivity", "Word at index 1: " + words[2]);
Log.v("NumbersActivity", "Word at index 1: " + words[3]);
Log.v("NumbersActivity", "Word at index 1: " + words[4]);
Log.v("NumbersActivity", "Word at index 1: " + words[5]);
Log.v("NumbersActivity", "Word at index 1: " + words[6]);
Log.v("NumbersActivity", "Word at index 1: " + words[7]);
Log.v("NumbersActivity", "Word at index 1: " + words[8]);
Log.v("NumbersActivity", "Word at index 1: " + words[9]);
}
}
| Create verbose log to check that the words array is working | Create verbose log to check that the words array is working
| Java | apache-2.0 | MuirDH/Miwok | java | ## Code Before:
package com.example.android.miwok;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
public class NumbersActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_numbers);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
/**
* Create a String array with 10 indices containing the words for the numbers 1-10
*/
String words[] = new String[10];
words[0] = "one";
words[1] = "two";
words[2] = "three";
words[3] = "four";
words[4] = "five";
words[5] = "six";
words[6] = "seven";
words[7] = "eight";
words[8] = "nine";
words[9] = "ten";
}
}
## Instruction:
Create verbose log to check that the words array is working
## Code After:
package com.example.android.miwok;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.util.Log;
public class NumbersActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_numbers);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
//Create a String array with 10 indices containing the words for the numbers 1-10
String words[] = new String[10];
words[0] = "one";
words[1] = "two";
words[2] = "three";
words[3] = "four";
words[4] = "five";
words[5] = "six";
words[6] = "seven";
words[7] = "eight";
words[8] = "nine";
words[9] = "ten";
// Log messages to check that the array has been initialised correctly
Log.v("NumbersActivity", "Word at index 0: " + words[0]);
Log.v("NumbersActivity", "Word at index 1: " + words[1]);
Log.v("NumbersActivity", "Word at index 1: " + words[2]);
Log.v("NumbersActivity", "Word at index 1: " + words[3]);
Log.v("NumbersActivity", "Word at index 1: " + words[4]);
Log.v("NumbersActivity", "Word at index 1: " + words[5]);
Log.v("NumbersActivity", "Word at index 1: " + words[6]);
Log.v("NumbersActivity", "Word at index 1: " + words[7]);
Log.v("NumbersActivity", "Word at index 1: " + words[8]);
Log.v("NumbersActivity", "Word at index 1: " + words[9]);
}
}
| package com.example.android.miwok;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
+ import android.util.Log;
public class NumbersActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_numbers);
Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
- /**
- * Create a String array with 10 indices containing the words for the numbers 1-10
? ^^^
+ //Create a String array with 10 indices containing the words for the numbers 1-10
? ^^
- */
String words[] = new String[10];
words[0] = "one";
words[1] = "two";
words[2] = "three";
words[3] = "four";
words[4] = "five";
words[5] = "six";
words[6] = "seven";
words[7] = "eight";
words[8] = "nine";
words[9] = "ten";
+ // Log messages to check that the array has been initialised correctly
+ Log.v("NumbersActivity", "Word at index 0: " + words[0]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[1]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[2]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[3]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[4]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[5]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[6]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[7]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[8]);
+ Log.v("NumbersActivity", "Word at index 1: " + words[9]);
+
}
} | 17 | 0.515152 | 14 | 3 |
eab40f8c29a81300a2793b3718b77b6709eba316 | lib/sass/tree/mixin_node.rb | lib/sass/tree/mixin_node.rb | require 'sass/tree/node'
module Sass::Tree
class MixinNode < Node
def initialize(name, args, options)
@name = name
@args = args
super(options)
end
protected
def _perform(environment)
raise Sass::SyntaxError.new("Undefined mixin '#{@name}'.", @line) unless mixin = environment.mixin(@name)
raise Sass::SyntaxError.new(<<END.gsub("\n", "")) if mixin.args.size < @args.size
Mixin #{@name} takes #{mixin.args.size} argument#{'s' if mixin.args.size != 1}
but #{@args.size} #{@args.size == 1 ? 'was' : 'were'} passed.
END
environment = mixin.args.zip(@args).
inject(Sass::Environment.new(mixin.environment)) do |env, ((name, default), value)|
env.set_local_var(name,
if value
value.perform(environment)
elsif default
default.perform(env)
end)
raise Sass::SyntaxError.new("Mixin #{@name} is missing parameter !#{name}.") unless env.var(name)
env
end
mixin.tree.map {|c| c.perform(environment)}.flatten
end
end
end
| require 'sass/tree/node'
module Sass::Tree
# A dynamic node representing a mixin include.
#
# @see Sass::Tree
class MixinNode < Node
# @param name [String] The name of the mixin
# @param args [Array<Script::Node>] The arguments to the mixin
# @param options [Hash<Symbol, Object>] An options hash;
# see [the Sass options documentation](../../Sass.html#sass_options)
def initialize(name, args, options)
@name = name
@args = args
super(options)
end
protected
# Runs the mixin.
#
# @param environment [Sass::Environment] The lexical environment containing
# variable and mixin values
# @return [Array<Tree::Node>] The resulting static nodes
# @raise [Sass::SyntaxError] if there is no mixin with the given name
# @raise [Sass::SyntaxError] if an incorrect number of arguments was passed
# @see Sass::Tree
def _perform(environment)
raise Sass::SyntaxError.new("Undefined mixin '#{@name}'.", @line) unless mixin = environment.mixin(@name)
raise Sass::SyntaxError.new(<<END.gsub("\n", "")) if mixin.args.size < @args.size
Mixin #{@name} takes #{mixin.args.size} argument#{'s' if mixin.args.size != 1}
but #{@args.size} #{@args.size == 1 ? 'was' : 'were'} passed.
END
environment = mixin.args.zip(@args).
inject(Sass::Environment.new(mixin.environment)) do |env, ((name, default), value)|
env.set_local_var(name,
if value
value.perform(environment)
elsif default
default.perform(env)
end)
raise Sass::SyntaxError.new("Mixin #{@name} is missing parameter !#{name}.") unless env.var(name)
env
end
mixin.tree.map {|c| c.perform(environment)}.flatten
end
end
end
| Convert Sass::Tree::MixinNode docs to YARD. | [Sass] Convert Sass::Tree::MixinNode docs to YARD.
| Ruby | mit | saper/sass,tangposmarvin/haml,haml/haml,schneems/sass,zaru/sass,ridixcr/sass,keithpitty/haml,nazar-pc/haml,amitsuroliya/sass,chriseppstein/haml,askl56/sass,14113/sass,Galactix/sass,saper/sass,nicklima/sass,stanhu/haml,keithpitty/haml,srinivashappy/sass,srawlins/sass,Industrys/sass,Anshdesire/sass,HugoRLopes/sass,ridixcr/sass,Nzaga/sass,OrenBochman/sass,nicklima/sass,Anshdesire/sass,dylannnn/sass,dg1dd1ngs/haml,tangposmarvin/haml,concord-consortium/haml,tangposmarvin/sass,cmpereirasi/sass,hcatlin/sass,Andrey-Pavlov/sass,askl56/sass,purcell/haml,antonifs/sass,Iamronan/sass,macressler/haml,OrenBochman/sass,briangonzalez/sass,uniba/hamldoc_ja,amitsuroliya/sass,stewx/sass,DanteKaris/sass,lmetla/sass,chriseppstein/haml,Galactix/sass,hcatlin/sass,Ahnita/sass,PMArtz92/sass,chriseppstein/sass,PMArtz92/sass,StephanieMak/sass,amatsuda/haml,StephanieMak/sass,Industrys/sass,danielfein/sass,haml/haml,stanhu/haml,JuanitoFatas/sass,oss-practice/sass,dg1dd1ngs/haml,Andrey-Pavlov/sass,macressler/haml,rammy7219/sass,zaru/sass,nazar-pc/haml,baiyanghese/sass,tenderlove/sass,modulexcite/sass,lmetla/sass,liquidmetal/sass,MetaAshley/sass,DanteKaris/sass,schneems/sass,xzyfer/sass,cmpereirasi/sass,dylannnn/sass,haml/haml,sideci-sample/sideci-sample-haml,stewx/sass,jaroot32/sass,Iamronan/sass,srinivashappy/sass,haml/haml,nickg33/sass,nickg33/sass,danielfein/sass,xzyfer/sass,JuanitoFatas/sass,jaroot32/sass,Nzaga/sass,srawlins/sass,HugoRLopes/sass,modulexcite/sass,rammy7219/sass,MetaAshley/sass,antonifs/sass,Ahnita/sass,liquidmetal/sass,tangposmarvin/sass | ruby | ## Code Before:
require 'sass/tree/node'
module Sass::Tree
class MixinNode < Node
def initialize(name, args, options)
@name = name
@args = args
super(options)
end
protected
def _perform(environment)
raise Sass::SyntaxError.new("Undefined mixin '#{@name}'.", @line) unless mixin = environment.mixin(@name)
raise Sass::SyntaxError.new(<<END.gsub("\n", "")) if mixin.args.size < @args.size
Mixin #{@name} takes #{mixin.args.size} argument#{'s' if mixin.args.size != 1}
but #{@args.size} #{@args.size == 1 ? 'was' : 'were'} passed.
END
environment = mixin.args.zip(@args).
inject(Sass::Environment.new(mixin.environment)) do |env, ((name, default), value)|
env.set_local_var(name,
if value
value.perform(environment)
elsif default
default.perform(env)
end)
raise Sass::SyntaxError.new("Mixin #{@name} is missing parameter !#{name}.") unless env.var(name)
env
end
mixin.tree.map {|c| c.perform(environment)}.flatten
end
end
end
## Instruction:
[Sass] Convert Sass::Tree::MixinNode docs to YARD.
## Code After:
require 'sass/tree/node'
module Sass::Tree
# A dynamic node representing a mixin include.
#
# @see Sass::Tree
class MixinNode < Node
# @param name [String] The name of the mixin
# @param args [Array<Script::Node>] The arguments to the mixin
# @param options [Hash<Symbol, Object>] An options hash;
# see [the Sass options documentation](../../Sass.html#sass_options)
def initialize(name, args, options)
@name = name
@args = args
super(options)
end
protected
# Runs the mixin.
#
# @param environment [Sass::Environment] The lexical environment containing
# variable and mixin values
# @return [Array<Tree::Node>] The resulting static nodes
# @raise [Sass::SyntaxError] if there is no mixin with the given name
# @raise [Sass::SyntaxError] if an incorrect number of arguments was passed
# @see Sass::Tree
def _perform(environment)
raise Sass::SyntaxError.new("Undefined mixin '#{@name}'.", @line) unless mixin = environment.mixin(@name)
raise Sass::SyntaxError.new(<<END.gsub("\n", "")) if mixin.args.size < @args.size
Mixin #{@name} takes #{mixin.args.size} argument#{'s' if mixin.args.size != 1}
but #{@args.size} #{@args.size == 1 ? 'was' : 'were'} passed.
END
environment = mixin.args.zip(@args).
inject(Sass::Environment.new(mixin.environment)) do |env, ((name, default), value)|
env.set_local_var(name,
if value
value.perform(environment)
elsif default
default.perform(env)
end)
raise Sass::SyntaxError.new("Mixin #{@name} is missing parameter !#{name}.") unless env.var(name)
env
end
mixin.tree.map {|c| c.perform(environment)}.flatten
end
end
end
| require 'sass/tree/node'
module Sass::Tree
+ # A dynamic node representing a mixin include.
+ #
+ # @see Sass::Tree
class MixinNode < Node
+ # @param name [String] The name of the mixin
+ # @param args [Array<Script::Node>] The arguments to the mixin
+ # @param options [Hash<Symbol, Object>] An options hash;
+ # see [the Sass options documentation](../../Sass.html#sass_options)
def initialize(name, args, options)
@name = name
@args = args
super(options)
end
protected
+ # Runs the mixin.
+ #
+ # @param environment [Sass::Environment] The lexical environment containing
+ # variable and mixin values
+ # @return [Array<Tree::Node>] The resulting static nodes
+ # @raise [Sass::SyntaxError] if there is no mixin with the given name
+ # @raise [Sass::SyntaxError] if an incorrect number of arguments was passed
+ # @see Sass::Tree
def _perform(environment)
raise Sass::SyntaxError.new("Undefined mixin '#{@name}'.", @line) unless mixin = environment.mixin(@name)
raise Sass::SyntaxError.new(<<END.gsub("\n", "")) if mixin.args.size < @args.size
Mixin #{@name} takes #{mixin.args.size} argument#{'s' if mixin.args.size != 1}
but #{@args.size} #{@args.size == 1 ? 'was' : 'were'} passed.
END
environment = mixin.args.zip(@args).
inject(Sass::Environment.new(mixin.environment)) do |env, ((name, default), value)|
env.set_local_var(name,
if value
value.perform(environment)
elsif default
default.perform(env)
end)
raise Sass::SyntaxError.new("Mixin #{@name} is missing parameter !#{name}.") unless env.var(name)
env
end
mixin.tree.map {|c| c.perform(environment)}.flatten
end
end
end | 15 | 0.428571 | 15 | 0 |
f83babc944ce8a81fc24c72c5c490dd9d0f8713c | web/thesauruses/meta_info.json | web/thesauruses/meta_info.json | {
"languages": {
"ada": "Ada",
"bash": "Bash",
"cpp": "C++",
"csharp": "C#",
"clojure": "Clojure",
"go": "Go",
"haskell": "Haskell",
"java": "Java",
"javascript": "JavaScript",
"nim": "Nim",
"objectivec": "Objective-C",
"perl": "Perl",
"php": "PHP",
"python": "Python",
"ruby": "Ruby",
"rust": "Rust",
"scala": "Scala",
"swift": "Swift",
"typescript": "TypeScript",
"vbnet": "Visual Basic"
},
"structures": {
"classes": "Classes",
"control_structures": "Control Structures",
"data_types": "Data Types",
"exception_handling": "Exception Handling",
"functions": "Functions, Methods, and Subroutines",
"io": "Input and Output",
"lists": "Lists, Arrays, and Hashed Lists",
"operators": "Logical and Mathematical/Arithmetic Operators",
"strings": "Strings"
}
}
| {
"languages": {
"ada": "Ada",
"bash": "Bash",
"c": "C",
"cpp": "C++",
"csharp": "C#",
"clojure": "Clojure",
"go": "Go",
"haskell": "Haskell",
"java": "Java",
"javascript": "JavaScript",
"nim": "Nim",
"objectivec": "Objective-C",
"perl": "Perl",
"php": "PHP",
"python": "Python",
"ruby": "Ruby",
"rust": "Rust",
"scala": "Scala",
"swift": "Swift",
"typescript": "TypeScript",
"vbnet": "Visual Basic"
},
"structures": {
"classes": "Classes",
"control_structures": "Control Structures",
"data_types": "Data Types",
"exception_handling": "Exception Handling",
"functions": "Functions, Methods, and Subroutines",
"io": "Input and Output",
"lists": "Lists, Arrays, and Hashed Lists",
"operators": "Logical and Mathematical/Arithmetic Operators",
"strings": "Strings"
}
}
| Update : added C language | Update : added C language | JSON | agpl-3.0 | codethesaurus/codethesaur.us,codethesaurus/codethesaur.us | json | ## Code Before:
{
"languages": {
"ada": "Ada",
"bash": "Bash",
"cpp": "C++",
"csharp": "C#",
"clojure": "Clojure",
"go": "Go",
"haskell": "Haskell",
"java": "Java",
"javascript": "JavaScript",
"nim": "Nim",
"objectivec": "Objective-C",
"perl": "Perl",
"php": "PHP",
"python": "Python",
"ruby": "Ruby",
"rust": "Rust",
"scala": "Scala",
"swift": "Swift",
"typescript": "TypeScript",
"vbnet": "Visual Basic"
},
"structures": {
"classes": "Classes",
"control_structures": "Control Structures",
"data_types": "Data Types",
"exception_handling": "Exception Handling",
"functions": "Functions, Methods, and Subroutines",
"io": "Input and Output",
"lists": "Lists, Arrays, and Hashed Lists",
"operators": "Logical and Mathematical/Arithmetic Operators",
"strings": "Strings"
}
}
## Instruction:
Update : added C language
## Code After:
{
"languages": {
"ada": "Ada",
"bash": "Bash",
"c": "C",
"cpp": "C++",
"csharp": "C#",
"clojure": "Clojure",
"go": "Go",
"haskell": "Haskell",
"java": "Java",
"javascript": "JavaScript",
"nim": "Nim",
"objectivec": "Objective-C",
"perl": "Perl",
"php": "PHP",
"python": "Python",
"ruby": "Ruby",
"rust": "Rust",
"scala": "Scala",
"swift": "Swift",
"typescript": "TypeScript",
"vbnet": "Visual Basic"
},
"structures": {
"classes": "Classes",
"control_structures": "Control Structures",
"data_types": "Data Types",
"exception_handling": "Exception Handling",
"functions": "Functions, Methods, and Subroutines",
"io": "Input and Output",
"lists": "Lists, Arrays, and Hashed Lists",
"operators": "Logical and Mathematical/Arithmetic Operators",
"strings": "Strings"
}
}
| {
"languages": {
"ada": "Ada",
"bash": "Bash",
+ "c": "C",
"cpp": "C++",
"csharp": "C#",
"clojure": "Clojure",
"go": "Go",
"haskell": "Haskell",
"java": "Java",
"javascript": "JavaScript",
"nim": "Nim",
"objectivec": "Objective-C",
"perl": "Perl",
"php": "PHP",
"python": "Python",
"ruby": "Ruby",
"rust": "Rust",
"scala": "Scala",
"swift": "Swift",
"typescript": "TypeScript",
"vbnet": "Visual Basic"
},
"structures": {
"classes": "Classes",
"control_structures": "Control Structures",
"data_types": "Data Types",
"exception_handling": "Exception Handling",
"functions": "Functions, Methods, and Subroutines",
"io": "Input and Output",
"lists": "Lists, Arrays, and Hashed Lists",
"operators": "Logical and Mathematical/Arithmetic Operators",
"strings": "Strings"
}
} | 1 | 0.028571 | 1 | 0 |
6499c1f5e292f7445c0c9274a623c28c0eb7ce7b | setup.py | setup.py |
from os.path import join
from setuptools import setup, find_packages
# Change geokey_sapelli version here (and here alone!):
VERSION_PARTS = (0, 6, 7)
name = 'geokey-sapelli'
version = '.'.join(map(str, VERSION_PARTS))
repository = join('https://github.com/ExCiteS', name)
def get_install_requires():
"""
parse requirements.txt, ignore links, exclude comments
"""
requirements = list()
for line in open('requirements.txt').readlines():
# skip to next iteration if comment or empty line
if line.startswith('#') or line.startswith('git+https') or line == '':
continue
# add line to requirements
requirements.append(line.rstrip())
return requirements
setup(
name=name,
version=version,
description='Read Sapelli project and load data from CSVs to GeoKey',
url=repository,
download_url=join(repository, 'tarball', version),
author='ExCiteS',
author_email='excitesucl@gmail.com',
packages=find_packages(exclude=['*.tests', '*.tests.*', 'tests.*']),
install_requires=get_install_requires(),
include_package_data=True,
)
|
from os.path import join
from setuptools import setup, find_packages
# Change geokey_sapelli version here (and here alone!):
VERSION_PARTS = (0, 6, 7)
name = 'geokey-sapelli'
version = '.'.join(map(str, VERSION_PARTS))
repository = join('https://github.com/ExCiteS', name)
def get_install_requires():
"""
parse requirements.txt, ignore links, exclude comments
"""
requirements = list()
for line in open('requirements.txt').readlines():
# skip to next iteration if comment, Git repository or empty line
if line.startswith('#') or line.startswith('git+https') or line == '':
continue
# add line to requirements
requirements.append(line.rstrip())
return requirements
setup(
name=name,
version=version,
description='Read Sapelli project and load data from CSVs to GeoKey',
url=repository,
download_url=join(repository, 'tarball', version),
author='ExCiteS',
author_email='excitesucl@gmail.com',
packages=find_packages(exclude=['*.tests', '*.tests.*', 'tests.*']),
install_requires=get_install_requires(),
include_package_data=True,
)
| Comment about excluding Git repositories from requirements.txt | Comment about excluding Git repositories from requirements.txt
| Python | mit | ExCiteS/geokey-sapelli,ExCiteS/geokey-sapelli | python | ## Code Before:
from os.path import join
from setuptools import setup, find_packages
# Change geokey_sapelli version here (and here alone!):
VERSION_PARTS = (0, 6, 7)
name = 'geokey-sapelli'
version = '.'.join(map(str, VERSION_PARTS))
repository = join('https://github.com/ExCiteS', name)
def get_install_requires():
"""
parse requirements.txt, ignore links, exclude comments
"""
requirements = list()
for line in open('requirements.txt').readlines():
# skip to next iteration if comment or empty line
if line.startswith('#') or line.startswith('git+https') or line == '':
continue
# add line to requirements
requirements.append(line.rstrip())
return requirements
setup(
name=name,
version=version,
description='Read Sapelli project and load data from CSVs to GeoKey',
url=repository,
download_url=join(repository, 'tarball', version),
author='ExCiteS',
author_email='excitesucl@gmail.com',
packages=find_packages(exclude=['*.tests', '*.tests.*', 'tests.*']),
install_requires=get_install_requires(),
include_package_data=True,
)
## Instruction:
Comment about excluding Git repositories from requirements.txt
## Code After:
from os.path import join
from setuptools import setup, find_packages
# Change geokey_sapelli version here (and here alone!):
VERSION_PARTS = (0, 6, 7)
name = 'geokey-sapelli'
version = '.'.join(map(str, VERSION_PARTS))
repository = join('https://github.com/ExCiteS', name)
def get_install_requires():
"""
parse requirements.txt, ignore links, exclude comments
"""
requirements = list()
for line in open('requirements.txt').readlines():
# skip to next iteration if comment, Git repository or empty line
if line.startswith('#') or line.startswith('git+https') or line == '':
continue
# add line to requirements
requirements.append(line.rstrip())
return requirements
setup(
name=name,
version=version,
description='Read Sapelli project and load data from CSVs to GeoKey',
url=repository,
download_url=join(repository, 'tarball', version),
author='ExCiteS',
author_email='excitesucl@gmail.com',
packages=find_packages(exclude=['*.tests', '*.tests.*', 'tests.*']),
install_requires=get_install_requires(),
include_package_data=True,
)
|
from os.path import join
from setuptools import setup, find_packages
# Change geokey_sapelli version here (and here alone!):
VERSION_PARTS = (0, 6, 7)
name = 'geokey-sapelli'
version = '.'.join(map(str, VERSION_PARTS))
repository = join('https://github.com/ExCiteS', name)
def get_install_requires():
"""
parse requirements.txt, ignore links, exclude comments
"""
requirements = list()
for line in open('requirements.txt').readlines():
- # skip to next iteration if comment or empty line
+ # skip to next iteration if comment, Git repository or empty line
? ++++++++++++++++
if line.startswith('#') or line.startswith('git+https') or line == '':
continue
# add line to requirements
requirements.append(line.rstrip())
return requirements
setup(
name=name,
version=version,
description='Read Sapelli project and load data from CSVs to GeoKey',
url=repository,
download_url=join(repository, 'tarball', version),
author='ExCiteS',
author_email='excitesucl@gmail.com',
packages=find_packages(exclude=['*.tests', '*.tests.*', 'tests.*']),
install_requires=get_install_requires(),
include_package_data=True,
) | 2 | 0.052632 | 1 | 1 |
f507c227af574e57a774e2f87d115b02c8049986 | src/dependument.ts | src/dependument.ts | /// <reference path="../typings/node/node.d.ts" />
import * as fs from 'fs';
export class Dependument {
private source: string;
private output: string;
constructor(options: any) {
if (!options.source) {
throw new Error("No source path specified in options");
}
if (!options.output) {
throw new Error("No output path specified in options");
}
this.source = options.source;
this.output = options.output;
}
writeOutput() {
fs.writeFile(this.output, 'dependument test writeOutput', (err) => {
if (err) throw err;
console.log(`Output written to ${this.output}`);
});
}
}
| /// <reference path="../typings/node/node.d.ts" />
import * as fs from 'fs';
export class Dependument {
private source: string;
private output: string;
constructor(options: any) {
if (!options) {
throw new Error("No options provided");
}
if (!options.source) {
throw new Error("No source path specified in options");
}
if (!options.output) {
throw new Error("No output path specified in options");
}
this.source = options.source;
this.output = options.output;
}
writeOutput() {
fs.writeFile(this.output, 'dependument test writeOutput', (err) => {
if (err) throw err;
console.log(`Output written to ${this.output}`);
});
}
}
| Throw error if no options | Throw error if no options
| TypeScript | unlicense | dependument/dependument,Jameskmonger/dependument,Jameskmonger/dependument,dependument/dependument | typescript | ## Code Before:
/// <reference path="../typings/node/node.d.ts" />
import * as fs from 'fs';
export class Dependument {
private source: string;
private output: string;
constructor(options: any) {
if (!options.source) {
throw new Error("No source path specified in options");
}
if (!options.output) {
throw new Error("No output path specified in options");
}
this.source = options.source;
this.output = options.output;
}
writeOutput() {
fs.writeFile(this.output, 'dependument test writeOutput', (err) => {
if (err) throw err;
console.log(`Output written to ${this.output}`);
});
}
}
## Instruction:
Throw error if no options
## Code After:
/// <reference path="../typings/node/node.d.ts" />
import * as fs from 'fs';
export class Dependument {
private source: string;
private output: string;
constructor(options: any) {
if (!options) {
throw new Error("No options provided");
}
if (!options.source) {
throw new Error("No source path specified in options");
}
if (!options.output) {
throw new Error("No output path specified in options");
}
this.source = options.source;
this.output = options.output;
}
writeOutput() {
fs.writeFile(this.output, 'dependument test writeOutput', (err) => {
if (err) throw err;
console.log(`Output written to ${this.output}`);
});
}
}
| /// <reference path="../typings/node/node.d.ts" />
import * as fs from 'fs';
export class Dependument {
private source: string;
private output: string;
constructor(options: any) {
+ if (!options) {
+ throw new Error("No options provided");
+ }
+
if (!options.source) {
throw new Error("No source path specified in options");
}
if (!options.output) {
throw new Error("No output path specified in options");
}
this.source = options.source;
this.output = options.output;
}
writeOutput() {
fs.writeFile(this.output, 'dependument test writeOutput', (err) => {
if (err) throw err;
console.log(`Output written to ${this.output}`);
});
}
} | 4 | 0.142857 | 4 | 0 |
5ae42226e5b4b9262421b0252376cfc8b64e8532 | engine/graphics/RenderTarget.cpp | engine/graphics/RenderTarget.cpp | // Ouzel by Elviss Strazdins
#include "RenderTarget.hpp"
#include "Graphics.hpp"
#include "Texture.hpp"
namespace ouzel::graphics
{
RenderTarget::RenderTarget(Graphics& initGraphics,
const std::vector<Texture*>& initColorTextures,
Texture* initDepthTexture):
resource{*initGraphics.getDevice()},
colorTextures{initColorTextures},
depthTexture{initDepthTexture}
{
std::set<std::size_t> colorTextureIds;
for (const auto& colorTexture : colorTextures)
colorTextureIds.insert(colorTexture ? colorTexture->getResource() : 0);
initGraphics.addCommand(std::make_unique<InitRenderTargetCommand>(resource,
colorTextureIds,
depthTexture ? depthTexture->getResource() : RenderDevice::ResourceId(0)));
}
}
| // Ouzel by Elviss Strazdins
#include "RenderTarget.hpp"
#include "Graphics.hpp"
#include "Texture.hpp"
namespace ouzel::graphics
{
RenderTarget::RenderTarget(Graphics& initGraphics,
const std::vector<Texture*>& initColorTextures,
Texture* initDepthTexture):
resource{*initGraphics.getDevice()},
colorTextures{initColorTextures},
depthTexture{initDepthTexture}
{
std::set<RenderDevice::ResourceId> colorTextureIds;
for (const auto& colorTexture : colorTextures)
colorTextureIds.insert(colorTexture ? colorTexture->getResource() : 0);
initGraphics.addCommand(std::make_unique<InitRenderTargetCommand>(resource,
colorTextureIds,
depthTexture ? depthTexture->getResource() : RenderDevice::ResourceId(0)));
}
}
| Use ResourceId instead of size_t | Use ResourceId instead of size_t
| C++ | unlicense | elnormous/ouzel,elnormous/ouzel,elnormous/ouzel | c++ | ## Code Before:
// Ouzel by Elviss Strazdins
#include "RenderTarget.hpp"
#include "Graphics.hpp"
#include "Texture.hpp"
namespace ouzel::graphics
{
RenderTarget::RenderTarget(Graphics& initGraphics,
const std::vector<Texture*>& initColorTextures,
Texture* initDepthTexture):
resource{*initGraphics.getDevice()},
colorTextures{initColorTextures},
depthTexture{initDepthTexture}
{
std::set<std::size_t> colorTextureIds;
for (const auto& colorTexture : colorTextures)
colorTextureIds.insert(colorTexture ? colorTexture->getResource() : 0);
initGraphics.addCommand(std::make_unique<InitRenderTargetCommand>(resource,
colorTextureIds,
depthTexture ? depthTexture->getResource() : RenderDevice::ResourceId(0)));
}
}
## Instruction:
Use ResourceId instead of size_t
## Code After:
// Ouzel by Elviss Strazdins
#include "RenderTarget.hpp"
#include "Graphics.hpp"
#include "Texture.hpp"
namespace ouzel::graphics
{
RenderTarget::RenderTarget(Graphics& initGraphics,
const std::vector<Texture*>& initColorTextures,
Texture* initDepthTexture):
resource{*initGraphics.getDevice()},
colorTextures{initColorTextures},
depthTexture{initDepthTexture}
{
std::set<RenderDevice::ResourceId> colorTextureIds;
for (const auto& colorTexture : colorTextures)
colorTextureIds.insert(colorTexture ? colorTexture->getResource() : 0);
initGraphics.addCommand(std::make_unique<InitRenderTargetCommand>(resource,
colorTextureIds,
depthTexture ? depthTexture->getResource() : RenderDevice::ResourceId(0)));
}
}
| // Ouzel by Elviss Strazdins
#include "RenderTarget.hpp"
#include "Graphics.hpp"
#include "Texture.hpp"
namespace ouzel::graphics
{
RenderTarget::RenderTarget(Graphics& initGraphics,
const std::vector<Texture*>& initColorTextures,
Texture* initDepthTexture):
resource{*initGraphics.getDevice()},
colorTextures{initColorTextures},
depthTexture{initDepthTexture}
{
- std::set<std::size_t> colorTextureIds;
? ^^ ^^ ^^
+ std::set<RenderDevice::ResourceId> colorTextureIds;
? ^^^ ++++++++ ++ ^^^^ ^^
for (const auto& colorTexture : colorTextures)
colorTextureIds.insert(colorTexture ? colorTexture->getResource() : 0);
initGraphics.addCommand(std::make_unique<InitRenderTargetCommand>(resource,
colorTextureIds,
depthTexture ? depthTexture->getResource() : RenderDevice::ResourceId(0)));
}
} | 2 | 0.08 | 1 | 1 |
c2580d2abc863f4a4aaf8d544ce0d390754fbae4 | playbooks/neutron/tasks/ovs_agent.yml | playbooks/neutron/tasks/ovs_agent.yml | ---
- include: main.yml
- apt: pkg=openvswitch-switch
- apt: pkg=openvswitch-datapath-dkms
- name: ovs int bridge
ovs_bridge: name=br-int state=present
- template: |
src=etc/init/neutron-openvswitch-agent.conf
dest=/etc/init/neutron-openvswitch-agent.conf
- service: name=neutron-openvswitch-agent state=started
| ---
- include: main.yml
- apt: pkg=openvswitch-switch
- apt: pkg=openvswitch-datapath-dkms
- apt: pkg=vlan
- apt: pkg=bridge-utils
- name: ovs int bridge
ovs_bridge: name=br-int state=present
- template: |
src=etc/init/neutron-openvswitch-agent.conf
dest=/etc/init/neutron-openvswitch-agent.conf
- service: name=neutron-openvswitch-agent state=started
| Install vlan and bridge-utils packages. | Install vlan and bridge-utils packages.
| YAML | mit | msambol/ursula,pgraziano/ursula,persistent-ursula/ursula,dlundquist/ursula,pbannister/ursula,masteinhauser/ursula,panxia6679/ursula,ddaskal/ursula,wupeiran/ursula,kennjason/ursula,blueboxgroup/ursula,j2sol/ursula,pgraziano/ursula,davidcusatis/ursula,andrewrothstein/ursula,narengan/ursula,ddaskal/ursula,masteinhauser/ursula,channus/ursula,sivakom/ursula,mjbrewer/ursula,lihkin213/ursula,pbannister/ursula,paulczar/ursula,twaldrop/ursula,pbannister/ursula,j2sol/ursula,paulczar/ursula,lihkin213/ursula,wupeiran/ursula,pgraziano/ursula,persistent-ursula/ursula,ryshah/ursula,allomov/ursula,aldevigi/ursula,blueboxjesse/ursula,edtubillara/ursula,dlundquist/ursula,fancyhe/ursula,MaheshIBM/ursula,sivakom/ursula,EricCrosson/ursula,paulczar/ursula,allomov/ursula,channus/ursula,narengan/ursula,jwaibel/ursula,fancyhe/ursula,retr0h/ursula,rongzhus/ursula,aldevigi/ursula,edtubillara/ursula,pgraziano/ursula,channus/ursula,lihkin213/ursula,fancyhe/ursula,ryshah/ursula,MaheshIBM/ursula,msambol/ursula,andrewrothstein/ursula,panxia6679/ursula,kennjason/ursula,rongzhus/ursula,lihkin213/ursula,panxia6679/ursula,knandya/ursula,EricCrosson/ursula,narengan/ursula,davidcusatis/ursula,jwaibel/ursula,rongzhus/ursula,j2sol/ursula,ddaskal/ursula,davidcusatis/ursula,blueboxjesse/ursula,MaheshIBM/ursula,zrs233/ursula,mjbrewer/ursula,zrs233/ursula,knandya/ursula,wupeiran/ursula,dlundquist/ursula,retr0h/ursula,twaldrop/ursula,nirajdp76/ursula,twaldrop/ursula,masteinhauser/ursula,greghaynes/ursula,EricCrosson/ursula,knandya/ursula,nirajdp76/ursula,ddaskal/ursula,blueboxgroup/ursula,persistent-ursula/ursula,andrewrothstein/ursula,twaldrop/ursula,greghaynes/ursula,blueboxjesse/ursula,rongzhus/ursula,masteinhauser/ursula,aldevigi/ursula,narengan/ursula,greghaynes/ursula,ryshah/ursula,ryshah/ursula,nirajdp76/ursula,retr0h/ursula,knandya/ursula,mjbrewer/ursula,edtubillara/ursula,edtubillara/ursula,blueboxgroup/ursula,blueboxjesse/ursula,wupeiran/ursula,kennjason/ursula,allomov/ursula,blueboxgroup/ursula,sivakom/ursula,channus/ursula,j2sol/ursula,zrs233/ursula,nirajdp76/ursula,zrs233/ursula,persistent-ursula/ursula,fancyhe/ursula,panxia6679/ursula,jwaibel/ursula,msambol/ursula | yaml | ## Code Before:
---
- include: main.yml
- apt: pkg=openvswitch-switch
- apt: pkg=openvswitch-datapath-dkms
- name: ovs int bridge
ovs_bridge: name=br-int state=present
- template: |
src=etc/init/neutron-openvswitch-agent.conf
dest=/etc/init/neutron-openvswitch-agent.conf
- service: name=neutron-openvswitch-agent state=started
## Instruction:
Install vlan and bridge-utils packages.
## Code After:
---
- include: main.yml
- apt: pkg=openvswitch-switch
- apt: pkg=openvswitch-datapath-dkms
- apt: pkg=vlan
- apt: pkg=bridge-utils
- name: ovs int bridge
ovs_bridge: name=br-int state=present
- template: |
src=etc/init/neutron-openvswitch-agent.conf
dest=/etc/init/neutron-openvswitch-agent.conf
- service: name=neutron-openvswitch-agent state=started
| ---
- include: main.yml
- apt: pkg=openvswitch-switch
- apt: pkg=openvswitch-datapath-dkms
+ - apt: pkg=vlan
+ - apt: pkg=bridge-utils
- name: ovs int bridge
ovs_bridge: name=br-int state=present
- template: |
src=etc/init/neutron-openvswitch-agent.conf
dest=/etc/init/neutron-openvswitch-agent.conf
- service: name=neutron-openvswitch-agent state=started | 2 | 0.153846 | 2 | 0 |
951211d50b3b87f296572dcd984c77c031f3f627 | metadata/dummydomain.yetanothercallblocker.yml | metadata/dummydomain.yetanothercallblocker.yml | AntiFeatures:
- NonFreeNet
Categories:
- Phone & SMS
License: AGPL-3.0-only
WebSite: https://gitlab.com/xynngh/YetAnotherCallBlocker
SourceCode: https://gitlab.com/xynngh/YetAnotherCallBlocker/tree/HEAD
IssueTracker: https://gitlab.com/xynngh/YetAnotherCallBlocker/issues
AutoName: Yet Another Call Blocker
RepoType: git
Repo: https://gitlab.com/xynngh/YetAnotherCallBlocker.git
Builds:
- versionName: 0.3.4
versionCode: 3040
commit: v0.3.4
subdir: app
gradle:
- yes
- versionName: 0.4.1
versionCode: 4010
commit: v0.4.1
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags .*[0-9]$
CurrentVersion: 0.4.1
CurrentVersionCode: 4010
| AntiFeatures:
- NonFreeNet
Categories:
- Phone & SMS
License: AGPL-3.0-only
WebSite: https://gitlab.com/xynngh/YetAnotherCallBlocker
SourceCode: https://gitlab.com/xynngh/YetAnotherCallBlocker/tree/HEAD
IssueTracker: https://gitlab.com/xynngh/YetAnotherCallBlocker/issues
AutoName: Yet Another Call Blocker
RepoType: git
Repo: https://gitlab.com/xynngh/YetAnotherCallBlocker.git
Builds:
- versionName: 0.3.4
versionCode: 3040
commit: v0.3.4
subdir: app
gradle:
- yes
- versionName: 0.4.1
versionCode: 4010
commit: v0.4.1
subdir: app
gradle:
- yes
- versionName: 0.4.3
versionCode: 4030
commit: v0.4.3
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags .*[0-9]$
CurrentVersion: 0.4.3
CurrentVersionCode: 4030
| Update Yet Another Call Blocker to 0.4.3 (4030) | Update Yet Another Call Blocker to 0.4.3 (4030)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata | yaml | ## Code Before:
AntiFeatures:
- NonFreeNet
Categories:
- Phone & SMS
License: AGPL-3.0-only
WebSite: https://gitlab.com/xynngh/YetAnotherCallBlocker
SourceCode: https://gitlab.com/xynngh/YetAnotherCallBlocker/tree/HEAD
IssueTracker: https://gitlab.com/xynngh/YetAnotherCallBlocker/issues
AutoName: Yet Another Call Blocker
RepoType: git
Repo: https://gitlab.com/xynngh/YetAnotherCallBlocker.git
Builds:
- versionName: 0.3.4
versionCode: 3040
commit: v0.3.4
subdir: app
gradle:
- yes
- versionName: 0.4.1
versionCode: 4010
commit: v0.4.1
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags .*[0-9]$
CurrentVersion: 0.4.1
CurrentVersionCode: 4010
## Instruction:
Update Yet Another Call Blocker to 0.4.3 (4030)
## Code After:
AntiFeatures:
- NonFreeNet
Categories:
- Phone & SMS
License: AGPL-3.0-only
WebSite: https://gitlab.com/xynngh/YetAnotherCallBlocker
SourceCode: https://gitlab.com/xynngh/YetAnotherCallBlocker/tree/HEAD
IssueTracker: https://gitlab.com/xynngh/YetAnotherCallBlocker/issues
AutoName: Yet Another Call Blocker
RepoType: git
Repo: https://gitlab.com/xynngh/YetAnotherCallBlocker.git
Builds:
- versionName: 0.3.4
versionCode: 3040
commit: v0.3.4
subdir: app
gradle:
- yes
- versionName: 0.4.1
versionCode: 4010
commit: v0.4.1
subdir: app
gradle:
- yes
- versionName: 0.4.3
versionCode: 4030
commit: v0.4.3
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags .*[0-9]$
CurrentVersion: 0.4.3
CurrentVersionCode: 4030
| AntiFeatures:
- NonFreeNet
Categories:
- Phone & SMS
License: AGPL-3.0-only
WebSite: https://gitlab.com/xynngh/YetAnotherCallBlocker
SourceCode: https://gitlab.com/xynngh/YetAnotherCallBlocker/tree/HEAD
IssueTracker: https://gitlab.com/xynngh/YetAnotherCallBlocker/issues
AutoName: Yet Another Call Blocker
RepoType: git
Repo: https://gitlab.com/xynngh/YetAnotherCallBlocker.git
Builds:
- versionName: 0.3.4
versionCode: 3040
commit: v0.3.4
subdir: app
gradle:
- yes
- versionName: 0.4.1
versionCode: 4010
commit: v0.4.1
subdir: app
gradle:
- yes
+ - versionName: 0.4.3
+ versionCode: 4030
+ commit: v0.4.3
+ subdir: app
+ gradle:
+ - yes
+
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags .*[0-9]$
- CurrentVersion: 0.4.1
? ^
+ CurrentVersion: 0.4.3
? ^
- CurrentVersionCode: 4010
? ^
+ CurrentVersionCode: 4030
? ^
| 11 | 0.333333 | 9 | 2 |
927f0010cb2a211b2e1309e06bd175807e308973 | extension/options.js | extension/options.js | const DEFAULT_WHITELISTED_URL_REGEXPS = [
'abcnews.go.com\/.+',
'arstechnica.com\/.+',
'bbc.co.uk\/.+',
'bbc.com\/.+',
'business-standard.com\/.+',
'cnn.com\/.+',
'economist.com\/.+',
'guardian.co.uk\/.+',
'theguardian.com\/.+',
'hollywoodreporter.com\/.+',
'huffingtonpost.com\/.+',
'irishtimes.com\/.+',
'independent.co.uk\/.+',
'npr.org\/.+',
'newsweek.com\/.+',
'nytimes.com\/.+',
'politico.com\/.+',
'rollingstone.com\/.+',
'spiegel.de\/.+',
'time.com\/.+',
'theatlantic.com\/.+',
'variety.com\/.+',
'washingtonpost.com\/.+',
'wired.com\/.+',
'wsj.com\/.+',
];
function saveURLsWhitelist(whitelist) {
localStorage.setItem('urls-whitelist', JSON.stringify(whitelist));
}
function loadURLsWhitelist() {
let whitelist = localStorage.getItem('urls-whitelist');
if (whitelist) {
return JSON.parse(whitelist);
}
return DEFAULT_WHITELISTED_URL_REGEXPS;
}
| const DEFAULT_WHITELISTED_URL_REGEXPS = [
'abcnews.go.com\/.+',
'arstechnica.com\/.+',
'bbc.co.uk\/.+',
'bbc.com\/.+',
'business-standard.com\/.+',
'cnn.com\/.+',
'economist.com\/.+',
'forbes.com\/.+',
'guardian.co.uk\/.+',
'hollywoodreporter.com\/.+',
'huffingtonpost.com\/.+',
'independent.co.uk\/.+',
'irishtimes.com\/.+',
'newsweek.com\/.+',
'newyorker.com\/.+',
'npr.org\/.+',
'nytimes.com\/.+',
'politico.com\/.+',
'rollingstone.com\/.+',
'spiegel.de\/.+',
'theatlantic.com\/.+',
'theguardian.com\/.+',
'time.com\/.+',
'variety.com\/.+',
'washingtonpost.com\/.+',
'wired.com\/.+',
'wsj.com\/.+',
];
function saveURLsWhitelist(whitelist) {
localStorage.setItem('urls-whitelist', JSON.stringify(whitelist));
}
function loadURLsWhitelist() {
let whitelist = localStorage.getItem('urls-whitelist');
if (whitelist) {
return JSON.parse(whitelist);
}
return DEFAULT_WHITELISTED_URL_REGEXPS;
}
| Add a couple more items to the default whitelist. | Add a couple more items to the default whitelist.
| JavaScript | mit | eggpi/similarity,eggpi/similarity,eggpi/similarity | javascript | ## Code Before:
const DEFAULT_WHITELISTED_URL_REGEXPS = [
'abcnews.go.com\/.+',
'arstechnica.com\/.+',
'bbc.co.uk\/.+',
'bbc.com\/.+',
'business-standard.com\/.+',
'cnn.com\/.+',
'economist.com\/.+',
'guardian.co.uk\/.+',
'theguardian.com\/.+',
'hollywoodreporter.com\/.+',
'huffingtonpost.com\/.+',
'irishtimes.com\/.+',
'independent.co.uk\/.+',
'npr.org\/.+',
'newsweek.com\/.+',
'nytimes.com\/.+',
'politico.com\/.+',
'rollingstone.com\/.+',
'spiegel.de\/.+',
'time.com\/.+',
'theatlantic.com\/.+',
'variety.com\/.+',
'washingtonpost.com\/.+',
'wired.com\/.+',
'wsj.com\/.+',
];
function saveURLsWhitelist(whitelist) {
localStorage.setItem('urls-whitelist', JSON.stringify(whitelist));
}
function loadURLsWhitelist() {
let whitelist = localStorage.getItem('urls-whitelist');
if (whitelist) {
return JSON.parse(whitelist);
}
return DEFAULT_WHITELISTED_URL_REGEXPS;
}
## Instruction:
Add a couple more items to the default whitelist.
## Code After:
const DEFAULT_WHITELISTED_URL_REGEXPS = [
'abcnews.go.com\/.+',
'arstechnica.com\/.+',
'bbc.co.uk\/.+',
'bbc.com\/.+',
'business-standard.com\/.+',
'cnn.com\/.+',
'economist.com\/.+',
'forbes.com\/.+',
'guardian.co.uk\/.+',
'hollywoodreporter.com\/.+',
'huffingtonpost.com\/.+',
'independent.co.uk\/.+',
'irishtimes.com\/.+',
'newsweek.com\/.+',
'newyorker.com\/.+',
'npr.org\/.+',
'nytimes.com\/.+',
'politico.com\/.+',
'rollingstone.com\/.+',
'spiegel.de\/.+',
'theatlantic.com\/.+',
'theguardian.com\/.+',
'time.com\/.+',
'variety.com\/.+',
'washingtonpost.com\/.+',
'wired.com\/.+',
'wsj.com\/.+',
];
function saveURLsWhitelist(whitelist) {
localStorage.setItem('urls-whitelist', JSON.stringify(whitelist));
}
function loadURLsWhitelist() {
let whitelist = localStorage.getItem('urls-whitelist');
if (whitelist) {
return JSON.parse(whitelist);
}
return DEFAULT_WHITELISTED_URL_REGEXPS;
}
| const DEFAULT_WHITELISTED_URL_REGEXPS = [
'abcnews.go.com\/.+',
'arstechnica.com\/.+',
'bbc.co.uk\/.+',
'bbc.com\/.+',
'business-standard.com\/.+',
'cnn.com\/.+',
'economist.com\/.+',
+ 'forbes.com\/.+',
'guardian.co.uk\/.+',
- 'theguardian.com\/.+',
'hollywoodreporter.com\/.+',
'huffingtonpost.com\/.+',
+ 'independent.co.uk\/.+',
'irishtimes.com\/.+',
- 'independent.co.uk\/.+',
+ 'newsweek.com\/.+',
+ 'newyorker.com\/.+',
'npr.org\/.+',
- 'newsweek.com\/.+',
'nytimes.com\/.+',
'politico.com\/.+',
'rollingstone.com\/.+',
'spiegel.de\/.+',
+ 'theatlantic.com\/.+',
+ 'theguardian.com\/.+',
'time.com\/.+',
- 'theatlantic.com\/.+',
'variety.com\/.+',
'washingtonpost.com\/.+',
'wired.com\/.+',
'wsj.com\/.+',
];
function saveURLsWhitelist(whitelist) {
localStorage.setItem('urls-whitelist', JSON.stringify(whitelist));
}
function loadURLsWhitelist() {
let whitelist = localStorage.getItem('urls-whitelist');
if (whitelist) {
return JSON.parse(whitelist);
}
return DEFAULT_WHITELISTED_URL_REGEXPS;
} | 10 | 0.25641 | 6 | 4 |
bcaf4de1d61cc29fda93522c0ea42d64935a1ce9 | app/views/admin/fact_check_requests/edit.html.haml | app/views/admin/fact_check_requests/edit.html.haml | .policy
%h1.title= @edition.title
%div.body
= @edition.body.split("\n\n").collect{|paragraph| "<p>#{paragraph}</p>" }.join.html_safe
%p.written_by
Written by
%span.author= @edition.author.name
%div
= form_for([:admin, @edition, @fact_check_request]) do |fact_check_form|
= fact_check_form.label :comments
= fact_check_form.text_area :comments
= fact_check_form.submit "Submit" | .explanation
%p You've been asked to review this policy for factual accuracy. Please read it, and then provide any comments in the form below
.policy
%h1.title= @edition.title
%div.body
= @edition.body.split("\n\n").collect{|paragraph| "<p>#{paragraph}</p>" }.join.html_safe
%p.written_by
Written by
%span.author= @edition.author.name
%div
= form_for([:admin, @edition, @fact_check_request]) do |fact_check_form|
= fact_check_form.label :comments
= fact_check_form.text_area :comments
= fact_check_form.submit "Submit" | Add a bit of explanation. | Add a bit of explanation.
This can easily be expanded or styled to make it more obvious.
| Haml | mit | hotvulcan/whitehall,YOTOV-LIMITED/whitehall,YOTOV-LIMITED/whitehall,askl56/whitehall,alphagov/whitehall,YOTOV-LIMITED/whitehall,alphagov/whitehall,robinwhittleton/whitehall,robinwhittleton/whitehall,ggoral/whitehall,ggoral/whitehall,askl56/whitehall,hotvulcan/whitehall,hotvulcan/whitehall,ggoral/whitehall,hotvulcan/whitehall,YOTOV-LIMITED/whitehall,askl56/whitehall,askl56/whitehall,alphagov/whitehall,alphagov/whitehall,robinwhittleton/whitehall,ggoral/whitehall,robinwhittleton/whitehall | haml | ## Code Before:
.policy
%h1.title= @edition.title
%div.body
= @edition.body.split("\n\n").collect{|paragraph| "<p>#{paragraph}</p>" }.join.html_safe
%p.written_by
Written by
%span.author= @edition.author.name
%div
= form_for([:admin, @edition, @fact_check_request]) do |fact_check_form|
= fact_check_form.label :comments
= fact_check_form.text_area :comments
= fact_check_form.submit "Submit"
## Instruction:
Add a bit of explanation.
This can easily be expanded or styled to make it more obvious.
## Code After:
.explanation
%p You've been asked to review this policy for factual accuracy. Please read it, and then provide any comments in the form below
.policy
%h1.title= @edition.title
%div.body
= @edition.body.split("\n\n").collect{|paragraph| "<p>#{paragraph}</p>" }.join.html_safe
%p.written_by
Written by
%span.author= @edition.author.name
%div
= form_for([:admin, @edition, @fact_check_request]) do |fact_check_form|
= fact_check_form.label :comments
= fact_check_form.text_area :comments
= fact_check_form.submit "Submit" | + .explanation
+ %p You've been asked to review this policy for factual accuracy. Please read it, and then provide any comments in the form below
+
.policy
%h1.title= @edition.title
%div.body
= @edition.body.split("\n\n").collect{|paragraph| "<p>#{paragraph}</p>" }.join.html_safe
%p.written_by
Written by
%span.author= @edition.author.name
%div
= form_for([:admin, @edition, @fact_check_request]) do |fact_check_form|
= fact_check_form.label :comments
= fact_check_form.text_area :comments
= fact_check_form.submit "Submit" | 3 | 0.230769 | 3 | 0 |
c34039bbc9e403ed0b435e5c3b819c36bd77c801 | README.md | README.md | The OS2KITOS was programmed by IT Minds ApS (http://it-minds.dk)
for OS2 - Offentligt digitaliseringsfællesskab (http://os2web.dk).
Copyright (c) 2014, OS2 - Offentligt digitaliseringsfællesskab.
The OS2KITOS is free software; you may use, study, modify and
distribute it under the terms of version 2.0 of the Mozilla Public
License. See the LICENSE file for details. If a copy of the MPL was not
distributed with this file, You can obtain one at
http://mozilla.org/MPL/2.0/.
All source code in this and the underlying directories is subject to
the terms of the Mozilla Public License, v. 2.0.
[](https://ci.appveyor.com/project/Kitos/kitos/branch/master) [](https://codecov.io/github/os2kitos/kitos?branch=master)
This website uses [BrowserStack](https://www.browserstack.com/) for testing.

| The OS2KITOS was programmed by IT Minds ApS (http://it-minds.dk)
for OS2 - Offentligt digitaliseringsfællesskab (http://os2web.dk).
Copyright (c) 2014, OS2 - Offentligt digitaliseringsfællesskab.
The OS2KITOS is free software; you may use, study, modify and
distribute it under the terms of version 2.0 of the Mozilla Public
License. See the LICENSE file for details. If a copy of the MPL was not
distributed with this file, You can obtain one at
http://mozilla.org/MPL/2.0/.
All source code in this and the underlying directories is subject to
the terms of the Mozilla Public License, v. 2.0.
[](https://ci.appveyor.com/project/Kitos/kitos/branch/master) [](https://codecov.io/github/os2kitos/kitos?branch=master)
This website uses [BrowserStack](https://www.browserstack.com/) for testing.

| Change URL to BrowserStack logo. | Change URL to BrowserStack logo. | Markdown | mpl-2.0 | os2kitos/kitos,miracle-as/kitos,os2kitos/kitos,os2kitos/kitos,miracle-as/kitos,miracle-as/kitos,miracle-as/kitos,os2kitos/kitos | markdown | ## Code Before:
The OS2KITOS was programmed by IT Minds ApS (http://it-minds.dk)
for OS2 - Offentligt digitaliseringsfællesskab (http://os2web.dk).
Copyright (c) 2014, OS2 - Offentligt digitaliseringsfællesskab.
The OS2KITOS is free software; you may use, study, modify and
distribute it under the terms of version 2.0 of the Mozilla Public
License. See the LICENSE file for details. If a copy of the MPL was not
distributed with this file, You can obtain one at
http://mozilla.org/MPL/2.0/.
All source code in this and the underlying directories is subject to
the terms of the Mozilla Public License, v. 2.0.
[](https://ci.appveyor.com/project/Kitos/kitos/branch/master) [](https://codecov.io/github/os2kitos/kitos?branch=master)
This website uses [BrowserStack](https://www.browserstack.com/) for testing.

## Instruction:
Change URL to BrowserStack logo.
## Code After:
The OS2KITOS was programmed by IT Minds ApS (http://it-minds.dk)
for OS2 - Offentligt digitaliseringsfællesskab (http://os2web.dk).
Copyright (c) 2014, OS2 - Offentligt digitaliseringsfællesskab.
The OS2KITOS is free software; you may use, study, modify and
distribute it under the terms of version 2.0 of the Mozilla Public
License. See the LICENSE file for details. If a copy of the MPL was not
distributed with this file, You can obtain one at
http://mozilla.org/MPL/2.0/.
All source code in this and the underlying directories is subject to
the terms of the Mozilla Public License, v. 2.0.
[](https://ci.appveyor.com/project/Kitos/kitos/branch/master) [](https://codecov.io/github/os2kitos/kitos?branch=master)
This website uses [BrowserStack](https://www.browserstack.com/) for testing.

| The OS2KITOS was programmed by IT Minds ApS (http://it-minds.dk)
for OS2 - Offentligt digitaliseringsfællesskab (http://os2web.dk).
Copyright (c) 2014, OS2 - Offentligt digitaliseringsfællesskab.
The OS2KITOS is free software; you may use, study, modify and
distribute it under the terms of version 2.0 of the Mozilla Public
License. See the LICENSE file for details. If a copy of the MPL was not
distributed with this file, You can obtain one at
http://mozilla.org/MPL/2.0/.
All source code in this and the underlying directories is subject to
the terms of the Mozilla Public License, v. 2.0.
[](https://ci.appveyor.com/project/Kitos/kitos/branch/master) [](https://codecov.io/github/os2kitos/kitos?branch=master)
This website uses [BrowserStack](https://www.browserstack.com/) for testing.
- 
? ---- ----------
+ 
? ++++
| 2 | 0.105263 | 1 | 1 |
419a84031cb412bf94c9a4b6b10d85a6eab231ca | manifest.json | manifest.json | {
"manifest_version": 2,
"name": "HBO Now New Releases",
"description": "This extension helps you discover when new movies are released on HBO Now.",
"version": "1.0",
"background": {
"scripts": ["watch_for_hbo_now.js"],
"persistent": false
},
"page_action": {
"default_icon": "icon-clock.png",
"default_title" : "There's a 'G' in this URL!"
},
"permissions": [
"declarativeContent"
]
}
| {
"manifest_version": 2,
"name": "HBO Now New Releases",
"description": "This extension helps you discover when new movies are released on HBO Now.",
"version": "1.0",
"background": {
"scripts": ["watch_for_hbo_now.js"],
"persistent": false
},
"page_action": {
"default_icon": "icon-clock.png",
"default_title" : "See HBO Now New Releases"
},
"permissions": [
"declarativeContent"
]
}
| Correct title for the button | Correct title for the button
| JSON | mit | metavida/now-diff,metavida/now-diff | json | ## Code Before:
{
"manifest_version": 2,
"name": "HBO Now New Releases",
"description": "This extension helps you discover when new movies are released on HBO Now.",
"version": "1.0",
"background": {
"scripts": ["watch_for_hbo_now.js"],
"persistent": false
},
"page_action": {
"default_icon": "icon-clock.png",
"default_title" : "There's a 'G' in this URL!"
},
"permissions": [
"declarativeContent"
]
}
## Instruction:
Correct title for the button
## Code After:
{
"manifest_version": 2,
"name": "HBO Now New Releases",
"description": "This extension helps you discover when new movies are released on HBO Now.",
"version": "1.0",
"background": {
"scripts": ["watch_for_hbo_now.js"],
"persistent": false
},
"page_action": {
"default_icon": "icon-clock.png",
"default_title" : "See HBO Now New Releases"
},
"permissions": [
"declarativeContent"
]
}
| {
"manifest_version": 2,
"name": "HBO Now New Releases",
"description": "This extension helps you discover when new movies are released on HBO Now.",
"version": "1.0",
"background": {
"scripts": ["watch_for_hbo_now.js"],
"persistent": false
},
"page_action": {
"default_icon": "icon-clock.png",
- "default_title" : "There's a 'G' in this URL!"
+ "default_title" : "See HBO Now New Releases"
},
"permissions": [
"declarativeContent"
]
} | 2 | 0.111111 | 1 | 1 |
79722cca34dfaa002f4d16ccc15961df8be0a20a | cookbooks/cumulus/test/cookbooks/cumulus-test/recipes/default.rb | cookbooks/cumulus/test/cookbooks/cumulus-test/recipes/default.rb | directory '/usr/cumulus/bin' do
recursive true
end
directory '/etc/cumulus' do
end
file '/usr/cumulus/bin/cl-license' do
content '#!/bin/sh
echo "Rocket Turtle!\nexpires=$(date +%s)\n$0 $@" > /etc/cumulus/.license.txt'
mode '0755'
end
# Invoke the providers
cumulus_ports 'speeds' do
speed_10g ['swp1']
speed_40g ['swp3','swp5-10', 'swp12']
speed_40g_div_4 ['swp15','swp16']
speed_4_by_10g ['swp20-32']
end
cumulus_license 'test' do
source 'http://localhost/test.lic'
end
cumulus_license 'test-with-force' do
source 'http://localhost/test.lic'
force true
end
| directory '/usr/cumulus/bin' do
recursive true
end
directory '/etc/cumulus' do
end
file '/usr/cumulus/bin/cl-license' do
content '#!/bin/sh
echo "Rocket Turtle!\nexpires=$(date +%s)\n$0 $@" > /etc/cumulus/.license.txt'
mode '0755'
end
include_recipe "cumulus-test::ports"
include_recipe "cumulus-test::license"
| Split the license & ports provider invocations to seperate recipes | Split the license & ports provider invocations to seperate recipes
| Ruby | apache-2.0 | Vanders/ifupdown2,Vanders/ifupdown2 | ruby | ## Code Before:
directory '/usr/cumulus/bin' do
recursive true
end
directory '/etc/cumulus' do
end
file '/usr/cumulus/bin/cl-license' do
content '#!/bin/sh
echo "Rocket Turtle!\nexpires=$(date +%s)\n$0 $@" > /etc/cumulus/.license.txt'
mode '0755'
end
# Invoke the providers
cumulus_ports 'speeds' do
speed_10g ['swp1']
speed_40g ['swp3','swp5-10', 'swp12']
speed_40g_div_4 ['swp15','swp16']
speed_4_by_10g ['swp20-32']
end
cumulus_license 'test' do
source 'http://localhost/test.lic'
end
cumulus_license 'test-with-force' do
source 'http://localhost/test.lic'
force true
end
## Instruction:
Split the license & ports provider invocations to seperate recipes
## Code After:
directory '/usr/cumulus/bin' do
recursive true
end
directory '/etc/cumulus' do
end
file '/usr/cumulus/bin/cl-license' do
content '#!/bin/sh
echo "Rocket Turtle!\nexpires=$(date +%s)\n$0 $@" > /etc/cumulus/.license.txt'
mode '0755'
end
include_recipe "cumulus-test::ports"
include_recipe "cumulus-test::license"
| directory '/usr/cumulus/bin' do
recursive true
end
directory '/etc/cumulus' do
end
file '/usr/cumulus/bin/cl-license' do
content '#!/bin/sh
echo "Rocket Turtle!\nexpires=$(date +%s)\n$0 $@" > /etc/cumulus/.license.txt'
mode '0755'
end
+ include_recipe "cumulus-test::ports"
+ include_recipe "cumulus-test::license"
- # Invoke the providers
- cumulus_ports 'speeds' do
- speed_10g ['swp1']
- speed_40g ['swp3','swp5-10', 'swp12']
- speed_40g_div_4 ['swp15','swp16']
- speed_4_by_10g ['swp20-32']
- end
-
- cumulus_license 'test' do
- source 'http://localhost/test.lic'
- end
-
- cumulus_license 'test-with-force' do
- source 'http://localhost/test.lic'
- force true
- end | 18 | 0.62069 | 2 | 16 |
eea495cb1727af073cbc385b3f0ebfbbff36f928 | src/js/store/migration.js | src/js/store/migration.js |
export default (state) => {
if (state.ui.shortkeys_enabled !== undefined) {
state.ui.hotkeys_enabled = state.ui.shortkeys_enabled;
}
return state;
};
| export default (state) => {
if (state.ui.shortkeys_enabled !== undefined) {
state.ui.hotkeys_enabled = state.ui.shortkeys_enabled;
}
// Change authorizations from 'null' to 'undefined' to fix destructuring issues like
// that of https://github.com/jaedb/Iris/issues/662
if (state.spotify && state.spotify.me === null) state.spotify.me = undefined;
if (state.lastfm && state.lastfm.me === null) state.lastfm.me = undefined;
if (state.genius && state.genius.me === null) state.genius.me = undefined;
return state;
};
| Migrate me from null to undefined, prevents corrupted state | Migrate me from null to undefined, prevents corrupted state
| JavaScript | apache-2.0 | jaedb/Iris,jaedb/Iris,jaedb/Iris,jaedb/Iris,jaedb/Iris,jaedb/Iris | javascript | ## Code Before:
export default (state) => {
if (state.ui.shortkeys_enabled !== undefined) {
state.ui.hotkeys_enabled = state.ui.shortkeys_enabled;
}
return state;
};
## Instruction:
Migrate me from null to undefined, prevents corrupted state
## Code After:
export default (state) => {
if (state.ui.shortkeys_enabled !== undefined) {
state.ui.hotkeys_enabled = state.ui.shortkeys_enabled;
}
// Change authorizations from 'null' to 'undefined' to fix destructuring issues like
// that of https://github.com/jaedb/Iris/issues/662
if (state.spotify && state.spotify.me === null) state.spotify.me = undefined;
if (state.lastfm && state.lastfm.me === null) state.lastfm.me = undefined;
if (state.genius && state.genius.me === null) state.genius.me = undefined;
return state;
};
| -
export default (state) => {
if (state.ui.shortkeys_enabled !== undefined) {
state.ui.hotkeys_enabled = state.ui.shortkeys_enabled;
}
+
+ // Change authorizations from 'null' to 'undefined' to fix destructuring issues like
+ // that of https://github.com/jaedb/Iris/issues/662
+ if (state.spotify && state.spotify.me === null) state.spotify.me = undefined;
+ if (state.lastfm && state.lastfm.me === null) state.lastfm.me = undefined;
+ if (state.genius && state.genius.me === null) state.genius.me = undefined;
+
return state;
}; | 8 | 1.142857 | 7 | 1 |
890e03aafbf2fefc6d27c2d7d0d54479b0285f70 | gulp/utils/command.js | gulp/utils/command.js | 'use strict';
const { spawn } = require('child_process');
const PluginError = require('plugin-error');
// Execute a shell command
const execCommand = function (command) {
const [commandA, ...args] = command.trim().split(/ +/);
const child = spawn(commandA, args, { stdio: 'inherit' });
// eslint-disable-next-line promise/avoid-new
return new Promise(execCommandPromise.bind(null, child));
};
const execCommandPromise = function (child, resolve, reject) {
child.on('exit', execCommandExit.bind(null, resolve, reject));
};
const execCommandExit = function (resolve, reject, exitCode) {
if (exitCode === 0) { return resolve(); }
const error = new PluginError('shell', 'Shell command failed');
reject(error);
};
module.exports = {
execCommand,
};
| 'use strict';
const { env } = require('process');
const { spawn } = require('child_process');
const PluginError = require('plugin-error');
// Execute a shell command
const execCommand = function (command) {
const [commandA, ...args] = command.trim().split(/ +/);
const envA = getEnv();
const child = spawn(commandA, args, { stdio: 'inherit', env: envA });
// eslint-disable-next-line promise/avoid-new
return new Promise(execCommandPromise.bind(null, child));
};
// Adds local Node modules binary to `$PATH`
const getEnv = function () {
const PATH = getPath({ env });
const envA = { ...env, PATH };
return envA;
};
const getPath = function ({ env: { PATH = '' } }) {
const hasLocalDir = PATH.split(':').includes(LOCAL_NODE_BIN_DIR);
if (hasLocalDir) { return PATH; }
return `${PATH}:${LOCAL_NODE_BIN_DIR}`;
};
const LOCAL_NODE_BIN_DIR = './node_modules/.bin/';
// Check command exit code
const execCommandPromise = function (child, resolve, reject) {
child.on('exit', execCommandExit.bind(null, resolve, reject));
};
const execCommandExit = function (resolve, reject, exitCode) {
if (exitCode === 0) { return resolve(); }
const error = new PluginError('shell', 'Shell command failed');
reject(error);
};
module.exports = {
execCommand,
};
| Fix ./node_modules/.bin not being in $PATH for Gulp | Fix ./node_modules/.bin not being in $PATH for Gulp
| JavaScript | apache-2.0 | autoserver-org/autoserver,autoserver-org/autoserver | javascript | ## Code Before:
'use strict';
const { spawn } = require('child_process');
const PluginError = require('plugin-error');
// Execute a shell command
const execCommand = function (command) {
const [commandA, ...args] = command.trim().split(/ +/);
const child = spawn(commandA, args, { stdio: 'inherit' });
// eslint-disable-next-line promise/avoid-new
return new Promise(execCommandPromise.bind(null, child));
};
const execCommandPromise = function (child, resolve, reject) {
child.on('exit', execCommandExit.bind(null, resolve, reject));
};
const execCommandExit = function (resolve, reject, exitCode) {
if (exitCode === 0) { return resolve(); }
const error = new PluginError('shell', 'Shell command failed');
reject(error);
};
module.exports = {
execCommand,
};
## Instruction:
Fix ./node_modules/.bin not being in $PATH for Gulp
## Code After:
'use strict';
const { env } = require('process');
const { spawn } = require('child_process');
const PluginError = require('plugin-error');
// Execute a shell command
const execCommand = function (command) {
const [commandA, ...args] = command.trim().split(/ +/);
const envA = getEnv();
const child = spawn(commandA, args, { stdio: 'inherit', env: envA });
// eslint-disable-next-line promise/avoid-new
return new Promise(execCommandPromise.bind(null, child));
};
// Adds local Node modules binary to `$PATH`
const getEnv = function () {
const PATH = getPath({ env });
const envA = { ...env, PATH };
return envA;
};
const getPath = function ({ env: { PATH = '' } }) {
const hasLocalDir = PATH.split(':').includes(LOCAL_NODE_BIN_DIR);
if (hasLocalDir) { return PATH; }
return `${PATH}:${LOCAL_NODE_BIN_DIR}`;
};
const LOCAL_NODE_BIN_DIR = './node_modules/.bin/';
// Check command exit code
const execCommandPromise = function (child, resolve, reject) {
child.on('exit', execCommandExit.bind(null, resolve, reject));
};
const execCommandExit = function (resolve, reject, exitCode) {
if (exitCode === 0) { return resolve(); }
const error = new PluginError('shell', 'Shell command failed');
reject(error);
};
module.exports = {
execCommand,
};
| 'use strict';
+ const { env } = require('process');
const { spawn } = require('child_process');
const PluginError = require('plugin-error');
// Execute a shell command
const execCommand = function (command) {
const [commandA, ...args] = command.trim().split(/ +/);
+ const envA = getEnv();
- const child = spawn(commandA, args, { stdio: 'inherit' });
+ const child = spawn(commandA, args, { stdio: 'inherit', env: envA });
? +++++++++++
// eslint-disable-next-line promise/avoid-new
return new Promise(execCommandPromise.bind(null, child));
};
+ // Adds local Node modules binary to `$PATH`
+ const getEnv = function () {
+ const PATH = getPath({ env });
+ const envA = { ...env, PATH };
+ return envA;
+ };
+
+ const getPath = function ({ env: { PATH = '' } }) {
+ const hasLocalDir = PATH.split(':').includes(LOCAL_NODE_BIN_DIR);
+ if (hasLocalDir) { return PATH; }
+
+ return `${PATH}:${LOCAL_NODE_BIN_DIR}`;
+ };
+
+ const LOCAL_NODE_BIN_DIR = './node_modules/.bin/';
+
+ // Check command exit code
const execCommandPromise = function (child, resolve, reject) {
child.on('exit', execCommandExit.bind(null, resolve, reject));
};
const execCommandExit = function (resolve, reject, exitCode) {
if (exitCode === 0) { return resolve(); }
const error = new PluginError('shell', 'Shell command failed');
reject(error);
};
module.exports = {
execCommand,
}; | 21 | 0.724138 | 20 | 1 |
b178b9ce2b76283efc0b9ad8a434e4373a23ac5e | lib/transfer_wise.rb | lib/transfer_wise.rb | require 'open-uri'
require 'oauth2'
require 'rest-client'
require 'json'
# Version
require "transfer_wise/version"
# Oauth2 Authentication
require "transfer_wise/oauth"
# Resources
require 'transfer_wise/transfer_wise_object'
require 'transfer_wise/api_resource'
require 'transfer_wise/profile'
require 'transfer_wise/quote'
require 'transfer_wise/account'
require 'transfer_wise/transfer'
require 'transfer_wise/util'
require 'transfer_wise/request'
require 'transfer_wise/borderless_account'
require 'transfer_wise/borderless_account/balance_currency'
require 'transfer_wise/borderless_account/statement'
require 'transfer_wise/borderless_account/transaction'
# Errors
require 'transfer_wise/transfer_wise_error'
module TransferWise
class << self
attr_accessor :mode
attr_accessor :access_token
def api_base
@api_base ||= "https://#{mode == 'live' ? 'api' : 'test-api'}.transferwise.com"
end
end
end
| require 'open-uri'
require 'oauth2'
require 'rest-client'
require 'json'
# Version
require "transfer_wise/version"
# Oauth2 Authentication
require "transfer_wise/oauth"
# Resources
require 'transfer_wise/transfer_wise_object'
require 'transfer_wise/api_resource'
require 'transfer_wise/profile'
require 'transfer_wise/quote'
require 'transfer_wise/account'
require 'transfer_wise/transfer'
require 'transfer_wise/util'
require 'transfer_wise/request'
require 'transfer_wise/borderless_account'
require 'transfer_wise/borderless_account/balance_currency'
require 'transfer_wise/borderless_account/statement'
require 'transfer_wise/borderless_account/transaction'
# Errors
require 'transfer_wise/transfer_wise_error'
module TransferWise
class << self
attr_accessor :mode
attr_accessor :access_token
def api_base
@api_base ||= mode == 'live' ? 'https://api.transferwise.com' : 'https://api.sandbox.transferwise.tech'
end
end
end
| Add the new Sandbox URL | Add the new Sandbox URL
There's a new API base for the sandbox application.
Note, the authorize action requires a different URL in sandbox (https://sandbox.transferwise.tech), but in production requires https://api.transferwise.com still.
| Ruby | mit | Milaap/transferwise-rb,Milaap/transferwise-rb | ruby | ## Code Before:
require 'open-uri'
require 'oauth2'
require 'rest-client'
require 'json'
# Version
require "transfer_wise/version"
# Oauth2 Authentication
require "transfer_wise/oauth"
# Resources
require 'transfer_wise/transfer_wise_object'
require 'transfer_wise/api_resource'
require 'transfer_wise/profile'
require 'transfer_wise/quote'
require 'transfer_wise/account'
require 'transfer_wise/transfer'
require 'transfer_wise/util'
require 'transfer_wise/request'
require 'transfer_wise/borderless_account'
require 'transfer_wise/borderless_account/balance_currency'
require 'transfer_wise/borderless_account/statement'
require 'transfer_wise/borderless_account/transaction'
# Errors
require 'transfer_wise/transfer_wise_error'
module TransferWise
class << self
attr_accessor :mode
attr_accessor :access_token
def api_base
@api_base ||= "https://#{mode == 'live' ? 'api' : 'test-api'}.transferwise.com"
end
end
end
## Instruction:
Add the new Sandbox URL
There's a new API base for the sandbox application.
Note, the authorize action requires a different URL in sandbox (https://sandbox.transferwise.tech), but in production requires https://api.transferwise.com still.
## Code After:
require 'open-uri'
require 'oauth2'
require 'rest-client'
require 'json'
# Version
require "transfer_wise/version"
# Oauth2 Authentication
require "transfer_wise/oauth"
# Resources
require 'transfer_wise/transfer_wise_object'
require 'transfer_wise/api_resource'
require 'transfer_wise/profile'
require 'transfer_wise/quote'
require 'transfer_wise/account'
require 'transfer_wise/transfer'
require 'transfer_wise/util'
require 'transfer_wise/request'
require 'transfer_wise/borderless_account'
require 'transfer_wise/borderless_account/balance_currency'
require 'transfer_wise/borderless_account/statement'
require 'transfer_wise/borderless_account/transaction'
# Errors
require 'transfer_wise/transfer_wise_error'
module TransferWise
class << self
attr_accessor :mode
attr_accessor :access_token
def api_base
@api_base ||= mode == 'live' ? 'https://api.transferwise.com' : 'https://api.sandbox.transferwise.tech'
end
end
end
| require 'open-uri'
require 'oauth2'
require 'rest-client'
require 'json'
# Version
require "transfer_wise/version"
# Oauth2 Authentication
require "transfer_wise/oauth"
# Resources
require 'transfer_wise/transfer_wise_object'
require 'transfer_wise/api_resource'
require 'transfer_wise/profile'
require 'transfer_wise/quote'
require 'transfer_wise/account'
require 'transfer_wise/transfer'
require 'transfer_wise/util'
require 'transfer_wise/request'
require 'transfer_wise/borderless_account'
require 'transfer_wise/borderless_account/balance_currency'
require 'transfer_wise/borderless_account/statement'
require 'transfer_wise/borderless_account/transaction'
# Errors
require 'transfer_wise/transfer_wise_error'
module TransferWise
class << self
attr_accessor :mode
attr_accessor :access_token
def api_base
- @api_base ||= "https://#{mode == 'live' ? 'api' : 'test-api'}.transferwise.com"
+ @api_base ||= mode == 'live' ? 'https://api.transferwise.com' : 'https://api.sandbox.transferwise.tech'
end
end
end | 2 | 0.051282 | 1 | 1 |
31184f9ae7d57b5e265b3521fd19a9269eb05d54 | README.md | README.md |
Azure WebJobs provide an easy way to run background tasks on Azure App Service. The WebJobs SDK makes it easy to start jobs based on events like new Queue messages or new BLOBs.
Learn more about WebJobs:
- [WebJobs Docs](http://aka.ms/webjobs-docs)
- [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk)
- [WebJobs SDK Extensions GitHub](https://github.com/azure/azure-webjobs-sdk-extensions)
## Getting Started
1. Clone or download the repository
2. Open the solution with Visual Studio (2013/2015). Be sure to have the latest Azure SDK installed.
3. In the App.config, paste in your Azure Storage connection string for the following settings:
<add name="AzureWebJobsDashboard" connectionString="" />
<add name="AzureWebJobsStorage" connectionString="" />
4. Hit run to test locally - you'll see new Queue Messages created on a regular basis.
5. Right click on the solution and select "Publish as Azure WebJob" to publish to your App Service App (Web, Mobile, API, etc.)
## Contribute
Please follow the [Azure Contributor Guidelines](http://azure.github.io/guidelines.html). PR requests generally require you accept a CLA. Any issues with the SDK itself should created in the [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk).
## License
[MIT](./LICENSE.txt)
|
Azure WebJobs provide an easy way to run background tasks on Azure App Service. The WebJobs SDK makes it easy to start jobs based on events like new Queue messages or new BLOBs.
Learn more about WebJobs:
- [WebJobs Docs](http://aka.ms/webjobs-docs)
- [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk)
- [WebJobs SDK Extensions GitHub](https://github.com/azure/azure-webjobs-sdk-extensions)
## Getting Started
1. Clone or download the repository
2. Open the solution with Visual Studio (2013/2015). Be sure to have the latest Azure SDK installed.
3. In the App.config, paste in your Azure Storage connection string for the following settings:
```
<add name="AzureWebJobsDashboard" connectionString="" />
<add name="AzureWebJobsStorage" connectionString="" />
```
4. Hit run to test locally - you'll see new Queue Messages created on a regular basis.
5. Right click on the solution and select "Publish as Azure WebJob" to publish to your App Service App (Web, Mobile, API, etc.)
## Contribute
Please follow the [Azure Contributor Guidelines](http://azure.github.io/guidelines.html). PR requests generally require you accept a CLA. Any issues with the SDK itself should created in the [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk).
## License
[MIT](./LICENSE.txt)
| Fix syntax for connection string snippet | Fix syntax for connection string snippet
| Markdown | mit | Azure/azure-webjobs-quickstart | markdown | ## Code Before:
Azure WebJobs provide an easy way to run background tasks on Azure App Service. The WebJobs SDK makes it easy to start jobs based on events like new Queue messages or new BLOBs.
Learn more about WebJobs:
- [WebJobs Docs](http://aka.ms/webjobs-docs)
- [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk)
- [WebJobs SDK Extensions GitHub](https://github.com/azure/azure-webjobs-sdk-extensions)
## Getting Started
1. Clone or download the repository
2. Open the solution with Visual Studio (2013/2015). Be sure to have the latest Azure SDK installed.
3. In the App.config, paste in your Azure Storage connection string for the following settings:
<add name="AzureWebJobsDashboard" connectionString="" />
<add name="AzureWebJobsStorage" connectionString="" />
4. Hit run to test locally - you'll see new Queue Messages created on a regular basis.
5. Right click on the solution and select "Publish as Azure WebJob" to publish to your App Service App (Web, Mobile, API, etc.)
## Contribute
Please follow the [Azure Contributor Guidelines](http://azure.github.io/guidelines.html). PR requests generally require you accept a CLA. Any issues with the SDK itself should created in the [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk).
## License
[MIT](./LICENSE.txt)
## Instruction:
Fix syntax for connection string snippet
## Code After:
Azure WebJobs provide an easy way to run background tasks on Azure App Service. The WebJobs SDK makes it easy to start jobs based on events like new Queue messages or new BLOBs.
Learn more about WebJobs:
- [WebJobs Docs](http://aka.ms/webjobs-docs)
- [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk)
- [WebJobs SDK Extensions GitHub](https://github.com/azure/azure-webjobs-sdk-extensions)
## Getting Started
1. Clone or download the repository
2. Open the solution with Visual Studio (2013/2015). Be sure to have the latest Azure SDK installed.
3. In the App.config, paste in your Azure Storage connection string for the following settings:
```
<add name="AzureWebJobsDashboard" connectionString="" />
<add name="AzureWebJobsStorage" connectionString="" />
```
4. Hit run to test locally - you'll see new Queue Messages created on a regular basis.
5. Right click on the solution and select "Publish as Azure WebJob" to publish to your App Service App (Web, Mobile, API, etc.)
## Contribute
Please follow the [Azure Contributor Guidelines](http://azure.github.io/guidelines.html). PR requests generally require you accept a CLA. Any issues with the SDK itself should created in the [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk).
## License
[MIT](./LICENSE.txt)
|
Azure WebJobs provide an easy way to run background tasks on Azure App Service. The WebJobs SDK makes it easy to start jobs based on events like new Queue messages or new BLOBs.
Learn more about WebJobs:
- [WebJobs Docs](http://aka.ms/webjobs-docs)
- [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk)
- [WebJobs SDK Extensions GitHub](https://github.com/azure/azure-webjobs-sdk-extensions)
## Getting Started
1. Clone or download the repository
2. Open the solution with Visual Studio (2013/2015). Be sure to have the latest Azure SDK installed.
3. In the App.config, paste in your Azure Storage connection string for the following settings:
+ ```
- <add name="AzureWebJobsDashboard" connectionString="" />
? --------
+ <add name="AzureWebJobsDashboard" connectionString="" />
- <add name="AzureWebJobsStorage" connectionString="" />
? --------
+ <add name="AzureWebJobsStorage" connectionString="" />
+ ```
4. Hit run to test locally - you'll see new Queue Messages created on a regular basis.
5. Right click on the solution and select "Publish as Azure WebJob" to publish to your App Service App (Web, Mobile, API, etc.)
## Contribute
Please follow the [Azure Contributor Guidelines](http://azure.github.io/guidelines.html). PR requests generally require you accept a CLA. Any issues with the SDK itself should created in the [WebJobs SDK GitHub](https://github.com/azure/azure-webjobs-sdk).
## License
[MIT](./LICENSE.txt) | 6 | 0.24 | 4 | 2 |
94313d62dfd2cf0e6157901f629e501225246d62 | .travis.yml | .travis.yml | language: go
go:
- 1.7.1
# AF: Ugly, but the dependencies here are a bit of a clusterfuck.
install:
- go get github.com/masterzen/winrm
- cd $GOPATH/src/github.com/masterzen/winrm
- git checkout 54ea5d01478cfc2afccec1504bd0dfcd8c260cfa
- mkdir -p $GOPATH/src/github.com/packer-community
- git clone https://github.com/packer-community/winrmcp $GOPATH/src/github.com/packer-community/winrmcp
- cd $GOPATH/src/github.com/packer-community/winrmcp
- git checkout f1bcf36a69fa2945e65dd099eee11b560fbd3346
- go get github.com/mitchellh/packer
- cd $GOPATH/src/github.com/mitchellh/packer
- git checkout v0.8.6
- go get -v -d ./... || true
| language: go
go:
- 1.7.1
# AF: Ugly, but the dependencies here are a bit of a clusterfuck.
install:
- go get github.com/masterzen/winrm
- cd $GOPATH/src/github.com/masterzen/winrm
- git checkout 54ea5d01478cfc2afccec1504bd0dfcd8c260cfa
- mkdir -p $GOPATH/src/github.com/packer-community
- git clone https://github.com/packer-community/winrmcp $GOPATH/src/github.com/packer-community/winrmcp
- cd $GOPATH/src/github.com/packer-community/winrmcp
- git checkout f1bcf36a69fa2945e65dd099eee11b560fbd3346
- go get github.com/mitchellh/packer
- cd $GOPATH/src/github.com/mitchellh/packer
- git checkout v0.8.6
- go get -v -d ./... || true
- cd $GOPATH/src/github.com/DimensionDataResearch/packer-plugins-ddcloud
| Move to repo root before CI build. | Move to repo root before CI build.
| YAML | apache-2.0 | DimensionDataResearch/packer-plugins-ddcloud,DimensionDataResearch/packer-plugins-ddcloud,DimensionDataResearch/packer-plugins-ddcloud | yaml | ## Code Before:
language: go
go:
- 1.7.1
# AF: Ugly, but the dependencies here are a bit of a clusterfuck.
install:
- go get github.com/masterzen/winrm
- cd $GOPATH/src/github.com/masterzen/winrm
- git checkout 54ea5d01478cfc2afccec1504bd0dfcd8c260cfa
- mkdir -p $GOPATH/src/github.com/packer-community
- git clone https://github.com/packer-community/winrmcp $GOPATH/src/github.com/packer-community/winrmcp
- cd $GOPATH/src/github.com/packer-community/winrmcp
- git checkout f1bcf36a69fa2945e65dd099eee11b560fbd3346
- go get github.com/mitchellh/packer
- cd $GOPATH/src/github.com/mitchellh/packer
- git checkout v0.8.6
- go get -v -d ./... || true
## Instruction:
Move to repo root before CI build.
## Code After:
language: go
go:
- 1.7.1
# AF: Ugly, but the dependencies here are a bit of a clusterfuck.
install:
- go get github.com/masterzen/winrm
- cd $GOPATH/src/github.com/masterzen/winrm
- git checkout 54ea5d01478cfc2afccec1504bd0dfcd8c260cfa
- mkdir -p $GOPATH/src/github.com/packer-community
- git clone https://github.com/packer-community/winrmcp $GOPATH/src/github.com/packer-community/winrmcp
- cd $GOPATH/src/github.com/packer-community/winrmcp
- git checkout f1bcf36a69fa2945e65dd099eee11b560fbd3346
- go get github.com/mitchellh/packer
- cd $GOPATH/src/github.com/mitchellh/packer
- git checkout v0.8.6
- go get -v -d ./... || true
- cd $GOPATH/src/github.com/DimensionDataResearch/packer-plugins-ddcloud
| language: go
go:
- 1.7.1
# AF: Ugly, but the dependencies here are a bit of a clusterfuck.
install:
- go get github.com/masterzen/winrm
- cd $GOPATH/src/github.com/masterzen/winrm
- git checkout 54ea5d01478cfc2afccec1504bd0dfcd8c260cfa
- mkdir -p $GOPATH/src/github.com/packer-community
- git clone https://github.com/packer-community/winrmcp $GOPATH/src/github.com/packer-community/winrmcp
- cd $GOPATH/src/github.com/packer-community/winrmcp
- git checkout f1bcf36a69fa2945e65dd099eee11b560fbd3346
- go get github.com/mitchellh/packer
- cd $GOPATH/src/github.com/mitchellh/packer
- git checkout v0.8.6
- go get -v -d ./... || true
+ - cd $GOPATH/src/github.com/DimensionDataResearch/packer-plugins-ddcloud
+ | 2 | 0.111111 | 2 | 0 |
913c957d490503d189314d07e1ddc9fc41653baa | src/main/java/br/com/tdsis/lambda/forest/json/JsonResponseBodySerializerStrategy.java | src/main/java/br/com/tdsis/lambda/forest/json/JsonResponseBodySerializerStrategy.java | package br.com.tdsis.lambda.forest.json;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import br.com.tdsis.lambda.forest.http.ResponseBodySerializerStrategy;
import br.com.tdsis.lambda.forest.http.exception.HttpException;
import br.com.tdsis.lambda.forest.http.exception.InternalServerErrorException;
/**
* The JSON response body serializer strategy
*
* @author nmelo
* @version 1.0.0
* @since 1.0.0
*/
public class JsonResponseBodySerializerStrategy implements ResponseBodySerializerStrategy {
@Override
public String serialize(Object entity) throws HttpException {
String json = null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(Include.NON_NULL);
json = mapper.writeValueAsString(entity);
} catch (JsonProcessingException e) {
throw new InternalServerErrorException(e.getMessage(), e);
}
return json;
}
}
| package br.com.tdsis.lambda.forest.json;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import br.com.tdsis.lambda.forest.http.ResponseBodySerializerStrategy;
import br.com.tdsis.lambda.forest.http.exception.HttpException;
import br.com.tdsis.lambda.forest.http.exception.InternalServerErrorException;
/**
* The JSON response body serializer strategy
*
* @author nmelo
* @version 1.0.0
* @since 1.0.0
*/
public class JsonResponseBodySerializerStrategy implements ResponseBodySerializerStrategy {
@Override
public String serialize(Object entity) throws HttpException {
String json = null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(Include.NON_NULL);
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
json = mapper.writeValueAsString(entity);
} catch (JsonProcessingException e) {
throw new InternalServerErrorException(e.getMessage(), e);
}
return json;
}
}
| Set FAIL_ON_EMPTY_BEANS to false when serializing responses. Fixes gh-6 | Set FAIL_ON_EMPTY_BEANS to false when serializing responses. Fixes gh-6
| Java | mit | tdsis/lambda-forest | java | ## Code Before:
package br.com.tdsis.lambda.forest.json;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import br.com.tdsis.lambda.forest.http.ResponseBodySerializerStrategy;
import br.com.tdsis.lambda.forest.http.exception.HttpException;
import br.com.tdsis.lambda.forest.http.exception.InternalServerErrorException;
/**
* The JSON response body serializer strategy
*
* @author nmelo
* @version 1.0.0
* @since 1.0.0
*/
public class JsonResponseBodySerializerStrategy implements ResponseBodySerializerStrategy {
@Override
public String serialize(Object entity) throws HttpException {
String json = null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(Include.NON_NULL);
json = mapper.writeValueAsString(entity);
} catch (JsonProcessingException e) {
throw new InternalServerErrorException(e.getMessage(), e);
}
return json;
}
}
## Instruction:
Set FAIL_ON_EMPTY_BEANS to false when serializing responses. Fixes gh-6
## Code After:
package br.com.tdsis.lambda.forest.json;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import br.com.tdsis.lambda.forest.http.ResponseBodySerializerStrategy;
import br.com.tdsis.lambda.forest.http.exception.HttpException;
import br.com.tdsis.lambda.forest.http.exception.InternalServerErrorException;
/**
* The JSON response body serializer strategy
*
* @author nmelo
* @version 1.0.0
* @since 1.0.0
*/
public class JsonResponseBodySerializerStrategy implements ResponseBodySerializerStrategy {
@Override
public String serialize(Object entity) throws HttpException {
String json = null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(Include.NON_NULL);
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
json = mapper.writeValueAsString(entity);
} catch (JsonProcessingException e) {
throw new InternalServerErrorException(e.getMessage(), e);
}
return json;
}
}
| package br.com.tdsis.lambda.forest.json;
import com.fasterxml.jackson.annotation.JsonInclude.Include;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
+ import com.fasterxml.jackson.databind.SerializationFeature;
import br.com.tdsis.lambda.forest.http.ResponseBodySerializerStrategy;
import br.com.tdsis.lambda.forest.http.exception.HttpException;
import br.com.tdsis.lambda.forest.http.exception.InternalServerErrorException;
/**
* The JSON response body serializer strategy
*
* @author nmelo
* @version 1.0.0
* @since 1.0.0
*/
public class JsonResponseBodySerializerStrategy implements ResponseBodySerializerStrategy {
@Override
public String serialize(Object entity) throws HttpException {
String json = null;
try {
- ObjectMapper mapper = new ObjectMapper();
+ ObjectMapper mapper = new ObjectMapper();
? ++++++++++++
mapper.setSerializationInclusion(Include.NON_NULL);
+ mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
+
json = mapper.writeValueAsString(entity);
} catch (JsonProcessingException e) {
throw new InternalServerErrorException(e.getMessage(), e);
}
return json;
}
} | 5 | 0.135135 | 4 | 1 |
c3ce738972002459086ebcad268b9c9941b92683 | examples/README.md | examples/README.md |
This directory includes self-contained sample projects demonstrating techniques
to include and extend the [Blockly](http://github.com/google/blockly) library.
## Prerequisites
Install [node](https://nodejs.org/) and [npm](https://www.npmjs.com/get-npm).
## Running
```
cd <any sample folder>
npm install
npm run start
```
Browse to http://localhost:3000
You may need to refer to a sample's README for further setup and running instructions.
## Development
### Bootstrap
```
npm run boot
```
This will run ``npm install`` on every example.
### Maintenance
```
npm run update
```
This will run ``npm update`` on every example.
```
npm run audit
```
This will run ``npm audit fix`` on every example.
|
This directory includes self-contained sample projects demonstrating techniques
to include and extend the [Blockly](http://github.com/google/blockly) library.
## Samples
### Integrating Blockly
- [``blockly-requirejs-sample``](examples/blockly-requirejs/): Loads RequireJS from a CDN and loads Blockly using ``AMD``.
- [``blockly-umd-sample``](examples/blockly-umd/): Loads the UMD build of Blockly (``blockly.min.js``), both from node_modules and from Unpkg.
- [``blockly-webpack-sample``](blockly-webpack/): Using Blockly in Webpack.
- [``blockly-node-sample``](examples/blockly-node/): Using Blockly in Node.js, loaded using require (``CommonJS``).
- [``blockly-angular-sample``](examples/blockly-angular/): Blockly in an Angular project, defines an Angular Blockly Component.
- [``blockly-react-sample``](examples/blockly-react/): Blockly in a React project, defines a React Blockly Component.
- [``blockly-svelte-sample``](examples/blockly-svelte/): Blockly in a Svelte project, defines a Svelte Blockly Component.
- [``blockly-vue-sample``](examples/blockly-vue/): Blockly in a Vue project, defines a Vue Blockly Component.
### Real-time Collaboration
- [``blockly-rtc``](examples/blockly-rtc/): Real-time collaboration environment on top of the Blockly framework.
## Prerequisites
## Prerequisites
Install [node](https://nodejs.org/) and [npm](https://www.npmjs.com/get-npm).
## Running
```
cd <any sample folder>
npm install
npm run start
```
Browse to http://localhost:3000
You may need to refer to a sample's README for further setup and running instructions.
## Development
### Bootstrap
```
npm run boot
```
This will run ``npm install`` on every example.
### Maintenance
```
npm run update
```
This will run ``npm update`` on every example.
```
npm run audit
```
This will run ``npm audit fix`` on every example.
| Update readme for examples directory | Update readme for examples directory | Markdown | apache-2.0 | google/blockly-samples,google/blockly-samples,google/blockly-samples,google/blockly-samples | markdown | ## Code Before:
This directory includes self-contained sample projects demonstrating techniques
to include and extend the [Blockly](http://github.com/google/blockly) library.
## Prerequisites
Install [node](https://nodejs.org/) and [npm](https://www.npmjs.com/get-npm).
## Running
```
cd <any sample folder>
npm install
npm run start
```
Browse to http://localhost:3000
You may need to refer to a sample's README for further setup and running instructions.
## Development
### Bootstrap
```
npm run boot
```
This will run ``npm install`` on every example.
### Maintenance
```
npm run update
```
This will run ``npm update`` on every example.
```
npm run audit
```
This will run ``npm audit fix`` on every example.
## Instruction:
Update readme for examples directory
## Code After:
This directory includes self-contained sample projects demonstrating techniques
to include and extend the [Blockly](http://github.com/google/blockly) library.
## Samples
### Integrating Blockly
- [``blockly-requirejs-sample``](examples/blockly-requirejs/): Loads RequireJS from a CDN and loads Blockly using ``AMD``.
- [``blockly-umd-sample``](examples/blockly-umd/): Loads the UMD build of Blockly (``blockly.min.js``), both from node_modules and from Unpkg.
- [``blockly-webpack-sample``](blockly-webpack/): Using Blockly in Webpack.
- [``blockly-node-sample``](examples/blockly-node/): Using Blockly in Node.js, loaded using require (``CommonJS``).
- [``blockly-angular-sample``](examples/blockly-angular/): Blockly in an Angular project, defines an Angular Blockly Component.
- [``blockly-react-sample``](examples/blockly-react/): Blockly in a React project, defines a React Blockly Component.
- [``blockly-svelte-sample``](examples/blockly-svelte/): Blockly in a Svelte project, defines a Svelte Blockly Component.
- [``blockly-vue-sample``](examples/blockly-vue/): Blockly in a Vue project, defines a Vue Blockly Component.
### Real-time Collaboration
- [``blockly-rtc``](examples/blockly-rtc/): Real-time collaboration environment on top of the Blockly framework.
## Prerequisites
## Prerequisites
Install [node](https://nodejs.org/) and [npm](https://www.npmjs.com/get-npm).
## Running
```
cd <any sample folder>
npm install
npm run start
```
Browse to http://localhost:3000
You may need to refer to a sample's README for further setup and running instructions.
## Development
### Bootstrap
```
npm run boot
```
This will run ``npm install`` on every example.
### Maintenance
```
npm run update
```
This will run ``npm update`` on every example.
```
npm run audit
```
This will run ``npm audit fix`` on every example.
|
This directory includes self-contained sample projects demonstrating techniques
to include and extend the [Blockly](http://github.com/google/blockly) library.
+
+ ## Samples
+
+ ### Integrating Blockly
+ - [``blockly-requirejs-sample``](examples/blockly-requirejs/): Loads RequireJS from a CDN and loads Blockly using ``AMD``.
+ - [``blockly-umd-sample``](examples/blockly-umd/): Loads the UMD build of Blockly (``blockly.min.js``), both from node_modules and from Unpkg.
+ - [``blockly-webpack-sample``](blockly-webpack/): Using Blockly in Webpack.
+ - [``blockly-node-sample``](examples/blockly-node/): Using Blockly in Node.js, loaded using require (``CommonJS``).
+ - [``blockly-angular-sample``](examples/blockly-angular/): Blockly in an Angular project, defines an Angular Blockly Component.
+ - [``blockly-react-sample``](examples/blockly-react/): Blockly in a React project, defines a React Blockly Component.
+ - [``blockly-svelte-sample``](examples/blockly-svelte/): Blockly in a Svelte project, defines a Svelte Blockly Component.
+ - [``blockly-vue-sample``](examples/blockly-vue/): Blockly in a Vue project, defines a Vue Blockly Component.
+
+ ### Real-time Collaboration
+
+ - [``blockly-rtc``](examples/blockly-rtc/): Real-time collaboration environment on top of the Blockly framework.
+
+ ## Prerequisites
## Prerequisites
Install [node](https://nodejs.org/) and [npm](https://www.npmjs.com/get-npm).
## Running
```
cd <any sample folder>
npm install
npm run start
```
Browse to http://localhost:3000
You may need to refer to a sample's README for further setup and running instructions.
## Development
### Bootstrap
```
npm run boot
```
This will run ``npm install`` on every example.
### Maintenance
```
npm run update
```
This will run ``npm update`` on every example.
```
npm run audit
```
This will run ``npm audit fix`` on every example. | 18 | 0.45 | 18 | 0 |
91c8c2d70fecba7e4f38bbee6b2e129341f3074a | .expeditor/config.yml | .expeditor/config.yml |
github:
# The file where the MAJOR.MINOR.PATCH version is kept. The version in this file
# is bumped automatically via the `built_in:bump_version` merge_action.
version_file: "VERSION"
# The file where our CHANGELOG is kept. This file is updated automatically with
# details from the Pull Request via the `built_in:update_changelog` merge_action.
changelog_file: "CHANGELOG.md"
# These actions are taken, in order they are specified, anytime a Pull Request is merged.
merge_actions:
- built_in:update_changelog:
ignore_labels:
- "Expeditor: Exclude from Changelog"
changelog:
categories:
- "X-change": "Behavioral Changes"
- "X-feature": "New Features & Enhancements"
- "X-fix": "Bug Fixes"
|
github:
# The file where the MAJOR.MINOR.PATCH version is kept. The version in this file
# is bumped automatically via the `built_in:bump_version` merge_action.
version_file: "VERSION"
# The file where our CHANGELOG is kept. This file is updated automatically with
# details from the Pull Request via the `built_in:update_changelog` merge_action.
changelog_file: "CHANGELOG.md"
# Slack channel in Chef Software slack to send notifications about Expeditor actions
slack:
notify_channel: habitat-notify
# These actions are taken, in order they are specified, anytime a Pull Request is merged.
merge_actions:
- built_in:update_changelog:
ignore_labels:
- "Expeditor: Exclude from Changelog"
changelog:
categories:
- "X-change": "Behavioral Changes"
- "X-feature": "New Features & Enhancements"
- "X-fix": "Bug Fixes"
| Send Expeditor notifications to sa habitat specific channel in Chef Slack | Send Expeditor notifications to sa habitat specific channel in Chef Slack
Signed-off-by: Scott Hain <54f99c3933fb11a028e0e31efcf2f4c9707ec4bf@chef.io>
| YAML | apache-2.0 | habitat-sh/habitat,habitat-sh/habitat,rsertelon/habitat,habitat-sh/habitat,rsertelon/habitat,rsertelon/habitat,habitat-sh/habitat,rsertelon/habitat,rsertelon/habitat,rsertelon/habitat,habitat-sh/habitat,rsertelon/habitat,habitat-sh/habitat,habitat-sh/habitat,habitat-sh/habitat | yaml | ## Code Before:
github:
# The file where the MAJOR.MINOR.PATCH version is kept. The version in this file
# is bumped automatically via the `built_in:bump_version` merge_action.
version_file: "VERSION"
# The file where our CHANGELOG is kept. This file is updated automatically with
# details from the Pull Request via the `built_in:update_changelog` merge_action.
changelog_file: "CHANGELOG.md"
# These actions are taken, in order they are specified, anytime a Pull Request is merged.
merge_actions:
- built_in:update_changelog:
ignore_labels:
- "Expeditor: Exclude from Changelog"
changelog:
categories:
- "X-change": "Behavioral Changes"
- "X-feature": "New Features & Enhancements"
- "X-fix": "Bug Fixes"
## Instruction:
Send Expeditor notifications to sa habitat specific channel in Chef Slack
Signed-off-by: Scott Hain <54f99c3933fb11a028e0e31efcf2f4c9707ec4bf@chef.io>
## Code After:
github:
# The file where the MAJOR.MINOR.PATCH version is kept. The version in this file
# is bumped automatically via the `built_in:bump_version` merge_action.
version_file: "VERSION"
# The file where our CHANGELOG is kept. This file is updated automatically with
# details from the Pull Request via the `built_in:update_changelog` merge_action.
changelog_file: "CHANGELOG.md"
# Slack channel in Chef Software slack to send notifications about Expeditor actions
slack:
notify_channel: habitat-notify
# These actions are taken, in order they are specified, anytime a Pull Request is merged.
merge_actions:
- built_in:update_changelog:
ignore_labels:
- "Expeditor: Exclude from Changelog"
changelog:
categories:
- "X-change": "Behavioral Changes"
- "X-feature": "New Features & Enhancements"
- "X-fix": "Bug Fixes"
|
github:
# The file where the MAJOR.MINOR.PATCH version is kept. The version in this file
# is bumped automatically via the `built_in:bump_version` merge_action.
version_file: "VERSION"
# The file where our CHANGELOG is kept. This file is updated automatically with
# details from the Pull Request via the `built_in:update_changelog` merge_action.
changelog_file: "CHANGELOG.md"
+
+ # Slack channel in Chef Software slack to send notifications about Expeditor actions
+ slack:
+ notify_channel: habitat-notify
# These actions are taken, in order they are specified, anytime a Pull Request is merged.
merge_actions:
- built_in:update_changelog:
ignore_labels:
- "Expeditor: Exclude from Changelog"
changelog:
categories:
- "X-change": "Behavioral Changes"
- "X-feature": "New Features & Enhancements"
- "X-fix": "Bug Fixes" | 4 | 0.2 | 4 | 0 |
e768ea5223b1a2c990b0b2bc37abcdd588babf5f | src/config.js | src/config.js | /** @namespace */
var here = window.here || {};
here.builder = {
config: {
shareURL: 'https://share.here.com/',
PBAPI: "https://places.api.here.com/places/v1/",
appId: '45vC3JHWu5fUqqUm9Ik2',
appCode: 'U8WhuCuhmYAHgttfjOEdfg'
}
};
| /** @namespace */
var here = window.here || {};
here.builder = {
config: {
shareURL: 'https://share.here.com/',
PBAPI: "https://places.api.here.com/places/v1/",
appId: 'g2UCyIKyOr3psUbVeSHD',
appCode: 'QDirXlCoI43I_Sm4vNeGcg'
}
};
| Use a new valid appId and appCode | Use a new valid appId and appCode
| JavaScript | mit | heremaps/map-linkbuilder-app,heremaps/map-linkbuilder-app,heremaps/map-linkbuilder-app | javascript | ## Code Before:
/** @namespace */
var here = window.here || {};
here.builder = {
config: {
shareURL: 'https://share.here.com/',
PBAPI: "https://places.api.here.com/places/v1/",
appId: '45vC3JHWu5fUqqUm9Ik2',
appCode: 'U8WhuCuhmYAHgttfjOEdfg'
}
};
## Instruction:
Use a new valid appId and appCode
## Code After:
/** @namespace */
var here = window.here || {};
here.builder = {
config: {
shareURL: 'https://share.here.com/',
PBAPI: "https://places.api.here.com/places/v1/",
appId: 'g2UCyIKyOr3psUbVeSHD',
appCode: 'QDirXlCoI43I_Sm4vNeGcg'
}
};
| /** @namespace */
var here = window.here || {};
here.builder = {
config: {
shareURL: 'https://share.here.com/',
PBAPI: "https://places.api.here.com/places/v1/",
- appId: '45vC3JHWu5fUqqUm9Ik2',
- appCode: 'U8WhuCuhmYAHgttfjOEdfg'
+ appId: 'g2UCyIKyOr3psUbVeSHD',
+ appCode: 'QDirXlCoI43I_Sm4vNeGcg'
}
}; | 4 | 0.363636 | 2 | 2 |
383b972bd9450e28f1b1c01241b1259852da29d4 | metadata/vocabletrainer.heinecke.aron.vocabletrainer.txt | metadata/vocabletrainer.heinecke.aron.vocabletrainer.txt | Categories:Science & Education
License:Apache-2.0
Web Site:
Source Code:https://github.com/0xpr03/VocableTrainer-Android
Issue Tracker:https://github.com/0xpr03/VocableTrainer-Android/issues
Auto Name:VocableTrainer
Summary:Learn vocables of foreign languages
Description:
Based on a Java [https://github.com/0xpr03/VocableTrainer VocableTrainer], this
allows you to create multiple lists and start a training session over a subset
of these.
.
Repo Type:git
Repo:https://github.com/0xpr03/VocableTrainer-Android
Build:1.0,1
commit=140dd8c4631825dece9732a8fe457b844aa6176c
subdir=app
gradle=yes
prebuild=sed -i -e '/constraint-layout/s/1.0.0-alpha7/1.0.2/' build.gradle
Build:0.1,2
commit=bb67656c5a255a9c6075a261a224b3bc9ce3429a
subdir=app
gradle=yes
Auto Update Mode:Version %v
Update Check Mode:Tags
Current Version:0.2
Current Version Code:3
| Categories:Science & Education
License:Apache-2.0
Web Site:
Source Code:https://github.com/0xpr03/VocableTrainer-Android
Issue Tracker:https://github.com/0xpr03/VocableTrainer-Android/issues
Auto Name:VocableTrainer Alpha
Summary:Learn vocables of foreign languages
Description:
Based on a Java [https://github.com/0xpr03/VocableTrainer VocableTrainer], this
allows you to create multiple lists and start a training session over a subset
of these.
.
Repo Type:git
Repo:https://github.com/0xpr03/VocableTrainer-Android
Build:1.0,1
commit=140dd8c4631825dece9732a8fe457b844aa6176c
subdir=app
gradle=yes
prebuild=sed -i -e '/constraint-layout/s/1.0.0-alpha7/1.0.2/' build.gradle
Build:0.1,2
commit=bb67656c5a255a9c6075a261a224b3bc9ce3429a
subdir=app
gradle=yes
Build:0.2,3
commit=0.2
subdir=app
gradle=yes
Auto Update Mode:Version %v
Update Check Mode:Tags
Current Version:0.2
Current Version Code:3
| Update VocableTrainer Alpha to 0.2 (3) | Update VocableTrainer Alpha to 0.2 (3)
| Text | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroid-data,f-droid/fdroiddata | text | ## Code Before:
Categories:Science & Education
License:Apache-2.0
Web Site:
Source Code:https://github.com/0xpr03/VocableTrainer-Android
Issue Tracker:https://github.com/0xpr03/VocableTrainer-Android/issues
Auto Name:VocableTrainer
Summary:Learn vocables of foreign languages
Description:
Based on a Java [https://github.com/0xpr03/VocableTrainer VocableTrainer], this
allows you to create multiple lists and start a training session over a subset
of these.
.
Repo Type:git
Repo:https://github.com/0xpr03/VocableTrainer-Android
Build:1.0,1
commit=140dd8c4631825dece9732a8fe457b844aa6176c
subdir=app
gradle=yes
prebuild=sed -i -e '/constraint-layout/s/1.0.0-alpha7/1.0.2/' build.gradle
Build:0.1,2
commit=bb67656c5a255a9c6075a261a224b3bc9ce3429a
subdir=app
gradle=yes
Auto Update Mode:Version %v
Update Check Mode:Tags
Current Version:0.2
Current Version Code:3
## Instruction:
Update VocableTrainer Alpha to 0.2 (3)
## Code After:
Categories:Science & Education
License:Apache-2.0
Web Site:
Source Code:https://github.com/0xpr03/VocableTrainer-Android
Issue Tracker:https://github.com/0xpr03/VocableTrainer-Android/issues
Auto Name:VocableTrainer Alpha
Summary:Learn vocables of foreign languages
Description:
Based on a Java [https://github.com/0xpr03/VocableTrainer VocableTrainer], this
allows you to create multiple lists and start a training session over a subset
of these.
.
Repo Type:git
Repo:https://github.com/0xpr03/VocableTrainer-Android
Build:1.0,1
commit=140dd8c4631825dece9732a8fe457b844aa6176c
subdir=app
gradle=yes
prebuild=sed -i -e '/constraint-layout/s/1.0.0-alpha7/1.0.2/' build.gradle
Build:0.1,2
commit=bb67656c5a255a9c6075a261a224b3bc9ce3429a
subdir=app
gradle=yes
Build:0.2,3
commit=0.2
subdir=app
gradle=yes
Auto Update Mode:Version %v
Update Check Mode:Tags
Current Version:0.2
Current Version Code:3
| Categories:Science & Education
License:Apache-2.0
Web Site:
Source Code:https://github.com/0xpr03/VocableTrainer-Android
Issue Tracker:https://github.com/0xpr03/VocableTrainer-Android/issues
- Auto Name:VocableTrainer
+ Auto Name:VocableTrainer Alpha
? ++++++
Summary:Learn vocables of foreign languages
Description:
Based on a Java [https://github.com/0xpr03/VocableTrainer VocableTrainer], this
allows you to create multiple lists and start a training session over a subset
of these.
.
Repo Type:git
Repo:https://github.com/0xpr03/VocableTrainer-Android
Build:1.0,1
commit=140dd8c4631825dece9732a8fe457b844aa6176c
subdir=app
gradle=yes
prebuild=sed -i -e '/constraint-layout/s/1.0.0-alpha7/1.0.2/' build.gradle
Build:0.1,2
commit=bb67656c5a255a9c6075a261a224b3bc9ce3429a
subdir=app
gradle=yes
+ Build:0.2,3
+ commit=0.2
+ subdir=app
+ gradle=yes
+
Auto Update Mode:Version %v
Update Check Mode:Tags
Current Version:0.2
Current Version Code:3 | 7 | 0.21875 | 6 | 1 |
41f272498df15bce6bf8c0e5857d5b4ecc8673db | util/sync.go | util/sync.go | package util
import (
"net/http"
"sync"
)
type StatusGroup struct {
sync.WaitGroup
sync.Mutex
Status int
Err error
}
func NewStatusGroup() *StatusGroup {
return &StatusGroup{Status: http.StatusOK}
}
func (sg *StatusGroup) Done(status int, err error) {
sg.Lock()
if sg.Err == nil && err != nil {
if status == 0 {
// Usually caused by an early exit.
status = http.StatusInternalServerError
}
sg.Status = status
sg.Err = err
}
sg.Unlock()
sg.WaitGroup.Done()
}
func (sg *StatusGroup) Wait() (int, error) {
sg.WaitGroup.Wait()
return sg.Status, sg.Err
}
| package util
import (
"fmt"
"net/http"
"sync"
)
type StatusGroup struct {
sync.WaitGroup
sync.Mutex
Status int
Err error
}
func NewStatusGroup() *StatusGroup {
return &StatusGroup{Status: http.StatusOK}
}
func (sg *StatusGroup) Done(status int, err error) {
if status == 0 {
// An early exit.
status = http.StatusInternalServerError
err = fmt.Errorf("Unkown errors occur in an goroutine.")
}
sg.Lock()
if sg.Err == nil && err != nil {
sg.Status = status
sg.Err = err
}
sg.Unlock()
sg.WaitGroup.Done()
}
func (sg *StatusGroup) Wait() (int, error) {
sg.WaitGroup.Wait()
return sg.Status, sg.Err
}
| Fix a minor bug in StatusGroup. | Fix a minor bug in StatusGroup.
err was not set when status == 0
| Go | mit | tsinghua-io/api-server | go | ## Code Before:
package util
import (
"net/http"
"sync"
)
type StatusGroup struct {
sync.WaitGroup
sync.Mutex
Status int
Err error
}
func NewStatusGroup() *StatusGroup {
return &StatusGroup{Status: http.StatusOK}
}
func (sg *StatusGroup) Done(status int, err error) {
sg.Lock()
if sg.Err == nil && err != nil {
if status == 0 {
// Usually caused by an early exit.
status = http.StatusInternalServerError
}
sg.Status = status
sg.Err = err
}
sg.Unlock()
sg.WaitGroup.Done()
}
func (sg *StatusGroup) Wait() (int, error) {
sg.WaitGroup.Wait()
return sg.Status, sg.Err
}
## Instruction:
Fix a minor bug in StatusGroup.
err was not set when status == 0
## Code After:
package util
import (
"fmt"
"net/http"
"sync"
)
type StatusGroup struct {
sync.WaitGroup
sync.Mutex
Status int
Err error
}
func NewStatusGroup() *StatusGroup {
return &StatusGroup{Status: http.StatusOK}
}
func (sg *StatusGroup) Done(status int, err error) {
if status == 0 {
// An early exit.
status = http.StatusInternalServerError
err = fmt.Errorf("Unkown errors occur in an goroutine.")
}
sg.Lock()
if sg.Err == nil && err != nil {
sg.Status = status
sg.Err = err
}
sg.Unlock()
sg.WaitGroup.Done()
}
func (sg *StatusGroup) Wait() (int, error) {
sg.WaitGroup.Wait()
return sg.Status, sg.Err
}
| package util
import (
+ "fmt"
"net/http"
"sync"
)
type StatusGroup struct {
sync.WaitGroup
sync.Mutex
Status int
Err error
}
func NewStatusGroup() *StatusGroup {
return &StatusGroup{Status: http.StatusOK}
}
func (sg *StatusGroup) Done(status int, err error) {
+ if status == 0 {
+ // An early exit.
+ status = http.StatusInternalServerError
+ err = fmt.Errorf("Unkown errors occur in an goroutine.")
+ }
+
sg.Lock()
if sg.Err == nil && err != nil {
- if status == 0 {
- // Usually caused by an early exit.
- status = http.StatusInternalServerError
- }
sg.Status = status
sg.Err = err
}
sg.Unlock()
sg.WaitGroup.Done()
}
func (sg *StatusGroup) Wait() (int, error) {
sg.WaitGroup.Wait()
return sg.Status, sg.Err
} | 11 | 0.305556 | 7 | 4 |
b07c9ae13f80cb0afbe787543b28e15d546763e6 | elevator/db.py | elevator/db.py | import os
import md5
import leveldb
class DatabasesHandler(dict):
def __init__(self, dest, *args, **kwargs):
self['index'] = {}
self.dest = dest
self._init_default_db()
def _init_default_db(self):
self.add('default')
def load(self):
# Retrieving every databases from database store on fs,
# and adding them to backend databases handler.
for db_name in os.listdir(self.store_path):
if db_name != 'default':
db_path = os.path.join(self.store_path, db_name)
db_uid = md5.new(db_name).digest()
self['index'].update({db_name: db_uid})
self.update({db_uid: leveldb.LevelDB(db_path)})
def add(self, db_name):
new_db_name = db_name
new_db_uid = md5.new(new_db_name).digest()
new_db_dest = os.path.join(self.dest, new_db_name)
self['index'].update({new_db_name: new_db_uid})
self.update({new_db_uid: leveldb.LevelDB(new_db_dest)})
| import os
import md5
import leveldb
class DatabasesHandler(dict):
def __init__(self, dest, *args, **kwargs):
self['index'] = {}
self.dest = dest
self._init_default_db()
def _init_default_db(self):
self.add('default')
def load(self):
# Retrieving every databases from database store on fs,
# and adding them to backend databases handler.
for db_name in os.listdir(self.store_path):
if db_name != 'default':
db_path = os.path.join(self.store_path, db_name)
db_uid = md5.new(db_name).digest()
self['index'].update({db_name: db_uid})
self.update({db_uid: leveldb.LevelDB(db_path)})
def add(self, db_name):
new_db_name = db_name
new_db_uid = md5.new(new_db_name).digest()
new_db_dest = os.path.join(self.dest, new_db_name)
self['index'].update({new_db_name: new_db_uid})
self.update({new_db_uid: leveldb.LevelDB(new_db_dest)})
def drop(self, db_name):
db_uid = self['index'].pop(db_name)
del self['db_uid']
os.remove(os.path.join(self.dest, db_name))
self.pop(db_uid)
def list(self):
return [db_name for db_name in self['index'].itervalues()]
| Add : database handler list and del methods | Add : database handler list and del methods
| Python | mit | oleiade/Elevator | python | ## Code Before:
import os
import md5
import leveldb
class DatabasesHandler(dict):
def __init__(self, dest, *args, **kwargs):
self['index'] = {}
self.dest = dest
self._init_default_db()
def _init_default_db(self):
self.add('default')
def load(self):
# Retrieving every databases from database store on fs,
# and adding them to backend databases handler.
for db_name in os.listdir(self.store_path):
if db_name != 'default':
db_path = os.path.join(self.store_path, db_name)
db_uid = md5.new(db_name).digest()
self['index'].update({db_name: db_uid})
self.update({db_uid: leveldb.LevelDB(db_path)})
def add(self, db_name):
new_db_name = db_name
new_db_uid = md5.new(new_db_name).digest()
new_db_dest = os.path.join(self.dest, new_db_name)
self['index'].update({new_db_name: new_db_uid})
self.update({new_db_uid: leveldb.LevelDB(new_db_dest)})
## Instruction:
Add : database handler list and del methods
## Code After:
import os
import md5
import leveldb
class DatabasesHandler(dict):
def __init__(self, dest, *args, **kwargs):
self['index'] = {}
self.dest = dest
self._init_default_db()
def _init_default_db(self):
self.add('default')
def load(self):
# Retrieving every databases from database store on fs,
# and adding them to backend databases handler.
for db_name in os.listdir(self.store_path):
if db_name != 'default':
db_path = os.path.join(self.store_path, db_name)
db_uid = md5.new(db_name).digest()
self['index'].update({db_name: db_uid})
self.update({db_uid: leveldb.LevelDB(db_path)})
def add(self, db_name):
new_db_name = db_name
new_db_uid = md5.new(new_db_name).digest()
new_db_dest = os.path.join(self.dest, new_db_name)
self['index'].update({new_db_name: new_db_uid})
self.update({new_db_uid: leveldb.LevelDB(new_db_dest)})
def drop(self, db_name):
db_uid = self['index'].pop(db_name)
del self['db_uid']
os.remove(os.path.join(self.dest, db_name))
self.pop(db_uid)
def list(self):
return [db_name for db_name in self['index'].itervalues()]
| import os
import md5
import leveldb
class DatabasesHandler(dict):
def __init__(self, dest, *args, **kwargs):
self['index'] = {}
self.dest = dest
self._init_default_db()
def _init_default_db(self):
self.add('default')
def load(self):
# Retrieving every databases from database store on fs,
# and adding them to backend databases handler.
for db_name in os.listdir(self.store_path):
if db_name != 'default':
db_path = os.path.join(self.store_path, db_name)
db_uid = md5.new(db_name).digest()
self['index'].update({db_name: db_uid})
self.update({db_uid: leveldb.LevelDB(db_path)})
def add(self, db_name):
new_db_name = db_name
new_db_uid = md5.new(new_db_name).digest()
new_db_dest = os.path.join(self.dest, new_db_name)
self['index'].update({new_db_name: new_db_uid})
self.update({new_db_uid: leveldb.LevelDB(new_db_dest)})
+
+ def drop(self, db_name):
+ db_uid = self['index'].pop(db_name)
+ del self['db_uid']
+ os.remove(os.path.join(self.dest, db_name))
+ self.pop(db_uid)
+
+
+ def list(self):
+ return [db_name for db_name in self['index'].itervalues()] | 10 | 0.322581 | 10 | 0 |
b60bdb557e476782091a32b4f75b1917d89afa5f | lib/config.js | lib/config.js | 'use babel'
export default {
moieExec: {
type: 'string',
default: '/home/user/moie.jar',
description: 'Full path to moie-server'
},
javaExec: {
type: 'string',
default: '/usr/bin/java',
description: 'Full path to java; `whereis java`'
},
startServer: {
type: 'boolean',
default: false,
description: 'Start a new instance of moie-server if none running'
},
interface: {
type: "string",
default: "localhost"
},
port: {
type: "integer",
default: 9001
},
compileMode: {
type: 'string',
default: 'on demand',
enum: [
'on demand',
'while typing'
]
}
}
| 'use babel'
export default {
moieExec: {
type: 'string',
default: '/home/user/moie.jar',
description: 'Full path to moie-server'
},
javaExec: {
type: 'string',
default: '/usr/bin/java',
description: 'Full path to java; `whereis java`'
},
startServer: {
type: 'boolean',
default: false,
description: 'Start a new instance of moie-server if none running'
},
interface: {
type: "string",
default: "localhost"
},
port: {
type: "integer",
default: 9001
},
compileMode: {
type: 'string',
default: 'while typing',
enum: [
'on demand',
'while typing'
]
}
}
| Change default compileMode to while typing | Change default compileMode to while typing
| JavaScript | mit | THM-MoTE/mope-atom-plugin | javascript | ## Code Before:
'use babel'
export default {
moieExec: {
type: 'string',
default: '/home/user/moie.jar',
description: 'Full path to moie-server'
},
javaExec: {
type: 'string',
default: '/usr/bin/java',
description: 'Full path to java; `whereis java`'
},
startServer: {
type: 'boolean',
default: false,
description: 'Start a new instance of moie-server if none running'
},
interface: {
type: "string",
default: "localhost"
},
port: {
type: "integer",
default: 9001
},
compileMode: {
type: 'string',
default: 'on demand',
enum: [
'on demand',
'while typing'
]
}
}
## Instruction:
Change default compileMode to while typing
## Code After:
'use babel'
export default {
moieExec: {
type: 'string',
default: '/home/user/moie.jar',
description: 'Full path to moie-server'
},
javaExec: {
type: 'string',
default: '/usr/bin/java',
description: 'Full path to java; `whereis java`'
},
startServer: {
type: 'boolean',
default: false,
description: 'Start a new instance of moie-server if none running'
},
interface: {
type: "string",
default: "localhost"
},
port: {
type: "integer",
default: 9001
},
compileMode: {
type: 'string',
default: 'while typing',
enum: [
'on demand',
'while typing'
]
}
}
| 'use babel'
export default {
moieExec: {
type: 'string',
default: '/home/user/moie.jar',
description: 'Full path to moie-server'
},
javaExec: {
type: 'string',
default: '/usr/bin/java',
description: 'Full path to java; `whereis java`'
},
startServer: {
type: 'boolean',
default: false,
description: 'Start a new instance of moie-server if none running'
},
interface: {
type: "string",
default: "localhost"
},
port: {
type: "integer",
default: 9001
},
compileMode: {
type: 'string',
- default: 'on demand',
+ default: 'while typing',
enum: [
'on demand',
'while typing'
]
}
} | 2 | 0.057143 | 1 | 1 |
dc5302ca0896394c4020b46f82102d6465f4ca54 | db/migrate/20181020103501_revoke_variant_overrideswithout_permissions.rb | db/migrate/20181020103501_revoke_variant_overrideswithout_permissions.rb | class RevokeVariantOverrideswithoutPermissions < ActiveRecord::Migration
def up
# This process was executed when the permission_revoked_at colum was created (see AddPermissionRevokedAtToVariantOverrides)
# It needs to be repeated due to #2739
variant_override_hubs = Enterprise.where(id: VariantOverride.all.map(&:hub_id).uniq)
variant_override_hubs.each do |hub|
permitting_producer_ids = hub.relationships_as_child
.with_permission(:create_variant_overrides).map(&:parent_id)
variant_overrides_with_revoked_permissions = VariantOverride.for_hubs(hub)
.joins(variant: :product).where("spree_products.supplier_id NOT IN (?)", permitting_producer_ids)
variant_overrides_with_revoked_permissions.update_all(permission_revoked_at: Time.now)
end
end
end
| class RevokeVariantOverrideswithoutPermissions < ActiveRecord::Migration
def up
# This process was executed when the permission_revoked_at colum was created (see AddPermissionRevokedAtToVariantOverrides)
# It needs to be repeated due to #2739
variant_override_hubs = Enterprise.where(id: VariantOverride.select(:hub_id).uniq)
variant_override_hubs.find_each do |hub|
permitting_producer_ids = hub.relationships_as_child
.with_permission(:create_variant_overrides).pluck(:parent_id)
variant_overrides_with_revoked_permissions = VariantOverride.for_hubs(hub)
.joins(variant: :product).where("spree_products.supplier_id NOT IN (?)", permitting_producer_ids)
variant_overrides_with_revoked_permissions.update_all(permission_revoked_at: Time.now)
end
end
end
| Speed up database queries and make them scale | Speed up database queries and make them scale
This commit makes use of three ActiveRecord features:
1. Using `select` instead of `all.map` enables ActiveRecord to nest one
select into the other, resulting in one more efficient query instead of
two.
2. Using `find_each` saves memory by loading records in batches.
https://api.rubyonrails.org/classes/ActiveRecord/Batches.html#method-i-find_each
3. Using `pluck` creates only an array, avoiding loading all the other
columns of the records into objects.
Running this on the current Canadian database, fixes the following
variant overrides:
```
[]
[]
[]
[]
[]
[]
[925, 924, 966, 965]
[]
[]
[]
[]
[462,
863,
464,
822,
949,
947,
944,
939,
942,
946,
945,
943,
438,
937,
938,
941,
940,
467,
952,
875,
453,
953,
454,
951,
487,
460,
457,
528,
527,
486,
459,
458,
461,
529,
530,
950,
642,
384,
380,
643,
385,
381,
644,
386,
382,
960,
959,
379,
640,
377,
375,
532,
639,
376,
374,
646,
390,
389,
637,
406,
408,
647,
391,
393,
633,
396,
400,
398,
645,
388,
387,
648,
394,
392,
536,
632,
399,
397,
395,
634,
403,
401,
635,
404,
402,
636,
407,
405,
535,
534,
638,
410,
409,
948,
533,
537,
531,
877,
880,
894,
893,
672,
671,
673,
674,
703,
714,
715,
716,
717,
862,
864,
879,
876,
865,
881,
878,
463,
954,
866,
823,
957,
958,
955,
956,
899,
897]
[]
[969]
```
| Ruby | agpl-3.0 | lin-d-hop/openfoodnetwork,openfoodfoundation/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,openfoodfoundation/openfoodnetwork,Matt-Yorkley/openfoodnetwork,mkllnk/openfoodnetwork,Matt-Yorkley/openfoodnetwork,openfoodfoundation/openfoodnetwork,openfoodfoundation/openfoodnetwork,mkllnk/openfoodnetwork,Matt-Yorkley/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,lin-d-hop/openfoodnetwork,Matt-Yorkley/openfoodnetwork | ruby | ## Code Before:
class RevokeVariantOverrideswithoutPermissions < ActiveRecord::Migration
def up
# This process was executed when the permission_revoked_at colum was created (see AddPermissionRevokedAtToVariantOverrides)
# It needs to be repeated due to #2739
variant_override_hubs = Enterprise.where(id: VariantOverride.all.map(&:hub_id).uniq)
variant_override_hubs.each do |hub|
permitting_producer_ids = hub.relationships_as_child
.with_permission(:create_variant_overrides).map(&:parent_id)
variant_overrides_with_revoked_permissions = VariantOverride.for_hubs(hub)
.joins(variant: :product).where("spree_products.supplier_id NOT IN (?)", permitting_producer_ids)
variant_overrides_with_revoked_permissions.update_all(permission_revoked_at: Time.now)
end
end
end
## Instruction:
Speed up database queries and make them scale
This commit makes use of three ActiveRecord features:
1. Using `select` instead of `all.map` enables ActiveRecord to nest one
select into the other, resulting in one more efficient query instead of
two.
2. Using `find_each` saves memory by loading records in batches.
https://api.rubyonrails.org/classes/ActiveRecord/Batches.html#method-i-find_each
3. Using `pluck` creates only an array, avoiding loading all the other
columns of the records into objects.
Running this on the current Canadian database, fixes the following
variant overrides:
```
[]
[]
[]
[]
[]
[]
[925, 924, 966, 965]
[]
[]
[]
[]
[462,
863,
464,
822,
949,
947,
944,
939,
942,
946,
945,
943,
438,
937,
938,
941,
940,
467,
952,
875,
453,
953,
454,
951,
487,
460,
457,
528,
527,
486,
459,
458,
461,
529,
530,
950,
642,
384,
380,
643,
385,
381,
644,
386,
382,
960,
959,
379,
640,
377,
375,
532,
639,
376,
374,
646,
390,
389,
637,
406,
408,
647,
391,
393,
633,
396,
400,
398,
645,
388,
387,
648,
394,
392,
536,
632,
399,
397,
395,
634,
403,
401,
635,
404,
402,
636,
407,
405,
535,
534,
638,
410,
409,
948,
533,
537,
531,
877,
880,
894,
893,
672,
671,
673,
674,
703,
714,
715,
716,
717,
862,
864,
879,
876,
865,
881,
878,
463,
954,
866,
823,
957,
958,
955,
956,
899,
897]
[]
[969]
```
## Code After:
class RevokeVariantOverrideswithoutPermissions < ActiveRecord::Migration
def up
# This process was executed when the permission_revoked_at colum was created (see AddPermissionRevokedAtToVariantOverrides)
# It needs to be repeated due to #2739
variant_override_hubs = Enterprise.where(id: VariantOverride.select(:hub_id).uniq)
variant_override_hubs.find_each do |hub|
permitting_producer_ids = hub.relationships_as_child
.with_permission(:create_variant_overrides).pluck(:parent_id)
variant_overrides_with_revoked_permissions = VariantOverride.for_hubs(hub)
.joins(variant: :product).where("spree_products.supplier_id NOT IN (?)", permitting_producer_ids)
variant_overrides_with_revoked_permissions.update_all(permission_revoked_at: Time.now)
end
end
end
| class RevokeVariantOverrideswithoutPermissions < ActiveRecord::Migration
def up
# This process was executed when the permission_revoked_at colum was created (see AddPermissionRevokedAtToVariantOverrides)
# It needs to be repeated due to #2739
- variant_override_hubs = Enterprise.where(id: VariantOverride.all.map(&:hub_id).uniq)
? ^ ^^^^^ -
+ variant_override_hubs = Enterprise.where(id: VariantOverride.select(:hub_id).uniq)
? ^^ ^^^
- variant_override_hubs.each do |hub|
+ variant_override_hubs.find_each do |hub|
? +++++
permitting_producer_ids = hub.relationships_as_child
- .with_permission(:create_variant_overrides).map(&:parent_id)
? -- -
+ .with_permission(:create_variant_overrides).pluck(:parent_id)
? ++++
variant_overrides_with_revoked_permissions = VariantOverride.for_hubs(hub)
.joins(variant: :product).where("spree_products.supplier_id NOT IN (?)", permitting_producer_ids)
variant_overrides_with_revoked_permissions.update_all(permission_revoked_at: Time.now)
end
end
end | 6 | 0.352941 | 3 | 3 |
8e7a92bce03ca472bc78bb9df5e2c9cf063c29b7 | temba/campaigns/tasks.py | temba/campaigns/tasks.py | from __future__ import unicode_literals
from datetime import datetime
from django.utils import timezone
from djcelery_transactions import task
from redis_cache import get_redis_connection
from .models import Campaign, EventFire
from django.conf import settings
import redis
from temba.msgs.models import HANDLER_QUEUE, HANDLE_EVENT_TASK, FIRE_EVENT
from temba.utils.queues import push_task
@task(track_started=True, name='check_campaigns_task') # pragma: no cover
def check_campaigns_task(sched_id=None):
"""
See if any event fires need to be triggered
"""
logger = check_campaigns_task.get_logger()
# get a lock
r = get_redis_connection()
key = 'check_campaigns'
# only do this if we aren't already checking campaigns
if not r.get(key):
with r.lock(key, timeout=3600):
# for each that needs to be fired
for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('event', 'event.org'):
try:
push_task(fire.event.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
except: # pragma: no cover
logger.error("Error running campaign event: %s" % fire.pk, exc_info=True)
| from __future__ import unicode_literals
from datetime import datetime
from django.utils import timezone
from djcelery_transactions import task
from redis_cache import get_redis_connection
from .models import Campaign, EventFire
from django.conf import settings
import redis
from temba.msgs.models import HANDLER_QUEUE, HANDLE_EVENT_TASK, FIRE_EVENT
from temba.utils.queues import push_task
@task(track_started=True, name='check_campaigns_task') # pragma: no cover
def check_campaigns_task(sched_id=None):
"""
See if any event fires need to be triggered
"""
logger = check_campaigns_task.get_logger()
# get a lock
r = get_redis_connection()
key = 'check_campaigns'
# only do this if we aren't already checking campaigns
if not r.get(key):
with r.lock(key, timeout=3600):
# for each that needs to be fired
for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('contact', 'contact.org'):
try:
push_task(fire.contact.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
except: # pragma: no cover
logger.error("Error running campaign event: %s" % fire.pk, exc_info=True)
| Use correct field to get org from | Use correct field to get org from
| Python | agpl-3.0 | harrissoerja/rapidpro,pulilab/rapidpro,pulilab/rapidpro,reyrodrigues/EU-SMS,tsotetsi/textily-web,harrissoerja/rapidpro,tsotetsi/textily-web,pulilab/rapidpro,tsotetsi/textily-web,Thapelo-Tsotetsi/rapidpro,Thapelo-Tsotetsi/rapidpro,ewheeler/rapidpro,praekelt/rapidpro,harrissoerja/rapidpro,praekelt/rapidpro,reyrodrigues/EU-SMS,Thapelo-Tsotetsi/rapidpro,ewheeler/rapidpro,tsotetsi/textily-web,reyrodrigues/EU-SMS,ewheeler/rapidpro,pulilab/rapidpro,tsotetsi/textily-web,praekelt/rapidpro,ewheeler/rapidpro,pulilab/rapidpro,praekelt/rapidpro | python | ## Code Before:
from __future__ import unicode_literals
from datetime import datetime
from django.utils import timezone
from djcelery_transactions import task
from redis_cache import get_redis_connection
from .models import Campaign, EventFire
from django.conf import settings
import redis
from temba.msgs.models import HANDLER_QUEUE, HANDLE_EVENT_TASK, FIRE_EVENT
from temba.utils.queues import push_task
@task(track_started=True, name='check_campaigns_task') # pragma: no cover
def check_campaigns_task(sched_id=None):
"""
See if any event fires need to be triggered
"""
logger = check_campaigns_task.get_logger()
# get a lock
r = get_redis_connection()
key = 'check_campaigns'
# only do this if we aren't already checking campaigns
if not r.get(key):
with r.lock(key, timeout=3600):
# for each that needs to be fired
for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('event', 'event.org'):
try:
push_task(fire.event.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
except: # pragma: no cover
logger.error("Error running campaign event: %s" % fire.pk, exc_info=True)
## Instruction:
Use correct field to get org from
## Code After:
from __future__ import unicode_literals
from datetime import datetime
from django.utils import timezone
from djcelery_transactions import task
from redis_cache import get_redis_connection
from .models import Campaign, EventFire
from django.conf import settings
import redis
from temba.msgs.models import HANDLER_QUEUE, HANDLE_EVENT_TASK, FIRE_EVENT
from temba.utils.queues import push_task
@task(track_started=True, name='check_campaigns_task') # pragma: no cover
def check_campaigns_task(sched_id=None):
"""
See if any event fires need to be triggered
"""
logger = check_campaigns_task.get_logger()
# get a lock
r = get_redis_connection()
key = 'check_campaigns'
# only do this if we aren't already checking campaigns
if not r.get(key):
with r.lock(key, timeout=3600):
# for each that needs to be fired
for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('contact', 'contact.org'):
try:
push_task(fire.contact.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
except: # pragma: no cover
logger.error("Error running campaign event: %s" % fire.pk, exc_info=True)
| from __future__ import unicode_literals
from datetime import datetime
from django.utils import timezone
from djcelery_transactions import task
from redis_cache import get_redis_connection
from .models import Campaign, EventFire
from django.conf import settings
import redis
from temba.msgs.models import HANDLER_QUEUE, HANDLE_EVENT_TASK, FIRE_EVENT
from temba.utils.queues import push_task
@task(track_started=True, name='check_campaigns_task') # pragma: no cover
def check_campaigns_task(sched_id=None):
"""
See if any event fires need to be triggered
"""
logger = check_campaigns_task.get_logger()
# get a lock
r = get_redis_connection()
key = 'check_campaigns'
# only do this if we aren't already checking campaigns
if not r.get(key):
with r.lock(key, timeout=3600):
# for each that needs to be fired
- for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('event', 'event.org'):
? ^^^ ^^^
+ for fire in EventFire.objects.filter(fired=None, scheduled__lte=timezone.now()).select_related('contact', 'contact.org'):
? ^^ +++ ^^ +++
try:
- push_task(fire.event.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
? ^^^
+ push_task(fire.contact.org, HANDLER_QUEUE, HANDLE_EVENT_TASK, dict(type=FIRE_EVENT, id=fire.id))
? ^^ +++
except: # pragma: no cover
logger.error("Error running campaign event: %s" % fire.pk, exc_info=True) | 4 | 0.117647 | 2 | 2 |
354713f04c9619fb32cc8dd5fe3aefa64657d49f | .vscode/settings.json | .vscode/settings.json | {
"files.exclude": {
"out": false // set this to true to hide the "out" folder with the compiled JS files
},
"search.exclude": {
"out": true // set this to false to include "out" folder in search results
},
"typescript.tsdk": "./node_modules/typescript/lib",
"typescript.tsc.autoDetect": "off",
"files.trimTrailingWhitespace": true,
"editor.insertSpaces": false,
"editor.tabSize": 4,
"cSpell.words": [
"Composer",
"Drupal",
"Squiz",
"WordPress",
"Zend",
"codesniffer",
"comspec",
"gitignore",
"languageclient",
"languageserver",
"lockfile",
"phpcs",
"pkg",
"pkgs",
"quickstart",
"ruleset",
"testrunner"
]
}
| {
"files.exclude": {
"out": false // set this to true to hide the "out" folder with the compiled JS files
},
"search.exclude": {
"out": true // set this to false to include "out" folder in search results
},
"typescript.tsdk": "./node_modules/typescript/lib",
"typescript.tsc.autoDetect": "off",
"files.trimTrailingWhitespace": true,
"editor.insertSpaces": false,
"editor.tabSize": 4,
"cSpell.words": [
"charcode",
"codesniffer",
"Composer",
"comspec",
"Drupal",
"gitignore",
"ikappas",
"Ioannis",
"Kappas",
"languageclient",
"languageserver",
"lockfile",
"phpcs",
"pkg",
"pkgs",
"quickstart",
"ruleset",
"Squiz",
"squizlabs",
"testrunner",
"WordPress",
"Zend"
]
}
| Add words to spell check dictionary | Add words to spell check dictionary
| JSON | mit | ikappas/vscode-phpcs | json | ## Code Before:
{
"files.exclude": {
"out": false // set this to true to hide the "out" folder with the compiled JS files
},
"search.exclude": {
"out": true // set this to false to include "out" folder in search results
},
"typescript.tsdk": "./node_modules/typescript/lib",
"typescript.tsc.autoDetect": "off",
"files.trimTrailingWhitespace": true,
"editor.insertSpaces": false,
"editor.tabSize": 4,
"cSpell.words": [
"Composer",
"Drupal",
"Squiz",
"WordPress",
"Zend",
"codesniffer",
"comspec",
"gitignore",
"languageclient",
"languageserver",
"lockfile",
"phpcs",
"pkg",
"pkgs",
"quickstart",
"ruleset",
"testrunner"
]
}
## Instruction:
Add words to spell check dictionary
## Code After:
{
"files.exclude": {
"out": false // set this to true to hide the "out" folder with the compiled JS files
},
"search.exclude": {
"out": true // set this to false to include "out" folder in search results
},
"typescript.tsdk": "./node_modules/typescript/lib",
"typescript.tsc.autoDetect": "off",
"files.trimTrailingWhitespace": true,
"editor.insertSpaces": false,
"editor.tabSize": 4,
"cSpell.words": [
"charcode",
"codesniffer",
"Composer",
"comspec",
"Drupal",
"gitignore",
"ikappas",
"Ioannis",
"Kappas",
"languageclient",
"languageserver",
"lockfile",
"phpcs",
"pkg",
"pkgs",
"quickstart",
"ruleset",
"Squiz",
"squizlabs",
"testrunner",
"WordPress",
"Zend"
]
}
| {
"files.exclude": {
"out": false // set this to true to hide the "out" folder with the compiled JS files
},
"search.exclude": {
"out": true // set this to false to include "out" folder in search results
},
"typescript.tsdk": "./node_modules/typescript/lib",
"typescript.tsc.autoDetect": "off",
"files.trimTrailingWhitespace": true,
"editor.insertSpaces": false,
"editor.tabSize": 4,
"cSpell.words": [
+ "charcode",
+ "codesniffer",
"Composer",
+ "comspec",
"Drupal",
- "Squiz",
- "WordPress",
- "Zend",
- "codesniffer",
- "comspec",
"gitignore",
+ "ikappas",
+ "Ioannis",
+ "Kappas",
"languageclient",
"languageserver",
"lockfile",
"phpcs",
"pkg",
"pkgs",
"quickstart",
"ruleset",
+ "Squiz",
+ "squizlabs",
- "testrunner"
+ "testrunner",
? +
+ "WordPress",
+ "Zend"
]
} | 17 | 0.53125 | 11 | 6 |
b94affe6771602b82293080a3be59570fde8e965 | ci/swiftpm.sh | ci/swiftpm.sh |
mv Package.swift .Package.swift && cp .Package.test.swift Package.swift
swift build --clean && swift build && swift test
RETVAL=$?
mv .Package.swift Package.swift
exit $RETVAL
|
mv Package.swift .Package.swift && cp .Package.test.swift Package.swift
swift build && swift test
RETVAL=$?
mv .Package.swift Package.swift
exit $RETVAL
| Fix Swift Package Manager build | Fix Swift Package Manager build
| Shell | mit | cbguder/CBGPromise,cbguder/CBGPromise,cbguder/CBGPromise | shell | ## Code Before:
mv Package.swift .Package.swift && cp .Package.test.swift Package.swift
swift build --clean && swift build && swift test
RETVAL=$?
mv .Package.swift Package.swift
exit $RETVAL
## Instruction:
Fix Swift Package Manager build
## Code After:
mv Package.swift .Package.swift && cp .Package.test.swift Package.swift
swift build && swift test
RETVAL=$?
mv .Package.swift Package.swift
exit $RETVAL
|
mv Package.swift .Package.swift && cp .Package.test.swift Package.swift
- swift build --clean && swift build && swift test
+ swift build && swift test
RETVAL=$?
mv .Package.swift Package.swift
exit $RETVAL | 2 | 0.333333 | 1 | 1 |
3a6f1947043274961648f742a1a743ee276ba8e4 | grails-app/domain/be/cytomine/security/User.groovy | grails-app/domain/be/cytomine/security/User.groovy | package be.cytomine.security
class User extends SecUser {
String firstname
String lastname
String email
Date dateCreated
static constraints = {
firstname blank : false
lastname blank : false
email blank : false
dateCreated blank : false
}
}
| package be.cytomine.security
class User extends SecUser {
String firstname
String lastname
String email
Date dateCreated
static constraints = {
firstname blank : false
lastname blank : false
email (blank : false, email:true)
dateCreated blank : false
}
}
| Add email constraint for user email | Add email constraint for user email
| Groovy | apache-2.0 | cytomine/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,cytomine/Cytomine-core,cytomine/Cytomine-core,cytomine/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,cytomine/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,charybdeBE/Cytomine-core,cytomine/Cytomine-core,charybdeBE/Cytomine-core | groovy | ## Code Before:
package be.cytomine.security
class User extends SecUser {
String firstname
String lastname
String email
Date dateCreated
static constraints = {
firstname blank : false
lastname blank : false
email blank : false
dateCreated blank : false
}
}
## Instruction:
Add email constraint for user email
## Code After:
package be.cytomine.security
class User extends SecUser {
String firstname
String lastname
String email
Date dateCreated
static constraints = {
firstname blank : false
lastname blank : false
email (blank : false, email:true)
dateCreated blank : false
}
}
| package be.cytomine.security
class User extends SecUser {
String firstname
String lastname
String email
Date dateCreated
static constraints = {
firstname blank : false
lastname blank : false
- email blank : false
+ email (blank : false, email:true)
? + +++++++++++++
dateCreated blank : false
}
} | 2 | 0.111111 | 1 | 1 |
e64d13486fe20c44dde0dea6a6fed5a95eddbbd1 | awx/main/notifications/email_backend.py | awx/main/notifications/email_backend.py |
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
class CustomEmailBackend(EmailBackend):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
"username": {"label": "Username", "type": "string"},
"password": {"label": "Password", "type": "password"},
"use_tls": {"label": "Use TLS", "type": "bool"},
"use_ssl": {"label": "Use SSL", "type": "bool"},
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"}}
recipient_parameter = "recipients"
sender_parameter = "sender"
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {} on Ansible Tower, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
|
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
class CustomEmailBackend(EmailBackend):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
"username": {"label": "Username", "type": "string"},
"password": {"label": "Password", "type": "password"},
"use_tls": {"label": "Use TLS", "type": "bool"},
"use_ssl": {"label": "Use SSL", "type": "bool"},
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"}}
recipient_parameter = "recipients"
sender_parameter = "sender"
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
| Remove Tower reference from email backend | Remove Tower reference from email backend
| Python | apache-2.0 | snahelou/awx,wwitzel3/awx,snahelou/awx,wwitzel3/awx,wwitzel3/awx,wwitzel3/awx,snahelou/awx,snahelou/awx | python | ## Code Before:
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
class CustomEmailBackend(EmailBackend):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
"username": {"label": "Username", "type": "string"},
"password": {"label": "Password", "type": "password"},
"use_tls": {"label": "Use TLS", "type": "bool"},
"use_ssl": {"label": "Use SSL", "type": "bool"},
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"}}
recipient_parameter = "recipients"
sender_parameter = "sender"
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {} on Ansible Tower, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
## Instruction:
Remove Tower reference from email backend
## Code After:
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
class CustomEmailBackend(EmailBackend):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
"username": {"label": "Username", "type": "string"},
"password": {"label": "Password", "type": "password"},
"use_tls": {"label": "Use TLS", "type": "bool"},
"use_ssl": {"label": "Use SSL", "type": "bool"},
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"}}
recipient_parameter = "recipients"
sender_parameter = "sender"
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
|
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
class CustomEmailBackend(EmailBackend):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
"username": {"label": "Username", "type": "string"},
"password": {"label": "Password", "type": "password"},
"use_tls": {"label": "Use TLS", "type": "bool"},
"use_ssl": {"label": "Use SSL", "type": "bool"},
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"}}
recipient_parameter = "recipients"
sender_parameter = "sender"
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
- body_actual = smart_text(_("{} #{} had status {} on Ansible Tower, view details at {}\n\n").format(
? -----------------
+ body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual | 2 | 0.066667 | 1 | 1 |
f2505017a70ee80664ad5a1f12ae583db308ebb8 | .github/CONTRIBUTING.md | .github/CONTRIBUTING.md |
* Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
* Declared exceptions to JSLINT are ok.
* [ES6](http://www.ecma-international.org/ecma-262/6.0/index.html) is OK if enclosed in an eval string. Code must compile on ES5 browsers. For now...
* Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
* Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
|
- Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
- Declared exceptions to JSLINT are ok.
- Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
- Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
| Drop note about code needing to be ES5 compatible | Drop note about code needing to be ES5 compatible
| Markdown | bsd-3-clause | TonyGermaneri/canvas-datagrid,TonyGermaneri/canvas-datagrid | markdown | ## Code Before:
* Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
* Declared exceptions to JSLINT are ok.
* [ES6](http://www.ecma-international.org/ecma-262/6.0/index.html) is OK if enclosed in an eval string. Code must compile on ES5 browsers. For now...
* Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
* Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
## Instruction:
Drop note about code needing to be ES5 compatible
## Code After:
- Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
- Declared exceptions to JSLINT are ok.
- Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
- Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
|
- * Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
? ^
+ - Pull requests must conform to [JSLINT](http://jslint.com/) [ES5](https://es5.github.io/) and [JSDOC](http://usejsdoc.org/).
? ^
- * Declared exceptions to JSLINT are ok.
? ^
+ - Declared exceptions to JSLINT are ok.
? ^
- * [ES6](http://www.ecma-international.org/ecma-262/6.0/index.html) is OK if enclosed in an eval string. Code must compile on ES5 browsers. For now...
- * Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
? ^
+ - Pull requests must be [attached to an issue](https://github.com/TonyGermaneri/canvas-datagrid/issues).
? ^
- * Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
? ^
+ - Pull requests that effect code must pass [tests](https://canvas-datagrid.js.org/canvas-datagrid/test/tests.html).
? ^
| 9 | 1.5 | 4 | 5 |
cf7d0719b32b7d761528ef481a82a1d72ef9032f | install/kubernetes/cilium/templates/hubble-generate-certs-ca-secret.yaml | install/kubernetes/cilium/templates/hubble-generate-certs-ca-secret.yaml | {{- if and .Values.hubble.tls.auto.enabled (eq .Values.hubble.tls.auto.method "cronJob") .Values.hubble.tls.ca.cert .Values.hubble.tls.ca.key }}
---
apiVersion: v1
kind: Secret
metadata:
name: hubble-ca-secret
namespace: {{ .Release.Namespace }}
type: kubernetes.io/tls
data:
ca.crt: {{ .Values.hubble.tls.ca.cert }}
ca.key: {{ .Values.hubble.tls.ca.key }}
{{- end }}
| {{- if and .Values.hubble.tls.auto.enabled (eq .Values.hubble.tls.auto.method "cronJob") .Values.hubble.tls.ca.cert .Values.hubble.tls.ca.key }}
---
apiVersion: v1
kind: Secret
metadata:
name: hubble-ca-secret
namespace: {{ .Release.Namespace }}
data:
ca.crt: {{ .Values.hubble.tls.ca.cert }}
ca.key: {{ .Values.hubble.tls.ca.key }}
{{- end }}
| Remove `type: kubernetes.io/tls` from hubble-ca-secret | helm: Remove `type: kubernetes.io/tls` from hubble-ca-secret
While `hubble-ca-secret` is technically a TLS secret, it does not
conform to the naming scheme outlined in
https://kubernetes.io/docs/concepts/services-networking/ingress/#tls
There is currently no technical reason for it to be a secret of type
`TLS`. Therefore we simply remove the type for now. Renaming the files
to be conform to `type: kubernetes.io/tls` would require more in-depth
changes, as certgen currently requires the files to be named
`ca.{crt,key}`. certgen itself creates the CA secrets with type `Opaque`
if they are absent, so this change is consent with what certgen already
does.
This commit does not contain any functional changes.
Signed-off-by: Sebastian Wicki <db043b2055cb3a47b2eb0b5aebf4e114a8c24a5a@isovalent.com>
| YAML | apache-2.0 | tgraf/cilium,michi-covalent/cilium,michi-covalent/cilium,tklauser/cilium,tklauser/cilium,michi-covalent/cilium,cilium/cilium,michi-covalent/cilium,cilium/cilium,michi-covalent/cilium,tklauser/cilium,tklauser/cilium,tgraf/cilium,cilium/cilium,tgraf/cilium,cilium/cilium,tklauser/cilium,cilium/cilium,tgraf/cilium,tgraf/cilium,tgraf/cilium | yaml | ## Code Before:
{{- if and .Values.hubble.tls.auto.enabled (eq .Values.hubble.tls.auto.method "cronJob") .Values.hubble.tls.ca.cert .Values.hubble.tls.ca.key }}
---
apiVersion: v1
kind: Secret
metadata:
name: hubble-ca-secret
namespace: {{ .Release.Namespace }}
type: kubernetes.io/tls
data:
ca.crt: {{ .Values.hubble.tls.ca.cert }}
ca.key: {{ .Values.hubble.tls.ca.key }}
{{- end }}
## Instruction:
helm: Remove `type: kubernetes.io/tls` from hubble-ca-secret
While `hubble-ca-secret` is technically a TLS secret, it does not
conform to the naming scheme outlined in
https://kubernetes.io/docs/concepts/services-networking/ingress/#tls
There is currently no technical reason for it to be a secret of type
`TLS`. Therefore we simply remove the type for now. Renaming the files
to be conform to `type: kubernetes.io/tls` would require more in-depth
changes, as certgen currently requires the files to be named
`ca.{crt,key}`. certgen itself creates the CA secrets with type `Opaque`
if they are absent, so this change is consent with what certgen already
does.
This commit does not contain any functional changes.
Signed-off-by: Sebastian Wicki <db043b2055cb3a47b2eb0b5aebf4e114a8c24a5a@isovalent.com>
## Code After:
{{- if and .Values.hubble.tls.auto.enabled (eq .Values.hubble.tls.auto.method "cronJob") .Values.hubble.tls.ca.cert .Values.hubble.tls.ca.key }}
---
apiVersion: v1
kind: Secret
metadata:
name: hubble-ca-secret
namespace: {{ .Release.Namespace }}
data:
ca.crt: {{ .Values.hubble.tls.ca.cert }}
ca.key: {{ .Values.hubble.tls.ca.key }}
{{- end }}
| {{- if and .Values.hubble.tls.auto.enabled (eq .Values.hubble.tls.auto.method "cronJob") .Values.hubble.tls.ca.cert .Values.hubble.tls.ca.key }}
---
apiVersion: v1
kind: Secret
metadata:
name: hubble-ca-secret
namespace: {{ .Release.Namespace }}
- type: kubernetes.io/tls
data:
ca.crt: {{ .Values.hubble.tls.ca.cert }}
ca.key: {{ .Values.hubble.tls.ca.key }}
{{- end }} | 1 | 0.083333 | 0 | 1 |
3704af22d611d2a0411768bc8bcf5b77ba66b699 | app/templates/components/frost-bunsen-input-select.hbs | app/templates/components/frost-bunsen-input-select.hbs | <div>
<div class={{labelWrapperClassName}}>
<label class="alias">{{renderLabel}}</label>
{{#if required}}
<div class='required'>Required</div>
{{/if}}
</div>
<div class={{inputWrapperClassName}}>
{{frost-select
disabled=cellConfig.disabled
onInput=(action "onInput")
onChange=(action "onChange")
data=options
selectedValue=value
}}
</div>
{{#if errorMessage}}
<div>
<div class={{labelWrapperClassName}}></div>
<div class="error">
{{errorMessage}}
</div>
</div>
{{/if}}
</div>
| <div>
<div class={{labelWrapperClassName}}>
<label class="alias">{{renderLabel}}</label>
{{#if required}}
<div class='required'>Required</div>
{{/if}}
</div>
<div class={{inputWrapperClassName}}>
{{frost-select
disabled=cellConfig.disabled
onInput=(action "onInput")
onChange=(action "onChange")
data=options
selectedValue=value
}}
</div>
</div>
{{#if errorMessage}}
<div>
<div class={{labelWrapperClassName}}></div>
<div class="error">
{{errorMessage}}
</div>
</div>
{{/if}}
| Fix select error message alignment | Fix select error message alignment
| Handlebars | mit | ciena-frost/ember-frost-bunsen,sophypal/ember-frost-bunsen,ciena-frost/ember-frost-bunsen,sandersky/ember-frost-bunsen,sophypal/ember-frost-bunsen,sandersky/ember-frost-bunsen,sophypal/ember-frost-bunsen,ciena-frost/ember-frost-bunsen,sandersky/ember-frost-bunsen | handlebars | ## Code Before:
<div>
<div class={{labelWrapperClassName}}>
<label class="alias">{{renderLabel}}</label>
{{#if required}}
<div class='required'>Required</div>
{{/if}}
</div>
<div class={{inputWrapperClassName}}>
{{frost-select
disabled=cellConfig.disabled
onInput=(action "onInput")
onChange=(action "onChange")
data=options
selectedValue=value
}}
</div>
{{#if errorMessage}}
<div>
<div class={{labelWrapperClassName}}></div>
<div class="error">
{{errorMessage}}
</div>
</div>
{{/if}}
</div>
## Instruction:
Fix select error message alignment
## Code After:
<div>
<div class={{labelWrapperClassName}}>
<label class="alias">{{renderLabel}}</label>
{{#if required}}
<div class='required'>Required</div>
{{/if}}
</div>
<div class={{inputWrapperClassName}}>
{{frost-select
disabled=cellConfig.disabled
onInput=(action "onInput")
onChange=(action "onChange")
data=options
selectedValue=value
}}
</div>
</div>
{{#if errorMessage}}
<div>
<div class={{labelWrapperClassName}}></div>
<div class="error">
{{errorMessage}}
</div>
</div>
{{/if}}
| <div>
<div class={{labelWrapperClassName}}>
<label class="alias">{{renderLabel}}</label>
{{#if required}}
<div class='required'>Required</div>
{{/if}}
</div>
<div class={{inputWrapperClassName}}>
{{frost-select
disabled=cellConfig.disabled
onInput=(action "onInput")
onChange=(action "onChange")
data=options
selectedValue=value
}}
</div>
+ </div>
- {{#if errorMessage}}
? --
+ {{#if errorMessage}}
- <div>
? --
+ <div>
- <div class={{labelWrapperClassName}}></div>
? --
+ <div class={{labelWrapperClassName}}></div>
- <div class="error">
? --
+ <div class="error">
- {{errorMessage}}
? --
+ {{errorMessage}}
- </div>
</div>
+ </div>
- {{/if}}
? --
+ {{/if}}
- </div> | 16 | 0.64 | 8 | 8 |
c3f8e52ff859b5853fcbd57b389256a18aa64a0a | scripts/chrome.rb | scripts/chrome.rb | require 'tmpdir'
def run_chrome(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
orig = `defaults read com.google.Chrome AppleLanguages 2>&1`
if $?.success?
languages = orig.lines.slice(1, orig.lines.length - 2).map { |s| s.strip.tr(',', ' ') }
restore = -> { `defaults write com.google.Chrome AppleLanguages '(#{languages.join(',')})'` }
else
restore = -> { `defaults delete com.google.Chrome AppleLanguages` }
end
`defaults write com.google.Chrome AppleLanguages '("#{language}")'`
begin
args = [
'--new',
'-a',
'Google Chrome',
'--args',
'--no-first-run',
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('open', *args)
ensure
restore.call
end
end
| require 'tmpdir'
require 'os'
def run_chrome(language)
if OS.mac?
run_chrome_macos(language)
elsif OS.windows?
run_chrome_windows(language)
else
$stderr.puts "Launching Chrome on this OS (#{RUBY_PLATFORM}) is not implemented."
end
end
def run_chrome_windows(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
args = [
'--no-first-run',
"--lang=#{language}",
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('\Program Files (x86)\Google\Chrome\Application\chrome.exe', *args)
end
def run_chrome_macos(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
orig = `defaults read com.google.Chrome AppleLanguages 2>&1`
if $?.success?
languages = orig.lines.slice(1, orig.lines.length - 2).map { |s| s.strip.tr(',', ' ') }
restore = -> { `defaults write com.google.Chrome AppleLanguages '(#{languages.join(',')})'` }
else
restore = -> { `defaults delete com.google.Chrome AppleLanguages` }
end
`defaults write com.google.Chrome AppleLanguages '("#{language}")'`
begin
args = [
'--new',
'-a',
'Google Chrome',
'--args',
'--no-first-run',
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('open', *args)
ensure
restore.call
end
end
| Add ability to launch Chrome on Windows from scripts. | Add ability to launch Chrome on Windows from scripts.
| Ruby | mit | schmich/marinara,schmich/marinara,schmich/marinara | ruby | ## Code Before:
require 'tmpdir'
def run_chrome(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
orig = `defaults read com.google.Chrome AppleLanguages 2>&1`
if $?.success?
languages = orig.lines.slice(1, orig.lines.length - 2).map { |s| s.strip.tr(',', ' ') }
restore = -> { `defaults write com.google.Chrome AppleLanguages '(#{languages.join(',')})'` }
else
restore = -> { `defaults delete com.google.Chrome AppleLanguages` }
end
`defaults write com.google.Chrome AppleLanguages '("#{language}")'`
begin
args = [
'--new',
'-a',
'Google Chrome',
'--args',
'--no-first-run',
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('open', *args)
ensure
restore.call
end
end
## Instruction:
Add ability to launch Chrome on Windows from scripts.
## Code After:
require 'tmpdir'
require 'os'
def run_chrome(language)
if OS.mac?
run_chrome_macos(language)
elsif OS.windows?
run_chrome_windows(language)
else
$stderr.puts "Launching Chrome on this OS (#{RUBY_PLATFORM}) is not implemented."
end
end
def run_chrome_windows(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
args = [
'--no-first-run',
"--lang=#{language}",
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('\Program Files (x86)\Google\Chrome\Application\chrome.exe', *args)
end
def run_chrome_macos(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
orig = `defaults read com.google.Chrome AppleLanguages 2>&1`
if $?.success?
languages = orig.lines.slice(1, orig.lines.length - 2).map { |s| s.strip.tr(',', ' ') }
restore = -> { `defaults write com.google.Chrome AppleLanguages '(#{languages.join(',')})'` }
else
restore = -> { `defaults delete com.google.Chrome AppleLanguages` }
end
`defaults write com.google.Chrome AppleLanguages '("#{language}")'`
begin
args = [
'--new',
'-a',
'Google Chrome',
'--args',
'--no-first-run',
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('open', *args)
ensure
restore.call
end
end
| require 'tmpdir'
+ require 'os'
def run_chrome(language)
+ if OS.mac?
+ run_chrome_macos(language)
+ elsif OS.windows?
+ run_chrome_windows(language)
+ else
+ $stderr.puts "Launching Chrome on this OS (#{RUBY_PLATFORM}) is not implemented."
+ end
+ end
+
+ def run_chrome_windows(language)
+ user_data_dir = Dir.mktmpdir('marinara')
+ extension_dir = File.join(Dir.pwd, 'package')
+
+ args = [
+ '--no-first-run',
+ "--lang=#{language}",
+ "--user-data-dir=#{user_data_dir}",
+ "--load-extension=#{extension_dir}",
+ 'about:blank'
+ ]
+
+ puts "Running Chrome with locale #{language}."
+ system('\Program Files (x86)\Google\Chrome\Application\chrome.exe', *args)
+ end
+
+ def run_chrome_macos(language)
user_data_dir = Dir.mktmpdir('marinara')
extension_dir = File.join(Dir.pwd, 'package')
orig = `defaults read com.google.Chrome AppleLanguages 2>&1`
if $?.success?
languages = orig.lines.slice(1, orig.lines.length - 2).map { |s| s.strip.tr(',', ' ') }
restore = -> { `defaults write com.google.Chrome AppleLanguages '(#{languages.join(',')})'` }
else
restore = -> { `defaults delete com.google.Chrome AppleLanguages` }
end
`defaults write com.google.Chrome AppleLanguages '("#{language}")'`
begin
args = [
'--new',
'-a',
'Google Chrome',
'--args',
'--no-first-run',
"--user-data-dir=#{user_data_dir}",
"--load-extension=#{extension_dir}",
'about:blank'
]
puts "Running Chrome with locale #{language}."
system('open', *args)
ensure
restore.call
end
end | 27 | 0.794118 | 27 | 0 |
49154c02a7cf518675a3b0d448bd79d62dfa4b00 | proto-actor/src/main/kotlin/actor/proto/DeferredProcess.kt | proto-actor/src/main/kotlin/actor/proto/DeferredProcess.kt | package actor.proto
import actor.proto.mailbox.SystemMessage
import kotlinx.coroutines.experimental.CompletableDeferred
import kotlinx.coroutines.experimental.withTimeout
import java.time.Duration
import java.util.concurrent.TimeUnit
class DeferredProcess<out T>(private val timeout: Duration = Duration.ofMillis(5000)) : Process() {
val pid = ProcessRegistry.put(ProcessRegistry.nextId(), this)
private val cd = CompletableDeferred<T>()
override fun sendUserMessage(pid: PID, message: Any) {
val m = when (message) {
is MessageEnvelope -> message.message
else -> message
}
@Suppress("UNCHECKED_CAST")
cd.complete(m as T)
}
override fun sendSystemMessage(pid: PID, message: SystemMessage) {}
suspend fun await(): T {
val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
ProcessRegistry.remove(pid)
return result
}
} | package actor.proto
import actor.proto.mailbox.SystemMessage
import kotlinx.coroutines.experimental.CompletableDeferred
import kotlinx.coroutines.experimental.withTimeout
import java.time.Duration
import java.util.concurrent.TimeUnit
class DeferredProcess<out T>(private val timeout: Duration = Duration.ofMillis(5000)) : Process() {
val pid = ProcessRegistry.put(ProcessRegistry.nextId(), this)
private val cd = CompletableDeferred<T>()
override fun sendUserMessage(pid: PID, message: Any) {
val m = when (message) {
is MessageEnvelope -> message.message
else -> message
}
@Suppress("UNCHECKED_CAST")
cd.complete(m as T)
}
override fun sendSystemMessage(pid: PID, message: SystemMessage) {}
suspend fun await(): T {
try {
val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
ProcessRegistry.remove(pid)
return result;
} catch (exception: Exception) {
ProcessRegistry.remove(pid)
throw exception;
}
}
} | Remove from registry when an exception (including timeout) occurs | Remove from registry when an exception (including timeout) occurs
| Kotlin | apache-2.0 | AsynkronIT/protoactor-kotlin,AsynkronIT/protoactor-kotlin,AsynkronIT/protoactor-kotlin | kotlin | ## Code Before:
package actor.proto
import actor.proto.mailbox.SystemMessage
import kotlinx.coroutines.experimental.CompletableDeferred
import kotlinx.coroutines.experimental.withTimeout
import java.time.Duration
import java.util.concurrent.TimeUnit
class DeferredProcess<out T>(private val timeout: Duration = Duration.ofMillis(5000)) : Process() {
val pid = ProcessRegistry.put(ProcessRegistry.nextId(), this)
private val cd = CompletableDeferred<T>()
override fun sendUserMessage(pid: PID, message: Any) {
val m = when (message) {
is MessageEnvelope -> message.message
else -> message
}
@Suppress("UNCHECKED_CAST")
cd.complete(m as T)
}
override fun sendSystemMessage(pid: PID, message: SystemMessage) {}
suspend fun await(): T {
val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
ProcessRegistry.remove(pid)
return result
}
}
## Instruction:
Remove from registry when an exception (including timeout) occurs
## Code After:
package actor.proto
import actor.proto.mailbox.SystemMessage
import kotlinx.coroutines.experimental.CompletableDeferred
import kotlinx.coroutines.experimental.withTimeout
import java.time.Duration
import java.util.concurrent.TimeUnit
class DeferredProcess<out T>(private val timeout: Duration = Duration.ofMillis(5000)) : Process() {
val pid = ProcessRegistry.put(ProcessRegistry.nextId(), this)
private val cd = CompletableDeferred<T>()
override fun sendUserMessage(pid: PID, message: Any) {
val m = when (message) {
is MessageEnvelope -> message.message
else -> message
}
@Suppress("UNCHECKED_CAST")
cd.complete(m as T)
}
override fun sendSystemMessage(pid: PID, message: SystemMessage) {}
suspend fun await(): T {
try {
val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
ProcessRegistry.remove(pid)
return result;
} catch (exception: Exception) {
ProcessRegistry.remove(pid)
throw exception;
}
}
} | package actor.proto
import actor.proto.mailbox.SystemMessage
import kotlinx.coroutines.experimental.CompletableDeferred
import kotlinx.coroutines.experimental.withTimeout
import java.time.Duration
import java.util.concurrent.TimeUnit
class DeferredProcess<out T>(private val timeout: Duration = Duration.ofMillis(5000)) : Process() {
val pid = ProcessRegistry.put(ProcessRegistry.nextId(), this)
private val cd = CompletableDeferred<T>()
override fun sendUserMessage(pid: PID, message: Any) {
val m = when (message) {
is MessageEnvelope -> message.message
else -> message
}
@Suppress("UNCHECKED_CAST")
cd.complete(m as T)
}
override fun sendSystemMessage(pid: PID, message: SystemMessage) {}
suspend fun await(): T {
+ try {
- val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
+ val result = withTimeout(timeout.toMillis(), TimeUnit.MILLISECONDS) { cd.await() }
? ++++
- ProcessRegistry.remove(pid)
+ ProcessRegistry.remove(pid)
? ++++
- return result
+ return result;
? ++++ +
+ } catch (exception: Exception) {
+ ProcessRegistry.remove(pid)
+ throw exception;
+ }
}
} | 11 | 0.392857 | 8 | 3 |
0d612ef54bb9bcf5b42ffcbb7e75f67d978b67c8 | app/models/ability.rb | app/models/ability.rb | class Ability
include CanCan::Ability
def initialize(user)
user ||= User.new
if user.has_role? :admin
can :manage, :all
else
can :read, :all
end
unless user.id.nil?
can :vote, Poll
can [:create, :update], Place
can [:nominate, :add_options], Poll
end
end
end
| class Ability
include CanCan::Ability
def initialize(user)
user ||= User.new
if user.has_role? :admin
can :manage, :all
else
can :read, :all
end
unless user.id.nil?
# Normal logged in users can...
can [:create, :update], Place # create and update Places
can [:create, :nominate, :add_options, :vote], Poll # Create, nominate and vote in Polls
end
end
end
| Allow normal users to create polls, clean up abilities model | Allow normal users to create polls, clean up abilities model
| Ruby | mit | localstatic/glf,localstatic/glf | ruby | ## Code Before:
class Ability
include CanCan::Ability
def initialize(user)
user ||= User.new
if user.has_role? :admin
can :manage, :all
else
can :read, :all
end
unless user.id.nil?
can :vote, Poll
can [:create, :update], Place
can [:nominate, :add_options], Poll
end
end
end
## Instruction:
Allow normal users to create polls, clean up abilities model
## Code After:
class Ability
include CanCan::Ability
def initialize(user)
user ||= User.new
if user.has_role? :admin
can :manage, :all
else
can :read, :all
end
unless user.id.nil?
# Normal logged in users can...
can [:create, :update], Place # create and update Places
can [:create, :nominate, :add_options, :vote], Poll # Create, nominate and vote in Polls
end
end
end
| class Ability
include CanCan::Ability
def initialize(user)
user ||= User.new
if user.has_role? :admin
can :manage, :all
else
can :read, :all
end
unless user.id.nil?
- can :vote, Poll
- can [:create, :update], Place
- can [:nominate, :add_options], Poll
+ # Normal logged in users can...
+ can [:create, :update], Place # create and update Places
+ can [:create, :nominate, :add_options, :vote], Poll # Create, nominate and vote in Polls
end
end
end | 6 | 0.3 | 3 | 3 |
56f52abb10c149ae856c1e2ba808a9adc3e4e7e2 | app/src/main/graphql/io/sweers/catchup/data/github/GitHubTrendingQuery.graphql | app/src/main/graphql/io/sweers/catchup/data/github/GitHubTrendingQuery.graphql | query GitHubSearch($queryString: String!, $firstCount: Int!, $order: LanguageOrder!) {
search(query: $queryString, type: REPOSITORY, first: $firstCount) {
repositoryCount
nodes {
... on Repository {
name
createdAt
description
id
languages(first: 1, orderBy: $order) {
nodes {
name
}
}
licenseInfo {
name
}
owner {
login
}
stargazers {
totalCount
}
url
}
}
}
}
| query GitHubSearch($queryString: String!, $firstCount: Int!, $order: LanguageOrder!) {
search(query: $queryString, type: REPOSITORY, first: $firstCount) {
repositoryCount
nodes {
... on Repository {
name
createdAt
description
homepageUrl
id
languages(first: 1, orderBy: $order) {
nodes {
name
}
}
licenseInfo {
name
}
owner {
login
}
stargazers {
totalCount
}
url
}
}
}
}
| Add homepageUrl for later investigation | Add homepageUrl for later investigation
This could be a good candidate for long clicks or something in github items. Not sure how best to handle it yet
| GraphQL | apache-2.0 | hzsweers/CatchUp,hzsweers/CatchUp,hzsweers/CatchUp,hzsweers/CatchUp | graphql | ## Code Before:
query GitHubSearch($queryString: String!, $firstCount: Int!, $order: LanguageOrder!) {
search(query: $queryString, type: REPOSITORY, first: $firstCount) {
repositoryCount
nodes {
... on Repository {
name
createdAt
description
id
languages(first: 1, orderBy: $order) {
nodes {
name
}
}
licenseInfo {
name
}
owner {
login
}
stargazers {
totalCount
}
url
}
}
}
}
## Instruction:
Add homepageUrl for later investigation
This could be a good candidate for long clicks or something in github items. Not sure how best to handle it yet
## Code After:
query GitHubSearch($queryString: String!, $firstCount: Int!, $order: LanguageOrder!) {
search(query: $queryString, type: REPOSITORY, first: $firstCount) {
repositoryCount
nodes {
... on Repository {
name
createdAt
description
homepageUrl
id
languages(first: 1, orderBy: $order) {
nodes {
name
}
}
licenseInfo {
name
}
owner {
login
}
stargazers {
totalCount
}
url
}
}
}
}
| query GitHubSearch($queryString: String!, $firstCount: Int!, $order: LanguageOrder!) {
search(query: $queryString, type: REPOSITORY, first: $firstCount) {
repositoryCount
nodes {
... on Repository {
name
createdAt
description
+ homepageUrl
id
languages(first: 1, orderBy: $order) {
nodes {
name
}
}
licenseInfo {
name
}
owner {
login
}
stargazers {
totalCount
}
url
}
}
}
} | 1 | 0.035714 | 1 | 0 |
8ac0252be0f1c16d98c18fa9471be093f5d99704 | src/components/TransitionSwitch/TransitionSwitch.tsx | src/components/TransitionSwitch/TransitionSwitch.tsx | import * as React from "react";
import {Route} from "react-router-dom";
import * as TransitionGroup from "react-transition-group/TransitionGroup";
import * as CSSTransition from "react-transition-group/CSSTransition";
import {TransitionSwitchProps, TransitionSwitchPropTypes, TransitionSwitchDefaultProps} from "./TransitionSwitchProps";
export class TransitionSwitch extends React.Component<TransitionSwitchProps, undefined> {
public static propTypes = TransitionSwitchPropTypes;
public static defaultProps = TransitionSwitchDefaultProps;
protected get routeProps(): object {
return Object.keys(this.props.children)
.map((field) => this.props.children[field].props)
.find(({path}) => path === this.props.history.location.pathname);
}
public render(): JSX.Element {
const {history: {location}, ...props} = this.props;
const transitionProps: CSSTransition.CSSTransitionProps = {
...props,
...{
key: location.pathname.split("/")[1],
}
} as any;
const currentRouteProps = {
...this.routeProps,
...{
location,
}
};
return (
<TransitionGroup className={this.props.className}>
<CSSTransition {...transitionProps}>
<Route {...currentRouteProps}/>
</CSSTransition>
</TransitionGroup>
);
}
}
| import * as React from "react";
import {Route} from "react-router-dom";
import {TransitionGroup, CSSTransition} from "react-transition-group";
import {TransitionSwitchProps, TransitionSwitchPropTypes, TransitionSwitchDefaultProps} from "./TransitionSwitchProps";
export class TransitionSwitch extends React.Component<TransitionSwitchProps, undefined> {
public static propTypes = TransitionSwitchPropTypes;
public static defaultProps = TransitionSwitchDefaultProps;
protected get routeProps(): object {
return Object.keys(this.props.children)
.map((field) => this.props.children[field].props)
.find(({path}) => path === this.props.history.location.pathname);
}
public render(): JSX.Element {
const {history: {location}, ...props} = this.props;
const transitionProps: any = {
...props,
...{
key: location.pathname.split("/")[1],
}
} as any;
const currentRouteProps = {
...this.routeProps,
...{
location,
}
};
return (
<TransitionGroup className={this.props.className}>
<CSSTransition {...transitionProps}>
<Route {...currentRouteProps}/>
</CSSTransition>
</TransitionGroup>
);
}
}
| Remove type definition for react-transition-group | Remove type definition for react-transition-group
| TypeScript | mit | wearesho-team/wearesho-site,wearesho-team/wearesho-site,wearesho-team/wearesho-site | typescript | ## Code Before:
import * as React from "react";
import {Route} from "react-router-dom";
import * as TransitionGroup from "react-transition-group/TransitionGroup";
import * as CSSTransition from "react-transition-group/CSSTransition";
import {TransitionSwitchProps, TransitionSwitchPropTypes, TransitionSwitchDefaultProps} from "./TransitionSwitchProps";
export class TransitionSwitch extends React.Component<TransitionSwitchProps, undefined> {
public static propTypes = TransitionSwitchPropTypes;
public static defaultProps = TransitionSwitchDefaultProps;
protected get routeProps(): object {
return Object.keys(this.props.children)
.map((field) => this.props.children[field].props)
.find(({path}) => path === this.props.history.location.pathname);
}
public render(): JSX.Element {
const {history: {location}, ...props} = this.props;
const transitionProps: CSSTransition.CSSTransitionProps = {
...props,
...{
key: location.pathname.split("/")[1],
}
} as any;
const currentRouteProps = {
...this.routeProps,
...{
location,
}
};
return (
<TransitionGroup className={this.props.className}>
<CSSTransition {...transitionProps}>
<Route {...currentRouteProps}/>
</CSSTransition>
</TransitionGroup>
);
}
}
## Instruction:
Remove type definition for react-transition-group
## Code After:
import * as React from "react";
import {Route} from "react-router-dom";
import {TransitionGroup, CSSTransition} from "react-transition-group";
import {TransitionSwitchProps, TransitionSwitchPropTypes, TransitionSwitchDefaultProps} from "./TransitionSwitchProps";
export class TransitionSwitch extends React.Component<TransitionSwitchProps, undefined> {
public static propTypes = TransitionSwitchPropTypes;
public static defaultProps = TransitionSwitchDefaultProps;
protected get routeProps(): object {
return Object.keys(this.props.children)
.map((field) => this.props.children[field].props)
.find(({path}) => path === this.props.history.location.pathname);
}
public render(): JSX.Element {
const {history: {location}, ...props} = this.props;
const transitionProps: any = {
...props,
...{
key: location.pathname.split("/")[1],
}
} as any;
const currentRouteProps = {
...this.routeProps,
...{
location,
}
};
return (
<TransitionGroup className={this.props.className}>
<CSSTransition {...transitionProps}>
<Route {...currentRouteProps}/>
</CSSTransition>
</TransitionGroup>
);
}
}
| import * as React from "react";
import {Route} from "react-router-dom";
- import * as TransitionGroup from "react-transition-group/TransitionGroup";
- import * as CSSTransition from "react-transition-group/CSSTransition";
? ^^ --------------
+ import {TransitionGroup, CSSTransition} from "react-transition-group";
? ^^^ + +++++++++++ +
import {TransitionSwitchProps, TransitionSwitchPropTypes, TransitionSwitchDefaultProps} from "./TransitionSwitchProps";
export class TransitionSwitch extends React.Component<TransitionSwitchProps, undefined> {
public static propTypes = TransitionSwitchPropTypes;
public static defaultProps = TransitionSwitchDefaultProps;
protected get routeProps(): object {
return Object.keys(this.props.children)
.map((field) => this.props.children[field].props)
.find(({path}) => path === this.props.history.location.pathname);
}
public render(): JSX.Element {
const {history: {location}, ...props} = this.props;
- const transitionProps: CSSTransition.CSSTransitionProps = {
+ const transitionProps: any = {
...props,
...{
key: location.pathname.split("/")[1],
}
} as any;
const currentRouteProps = {
...this.routeProps,
...{
location,
}
};
return (
<TransitionGroup className={this.props.className}>
<CSSTransition {...transitionProps}>
<Route {...currentRouteProps}/>
</CSSTransition>
</TransitionGroup>
);
}
} | 5 | 0.111111 | 2 | 3 |
66731fa3a15b97519bb5a190f49e768bb2c93090 | verified_double.gemspec | verified_double.gemspec | lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'verified_double/version'
Gem::Specification.new do |gem|
gem.name = "verified_double"
gem.version = VerifiedDouble::VERSION
gem.authors = ["George Mendoza"]
gem.email = ["gsmendoza@gmail.com"]
gem.description = %q{TODO: Write a gem description}
gem.summary = %q{TODO: Write a gem summary}
gem.homepage = ""
gem.files = `git ls-files`.split($/)
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.require_paths = ["lib"]
gem.add_runtime_dependency "rspec"
gem.add_runtime_dependency "rspec-fire"
gem.add_development_dependency "aruba"
gem.add_development_dependency "cucumber"
gem.add_development_dependency "pry"
gem.add_development_dependency "activesupport"
end
| lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'verified_double/version'
Gem::Specification.new do |gem|
gem.name = "verified_double"
gem.version = VerifiedDouble::VERSION
gem.authors = ["George Mendoza"]
gem.email = ["gsmendoza@gmail.com"]
gem.description = %q{Contract tests for rspec}
gem.summary = %q{VerifiedDouble would record any mock made in the test suite. It would then verify if the mock is valid by checking if there is a test against it.}
gem.homepage = "https://www.relishapp.com/gsmendoza/verified-double"
gem.files = `git ls-files`.split($/)
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.require_paths = ["lib"]
gem.add_runtime_dependency "rspec"
gem.add_runtime_dependency "rspec-fire"
gem.add_development_dependency "aruba"
gem.add_development_dependency "cucumber"
gem.add_development_dependency "pry"
gem.add_development_dependency "activesupport"
end
| Update gem description, summary, and homepage for release. | Update gem description, summary, and homepage for release.
| Ruby | mit | gsmendoza/verified_double | ruby | ## Code Before:
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'verified_double/version'
Gem::Specification.new do |gem|
gem.name = "verified_double"
gem.version = VerifiedDouble::VERSION
gem.authors = ["George Mendoza"]
gem.email = ["gsmendoza@gmail.com"]
gem.description = %q{TODO: Write a gem description}
gem.summary = %q{TODO: Write a gem summary}
gem.homepage = ""
gem.files = `git ls-files`.split($/)
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.require_paths = ["lib"]
gem.add_runtime_dependency "rspec"
gem.add_runtime_dependency "rspec-fire"
gem.add_development_dependency "aruba"
gem.add_development_dependency "cucumber"
gem.add_development_dependency "pry"
gem.add_development_dependency "activesupport"
end
## Instruction:
Update gem description, summary, and homepage for release.
## Code After:
lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'verified_double/version'
Gem::Specification.new do |gem|
gem.name = "verified_double"
gem.version = VerifiedDouble::VERSION
gem.authors = ["George Mendoza"]
gem.email = ["gsmendoza@gmail.com"]
gem.description = %q{Contract tests for rspec}
gem.summary = %q{VerifiedDouble would record any mock made in the test suite. It would then verify if the mock is valid by checking if there is a test against it.}
gem.homepage = "https://www.relishapp.com/gsmendoza/verified-double"
gem.files = `git ls-files`.split($/)
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.require_paths = ["lib"]
gem.add_runtime_dependency "rspec"
gem.add_runtime_dependency "rspec-fire"
gem.add_development_dependency "aruba"
gem.add_development_dependency "cucumber"
gem.add_development_dependency "pry"
gem.add_development_dependency "activesupport"
end
| lib = File.expand_path('../lib', __FILE__)
$LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
require 'verified_double/version'
Gem::Specification.new do |gem|
gem.name = "verified_double"
gem.version = VerifiedDouble::VERSION
gem.authors = ["George Mendoza"]
gem.email = ["gsmendoza@gmail.com"]
- gem.description = %q{TODO: Write a gem description}
- gem.summary = %q{TODO: Write a gem summary}
- gem.homepage = ""
+ gem.description = %q{Contract tests for rspec}
+ gem.summary = %q{VerifiedDouble would record any mock made in the test suite. It would then verify if the mock is valid by checking if there is a test against it.}
+ gem.homepage = "https://www.relishapp.com/gsmendoza/verified-double"
gem.files = `git ls-files`.split($/)
gem.executables = gem.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
gem.require_paths = ["lib"]
gem.add_runtime_dependency "rspec"
gem.add_runtime_dependency "rspec-fire"
gem.add_development_dependency "aruba"
gem.add_development_dependency "cucumber"
gem.add_development_dependency "pry"
gem.add_development_dependency "activesupport"
end | 6 | 0.222222 | 3 | 3 |
992b3302c4cb690e86436c54c43d0bb2aa406b0d | scrapi/harvesters/hacettepe_U_DIM.py | scrapi/harvesters/hacettepe_U_DIM.py | '''
Harvester for the DSpace on LibLiveCD for the SHARE project
Example API call: http://bbytezarsivi.hacettepe.edu.tr/oai/request?verb=ListRecords&metadataPrefix=oai_dc
'''
from __future__ import unicode_literals
from scrapi.base import OAIHarvester
class Hacettepe_u_dimHarvester(OAIHarvester):
short_name = 'hacettepe_U_DIM'
long_name = 'DSpace on LibLiveCD'
url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
base_url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
property_list = ['date', 'identifier', 'type', 'rights']
timezone_granularity = True
| '''
Harvester for the DSpace on LibLiveCD for the SHARE project
Example API call: http://bbytezarsivi.hacettepe.edu.tr/oai/request?verb=ListRecords&metadataPrefix=oai_dc
'''
from __future__ import unicode_literals
from scrapi.base import OAIHarvester
class HacettepeHarvester(OAIHarvester):
short_name = 'hacettepe'
long_name = 'DSpace on LibLiveCD'
url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
base_url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
property_list = ['date', 'identifier', 'type', 'rights']
timezone_granularity = True
| Change shortname and class name | Change shortname and class name
| Python | apache-2.0 | alexgarciac/scrapi,fabianvf/scrapi,CenterForOpenScience/scrapi,mehanig/scrapi,ostwald/scrapi,CenterForOpenScience/scrapi,erinspace/scrapi,jeffreyliu3230/scrapi,mehanig/scrapi,erinspace/scrapi,felliott/scrapi,felliott/scrapi,fabianvf/scrapi | python | ## Code Before:
'''
Harvester for the DSpace on LibLiveCD for the SHARE project
Example API call: http://bbytezarsivi.hacettepe.edu.tr/oai/request?verb=ListRecords&metadataPrefix=oai_dc
'''
from __future__ import unicode_literals
from scrapi.base import OAIHarvester
class Hacettepe_u_dimHarvester(OAIHarvester):
short_name = 'hacettepe_U_DIM'
long_name = 'DSpace on LibLiveCD'
url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
base_url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
property_list = ['date', 'identifier', 'type', 'rights']
timezone_granularity = True
## Instruction:
Change shortname and class name
## Code After:
'''
Harvester for the DSpace on LibLiveCD for the SHARE project
Example API call: http://bbytezarsivi.hacettepe.edu.tr/oai/request?verb=ListRecords&metadataPrefix=oai_dc
'''
from __future__ import unicode_literals
from scrapi.base import OAIHarvester
class HacettepeHarvester(OAIHarvester):
short_name = 'hacettepe'
long_name = 'DSpace on LibLiveCD'
url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
base_url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
property_list = ['date', 'identifier', 'type', 'rights']
timezone_granularity = True
| '''
Harvester for the DSpace on LibLiveCD for the SHARE project
Example API call: http://bbytezarsivi.hacettepe.edu.tr/oai/request?verb=ListRecords&metadataPrefix=oai_dc
'''
from __future__ import unicode_literals
from scrapi.base import OAIHarvester
- class Hacettepe_u_dimHarvester(OAIHarvester):
? ------
+ class HacettepeHarvester(OAIHarvester):
- short_name = 'hacettepe_U_DIM'
? ------
+ short_name = 'hacettepe'
long_name = 'DSpace on LibLiveCD'
url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
base_url = 'http://bbytezarsivi.hacettepe.edu.tr/oai/request'
property_list = ['date', 'identifier', 'type', 'rights']
timezone_granularity = True | 4 | 0.222222 | 2 | 2 |
28770adb8a68f18c6ba798d332ee7fddc9a65107 | src/frontend/fetchTutorials.js | src/frontend/fetchTutorials.js | /* global fetch */
module.exports = function(callback) {
console.log("Fetch tutorials.");
fetch('/getListOfTutorials')
.then(function(data) {
return data.json();
}).then(function(tutorialPaths) {
console.log("Obtaining list of tutorials successful: " + tutorialPaths);
callback(0, tutorialPaths);
}).catch(function(error) {
console.log("There was an error obtaining the list of tutorial files: " +
error);
});
};
| /* global fetch */
module.exports = function(callback) {
console.log("Fetch tutorials.");
fetch('/getListOfTutorials', {
credentials: 'same-origin'
})
.then(function(data) {
return data.json();
}).then(function(tutorialPaths) {
console.log("Obtaining list of tutorials successful: " + tutorialPaths);
callback(0, tutorialPaths);
}).catch(function(error) {
console.log("There was an error obtaining the list of " +
"tutorial files: " + error);
});
};
| Fix error about tutorials for pasword-restricted use | Fix error about tutorials for pasword-restricted use
| JavaScript | mit | fhinkel/InteractiveShell,antonleykin/InteractiveShell,fhinkel/InteractiveShell,fhinkel/InteractiveShell,antonleykin/InteractiveShell,antonleykin/InteractiveShell,fhinkel/InteractiveShell | javascript | ## Code Before:
/* global fetch */
module.exports = function(callback) {
console.log("Fetch tutorials.");
fetch('/getListOfTutorials')
.then(function(data) {
return data.json();
}).then(function(tutorialPaths) {
console.log("Obtaining list of tutorials successful: " + tutorialPaths);
callback(0, tutorialPaths);
}).catch(function(error) {
console.log("There was an error obtaining the list of tutorial files: " +
error);
});
};
## Instruction:
Fix error about tutorials for pasword-restricted use
## Code After:
/* global fetch */
module.exports = function(callback) {
console.log("Fetch tutorials.");
fetch('/getListOfTutorials', {
credentials: 'same-origin'
})
.then(function(data) {
return data.json();
}).then(function(tutorialPaths) {
console.log("Obtaining list of tutorials successful: " + tutorialPaths);
callback(0, tutorialPaths);
}).catch(function(error) {
console.log("There was an error obtaining the list of " +
"tutorial files: " + error);
});
};
| /* global fetch */
module.exports = function(callback) {
console.log("Fetch tutorials.");
- fetch('/getListOfTutorials')
? ^
+ fetch('/getListOfTutorials', {
? ^^^
+ credentials: 'same-origin'
+ })
.then(function(data) {
return data.json();
}).then(function(tutorialPaths) {
console.log("Obtaining list of tutorials successful: " + tutorialPaths);
callback(0, tutorialPaths);
}).catch(function(error) {
- console.log("There was an error obtaining the list of tutorial files: " +
? ---- ----------------
+ console.log("There was an error obtaining the list of " +
- error);
- });
+ "tutorial files: " + error);
+ });
}; | 10 | 0.714286 | 6 | 4 |
9bf6e709d5c1d900b11c103dadd89a8896a0ac1a | app/views/claims/thanks.slim | app/views/claims/thanks.slim | ruby:
claimed_pods = params[:successfully_claimed] != nil
pod_message = claimed_pods ? 'pod'.pluralize(params[:successfully_claimed].size) : nil
- if claimed_pods
p You have successfully claimed the following #{ pod_message } and are now registered as #{params[:successfully_claimed].size == 1 ? 'its' : 'their'} ‘owner’: #{params[:successfully_claimed].to_sentence}.
p If you have any co-maintainers, you can now add them as ‘owners’ as well. For information on how to do this see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see [TODO].
- else
p All of your choosen Pods are already claimed.
- unless params[:already_claimed] == ['']
p The following #{params[:already_claimed].size == 1 ? 'pod has' : 'pods have'} already been claimed: #{params[:already_claimed].to_sentence}. If you disagree with this please <a href="#{url("/disputes/new?#{{ :claimer_email => params[:claimer_email], :pods => params[:already_claimed] }.to_query}")}">file a dispute</a>.
| ruby:
claimed_pods = params[:successfully_claimed] != nil
pod_message = claimed_pods ? 'pod'.pluralize(params[:successfully_claimed].size) : nil
- if claimed_pods
p You have successfully claimed the following #{ pod_message } and are now registered as #{params[:successfully_claimed].size == 1 ? 'its' : 'their'} ‘owner’: #{params[:successfully_claimed].to_sentence}.
p If you have any co-maintainers, you can now add them as ‘owners’ as well. For information on how to do this see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
- else
p All of your choosen Pods are already claimed.
- unless params[:already_claimed] == ['']
p The following #{params[:already_claimed].size == 1 ? 'pod has' : 'pods have'} already been claimed: #{params[:already_claimed].to_sentence}. If you disagree with this please <a href="#{url("/disputes/new?#{{ :claimer_email => params[:claimer_email], :pods => params[:already_claimed] }.to_query}")}">file a dispute</a>.
| Replace another TODO -> guide link. | [claims] Replace another TODO -> guide link.
| Slim | mit | billinghamj/trunk.cocoapods.org,Shawn-WangDapeng/trunk.cocoapods.org,Shawn-WangDapeng/trunk.cocoapods.org,billinghamj/trunk.cocoapods.org,k0nserv/trunk.cocoapods.org,CocoaPods/trunk.cocoapods.org,CocoaPods/trunk.cocoapods.org,k0nserv/trunk.cocoapods.org,k0nserv/trunk.cocoapods.org | slim | ## Code Before:
ruby:
claimed_pods = params[:successfully_claimed] != nil
pod_message = claimed_pods ? 'pod'.pluralize(params[:successfully_claimed].size) : nil
- if claimed_pods
p You have successfully claimed the following #{ pod_message } and are now registered as #{params[:successfully_claimed].size == 1 ? 'its' : 'their'} ‘owner’: #{params[:successfully_claimed].to_sentence}.
p If you have any co-maintainers, you can now add them as ‘owners’ as well. For information on how to do this see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see [TODO].
- else
p All of your choosen Pods are already claimed.
- unless params[:already_claimed] == ['']
p The following #{params[:already_claimed].size == 1 ? 'pod has' : 'pods have'} already been claimed: #{params[:already_claimed].to_sentence}. If you disagree with this please <a href="#{url("/disputes/new?#{{ :claimer_email => params[:claimer_email], :pods => params[:already_claimed] }.to_query}")}">file a dispute</a>.
## Instruction:
[claims] Replace another TODO -> guide link.
## Code After:
ruby:
claimed_pods = params[:successfully_claimed] != nil
pod_message = claimed_pods ? 'pod'.pluralize(params[:successfully_claimed].size) : nil
- if claimed_pods
p You have successfully claimed the following #{ pod_message } and are now registered as #{params[:successfully_claimed].size == 1 ? 'its' : 'their'} ‘owner’: #{params[:successfully_claimed].to_sentence}.
p If you have any co-maintainers, you can now add them as ‘owners’ as well. For information on how to do this see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
- else
p All of your choosen Pods are already claimed.
- unless params[:already_claimed] == ['']
p The following #{params[:already_claimed].size == 1 ? 'pod has' : 'pods have'} already been claimed: #{params[:already_claimed].to_sentence}. If you disagree with this please <a href="#{url("/disputes/new?#{{ :claimer_email => params[:claimer_email], :pods => params[:already_claimed] }.to_query}")}">file a dispute</a>.
| ruby:
claimed_pods = params[:successfully_claimed] != nil
pod_message = claimed_pods ? 'pod'.pluralize(params[:successfully_claimed].size) : nil
- if claimed_pods
p You have successfully claimed the following #{ pod_message } and are now registered as #{params[:successfully_claimed].size == 1 ? 'its' : 'their'} ‘owner’: #{params[:successfully_claimed].to_sentence}.
p If you have any co-maintainers, you can now add them as ‘owners’ as well. For information on how to do this see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
- p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see [TODO].
? ^ ^^^^^
+ p Once we have finished the transition period, you will be able to push new versions of #{params[:successfully_claimed].size == 1 ? 'these pods' : 'this pod'} directly from the command-line. For more details see the <a href = "http://guides.cocoapods.org/making/getting-setup-with-trunk">getting started with Trunk</a> guide.
? ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^
- else
p All of your choosen Pods are already claimed.
- unless params[:already_claimed] == ['']
p The following #{params[:already_claimed].size == 1 ? 'pod has' : 'pods have'} already been claimed: #{params[:already_claimed].to_sentence}. If you disagree with this please <a href="#{url("/disputes/new?#{{ :claimer_email => params[:claimer_email], :pods => params[:already_claimed] }.to_query}")}">file a dispute</a>.
| 2 | 0.125 | 1 | 1 |
5afa1a697da48fb473d0e19fe6e5dbfc6913ca75 | src/common/analytics/index.js | src/common/analytics/index.js | /**
* Dependencies.
*/
let NullAnalytics = require('./NullAnalytics');
let Tracker = require('./Tracker');
/**
* Constants.
*/
const ANALYTICS_KEY = process.env.ANALYTICS_KEY;
/**
* Locals.
*/
let enableAnalytics = window.ProductHuntAnalytics && ANALYTICS_KEY;
let ProductHuntAnalytics = enableAnalytics ? window.ProductHuntAnalytics : NullAnalytics;
let analytics = new ProductHuntAnalytics(process.env.ANALYTICS_KEY);
/**
* Export a new `Tracker`.
*/
module.exports = new Tracker(analytics);
| /**
* Dependencies.
*/
let NullAnalytics = require('./NullAnalytics');
let Tracker = require('./Tracker');
/**
* Constants.
*/
const ANALYTICS_KEY = process.env.ANALYTICS_KEY;
/**
* Locals.
*
* Note(andreasklinger): window.ProductHuntAnalytics gets set by a custom built of the analytics.js
* To recreate this use their make script - it offers a options to set the variable name.
*/
let enableAnalytics = window.ProductHuntAnalytics && ANALYTICS_KEY;
let ProductHuntAnalytics = enableAnalytics ? window.ProductHuntAnalytics : NullAnalytics;
let analytics = new ProductHuntAnalytics(process.env.ANALYTICS_KEY);
/**
* Export a new `Tracker`.
*/
module.exports = new Tracker(analytics);
| Add note to explain where the custom name comes from | Add note to explain where the custom name comes from | JavaScript | isc | producthunt/producthunt-chrome-extension,producthunt/producthunt-chrome-extension | javascript | ## Code Before:
/**
* Dependencies.
*/
let NullAnalytics = require('./NullAnalytics');
let Tracker = require('./Tracker');
/**
* Constants.
*/
const ANALYTICS_KEY = process.env.ANALYTICS_KEY;
/**
* Locals.
*/
let enableAnalytics = window.ProductHuntAnalytics && ANALYTICS_KEY;
let ProductHuntAnalytics = enableAnalytics ? window.ProductHuntAnalytics : NullAnalytics;
let analytics = new ProductHuntAnalytics(process.env.ANALYTICS_KEY);
/**
* Export a new `Tracker`.
*/
module.exports = new Tracker(analytics);
## Instruction:
Add note to explain where the custom name comes from
## Code After:
/**
* Dependencies.
*/
let NullAnalytics = require('./NullAnalytics');
let Tracker = require('./Tracker');
/**
* Constants.
*/
const ANALYTICS_KEY = process.env.ANALYTICS_KEY;
/**
* Locals.
*
* Note(andreasklinger): window.ProductHuntAnalytics gets set by a custom built of the analytics.js
* To recreate this use their make script - it offers a options to set the variable name.
*/
let enableAnalytics = window.ProductHuntAnalytics && ANALYTICS_KEY;
let ProductHuntAnalytics = enableAnalytics ? window.ProductHuntAnalytics : NullAnalytics;
let analytics = new ProductHuntAnalytics(process.env.ANALYTICS_KEY);
/**
* Export a new `Tracker`.
*/
module.exports = new Tracker(analytics);
| /**
* Dependencies.
*/
let NullAnalytics = require('./NullAnalytics');
let Tracker = require('./Tracker');
/**
* Constants.
*/
const ANALYTICS_KEY = process.env.ANALYTICS_KEY;
/**
* Locals.
+ *
+ * Note(andreasklinger): window.ProductHuntAnalytics gets set by a custom built of the analytics.js
+ * To recreate this use their make script - it offers a options to set the variable name.
*/
let enableAnalytics = window.ProductHuntAnalytics && ANALYTICS_KEY;
let ProductHuntAnalytics = enableAnalytics ? window.ProductHuntAnalytics : NullAnalytics;
let analytics = new ProductHuntAnalytics(process.env.ANALYTICS_KEY);
/**
* Export a new `Tracker`.
*/
module.exports = new Tracker(analytics); | 3 | 0.115385 | 3 | 0 |
56e55fa665bd2156f6f446e3179a0c3078da314c | circle.yml | circle.yml | machine:
java:
version: oraclejdk8
environment:
GRADLE_OPTS: '-Dorg.gradle.jvmargs="-Xmx3500m -XX:+HeapDumpOnOutOfMemoryError"'
dependencies:
pre:
- echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
# Build tools should be installed after "tools", uh.
- echo y | android update sdk --no-ui --all --filter "build-tools-23.0.3"
# Generate gradle.properties with API keys
- source environmentSetup.sh && copyEnvVarsToGradleProperties
test:
override:
- sh ci.sh | machine:
java:
version: oraclejdk8
environment:
GRADLE_OPTS: '-Dorg.gradle.jvmargs="-Xmx3500m -XX:+HeapDumpOnOutOfMemoryError"'
dependencies:
pre:
- echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,android-25,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
# Build tools should be installed after "tools", uh.
- echo y | android update sdk --no-ui --all --filter "build-tools-23.0.3"
# Generate gradle.properties with API keys
- source environmentSetup.sh && copyEnvVarsToGradleProperties
test:
override:
- sh ci.sh
| Add Android API 25 to Circle CI | Add Android API 25 to Circle CI
Fix the CI build. | YAML | mpl-2.0 | Plastix/Forage,Plastix/Forage | yaml | ## Code Before:
machine:
java:
version: oraclejdk8
environment:
GRADLE_OPTS: '-Dorg.gradle.jvmargs="-Xmx3500m -XX:+HeapDumpOnOutOfMemoryError"'
dependencies:
pre:
- echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
# Build tools should be installed after "tools", uh.
- echo y | android update sdk --no-ui --all --filter "build-tools-23.0.3"
# Generate gradle.properties with API keys
- source environmentSetup.sh && copyEnvVarsToGradleProperties
test:
override:
- sh ci.sh
## Instruction:
Add Android API 25 to Circle CI
Fix the CI build.
## Code After:
machine:
java:
version: oraclejdk8
environment:
GRADLE_OPTS: '-Dorg.gradle.jvmargs="-Xmx3500m -XX:+HeapDumpOnOutOfMemoryError"'
dependencies:
pre:
- echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,android-25,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
# Build tools should be installed after "tools", uh.
- echo y | android update sdk --no-ui --all --filter "build-tools-23.0.3"
# Generate gradle.properties with API keys
- source environmentSetup.sh && copyEnvVarsToGradleProperties
test:
override:
- sh ci.sh
| machine:
java:
version: oraclejdk8
environment:
GRADLE_OPTS: '-Dorg.gradle.jvmargs="-Xmx3500m -XX:+HeapDumpOnOutOfMemoryError"'
dependencies:
pre:
- - echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
+ - echo y | android update sdk --no-ui --all --filter "platform-tools,tools,android-24,android-25,extra-android-m2repository,extra-google-google_play_services,extra-google-m2repository,extra-android-support"
? +++++++++++
# Build tools should be installed after "tools", uh.
- echo y | android update sdk --no-ui --all --filter "build-tools-23.0.3"
# Generate gradle.properties with API keys
- source environmentSetup.sh && copyEnvVarsToGradleProperties
test:
override:
- sh ci.sh | 2 | 0.117647 | 1 | 1 |
72d48d41403a2a04b71f1cfa8a3e226f52761d75 | plugins-compat-tester/src/main/java/org/jenkins/tools/test/hook/WorkflowCpsHook.java | plugins-compat-tester/src/main/java/org/jenkins/tools/test/hook/WorkflowCpsHook.java | package org.jenkins.tools.test.hook;
import hudson.model.UpdateSite;
import hudson.util.VersionNumber;
import java.util.Map;
public class WorkflowCpsHook extends AbstractMultiParentHook {
@Override
protected String getParentFolder() {
return "workflow-cps-plugin";
}
@Override
protected String getParentProjectName() {
return "workflow-cps-parent";
}
@Override
protected String getPluginFolderName(UpdateSite.Plugin currentPlugin){
return "plugin";
}
@Override
public boolean check(Map<String, Object> info) {
return isMultiModuleVersionOfWorkflowCps(info);
}
public static boolean isMultiModuleVersionOfWorkflowCps(Map<String, Object> info) {
UpdateSite.Plugin plugin = info.get("plugin") != null ? (UpdateSite.Plugin) info.get("plugin") : null;
if (plugin != null && plugin.name.equalsIgnoreCase("workflow-cps") && plugin.version != null) {
VersionNumber pluginVersion = new VersionNumber(plugin.version);
// 2803 was the final release before it became a multi-module project.
// The history of groovy-cps history was merged into the repo, so the first multi-module release will be a little over 3500.
VersionNumber multiModuleSince = new VersionNumber("3500");
return pluginVersion.isNewerThan(multiModuleSince);
}
return false;
}
}
| package org.jenkins.tools.test.hook;
import hudson.model.UpdateSite;
import hudson.util.VersionNumber;
import java.util.Map;
public class WorkflowCpsHook extends AbstractMultiParentHook {
@Override
protected String getParentFolder() {
return "workflow-cps-plugin";
}
@Override
protected String getParentProjectName() {
return "workflow-cps-parent";
}
@Override
protected String getPluginFolderName(UpdateSite.Plugin currentPlugin){
return "plugin";
}
@Override
public boolean check(Map<String, Object> info) {
return isMultiModuleVersionOfWorkflowCps(info);
}
public static boolean isMultiModuleVersionOfWorkflowCps(Map<String, Object> info) {
UpdateSite.Plugin plugin = info.get("plugin") != null ? (UpdateSite.Plugin) info.get("plugin") : null;
if (plugin != null && plugin.name.equals("workflow-cps") && plugin.version != null) {
VersionNumber pluginVersion = new VersionNumber(plugin.version);
// 2803 was the final release before it became a multi-module project.
// The history of groovy-cps history was merged into the repo, so the first multi-module release will be a little over 3500.
VersionNumber multiModuleSince = new VersionNumber("3500");
return pluginVersion.isNewerThan(multiModuleSince);
}
return false;
}
}
| Use equals instead of equalsIgnoreCase | Use equals instead of equalsIgnoreCase
Co-authored-by: Jesse Glick <f03f7b035106bba3da9b565be6f0115dc0d88f87@cloudbees.com> | Java | mit | jenkinsci/plugin-compat-tester,jenkinsci/plugin-compat-tester | java | ## Code Before:
package org.jenkins.tools.test.hook;
import hudson.model.UpdateSite;
import hudson.util.VersionNumber;
import java.util.Map;
public class WorkflowCpsHook extends AbstractMultiParentHook {
@Override
protected String getParentFolder() {
return "workflow-cps-plugin";
}
@Override
protected String getParentProjectName() {
return "workflow-cps-parent";
}
@Override
protected String getPluginFolderName(UpdateSite.Plugin currentPlugin){
return "plugin";
}
@Override
public boolean check(Map<String, Object> info) {
return isMultiModuleVersionOfWorkflowCps(info);
}
public static boolean isMultiModuleVersionOfWorkflowCps(Map<String, Object> info) {
UpdateSite.Plugin plugin = info.get("plugin") != null ? (UpdateSite.Plugin) info.get("plugin") : null;
if (plugin != null && plugin.name.equalsIgnoreCase("workflow-cps") && plugin.version != null) {
VersionNumber pluginVersion = new VersionNumber(plugin.version);
// 2803 was the final release before it became a multi-module project.
// The history of groovy-cps history was merged into the repo, so the first multi-module release will be a little over 3500.
VersionNumber multiModuleSince = new VersionNumber("3500");
return pluginVersion.isNewerThan(multiModuleSince);
}
return false;
}
}
## Instruction:
Use equals instead of equalsIgnoreCase
Co-authored-by: Jesse Glick <f03f7b035106bba3da9b565be6f0115dc0d88f87@cloudbees.com>
## Code After:
package org.jenkins.tools.test.hook;
import hudson.model.UpdateSite;
import hudson.util.VersionNumber;
import java.util.Map;
public class WorkflowCpsHook extends AbstractMultiParentHook {
@Override
protected String getParentFolder() {
return "workflow-cps-plugin";
}
@Override
protected String getParentProjectName() {
return "workflow-cps-parent";
}
@Override
protected String getPluginFolderName(UpdateSite.Plugin currentPlugin){
return "plugin";
}
@Override
public boolean check(Map<String, Object> info) {
return isMultiModuleVersionOfWorkflowCps(info);
}
public static boolean isMultiModuleVersionOfWorkflowCps(Map<String, Object> info) {
UpdateSite.Plugin plugin = info.get("plugin") != null ? (UpdateSite.Plugin) info.get("plugin") : null;
if (plugin != null && plugin.name.equals("workflow-cps") && plugin.version != null) {
VersionNumber pluginVersion = new VersionNumber(plugin.version);
// 2803 was the final release before it became a multi-module project.
// The history of groovy-cps history was merged into the repo, so the first multi-module release will be a little over 3500.
VersionNumber multiModuleSince = new VersionNumber("3500");
return pluginVersion.isNewerThan(multiModuleSince);
}
return false;
}
}
| package org.jenkins.tools.test.hook;
import hudson.model.UpdateSite;
import hudson.util.VersionNumber;
import java.util.Map;
public class WorkflowCpsHook extends AbstractMultiParentHook {
@Override
protected String getParentFolder() {
return "workflow-cps-plugin";
}
@Override
protected String getParentProjectName() {
return "workflow-cps-parent";
}
@Override
protected String getPluginFolderName(UpdateSite.Plugin currentPlugin){
return "plugin";
}
@Override
public boolean check(Map<String, Object> info) {
return isMultiModuleVersionOfWorkflowCps(info);
}
public static boolean isMultiModuleVersionOfWorkflowCps(Map<String, Object> info) {
UpdateSite.Plugin plugin = info.get("plugin") != null ? (UpdateSite.Plugin) info.get("plugin") : null;
- if (plugin != null && plugin.name.equalsIgnoreCase("workflow-cps") && plugin.version != null) {
? ----------
+ if (plugin != null && plugin.name.equals("workflow-cps") && plugin.version != null) {
VersionNumber pluginVersion = new VersionNumber(plugin.version);
// 2803 was the final release before it became a multi-module project.
// The history of groovy-cps history was merged into the repo, so the first multi-module release will be a little over 3500.
VersionNumber multiModuleSince = new VersionNumber("3500");
return pluginVersion.isNewerThan(multiModuleSince);
}
return false;
}
} | 2 | 0.05 | 1 | 1 |
b1b80a8b76f6dcae60d5a799e03c5269e2771e1c | lib/file.coffee | lib/file.coffee | if Meteor.isServer
storeType = 'GridFS'
if Meteor.settings?.public?.avatarStore?.type?
storeType = Meteor.settings.public.avatarStore.type
RocketStore = RocketFile[storeType]
if not RocketStore?
throw new Error "Invalid RocketStore type [#{storeType}]"
transformWrite = undefined
if Meteor.settings?.public?.avatarStore?.size?.height?
height = Meteor.settings.public.avatarStore.size.height
width = Meteor.settings.public.avatarStore.size.width
transformWrite = (file, readStream, writeStream) ->
RocketFile.gm(readStream, file.fileName).background('#ffffff').resize(width, height).gravity('Center').extent(width, height).stream('jpeg').pipe(writeStream)
path = "~/uploads"
if Meteor.settings?.public?.avatarStore?.path?
path = Meteor.settings.public.avatarStore.path
@RocketFileAvatarInstance = new RocketFile.GridFS
name: 'avatars'
absolutePath: path
transformWrite: transformWrite
HTTP.methods
'/avatar/:username':
'stream': true
'get': (data) ->
this.params.username
file = RocketFileAvatarInstance.getFileWithReadStream this.params.username
this.setContentType 'image/jpeg'
this.addHeader 'Content-Disposition', 'inline'
this.addHeader 'Content-Length', file.length
file.readStream.pipe this.createWriteStream()
return
| if Meteor.isServer
storeType = 'GridFS'
if Meteor.settings?.public?.avatarStore?.type?
storeType = Meteor.settings.public.avatarStore.type
RocketStore = RocketFile[storeType]
if not RocketStore?
throw new Error "Invalid RocketStore type [#{storeType}]"
console.log "Using #{storeType} for Avatar storage".green
transformWrite = undefined
if Meteor.settings?.public?.avatarStore?.size?.height?
height = Meteor.settings.public.avatarStore.size.height
width = Meteor.settings.public.avatarStore.size.width
transformWrite = (file, readStream, writeStream) ->
RocketFile.gm(readStream, file.fileName).background('#ffffff').resize(width, height).gravity('Center').extent(width, height).stream('jpeg').pipe(writeStream)
path = "~/uploads"
if Meteor.settings?.public?.avatarStore?.path?
path = Meteor.settings.public.avatarStore.path
@RocketFileAvatarInstance = new RocketStore
name: 'avatars'
absolutePath: path
transformWrite: transformWrite
HTTP.methods
'/avatar/:username':
'stream': true
'get': (data) ->
this.params.username
file = RocketFileAvatarInstance.getFileWithReadStream this.params.username
this.setContentType 'image/jpeg'
this.addHeader 'Content-Disposition', 'inline'
this.addHeader 'Content-Length', file.length
file.readStream.pipe this.createWriteStream()
return
| Fix store reference and log which store is using | Fix store reference and log which store is using
| CoffeeScript | mit | warcode/Rocket.Chat,mccambridge/Rocket.Chat,ZBoxApp/Rocket.Chat,katopz/Rocket.Chat,qnib/Rocket.Chat,MiHuevos/Rocket.Chat,mhurwi/Rocket.Chat,wolfika/Rocket.Chat,mrsimpson/Rocket.Chat,liuliming2008/Rocket.Chat,Ninotna/Rocket.Chat,parkmap/Rocket.Chat,Gudii/Rocket.Chat,abhishekshukla0302/trico,Kiran-Rao/Rocket.Chat,klatys/Rocket.Chat,acaronmd/Rocket.Chat,lonbaker/Rocket.Chat,acidsound/Rocket.Chat,inoxth/Rocket.Chat,andela-cnnadi/Rocket.Chat,berndsi/Rocket.Chat,madmanteam/Rocket.Chat,glnarayanan/Rocket.Chat,revspringjake/Rocket.Chat,timkinnane/Rocket.Chat,karlprieb/Rocket.Chat,ahmadassaf/Rocket.Chat,kkochubey1/Rocket.Chat,ndarilek/Rocket.Chat,jeanmatheussouto/Rocket.Chat,JamesHGreen/Rocket_API,JamesHGreen/Rocket.Chat,warcode/Rocket.Chat,acidsound/Rocket.Chat,JamesHGreen/Rocket.Chat,psadaic/Rocket.Chat,dmitrijs-balcers/Rocket.Chat,pachox/Rocket.Chat,HeapCity/Heap.City,slava-sh/Rocket.Chat,I-am-Gabi/Rocket.Chat,cnash/Rocket.Chat,acidicX/Rocket.Chat,tntobias/Rocket.Chat,hazio/Rocket.Chat,amaapp/ama,fonsich/Rocket.Chat,florinnichifiriuc/Rocket.Chat,linnovate/hi,bt/Rocket.Chat,williamfortunademoraes/Rocket.Chat,jyx140521/Rocket.Chat,freakynit/Rocket.Chat,haosdent/Rocket.Chat,AimenJoe/Rocket.Chat,inoio/Rocket.Chat,intelradoux/Rocket.Chat,steedos/chat,biomassives/Rocket.Chat,thunderrabbit/Rocket.Chat,klatys/Rocket.Chat,ndarilek/Rocket.Chat,pitamar/Rocket.Chat,ndarilek/Rocket.Chat,mohamedhagag/Rocket.Chat,berndsi/Rocket.Chat,dmitrijs-balcers/Rocket.Chat,HeapCity/Heap.City,haoyixin/Rocket.Chat,Dianoga/Rocket.Chat,bopjesvla/chatmafia,jessedhillon/Rocket.Chat,Kiran-Rao/Rocket.Chat,wangleihd/Rocket.Chat,webcoding/Rocket.Chat,pkgodara/Rocket.Chat,acidicX/Rocket.Chat,capensisma/Rocket.Chat,Ninotna/Rocket.Chat,sscpac/chat-locker,mrinaldhar/Rocket.Chat,k0nsl/Rocket.Chat,mitar/Rocket.Chat,gitaboard/Rocket.Chat,Abdelhamidhenni/Rocket.Chat,himeshp/Rocket.Chat,mitar/Rocket.Chat,erikmaarten/Rocket.Chat,lukaroski/traden,andela-cnnadi/Rocket.Chat,VoiSmart/Rocket.Chat,nathantreid/Rocket.Chat,ludiculous/Rocket.Chat,LearnersGuild/echo-chat,4thParty/Rocket.Chat,apnero/tactixteam,ziedmahdi/Rocket.Chat,mrinaldhar/Rocket.Chat,coreyaus/Rocket.Chat,pachox/Rocket.Chat,Codebrahma/Rocket.Chat,ludiculous/Rocket.Chat,mrinaldhar/Rocket.Chat,ealbers/Rocket.Chat,nishimaki10/Rocket.Chat,klatys/Rocket.Chat,anhld/Rocket.Chat,Kiran-Rao/Rocket.Chat,AlecTroemel/Rocket.Chat,fatihwk/Rocket.Chat,Jandersolutions/Rocket.Chat,alenodari/Rocket.Chat,liemqv/Rocket.Chat,greatdinosaur/Rocket.Chat,matthewshirley/Rocket.Chat,k0nsl/Rocket.Chat,revspringjake/Rocket.Chat,danielbressan/Rocket.Chat,Sing-Li/Rocket.Chat,thunderrabbit/Rocket.Chat,Gudii/Rocket.Chat,amaapp/ama,sunhaolin/Rocket.Chat,JamesHGreen/Rocket_API,igorstajic/Rocket.Chat,abhishekshukla0302/trico,nishimaki10/Rocket.Chat,umeshrs/rocket-chat-integration,thswave/Rocket.Chat,JisuPark/Rocket.Chat,wicked539/Rocket.Chat,andela-cnnadi/Rocket.Chat,flaviogrossi/Rocket.Chat,ludiculous/Rocket.Chat,pachox/Rocket.Chat,callmekatootie/Rocket.Chat,jbsavoy18/rocketchat-1,alexbrazier/Rocket.Chat,nrhubbar/Rocket.Chat,BHWD/noouchat,kkochubey1/Rocket.Chat,adamteece/Rocket.Chat,pitamar/Rocket.Chat,trt15-ssci-organization/Rocket.Chat,acidicX/Rocket.Chat,princesust/Rocket.Chat,xasx/Rocket.Chat,flaviogrossi/Rocket.Chat,OtkurBiz/Rocket.Chat,princesust/Rocket.Chat,nabiltntn/Rocket.Chat,xboston/Rocket.Chat,intelradoux/Rocket.Chat,OtkurBiz/Rocket.Chat,umeshrs/rocket-chat-integration,nishimaki10/Rocket.Chat,freakynit/Rocket.Chat,wicked539/Rocket.Chat,Maysora/Rocket.Chat,marzieh312/Rocket.Chat,jadeqwang/Rocket.Chat,adamteece/Rocket.Chat,mwharrison/Rocket.Chat,ZBoxApp/Rocket.Chat,jonathanhartman/Rocket.Chat,JamesHGreen/Rocket_API,PavelVanecek/Rocket.Chat,Achaikos/Rocket.Chat,fatihwk/Rocket.Chat,LearnersGuild/echo-chat,karlprieb/Rocket.Chat,OtkurBiz/Rocket.Chat,lihuanghai/Rocket.Chat,williamfortunademoraes/Rocket.Chat,AlecTroemel/Rocket.Chat,acaronmd/Rocket.Chat,mrinaldhar/Rocket.Chat,k0nsl/Rocket.Chat,ggazzo/Rocket.Chat,AlecTroemel/Rocket.Chat,igorstajic/Rocket.Chat,yuyixg/Rocket.Chat,KyawNaingTun/Rocket.Chat,princesust/Rocket.Chat,ziedmahdi/Rocket.Chat,xasx/Rocket.Chat,ahmadassaf/Rocket.Chat,timkinnane/Rocket.Chat,steedos/chat,uniteddiversity/Rocket.Chat,subesokun/Rocket.Chat,JamesHGreen/Rocket.Chat,mhurwi/Rocket.Chat,jyx140521/Rocket.Chat,christmo/Rocket.Chat,acaronmd/Rocket.Chat,ealbers/Rocket.Chat,galrotem1993/Rocket.Chat,wicked539/Rocket.Chat,j-ew-s/Rocket.Chat,atyenoria/Rocket.Chat,danielbressan/Rocket.Chat,Kiran-Rao/Rocket.Chat,biomassives/Rocket.Chat,arvi/Rocket.Chat,philosowaffle/rpi-Rocket.Chat,revspringjake/Rocket.Chat,Gudii/Rocket.Chat,Codebrahma/Rocket.Chat,wtsarchive/Rocket.Chat,mhurwi/Rocket.Chat,atyenoria/Rocket.Chat,pkgodara/Rocket.Chat,glnarayanan/Rocket.Chat,snaiperskaya96/Rocket.Chat,osxi/Rocket.Chat,himeshp/Rocket.Chat,Movile/Rocket.Chat,tntobias/Rocket.Chat,NMandapaty/Rocket.Chat,acaronmd/Rocket.Chat,jessedhillon/Rocket.Chat,jbsavoy18/rocketchat-1,AlecTroemel/Rocket.Chat,alenodari/Rocket.Chat,yuyixg/Rocket.Chat,Maysora/Rocket.Chat,ImpressiveSetOfIntelligentStudents/chat,LearnersGuild/Rocket.Chat,fduraibi/Rocket.Chat,ealbers/Rocket.Chat,hazio/Rocket.Chat,Sing-Li/Rocket.Chat,I-am-Gabi/Rocket.Chat,janmaghuyop/Rocket.Chat,Jandersoft/Rocket.Chat,alexbrazier/Rocket.Chat,Codebrahma/Rocket.Chat,ealbers/Rocket.Chat,wtsarchive/Rocket.Chat,ziedmahdi/Rocket.Chat,cnash/Rocket.Chat,philosowaffle/rpi-Rocket.Chat,lucasgolino/Rocket.Chat,xboston/Rocket.Chat,org100h1/Rocket.Panda,ziedmahdi/Rocket.Chat,ggazzo/Rocket.Chat,fduraibi/Rocket.Chat,subesokun/Rocket.Chat,greatdinosaur/Rocket.Chat,LearnersGuild/Rocket.Chat,ImpressiveSetOfIntelligentStudents/chat,AimenJoe/Rocket.Chat,jhou2/Rocket.Chat,Maysora/Rocket.Chat,cdwv/Rocket.Chat,BorntraegerMarc/Rocket.Chat,pachox/Rocket.Chat,tradetiger/Rocket.Chat,nrhubbar/Rocket.Chat,litewhatever/Rocket.Chat,jbsavoy18/rocketchat-1,jessedhillon/Rocket.Chat,sikofitt/Rocket.Chat,igorstajic/Rocket.Chat,callblueday/Rocket.Chat,bt/Rocket.Chat,Sing-Li/Rocket.Chat,danielbressan/Rocket.Chat,soonahn/Rocket.Chat,capensisma/Rocket.Chat,icaromh/Rocket.Chat,tzellman/Rocket.Chat,janmaghuyop/Rocket.Chat,ahmadassaf/Rocket.Chat,greatdinosaur/Rocket.Chat,williamfortunademoraes/Rocket.Chat,tlongren/Rocket.Chat,osxi/Rocket.Chat,liemqv/Rocket.Chat,jeann2013/Rocket.Chat,janmaghuyop/Rocket.Chat,atyenoria/Rocket.Chat,nathantreid/Rocket.Chat,NMandapaty/Rocket.Chat,sargentsurg/Rocket.Chat,subesokun/Rocket.Chat,pitamar/Rocket.Chat,jeanmatheussouto/Rocket.Chat,gitaboard/Rocket.Chat,nabiltntn/Rocket.Chat,Gyubin/Rocket.Chat,umeshrs/rocket-chat,galrotem1993/Rocket.Chat,jyx140521/Rocket.Chat,j-ew-s/Rocket.Chat,timkinnane/Rocket.Chat,lonbaker/Rocket.Chat,abduljanjua/TheHub,Jandersoft/Rocket.Chat,mhurwi/Rocket.Chat,karlprieb/Rocket.Chat,madmanteam/Rocket.Chat,wtsarchive/Rocket.Chat,florinnichifiriuc/Rocket.Chat,leohmoraes/Rocket.Chat,sscpac/chat-locker,abhishekshukla0302/trico,thebakeryio/Rocket.Chat,LearnersGuild/Rocket.Chat,haosdent/Rocket.Chat,dmitrijs-balcers/Rocket.Chat,ImpressiveSetOfIntelligentStudents/chat,gitaboard/Rocket.Chat,marzieh312/Rocket.Chat,ut7/Rocket.Chat,bt/Rocket.Chat,fatihwk/Rocket.Chat,Achaikos/Rocket.Chat,rasata/Rocket.Chat,sikofitt/Rocket.Chat,uniteddiversity/Rocket.Chat,ggazzo/Rocket.Chat,snaiperskaya96/Rocket.Chat,lonbaker/Rocket.Chat,mrsimpson/Rocket.Chat,wolfika/Rocket.Chat,LeonardOliveros/Rocket.Chat,xasx/Rocket.Chat,abduljanjua/TheHub,callblueday/Rocket.Chat,umeshrs/rocket-chat,katopz/Rocket.Chat,wolfika/Rocket.Chat,org100h1/Rocket.Panda,jeanmatheussouto/Rocket.Chat,tlongren/Rocket.Chat,pitamar/Rocket.Chat,tntobias/Rocket.Chat,cdwv/Rocket.Chat,4thParty/Rocket.Chat,abduljanjua/TheHub,sunhaolin/Rocket.Chat,jbsavoy18/rocketchat-1,freakynit/Rocket.Chat,abhishekshukla0302/trico,osxi/Rocket.Chat,kkochubey1/Rocket.Chat,warcode/Rocket.Chat,sscpac/chat-locker,MiHuevos/Rocket.Chat,PavelVanecek/Rocket.Chat,JisuPark/Rocket.Chat,mccambridge/Rocket.Chat,Gudii/Rocket.Chat,Dianoga/Rocket.Chat,ederribeiro/Rocket.Chat,icaromh/Rocket.Chat,Achaikos/Rocket.Chat,KyawNaingTun/Rocket.Chat,cdwv/Rocket.Chat,Ninotna/Rocket.Chat,bopjesvla/chatmafia,uniteddiversity/Rocket.Chat,j-ew-s/Rocket.Chat,lukaroski/traden,Gyubin/Rocket.Chat,haoyixin/Rocket.Chat,florinnichifiriuc/Rocket.Chat,tlongren/Rocket.Chat,NMandapaty/Rocket.Chat,Sing-Li/Rocket.Chat,ZBoxApp/Rocket.Chat,leohmoraes/Rocket.Chat,himeshp/Rocket.Chat,marzieh312/Rocket.Chat,VoiSmart/Rocket.Chat,liemqv/Rocket.Chat,nishimaki10/Rocket.Chat,LearnersGuild/echo-chat,thebakeryio/Rocket.Chat,liuliming2008/Rocket.Chat,parkmap/Rocket.Chat,Gromby/Rocket.Chat,fduraibi/Rocket.Chat,Abdelhamidhenni/Rocket.Chat,AimenJoe/Rocket.Chat,snaiperskaya96/Rocket.Chat,sargentsurg/Rocket.Chat,litewhatever/Rocket.Chat,Jandersoft/Rocket.Chat,MiHuevos/Rocket.Chat,intelradoux/Rocket.Chat,galrotem1993/Rocket.Chat,LeonardOliveros/Rocket.Chat,apnero/tactixteam,lukaroski/traden,matthewshirley/Rocket.Chat,ut7/Rocket.Chat,acidsound/Rocket.Chat,ederribeiro/Rocket.Chat,KyawNaingTun/Rocket.Chat,nrhubbar/Rocket.Chat,trt15-ssci-organization/Rocket.Chat,phlkchan/Rocket.Chat,callmekatootie/Rocket.Chat,JisuPark/Rocket.Chat,wicked539/Rocket.Chat,parkmap/Rocket.Chat,4thParty/Rocket.Chat,fonsich/Rocket.Chat,philosowaffle/rpi-Rocket.Chat,lucasgolino/Rocket.Chat,slava-sh/Rocket.Chat,arvi/Rocket.Chat,erikmaarten/Rocket.Chat,yuyixg/Rocket.Chat,thswave/Rocket.Chat,Achaikos/Rocket.Chat,ggazzo/Rocket.Chat,icaromh/Rocket.Chat,qnib/Rocket.Chat,ederribeiro/Rocket.Chat,NMandapaty/Rocket.Chat,cnash/Rocket.Chat,thebakeryio/Rocket.Chat,alenodari/Rocket.Chat,nabiltntn/Rocket.Chat,snaiperskaya96/Rocket.Chat,bopjesvla/chatmafia,subesokun/Rocket.Chat,jonathanhartman/Rocket.Chat,liuliming2008/Rocket.Chat,trt15-ssci-organization/Rocket.Chat,leohmoraes/Rocket.Chat,litewhatever/Rocket.Chat,berndsi/Rocket.Chat,alexbrazier/Rocket.Chat,TribeMedia/Rocket.Chat,christmo/Rocket.Chat,haosdent/Rocket.Chat,mohamedhagag/Rocket.Chat,sargentsurg/Rocket.Chat,VoiSmart/Rocket.Chat,abduljanjua/TheHub,rasata/Rocket.Chat,AimenJoe/Rocket.Chat,Flitterkill/Rocket.Chat,coreyaus/Rocket.Chat,LearnersGuild/echo-chat,hazio/Rocket.Chat,celloudiallo/Rocket.Chat,mrsimpson/Rocket.Chat,umeshrs/rocket-chat,4thParty/Rocket.Chat,amaapp/ama,timkinnane/Rocket.Chat,Movile/Rocket.Chat,soonahn/Rocket.Chat,yuyixg/Rocket.Chat,trt15-ssci-organization/Rocket.Chat,pkgodara/Rocket.Chat,inoxth/Rocket.Chat,christmo/Rocket.Chat,mwharrison/Rocket.Chat,k0nsl/Rocket.Chat,BHWD/noouchat,lihuanghai/Rocket.Chat,Jandersolutions/Rocket.Chat,mitar/Rocket.Chat,Flitterkill/Rocket.Chat,steedos/chat,amaapp/ama,PavelVanecek/Rocket.Chat,bt/Rocket.Chat,tzellman/Rocket.Chat,anhld/Rocket.Chat,linnovate/hi,xasx/Rocket.Chat,mwharrison/Rocket.Chat,karlprieb/Rocket.Chat,BHWD/noouchat,jhou2/Rocket.Chat,apnero/tactixteam,cdwv/Rocket.Chat,matthewshirley/Rocket.Chat,inoxth/Rocket.Chat,callmekatootie/Rocket.Chat,JamesHGreen/Rocket_API,I-am-Gabi/Rocket.Chat,wangleihd/Rocket.Chat,lukaroski/traden,haoyixin/Rocket.Chat,psadaic/Rocket.Chat,slava-sh/Rocket.Chat,xboston/Rocket.Chat,BorntraegerMarc/Rocket.Chat,Deepakkothandan/Rocket.Chat,Movile/Rocket.Chat,jonathanhartman/Rocket.Chat,Deepakkothandan/Rocket.Chat,TribeMedia/Rocket.Chat,mohamedhagag/Rocket.Chat,xboston/Rocket.Chat,jeann2013/Rocket.Chat,HeapCity/Heap.City,litewhatever/Rocket.Chat,tradetiger/Rocket.Chat,callblueday/Rocket.Chat,galrotem1993/Rocket.Chat,inoio/Rocket.Chat,capensisma/Rocket.Chat,Gyubin/Rocket.Chat,Dianoga/Rocket.Chat,Abdelhamidhenni/Rocket.Chat,klatys/Rocket.Chat,liuliming2008/Rocket.Chat,soonahn/Rocket.Chat,LeonardOliveros/Rocket.Chat,jadeqwang/Rocket.Chat,qnib/Rocket.Chat,tzellman/Rocket.Chat,qnib/Rocket.Chat,coreyaus/Rocket.Chat,lucasgolino/Rocket.Chat,celloudiallo/Rocket.Chat,Movile/Rocket.Chat,celloudiallo/Rocket.Chat,fduraibi/Rocket.Chat,lihuanghai/Rocket.Chat,Gyubin/Rocket.Chat,Deepakkothandan/Rocket.Chat,Dianoga/Rocket.Chat,org100h1/Rocket.Panda,mccambridge/Rocket.Chat,Deepakkothandan/Rocket.Chat,katopz/Rocket.Chat,steedos/chat,wtsarchive/Rocket.Chat,Gromby/Rocket.Chat,ndarilek/Rocket.Chat,LearnersGuild/Rocket.Chat,marzieh312/Rocket.Chat,tntobias/Rocket.Chat,matthewshirley/Rocket.Chat,jhou2/Rocket.Chat,flaviogrossi/Rocket.Chat,biomassives/Rocket.Chat,tradetiger/Rocket.Chat,Gromby/Rocket.Chat,ut7/Rocket.Chat,org100h1/Rocket.Panda,rasata/Rocket.Chat,jonathanhartman/Rocket.Chat,pkgodara/Rocket.Chat,mitar/Rocket.Chat,intelradoux/Rocket.Chat,JamesHGreen/Rocket.Chat,ImpressiveSetOfIntelligentStudents/chat,Flitterkill/Rocket.Chat,LeonardOliveros/Rocket.Chat,PavelVanecek/Rocket.Chat,psadaic/Rocket.Chat,webcoding/Rocket.Chat,OtkurBiz/Rocket.Chat,jadeqwang/Rocket.Chat,sikofitt/Rocket.Chat,Jandersolutions/Rocket.Chat,webcoding/Rocket.Chat,danielbressan/Rocket.Chat,soonahn/Rocket.Chat,sunhaolin/Rocket.Chat,nathantreid/Rocket.Chat,Flitterkill/Rocket.Chat,ut7/Rocket.Chat,haoyixin/Rocket.Chat,inoxth/Rocket.Chat,adamteece/Rocket.Chat,flaviogrossi/Rocket.Chat,inoio/Rocket.Chat,phlkchan/Rocket.Chat,mrsimpson/Rocket.Chat,igorstajic/Rocket.Chat,wangleihd/Rocket.Chat,jeann2013/Rocket.Chat,anhld/Rocket.Chat,tlongren/Rocket.Chat,erikmaarten/Rocket.Chat,fatihwk/Rocket.Chat,ahmadassaf/Rocket.Chat,BorntraegerMarc/Rocket.Chat,alexbrazier/Rocket.Chat,mwharrison/Rocket.Chat,mccambridge/Rocket.Chat,madmanteam/Rocket.Chat,TribeMedia/Rocket.Chat,thswave/Rocket.Chat,phlkchan/Rocket.Chat,cnash/Rocket.Chat,fonsich/Rocket.Chat,thunderrabbit/Rocket.Chat,TribeMedia/Rocket.Chat,glnarayanan/Rocket.Chat,umeshrs/rocket-chat-integration,BorntraegerMarc/Rocket.Chat,arvi/Rocket.Chat | coffeescript | ## Code Before:
if Meteor.isServer
storeType = 'GridFS'
if Meteor.settings?.public?.avatarStore?.type?
storeType = Meteor.settings.public.avatarStore.type
RocketStore = RocketFile[storeType]
if not RocketStore?
throw new Error "Invalid RocketStore type [#{storeType}]"
transformWrite = undefined
if Meteor.settings?.public?.avatarStore?.size?.height?
height = Meteor.settings.public.avatarStore.size.height
width = Meteor.settings.public.avatarStore.size.width
transformWrite = (file, readStream, writeStream) ->
RocketFile.gm(readStream, file.fileName).background('#ffffff').resize(width, height).gravity('Center').extent(width, height).stream('jpeg').pipe(writeStream)
path = "~/uploads"
if Meteor.settings?.public?.avatarStore?.path?
path = Meteor.settings.public.avatarStore.path
@RocketFileAvatarInstance = new RocketFile.GridFS
name: 'avatars'
absolutePath: path
transformWrite: transformWrite
HTTP.methods
'/avatar/:username':
'stream': true
'get': (data) ->
this.params.username
file = RocketFileAvatarInstance.getFileWithReadStream this.params.username
this.setContentType 'image/jpeg'
this.addHeader 'Content-Disposition', 'inline'
this.addHeader 'Content-Length', file.length
file.readStream.pipe this.createWriteStream()
return
## Instruction:
Fix store reference and log which store is using
## Code After:
if Meteor.isServer
storeType = 'GridFS'
if Meteor.settings?.public?.avatarStore?.type?
storeType = Meteor.settings.public.avatarStore.type
RocketStore = RocketFile[storeType]
if not RocketStore?
throw new Error "Invalid RocketStore type [#{storeType}]"
console.log "Using #{storeType} for Avatar storage".green
transformWrite = undefined
if Meteor.settings?.public?.avatarStore?.size?.height?
height = Meteor.settings.public.avatarStore.size.height
width = Meteor.settings.public.avatarStore.size.width
transformWrite = (file, readStream, writeStream) ->
RocketFile.gm(readStream, file.fileName).background('#ffffff').resize(width, height).gravity('Center').extent(width, height).stream('jpeg').pipe(writeStream)
path = "~/uploads"
if Meteor.settings?.public?.avatarStore?.path?
path = Meteor.settings.public.avatarStore.path
@RocketFileAvatarInstance = new RocketStore
name: 'avatars'
absolutePath: path
transformWrite: transformWrite
HTTP.methods
'/avatar/:username':
'stream': true
'get': (data) ->
this.params.username
file = RocketFileAvatarInstance.getFileWithReadStream this.params.username
this.setContentType 'image/jpeg'
this.addHeader 'Content-Disposition', 'inline'
this.addHeader 'Content-Length', file.length
file.readStream.pipe this.createWriteStream()
return
| if Meteor.isServer
storeType = 'GridFS'
if Meteor.settings?.public?.avatarStore?.type?
storeType = Meteor.settings.public.avatarStore.type
RocketStore = RocketFile[storeType]
if not RocketStore?
throw new Error "Invalid RocketStore type [#{storeType}]"
+
+ console.log "Using #{storeType} for Avatar storage".green
transformWrite = undefined
if Meteor.settings?.public?.avatarStore?.size?.height?
height = Meteor.settings.public.avatarStore.size.height
width = Meteor.settings.public.avatarStore.size.width
transformWrite = (file, readStream, writeStream) ->
RocketFile.gm(readStream, file.fileName).background('#ffffff').resize(width, height).gravity('Center').extent(width, height).stream('jpeg').pipe(writeStream)
path = "~/uploads"
if Meteor.settings?.public?.avatarStore?.path?
path = Meteor.settings.public.avatarStore.path
- @RocketFileAvatarInstance = new RocketFile.GridFS
? ^^^ -------
+ @RocketFileAvatarInstance = new RocketStore
? ^^^^
name: 'avatars'
absolutePath: path
transformWrite: transformWrite
HTTP.methods
'/avatar/:username':
'stream': true
'get': (data) ->
this.params.username
file = RocketFileAvatarInstance.getFileWithReadStream this.params.username
this.setContentType 'image/jpeg'
this.addHeader 'Content-Disposition', 'inline'
this.addHeader 'Content-Length', file.length
file.readStream.pipe this.createWriteStream()
return
| 4 | 0.095238 | 3 | 1 |
a2e957317f7a4f57ef5b1d692f45ab842832f5d7 | cmake/Depthai/DepthaiBootloaderConfig.cmake | cmake/Depthai/DepthaiBootloaderConfig.cmake | set(DEPTHAI_BOOTLOADER_MATURITY "release")
#set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
# "version if applicable"
set(DEPTHAI_BOOTLOADER_VERSION "0.0.17")
#set(DEPTHAI_BOOTLOADER_VERSION "870bec4ca0ef026ddb9c490a38dfe9d097614554")
| set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
# "version if applicable"
#set(DEPTHAI_BOOTLOADER_VERSION "0.0.15")
set(DEPTHAI_BOOTLOADER_VERSION "bb9e4c86ce3c22da0aeb92dc18c8b28458bed4af")
| Fix CI build (not matching bootloader-shared commits) | Fix CI build (not matching bootloader-shared commits)
| CMake | mit | luxonis/depthai-core,luxonis/depthai-core,luxonis/depthai-core | cmake | ## Code Before:
set(DEPTHAI_BOOTLOADER_MATURITY "release")
#set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
# "version if applicable"
set(DEPTHAI_BOOTLOADER_VERSION "0.0.17")
#set(DEPTHAI_BOOTLOADER_VERSION "870bec4ca0ef026ddb9c490a38dfe9d097614554")
## Instruction:
Fix CI build (not matching bootloader-shared commits)
## Code After:
set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
# "version if applicable"
#set(DEPTHAI_BOOTLOADER_VERSION "0.0.15")
set(DEPTHAI_BOOTLOADER_VERSION "bb9e4c86ce3c22da0aeb92dc18c8b28458bed4af")
| - set(DEPTHAI_BOOTLOADER_MATURITY "release")
- #set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
? -
+ set(DEPTHAI_BOOTLOADER_MATURITY "snapshot")
# "version if applicable"
- set(DEPTHAI_BOOTLOADER_VERSION "0.0.17")
? ^
+ #set(DEPTHAI_BOOTLOADER_VERSION "0.0.15")
? + ^
- #set(DEPTHAI_BOOTLOADER_VERSION "870bec4ca0ef026ddb9c490a38dfe9d097614554")
+ set(DEPTHAI_BOOTLOADER_VERSION "bb9e4c86ce3c22da0aeb92dc18c8b28458bed4af") | 7 | 1.166667 | 3 | 4 |
3621e80a9606799384166cd7431b6f9e298b5271 | lib/mortar/templates/script/runpython.sh | lib/mortar/templates/script/runpython.sh |
set -e
# Setup python environment
source <%= @local_install_dir %>/pythonenv/bin/activate
# Run Python
<%= @local_install_dir %>/pythonenv/bin/python \
<%= @python_arugments %> \
<%= @python_script %> \
<%= @script_arguments %>
|
set -e
# Setup python environment
source <%= @local_install_dir %>/pythonenv/bin/activate
export LUIGI_CONFIG_PATH=`pwd`/`dirname <%= @python_script %>`/client.cfg
# Run Python
<%= @local_install_dir %>/pythonenv/bin/python \
<%= @python_arugments %> \
<%= @python_script %> \
<%= @script_arguments %>
| Use LUIGI_CONFIG_PATH to look for client.cfg in the directory of the script. | Use LUIGI_CONFIG_PATH to look for client.cfg in the directory of the script. | Shell | apache-2.0 | AsherBond/mortar,AsherBond/mortar,AsherBond/mortar,AsherBond/mortar | shell | ## Code Before:
set -e
# Setup python environment
source <%= @local_install_dir %>/pythonenv/bin/activate
# Run Python
<%= @local_install_dir %>/pythonenv/bin/python \
<%= @python_arugments %> \
<%= @python_script %> \
<%= @script_arguments %>
## Instruction:
Use LUIGI_CONFIG_PATH to look for client.cfg in the directory of the script.
## Code After:
set -e
# Setup python environment
source <%= @local_install_dir %>/pythonenv/bin/activate
export LUIGI_CONFIG_PATH=`pwd`/`dirname <%= @python_script %>`/client.cfg
# Run Python
<%= @local_install_dir %>/pythonenv/bin/python \
<%= @python_arugments %> \
<%= @python_script %> \
<%= @script_arguments %>
|
set -e
# Setup python environment
source <%= @local_install_dir %>/pythonenv/bin/activate
+ export LUIGI_CONFIG_PATH=`pwd`/`dirname <%= @python_script %>`/client.cfg
+
# Run Python
<%= @local_install_dir %>/pythonenv/bin/python \
<%= @python_arugments %> \
<%= @python_script %> \
<%= @script_arguments %> | 2 | 0.181818 | 2 | 0 |
fa1209db9af0166901f278452dc9b8c1d1d1fa5c | lib/autoload/hooks/responder/etag_304_revalidate.js | lib/autoload/hooks/responder/etag_304_revalidate.js | // Add Etag header to each http and rpc response
//
'use strict';
const crypto = require('crypto');
module.exports = function (N) {
N.wire.after([ 'responder:http', 'responder:rpc' ], { priority: 95 }, function etag_304_revalidate(env) {
if (env.status !== 200) return;
if (!env.body || typeof env.body !== 'string') return;
if (env.headers['ETag'] || env.headers['Cache-Control']) return;
let etag = '"' + crypto.createHash('sha1').update(env.body).digest('base64').substring(0, 27) + '"';
env.headers['ETag'] = etag;
env.headers['Cache-Control'] = 'must-revalidate';
if (etag === env.origin.req.headers['if-none-match']) {
env.status = N.io.NOT_MODIFIED;
env.body = null;
}
});
};
| // Add Etags, 304 responses, and force revalidate for each request
//
// - Should help with nasty cases, when quick page open use old
// assets and show errors until Ctrl+F5
// - Still good enougth for user, because 304 responses supported
//
'use strict';
const crypto = require('crypto');
module.exports = function (N) {
N.wire.after([ 'responder:http', 'responder:rpc' ], { priority: 95 }, function etag_304_revalidate(env) {
// Quick check if we can intrude
if (env.status !== 200) return;
if (!env.body) return;
if (typeof env.body !== 'string' && !Buffer.isBuffer(env.body)) return;
if (env.headers['ETag'] || env.headers['Cache-Control']) return;
// Fill Etag/Cache-Control headers
let etag = '"' + crypto.createHash('sha1').update(env.body).digest('base64').substring(0, 27) + '"';
env.headers['ETag'] = etag;
env.headers['Cache-Control'] = 'max-age=0, must-revalidate';
// Replace responce status if possible
if (etag === env.origin.req.headers['if-none-match']) {
env.status = N.io.NOT_MODIFIED;
env.body = null;
}
});
};
| Improve cache headers and add comments | Improve cache headers and add comments
| JavaScript | mit | nodeca/nodeca.core,nodeca/nodeca.core | javascript | ## Code Before:
// Add Etag header to each http and rpc response
//
'use strict';
const crypto = require('crypto');
module.exports = function (N) {
N.wire.after([ 'responder:http', 'responder:rpc' ], { priority: 95 }, function etag_304_revalidate(env) {
if (env.status !== 200) return;
if (!env.body || typeof env.body !== 'string') return;
if (env.headers['ETag'] || env.headers['Cache-Control']) return;
let etag = '"' + crypto.createHash('sha1').update(env.body).digest('base64').substring(0, 27) + '"';
env.headers['ETag'] = etag;
env.headers['Cache-Control'] = 'must-revalidate';
if (etag === env.origin.req.headers['if-none-match']) {
env.status = N.io.NOT_MODIFIED;
env.body = null;
}
});
};
## Instruction:
Improve cache headers and add comments
## Code After:
// Add Etags, 304 responses, and force revalidate for each request
//
// - Should help with nasty cases, when quick page open use old
// assets and show errors until Ctrl+F5
// - Still good enougth for user, because 304 responses supported
//
'use strict';
const crypto = require('crypto');
module.exports = function (N) {
N.wire.after([ 'responder:http', 'responder:rpc' ], { priority: 95 }, function etag_304_revalidate(env) {
// Quick check if we can intrude
if (env.status !== 200) return;
if (!env.body) return;
if (typeof env.body !== 'string' && !Buffer.isBuffer(env.body)) return;
if (env.headers['ETag'] || env.headers['Cache-Control']) return;
// Fill Etag/Cache-Control headers
let etag = '"' + crypto.createHash('sha1').update(env.body).digest('base64').substring(0, 27) + '"';
env.headers['ETag'] = etag;
env.headers['Cache-Control'] = 'max-age=0, must-revalidate';
// Replace responce status if possible
if (etag === env.origin.req.headers['if-none-match']) {
env.status = N.io.NOT_MODIFIED;
env.body = null;
}
});
};
| - // Add Etag header to each http and rpc response
+ // Add Etags, 304 responses, and force revalidate for each request
//
-
+ // - Should help with nasty cases, when quick page open use old
+ // assets and show errors until Ctrl+F5
+ // - Still good enougth for user, because 304 responses supported
+ //
'use strict';
const crypto = require('crypto');
module.exports = function (N) {
N.wire.after([ 'responder:http', 'responder:rpc' ], { priority: 95 }, function etag_304_revalidate(env) {
+
+ // Quick check if we can intrude
if (env.status !== 200) return;
- if (!env.body || typeof env.body !== 'string') return;
+ if (!env.body) return;
+ if (typeof env.body !== 'string' && !Buffer.isBuffer(env.body)) return;
if (env.headers['ETag'] || env.headers['Cache-Control']) return;
+ // Fill Etag/Cache-Control headers
let etag = '"' + crypto.createHash('sha1').update(env.body).digest('base64').substring(0, 27) + '"';
env.headers['ETag'] = etag;
- env.headers['Cache-Control'] = 'must-revalidate';
+ env.headers['Cache-Control'] = 'max-age=0, must-revalidate';
? +++++++++++
+ // Replace responce status if possible
if (etag === env.origin.req.headers['if-none-match']) {
env.status = N.io.NOT_MODIFIED;
env.body = null;
}
});
}; | 16 | 0.64 | 12 | 4 |
d05adc89faba2708c1dfb13af61e0449e2cc22f3 | tools/build_adafruit_bins.sh | tools/build_adafruit_bins.sh | rm -rf atmel-samd/build*
rm -rf esp8266/build*
ATMEL_BOARDS="cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
for board in $ATMEL_BOARDS; do
make -C atmel-samd BOARD=$board
done
make -C esp8266 BOARD=feather_huzzah
version=`git describe --tags --exact-match`
if [ $? -ne 0 ]; then
version=`git rev-parse --short HEAD`
fi
for board in $ATMEL_BOARDS; do
cp atmel-samd/build-$board/firmware.bin bin/adafruit-micropython-$board-$version.bin
done
cp esp8266/build/firmware-combined.bin bin/adafruit-micropython-feather_huzzah-$version.bin
| rm -rf atmel-samd/build*
rm -rf esp8266/build*
ATMEL_BOARDS="arduino_zero cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
for board in $ATMEL_BOARDS; do
make -C atmel-samd BOARD=$board
done
make -C esp8266 BOARD=feather_huzzah
version=`git describe --tags --exact-match`
if [ $? -ne 0 ]; then
version=`git rev-parse --short HEAD`
fi
for board in $ATMEL_BOARDS; do
cp atmel-samd/build-$board/firmware.bin bin/adafruit-circuitpython-$board-$version.bin
done
cp esp8266/build/firmware-combined.bin bin/adafruit-circuitpython-feather_huzzah-$version.bin
| Add arduino zero to default builder and change name to circuitpython. | Add arduino zero to default builder and change name to circuitpython.
| Shell | mit | adafruit/circuitpython,adafruit/micropython,adafruit/micropython,adafruit/circuitpython,adafruit/circuitpython,adafruit/micropython,adafruit/micropython,adafruit/micropython,adafruit/circuitpython,adafruit/circuitpython,adafruit/circuitpython | shell | ## Code Before:
rm -rf atmel-samd/build*
rm -rf esp8266/build*
ATMEL_BOARDS="cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
for board in $ATMEL_BOARDS; do
make -C atmel-samd BOARD=$board
done
make -C esp8266 BOARD=feather_huzzah
version=`git describe --tags --exact-match`
if [ $? -ne 0 ]; then
version=`git rev-parse --short HEAD`
fi
for board in $ATMEL_BOARDS; do
cp atmel-samd/build-$board/firmware.bin bin/adafruit-micropython-$board-$version.bin
done
cp esp8266/build/firmware-combined.bin bin/adafruit-micropython-feather_huzzah-$version.bin
## Instruction:
Add arduino zero to default builder and change name to circuitpython.
## Code After:
rm -rf atmel-samd/build*
rm -rf esp8266/build*
ATMEL_BOARDS="arduino_zero cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
for board in $ATMEL_BOARDS; do
make -C atmel-samd BOARD=$board
done
make -C esp8266 BOARD=feather_huzzah
version=`git describe --tags --exact-match`
if [ $? -ne 0 ]; then
version=`git rev-parse --short HEAD`
fi
for board in $ATMEL_BOARDS; do
cp atmel-samd/build-$board/firmware.bin bin/adafruit-circuitpython-$board-$version.bin
done
cp esp8266/build/firmware-combined.bin bin/adafruit-circuitpython-feather_huzzah-$version.bin
| rm -rf atmel-samd/build*
rm -rf esp8266/build*
- ATMEL_BOARDS="cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
+ ATMEL_BOARDS="arduino_zero cplay_m0_flash feather_m0_basic feather_m0_adalogger feather_m0_flash metro_m0_flash trinket_m0 gemma_m0"
? +++++++++++++
for board in $ATMEL_BOARDS; do
make -C atmel-samd BOARD=$board
done
make -C esp8266 BOARD=feather_huzzah
version=`git describe --tags --exact-match`
if [ $? -ne 0 ]; then
version=`git rev-parse --short HEAD`
fi
for board in $ATMEL_BOARDS; do
- cp atmel-samd/build-$board/firmware.bin bin/adafruit-micropython-$board-$version.bin
? ^ ^^
+ cp atmel-samd/build-$board/firmware.bin bin/adafruit-circuitpython-$board-$version.bin
? ^ + ^^^
done
- cp esp8266/build/firmware-combined.bin bin/adafruit-micropython-feather_huzzah-$version.bin
? ^ ^^
+ cp esp8266/build/firmware-combined.bin bin/adafruit-circuitpython-feather_huzzah-$version.bin
? ^ + ^^^
| 6 | 0.315789 | 3 | 3 |
f9b46bcd43560a0662256a4bccd4c658f002aee5 | data/schedule.yml | data/schedule.yml | -
time : 9:00am
title: Registration and coffee
minor: true
-
time : 9:30am
title: Opening Remarks
minor: true
-
time : 9:45am
title : Investment Too is Spending: No Escaping the Trophics of Money
speaker: Brian Czech
-
time : 10:30am
title : TBD
speaker: Sonia Kowal
-
time : 11:15am
title: Break
minor: true
-
time : 11:30am
title: Breakout Sessions
-
time : 12:30pm
title: Lunch
minor: true
-
time : 1:45pm
title : Sustainability Policy: Beyond Economic Drivers
speaker: John Gowdy
-
time : 2:30pm
title : TBD
speaker: Marjorie Kelly
-
time : 3:15pm
title: Break
minor: true
-
time : 3:30pm
title: Panel discussion
-
time : 4:45pm
title: Closing remarks
minor: true
| -
time : 9:00am
title: Registration and coffee
minor: true
-
time : 9:30am
title: Opening Remarks
minor: true
-
time : 9:45am
title : Investment Too is Spending: No Escaping the Trophics of Money
speaker: Brian Czech
-
time : 10:30am
title : Socially Responsible Investing and Shareholder Advocacy
speaker: Sonia Kowal
-
time : 11:15am
title: Break
minor: true
-
time : 11:30am
title: Breakout Sessions
-
time : 12:30pm
title: Lunch
minor: true
-
time : 1:45pm
title : Sustainability Policy: Beyond Economic Drivers
speaker: John Gowdy
-
time : 2:30pm
title : Owning Our Future: The Emerging Ownership Revolution
speaker: Marjorie Kelly
-
time : 3:15pm
title: Break
minor: true
-
time : 3:30pm
title: Panel discussion
-
time : 4:45pm
title: Closing remarks
minor: true
| Add talk titles for Sonia and Marjorie | Add talk titles for Sonia and Marjorie
| YAML | mit | ericf/enviroecon.org,ericf/enviroecon.org | yaml | ## Code Before:
-
time : 9:00am
title: Registration and coffee
minor: true
-
time : 9:30am
title: Opening Remarks
minor: true
-
time : 9:45am
title : Investment Too is Spending: No Escaping the Trophics of Money
speaker: Brian Czech
-
time : 10:30am
title : TBD
speaker: Sonia Kowal
-
time : 11:15am
title: Break
minor: true
-
time : 11:30am
title: Breakout Sessions
-
time : 12:30pm
title: Lunch
minor: true
-
time : 1:45pm
title : Sustainability Policy: Beyond Economic Drivers
speaker: John Gowdy
-
time : 2:30pm
title : TBD
speaker: Marjorie Kelly
-
time : 3:15pm
title: Break
minor: true
-
time : 3:30pm
title: Panel discussion
-
time : 4:45pm
title: Closing remarks
minor: true
## Instruction:
Add talk titles for Sonia and Marjorie
## Code After:
-
time : 9:00am
title: Registration and coffee
minor: true
-
time : 9:30am
title: Opening Remarks
minor: true
-
time : 9:45am
title : Investment Too is Spending: No Escaping the Trophics of Money
speaker: Brian Czech
-
time : 10:30am
title : Socially Responsible Investing and Shareholder Advocacy
speaker: Sonia Kowal
-
time : 11:15am
title: Break
minor: true
-
time : 11:30am
title: Breakout Sessions
-
time : 12:30pm
title: Lunch
minor: true
-
time : 1:45pm
title : Sustainability Policy: Beyond Economic Drivers
speaker: John Gowdy
-
time : 2:30pm
title : Owning Our Future: The Emerging Ownership Revolution
speaker: Marjorie Kelly
-
time : 3:15pm
title: Break
minor: true
-
time : 3:30pm
title: Panel discussion
-
time : 4:45pm
title: Closing remarks
minor: true
| -
time : 9:00am
title: Registration and coffee
minor: true
-
time : 9:30am
title: Opening Remarks
minor: true
-
time : 9:45am
title : Investment Too is Spending: No Escaping the Trophics of Money
speaker: Brian Czech
-
time : 10:30am
- title : TBD
+ title : Socially Responsible Investing and Shareholder Advocacy
speaker: Sonia Kowal
-
time : 11:15am
title: Break
minor: true
-
time : 11:30am
title: Breakout Sessions
-
time : 12:30pm
title: Lunch
minor: true
-
time : 1:45pm
title : Sustainability Policy: Beyond Economic Drivers
speaker: John Gowdy
-
time : 2:30pm
- title : TBD
+ title : Owning Our Future: The Emerging Ownership Revolution
speaker: Marjorie Kelly
-
time : 3:15pm
title: Break
minor: true
-
time : 3:30pm
title: Panel discussion
-
time : 4:45pm
title: Closing remarks
minor: true | 4 | 0.086957 | 2 | 2 |
be99c7856693ce21fa3dd3a04837b673a0390ba6 | packages/zarith-freestanding/zarith-freestanding.1.7/files/mirage-build.sh | packages/zarith-freestanding/zarith-freestanding.1.7/files/mirage-build.sh |
PREFIX=`opam config var prefix`
PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig"
export PKG_CONFIG_PATH
# WARNING: if you pass invalid cflags here, zarith will silently
# fall back to compiling with the default flags instead!
CFLAGS="$(pkg-config --cflags gmp-freestanding ocaml-freestanding)" \
LDFLAGS="$(pkg-config --libs gmp-freestanding)" \
./configure -gmp
if [ `uname -s` = "FreeBSD" ]; then
gmake
else
make
fi
|
PREFIX=`opam config var prefix`
PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig"
export PKG_CONFIG_PATH
# WARNING: if you pass invalid cflags here, zarith will silently
# fall back to compiling with the default flags instead!
CFLAGS="$(pkg-config --cflags gmp-freestanding ocaml-freestanding)" \
LDFLAGS="$(pkg-config --libs gmp-freestanding)" \
./configure -gmp
if [ `uname -s` = "FreeBSD" ] || [ `uname -s` = "OpenBSD" ]; then
gmake
else
make
fi
| Add OpenBSD support to zarith-freestanding.1.7 | Add OpenBSD support to zarith-freestanding.1.7
| Shell | cc0-1.0 | fpottier/opam-repository,samoht/opam-repository,orbitz/opam-repository,Chris00/opam-repository,lefessan/opam-repository,camlspotter/opam-repository,jonludlam/opam-repository,djs55/opam-repository,hhugo/opam-repository,toots/opam-repository,abate/opam-repository,c-cube/opam-repository,pqwy/opam-repository,talex5/opam-repository,emillon/opam-repository,kkirstein/opam-repository,pqwy/opam-repository,dbuenzli/opam-repository,mjambon/opam-repository,mjambon/opam-repository,nberth/opam-repository,def-lkb/opam-repository,yallop/opam-repository,codinuum/opam-repository,c-cube/opam-repository,djs55/opam-repository,Chris00/opam-repository,dra27/opam-repository,AltGr/opam-repository,jhwoodyatt/opam-repository,camlspotter/opam-repository,ocaml/opam-repository,nberth/opam-repository,yallop/opam-repository,avsm/opam-repository,fpottier/opam-repository,hannesm/opam-repository,cakeplus/opam-repository,andersfugmann/opam-repository,talex5/opam-repository,ocaml/opam-repository,hannesm/opam-repository,codinuum/opam-repository,lefessan/opam-repository,emillon/opam-repository,seliopou/opam-repository,kkirstein/opam-repository,avsm/opam-repository,jhwoodyatt/opam-repository,astrada/opam-repository,dra27/opam-repository,planar/opam-repository,seliopou/opam-repository,gasche/opam-repository,andersfugmann/opam-repository,Leonidas-from-XIV/opam-repository,dbuenzli/opam-repository,pirbo/opam-repository,AltGr/opam-repository,Leonidas-from-XIV/opam-repository,hakuch/opam-repository,toots/opam-repository,arlencox/opam-repository,astrada/opam-repository,jonludlam/opam-repository,pirbo/opam-repository | shell | ## Code Before:
PREFIX=`opam config var prefix`
PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig"
export PKG_CONFIG_PATH
# WARNING: if you pass invalid cflags here, zarith will silently
# fall back to compiling with the default flags instead!
CFLAGS="$(pkg-config --cflags gmp-freestanding ocaml-freestanding)" \
LDFLAGS="$(pkg-config --libs gmp-freestanding)" \
./configure -gmp
if [ `uname -s` = "FreeBSD" ]; then
gmake
else
make
fi
## Instruction:
Add OpenBSD support to zarith-freestanding.1.7
## Code After:
PREFIX=`opam config var prefix`
PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig"
export PKG_CONFIG_PATH
# WARNING: if you pass invalid cflags here, zarith will silently
# fall back to compiling with the default flags instead!
CFLAGS="$(pkg-config --cflags gmp-freestanding ocaml-freestanding)" \
LDFLAGS="$(pkg-config --libs gmp-freestanding)" \
./configure -gmp
if [ `uname -s` = "FreeBSD" ] || [ `uname -s` = "OpenBSD" ]; then
gmake
else
make
fi
|
PREFIX=`opam config var prefix`
PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig"
export PKG_CONFIG_PATH
# WARNING: if you pass invalid cflags here, zarith will silently
# fall back to compiling with the default flags instead!
CFLAGS="$(pkg-config --cflags gmp-freestanding ocaml-freestanding)" \
LDFLAGS="$(pkg-config --libs gmp-freestanding)" \
./configure -gmp
- if [ `uname -s` = "FreeBSD" ]; then
+ if [ `uname -s` = "FreeBSD" ] || [ `uname -s` = "OpenBSD" ]; then
gmake
else
make
fi | 2 | 0.125 | 1 | 1 |
77a7314aeff2df95ceaf2f2f7a3999cb06d96c55 | packages/li/list-t-libcurl.yaml | packages/li/list-t-libcurl.yaml | homepage: https://github.com/nikita-volkov/list-t-libcurl
changelog-type: ''
hash: 17ceacbd70abbedf7d9eeeb09f4c2e31e51aa3169cc63994ab55ab727547359d
test-bench-deps: {}
maintainer: Nikita Volkov <nikita.y.volkov@mail.ru>
synopsis: A "libcurl"-based streaming HTTP client
changelog: ''
basic-deps:
either: ==4.*
bytestring: ! '>=0.10 && <0.11'
base-prelude: ! '>=0.1.19 && <0.2'
stm: ==2.*
base: <5
list-t: ! '>=0.4 && <0.5'
curlhs: ==0.1.*
resource-pool: ==0.2.*
mtl-prelude: ! '>=1 && <3'
all-versions:
- '0.2.0.0'
- '0.2.0.1'
- '0.2.0.2'
author: Nikita Volkov <nikita.y.volkov@mail.ru>
latest: '0.2.0.2'
description-type: haddock
description: ''
license-name: MIT
| homepage: https://github.com/nikita-volkov/list-t-libcurl
changelog-type: ''
hash: 1a92883b998262715586f9f65f5161044121644edb31f65aadafd18415d3be99
test-bench-deps: {}
maintainer: Nikita Volkov <nikita.y.volkov@mail.ru>
synopsis: A "libcurl"-based streaming HTTP client
changelog: ''
basic-deps:
either: ==4.*
bytestring: ! '>=0.10 && <0.11'
base-prelude: ! '>=0.1.19 && <0.2'
stm: ==2.*
base: <5
list-t: ! '>=0.4 && <0.5'
curlhs: ! '>=0.1.6 && <0.2'
resource-pool: ==0.2.*
mtl-prelude: ! '>=1 && <3'
all-versions:
- '0.2.0.0'
- '0.2.0.1'
- '0.2.0.2'
- '0.3.0.0'
author: Nikita Volkov <nikita.y.volkov@mail.ru>
latest: '0.3.0.0'
description-type: haddock
description: ''
license-name: MIT
| Update from Hackage at 2015-05-26T06:20:21+0000 | Update from Hackage at 2015-05-26T06:20:21+0000
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/nikita-volkov/list-t-libcurl
changelog-type: ''
hash: 17ceacbd70abbedf7d9eeeb09f4c2e31e51aa3169cc63994ab55ab727547359d
test-bench-deps: {}
maintainer: Nikita Volkov <nikita.y.volkov@mail.ru>
synopsis: A "libcurl"-based streaming HTTP client
changelog: ''
basic-deps:
either: ==4.*
bytestring: ! '>=0.10 && <0.11'
base-prelude: ! '>=0.1.19 && <0.2'
stm: ==2.*
base: <5
list-t: ! '>=0.4 && <0.5'
curlhs: ==0.1.*
resource-pool: ==0.2.*
mtl-prelude: ! '>=1 && <3'
all-versions:
- '0.2.0.0'
- '0.2.0.1'
- '0.2.0.2'
author: Nikita Volkov <nikita.y.volkov@mail.ru>
latest: '0.2.0.2'
description-type: haddock
description: ''
license-name: MIT
## Instruction:
Update from Hackage at 2015-05-26T06:20:21+0000
## Code After:
homepage: https://github.com/nikita-volkov/list-t-libcurl
changelog-type: ''
hash: 1a92883b998262715586f9f65f5161044121644edb31f65aadafd18415d3be99
test-bench-deps: {}
maintainer: Nikita Volkov <nikita.y.volkov@mail.ru>
synopsis: A "libcurl"-based streaming HTTP client
changelog: ''
basic-deps:
either: ==4.*
bytestring: ! '>=0.10 && <0.11'
base-prelude: ! '>=0.1.19 && <0.2'
stm: ==2.*
base: <5
list-t: ! '>=0.4 && <0.5'
curlhs: ! '>=0.1.6 && <0.2'
resource-pool: ==0.2.*
mtl-prelude: ! '>=1 && <3'
all-versions:
- '0.2.0.0'
- '0.2.0.1'
- '0.2.0.2'
- '0.3.0.0'
author: Nikita Volkov <nikita.y.volkov@mail.ru>
latest: '0.3.0.0'
description-type: haddock
description: ''
license-name: MIT
| homepage: https://github.com/nikita-volkov/list-t-libcurl
changelog-type: ''
- hash: 17ceacbd70abbedf7d9eeeb09f4c2e31e51aa3169cc63994ab55ab727547359d
+ hash: 1a92883b998262715586f9f65f5161044121644edb31f65aadafd18415d3be99
test-bench-deps: {}
maintainer: Nikita Volkov <nikita.y.volkov@mail.ru>
synopsis: A "libcurl"-based streaming HTTP client
changelog: ''
basic-deps:
either: ==4.*
bytestring: ! '>=0.10 && <0.11'
base-prelude: ! '>=0.1.19 && <0.2'
stm: ==2.*
base: <5
list-t: ! '>=0.4 && <0.5'
- curlhs: ==0.1.*
+ curlhs: ! '>=0.1.6 && <0.2'
resource-pool: ==0.2.*
mtl-prelude: ! '>=1 && <3'
all-versions:
- '0.2.0.0'
- '0.2.0.1'
- '0.2.0.2'
+ - '0.3.0.0'
author: Nikita Volkov <nikita.y.volkov@mail.ru>
- latest: '0.2.0.2'
? ^ ^
+ latest: '0.3.0.0'
? ^ ^
description-type: haddock
description: ''
license-name: MIT | 7 | 0.269231 | 4 | 3 |
e638aebb7ffe2ff4edc4059ff6a45c3c7a389493 | lib/foodcritic/rules/fc033.rb | lib/foodcritic/rules/fc033.rb | rule "FC033", "Missing template file" do
tags %w{correctness templates}
recipe do |ast, filename|
# find all template resources that don't fetch a template
# from either another cookbook or a local path
find_resources(ast, type: :template).reject do |resource|
resource_attributes(resource)["local"] ||
resource_attributes(resource)["cookbook"]
end.map do |resource|
# fetch the specified file to the template
file = template_file(resource_attributes(resource,
return_expressions: true))
{ resource: resource, file: file }
end.reject do |resource|
# skip the check if the file path is derived since
# we can't determine if that's here or not without converging the node
resource[:file].respond_to?(:xpath)
end.select do |resource|
template_paths(filename).none? do |path|
relative_path = []
Pathname.new(path).ascend do |template_path|
relative_path << template_path.basename
break if gem_version(chef_version) >= gem_version("12.0.0") &&
template_path.dirname.basename.to_s == "templates"
break if template_path.dirname.dirname.basename.to_s == "templates"
end
File.join(relative_path.reverse) == resource[:file]
end
end.map { |resource| resource[:resource] }
end
end
| rule "FC033", "Missing template file" do
tags %w{correctness templates}
recipe do |ast, filename|
# find all template resources that don't fetch a template
# from either another cookbook or a local path
find_resources(ast, type: :template).reject do |resource|
resource_attributes(resource)["local"] ||
resource_attributes(resource)["cookbook"]
end.map do |resource|
# fetch the specified file to the template
file = template_file(resource_attributes(resource,
return_expressions: true))
{ resource: resource, file: file }
end.reject do |resource|
# skip the check if the file path is derived since
# we can't determine if that's here or not without converging the node
resource[:file].respond_to?(:xpath)
end.select do |resource|
template_paths(filename).none? do |path|
relative_path = []
Pathname.new(path).ascend do |template_path|
relative_path << template_path.basename
# stop building relative path if we've hit template or 1 dir above
# NOTE: This is a totally flawed attempt to strip things like
# templates/ubuntu/something.erb down to something.erb, which breaks
# legit nested dirs in the templates dir like templates/something/something.erb
break if template_path.dirname.basename.to_s == "templates" ||
template_path.dirname.dirname.basename.to_s == "templates"
end
File.join(relative_path.reverse) == resource[:file]
end
end.map { |resource| resource[:resource] }
end
end
| Remove the Chef 11 support in the check | Remove the Chef 11 support in the check
Also add a comment explaining how flawed this logic is
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
| Ruby | mit | acrmp/foodcritic,Foodcritic/foodcritic | ruby | ## Code Before:
rule "FC033", "Missing template file" do
tags %w{correctness templates}
recipe do |ast, filename|
# find all template resources that don't fetch a template
# from either another cookbook or a local path
find_resources(ast, type: :template).reject do |resource|
resource_attributes(resource)["local"] ||
resource_attributes(resource)["cookbook"]
end.map do |resource|
# fetch the specified file to the template
file = template_file(resource_attributes(resource,
return_expressions: true))
{ resource: resource, file: file }
end.reject do |resource|
# skip the check if the file path is derived since
# we can't determine if that's here or not without converging the node
resource[:file].respond_to?(:xpath)
end.select do |resource|
template_paths(filename).none? do |path|
relative_path = []
Pathname.new(path).ascend do |template_path|
relative_path << template_path.basename
break if gem_version(chef_version) >= gem_version("12.0.0") &&
template_path.dirname.basename.to_s == "templates"
break if template_path.dirname.dirname.basename.to_s == "templates"
end
File.join(relative_path.reverse) == resource[:file]
end
end.map { |resource| resource[:resource] }
end
end
## Instruction:
Remove the Chef 11 support in the check
Also add a comment explaining how flawed this logic is
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
## Code After:
rule "FC033", "Missing template file" do
tags %w{correctness templates}
recipe do |ast, filename|
# find all template resources that don't fetch a template
# from either another cookbook or a local path
find_resources(ast, type: :template).reject do |resource|
resource_attributes(resource)["local"] ||
resource_attributes(resource)["cookbook"]
end.map do |resource|
# fetch the specified file to the template
file = template_file(resource_attributes(resource,
return_expressions: true))
{ resource: resource, file: file }
end.reject do |resource|
# skip the check if the file path is derived since
# we can't determine if that's here or not without converging the node
resource[:file].respond_to?(:xpath)
end.select do |resource|
template_paths(filename).none? do |path|
relative_path = []
Pathname.new(path).ascend do |template_path|
relative_path << template_path.basename
# stop building relative path if we've hit template or 1 dir above
# NOTE: This is a totally flawed attempt to strip things like
# templates/ubuntu/something.erb down to something.erb, which breaks
# legit nested dirs in the templates dir like templates/something/something.erb
break if template_path.dirname.basename.to_s == "templates" ||
template_path.dirname.dirname.basename.to_s == "templates"
end
File.join(relative_path.reverse) == resource[:file]
end
end.map { |resource| resource[:resource] }
end
end
| rule "FC033", "Missing template file" do
tags %w{correctness templates}
recipe do |ast, filename|
# find all template resources that don't fetch a template
# from either another cookbook or a local path
find_resources(ast, type: :template).reject do |resource|
resource_attributes(resource)["local"] ||
resource_attributes(resource)["cookbook"]
end.map do |resource|
# fetch the specified file to the template
file = template_file(resource_attributes(resource,
return_expressions: true))
{ resource: resource, file: file }
end.reject do |resource|
# skip the check if the file path is derived since
# we can't determine if that's here or not without converging the node
resource[:file].respond_to?(:xpath)
end.select do |resource|
template_paths(filename).none? do |path|
relative_path = []
Pathname.new(path).ascend do |template_path|
relative_path << template_path.basename
- break if gem_version(chef_version) >= gem_version("12.0.0") &&
+ # stop building relative path if we've hit template or 1 dir above
+ # NOTE: This is a totally flawed attempt to strip things like
+ # templates/ubuntu/something.erb down to something.erb, which breaks
+ # legit nested dirs in the templates dir like templates/something/something.erb
+ break if template_path.dirname.basename.to_s == "templates" ||
- template_path.dirname.basename.to_s == "templates"
+ template_path.dirname.dirname.basename.to_s == "templates"
? ++++++++
- break if template_path.dirname.dirname.basename.to_s == "templates"
end
File.join(relative_path.reverse) == resource[:file]
end
end.map { |resource| resource[:resource] }
end
end | 9 | 0.290323 | 6 | 3 |
d357136075bce9d8582759a525536daf7489becb | unitypack/engine/object.py | unitypack/engine/object.py | def field(f, cast=None):
def _inner(self):
ret = self._obj[f]
if cast:
ret = cast(ret)
return ret
return property(_inner)
class Object:
def __init__(self, data=None):
if data is None:
data = {}
self._obj = data
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
def __str__(self):
return self.name
name = field("m_Name")
class GameObject(Object):
active = field("m_IsActive")
component = field("m_Component")
layer = field("m_Layer")
tag = field("m_Tag")
| def field(f, cast=None, **kwargs):
def _inner(self):
if "default" in kwargs:
ret = self._obj.get(f, kwargs["default"])
else:
ret = self._obj[f]
if cast:
ret = cast(ret)
return ret
return property(_inner)
class Object:
def __init__(self, data=None):
if data is None:
data = {}
self._obj = data
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
def __str__(self):
return self.name
name = field("m_Name", default="")
class GameObject(Object):
active = field("m_IsActive")
component = field("m_Component")
layer = field("m_Layer")
tag = field("m_Tag")
| Allow default values in field() | Allow default values in field()
| Python | mit | andburn/python-unitypack | python | ## Code Before:
def field(f, cast=None):
def _inner(self):
ret = self._obj[f]
if cast:
ret = cast(ret)
return ret
return property(_inner)
class Object:
def __init__(self, data=None):
if data is None:
data = {}
self._obj = data
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
def __str__(self):
return self.name
name = field("m_Name")
class GameObject(Object):
active = field("m_IsActive")
component = field("m_Component")
layer = field("m_Layer")
tag = field("m_Tag")
## Instruction:
Allow default values in field()
## Code After:
def field(f, cast=None, **kwargs):
def _inner(self):
if "default" in kwargs:
ret = self._obj.get(f, kwargs["default"])
else:
ret = self._obj[f]
if cast:
ret = cast(ret)
return ret
return property(_inner)
class Object:
def __init__(self, data=None):
if data is None:
data = {}
self._obj = data
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
def __str__(self):
return self.name
name = field("m_Name", default="")
class GameObject(Object):
active = field("m_IsActive")
component = field("m_Component")
layer = field("m_Layer")
tag = field("m_Tag")
| - def field(f, cast=None):
+ def field(f, cast=None, **kwargs):
? ++++++++++
def _inner(self):
+ if "default" in kwargs:
+ ret = self._obj.get(f, kwargs["default"])
+ else:
- ret = self._obj[f]
+ ret = self._obj[f]
? +
if cast:
ret = cast(ret)
return ret
return property(_inner)
class Object:
def __init__(self, data=None):
if data is None:
data = {}
self._obj = data
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
def __str__(self):
return self.name
- name = field("m_Name")
+ name = field("m_Name", default="")
? ++++++++++++
class GameObject(Object):
active = field("m_IsActive")
component = field("m_Component")
layer = field("m_Layer")
tag = field("m_Tag") | 9 | 0.310345 | 6 | 3 |
b7d5efd87913052b4b66258254c8bdaaa88fccb8 | lib/rubeuler/problem.rb | lib/rubeuler/problem.rb | require 'benchmark'
module Rubeuler
include Solution
class Problem < ::Rubeuler::Base
def initialize(options)
load_options(:number, :answer, options)
raise TypeError, ':answer should be a string' unless @answer.is_a?(String)
end
def execute!
time = Benchmark.measure { @data = answer }.real
true_or_false = @data == solution ? true : false
return Rubeuler::Result.new(success: true_or_false, problem: @number, data: @data, runtime: time)
end
private
def answer
instance_eval(@answer.gsub("\n",";"))
end
def solution
Rubeuler::Solution.for_problem(@number)
end
end
end
| require 'benchmark'
module Rubeuler
include Solution
class Problem < ::Rubeuler::Base
def initialize(options)
load_options(:number, :answer, options)
raise TypeError, ':answer should be a string' unless @answer.is_a?(String)
raise TypeError, ':number should be an integer' unless @number.is_a?(Fixnum)
end
def execute!
time = timed_answer
true_or_false = @data == solution ? true : false
return Rubeuler::Result.new(success: true_or_false, problem: @number, data: @data, runtime: time)
end
private
def timed_answer
timer_endpoints = ["Benchmark.measure { @data =", "}.real"]
instance_eval(timer_endpoints.join(@answer.gsub("\n",";")))
end
def solution
Rubeuler::Solution.for_problem(@number)
end
end
end
| Move Benchmark closer to the code being evaluated (inside the instance_eval) | Move Benchmark closer to the code being evaluated (inside the instance_eval)
| Ruby | mit | rthbound/rubeuler | ruby | ## Code Before:
require 'benchmark'
module Rubeuler
include Solution
class Problem < ::Rubeuler::Base
def initialize(options)
load_options(:number, :answer, options)
raise TypeError, ':answer should be a string' unless @answer.is_a?(String)
end
def execute!
time = Benchmark.measure { @data = answer }.real
true_or_false = @data == solution ? true : false
return Rubeuler::Result.new(success: true_or_false, problem: @number, data: @data, runtime: time)
end
private
def answer
instance_eval(@answer.gsub("\n",";"))
end
def solution
Rubeuler::Solution.for_problem(@number)
end
end
end
## Instruction:
Move Benchmark closer to the code being evaluated (inside the instance_eval)
## Code After:
require 'benchmark'
module Rubeuler
include Solution
class Problem < ::Rubeuler::Base
def initialize(options)
load_options(:number, :answer, options)
raise TypeError, ':answer should be a string' unless @answer.is_a?(String)
raise TypeError, ':number should be an integer' unless @number.is_a?(Fixnum)
end
def execute!
time = timed_answer
true_or_false = @data == solution ? true : false
return Rubeuler::Result.new(success: true_or_false, problem: @number, data: @data, runtime: time)
end
private
def timed_answer
timer_endpoints = ["Benchmark.measure { @data =", "}.real"]
instance_eval(timer_endpoints.join(@answer.gsub("\n",";")))
end
def solution
Rubeuler::Solution.for_problem(@number)
end
end
end
| require 'benchmark'
module Rubeuler
include Solution
class Problem < ::Rubeuler::Base
def initialize(options)
load_options(:number, :answer, options)
raise TypeError, ':answer should be a string' unless @answer.is_a?(String)
+ raise TypeError, ':number should be an integer' unless @number.is_a?(Fixnum)
end
def execute!
- time = Benchmark.measure { @data = answer }.real
+ time = timed_answer
true_or_false = @data == solution ? true : false
return Rubeuler::Result.new(success: true_or_false, problem: @number, data: @data, runtime: time)
end
private
- def answer
+ def timed_answer
? ++++++
+ timer_endpoints = ["Benchmark.measure { @data =", "}.real"]
- instance_eval(@answer.gsub("\n",";"))
+ instance_eval(timer_endpoints.join(@answer.gsub("\n",";")))
? +++++++++++++++++++++ +
end
def solution
Rubeuler::Solution.for_problem(@number)
end
end
end | 8 | 0.285714 | 5 | 3 |
d94c440c09974e00dafc3b2566578d2558f2cbe5 | sortable/src/ca/eandb/sortable/ProductTrieBuilder.java | sortable/src/ca/eandb/sortable/ProductTrieBuilder.java | /**
*
*/
package ca.eandb.sortable;
import java.util.LinkedList;
import java.util.List;
import ca.eandb.sortable.Product.Field;
/**
* @author brad
*
*/
public final class ProductTrieBuilder implements ProductVisitor {
private final TrieNode root = new TrieNode();
/*(non-Javadoc)
* @see ca.eandb.sortable.ProductVisitor#visit(ca.eandb.sortable.Product)
*/
@Override
public void visit(Product product) {
addProduct(product);
}
public void addProduct(Product product) {
processField(product, Field.MANUFACTURER, product.getManufacturer());
processField(product, Field.FAMILY, product.getFamily());
processField(product, Field.MODEL, product.getModel());
}
private void processField(Product product, Field field, String value) {
value = StringUtil.normalize(value);
String[] words = value.split(" ");
ProductMatch match = new ProductMatch(product, field);
for (String word : words) {
TrieNode node = root.insert(word);
List<ProductMatch> products = (List<ProductMatch>) node.getData();
if (products == null) {
products = new LinkedList<ProductMatch>();
node.setData(products);
}
products.add(match);
}
}
public TrieNode getRoot() {
return root;
}
}
| /**
*
*/
package ca.eandb.sortable;
import java.util.LinkedList;
import java.util.List;
import ca.eandb.sortable.Product.Field;
/**
* @author brad
*
*/
public final class ProductTrieBuilder implements ProductVisitor {
private final TrieNode root = new TrieNode();
/*(non-Javadoc)
* @see ca.eandb.sortable.ProductVisitor#visit(ca.eandb.sortable.Product)
*/
@Override
public void visit(Product product) {
addProduct(product);
}
public void addProduct(Product product) {
processField(product, Field.MANUFACTURER, product.getManufacturer());
processField(product, Field.MODEL, product.getModel());
if (product.getFamily() != null) {
processField(product, Field.FAMILY, product.getFamily());
}
}
private void processField(Product product, Field field, String value) {
value = StringUtil.normalize(value);
String[] words = value.split(" ");
ProductMatch match = new ProductMatch(product, field);
for (String word : words) {
TrieNode node = root.insert(word);
List<ProductMatch> products = (List<ProductMatch>) node.getData();
if (products == null) {
products = new LinkedList<ProductMatch>();
node.setData(products);
}
products.add(match);
}
}
public TrieNode getRoot() {
return root;
}
}
| Check if the optional family field is null before attempting to insert. | Check if the optional family field is null before attempting to insert. | Java | mit | bwkimmel/sortable | java | ## Code Before:
/**
*
*/
package ca.eandb.sortable;
import java.util.LinkedList;
import java.util.List;
import ca.eandb.sortable.Product.Field;
/**
* @author brad
*
*/
public final class ProductTrieBuilder implements ProductVisitor {
private final TrieNode root = new TrieNode();
/*(non-Javadoc)
* @see ca.eandb.sortable.ProductVisitor#visit(ca.eandb.sortable.Product)
*/
@Override
public void visit(Product product) {
addProduct(product);
}
public void addProduct(Product product) {
processField(product, Field.MANUFACTURER, product.getManufacturer());
processField(product, Field.FAMILY, product.getFamily());
processField(product, Field.MODEL, product.getModel());
}
private void processField(Product product, Field field, String value) {
value = StringUtil.normalize(value);
String[] words = value.split(" ");
ProductMatch match = new ProductMatch(product, field);
for (String word : words) {
TrieNode node = root.insert(word);
List<ProductMatch> products = (List<ProductMatch>) node.getData();
if (products == null) {
products = new LinkedList<ProductMatch>();
node.setData(products);
}
products.add(match);
}
}
public TrieNode getRoot() {
return root;
}
}
## Instruction:
Check if the optional family field is null before attempting to insert.
## Code After:
/**
*
*/
package ca.eandb.sortable;
import java.util.LinkedList;
import java.util.List;
import ca.eandb.sortable.Product.Field;
/**
* @author brad
*
*/
public final class ProductTrieBuilder implements ProductVisitor {
private final TrieNode root = new TrieNode();
/*(non-Javadoc)
* @see ca.eandb.sortable.ProductVisitor#visit(ca.eandb.sortable.Product)
*/
@Override
public void visit(Product product) {
addProduct(product);
}
public void addProduct(Product product) {
processField(product, Field.MANUFACTURER, product.getManufacturer());
processField(product, Field.MODEL, product.getModel());
if (product.getFamily() != null) {
processField(product, Field.FAMILY, product.getFamily());
}
}
private void processField(Product product, Field field, String value) {
value = StringUtil.normalize(value);
String[] words = value.split(" ");
ProductMatch match = new ProductMatch(product, field);
for (String word : words) {
TrieNode node = root.insert(word);
List<ProductMatch> products = (List<ProductMatch>) node.getData();
if (products == null) {
products = new LinkedList<ProductMatch>();
node.setData(products);
}
products.add(match);
}
}
public TrieNode getRoot() {
return root;
}
}
| /**
*
*/
package ca.eandb.sortable;
import java.util.LinkedList;
import java.util.List;
import ca.eandb.sortable.Product.Field;
/**
* @author brad
*
*/
public final class ProductTrieBuilder implements ProductVisitor {
private final TrieNode root = new TrieNode();
/*(non-Javadoc)
* @see ca.eandb.sortable.ProductVisitor#visit(ca.eandb.sortable.Product)
*/
@Override
public void visit(Product product) {
addProduct(product);
}
public void addProduct(Product product) {
processField(product, Field.MANUFACTURER, product.getManufacturer());
- processField(product, Field.FAMILY, product.getFamily());
processField(product, Field.MODEL, product.getModel());
+ if (product.getFamily() != null) {
+ processField(product, Field.FAMILY, product.getFamily());
+ }
}
private void processField(Product product, Field field, String value) {
value = StringUtil.normalize(value);
String[] words = value.split(" ");
ProductMatch match = new ProductMatch(product, field);
for (String word : words) {
TrieNode node = root.insert(word);
List<ProductMatch> products = (List<ProductMatch>) node.getData();
if (products == null) {
products = new LinkedList<ProductMatch>();
node.setData(products);
}
products.add(match);
}
}
public TrieNode getRoot() {
return root;
}
} | 4 | 0.072727 | 3 | 1 |
e63d3c90209330442df35cf02f4b08bbc68fbc9f | .travis.yml | .travis.yml | language:
- python
before_install:
- sudo add-apt-repository -y "deb http://archive.ubuntu.com/ubuntu/ trusty main universe"
- sudo apt-get update -qq
install:
- sudo apt-get install -qq build-essential pkg-config
- sudo apt-get install -qq python-dev libboost-python-dev
- sudo apt-get install -qq coinor-libosi-dev coinor-libcoinutils-dev coinor-libclp-dev libbz2-dev
python:
- 2.7
before_script:
- sudo python setup.py install
script:
- python scripts/yaposib-config
| language:
- python
before_install:
- sudo add-apt-repository -y "deb http://archive.ubuntu.com/ubuntu/ trusty main universe"
- sudo apt-get update -qq
install:
- sudo apt-get install -qq build-essential pkg-config
- sudo apt-get install -qq python-dev libboost-python-dev
- sudo apt-get install -qq coinor-libosi-dev coinor-libcoinutils-dev coinor-libclp-dev libbz2-dev
python:
- 2.7
before_script:
- sudo python setup.py install
script:
- python scripts/yaposib-solve examples/p0033.mps
| Change test for now if it works | Change test for now if it works | YAML | epl-1.0 | coin-or/yaposib,coin-or/yaposib | yaml | ## Code Before:
language:
- python
before_install:
- sudo add-apt-repository -y "deb http://archive.ubuntu.com/ubuntu/ trusty main universe"
- sudo apt-get update -qq
install:
- sudo apt-get install -qq build-essential pkg-config
- sudo apt-get install -qq python-dev libboost-python-dev
- sudo apt-get install -qq coinor-libosi-dev coinor-libcoinutils-dev coinor-libclp-dev libbz2-dev
python:
- 2.7
before_script:
- sudo python setup.py install
script:
- python scripts/yaposib-config
## Instruction:
Change test for now if it works
## Code After:
language:
- python
before_install:
- sudo add-apt-repository -y "deb http://archive.ubuntu.com/ubuntu/ trusty main universe"
- sudo apt-get update -qq
install:
- sudo apt-get install -qq build-essential pkg-config
- sudo apt-get install -qq python-dev libboost-python-dev
- sudo apt-get install -qq coinor-libosi-dev coinor-libcoinutils-dev coinor-libclp-dev libbz2-dev
python:
- 2.7
before_script:
- sudo python setup.py install
script:
- python scripts/yaposib-solve examples/p0033.mps
| language:
- python
before_install:
- sudo add-apt-repository -y "deb http://archive.ubuntu.com/ubuntu/ trusty main universe"
- sudo apt-get update -qq
install:
- sudo apt-get install -qq build-essential pkg-config
- sudo apt-get install -qq python-dev libboost-python-dev
- sudo apt-get install -qq coinor-libosi-dev coinor-libcoinutils-dev coinor-libclp-dev libbz2-dev
python:
- 2.7
before_script:
- sudo python setup.py install
script:
- - python scripts/yaposib-config
+ - python scripts/yaposib-solve examples/p0033.mps | 2 | 0.133333 | 1 | 1 |
fccc6379ae82e1a1252f9ab9498dee239a93c68f | tests/integration/features/bootstrap/ServerContext.php | tests/integration/features/bootstrap/ServerContext.php | <?php
use Behat\Behat\Context\Context;
use GuzzleHttp\Cookie\CookieJar;
require_once __DIR__ . '/../../vendor/autoload.php';
class ServerContext implements Context {
use WebDav;
/** @var string */
private $mappedUserId;
private $lastInsertIds = [];
/**
* @BeforeSuite
*/
public static function addFilesToSkeleton() {
}
/**
* @Given /^acting as user "([^"]*)"$/
*/
public function actingAsUser($user) {
$this->loggingInUsingWebAs($user);
$this->asAn($user);
}
public function getCookieJar(): CookieJar {
return $this->cookieJar;
}
public function getReqestToken(): string {
return $this->requestToken;
}
}
| <?php
use Behat\Behat\Context\Context;
use GuzzleHttp\Cookie\CookieJar;
require_once __DIR__ . '/../../vendor/autoload.php';
class ServerContext implements Context {
use WebDav;
/** @var string */
private $mappedUserId;
private $lastInsertIds = [];
/**
* @BeforeSuite
*/
public static function addFilesToSkeleton() {
}
/**
* @Given /^acting as user "([^"]*)"$/
*/
public function actingAsUser($user) {
$this->cookieJar = new CookieJar();
$this->loggingInUsingWebAs($user);
$this->asAn($user);
}
public function getCookieJar(): CookieJar {
return $this->cookieJar;
}
public function getReqestToken(): string {
return $this->requestToken;
}
}
| Refresh cookie jar when acting as a different user | Refresh cookie jar when acting as a different user
Signed-off-by: Julius Härtl <bf353fa4999f2f148afcc6d8ee6cb1ee74cc07c3@bitgrid.net>
| PHP | agpl-3.0 | nextcloud/deck,nextcloud/deck,juliushaertl/deck,juliushaertl/deck,juliushaertl/deck,nextcloud/deck | php | ## Code Before:
<?php
use Behat\Behat\Context\Context;
use GuzzleHttp\Cookie\CookieJar;
require_once __DIR__ . '/../../vendor/autoload.php';
class ServerContext implements Context {
use WebDav;
/** @var string */
private $mappedUserId;
private $lastInsertIds = [];
/**
* @BeforeSuite
*/
public static function addFilesToSkeleton() {
}
/**
* @Given /^acting as user "([^"]*)"$/
*/
public function actingAsUser($user) {
$this->loggingInUsingWebAs($user);
$this->asAn($user);
}
public function getCookieJar(): CookieJar {
return $this->cookieJar;
}
public function getReqestToken(): string {
return $this->requestToken;
}
}
## Instruction:
Refresh cookie jar when acting as a different user
Signed-off-by: Julius Härtl <bf353fa4999f2f148afcc6d8ee6cb1ee74cc07c3@bitgrid.net>
## Code After:
<?php
use Behat\Behat\Context\Context;
use GuzzleHttp\Cookie\CookieJar;
require_once __DIR__ . '/../../vendor/autoload.php';
class ServerContext implements Context {
use WebDav;
/** @var string */
private $mappedUserId;
private $lastInsertIds = [];
/**
* @BeforeSuite
*/
public static function addFilesToSkeleton() {
}
/**
* @Given /^acting as user "([^"]*)"$/
*/
public function actingAsUser($user) {
$this->cookieJar = new CookieJar();
$this->loggingInUsingWebAs($user);
$this->asAn($user);
}
public function getCookieJar(): CookieJar {
return $this->cookieJar;
}
public function getReqestToken(): string {
return $this->requestToken;
}
}
| <?php
use Behat\Behat\Context\Context;
use GuzzleHttp\Cookie\CookieJar;
require_once __DIR__ . '/../../vendor/autoload.php';
class ServerContext implements Context {
use WebDav;
/** @var string */
private $mappedUserId;
private $lastInsertIds = [];
/**
* @BeforeSuite
*/
public static function addFilesToSkeleton() {
}
/**
* @Given /^acting as user "([^"]*)"$/
*/
public function actingAsUser($user) {
+ $this->cookieJar = new CookieJar();
$this->loggingInUsingWebAs($user);
$this->asAn($user);
}
public function getCookieJar(): CookieJar {
return $this->cookieJar;
}
public function getReqestToken(): string {
return $this->requestToken;
}
} | 1 | 0.027027 | 1 | 0 |
411301b2916204c4a28e25882e5da948bc83f93f | gcsweb.k8s.io/deployment.yaml | gcsweb.k8s.io/deployment.yaml | apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: gcsweb
labels:
app: gcsweb
spec:
replicas: 2
# selector defaults from template labels
strategy:
rollingUpdate:
maxSurge: 1
maxUnavailable: 1
type: RollingUpdate
template:
metadata:
labels:
app: gcsweb
spec:
terminationGracePeriodSeconds: 30
containers:
- name: gcsweb
image: k8s.gcr.io/gcsweb-amd64:v1.0.6
args:
- -b=crreleases
- -b=istio-prow
- -b=kubernetes-jenkins
- -b=kubernetes-release
- -b=kubernetes-release-dev
- -b=android-ci
- -p=8080
ports:
- containerPort: 8080
protocol: TCP
resources:
limits:
cpu: 0.1
memory: 128Mi
livenessProbe:
httpGet:
path: /healthz
port: 8080
initialDelaySeconds: 3
timeoutSeconds: 2
failureThreshold: 2
| apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: gcsweb
labels:
app: gcsweb
spec:
replicas: 2
# selector defaults from template labels
strategy:
rollingUpdate:
maxSurge: 1
maxUnavailable: 1
type: RollingUpdate
template:
metadata:
labels:
app: gcsweb
spec:
terminationGracePeriodSeconds: 30
containers:
- name: gcsweb
image: k8s.gcr.io/gcsweb-amd64:v1.0.6
args:
- -b=android-ci
- -b=crreleases
- -b=istio-prow
- -b=kubernetes-jenkins
- -b=kubernetes-release
- -b=kubernetes-release-dev
- -p=8080
ports:
- containerPort: 8080
protocol: TCP
resources:
limits:
cpu: 0.1
memory: 128Mi
livenessProbe:
httpGet:
path: /healthz
port: 8080
initialDelaySeconds: 3
timeoutSeconds: 2
failureThreshold: 2
| Move android-ci to it's sorted position. | Move android-ci to it's sorted position.
| YAML | apache-2.0 | kubernetes/k8s.io,thockin/k8s.io,kubernetes/k8s.io,kubernetes/k8s.io,thockin/k8s.io,thockin/k8s.io,thockin/k8s.io,kubernetes/k8s.io | yaml | ## Code Before:
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: gcsweb
labels:
app: gcsweb
spec:
replicas: 2
# selector defaults from template labels
strategy:
rollingUpdate:
maxSurge: 1
maxUnavailable: 1
type: RollingUpdate
template:
metadata:
labels:
app: gcsweb
spec:
terminationGracePeriodSeconds: 30
containers:
- name: gcsweb
image: k8s.gcr.io/gcsweb-amd64:v1.0.6
args:
- -b=crreleases
- -b=istio-prow
- -b=kubernetes-jenkins
- -b=kubernetes-release
- -b=kubernetes-release-dev
- -b=android-ci
- -p=8080
ports:
- containerPort: 8080
protocol: TCP
resources:
limits:
cpu: 0.1
memory: 128Mi
livenessProbe:
httpGet:
path: /healthz
port: 8080
initialDelaySeconds: 3
timeoutSeconds: 2
failureThreshold: 2
## Instruction:
Move android-ci to it's sorted position.
## Code After:
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: gcsweb
labels:
app: gcsweb
spec:
replicas: 2
# selector defaults from template labels
strategy:
rollingUpdate:
maxSurge: 1
maxUnavailable: 1
type: RollingUpdate
template:
metadata:
labels:
app: gcsweb
spec:
terminationGracePeriodSeconds: 30
containers:
- name: gcsweb
image: k8s.gcr.io/gcsweb-amd64:v1.0.6
args:
- -b=android-ci
- -b=crreleases
- -b=istio-prow
- -b=kubernetes-jenkins
- -b=kubernetes-release
- -b=kubernetes-release-dev
- -p=8080
ports:
- containerPort: 8080
protocol: TCP
resources:
limits:
cpu: 0.1
memory: 128Mi
livenessProbe:
httpGet:
path: /healthz
port: 8080
initialDelaySeconds: 3
timeoutSeconds: 2
failureThreshold: 2
| apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: gcsweb
labels:
app: gcsweb
spec:
replicas: 2
# selector defaults from template labels
strategy:
rollingUpdate:
maxSurge: 1
maxUnavailable: 1
type: RollingUpdate
template:
metadata:
labels:
app: gcsweb
spec:
terminationGracePeriodSeconds: 30
containers:
- name: gcsweb
image: k8s.gcr.io/gcsweb-amd64:v1.0.6
args:
+ - -b=android-ci
- -b=crreleases
- -b=istio-prow
- -b=kubernetes-jenkins
- -b=kubernetes-release
- -b=kubernetes-release-dev
- - -b=android-ci
- -p=8080
ports:
- containerPort: 8080
protocol: TCP
resources:
limits:
cpu: 0.1
memory: 128Mi
livenessProbe:
httpGet:
path: /healthz
port: 8080
initialDelaySeconds: 3
timeoutSeconds: 2
failureThreshold: 2 | 2 | 0.044444 | 1 | 1 |
12e4db4c9c81f79174490a42b3ab307a832f0a3c | projects/scala_serialization/src/com/komanov/serialization/converters/ChillConverter.scala | projects/scala_serialization/src/com/komanov/serialization/converters/ChillConverter.scala | package com.komanov.serialization.converters
import com.komanov.serialization.converters.api.MyConverter
import com.komanov.serialization.domain.{Site, SiteEvent}
import com.twitter.chill.ScalaKryoInstantiator
/** https://github.com/twitter/chill */
object ChillConverter extends MyConverter {
private val pool = ScalaKryoInstantiator.defaultPool
override def toByteArray(site: Site): Array[Byte] = {
pool.toBytesWithoutClass(site)
}
// WOW Chill mutates input array! We need to clone it. https://github.com/twitter/chill/issues/181
override def fromByteArray(bytes: Array[Byte]): Site = {
pool.fromBytes(bytes.clone(), classOf[Site])
}
override def toByteArray(event: SiteEvent): Array[Byte] = {
pool.toBytesWithoutClass(event)
}
override def siteEventFromByteArray(clazz: Class[_], bytes: Array[Byte]): SiteEvent = {
pool.fromBytes(bytes.clone(), clazz).asInstanceOf[SiteEvent]
}
}
| package com.komanov.serialization.converters
import com.komanov.serialization.converters.api.MyConverter
import com.komanov.serialization.domain.{Site, SiteEvent}
import com.twitter.chill.ScalaKryoInstantiator
/** https://github.com/twitter/chill */
object ChillConverter extends MyConverter {
private val pool = ScalaKryoInstantiator.defaultPool
override def toByteArray(site: Site): Array[Byte] = {
pool.toBytesWithoutClass(site)
}
override def fromByteArray(bytes: Array[Byte]): Site = {
pool.fromBytes(bytes, classOf[Site])
}
override def toByteArray(event: SiteEvent): Array[Byte] = {
pool.toBytesWithoutClass(event)
}
override def siteEventFromByteArray(clazz: Class[_], bytes: Array[Byte]): SiteEvent = {
pool.fromBytes(bytes, clazz).asInstanceOf[SiteEvent]
}
}
| Remove cloning of an input array, because using State per Thread in jmh (resolve Chill threading issue) | Remove cloning of an input array, because using State per Thread in jmh (resolve Chill threading issue)
| Scala | mit | dkomanov/stuff,dkomanov/stuff,dkomanov/stuff | scala | ## Code Before:
package com.komanov.serialization.converters
import com.komanov.serialization.converters.api.MyConverter
import com.komanov.serialization.domain.{Site, SiteEvent}
import com.twitter.chill.ScalaKryoInstantiator
/** https://github.com/twitter/chill */
object ChillConverter extends MyConverter {
private val pool = ScalaKryoInstantiator.defaultPool
override def toByteArray(site: Site): Array[Byte] = {
pool.toBytesWithoutClass(site)
}
// WOW Chill mutates input array! We need to clone it. https://github.com/twitter/chill/issues/181
override def fromByteArray(bytes: Array[Byte]): Site = {
pool.fromBytes(bytes.clone(), classOf[Site])
}
override def toByteArray(event: SiteEvent): Array[Byte] = {
pool.toBytesWithoutClass(event)
}
override def siteEventFromByteArray(clazz: Class[_], bytes: Array[Byte]): SiteEvent = {
pool.fromBytes(bytes.clone(), clazz).asInstanceOf[SiteEvent]
}
}
## Instruction:
Remove cloning of an input array, because using State per Thread in jmh (resolve Chill threading issue)
## Code After:
package com.komanov.serialization.converters
import com.komanov.serialization.converters.api.MyConverter
import com.komanov.serialization.domain.{Site, SiteEvent}
import com.twitter.chill.ScalaKryoInstantiator
/** https://github.com/twitter/chill */
object ChillConverter extends MyConverter {
private val pool = ScalaKryoInstantiator.defaultPool
override def toByteArray(site: Site): Array[Byte] = {
pool.toBytesWithoutClass(site)
}
override def fromByteArray(bytes: Array[Byte]): Site = {
pool.fromBytes(bytes, classOf[Site])
}
override def toByteArray(event: SiteEvent): Array[Byte] = {
pool.toBytesWithoutClass(event)
}
override def siteEventFromByteArray(clazz: Class[_], bytes: Array[Byte]): SiteEvent = {
pool.fromBytes(bytes, clazz).asInstanceOf[SiteEvent]
}
}
| package com.komanov.serialization.converters
import com.komanov.serialization.converters.api.MyConverter
import com.komanov.serialization.domain.{Site, SiteEvent}
import com.twitter.chill.ScalaKryoInstantiator
/** https://github.com/twitter/chill */
object ChillConverter extends MyConverter {
private val pool = ScalaKryoInstantiator.defaultPool
override def toByteArray(site: Site): Array[Byte] = {
pool.toBytesWithoutClass(site)
}
- // WOW Chill mutates input array! We need to clone it. https://github.com/twitter/chill/issues/181
override def fromByteArray(bytes: Array[Byte]): Site = {
- pool.fromBytes(bytes.clone(), classOf[Site])
? --------
+ pool.fromBytes(bytes, classOf[Site])
}
override def toByteArray(event: SiteEvent): Array[Byte] = {
pool.toBytesWithoutClass(event)
}
override def siteEventFromByteArray(clazz: Class[_], bytes: Array[Byte]): SiteEvent = {
- pool.fromBytes(bytes.clone(), clazz).asInstanceOf[SiteEvent]
? --------
+ pool.fromBytes(bytes, clazz).asInstanceOf[SiteEvent]
}
} | 5 | 0.172414 | 2 | 3 |
934fa5c09e2bb24d63d478a950bf2f9884bdbc94 | lib/bouvier.rb | lib/bouvier.rb | require 'httparty'
require 'hashie'
module Bouvier
class Client
include HTTParty
format :xml
base_uri "http://www.eot.state.ma.us"
# Bouvier::Client.branches
def self.branches
response = get('/developers/downloads/qmaticXML.aspx')
Hashie::Mash.new(response['branches']).branch
end
# Bouvier::Client.branch("Danvers")
def self.branch(town_name)
self.branches.detect{|b| b.town == town_name }
end
end
end
| require 'httparty'
require 'hashie'
module Bouvier
class Client
include HTTParty
format :xml
base_uri "http://www.massdot.state.ma.us"
# Bouvier::Client.branches
def self.branches
response = get('/feeds/qmaticxml/qmaticXML.aspx')
Hashie::Mash.new(response['branches']).branch
end
# Bouvier::Client.branch("Danvers")
def self.branch(town_name)
self.branches.detect{|b| b.town == town_name }
end
end
end
| Update to use new URLs. | Update to use new URLs. | Ruby | mit | bkaney/bouvier | ruby | ## Code Before:
require 'httparty'
require 'hashie'
module Bouvier
class Client
include HTTParty
format :xml
base_uri "http://www.eot.state.ma.us"
# Bouvier::Client.branches
def self.branches
response = get('/developers/downloads/qmaticXML.aspx')
Hashie::Mash.new(response['branches']).branch
end
# Bouvier::Client.branch("Danvers")
def self.branch(town_name)
self.branches.detect{|b| b.town == town_name }
end
end
end
## Instruction:
Update to use new URLs.
## Code After:
require 'httparty'
require 'hashie'
module Bouvier
class Client
include HTTParty
format :xml
base_uri "http://www.massdot.state.ma.us"
# Bouvier::Client.branches
def self.branches
response = get('/feeds/qmaticxml/qmaticXML.aspx')
Hashie::Mash.new(response['branches']).branch
end
# Bouvier::Client.branch("Danvers")
def self.branch(town_name)
self.branches.detect{|b| b.town == town_name }
end
end
end
| require 'httparty'
require 'hashie'
module Bouvier
class Client
-
+
include HTTParty
format :xml
- base_uri "http://www.eot.state.ma.us"
? ^
+ base_uri "http://www.massdot.state.ma.us"
? ^^^^^
# Bouvier::Client.branches
def self.branches
- response = get('/developers/downloads/qmaticXML.aspx')
? -------- ^^^^ ----
+ response = get('/feeds/qmaticxml/qmaticXML.aspx')
? +++ ^^^^^^^^
Hashie::Mash.new(response['branches']).branch
end
# Bouvier::Client.branch("Danvers")
def self.branch(town_name)
self.branches.detect{|b| b.town == town_name }
end
end
end
| 6 | 0.230769 | 3 | 3 |
7ed19462f93bbf8d494fa49fb90b51ebcd53e2fc | templates/iot/operator/030-ClusterRoleBinding-iot-operator.yaml | templates/iot/operator/030-ClusterRoleBinding-iot-operator.yaml | apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: iot-operator
labels:
app: enmasse
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: enmasse.io:iot-operator
subjects:
- kind: ServiceAccount
name: iot-operator
namespace: ${NAMESPACE}
| apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: "enmasse.io:iot-operator-${NAMESPACE}"
labels:
app: enmasse
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: enmasse.io:iot-operator
subjects:
- kind: ServiceAccount
name: iot-operator
namespace: ${NAMESPACE}
| Align the names with the other binding | Align the names with the other binding | YAML | apache-2.0 | EnMasseProject/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,jenmalloy/enmasse,EnMasseProject/enmasse,EnMasseProject/enmasse | yaml | ## Code Before:
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: iot-operator
labels:
app: enmasse
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: enmasse.io:iot-operator
subjects:
- kind: ServiceAccount
name: iot-operator
namespace: ${NAMESPACE}
## Instruction:
Align the names with the other binding
## Code After:
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: "enmasse.io:iot-operator-${NAMESPACE}"
labels:
app: enmasse
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: enmasse.io:iot-operator
subjects:
- kind: ServiceAccount
name: iot-operator
namespace: ${NAMESPACE}
| apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
- name: iot-operator
+ name: "enmasse.io:iot-operator-${NAMESPACE}"
labels:
app: enmasse
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: enmasse.io:iot-operator
subjects:
- kind: ServiceAccount
name: iot-operator
namespace: ${NAMESPACE} | 2 | 0.142857 | 1 | 1 |
6252cd18d4c9cb32cae0833863539a845585cf9c | lib/smart_api/controller.rb | lib/smart_api/controller.rb | module SmartApi
module Controller
extend ActiveSupport::Concern
include ActionController::ConditionalGet
include ActionController::Head
include ActionController::Instrumentation
include ActionController::MimeResponds
include ActionController::Redirecting
include ActionController::Rendering
include ActionController::Rescue
include ActionController::UrlFor
include Rails.application.routes.url_helpers
included do
self.responder = SmartApi::Responder
class_attribute :_endpoint_descriptors
self._endpoint_descriptors = {}
end
module ClassMethods
def endpoint_descriptor_for(action)
self._endpoint_descriptors[action]
end
def desc(action_name, *args)
self._endpoint_descriptors[action_name] = Dsl.desc(action_name, *args)
end
end
def params
return @_smart_api_params if @_smart_api_params
desc = self.class.endpoint_descriptor_for(action_name.to_sym)
@_smart_api_params = ParamsHandler.new(desc.params).handle(super)
end
end
end | module SmartApi
module Controller
extend ActiveSupport::Concern
include AbstractController::Callbacks
include ActionController::ConditionalGet
include ActionController::Head
include ActionController::Instrumentation
include ActionController::MimeResponds
include ActionController::Redirecting
include ActionController::Renderers::All
include ActionController::Rendering
include ActionController::Rescue
include ActionController::UrlFor
include Rails.application.routes.url_helpers
included do
append_view_path Rails.root.join("app", "views")
self.responder = SmartApi::Responder
class_attribute :_endpoint_descriptors
self._endpoint_descriptors = {}
end
module ClassMethods
def endpoint_descriptor_for(action)
self._endpoint_descriptors[action]
end
def desc(action_name, *args)
self._endpoint_descriptors[action_name] = Dsl.desc(action_name, *args)
end
end
def params
return @_smart_api_params if @_smart_api_params
desc = self.class.endpoint_descriptor_for(action_name.to_sym)
@_smart_api_params = ParamsHandler.new(desc.params).handle(super)
end
end
end | Add common modules and allow view rendering | Add common modules and allow view rendering
| Ruby | mit | divoxx/smart_api,divoxx/smart_api | ruby | ## Code Before:
module SmartApi
module Controller
extend ActiveSupport::Concern
include ActionController::ConditionalGet
include ActionController::Head
include ActionController::Instrumentation
include ActionController::MimeResponds
include ActionController::Redirecting
include ActionController::Rendering
include ActionController::Rescue
include ActionController::UrlFor
include Rails.application.routes.url_helpers
included do
self.responder = SmartApi::Responder
class_attribute :_endpoint_descriptors
self._endpoint_descriptors = {}
end
module ClassMethods
def endpoint_descriptor_for(action)
self._endpoint_descriptors[action]
end
def desc(action_name, *args)
self._endpoint_descriptors[action_name] = Dsl.desc(action_name, *args)
end
end
def params
return @_smart_api_params if @_smart_api_params
desc = self.class.endpoint_descriptor_for(action_name.to_sym)
@_smart_api_params = ParamsHandler.new(desc.params).handle(super)
end
end
end
## Instruction:
Add common modules and allow view rendering
## Code After:
module SmartApi
module Controller
extend ActiveSupport::Concern
include AbstractController::Callbacks
include ActionController::ConditionalGet
include ActionController::Head
include ActionController::Instrumentation
include ActionController::MimeResponds
include ActionController::Redirecting
include ActionController::Renderers::All
include ActionController::Rendering
include ActionController::Rescue
include ActionController::UrlFor
include Rails.application.routes.url_helpers
included do
append_view_path Rails.root.join("app", "views")
self.responder = SmartApi::Responder
class_attribute :_endpoint_descriptors
self._endpoint_descriptors = {}
end
module ClassMethods
def endpoint_descriptor_for(action)
self._endpoint_descriptors[action]
end
def desc(action_name, *args)
self._endpoint_descriptors[action_name] = Dsl.desc(action_name, *args)
end
end
def params
return @_smart_api_params if @_smart_api_params
desc = self.class.endpoint_descriptor_for(action_name.to_sym)
@_smart_api_params = ParamsHandler.new(desc.params).handle(super)
end
end
end | module SmartApi
module Controller
extend ActiveSupport::Concern
+ include AbstractController::Callbacks
include ActionController::ConditionalGet
include ActionController::Head
include ActionController::Instrumentation
include ActionController::MimeResponds
include ActionController::Redirecting
+ include ActionController::Renderers::All
include ActionController::Rendering
include ActionController::Rescue
include ActionController::UrlFor
include Rails.application.routes.url_helpers
included do
+ append_view_path Rails.root.join("app", "views")
+
self.responder = SmartApi::Responder
class_attribute :_endpoint_descriptors
self._endpoint_descriptors = {}
end
module ClassMethods
def endpoint_descriptor_for(action)
self._endpoint_descriptors[action]
end
def desc(action_name, *args)
self._endpoint_descriptors[action_name] = Dsl.desc(action_name, *args)
end
end
def params
return @_smart_api_params if @_smart_api_params
desc = self.class.endpoint_descriptor_for(action_name.to_sym)
@_smart_api_params = ParamsHandler.new(desc.params).handle(super)
end
end
end | 4 | 0.105263 | 4 | 0 |
15ff64d69480a9bce6adebae229c3c2c38c4101a | .vscode/settings.json | .vscode/settings.json | // Place your settings in this file to overwrite default and user settings.
{
"editor.tabSize": 2,
// When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
"editor.detectIndentation": false,
"editor.insertSpaces": true,
"editor.useTabStops": false
} | // Place your settings in this file to overwrite default and user settings.
{
"editor.tabSize": 2,
// When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
"editor.detectIndentation": false,
"editor.insertSpaces": true,
"editor.useTabStops": false,
"typescript.tsdk": "./node_modules/typescript/lib"
}
| Use TypeScript from node_modules to get same behaviour in webpack and VSC | VSC: Use TypeScript from node_modules to get same behaviour in webpack and VSC
| JSON | mit | DJCordhose/react-workshop,DJCordhose/react-workshop,DJCordhose/react-workshop,DJCordhose/react-workshop | json | ## Code Before:
// Place your settings in this file to overwrite default and user settings.
{
"editor.tabSize": 2,
// When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
"editor.detectIndentation": false,
"editor.insertSpaces": true,
"editor.useTabStops": false
}
## Instruction:
VSC: Use TypeScript from node_modules to get same behaviour in webpack and VSC
## Code After:
// Place your settings in this file to overwrite default and user settings.
{
"editor.tabSize": 2,
// When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
"editor.detectIndentation": false,
"editor.insertSpaces": true,
"editor.useTabStops": false,
"typescript.tsdk": "./node_modules/typescript/lib"
}
| // Place your settings in this file to overwrite default and user settings.
{
- "editor.tabSize": 2,
? ^^^^
+ "editor.tabSize": 2,
? ^
- // When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
? ^^^^
+ // When opening a file, `editor.tabSize` and `editor.insertSpaces` will be detected based on the file contents.
? ^
- "editor.detectIndentation": false,
? ^^^^
+ "editor.detectIndentation": false,
? ^
- "editor.insertSpaces": true,
? ^^^^
+ "editor.insertSpaces": true,
? ^
- "editor.useTabStops": false
? ^^^^^^^^
+ "editor.useTabStops": false,
? ^ +
-
+ "typescript.tsdk": "./node_modules/typescript/lib"
} | 12 | 1.333333 | 6 | 6 |
473c8748c4f8b33e51da2f4890cfe50a5aef3f29 | tests/test_variations.py | tests/test_variations.py | from nose import tools
import numpy as np
from trials.variations import *
class TestBernoulli:
def setup(self):
self.x = BernoulliVariation(1, 1)
def test_update(self):
self.x.update(100, 20)
def test_sample(self):
s1 = self.x.sample(10)
tools.assert_equals(len(s1), 10)
s2 = self.x.sample(10)
tools.assert_true(np.all(s1 == s2))
self.x.update(10, 30)
s3 = self.x.sample(10)
tools.assert_false(np.all(s2 == s3))
| from nose import tools
import numpy as np
from trials.variations import *
class TestBernoulli:
def setup(self):
self.x = BernoulliVariation(1, 1)
def test_update(self):
self.x.update(100, 20)
self.x.update(200, 10)
tools.assert_true(self.x.alpha == 301)
tools.assert_true(self.x.beta == 31)
def test_sample(self):
s1 = self.x.sample(10)
tools.assert_equals(len(s1), 10)
s2 = self.x.sample(10)
tools.assert_true(np.all(s1 == s2))
self.x.update(10, 30)
s3 = self.x.sample(10)
tools.assert_false(np.all(s2 == s3))
| Add a test for update correctness | Add a test for update correctness
| Python | mit | bogdan-kulynych/trials | python | ## Code Before:
from nose import tools
import numpy as np
from trials.variations import *
class TestBernoulli:
def setup(self):
self.x = BernoulliVariation(1, 1)
def test_update(self):
self.x.update(100, 20)
def test_sample(self):
s1 = self.x.sample(10)
tools.assert_equals(len(s1), 10)
s2 = self.x.sample(10)
tools.assert_true(np.all(s1 == s2))
self.x.update(10, 30)
s3 = self.x.sample(10)
tools.assert_false(np.all(s2 == s3))
## Instruction:
Add a test for update correctness
## Code After:
from nose import tools
import numpy as np
from trials.variations import *
class TestBernoulli:
def setup(self):
self.x = BernoulliVariation(1, 1)
def test_update(self):
self.x.update(100, 20)
self.x.update(200, 10)
tools.assert_true(self.x.alpha == 301)
tools.assert_true(self.x.beta == 31)
def test_sample(self):
s1 = self.x.sample(10)
tools.assert_equals(len(s1), 10)
s2 = self.x.sample(10)
tools.assert_true(np.all(s1 == s2))
self.x.update(10, 30)
s3 = self.x.sample(10)
tools.assert_false(np.all(s2 == s3))
| from nose import tools
import numpy as np
from trials.variations import *
class TestBernoulli:
def setup(self):
self.x = BernoulliVariation(1, 1)
def test_update(self):
self.x.update(100, 20)
+ self.x.update(200, 10)
+ tools.assert_true(self.x.alpha == 301)
+ tools.assert_true(self.x.beta == 31)
def test_sample(self):
s1 = self.x.sample(10)
tools.assert_equals(len(s1), 10)
s2 = self.x.sample(10)
tools.assert_true(np.all(s1 == s2))
self.x.update(10, 30)
s3 = self.x.sample(10)
tools.assert_false(np.all(s2 == s3)) | 3 | 0.136364 | 3 | 0 |
c1629417fac6ad44e57cc543b231804069a4b791 | about/index.html | about/index.html | ---
layout: default
title: about
pageId: about
permalink: /about/
weight: 2
sitemap: false
scripts:
- /assets/scripts/pages/about/formMail.js
---
<div class="contact">
<div class="social">
</div>
<div class="formMail">
<form id="contactForm" action="//formspree.io/1up@1upz.com" method="post">
<input type="text" name="name" placeholder="Your Name">
<input type="email" name="_replyto" placeholder="Your email">
<textarea name="message" rows="5" placeholder="Message"></textarea>
<input type="submit" value="Send">
</form>
</div>
</div>
| ---
layout: default
title: about
pageId: about
permalink: /about/
weight: 2
sitemap: false
scripts:
- /assets/scripts/pages/about/formMail.js
---
<div class="contact">
<div class="social">
</div>
<div class="formMail">
<form id="contactForm" action="{{ site.author.email | prepend: '//formspree.io/'}}" method="post">
<input type="text" name="name" placeholder="Your Name">
<input type="email" name="_replyto" placeholder="Your email">
<textarea name="message" rows="5" placeholder="Message"></textarea>
<input type="submit" value="Send">
</form>
</div>
</div>
| Replace hard-coded `action` value with site variables | Replace hard-coded `action` value with site variables
| HTML | mit | woneob/1upnote,woneob/1upnote,woneob/woneob.github.io,woneob/woneob.github.io,woneob/1upnote,woneob/woneob.github.io | html | ## Code Before:
---
layout: default
title: about
pageId: about
permalink: /about/
weight: 2
sitemap: false
scripts:
- /assets/scripts/pages/about/formMail.js
---
<div class="contact">
<div class="social">
</div>
<div class="formMail">
<form id="contactForm" action="//formspree.io/1up@1upz.com" method="post">
<input type="text" name="name" placeholder="Your Name">
<input type="email" name="_replyto" placeholder="Your email">
<textarea name="message" rows="5" placeholder="Message"></textarea>
<input type="submit" value="Send">
</form>
</div>
</div>
## Instruction:
Replace hard-coded `action` value with site variables
## Code After:
---
layout: default
title: about
pageId: about
permalink: /about/
weight: 2
sitemap: false
scripts:
- /assets/scripts/pages/about/formMail.js
---
<div class="contact">
<div class="social">
</div>
<div class="formMail">
<form id="contactForm" action="{{ site.author.email | prepend: '//formspree.io/'}}" method="post">
<input type="text" name="name" placeholder="Your Name">
<input type="email" name="_replyto" placeholder="Your email">
<textarea name="message" rows="5" placeholder="Message"></textarea>
<input type="submit" value="Send">
</form>
</div>
</div>
| ---
layout: default
title: about
pageId: about
permalink: /about/
weight: 2
sitemap: false
scripts:
- /assets/scripts/pages/about/formMail.js
---
<div class="contact">
<div class="social">
</div>
<div class="formMail">
- <form id="contactForm" action="//formspree.io/1up@1upz.com" method="post">
+ <form id="contactForm" action="{{ site.author.email | prepend: '//formspree.io/'}}" method="post">
<input type="text" name="name" placeholder="Your Name">
<input type="email" name="_replyto" placeholder="Your email">
<textarea name="message" rows="5" placeholder="Message"></textarea>
<input type="submit" value="Send">
</form>
</div>
</div> | 2 | 0.086957 | 1 | 1 |
819a05bfae94b284ad9428ea17f08bea287989c7 | pkgs/servers/sql/mysql55/default.nix | pkgs/servers/sql/mysql55/default.nix | {stdenv, fetchurl, cmake, bison, ncurses, openssl, readline, zlib, darwinInstallNameToolUtility, perl}:
# Note: zlib is not required; MySQL can use an internal zlib.
stdenv.mkDerivation {
name = "mysql-5.5.20";
src = fetchurl {
url = ftp://mirror.leaseweb.com/mysql/Downloads/MySQL-5.5/mysql-5.5.20.tar.gz;
sha256 = "03jl60mzrsd1jb8fvkz6c8j2239b37k8n1i07jk1q4yk58aq8ynh";
};
buildInputs = [ cmake bison ncurses openssl readline zlib ] ++ stdenv.lib.optionals stdenv.isDarwin [ darwinInstallNameToolUtility perl ];
cmakeFlags = "-DWITH_SSL=yes -DWITH_READLINE=yes -DWITH_EMBEDDED_SERVER=yes -DWITH_ZLIB=yes -DINSTALL_SCRIPTDIR=bin -DHAVE_IPV6=yes";
NIX_LDFLAGS = stdenv.lib.optionalString stdenv.isLinux "-lgcc_s";
postInstall = ''
sed -i -e "s|basedir=\"\"|basedir=\"$out\"|" $out/bin/mysql_install_db
rm -rf $out/mysql-test $out/sql-bench
'';
meta = {
homepage = http://www.mysql.com/;
description = "The world's most popular open source database";
};
}
| {stdenv, fetchurl, cmake, bison, ncurses, openssl, readline, zlib, darwinInstallNameToolUtility, perl}:
# Note: zlib is not required; MySQL can use an internal zlib.
stdenv.mkDerivation {
name = "mysql-5.5.23";
src = fetchurl {
url = ftp://ftp.inria.fr/pub/MySQL/Downloads/MySQL-5.5/mysql-5.5.23.tar.gz;
sha256 = "0sklcz6miff7nb6bi1pqncgjv819255y7if6jxcqgiqs50z319i0";
};
buildInputs = [ cmake bison ncurses openssl readline zlib ] ++ stdenv.lib.optionals stdenv.isDarwin [ darwinInstallNameToolUtility perl ];
cmakeFlags = "-DWITH_SSL=yes -DWITH_READLINE=yes -DWITH_EMBEDDED_SERVER=yes -DWITH_ZLIB=yes -DINSTALL_SCRIPTDIR=bin -DHAVE_IPV6=yes";
NIX_LDFLAGS = stdenv.lib.optionalString stdenv.isLinux "-lgcc_s";
postInstall = ''
sed -i -e "s|basedir=\"\"|basedir=\"$out\"|" $out/bin/mysql_install_db
rm -rf $out/mysql-test $out/sql-bench
'';
meta = {
homepage = http://www.mysql.com/;
description = "The world's most popular open source database";
};
}
| Update mysql 5.5 to 5.5.23 and fix no longer responding URL. | Update mysql 5.5 to 5.5.23 and fix no longer responding URL.
svn path=/nixpkgs/trunk/; revision=33872
| Nix | mit | triton/triton,NixOS/nixpkgs,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs,triton/triton,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,triton/triton,triton/triton,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,triton/triton,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs | nix | ## Code Before:
{stdenv, fetchurl, cmake, bison, ncurses, openssl, readline, zlib, darwinInstallNameToolUtility, perl}:
# Note: zlib is not required; MySQL can use an internal zlib.
stdenv.mkDerivation {
name = "mysql-5.5.20";
src = fetchurl {
url = ftp://mirror.leaseweb.com/mysql/Downloads/MySQL-5.5/mysql-5.5.20.tar.gz;
sha256 = "03jl60mzrsd1jb8fvkz6c8j2239b37k8n1i07jk1q4yk58aq8ynh";
};
buildInputs = [ cmake bison ncurses openssl readline zlib ] ++ stdenv.lib.optionals stdenv.isDarwin [ darwinInstallNameToolUtility perl ];
cmakeFlags = "-DWITH_SSL=yes -DWITH_READLINE=yes -DWITH_EMBEDDED_SERVER=yes -DWITH_ZLIB=yes -DINSTALL_SCRIPTDIR=bin -DHAVE_IPV6=yes";
NIX_LDFLAGS = stdenv.lib.optionalString stdenv.isLinux "-lgcc_s";
postInstall = ''
sed -i -e "s|basedir=\"\"|basedir=\"$out\"|" $out/bin/mysql_install_db
rm -rf $out/mysql-test $out/sql-bench
'';
meta = {
homepage = http://www.mysql.com/;
description = "The world's most popular open source database";
};
}
## Instruction:
Update mysql 5.5 to 5.5.23 and fix no longer responding URL.
svn path=/nixpkgs/trunk/; revision=33872
## Code After:
{stdenv, fetchurl, cmake, bison, ncurses, openssl, readline, zlib, darwinInstallNameToolUtility, perl}:
# Note: zlib is not required; MySQL can use an internal zlib.
stdenv.mkDerivation {
name = "mysql-5.5.23";
src = fetchurl {
url = ftp://ftp.inria.fr/pub/MySQL/Downloads/MySQL-5.5/mysql-5.5.23.tar.gz;
sha256 = "0sklcz6miff7nb6bi1pqncgjv819255y7if6jxcqgiqs50z319i0";
};
buildInputs = [ cmake bison ncurses openssl readline zlib ] ++ stdenv.lib.optionals stdenv.isDarwin [ darwinInstallNameToolUtility perl ];
cmakeFlags = "-DWITH_SSL=yes -DWITH_READLINE=yes -DWITH_EMBEDDED_SERVER=yes -DWITH_ZLIB=yes -DINSTALL_SCRIPTDIR=bin -DHAVE_IPV6=yes";
NIX_LDFLAGS = stdenv.lib.optionalString stdenv.isLinux "-lgcc_s";
postInstall = ''
sed -i -e "s|basedir=\"\"|basedir=\"$out\"|" $out/bin/mysql_install_db
rm -rf $out/mysql-test $out/sql-bench
'';
meta = {
homepage = http://www.mysql.com/;
description = "The world's most popular open source database";
};
}
| {stdenv, fetchurl, cmake, bison, ncurses, openssl, readline, zlib, darwinInstallNameToolUtility, perl}:
# Note: zlib is not required; MySQL can use an internal zlib.
stdenv.mkDerivation {
- name = "mysql-5.5.20";
? ^
+ name = "mysql-5.5.23";
? ^
src = fetchurl {
- url = ftp://mirror.leaseweb.com/mysql/Downloads/MySQL-5.5/mysql-5.5.20.tar.gz;
? ^ ^^^^^^^^^^ ---- ^ ^^^ ^
+ url = ftp://ftp.inria.fr/pub/MySQL/Downloads/MySQL-5.5/mysql-5.5.23.tar.gz;
? ^^^^ + ++++ ^^^ ^ ^^^ ^
- sha256 = "03jl60mzrsd1jb8fvkz6c8j2239b37k8n1i07jk1q4yk58aq8ynh";
+ sha256 = "0sklcz6miff7nb6bi1pqncgjv819255y7if6jxcqgiqs50z319i0";
};
buildInputs = [ cmake bison ncurses openssl readline zlib ] ++ stdenv.lib.optionals stdenv.isDarwin [ darwinInstallNameToolUtility perl ];
cmakeFlags = "-DWITH_SSL=yes -DWITH_READLINE=yes -DWITH_EMBEDDED_SERVER=yes -DWITH_ZLIB=yes -DINSTALL_SCRIPTDIR=bin -DHAVE_IPV6=yes";
NIX_LDFLAGS = stdenv.lib.optionalString stdenv.isLinux "-lgcc_s";
postInstall = ''
sed -i -e "s|basedir=\"\"|basedir=\"$out\"|" $out/bin/mysql_install_db
rm -rf $out/mysql-test $out/sql-bench
'';
meta = {
homepage = http://www.mysql.com/;
description = "The world's most popular open source database";
};
} | 6 | 0.214286 | 3 | 3 |
f553be7ff1e081ccc9b91aeb6c696e10af23721b | your-bot-here.js | your-bot-here.js | var Util = require("util");
var Bot = require("./lib/irc");
var YourBot = function(profile) {
Bot.call(this, profile);
this.set_log_level(this.LOG_ALL);
this.set_command_identifier("!"); // Exclamation
};
Util.inherits(YourBot, Bot);
YourBot.prototype.init = function() {
Bot.prototype.init.call(this);
this.register_command("ping", this.ping);
this.on('command_not_found', this.unrecognized);
};
YourBot.prototype.ping = function(cx, text) {
cx.channel.send_reply (cx.sender, "Pong!");
};
YourBot.prototype.unrecognized = function(cx, text) {
cx.channel.send_reply(cx.sender, "There is no command: "+text);
};
var profile = [{
host: "irc.freenode.net",
port: 6667,
nick: "mybot",
password: "password_to_authenticate",
user: "username",
real: "Real Name",
channels: ["#channels", "#to", "#join"]
}];
(new YourBot(profile)).init();
| var Util = require("util");
var Bot = require("./lib/irc");
var YourBot = function(profile) {
Bot.call(this, profile);
this.set_log_level(this.LOG_ALL);
this.set_trigger("!"); // Exclamation
};
Util.inherits(YourBot, Bot);
YourBot.prototype.init = function() {
Bot.prototype.init.call(this);
this.register_command("ping", this.ping);
this.on('command_not_found', this.unrecognized);
};
YourBot.prototype.ping = function(cx, text) {
cx.channel.send_reply (cx.sender, "Pong!");
};
YourBot.prototype.unrecognized = function(cx, text) {
cx.channel.send_reply(cx.sender, "There is no command: "+text);
};
var profile = [{
host: "irc.freenode.net",
port: 6667,
nick: "mybot",
password: "password_to_authenticate",
user: "username",
real: "Real Name",
channels: ["#channels", "#to", "#join"]
}];
(new YourBot(profile)).init();
| Remove deprecated call to set_command_identifier | Remove deprecated call to set_command_identifier
| JavaScript | mit | robertmaxrees/r0b0t,brigand/oftn-bot,brigand/oftn-bot,robotlolita/nanobot,robertmaxrees/r0b0t,oftn/oftn-bot,oftn/oftn-bot,brigand/oftn-bot,robertmaxrees/r0b0t,vnen/nanobot,oftn/oftn-bot,vnen/nanobot,robotlolita/nanobot,NinjaBanjo/vbot | javascript | ## Code Before:
var Util = require("util");
var Bot = require("./lib/irc");
var YourBot = function(profile) {
Bot.call(this, profile);
this.set_log_level(this.LOG_ALL);
this.set_command_identifier("!"); // Exclamation
};
Util.inherits(YourBot, Bot);
YourBot.prototype.init = function() {
Bot.prototype.init.call(this);
this.register_command("ping", this.ping);
this.on('command_not_found', this.unrecognized);
};
YourBot.prototype.ping = function(cx, text) {
cx.channel.send_reply (cx.sender, "Pong!");
};
YourBot.prototype.unrecognized = function(cx, text) {
cx.channel.send_reply(cx.sender, "There is no command: "+text);
};
var profile = [{
host: "irc.freenode.net",
port: 6667,
nick: "mybot",
password: "password_to_authenticate",
user: "username",
real: "Real Name",
channels: ["#channels", "#to", "#join"]
}];
(new YourBot(profile)).init();
## Instruction:
Remove deprecated call to set_command_identifier
## Code After:
var Util = require("util");
var Bot = require("./lib/irc");
var YourBot = function(profile) {
Bot.call(this, profile);
this.set_log_level(this.LOG_ALL);
this.set_trigger("!"); // Exclamation
};
Util.inherits(YourBot, Bot);
YourBot.prototype.init = function() {
Bot.prototype.init.call(this);
this.register_command("ping", this.ping);
this.on('command_not_found', this.unrecognized);
};
YourBot.prototype.ping = function(cx, text) {
cx.channel.send_reply (cx.sender, "Pong!");
};
YourBot.prototype.unrecognized = function(cx, text) {
cx.channel.send_reply(cx.sender, "There is no command: "+text);
};
var profile = [{
host: "irc.freenode.net",
port: 6667,
nick: "mybot",
password: "password_to_authenticate",
user: "username",
real: "Real Name",
channels: ["#channels", "#to", "#join"]
}];
(new YourBot(profile)).init();
| var Util = require("util");
var Bot = require("./lib/irc");
var YourBot = function(profile) {
Bot.call(this, profile);
this.set_log_level(this.LOG_ALL);
- this.set_command_identifier("!"); // Exclamation
? ^^^^^^^^ ^^^^^^^
+ this.set_trigger("!"); // Exclamation
? ^^ ^^
};
Util.inherits(YourBot, Bot);
YourBot.prototype.init = function() {
Bot.prototype.init.call(this);
this.register_command("ping", this.ping);
this.on('command_not_found', this.unrecognized);
};
YourBot.prototype.ping = function(cx, text) {
cx.channel.send_reply (cx.sender, "Pong!");
};
YourBot.prototype.unrecognized = function(cx, text) {
cx.channel.send_reply(cx.sender, "There is no command: "+text);
};
var profile = [{
host: "irc.freenode.net",
port: 6667,
nick: "mybot",
password: "password_to_authenticate",
user: "username",
real: "Real Name",
channels: ["#channels", "#to", "#join"]
}];
(new YourBot(profile)).init(); | 2 | 0.052632 | 1 | 1 |
ee683dc4fef68d73b7ed6200b54d0f7674610386 | memos/memo0.md | memos/memo0.md | Title
=====
Chapter 1
---------
This is a pen.
That is a pencil.
Chapter 2
---------
* foo
* bar
* baz
* zzz
| Title
=====
Chapter 1
---------
This is a pen.
That is a pencil.
Chapter 2
---------
* foo
* bar
* baz
* zzz
* [W3C](http://www.w3.org/)
* 
| Add link and image to memo | Add link and image to memo
| Markdown | mit | eqot/memo | markdown | ## Code Before:
Title
=====
Chapter 1
---------
This is a pen.
That is a pencil.
Chapter 2
---------
* foo
* bar
* baz
* zzz
## Instruction:
Add link and image to memo
## Code After:
Title
=====
Chapter 1
---------
This is a pen.
That is a pencil.
Chapter 2
---------
* foo
* bar
* baz
* zzz
* [W3C](http://www.w3.org/)
* 
| Title
=====
Chapter 1
---------
This is a pen.
That is a pencil.
Chapter 2
---------
* foo
* bar
* baz
* zzz
+
+ * [W3C](http://www.w3.org/)
+ *  | 3 | 0.176471 | 3 | 0 |
15d24f27fad5f6e37a393050049e33ac47dc27df | index.html | index.html | <!doctype html>
<meta charset="utf-8">
<title>Signed Service Workers</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>
(function () {
var jsStyle = document.createElement('style');
jsStyle.innerHTML = '#no-script{display:none}';
document.head.appendChild(jsStyle);
if (!navigator.serviceWorker) {
document.getElementById('output-message')
.innerHTML = 'Service Workers are not supported';
return;
}
navigator.serviceWorker.register('loader.js', { scope: '.' })
.then(function () {
document.getElementById('output-message')
.innerHTML = 'Service Worker is working!';
}, function (err) {
document.getElementById('output-message')
.innerHTML = 'An error ocurred, check developer console.';
// throw should work in native Promises:
throw err;
});
}());
</script>
<style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
<div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
<div id="output-message"></div>
| <!doctype html>
<meta charset="utf-8">
<title>Signed Service Workers</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>
(function () {
var jsStyle = document.createElement('style');
jsStyle.innerHTML = '#no-script{display:none}';
document.head.appendChild(jsStyle);
}());
</script>
<style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
<div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
<div id="output-message"></div>
<script>
(function () {
if (!navigator.serviceWorker) {
document.getElementById('output-message')
.innerHTML = 'Service Workers are not supported';
return;
}
navigator.serviceWorker.register('loader.js', { scope: '.' })
.then(function () {
document.getElementById('output-message')
.innerHTML = 'Service Worker is working!';
}, function (err) {
document.getElementById('output-message')
.innerHTML = 'An error ocurred, check developer console.';
// throw should work in native Promises:
throw err;
});
}());
</script>
| Move SW script to bottom | Move SW script to bottom
| HTML | mit | qgustavor/signed-service-workers,qgustavor/signed-service-workers | html | ## Code Before:
<!doctype html>
<meta charset="utf-8">
<title>Signed Service Workers</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>
(function () {
var jsStyle = document.createElement('style');
jsStyle.innerHTML = '#no-script{display:none}';
document.head.appendChild(jsStyle);
if (!navigator.serviceWorker) {
document.getElementById('output-message')
.innerHTML = 'Service Workers are not supported';
return;
}
navigator.serviceWorker.register('loader.js', { scope: '.' })
.then(function () {
document.getElementById('output-message')
.innerHTML = 'Service Worker is working!';
}, function (err) {
document.getElementById('output-message')
.innerHTML = 'An error ocurred, check developer console.';
// throw should work in native Promises:
throw err;
});
}());
</script>
<style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
<div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
<div id="output-message"></div>
## Instruction:
Move SW script to bottom
## Code After:
<!doctype html>
<meta charset="utf-8">
<title>Signed Service Workers</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>
(function () {
var jsStyle = document.createElement('style');
jsStyle.innerHTML = '#no-script{display:none}';
document.head.appendChild(jsStyle);
}());
</script>
<style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
<div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
<div id="output-message"></div>
<script>
(function () {
if (!navigator.serviceWorker) {
document.getElementById('output-message')
.innerHTML = 'Service Workers are not supported';
return;
}
navigator.serviceWorker.register('loader.js', { scope: '.' })
.then(function () {
document.getElementById('output-message')
.innerHTML = 'Service Worker is working!';
}, function (err) {
document.getElementById('output-message')
.innerHTML = 'An error ocurred, check developer console.';
// throw should work in native Promises:
throw err;
});
}());
</script>
| <!doctype html>
<meta charset="utf-8">
<title>Signed Service Workers</title>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>
(function () {
var jsStyle = document.createElement('style');
jsStyle.innerHTML = '#no-script{display:none}';
document.head.appendChild(jsStyle);
-
+ }());
+ </script>
+ <style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
+ <div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
+ <div id="output-message"></div>
+ <script>
+ (function () {
if (!navigator.serviceWorker) {
document.getElementById('output-message')
.innerHTML = 'Service Workers are not supported';
return;
}
navigator.serviceWorker.register('loader.js', { scope: '.' })
.then(function () {
document.getElementById('output-message')
.innerHTML = 'Service Worker is working!';
}, function (err) {
document.getElementById('output-message')
.innerHTML = 'An error ocurred, check developer console.';
// throw should work in native Promises:
throw err;
});
}());
</script>
+
- <style>html{padding:1em;background:#f8f8f8;color:#333;text-align:center;font:2em sans-serif}</style>
- <div id="no-script">Service Workers can only work if JavaScript is enabled.</div>
- <div id="output-message"></div> | 12 | 0.375 | 8 | 4 |
0b25c365a468ecd9ed9b27c3cab840dad860736d | lib/app.jsx | lib/app.jsx | import React from 'react';
import { BrandLogo } from '../src';
// eslint-disable-next-line
import documentation from './loaders/jsdoc!../src';
export default function App() {
return <BrandLogo size={200} />;
}
| import React from 'react';
import { BrandLogo } from '../src';
import documentation from './loaders/jsdoc!../src'; // eslint-disable-line
export default function App() {
return <BrandLogo size={200} />;
}
| Disable eslint for a line inline | Disable eslint for a line inline
| JSX | mit | ahuth/mavenlink-ui-concept,mavenlink/mavenlink-ui-concept | jsx | ## Code Before:
import React from 'react';
import { BrandLogo } from '../src';
// eslint-disable-next-line
import documentation from './loaders/jsdoc!../src';
export default function App() {
return <BrandLogo size={200} />;
}
## Instruction:
Disable eslint for a line inline
## Code After:
import React from 'react';
import { BrandLogo } from '../src';
import documentation from './loaders/jsdoc!../src'; // eslint-disable-line
export default function App() {
return <BrandLogo size={200} />;
}
| import React from 'react';
import { BrandLogo } from '../src';
- // eslint-disable-next-line
- import documentation from './loaders/jsdoc!../src';
+ import documentation from './loaders/jsdoc!../src'; // eslint-disable-line
? +++++++++++++++++++++++
export default function App() {
return <BrandLogo size={200} />;
} | 3 | 0.375 | 1 | 2 |
8592652823cb90f2a21e0718718051aadedf96ae | lib/tic_tac_toe.rb | lib/tic_tac_toe.rb | require "tic_tac_toe/version"
module TicTacToe
class Game
def initialize display
end
def start
end
end
class Board
@@win_places = [[0, 1, 2],
[3, 4, 5],
[6, 7, 8],
[0, 3, 6],
[1, 4, 7],
[2, 5, 8],
[0, 4, 8],
[2, 4, 6]]
def self.win_places
@@win_places
end
def initialize
@_marks = Array.new(9, " ")
end
def pos index
@_marks[index]
end
def move index, mark
@_marks[index] = mark
end
def won
@@win_places.each do |places|
marks = places.map { |n| @_marks.at n }
return marks[0] if marks.all? { |m| m == marks[0] } and marks[0] != " "
end
nil
end
def draw?
true
end
end
end
| require "tic_tac_toe/version"
module TicTacToe
class Game
def initialize display
end
def start
end
end
class Board
@@win_places = [[0, 1, 2],
[3, 4, 5],
[6, 7, 8],
[0, 3, 6],
[1, 4, 7],
[2, 5, 8],
[0, 4, 8],
[2, 4, 6]]
def self.win_places
@@win_places
end
def initialize
@_marks = Array.new(9, " ")
end
def pos index
@_marks[index]
end
def move index, mark
@_marks[index] = mark
end
def won
@@win_places.each do |places|
marks = places.map { |n| @_marks.at n }
return marks[0] if marks.all? { |m| m == marks[0] } and marks[0] != " "
end
nil
end
def draw?
not won and @_marks.none? { |m| m == " " } ? true : false
end
end
end
| Implement a real draw? function | Implement a real draw? function
| Ruby | mit | RadicalZephyr/tic_tac_toe,RadicalZephyr/tic_tac_toe | ruby | ## Code Before:
require "tic_tac_toe/version"
module TicTacToe
class Game
def initialize display
end
def start
end
end
class Board
@@win_places = [[0, 1, 2],
[3, 4, 5],
[6, 7, 8],
[0, 3, 6],
[1, 4, 7],
[2, 5, 8],
[0, 4, 8],
[2, 4, 6]]
def self.win_places
@@win_places
end
def initialize
@_marks = Array.new(9, " ")
end
def pos index
@_marks[index]
end
def move index, mark
@_marks[index] = mark
end
def won
@@win_places.each do |places|
marks = places.map { |n| @_marks.at n }
return marks[0] if marks.all? { |m| m == marks[0] } and marks[0] != " "
end
nil
end
def draw?
true
end
end
end
## Instruction:
Implement a real draw? function
## Code After:
require "tic_tac_toe/version"
module TicTacToe
class Game
def initialize display
end
def start
end
end
class Board
@@win_places = [[0, 1, 2],
[3, 4, 5],
[6, 7, 8],
[0, 3, 6],
[1, 4, 7],
[2, 5, 8],
[0, 4, 8],
[2, 4, 6]]
def self.win_places
@@win_places
end
def initialize
@_marks = Array.new(9, " ")
end
def pos index
@_marks[index]
end
def move index, mark
@_marks[index] = mark
end
def won
@@win_places.each do |places|
marks = places.map { |n| @_marks.at n }
return marks[0] if marks.all? { |m| m == marks[0] } and marks[0] != " "
end
nil
end
def draw?
not won and @_marks.none? { |m| m == " " } ? true : false
end
end
end
| require "tic_tac_toe/version"
module TicTacToe
class Game
def initialize display
end
def start
end
end
class Board
@@win_places = [[0, 1, 2],
[3, 4, 5],
[6, 7, 8],
[0, 3, 6],
[1, 4, 7],
[2, 5, 8],
[0, 4, 8],
[2, 4, 6]]
def self.win_places
@@win_places
end
def initialize
@_marks = Array.new(9, " ")
end
def pos index
@_marks[index]
end
def move index, mark
@_marks[index] = mark
end
def won
@@win_places.each do |places|
marks = places.map { |n| @_marks.at n }
return marks[0] if marks.all? { |m| m == marks[0] } and marks[0] != " "
end
nil
end
def draw?
- true
+ not won and @_marks.none? { |m| m == " " } ? true : false
end
end
end | 2 | 0.035714 | 1 | 1 |
afbd1fdcaaba1bd82b38d30dbc93272122e2fc7c | util.rs | util.rs | use ll::stylesheet::css_fixed;
pub fn css_fixed_to_float(f: css_fixed) -> float {
const before: i32 = 10;
f as float * 1.0f / ((1i32 << before) as float)
}
| use ll::stylesheet::css_fixed;
use core::libc::c_void;
pub fn css_fixed_to_float(f: css_fixed) -> float {
const before: i32 = 10;
f as float * 1.0f / ((1i32 << before) as float)
}
pub trait VoidPtrLike {
static fn from_void_ptr(*c_void) -> self;
fn to_void_ptr(&self) -> *c_void;
} | Add VoidPtrLike trait for converting between *c_void | Add VoidPtrLike trait for converting between *c_void
| Rust | apache-2.0 | servo/rust-netsurfcss | rust | ## Code Before:
use ll::stylesheet::css_fixed;
pub fn css_fixed_to_float(f: css_fixed) -> float {
const before: i32 = 10;
f as float * 1.0f / ((1i32 << before) as float)
}
## Instruction:
Add VoidPtrLike trait for converting between *c_void
## Code After:
use ll::stylesheet::css_fixed;
use core::libc::c_void;
pub fn css_fixed_to_float(f: css_fixed) -> float {
const before: i32 = 10;
f as float * 1.0f / ((1i32 << before) as float)
}
pub trait VoidPtrLike {
static fn from_void_ptr(*c_void) -> self;
fn to_void_ptr(&self) -> *c_void;
} | use ll::stylesheet::css_fixed;
+ use core::libc::c_void;
pub fn css_fixed_to_float(f: css_fixed) -> float {
const before: i32 = 10;
f as float * 1.0f / ((1i32 << before) as float)
}
+
+ pub trait VoidPtrLike {
+ static fn from_void_ptr(*c_void) -> self;
+ fn to_void_ptr(&self) -> *c_void;
+ } | 6 | 1 | 6 | 0 |
31485cb31c82bd1170e1b5a104dcd04471b728dc | .concierge/config.json | .concierge/config.json | {
"contributorsUrl": "https://api.github.com/repos/AnalyticalGraphicsInc/cesium/contents/CONTRIBUTORS.md",
"maxDaysSinceUpdate": 30
}
| {
"contributorsUrl": "https://api.github.com/repos/AnalyticalGraphicsInc/cesium/contents/CONTRIBUTORS.md",
"unitTestPath" : "Specs/",
"maxDaysSinceUpdate": 30
}
| Add unitTestPath to concierge settings | Add unitTestPath to concierge settings | JSON | apache-2.0 | AnalyticalGraphicsInc/cesium,josh-bernstein/cesium,CesiumGS/cesium,YonatanKra/cesium,likangning93/cesium,hodbauer/cesium,progsung/cesium,soceur/cesium,CesiumGS/cesium,AnalyticalGraphicsInc/cesium,soceur/cesium,likangning93/cesium,hodbauer/cesium,progsung/cesium,hodbauer/cesium,geoscan/cesium,likangning93/cesium,kaktus40/cesium,YonatanKra/cesium,CesiumGS/cesium,soceur/cesium,josh-bernstein/cesium,likangning93/cesium,CesiumGS/cesium,likangning93/cesium,CesiumGS/cesium,YonatanKra/cesium,jasonbeverage/cesium,YonatanKra/cesium,kaktus40/cesium,geoscan/cesium,jasonbeverage/cesium | json | ## Code Before:
{
"contributorsUrl": "https://api.github.com/repos/AnalyticalGraphicsInc/cesium/contents/CONTRIBUTORS.md",
"maxDaysSinceUpdate": 30
}
## Instruction:
Add unitTestPath to concierge settings
## Code After:
{
"contributorsUrl": "https://api.github.com/repos/AnalyticalGraphicsInc/cesium/contents/CONTRIBUTORS.md",
"unitTestPath" : "Specs/",
"maxDaysSinceUpdate": 30
}
| {
"contributorsUrl": "https://api.github.com/repos/AnalyticalGraphicsInc/cesium/contents/CONTRIBUTORS.md",
+ "unitTestPath" : "Specs/",
"maxDaysSinceUpdate": 30
} | 1 | 0.25 | 1 | 0 |
cb5c3dd86ac28f44a6adceed78dc3759a72676ba | src/Bgy/OAuth2/ResourceOwner.php | src/Bgy/OAuth2/ResourceOwner.php | <?php
/**
* @author Boris Guéry <guery.b@gmail.com>
*/
namespace Bgy\OAuth2;
use Bgy\OAuth2\Utils\Ensure;
class ResourceOwner
{
private $resourceOwnerId, $resourceOwnerType;
public function __construct($resourceOwnerId, $resourceOwnerType)
{
Ensure::string($resourceOwnerId, $resourceOwnerType);
$this->resourceOwnerId = $resourceOwnerId;
$this->resourceOwnerType = $resourceOwnerType;
}
public function getResourceOwnerId()
{
return $this->resourceOwnerId;
}
public function getResourceOwnerType()
{
return $this->resourceOwnerType;
}
}
| <?php
/**
* @author Boris Guéry <guery.b@gmail.com>
*/
namespace Bgy\OAuth2;
use Bgy\OAuth2\Utils\Ensure;
class ResourceOwner
{
private $resourceOwnerId;
private $resourceOwnerType;
public function __construct($resourceOwnerId, $resourceOwnerType)
{
Ensure::string($resourceOwnerId, $resourceOwnerType);
$this->resourceOwnerId = $resourceOwnerId;
$this->resourceOwnerType = $resourceOwnerType;
}
public function getResourceOwnerId()
{
return $this->resourceOwnerId;
}
public function getResourceOwnerType()
{
return $this->resourceOwnerType;
}
}
| Declare each attributes on a separate statement | Declare each attributes on a separate statement
| PHP | mit | borisguery/oauth2-server | php | ## Code Before:
<?php
/**
* @author Boris Guéry <guery.b@gmail.com>
*/
namespace Bgy\OAuth2;
use Bgy\OAuth2\Utils\Ensure;
class ResourceOwner
{
private $resourceOwnerId, $resourceOwnerType;
public function __construct($resourceOwnerId, $resourceOwnerType)
{
Ensure::string($resourceOwnerId, $resourceOwnerType);
$this->resourceOwnerId = $resourceOwnerId;
$this->resourceOwnerType = $resourceOwnerType;
}
public function getResourceOwnerId()
{
return $this->resourceOwnerId;
}
public function getResourceOwnerType()
{
return $this->resourceOwnerType;
}
}
## Instruction:
Declare each attributes on a separate statement
## Code After:
<?php
/**
* @author Boris Guéry <guery.b@gmail.com>
*/
namespace Bgy\OAuth2;
use Bgy\OAuth2\Utils\Ensure;
class ResourceOwner
{
private $resourceOwnerId;
private $resourceOwnerType;
public function __construct($resourceOwnerId, $resourceOwnerType)
{
Ensure::string($resourceOwnerId, $resourceOwnerType);
$this->resourceOwnerId = $resourceOwnerId;
$this->resourceOwnerType = $resourceOwnerType;
}
public function getResourceOwnerId()
{
return $this->resourceOwnerId;
}
public function getResourceOwnerType()
{
return $this->resourceOwnerType;
}
}
| <?php
/**
* @author Boris Guéry <guery.b@gmail.com>
*/
namespace Bgy\OAuth2;
use Bgy\OAuth2\Utils\Ensure;
class ResourceOwner
{
+ private $resourceOwnerId;
- private $resourceOwnerId, $resourceOwnerType;
? ------------------
+ private $resourceOwnerType;
public function __construct($resourceOwnerId, $resourceOwnerType)
{
Ensure::string($resourceOwnerId, $resourceOwnerType);
$this->resourceOwnerId = $resourceOwnerId;
$this->resourceOwnerType = $resourceOwnerType;
}
public function getResourceOwnerId()
{
return $this->resourceOwnerId;
}
public function getResourceOwnerType()
{
return $this->resourceOwnerType;
}
} | 3 | 0.1 | 2 | 1 |
23b82eaf1c5e8f04ee0ff7fc4ce947989e42e070 | criticL/app/views/reviews/edit.html.erb | criticL/app/views/reviews/edit.html.erb | <h1>Editing Review</h1>
<%= render 'form' %>
<%= link_to 'Show', @review %> |
<%= link_to 'Back', reviews_path %>
| <h1>Editing Review</h1>
<%= render 'form' %>
<%= link_to 'Show', @review %> |
<%= link_to 'Back', movie_reviews_path(@review.movie) %>
| Fix other instances of bug | Fix other instances of bug
| HTML+ERB | mit | kmark1625/criticL,kmark1625/criticL,kmark1625/criticL | html+erb | ## Code Before:
<h1>Editing Review</h1>
<%= render 'form' %>
<%= link_to 'Show', @review %> |
<%= link_to 'Back', reviews_path %>
## Instruction:
Fix other instances of bug
## Code After:
<h1>Editing Review</h1>
<%= render 'form' %>
<%= link_to 'Show', @review %> |
<%= link_to 'Back', movie_reviews_path(@review.movie) %>
| <h1>Editing Review</h1>
<%= render 'form' %>
<%= link_to 'Show', @review %> |
- <%= link_to 'Back', reviews_path %>
+ <%= link_to 'Back', movie_reviews_path(@review.movie) %>
? ++++++ +++++++++++++++
| 2 | 0.333333 | 1 | 1 |
cc41dd94bc695e0eb4da3915efcc2559c195f6cf | test/jshint.js | test/jshint.js | "use strict";
var JSHINT = require("jshint").JSHINT,
assert = require("assert"),
glob = require("glob"),
path = require("path"),
fs = require("fs"),
_ = require('underscore');
var projectDir = path.normalize(path.join(__dirname, '..'));
var jsFiles = glob.sync(
"**/*.js",
{ cwd: projectDir }
);
var jshintrc = path.join(projectDir, '.jshintrc');
var jshintConfig = JSON.parse(fs.readFileSync(jshintrc).toString());
describe("Run jsHint on", function () {
jsFiles.forEach(function (file) {
if (/node_modules/.test(file)) {
return;
}
it(file, function () {
var content = fs.readFileSync(path.join(projectDir, file)).toString();
// Split the content into lines and replace whitespace only lines with
// empty strings so that this test behaviour mimics that of the command
// line tool.
var lines = _.map(
content.split(/[\n\r]/),
function (line) {
return (/^\s+$/).test(line) ? '' : line;
}
);
var success = JSHINT(
lines,
jshintConfig
);
var errorMessage = "";
if (!success) {
// console.log(JSHINT.errors[0]);
errorMessage = "line " + JSHINT.errors[0].line + ": " + JSHINT.errors[0].raw;
}
assert.ok(success, errorMessage);
});
});
});
| "use strict";
var JSHINT = require("jshint").JSHINT,
assert = require("assert"),
glob = require("glob"),
path = require("path"),
fs = require("fs"),
_ = require('underscore');
var projectDir = path.normalize(path.join(__dirname, '..'));
var jsFiles = glob.sync(
"**/*.js",
{ cwd: projectDir }
);
var jshintrc = path.join(projectDir, '.jshintrc');
var jshintConfig = JSON.parse(fs.readFileSync(jshintrc).toString());
describe("Run jsHint on", function () {
jsFiles.forEach(function (file) {
if (/node_modules/.test(file)) {
return;
}
it(file, function () {
var content = fs.readFileSync(path.join(projectDir, file)).toString();
// Split the content into lines and replace whitespace only lines with
// empty strings so that this test behaviour mimics that of the command
// line tool.
var lines = _.map(
content.split(/[\n\r]/),
function (line) {
return (/^\s+$/).test(line) ? '' : line;
}
);
var success = JSHINT(
lines,
jshintConfig
);
var errorMessage = "";
if (!success) {
var error = JSHINT.data().errors[0];
// console.log(error);
errorMessage = "line " + error.line + ": " + error.reason;
}
assert.ok(success, errorMessage);
});
});
});
| Return the correctly formatted error message | Return the correctly formatted error message
| JavaScript | agpl-3.0 | Sinar/popit-api,mysociety/popit-api,mysociety/popit-api,ciudadanointeligente/popit-api,Sinar/popit-api | javascript | ## Code Before:
"use strict";
var JSHINT = require("jshint").JSHINT,
assert = require("assert"),
glob = require("glob"),
path = require("path"),
fs = require("fs"),
_ = require('underscore');
var projectDir = path.normalize(path.join(__dirname, '..'));
var jsFiles = glob.sync(
"**/*.js",
{ cwd: projectDir }
);
var jshintrc = path.join(projectDir, '.jshintrc');
var jshintConfig = JSON.parse(fs.readFileSync(jshintrc).toString());
describe("Run jsHint on", function () {
jsFiles.forEach(function (file) {
if (/node_modules/.test(file)) {
return;
}
it(file, function () {
var content = fs.readFileSync(path.join(projectDir, file)).toString();
// Split the content into lines and replace whitespace only lines with
// empty strings so that this test behaviour mimics that of the command
// line tool.
var lines = _.map(
content.split(/[\n\r]/),
function (line) {
return (/^\s+$/).test(line) ? '' : line;
}
);
var success = JSHINT(
lines,
jshintConfig
);
var errorMessage = "";
if (!success) {
// console.log(JSHINT.errors[0]);
errorMessage = "line " + JSHINT.errors[0].line + ": " + JSHINT.errors[0].raw;
}
assert.ok(success, errorMessage);
});
});
});
## Instruction:
Return the correctly formatted error message
## Code After:
"use strict";
var JSHINT = require("jshint").JSHINT,
assert = require("assert"),
glob = require("glob"),
path = require("path"),
fs = require("fs"),
_ = require('underscore');
var projectDir = path.normalize(path.join(__dirname, '..'));
var jsFiles = glob.sync(
"**/*.js",
{ cwd: projectDir }
);
var jshintrc = path.join(projectDir, '.jshintrc');
var jshintConfig = JSON.parse(fs.readFileSync(jshintrc).toString());
describe("Run jsHint on", function () {
jsFiles.forEach(function (file) {
if (/node_modules/.test(file)) {
return;
}
it(file, function () {
var content = fs.readFileSync(path.join(projectDir, file)).toString();
// Split the content into lines and replace whitespace only lines with
// empty strings so that this test behaviour mimics that of the command
// line tool.
var lines = _.map(
content.split(/[\n\r]/),
function (line) {
return (/^\s+$/).test(line) ? '' : line;
}
);
var success = JSHINT(
lines,
jshintConfig
);
var errorMessage = "";
if (!success) {
var error = JSHINT.data().errors[0];
// console.log(error);
errorMessage = "line " + error.line + ": " + error.reason;
}
assert.ok(success, errorMessage);
});
});
});
| "use strict";
var JSHINT = require("jshint").JSHINT,
assert = require("assert"),
glob = require("glob"),
path = require("path"),
fs = require("fs"),
_ = require('underscore');
var projectDir = path.normalize(path.join(__dirname, '..'));
var jsFiles = glob.sync(
"**/*.js",
{ cwd: projectDir }
);
var jshintrc = path.join(projectDir, '.jshintrc');
var jshintConfig = JSON.parse(fs.readFileSync(jshintrc).toString());
describe("Run jsHint on", function () {
jsFiles.forEach(function (file) {
if (/node_modules/.test(file)) {
return;
}
it(file, function () {
var content = fs.readFileSync(path.join(projectDir, file)).toString();
// Split the content into lines and replace whitespace only lines with
// empty strings so that this test behaviour mimics that of the command
// line tool.
var lines = _.map(
content.split(/[\n\r]/),
function (line) {
return (/^\s+$/).test(line) ? '' : line;
}
);
var success = JSHINT(
lines,
jshintConfig
);
var errorMessage = "";
if (!success) {
+ var error = JSHINT.data().errors[0];
- // console.log(JSHINT.errors[0]);
? ------- ----
+ // console.log(error);
- errorMessage = "line " + JSHINT.errors[0].line + ": " + JSHINT.errors[0].raw;
? ------- ---- ------- ---- ^
+ errorMessage = "line " + error.line + ": " + error.reason;
? + ^^^
}
assert.ok(success, errorMessage);
});
});
}); | 5 | 0.087719 | 3 | 2 |
da9d52773d116bd481cec1ca74d43a1f6bdaf937 | spring-cloud-build-tools/src/checkstyle/checkstyle-suppressions.xml | spring-cloud-build-tools/src/checkstyle/checkstyle-suppressions.xml | <?xml version="1.0"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.1//EN"
"http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
<suppressions>
<suppress files="[\\/]src[\\/]test[\\/](java|groovy)[\\/]" checks="(Javadoc*|HideUtility*)" />
<suppress files=".*samples.*" checks="(Javadoc*|HideUtility*)" />
<suppress files="(.*Tests\.java|.*Tests\.groovy)" checks="Javadoc*" />
<suppress files="generated-sources" checks="[a-zA-Z0-9]*" />
</suppressions> | <?xml version="1.0"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.1//EN"
"http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
<suppressions>
<suppress files="[\\/]src[\\/]test[\\/](java|groovy)[\\/]" checks="(Javadoc*|HideUtility*)" />
<suppress files=".*samples.*" checks="(Javadoc*|HideUtility*)" />
<suppress files="(.*Tests\.java|.*Tests\.groovy)" checks="Javadoc*" />
<suppress files="generated-sources" checks="[a-zA-Z0-9]*" />
<suppress files=".*OkHttpRibbonResponse\.java" checks="IllegalImportCheck"/>
<suppress files=".*RibbonApacheHttpResponse\.java" checks="IllegalImportCheck"/>
</suppressions> | Add checkstyle suppressions for illegal imports necessary for some Netflix types. | Add checkstyle suppressions for illegal imports necessary for some Netflix types.
| XML | apache-2.0 | spring-cloud/spring-cloud-build,spring-cloud/spring-cloud-build | xml | ## Code Before:
<?xml version="1.0"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.1//EN"
"http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
<suppressions>
<suppress files="[\\/]src[\\/]test[\\/](java|groovy)[\\/]" checks="(Javadoc*|HideUtility*)" />
<suppress files=".*samples.*" checks="(Javadoc*|HideUtility*)" />
<suppress files="(.*Tests\.java|.*Tests\.groovy)" checks="Javadoc*" />
<suppress files="generated-sources" checks="[a-zA-Z0-9]*" />
</suppressions>
## Instruction:
Add checkstyle suppressions for illegal imports necessary for some Netflix types.
## Code After:
<?xml version="1.0"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.1//EN"
"http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
<suppressions>
<suppress files="[\\/]src[\\/]test[\\/](java|groovy)[\\/]" checks="(Javadoc*|HideUtility*)" />
<suppress files=".*samples.*" checks="(Javadoc*|HideUtility*)" />
<suppress files="(.*Tests\.java|.*Tests\.groovy)" checks="Javadoc*" />
<suppress files="generated-sources" checks="[a-zA-Z0-9]*" />
<suppress files=".*OkHttpRibbonResponse\.java" checks="IllegalImportCheck"/>
<suppress files=".*RibbonApacheHttpResponse\.java" checks="IllegalImportCheck"/>
</suppressions> | <?xml version="1.0"?>
<!DOCTYPE suppressions PUBLIC
"-//Puppy Crawl//DTD Suppressions 1.1//EN"
"http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
<suppressions>
<suppress files="[\\/]src[\\/]test[\\/](java|groovy)[\\/]" checks="(Javadoc*|HideUtility*)" />
<suppress files=".*samples.*" checks="(Javadoc*|HideUtility*)" />
<suppress files="(.*Tests\.java|.*Tests\.groovy)" checks="Javadoc*" />
<suppress files="generated-sources" checks="[a-zA-Z0-9]*" />
+ <suppress files=".*OkHttpRibbonResponse\.java" checks="IllegalImportCheck"/>
+ <suppress files=".*RibbonApacheHttpResponse\.java" checks="IllegalImportCheck"/>
</suppressions> | 2 | 0.2 | 2 | 0 |
498cbc2d6ec7fda7815ba15b52b254a1adb78516 | test/unit/rails-ranger.js | test/unit/rails-ranger.js | import RailsRanger from '../../src/rails-ranger'
describe('railsRanger', () => {
describe('.get', () => {
beforeEach(() => {
spy(RailsRanger.client, 'get')
})
it('triggers a get request through the client', () => {
let api = new RailsRanger()
api.get('/users')
expect(RailsRanger.client.get).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
let api = new RailsRanger()
api.get('/users/:id', { id: 1 })
expect(RailsRanger.client.get).to.have.been.calledOnce
})
it('transforms remaining parameters into query params', () => {
expect(RailsRanger.greet).to.have.always.returned('hello')
})
})
})
| import RailsRanger from '../../src/rails-ranger'
describe('RailsRanger', () => {
let ranger = null
beforeEach(() => {
// Defines a new RailsRanger instance to be used in each test
ranger = new RailsRanger()
})
describe('.get', () => {
// Spy on client.get function
beforeEach(() => { spy(ranger.client, 'get') })
it('triggers a get request through the client', () => {
ranger.get('/users')
expect(ranger.client.get).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.get('/users/:id', { id: 1 })
expect(ranger.client.get).to.have.been.calledWith('/users/1')
})
it('interpolates parameters into the query', () => {
ranger.get('/users/', { id: 1 })
expect(ranger.client.get).to.have.been.calledWith('/users/?id=1')
})
it('transforms remaining parameters from path into query', () => {
ranger.get('/users/:id', { id: 1, only: 'logged' })
expect(ranger.client.get).to.have.been.calledWith('/users/1?only=logged')
})
})
describe('.post', () => {
// Spy on client.post function
beforeEach(() => { spy(ranger.client, 'post') })
it('triggers a post request through the client', () => {
ranger.post('/users')
expect(ranger.client.post).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.post('/users/:id', { id: 1 })
expect(ranger.client.post).to.have.been.calledWith('/users/1')
})
})
describe('.delete', () => {
// Spy on client.delete function
beforeEach(() => { spy(ranger.client, 'delete') })
it('triggers a delete request through the client', () => {
ranger.delete('/users')
expect(ranger.client.delete).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.delete('/users/:id', { id: 1 })
expect(ranger.client.delete).to.have.been.calledWith('/users/1')
})
})
})
| Add tests for get, post and delete | Add tests for get, post and delete
| JavaScript | mit | victor-am/rails-ranger,victor-am/rails-ranger | javascript | ## Code Before:
import RailsRanger from '../../src/rails-ranger'
describe('railsRanger', () => {
describe('.get', () => {
beforeEach(() => {
spy(RailsRanger.client, 'get')
})
it('triggers a get request through the client', () => {
let api = new RailsRanger()
api.get('/users')
expect(RailsRanger.client.get).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
let api = new RailsRanger()
api.get('/users/:id', { id: 1 })
expect(RailsRanger.client.get).to.have.been.calledOnce
})
it('transforms remaining parameters into query params', () => {
expect(RailsRanger.greet).to.have.always.returned('hello')
})
})
})
## Instruction:
Add tests for get, post and delete
## Code After:
import RailsRanger from '../../src/rails-ranger'
describe('RailsRanger', () => {
let ranger = null
beforeEach(() => {
// Defines a new RailsRanger instance to be used in each test
ranger = new RailsRanger()
})
describe('.get', () => {
// Spy on client.get function
beforeEach(() => { spy(ranger.client, 'get') })
it('triggers a get request through the client', () => {
ranger.get('/users')
expect(ranger.client.get).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.get('/users/:id', { id: 1 })
expect(ranger.client.get).to.have.been.calledWith('/users/1')
})
it('interpolates parameters into the query', () => {
ranger.get('/users/', { id: 1 })
expect(ranger.client.get).to.have.been.calledWith('/users/?id=1')
})
it('transforms remaining parameters from path into query', () => {
ranger.get('/users/:id', { id: 1, only: 'logged' })
expect(ranger.client.get).to.have.been.calledWith('/users/1?only=logged')
})
})
describe('.post', () => {
// Spy on client.post function
beforeEach(() => { spy(ranger.client, 'post') })
it('triggers a post request through the client', () => {
ranger.post('/users')
expect(ranger.client.post).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.post('/users/:id', { id: 1 })
expect(ranger.client.post).to.have.been.calledWith('/users/1')
})
})
describe('.delete', () => {
// Spy on client.delete function
beforeEach(() => { spy(ranger.client, 'delete') })
it('triggers a delete request through the client', () => {
ranger.delete('/users')
expect(ranger.client.delete).to.have.been.calledOnce
})
it('interpolates parameters into the path', () => {
ranger.delete('/users/:id', { id: 1 })
expect(ranger.client.delete).to.have.been.calledWith('/users/1')
})
})
})
| import RailsRanger from '../../src/rails-ranger'
- describe('railsRanger', () => {
? ^
+ describe('RailsRanger', () => {
? ^
+ let ranger = null
+
+ beforeEach(() => {
+ // Defines a new RailsRanger instance to be used in each test
+ ranger = new RailsRanger()
+ })
+
describe('.get', () => {
+ // Spy on client.get function
+ beforeEach(() => { spy(ranger.client, 'get') })
- beforeEach(() => {
- spy(RailsRanger.client, 'get')
- })
it('triggers a get request through the client', () => {
- let api = new RailsRanger()
- api.get('/users')
? ^^
+ ranger.get('/users')
? + ^^^^
- expect(RailsRanger.client.get).to.have.been.calledOnce
? ^^^^^^
+ expect(ranger.client.get).to.have.been.calledOnce
? ^
})
it('interpolates parameters into the path', () => {
- let api = new RailsRanger()
- api.get('/users/:id', { id: 1 })
? ^^
+ ranger.get('/users/:id', { id: 1 })
? + ^^^^
- expect(RailsRanger.client.get).to.have.been.calledOnce
? ^^^^^^ ^^^
+ expect(ranger.client.get).to.have.been.calledWith('/users/1')
? ^ ^^^^^^^^^ ++++++
})
+ it('interpolates parameters into the query', () => {
+ ranger.get('/users/', { id: 1 })
+ expect(ranger.client.get).to.have.been.calledWith('/users/?id=1')
+ })
+
- it('transforms remaining parameters into query params', () => {
? -------
+ it('transforms remaining parameters from path into query', () => {
? ++++++++++
- expect(RailsRanger.greet).to.have.always.returned('hello')
+ ranger.get('/users/:id', { id: 1, only: 'logged' })
+ expect(ranger.client.get).to.have.been.calledWith('/users/1?only=logged')
+ })
+ })
+
+ describe('.post', () => {
+ // Spy on client.post function
+ beforeEach(() => { spy(ranger.client, 'post') })
+
+ it('triggers a post request through the client', () => {
+ ranger.post('/users')
+ expect(ranger.client.post).to.have.been.calledOnce
+ })
+
+ it('interpolates parameters into the path', () => {
+ ranger.post('/users/:id', { id: 1 })
+ expect(ranger.client.post).to.have.been.calledWith('/users/1')
+ })
+ })
+
+ describe('.delete', () => {
+ // Spy on client.delete function
+ beforeEach(() => { spy(ranger.client, 'delete') })
+
+ it('triggers a delete request through the client', () => {
+ ranger.delete('/users')
+ expect(ranger.client.delete).to.have.been.calledOnce
+ })
+
+ it('interpolates parameters into the path', () => {
+ ranger.delete('/users/:id', { id: 1 })
+ expect(ranger.client.delete).to.have.been.calledWith('/users/1')
})
})
}) | 64 | 2.56 | 52 | 12 |
49eb666e700ef5e87e8122c2ac492dc9e0bdf2ba | lib/node_modules/@stdlib/utils/is-nan/examples/index.js | lib/node_modules/@stdlib/utils/is-nan/examples/index.js | 'use strict';
var isnan = require( './../lib' );
console.log( isnan( NaN ) );
// returns true
console.log( isnan( new Number( NaN ) ) );
// returns true
console.log( isnan( 5 ) );
// returns false
console.log( isnan( '5' ) );
// returns false
console.log( isnan( null ) );
// returns false
console.log( isnan( Symbol( 'NaN' ) ) );
// returns false
| 'use strict';
var hasSymbolSupport = require( '@stdlib/utils/detect-symbol-support' )();
var isnan = require( './../lib' );
console.log( isnan( NaN ) );
// returns true
console.log( isnan( new Number( NaN ) ) );
// returns true
console.log( isnan( 5 ) );
// returns false
console.log( isnan( '5' ) );
// returns false
console.log( isnan( null ) );
// returns false
if ( hasSymbolSupport ) {
console.log( isnan( Symbol( 'NaN' ) ) );
// returns false
}
| Use utility to detect Symbol support | Use utility to detect Symbol support
| JavaScript | apache-2.0 | stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib | javascript | ## Code Before:
'use strict';
var isnan = require( './../lib' );
console.log( isnan( NaN ) );
// returns true
console.log( isnan( new Number( NaN ) ) );
// returns true
console.log( isnan( 5 ) );
// returns false
console.log( isnan( '5' ) );
// returns false
console.log( isnan( null ) );
// returns false
console.log( isnan( Symbol( 'NaN' ) ) );
// returns false
## Instruction:
Use utility to detect Symbol support
## Code After:
'use strict';
var hasSymbolSupport = require( '@stdlib/utils/detect-symbol-support' )();
var isnan = require( './../lib' );
console.log( isnan( NaN ) );
// returns true
console.log( isnan( new Number( NaN ) ) );
// returns true
console.log( isnan( 5 ) );
// returns false
console.log( isnan( '5' ) );
// returns false
console.log( isnan( null ) );
// returns false
if ( hasSymbolSupport ) {
console.log( isnan( Symbol( 'NaN' ) ) );
// returns false
}
| 'use strict';
+ var hasSymbolSupport = require( '@stdlib/utils/detect-symbol-support' )();
var isnan = require( './../lib' );
console.log( isnan( NaN ) );
// returns true
console.log( isnan( new Number( NaN ) ) );
// returns true
console.log( isnan( 5 ) );
// returns false
console.log( isnan( '5' ) );
// returns false
console.log( isnan( null ) );
// returns false
+ if ( hasSymbolSupport ) {
- console.log( isnan( Symbol( 'NaN' ) ) );
+ console.log( isnan( Symbol( 'NaN' ) ) );
? +
- // returns false
+ // returns false
? +
+ } | 7 | 0.333333 | 5 | 2 |
76b7dad678b1be815a1df0ed8b920e06ae7916fd | css/parts/_code.scss | css/parts/_code.scss | // ------- Code display
code {
font-family: $font-family-code;
font-size: 0.9em;
}
pre {
border-left: 0.4em solid #DDD;
padding: 10px;
margin-left: 2.5em;
margin-right: 2.5em;
width: auto;
overflow: auto;
background-color: $color-gray-trans;
font-size: 0.93em;
font-family: $font-family-code;
text-shadow: 1px 1px 1px $color-black;
}
| // ------- Code display
code { font-family: $font-family-code; }
pre {
border-left: 0.3em solid #e2f89c;
padding: 0 0.5em;
margin-left: 2.5em;
margin-right: 2.5em;
width: auto;
overflow: auto;
background-color: $color-gray-trans;
font-family: $font-family-code;
}
| Tweak code and pre styling | Tweak code and pre styling
| SCSS | mit | le717/le717.github.io | scss | ## Code Before:
// ------- Code display
code {
font-family: $font-family-code;
font-size: 0.9em;
}
pre {
border-left: 0.4em solid #DDD;
padding: 10px;
margin-left: 2.5em;
margin-right: 2.5em;
width: auto;
overflow: auto;
background-color: $color-gray-trans;
font-size: 0.93em;
font-family: $font-family-code;
text-shadow: 1px 1px 1px $color-black;
}
## Instruction:
Tweak code and pre styling
## Code After:
// ------- Code display
code { font-family: $font-family-code; }
pre {
border-left: 0.3em solid #e2f89c;
padding: 0 0.5em;
margin-left: 2.5em;
margin-right: 2.5em;
width: auto;
overflow: auto;
background-color: $color-gray-trans;
font-family: $font-family-code;
}
| // ------- Code display
- code {
- font-family: $font-family-code;
+ code { font-family: $font-family-code; }
? ++++ + ++
- font-size: 0.9em;
- }
pre {
- border-left: 0.4em solid #DDD;
? ^ ^^^
+ border-left: 0.3em solid #e2f89c;
? ^ ^^^^^^
- padding: 10px;
+ padding: 0 0.5em;
margin-left: 2.5em;
margin-right: 2.5em;
width: auto;
overflow: auto;
background-color: $color-gray-trans;
- font-size: 0.93em;
font-family: $font-family-code;
- text-shadow: 1px 1px 1px $color-black;
} | 11 | 0.611111 | 3 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.