commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
6b83ab382453ff622aba1b550982cdc11229e209 | lib/config/custom-routes.rb | lib/config/custom-routes.rb |
Rails.application.routes.draw do
# Additional help pages
match '/help/help_out' => 'help#help_out', :as => 'help_help_out'
match '/help/right_to_know' => 'help#right_to_know', :as => 'help_right_to_know'
# redirect the blog page to blog.asktheeu.org
match '/blog/' => redirect('http://blog.asktheeu.org/')
end
|
Rails.application.routes.draw do
# Additional help pages
match '/help/help_out' => 'help#help_out',
:as => 'help_help_out',
:via => :get
match '/help/right_to_know' => 'help#right_to_know',
:as => 'help_right_to_know',
:via => :get
# redirect the blog page to blog.asktheeu.org
match '/blog/' => redirect('http://blog.asktheeu.org/'),
:as => :external_blog,
:via => :get
end
| Fix broken routes so application can now boot | Fix broken routes so application can now boot
| Ruby | mit | mysociety/asktheeu-theme,mysociety/asktheeu-theme | ruby | ## Code Before:
Rails.application.routes.draw do
# Additional help pages
match '/help/help_out' => 'help#help_out', :as => 'help_help_out'
match '/help/right_to_know' => 'help#right_to_know', :as => 'help_right_to_know'
# redirect the blog page to blog.asktheeu.org
match '/blog/' => redirect('http://blog.asktheeu.org/')
end
## Instruction:
Fix broken routes so application can now boot
## Code After:
Rails.application.routes.draw do
# Additional help pages
match '/help/help_out' => 'help#help_out',
:as => 'help_help_out',
:via => :get
match '/help/right_to_know' => 'help#right_to_know',
:as => 'help_right_to_know',
:via => :get
# redirect the blog page to blog.asktheeu.org
match '/blog/' => redirect('http://blog.asktheeu.org/'),
:as => :external_blog,
:via => :get
end
|
Rails.application.routes.draw do
# Additional help pages
- match '/help/help_out' => 'help#help_out', :as => 'help_help_out'
? -----------------------
+ match '/help/help_out' => 'help#help_out',
+ :as => 'help_help_out',
+ :via => :get
- match '/help/right_to_know' => 'help#right_to_know', :as => 'help_right_to_know'
? ----------------------------
+ match '/help/right_to_know' => 'help#right_to_know',
+ :as => 'help_right_to_know',
+ :via => :get
# redirect the blog page to blog.asktheeu.org
- match '/blog/' => redirect('http://blog.asktheeu.org/')
+ match '/blog/' => redirect('http://blog.asktheeu.org/'),
? +
+ :as => :external_blog,
+ :via => :get
end | 12 | 1.333333 | 9 | 3 |
d5c5ac23a25503a28ba9eb442649ad8339833d66 | src/themes/default/empty-pane-menu.scss | src/themes/default/empty-pane-menu.scss | /**
* Copyright 2017 Simon Edwards <simon@simonzone.com>
*/
@import "bootstrap/variables";
:host(:focus) {
outline: 0px solid transparent;
}
#ID_EMPTY_PANE_MENU {
font-family: $font-family-base;
font-size: $input-font-size;
line-height: $line-height-base;
color: $text-color;
background-color: $dropdown-bg;
width: 100%;
height: 100%;
contain: strict;
}
#ID_CONTAINER {
--empty-pane-menu-container-width: 400px;
display: block;
width: var(--empty-pane-menu-container-width);
margin-left: calc((100% - var(--empty-pane-menu-container-width) )/2);
margin-right: calc((100% - var(--empty-pane-menu-container-width) )/2);
}
#ID_TITLE {
width: 100%;
margin-bottom: 5px;
}
| /**
* Copyright 2017 Simon Edwards <simon@simonzone.com>
*/
@import "bootstrap/variables";
:host(:focus) {
outline: 0px solid transparent;
}
#ID_EMPTY_PANE_MENU {
font-family: $font-family-base;
font-size: $input-font-size;
line-height: $line-height-base;
color: $text-color;
background-color: $dropdown-bg;
width: 100%;
height: 100%;
contain: strict;
}
#ID_CONTAINER {
--empty-pane-menu-container-width: 400px;
display: block;
width: 100%;
max-width: var(--empty-pane-menu-container-width);
margin-left: auto;
margin-right: auto;
}
#ID_TITLE {
width: 100%;
margin-bottom: 5px;
}
| Make the empty pane menu shrink when needed | Make the empty pane menu shrink when needed
| SCSS | mit | sedwards2009/extraterm,sedwards2009/extraterm,sedwards2009/extraterm,sedwards2009/extraterm,sedwards2009/extraterm | scss | ## Code Before:
/**
* Copyright 2017 Simon Edwards <simon@simonzone.com>
*/
@import "bootstrap/variables";
:host(:focus) {
outline: 0px solid transparent;
}
#ID_EMPTY_PANE_MENU {
font-family: $font-family-base;
font-size: $input-font-size;
line-height: $line-height-base;
color: $text-color;
background-color: $dropdown-bg;
width: 100%;
height: 100%;
contain: strict;
}
#ID_CONTAINER {
--empty-pane-menu-container-width: 400px;
display: block;
width: var(--empty-pane-menu-container-width);
margin-left: calc((100% - var(--empty-pane-menu-container-width) )/2);
margin-right: calc((100% - var(--empty-pane-menu-container-width) )/2);
}
#ID_TITLE {
width: 100%;
margin-bottom: 5px;
}
## Instruction:
Make the empty pane menu shrink when needed
## Code After:
/**
* Copyright 2017 Simon Edwards <simon@simonzone.com>
*/
@import "bootstrap/variables";
:host(:focus) {
outline: 0px solid transparent;
}
#ID_EMPTY_PANE_MENU {
font-family: $font-family-base;
font-size: $input-font-size;
line-height: $line-height-base;
color: $text-color;
background-color: $dropdown-bg;
width: 100%;
height: 100%;
contain: strict;
}
#ID_CONTAINER {
--empty-pane-menu-container-width: 400px;
display: block;
width: 100%;
max-width: var(--empty-pane-menu-container-width);
margin-left: auto;
margin-right: auto;
}
#ID_TITLE {
width: 100%;
margin-bottom: 5px;
}
| /**
* Copyright 2017 Simon Edwards <simon@simonzone.com>
*/
@import "bootstrap/variables";
:host(:focus) {
outline: 0px solid transparent;
}
#ID_EMPTY_PANE_MENU {
font-family: $font-family-base;
font-size: $input-font-size;
line-height: $line-height-base;
color: $text-color;
background-color: $dropdown-bg;
width: 100%;
height: 100%;
contain: strict;
}
#ID_CONTAINER {
--empty-pane-menu-container-width: 400px;
display: block;
- width: var(--empty-pane-menu-container-width);
- margin-left: calc((100% - var(--empty-pane-menu-container-width) )/2);
- margin-right: calc((100% - var(--empty-pane-menu-container-width) )/2);
+ width: 100%;
+ max-width: var(--empty-pane-menu-container-width);
+
+ margin-left: auto;
+ margin-right: auto;
}
#ID_TITLE {
width: 100%;
margin-bottom: 5px;
} | 8 | 0.210526 | 5 | 3 |
b22014db7a5b437de7def5461b3bd1f4d5090800 | test/cookbooks/test/recipes/source.rb | test/cookbooks/test/recipes/source.rb | apt_update
node.default['php']['install_method'] = 'source'
node.default['php']['pear'] = '/usr/local/bin/pear'
include_recipe 'php'
| apt_update
node.default['php']['install_method'] = 'source'
node.default['php']['pear'] = '/usr/local/bin/pear'
node.default['php']['url'] = 'https://ftp.osuosl.org/pub/php/' # the default site blocks github actions boxes
include_recipe 'php'
| Use a different mirror in testing since GH Actions is blocked | Use a different mirror in testing since GH Actions is blocked
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
| Ruby | apache-2.0 | opscode-cookbooks/php,chef-cookbooks/php,chef-cookbooks/php,opscode-cookbooks/php | ruby | ## Code Before:
apt_update
node.default['php']['install_method'] = 'source'
node.default['php']['pear'] = '/usr/local/bin/pear'
include_recipe 'php'
## Instruction:
Use a different mirror in testing since GH Actions is blocked
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
## Code After:
apt_update
node.default['php']['install_method'] = 'source'
node.default['php']['pear'] = '/usr/local/bin/pear'
node.default['php']['url'] = 'https://ftp.osuosl.org/pub/php/' # the default site blocks github actions boxes
include_recipe 'php'
| apt_update
node.default['php']['install_method'] = 'source'
node.default['php']['pear'] = '/usr/local/bin/pear'
+ node.default['php']['url'] = 'https://ftp.osuosl.org/pub/php/' # the default site blocks github actions boxes
include_recipe 'php' | 1 | 0.166667 | 1 | 0 |
8453ed1c076bda52161ddd792ee8dd81df5790e3 | src/core/Routine.pm | src/core/Routine.pm | my class Routine {
method of() { self.signature.returns }
method returns() { self.signature.returns }
method rw() { $!rw }
method assuming($r: *@curried_pos, *%curried_named) {
return sub CURRIED (*@pos, *%named) {
$r(|@curried_pos, |@pos, |%curried_named, |%named)
}
}
method candidates() {
self.is_dispatcher ??
pir::perl6ize_type__PP(nqp::getattr(self, Code, '$!dispatchees')) !!
(self,)
}
method multi() {
self.dispatcher.defined
}
}
| my class Routine {
method of() { self.signature.returns }
method returns() { self.signature.returns }
method rw() { $!rw }
method assuming($r: *@curried_pos, *%curried_named) {
return sub CURRIED (*@pos, *%named) {
$r(|@curried_pos, |@pos, |%curried_named, |%named)
}
}
method candidates() {
self.is_dispatcher ??
pir::perl6ize_type__PP(nqp::getattr(self, Code, '$!dispatchees')) !!
(self,)
}
method candidates_matching(|$c) {
sub checker(|$) {
my Mu $cap := pir::find_lex__Ps('call_sig');
self.is_dispatcher ??
pir::perl6ize_type__PP(pir::perl6_get_matching_multis__PPP(self, $cap)) !!
(self,)
}
checker(|$c);
}
method multi() {
self.dispatcher.defined
}
}
| Add a way to get a set of matching candidates for a multi. | Add a way to get a set of matching candidates for a multi.
| Perl | artistic-2.0 | azawawi/rakudo,sjn/rakudo,MasterDuke17/rakudo,ungrim97/rakudo,sergot/rakudo,labster/rakudo,nbrown/rakudo,Leont/rakudo,labster/rakudo,MasterDuke17/rakudo,paultcochrane/rakudo,Gnouc/rakudo,tony-o/rakudo,tony-o/deb-rakudodaily,LLFourn/rakudo,dankogai/rakudo,skids/rakudo,raydiak/rakudo,b2gills/rakudo,zhuomingliang/rakudo,usev6/rakudo,cognominal/rakudo,teodozjan/rakudo,ugexe/rakudo,tony-o/deb-rakudodaily,nbrown/rakudo,tbrowder/rakudo,dankogai/rakudo,rakudo/rakudo,zostay/rakudo,cognominal/rakudo,ab5tract/rakudo,tbrowder/rakudo,cygx/rakudo,cognominal/rakudo,softmoth/rakudo,usev6/rakudo,lucasbuchala/rakudo,cygx/rakudo,LLFourn/rakudo,niner/rakudo,samcv/rakudo,awwaiid/rakudo,rjbs/rakudo,ungrim97/rakudo,tbrowder/rakudo,dankogai/rakudo,ugexe/rakudo,paultcochrane/rakudo,azawawi/rakudo,sjn/rakudo,labster/rakudo,retupmoca/rakudo,zhuomingliang/rakudo,nbrown/rakudo,labster/rakudo,Leont/rakudo,lucasbuchala/rakudo,nbrown/rakudo,nbrown/rakudo,rakudo/rakudo,labster/rakudo,zhuomingliang/rakudo,sergot/rakudo,dwarring/rakudo,raydiak/rakudo,b2gills/rakudo,sjn/rakudo,pmurias/rakudo,zostay/rakudo,nbrown/rakudo,raydiak/rakudo,tony-o/rakudo,samcv/rakudo,ab5tract/rakudo,softmoth/rakudo,labster/rakudo,dwarring/rakudo,sergot/rakudo,nunorc/rakudo,teodozjan/rakudo,skids/rakudo,Leont/rakudo,usev6/rakudo,LLFourn/rakudo,salortiz/rakudo,salortiz/rakudo,dankogai/rakudo,ungrim97/rakudo,tony-o/deb-rakudodaily,laben/rakudo,nunorc/rakudo,ugexe/rakudo,pmurias/rakudo,zhuomingliang/rakudo,rakudo/rakudo,dankogai/rakudo,Gnouc/rakudo,MasterDuke17/rakudo,laben/rakudo,jonathanstowe/rakudo,azawawi/rakudo,ungrim97/rakudo,cygx/rakudo,b2gills/rakudo,awwaiid/rakudo,tbrowder/rakudo,nunorc/rakudo,sjn/rakudo,ungrim97/rakudo,Leont/rakudo,dwarring/rakudo,tony-o/deb-rakudodaily,ab5tract/rakudo,teodozjan/rakudo,MasterDuke17/rakudo,sergot/rakudo,salortiz/rakudo,usev6/rakudo,cygx/rakudo,niner/rakudo,rjbs/rakudo,lucasbuchala/rakudo,pmurias/rakudo,jonathanstowe/rakudo,samcv/rakudo,salortiz/rakudo,retupmoca/rakudo,nunorc/rakudo,cognominal/rakudo,usev6/rakudo,awwaiid/rakudo,paultcochrane/rakudo,softmoth/rakudo,tony-o/rakudo,samcv/rakudo,rakudo/rakudo,lucasbuchala/rakudo,Gnouc/rakudo,niner/rakudo,rakudo/rakudo,zostay/rakudo,b2gills/rakudo,tony-o/rakudo,raydiak/rakudo,sjn/rakudo,tony-o/deb-rakudodaily,jonathanstowe/rakudo,awwaiid/rakudo,samcv/rakudo,retupmoca/rakudo,azawawi/rakudo,laben/rakudo,Gnouc/rakudo,jonathanstowe/rakudo,LLFourn/rakudo,rakudo/rakudo,tony-o/rakudo,softmoth/rakudo,zostay/rakudo,tbrowder/rakudo,lucasbuchala/rakudo,cygx/rakudo,tony-o/deb-rakudodaily,teodozjan/rakudo,MasterDuke17/rakudo,skids/rakudo,laben/rakudo,Gnouc/rakudo,ugexe/rakudo,ab5tract/rakudo,azawawi/rakudo,salortiz/rakudo,softmoth/rakudo,tony-o/deb-rakudodaily,jonathanstowe/rakudo,rjbs/rakudo,paultcochrane/rakudo,pmurias/rakudo,ugexe/rakudo,dwarring/rakudo,b2gills/rakudo,skids/rakudo,tbrowder/rakudo,ab5tract/rakudo,cognominal/rakudo,retupmoca/rakudo,skids/rakudo,awwaiid/rakudo,Gnouc/rakudo,LLFourn/rakudo,niner/rakudo,paultcochrane/rakudo,MasterDuke17/rakudo,salortiz/rakudo,tony-o/rakudo | perl | ## Code Before:
my class Routine {
method of() { self.signature.returns }
method returns() { self.signature.returns }
method rw() { $!rw }
method assuming($r: *@curried_pos, *%curried_named) {
return sub CURRIED (*@pos, *%named) {
$r(|@curried_pos, |@pos, |%curried_named, |%named)
}
}
method candidates() {
self.is_dispatcher ??
pir::perl6ize_type__PP(nqp::getattr(self, Code, '$!dispatchees')) !!
(self,)
}
method multi() {
self.dispatcher.defined
}
}
## Instruction:
Add a way to get a set of matching candidates for a multi.
## Code After:
my class Routine {
method of() { self.signature.returns }
method returns() { self.signature.returns }
method rw() { $!rw }
method assuming($r: *@curried_pos, *%curried_named) {
return sub CURRIED (*@pos, *%named) {
$r(|@curried_pos, |@pos, |%curried_named, |%named)
}
}
method candidates() {
self.is_dispatcher ??
pir::perl6ize_type__PP(nqp::getattr(self, Code, '$!dispatchees')) !!
(self,)
}
method candidates_matching(|$c) {
sub checker(|$) {
my Mu $cap := pir::find_lex__Ps('call_sig');
self.is_dispatcher ??
pir::perl6ize_type__PP(pir::perl6_get_matching_multis__PPP(self, $cap)) !!
(self,)
}
checker(|$c);
}
method multi() {
self.dispatcher.defined
}
}
| my class Routine {
method of() { self.signature.returns }
method returns() { self.signature.returns }
method rw() { $!rw }
method assuming($r: *@curried_pos, *%curried_named) {
return sub CURRIED (*@pos, *%named) {
$r(|@curried_pos, |@pos, |%curried_named, |%named)
}
}
method candidates() {
self.is_dispatcher ??
pir::perl6ize_type__PP(nqp::getattr(self, Code, '$!dispatchees')) !!
(self,)
}
+ method candidates_matching(|$c) {
+ sub checker(|$) {
+ my Mu $cap := pir::find_lex__Ps('call_sig');
+ self.is_dispatcher ??
+ pir::perl6ize_type__PP(pir::perl6_get_matching_multis__PPP(self, $cap)) !!
+ (self,)
+ }
+ checker(|$c);
+ }
+
method multi() {
self.dispatcher.defined
}
} | 10 | 0.47619 | 10 | 0 |
21d3b22482e48e1c2654c256fec667788d2729fb | .expeditor/coverage.pipeline.yml | .expeditor/coverage.pipeline.yml | ---
steps:
- label: coverage
commands:
- bundle install --jobs=7 --retry=3 --without tools integration
- bundle exec rake test:default
expeditor:
executor:
docker:
image: ruby:2.6-stretch
| ---
steps:
- label: coverage
commands:
- /workdir/.expeditor/buildkite/coverage.sh
expeditor:
executor:
docker:
image: ruby:2.6-stretch
| Use our coverage.sh rather than embedded commands | Use our coverage.sh rather than embedded commands
Signed-off-by: Miah Johnson <722873f9fa963292e0213e095373c87c1070cd81@chia-pet.org>
| YAML | apache-2.0 | chef/train,chef/train | yaml | ## Code Before:
---
steps:
- label: coverage
commands:
- bundle install --jobs=7 --retry=3 --without tools integration
- bundle exec rake test:default
expeditor:
executor:
docker:
image: ruby:2.6-stretch
## Instruction:
Use our coverage.sh rather than embedded commands
Signed-off-by: Miah Johnson <722873f9fa963292e0213e095373c87c1070cd81@chia-pet.org>
## Code After:
---
steps:
- label: coverage
commands:
- /workdir/.expeditor/buildkite/coverage.sh
expeditor:
executor:
docker:
image: ruby:2.6-stretch
| ---
steps:
- label: coverage
commands:
+ - /workdir/.expeditor/buildkite/coverage.sh
- - bundle install --jobs=7 --retry=3 --without tools integration
- - bundle exec rake test:default
expeditor:
executor:
docker:
image: ruby:2.6-stretch | 3 | 0.272727 | 1 | 2 |
584f59f11680ae9528e897202aeb21cbab130157 | .travis.yml | .travis.yml | language: go
dist: trusty
go:
- 1.5
- 1.6.2
install:
- # WAV loading
- go get github.com/youpy/go-riff
- # SOX bindings
- sudo apt-get install -y libsox-dev
- go get github.com/krig/go-sox
- # Testing
- go get github.com/stretchr/testify/assert
- go get github.com/stretchr/testify/mock
- # Logging
- go get github.com/Sirupsen/logrus
| language: go
dist: trusty
go:
- 1.5
- 1.6.2
- 1.11
install:
- # WAV loading
- go get github.com/youpy/go-riff
- # SOX bindings
- sudo apt-get install -y libsox-dev
- go get github.com/krig/go-sox
- # Testing
- go get github.com/stretchr/testify/assert
- go get github.com/stretchr/testify/mock
- # Logging
- go get github.com/Sirupsen/logrus
script:
- go test ./...
| Add testing to Travis CI | Add testing to Travis CI
| YAML | mit | outrightmental/go-atomix,go-ontomix/ontomix,go-mix/mix | yaml | ## Code Before:
language: go
dist: trusty
go:
- 1.5
- 1.6.2
install:
- # WAV loading
- go get github.com/youpy/go-riff
- # SOX bindings
- sudo apt-get install -y libsox-dev
- go get github.com/krig/go-sox
- # Testing
- go get github.com/stretchr/testify/assert
- go get github.com/stretchr/testify/mock
- # Logging
- go get github.com/Sirupsen/logrus
## Instruction:
Add testing to Travis CI
## Code After:
language: go
dist: trusty
go:
- 1.5
- 1.6.2
- 1.11
install:
- # WAV loading
- go get github.com/youpy/go-riff
- # SOX bindings
- sudo apt-get install -y libsox-dev
- go get github.com/krig/go-sox
- # Testing
- go get github.com/stretchr/testify/assert
- go get github.com/stretchr/testify/mock
- # Logging
- go get github.com/Sirupsen/logrus
script:
- go test ./...
| language: go
dist: trusty
go:
- 1.5
- 1.6.2
+ - 1.11
install:
- - # WAV loading
+ - # WAV loading
? +
- go get github.com/youpy/go-riff
- - # SOX bindings
+ - # SOX bindings
? +
- sudo apt-get install -y libsox-dev
- go get github.com/krig/go-sox
- - # Testing
+ - # Testing
? +
- go get github.com/stretchr/testify/assert
- go get github.com/stretchr/testify/mock
- - # Logging
+ - # Logging
? +
- go get github.com/Sirupsen/logrus
+ script:
+ - go test ./... | 11 | 0.578947 | 7 | 4 |
a26869c4442ef6e9b2c7f82818666ccf8a718608 | spec/models/section_spec.rb | spec/models/section_spec.rb | require 'spec_helper'
describe Section do
describe 'associations' do
describe 'chapters' do
it 'does not include HiddenGoodsNomenclatures' do
pending
end
end
end
end
| require 'spec_helper'
describe Section do
describe 'associations' do
describe 'chapters' do
let!(:chapter) { create(:chapter, :with_section) }
it 'does not include HiddenGoodsNomenclatures' do
section = chapter.section
create(:hidden_goods_nomenclature, goods_nomenclature_item_id: chapter.goods_nomenclature_item_id)
expect(section.chapters).to eq []
end
end
end
end
| Fix pending spec for section.chapters with hidden codes | Fix pending spec for section.chapters with hidden codes
| Ruby | mit | bitzesty/trade-tariff-backend,alphagov/trade-tariff-backend,bitzesty/trade-tariff-backend,leftees/trade-tariff-backend,bitzesty/trade-tariff-backend,alphagov/trade-tariff-backend,alphagov/trade-tariff-backend,leftees/trade-tariff-backend,leftees/trade-tariff-backend | ruby | ## Code Before:
require 'spec_helper'
describe Section do
describe 'associations' do
describe 'chapters' do
it 'does not include HiddenGoodsNomenclatures' do
pending
end
end
end
end
## Instruction:
Fix pending spec for section.chapters with hidden codes
## Code After:
require 'spec_helper'
describe Section do
describe 'associations' do
describe 'chapters' do
let!(:chapter) { create(:chapter, :with_section) }
it 'does not include HiddenGoodsNomenclatures' do
section = chapter.section
create(:hidden_goods_nomenclature, goods_nomenclature_item_id: chapter.goods_nomenclature_item_id)
expect(section.chapters).to eq []
end
end
end
end
| require 'spec_helper'
describe Section do
describe 'associations' do
describe 'chapters' do
+ let!(:chapter) { create(:chapter, :with_section) }
+
it 'does not include HiddenGoodsNomenclatures' do
- pending
+ section = chapter.section
+ create(:hidden_goods_nomenclature, goods_nomenclature_item_id: chapter.goods_nomenclature_item_id)
+
+ expect(section.chapters).to eq []
end
end
end
end | 7 | 0.636364 | 6 | 1 |
f3efd7bf27100e5419242a8d69caa84d8250f099 | roles/base.json | roles/base.json | {
"name": "base",
"description": "Drop The Bass",
"run_list": [
"recipe[chef-client]",
"recipe[dns::client]",
"recipe[hostname]",
"recipe[filer]",
"recipe[ntp]",
"recipe[openssh]",
"recipe[sudo]",
"recipe[timezone-ii]",
"recipe[users::sysadmins]"
],
"default_attributes": {
"authorization": {
"sudo": {
"passwordless": "true"
}
},
"chef_client": {
"interval": "300",
"splay": "60",
"config": {
"ssl_verify_mode": ":verify_peer"
}
},
"dns": {
"master": "192.168.15.254"
},
"ntp": {
"servers": ["ns0.local.pvt"]
},
"openssh": {
"server": {
"permit_root_login": "without-password"
}
},
"tz": "America/New_York"
}
}
| {
"name": "base",
"description": "Drop The Bass",
"run_list": [
"recipe[chef-client]",
"recipe[chef-client::config]",
"recipe[dns::client]",
"recipe[hostname]",
"recipe[filer]",
"recipe[ntp]",
"recipe[openssh]",
"recipe[sudo]",
"recipe[timezone-ii]",
"recipe[users::sysadmins]"
],
"default_attributes": {
"authorization": {
"sudo": {
"passwordless": "true"
}
},
"chef_client": {
"interval": "300",
"splay": "60",
"config": {
"ssl_verify_mode": ":verify_peer"
}
},
"dns": {
"master": "192.168.15.254"
},
"ntp": {
"servers": ["ns0.local.pvt"]
},
"openssh": {
"server": {
"permit_root_login": "without-password"
}
},
"tz": "America/New_York"
}
}
| Verify peer ssl when chef converges | Verify peer ssl when chef converges
| JSON | apache-2.0 | skingry/chef,skingry/chef,skingry/media-server,skingry/chef,skingry/media-server,skingry/media-server | json | ## Code Before:
{
"name": "base",
"description": "Drop The Bass",
"run_list": [
"recipe[chef-client]",
"recipe[dns::client]",
"recipe[hostname]",
"recipe[filer]",
"recipe[ntp]",
"recipe[openssh]",
"recipe[sudo]",
"recipe[timezone-ii]",
"recipe[users::sysadmins]"
],
"default_attributes": {
"authorization": {
"sudo": {
"passwordless": "true"
}
},
"chef_client": {
"interval": "300",
"splay": "60",
"config": {
"ssl_verify_mode": ":verify_peer"
}
},
"dns": {
"master": "192.168.15.254"
},
"ntp": {
"servers": ["ns0.local.pvt"]
},
"openssh": {
"server": {
"permit_root_login": "without-password"
}
},
"tz": "America/New_York"
}
}
## Instruction:
Verify peer ssl when chef converges
## Code After:
{
"name": "base",
"description": "Drop The Bass",
"run_list": [
"recipe[chef-client]",
"recipe[chef-client::config]",
"recipe[dns::client]",
"recipe[hostname]",
"recipe[filer]",
"recipe[ntp]",
"recipe[openssh]",
"recipe[sudo]",
"recipe[timezone-ii]",
"recipe[users::sysadmins]"
],
"default_attributes": {
"authorization": {
"sudo": {
"passwordless": "true"
}
},
"chef_client": {
"interval": "300",
"splay": "60",
"config": {
"ssl_verify_mode": ":verify_peer"
}
},
"dns": {
"master": "192.168.15.254"
},
"ntp": {
"servers": ["ns0.local.pvt"]
},
"openssh": {
"server": {
"permit_root_login": "without-password"
}
},
"tz": "America/New_York"
}
}
| {
"name": "base",
"description": "Drop The Bass",
"run_list": [
"recipe[chef-client]",
+ "recipe[chef-client::config]",
"recipe[dns::client]",
"recipe[hostname]",
"recipe[filer]",
"recipe[ntp]",
"recipe[openssh]",
"recipe[sudo]",
"recipe[timezone-ii]",
"recipe[users::sysadmins]"
],
"default_attributes": {
"authorization": {
"sudo": {
"passwordless": "true"
}
},
"chef_client": {
"interval": "300",
"splay": "60",
"config": {
"ssl_verify_mode": ":verify_peer"
}
},
"dns": {
"master": "192.168.15.254"
},
"ntp": {
"servers": ["ns0.local.pvt"]
},
"openssh": {
"server": {
"permit_root_login": "without-password"
}
},
"tz": "America/New_York"
}
} | 1 | 0.02439 | 1 | 0 |
af07b626d44e5d46c428f710ac6431918f2a70e7 | lib/SwiftDemangle/CMakeLists.txt | lib/SwiftDemangle/CMakeLists.txt | add_swift_library(swiftDemangle SHARED
SwiftDemangle.cpp
MangleHack.cpp
LINK_LIBRARIES swiftBasic)
swift_install_in_component(compiler
TARGETS swiftDemangle
LIBRARY DESTINATION "lib${LLVM_LIBDIR_SUFFIX}"
ARCHIVE DESTINATION "lib${LLVM_LIBDIR_SUFFIX}")
# Create a compatibility symlink.
swift_create_post_build_symlink(sourcekitd
IS_DIRECTORY
SOURCE "${SWIFTLIB_DIR}"
DESTINATION "${SOURCEKIT_LIBRARY_OUTPUT_INTDIR}/swift"
COMMENT "Creating compatibility symlink for functionNameDemangle.dylib")
swift_install_in_component(compiler
FILES "${SWIFT_LIBRARY_OUTPUT_INTDIR}/libfunctionNameDemangle.dylib"
DESTINATION "lib")
| add_swift_library(swiftDemangle SHARED
SwiftDemangle.cpp
MangleHack.cpp
LINK_LIBRARIES swiftBasic)
swift_install_in_component(compiler
TARGETS swiftDemangle
LIBRARY DESTINATION "lib${LLVM_LIBDIR_SUFFIX}"
ARCHIVE DESTINATION "lib${LLVM_LIBDIR_SUFFIX}")
| Remove creating compatability symlink in swiftDemangle | Remove creating compatability symlink in swiftDemangle
| Text | apache-2.0 | gribozavr/swift,gmilos/swift,ahoppen/swift,jopamer/swift,tjw/swift,gmilos/swift,gregomni/swift,uasys/swift,glessard/swift,austinzheng/swift,codestergit/swift,tinysun212/swift-windows,atrick/swift,gribozavr/swift,jmgc/swift,karwa/swift,practicalswift/swift,karwa/swift,airspeedswift/swift,austinzheng/swift,felix91gr/swift,aschwaighofer/swift,CodaFi/swift,frootloops/swift,allevato/swift,Jnosh/swift,xwu/swift,shajrawi/swift,nathawes/swift,parkera/swift,jmgc/swift,manavgabhawala/swift,Jnosh/swift,alblue/swift,bitjammer/swift,alblue/swift,shahmishal/swift,uasys/swift,amraboelela/swift,xedin/swift,amraboelela/swift,brentdax/swift,brentdax/swift,frootloops/swift,parkera/swift,atrick/swift,practicalswift/swift,OscarSwanros/swift,aschwaighofer/swift,hooman/swift,harlanhaskins/swift,airspeedswift/swift,zisko/swift,swiftix/swift,danielmartin/swift,glessard/swift,jtbandes/swift,karwa/swift,tjw/swift,calebd/swift,airspeedswift/swift,hooman/swift,jtbandes/swift,djwbrown/swift,frootloops/swift,xedin/swift,airspeedswift/swift,milseman/swift,JGiola/swift,felix91gr/swift,xedin/swift,practicalswift/swift,felix91gr/swift,jckarter/swift,alblue/swift,aschwaighofer/swift,apple/swift,airspeedswift/swift,austinzheng/swift,apple/swift,aschwaighofer/swift,return/swift,ahoppen/swift,JGiola/swift,harlanhaskins/swift,gottesmm/swift,danielmartin/swift,calebd/swift,gmilos/swift,rudkx/swift,djwbrown/swift,shahmishal/swift,codestergit/swift,devincoughlin/swift,uasys/swift,tjw/swift,JaSpa/swift,devincoughlin/swift,roambotics/swift,bitjammer/swift,gottesmm/swift,codestergit/swift,rudkx/swift,aschwaighofer/swift,codestergit/swift,jtbandes/swift,amraboelela/swift,austinzheng/swift,alblue/swift,shajrawi/swift,ahoppen/swift,austinzheng/swift,swiftix/swift,codestergit/swift,tardieu/swift,xedin/swift,hughbe/swift,practicalswift/swift,tkremenek/swift,austinzheng/swift,atrick/swift,arvedviehweger/swift,deyton/swift,allevato/swift,xwu/swift,gribozavr/swift,swiftix/swift,bitjammer/swift,alblue/swift,sschiau/swift,benlangmuir/swift,gribozavr/swift,manavgabhawala/swift,deyton/swift,gribozavr/swift,nathawes/swift,parkera/swift,return/swift,practicalswift/swift,alblue/swift,airspeedswift/swift,gottesmm/swift,jopamer/swift,devincoughlin/swift,practicalswift/swift,sschiau/swift,parkera/swift,deyton/swift,jmgc/swift,tardieu/swift,bitjammer/swift,milseman/swift,tardieu/swift,tkremenek/swift,glessard/swift,deyton/swift,OscarSwanros/swift,jopamer/swift,jmgc/swift,arvedviehweger/swift,djwbrown/swift,xedin/swift,amraboelela/swift,huonw/swift,return/swift,karwa/swift,gregomni/swift,lorentey/swift,milseman/swift,jtbandes/swift,karwa/swift,harlanhaskins/swift,natecook1000/swift,bitjammer/swift,CodaFi/swift,frootloops/swift,nathawes/swift,OscarSwanros/swift,calebd/swift,austinzheng/swift,xedin/swift,xwu/swift,gregomni/swift,jtbandes/swift,natecook1000/swift,sschiau/swift,rudkx/swift,swiftix/swift,hughbe/swift,OscarSwanros/swift,JGiola/swift,allevato/swift,tkremenek/swift,atrick/swift,deyton/swift,uasys/swift,Jnosh/swift,jopamer/swift,practicalswift/swift,shahmishal/swift,hughbe/swift,tinysun212/swift-windows,jckarter/swift,jmgc/swift,manavgabhawala/swift,airspeedswift/swift,jtbandes/swift,atrick/swift,zisko/swift,benlangmuir/swift,danielmartin/swift,gottesmm/swift,brentdax/swift,tinysun212/swift-windows,xedin/swift,shajrawi/swift,danielmartin/swift,hooman/swift,zisko/swift,practicalswift/swift,tardieu/swift,tjw/swift,djwbrown/swift,uasys/swift,CodaFi/swift,aschwaighofer/swift,tardieu/swift,nathawes/swift,felix91gr/swift,glessard/swift,Jnosh/swift,jckarter/swift,rudkx/swift,gmilos/swift,roambotics/swift,swiftix/swift,tkremenek/swift,shajrawi/swift,JGiola/swift,gmilos/swift,calebd/swift,stephentyrone/swift,jckarter/swift,amraboelela/swift,shajrawi/swift,gregomni/swift,arvedviehweger/swift,CodaFi/swift,jopamer/swift,milseman/swift,hughbe/swift,stephentyrone/swift,atrick/swift,tjw/swift,tardieu/swift,amraboelela/swift,deyton/swift,CodaFi/swift,OscarSwanros/swift,parkera/swift,sschiau/swift,brentdax/swift,gribozavr/swift,allevato/swift,stephentyrone/swift,hooman/swift,shahmishal/swift,jmgc/swift,calebd/swift,huonw/swift,apple/swift,lorentey/swift,milseman/swift,brentdax/swift,OscarSwanros/swift,felix91gr/swift,allevato/swift,aschwaighofer/swift,jtbandes/swift,sschiau/swift,manavgabhawala/swift,hooman/swift,xwu/swift,JGiola/swift,karwa/swift,harlanhaskins/swift,sschiau/swift,CodaFi/swift,allevato/swift,calebd/swift,hughbe/swift,benlangmuir/swift,JaSpa/swift,jopamer/swift,lorentey/swift,devincoughlin/swift,huonw/swift,ahoppen/swift,benlangmuir/swift,gregomni/swift,jmgc/swift,JaSpa/swift,natecook1000/swift,OscarSwanros/swift,tinysun212/swift-windows,uasys/swift,manavgabhawala/swift,huonw/swift,xedin/swift,glessard/swift,stephentyrone/swift,gribozavr/swift,return/swift,huonw/swift,roambotics/swift,stephentyrone/swift,brentdax/swift,harlanhaskins/swift,hooman/swift,gmilos/swift,devincoughlin/swift,apple/swift,gottesmm/swift,parkera/swift,apple/swift,glessard/swift,zisko/swift,bitjammer/swift,JaSpa/swift,lorentey/swift,felix91gr/swift,gregomni/swift,manavgabhawala/swift,nathawes/swift,natecook1000/swift,gottesmm/swift,zisko/swift,gmilos/swift,huonw/swift,xwu/swift,gottesmm/swift,roambotics/swift,roambotics/swift,harlanhaskins/swift,swiftix/swift,frootloops/swift,shahmishal/swift,milseman/swift,sschiau/swift,parkera/swift,tkremenek/swift,CodaFi/swift,stephentyrone/swift,Jnosh/swift,devincoughlin/swift,jckarter/swift,xwu/swift,bitjammer/swift,lorentey/swift,hughbe/swift,arvedviehweger/swift,lorentey/swift,parkera/swift,danielmartin/swift,tjw/swift,devincoughlin/swift,karwa/swift,JGiola/swift,karwa/swift,frootloops/swift,swiftix/swift,djwbrown/swift,rudkx/swift,shahmishal/swift,codestergit/swift,shahmishal/swift,zisko/swift,benlangmuir/swift,tardieu/swift,calebd/swift,zisko/swift,hooman/swift,return/swift,allevato/swift,jopamer/swift,JaSpa/swift,amraboelela/swift,natecook1000/swift,nathawes/swift,sschiau/swift,Jnosh/swift,tinysun212/swift-windows,lorentey/swift,Jnosh/swift,benlangmuir/swift,uasys/swift,lorentey/swift,natecook1000/swift,brentdax/swift,jckarter/swift,shajrawi/swift,roambotics/swift,tkremenek/swift,shajrawi/swift,alblue/swift,xwu/swift,hughbe/swift,nathawes/swift,devincoughlin/swift,return/swift,JaSpa/swift,shajrawi/swift,gribozavr/swift,djwbrown/swift,stephentyrone/swift,deyton/swift,shahmishal/swift,apple/swift,tkremenek/swift,tinysun212/swift-windows,ahoppen/swift,frootloops/swift,djwbrown/swift,tinysun212/swift-windows,arvedviehweger/swift,arvedviehweger/swift,milseman/swift,JaSpa/swift,felix91gr/swift,codestergit/swift,return/swift,natecook1000/swift,arvedviehweger/swift,manavgabhawala/swift,ahoppen/swift,harlanhaskins/swift,tjw/swift,danielmartin/swift,danielmartin/swift,huonw/swift,jckarter/swift,rudkx/swift | text | ## Code Before:
add_swift_library(swiftDemangle SHARED
SwiftDemangle.cpp
MangleHack.cpp
LINK_LIBRARIES swiftBasic)
swift_install_in_component(compiler
TARGETS swiftDemangle
LIBRARY DESTINATION "lib${LLVM_LIBDIR_SUFFIX}"
ARCHIVE DESTINATION "lib${LLVM_LIBDIR_SUFFIX}")
# Create a compatibility symlink.
swift_create_post_build_symlink(sourcekitd
IS_DIRECTORY
SOURCE "${SWIFTLIB_DIR}"
DESTINATION "${SOURCEKIT_LIBRARY_OUTPUT_INTDIR}/swift"
COMMENT "Creating compatibility symlink for functionNameDemangle.dylib")
swift_install_in_component(compiler
FILES "${SWIFT_LIBRARY_OUTPUT_INTDIR}/libfunctionNameDemangle.dylib"
DESTINATION "lib")
## Instruction:
Remove creating compatability symlink in swiftDemangle
## Code After:
add_swift_library(swiftDemangle SHARED
SwiftDemangle.cpp
MangleHack.cpp
LINK_LIBRARIES swiftBasic)
swift_install_in_component(compiler
TARGETS swiftDemangle
LIBRARY DESTINATION "lib${LLVM_LIBDIR_SUFFIX}"
ARCHIVE DESTINATION "lib${LLVM_LIBDIR_SUFFIX}")
| add_swift_library(swiftDemangle SHARED
SwiftDemangle.cpp
MangleHack.cpp
LINK_LIBRARIES swiftBasic)
swift_install_in_component(compiler
TARGETS swiftDemangle
LIBRARY DESTINATION "lib${LLVM_LIBDIR_SUFFIX}"
ARCHIVE DESTINATION "lib${LLVM_LIBDIR_SUFFIX}")
-
- # Create a compatibility symlink.
-
- swift_create_post_build_symlink(sourcekitd
- IS_DIRECTORY
- SOURCE "${SWIFTLIB_DIR}"
- DESTINATION "${SOURCEKIT_LIBRARY_OUTPUT_INTDIR}/swift"
- COMMENT "Creating compatibility symlink for functionNameDemangle.dylib")
-
- swift_install_in_component(compiler
- FILES "${SWIFT_LIBRARY_OUTPUT_INTDIR}/libfunctionNameDemangle.dylib"
- DESTINATION "lib")
- | 13 | 0.590909 | 0 | 13 |
24d0c3e7822165e860132d5bcdc1e9c2ed9d9a2e | initializers/trailing-history.js | initializers/trailing-history.js | /*global Ember */
var trailingHistory = Ember.HistoryLocation.extend({
setURL: function (path) {
var state = this.getState();
path = this.formatURL(path);
path = path.replace(/\/?$/, '/');
if (state && state.path !== path) {
this.pushState(path);
}
}
});
var registerTrailingLocationHistory = {
name: 'registerTrailingLocationHistory',
initialize: function (container, application) {
application.register('location:trailing-history', trailingHistory);
}
};
export default registerTrailingLocationHistory; | /*global Ember */
var trailingHistory = Ember.HistoryLocation.extend({
formatURL: function () {
return this._super.apply(this, arguments).replace(/\/?$/, '/');
}
});
var registerTrailingLocationHistory = {
name: 'registerTrailingLocationHistory',
initialize: function (container, application) {
application.register('location:trailing-history', trailingHistory);
}
};
export default registerTrailingLocationHistory; | Fix trailing slashes output app-wide | Fix trailing slashes output app-wide
closes #2963, closes #2964
- override Ember's `HistoryLocation.formatURL`
- remove overridden `HistoryLocation.setURL`
| JavaScript | mit | dbalders/Ghost-Admin,kevinansfield/Ghost-Admin,acburdine/Ghost-Admin,acburdine/Ghost-Admin,JohnONolan/Ghost-Admin,airycanon/Ghost-Admin,JohnONolan/Ghost-Admin,dbalders/Ghost-Admin,TryGhost/Ghost-Admin,kevinansfield/Ghost-Admin,TryGhost/Ghost-Admin,airycanon/Ghost-Admin | javascript | ## Code Before:
/*global Ember */
var trailingHistory = Ember.HistoryLocation.extend({
setURL: function (path) {
var state = this.getState();
path = this.formatURL(path);
path = path.replace(/\/?$/, '/');
if (state && state.path !== path) {
this.pushState(path);
}
}
});
var registerTrailingLocationHistory = {
name: 'registerTrailingLocationHistory',
initialize: function (container, application) {
application.register('location:trailing-history', trailingHistory);
}
};
export default registerTrailingLocationHistory;
## Instruction:
Fix trailing slashes output app-wide
closes #2963, closes #2964
- override Ember's `HistoryLocation.formatURL`
- remove overridden `HistoryLocation.setURL`
## Code After:
/*global Ember */
var trailingHistory = Ember.HistoryLocation.extend({
formatURL: function () {
return this._super.apply(this, arguments).replace(/\/?$/, '/');
}
});
var registerTrailingLocationHistory = {
name: 'registerTrailingLocationHistory',
initialize: function (container, application) {
application.register('location:trailing-history', trailingHistory);
}
};
export default registerTrailingLocationHistory; | /*global Ember */
var trailingHistory = Ember.HistoryLocation.extend({
- setURL: function (path) {
? ^^ ----
+ formatURL: function () {
? ^^^^^
+ return this._super.apply(this, arguments).replace(/\/?$/, '/');
- var state = this.getState();
- path = this.formatURL(path);
- path = path.replace(/\/?$/, '/');
-
- if (state && state.path !== path) {
- this.pushState(path);
- }
}
});
var registerTrailingLocationHistory = {
name: 'registerTrailingLocationHistory',
initialize: function (container, application) {
application.register('location:trailing-history', trailingHistory);
}
};
export default registerTrailingLocationHistory; | 10 | 0.434783 | 2 | 8 |
bc629e6085c16c23968a1b3aad41f2bd19013209 | src/S203.NewRelic.NeuronEsb.Plugin/config/plugin.template.json | src/S203.NewRelic.NeuronEsb.Plugin/config/plugin.template.json | {
"agents": [
{
"_comment": "This is an example of how you would monitor a Neuron ESB instance. Duplicate this section if you have multiple servers.",
"name": "Neuron ESB",
"host": "localhost",
"port": 51002,
"instance": "DEFAULT"
}
]
} | {
"agents": [
{
"_comment": "This is an example of how you would monitor a Neuron ESB instance. Duplicate this section if you have multiple servers.",
"name": "localhost",
"host": "localhost",
"port": 51002,
"instance": "DEFAULT"
}
]
} | Update to make the plugin.json setup clearer | Update to make the plugin.json setup clearer
| JSON | mit | 203sol/newrelic-neuronesb-plugin | json | ## Code Before:
{
"agents": [
{
"_comment": "This is an example of how you would monitor a Neuron ESB instance. Duplicate this section if you have multiple servers.",
"name": "Neuron ESB",
"host": "localhost",
"port": 51002,
"instance": "DEFAULT"
}
]
}
## Instruction:
Update to make the plugin.json setup clearer
## Code After:
{
"agents": [
{
"_comment": "This is an example of how you would monitor a Neuron ESB instance. Duplicate this section if you have multiple servers.",
"name": "localhost",
"host": "localhost",
"port": 51002,
"instance": "DEFAULT"
}
]
} | {
"agents": [
{
"_comment": "This is an example of how you would monitor a Neuron ESB instance. Duplicate this section if you have multiple servers.",
- "name": "Neuron ESB",
+ "name": "localhost",
"host": "localhost",
"port": 51002,
"instance": "DEFAULT"
}
]
} | 2 | 0.181818 | 1 | 1 |
ca8f41dbc466ee7a98c27569a1de0c1ff776836c | .build_scripts/ecs-task-definition-fargate.json | .build_scripts/ecs-task-definition-fargate.json | {
"family": "openaq-fetch-fargate",
"taskRoleArn": "arn:aws:iam::470049585876:role/uploadsS3",
"containerDefinitions": [
{
"name": "openaq-fetch",
"image": "flasher/openaq-fetch",
"command": [
"npm",
"start"
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/openaq-fetch",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
],
"cpu": "512",
"memory": "1024",
"networkMode": "awsvpc",
"executionRoleArn": "arn:aws:iam::470049585876:role/ecsTaskExecutionRole",
"requiresCompatibilities": [
"EC2",
"FARGATE"
]
} | {
"family": "openaq-fetch-fargate",
"taskRoleArn": "arn:aws:iam::470049585876:role/uploadsS3",
"containerDefinitions": [
{
"name": "openaq-fetch",
"image": "flasher/openaq-fetch",
"command": [
"./index.sh"
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/openaq-fetch",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
],
"cpu": "2048",
"memory": "16384",
"networkMode": "awsvpc",
"executionRoleArn": "arn:aws:iam::470049585876:role/ecsTaskExecutionRole",
"requiresCompatibilities": [
"EC2",
"FARGATE"
]
} | Increase fargate size - develop | Increase fargate size - develop
| JSON | mit | openaq/openaq-fetch,openaq/openaq-fetch | json | ## Code Before:
{
"family": "openaq-fetch-fargate",
"taskRoleArn": "arn:aws:iam::470049585876:role/uploadsS3",
"containerDefinitions": [
{
"name": "openaq-fetch",
"image": "flasher/openaq-fetch",
"command": [
"npm",
"start"
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/openaq-fetch",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
],
"cpu": "512",
"memory": "1024",
"networkMode": "awsvpc",
"executionRoleArn": "arn:aws:iam::470049585876:role/ecsTaskExecutionRole",
"requiresCompatibilities": [
"EC2",
"FARGATE"
]
}
## Instruction:
Increase fargate size - develop
## Code After:
{
"family": "openaq-fetch-fargate",
"taskRoleArn": "arn:aws:iam::470049585876:role/uploadsS3",
"containerDefinitions": [
{
"name": "openaq-fetch",
"image": "flasher/openaq-fetch",
"command": [
"./index.sh"
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/openaq-fetch",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
],
"cpu": "2048",
"memory": "16384",
"networkMode": "awsvpc",
"executionRoleArn": "arn:aws:iam::470049585876:role/ecsTaskExecutionRole",
"requiresCompatibilities": [
"EC2",
"FARGATE"
]
} | {
"family": "openaq-fetch-fargate",
"taskRoleArn": "arn:aws:iam::470049585876:role/uploadsS3",
"containerDefinitions": [
{
"name": "openaq-fetch",
"image": "flasher/openaq-fetch",
"command": [
- "npm",
? ^^ -
+ "./index.sh"
? +++ ^^^^^^
- "start"
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/openaq-fetch",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
],
- "cpu": "512",
? --
+ "cpu": "2048",
? +++
- "memory": "1024",
? ^^
+ "memory": "16384",
? ^^^
"networkMode": "awsvpc",
"executionRoleArn": "arn:aws:iam::470049585876:role/ecsTaskExecutionRole",
"requiresCompatibilities": [
"EC2",
"FARGATE"
]
} | 7 | 0.233333 | 3 | 4 |
2acc0c6241c22e38b520494a76d9167933c71b17 | requirements.txt | requirements.txt | molo.core==6.10.2
molo.commenting==6.3.0
molo.yourwords==6.4.1
molo.yourtips==6.0.1
molo.servicedirectory==6.1.13
molo.surveys==6.9.9
molo.pwa==6.2.0
elasticsearch==1.7.0
django-modelcluster<4.0,>=3.1
psycopg2
gunicorn
django_compressor==2.2
celery<4.0
django-google-analytics-app==4.3.3
Unidecode==0.04.16
django-secretballot==1.0.0
django-likes==1.11
molo.globalsite==6.0.1
django-import-export==1.0.1
# Required by ImageHash, install a working version
scipy==0.19.1
python-cas==1.2.0
google-cloud-storage
| molo.core==6.10.2
molo.commenting==6.3.0
molo.yourwords==6.4.1
molo.yourtips==6.0.1
molo.servicedirectory==6.1.13
molo.surveys==6.9.9
molo.pwa==6.2.0
elasticsearch==1.7.0
django-modelcluster<4.0,>=3.1
psycopg2
gunicorn
django_compressor==2.2
celery<4.0
django-google-analytics-app==4.3.3
Unidecode==0.04.16
django-secretballot==1.0.0
django-likes==1.11
molo.globalsite==6.0.1
django-import-export==1.0.1
# Required by ImageHash, install a working version
scipy==0.19.1
python-cas==1.2.0
google-cloud-storage
django-storages==1.6.6
| Upgrade Django storage to 1.6.6 | Upgrade Django storage to 1.6.6
Upgrade Django storage to 1.6.6 in order to fix the issues with both S3Boto3Storage and GoogleCloudStorage | Text | bsd-2-clause | praekelt/molo-tuneme,praekelt/molo-tuneme,praekelt/molo-tuneme,praekelt/molo-tuneme | text | ## Code Before:
molo.core==6.10.2
molo.commenting==6.3.0
molo.yourwords==6.4.1
molo.yourtips==6.0.1
molo.servicedirectory==6.1.13
molo.surveys==6.9.9
molo.pwa==6.2.0
elasticsearch==1.7.0
django-modelcluster<4.0,>=3.1
psycopg2
gunicorn
django_compressor==2.2
celery<4.0
django-google-analytics-app==4.3.3
Unidecode==0.04.16
django-secretballot==1.0.0
django-likes==1.11
molo.globalsite==6.0.1
django-import-export==1.0.1
# Required by ImageHash, install a working version
scipy==0.19.1
python-cas==1.2.0
google-cloud-storage
## Instruction:
Upgrade Django storage to 1.6.6
Upgrade Django storage to 1.6.6 in order to fix the issues with both S3Boto3Storage and GoogleCloudStorage
## Code After:
molo.core==6.10.2
molo.commenting==6.3.0
molo.yourwords==6.4.1
molo.yourtips==6.0.1
molo.servicedirectory==6.1.13
molo.surveys==6.9.9
molo.pwa==6.2.0
elasticsearch==1.7.0
django-modelcluster<4.0,>=3.1
psycopg2
gunicorn
django_compressor==2.2
celery<4.0
django-google-analytics-app==4.3.3
Unidecode==0.04.16
django-secretballot==1.0.0
django-likes==1.11
molo.globalsite==6.0.1
django-import-export==1.0.1
# Required by ImageHash, install a working version
scipy==0.19.1
python-cas==1.2.0
google-cloud-storage
django-storages==1.6.6
| molo.core==6.10.2
molo.commenting==6.3.0
molo.yourwords==6.4.1
molo.yourtips==6.0.1
molo.servicedirectory==6.1.13
molo.surveys==6.9.9
molo.pwa==6.2.0
elasticsearch==1.7.0
django-modelcluster<4.0,>=3.1
psycopg2
gunicorn
django_compressor==2.2
celery<4.0
django-google-analytics-app==4.3.3
Unidecode==0.04.16
django-secretballot==1.0.0
django-likes==1.11
molo.globalsite==6.0.1
django-import-export==1.0.1
# Required by ImageHash, install a working version
scipy==0.19.1
python-cas==1.2.0
- google-cloud-storage
+ google-cloud-storage
? +
+ django-storages==1.6.6 | 3 | 0.130435 | 2 | 1 |
1320373c99541658f50bde97291fb849a18200a6 | README.md | README.md |
Virtlet is a Kubernetes runtime server which allows you to run VM workloads, based on QCOW2 images.
## Running local environment
To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose) and then do:
```
cd contrib/docker-compose
docker-compose up
``` |
Virtlet is a Kubernetes runtime server which allows you to run VM workloads, based on QCOW2 images.
## Running local environment
To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose)
at least in 1.8.0 version. If your Linux distribution is providing an older version, we suggest to
use Python virtualenv(wrapper):
```
apt-get install virtualenvwrapper
mkvirtualenv docker-compose
pip install docker-compose
```
If you have docker-compose ready to use, you can set up the virtlet dev environment by doing:
```
cd contrib/docker-compose
docker-compose up
``` | Add information about docker-compose min version | docs: Add information about docker-compose min version
| Markdown | apache-2.0 | Mirantis/virtlet,vefimova/virtlet,nhlfr/virtlet,nhlfr/virtlet,vefimova/virtlet,vefimova/virtlet,Mirantis/virtlet,nhlfr/virtlet,nhlfr/virtlet,vefimova/virtlet | markdown | ## Code Before:
Virtlet is a Kubernetes runtime server which allows you to run VM workloads, based on QCOW2 images.
## Running local environment
To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose) and then do:
```
cd contrib/docker-compose
docker-compose up
```
## Instruction:
docs: Add information about docker-compose min version
## Code After:
Virtlet is a Kubernetes runtime server which allows you to run VM workloads, based on QCOW2 images.
## Running local environment
To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose)
at least in 1.8.0 version. If your Linux distribution is providing an older version, we suggest to
use Python virtualenv(wrapper):
```
apt-get install virtualenvwrapper
mkvirtualenv docker-compose
pip install docker-compose
```
If you have docker-compose ready to use, you can set up the virtlet dev environment by doing:
```
cd contrib/docker-compose
docker-compose up
``` |
Virtlet is a Kubernetes runtime server which allows you to run VM workloads, based on QCOW2 images.
## Running local environment
- To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose) and then do:
? -------------
+ To run local environment, please install [docker-compose](https://pypi.python.org/pypi/docker-compose)
+ at least in 1.8.0 version. If your Linux distribution is providing an older version, we suggest to
+ use Python virtualenv(wrapper):
+
+ ```
+ apt-get install virtualenvwrapper
+ mkvirtualenv docker-compose
+ pip install docker-compose
+ ```
+
+ If you have docker-compose ready to use, you can set up the virtlet dev environment by doing:
```
cd contrib/docker-compose
docker-compose up
``` | 12 | 1.090909 | 11 | 1 |
5e7b67e55f4b987753e5e3c632ae6fd4d4cf254f | messaging-client-databus/src/main/java/com/inmobi/messaging/consumer/databus/DatabusConsumerConfig.java | messaging-client-databus/src/main/java/com/inmobi/messaging/consumer/databus/DatabusConsumerConfig.java | package com.inmobi.messaging.consumer.databus;
import com.inmobi.databus.FSCheckpointProvider;
/**
* Configuration properties and their default values for {@link DatabusConsumer}
*/
public interface DatabusConsumerConfig {
public static final String queueSizeConfig = "databus.consumer.buffer.size";
public static final int DEFAULT_QUEUE_SIZE = 5000;
public static final String waitTimeForFlushConfig =
"databus.consumer.waittime.forcollectorflush";
public static final long DEFAULT_WAIT_TIME_FOR_FLUSH = 1000; // 1 second
public static final String databusConfigFileKey = "databus.conf";
public static final String DEFAULT_DATABUS_CONFIG_FILE = "databus.xml";
public static final String databusClustersConfig = "databus.consumer.clusters";
public static final String databusChkProviderConfig =
"databus.consumer.chkpoint.provider.classname";
public static final String DEFAULT_CHK_PROVIDER = FSCheckpointProvider.class
.getName();
public static final String checkpointDirConfig =
"databus.consumer.checkpoint.dir";
public static final String DEFAULT_CHECKPOINT_DIR = ".";
}
| package com.inmobi.messaging.consumer.databus;
import com.inmobi.databus.FSCheckpointProvider;
/**
* Configuration properties and their default values for {@link DatabusConsumer}
*/
public interface DatabusConsumerConfig {
public static final String queueSizeConfig = "databus.consumer.buffer.size";
public static final int DEFAULT_QUEUE_SIZE = 5000;
public static final String waitTimeForFlushConfig =
"databus.consumer.waittime.forcollectorflush";
public static final long DEFAULT_WAIT_TIME_FOR_FLUSH = 1000; // 1 second
public static final String databusConfigFileKey = "databus.conf";
public static final String DEFAULT_DATABUS_CONFIG_FILE = "databus.xml";
public static final String databusClustersConfig = "databus.consumer.clusters";
public static final String databusChkProviderConfig =
"databus.consumer.chkpoint.provider.classname";
public static final String DEFAULT_CHK_PROVIDER = FSCheckpointProvider.class
.getName();
public static final String checkpointDirConfig =
"databus.consumer.checkpoint.dir";
public static final String DEFAULT_CHECKPOINT_DIR = ".";
public static final String databusConsumerPrincipal =
"databus.consumer.principal.name";
public static final String databusConsumerKeytab =
"databus.consumer.keytab.path";
}
| Fix typo in Config name | Fix typo in Config name
| Java | apache-2.0 | rajubairishetti/pintail,InMobi/pintail,rajubairishetti/pintail,sreedishps/pintail,sreedishps/pintail,InMobi/pintail | java | ## Code Before:
package com.inmobi.messaging.consumer.databus;
import com.inmobi.databus.FSCheckpointProvider;
/**
* Configuration properties and their default values for {@link DatabusConsumer}
*/
public interface DatabusConsumerConfig {
public static final String queueSizeConfig = "databus.consumer.buffer.size";
public static final int DEFAULT_QUEUE_SIZE = 5000;
public static final String waitTimeForFlushConfig =
"databus.consumer.waittime.forcollectorflush";
public static final long DEFAULT_WAIT_TIME_FOR_FLUSH = 1000; // 1 second
public static final String databusConfigFileKey = "databus.conf";
public static final String DEFAULT_DATABUS_CONFIG_FILE = "databus.xml";
public static final String databusClustersConfig = "databus.consumer.clusters";
public static final String databusChkProviderConfig =
"databus.consumer.chkpoint.provider.classname";
public static final String DEFAULT_CHK_PROVIDER = FSCheckpointProvider.class
.getName();
public static final String checkpointDirConfig =
"databus.consumer.checkpoint.dir";
public static final String DEFAULT_CHECKPOINT_DIR = ".";
}
## Instruction:
Fix typo in Config name
## Code After:
package com.inmobi.messaging.consumer.databus;
import com.inmobi.databus.FSCheckpointProvider;
/**
* Configuration properties and their default values for {@link DatabusConsumer}
*/
public interface DatabusConsumerConfig {
public static final String queueSizeConfig = "databus.consumer.buffer.size";
public static final int DEFAULT_QUEUE_SIZE = 5000;
public static final String waitTimeForFlushConfig =
"databus.consumer.waittime.forcollectorflush";
public static final long DEFAULT_WAIT_TIME_FOR_FLUSH = 1000; // 1 second
public static final String databusConfigFileKey = "databus.conf";
public static final String DEFAULT_DATABUS_CONFIG_FILE = "databus.xml";
public static final String databusClustersConfig = "databus.consumer.clusters";
public static final String databusChkProviderConfig =
"databus.consumer.chkpoint.provider.classname";
public static final String DEFAULT_CHK_PROVIDER = FSCheckpointProvider.class
.getName();
public static final String checkpointDirConfig =
"databus.consumer.checkpoint.dir";
public static final String DEFAULT_CHECKPOINT_DIR = ".";
public static final String databusConsumerPrincipal =
"databus.consumer.principal.name";
public static final String databusConsumerKeytab =
"databus.consumer.keytab.path";
}
| package com.inmobi.messaging.consumer.databus;
import com.inmobi.databus.FSCheckpointProvider;
/**
* Configuration properties and their default values for {@link DatabusConsumer}
*/
public interface DatabusConsumerConfig {
public static final String queueSizeConfig = "databus.consumer.buffer.size";
public static final int DEFAULT_QUEUE_SIZE = 5000;
public static final String waitTimeForFlushConfig =
"databus.consumer.waittime.forcollectorflush";
public static final long DEFAULT_WAIT_TIME_FOR_FLUSH = 1000; // 1 second
public static final String databusConfigFileKey = "databus.conf";
public static final String DEFAULT_DATABUS_CONFIG_FILE = "databus.xml";
public static final String databusClustersConfig = "databus.consumer.clusters";
public static final String databusChkProviderConfig =
"databus.consumer.chkpoint.provider.classname";
public static final String DEFAULT_CHK_PROVIDER = FSCheckpointProvider.class
.getName();
public static final String checkpointDirConfig =
"databus.consumer.checkpoint.dir";
public static final String DEFAULT_CHECKPOINT_DIR = ".";
+ public static final String databusConsumerPrincipal =
+ "databus.consumer.principal.name";
+ public static final String databusConsumerKeytab =
+ "databus.consumer.keytab.path";
+
} | 5 | 0.16129 | 5 | 0 |
46726c4ffa45cfa6a42f46f28326afe2dd834ce9 | stylesheets/src/navbar.less | stylesheets/src/navbar.less | .above;
.bkg(#EEE);
.dock-top(fixed, 48px);
.pad(4px);
.user-select-none;
border-bottom: 2px solid @lightGrey;
.logo {
.dock-left(absolute, 48px);
.img-logo;
.no-outline;
margin: 4px;
&:hover, &:focus {
-webkit-filter: brightness(110%);
filter: brightness(110%);
}
}
h1 {
.fill(absolute);
.single-line(ellipsis);
color: #C4C4C4;
font: 600 28px/40px @baseFont;
margin: 6px 64px 6px 64px;
}
}
| .above;
.bkg(#EEE);
.dock-top(fixed, 48px);
.pad(4px);
.user-select-none;
border-bottom: 2px solid @lightGrey;
.logo {
.dock-left(absolute, 40px);
.img-logo-glyph;
.no-outline;
margin: 8px;
&:hover, &:focus {
-webkit-filter: brightness(110%);
filter: brightness(110%);
}
}
h1 {
.fill(absolute);
.single-line(ellipsis);
color: #C4C4C4;
font: 600 28px/40px @baseFont;
margin: 6px 64px 6px 64px;
}
}
#account {
.block;
.dock-right(absolute, 128px);
.no-underline;
.text-center;
border-left: 3px solid @lightGrey;
color: #AAA;
margin: 8px 0;
p, b { margin: 2px 4px; }
b {
.block;
.single-line;
color: #BBB;
}
}
| Use a more compact SVG glyph-style version of the logo. | Use a more compact SVG glyph-style version of the logo.
| Less | agpl-3.0 | CamiloMM/Noumena,CamiloMM/Noumena,CamiloMM/Noumena | less | ## Code Before:
.above;
.bkg(#EEE);
.dock-top(fixed, 48px);
.pad(4px);
.user-select-none;
border-bottom: 2px solid @lightGrey;
.logo {
.dock-left(absolute, 48px);
.img-logo;
.no-outline;
margin: 4px;
&:hover, &:focus {
-webkit-filter: brightness(110%);
filter: brightness(110%);
}
}
h1 {
.fill(absolute);
.single-line(ellipsis);
color: #C4C4C4;
font: 600 28px/40px @baseFont;
margin: 6px 64px 6px 64px;
}
}
## Instruction:
Use a more compact SVG glyph-style version of the logo.
## Code After:
.above;
.bkg(#EEE);
.dock-top(fixed, 48px);
.pad(4px);
.user-select-none;
border-bottom: 2px solid @lightGrey;
.logo {
.dock-left(absolute, 40px);
.img-logo-glyph;
.no-outline;
margin: 8px;
&:hover, &:focus {
-webkit-filter: brightness(110%);
filter: brightness(110%);
}
}
h1 {
.fill(absolute);
.single-line(ellipsis);
color: #C4C4C4;
font: 600 28px/40px @baseFont;
margin: 6px 64px 6px 64px;
}
}
#account {
.block;
.dock-right(absolute, 128px);
.no-underline;
.text-center;
border-left: 3px solid @lightGrey;
color: #AAA;
margin: 8px 0;
p, b { margin: 2px 4px; }
b {
.block;
.single-line;
color: #BBB;
}
}
| .above;
.bkg(#EEE);
.dock-top(fixed, 48px);
.pad(4px);
.user-select-none;
border-bottom: 2px solid @lightGrey;
.logo {
- .dock-left(absolute, 48px);
? ^
+ .dock-left(absolute, 40px);
? ^
- .img-logo;
+ .img-logo-glyph;
? ++++++
.no-outline;
- margin: 4px;
? ^
+ margin: 8px;
? ^
&:hover, &:focus {
-webkit-filter: brightness(110%);
filter: brightness(110%);
}
}
h1 {
.fill(absolute);
.single-line(ellipsis);
color: #C4C4C4;
font: 600 28px/40px @baseFont;
margin: 6px 64px 6px 64px;
}
}
+
+ #account {
+ .block;
+ .dock-right(absolute, 128px);
+ .no-underline;
+ .text-center;
+ border-left: 3px solid @lightGrey;
+ color: #AAA;
+ margin: 8px 0;
+
+ p, b { margin: 2px 4px; }
+
+ b {
+ .block;
+ .single-line;
+ color: #BBB;
+ }
+ } | 24 | 0.888889 | 21 | 3 |
58ea8a277bb765a744cc1e954fe1b76fc3f8036a | examples/responses.pl | examples/responses.pl | use Mojolicious::Lite;
get '/res1' => sub {
my $c = shift;
$c->render(data => 'Hello World!');
};
get '/res2' => sub {
my $c = shift;
$c->write('Hello ');
$c->write('World!');
$c->write('');
};
get '/res3' => sub {
my $c = shift;
$c->write_chunk('Hello ');
$c->write_chunk('World!');
$c->write_chunk('');
};
get '/res4' => sub {
my $c = shift;
$c->render(data => '', status => 204);
};
app->start;
| use Mojolicious::Lite;
get '/res1' => sub {
my $c = shift;
$c->render(data => 'Hello World!');
};
get '/res2' => sub {
my $c = shift;
$c->write('Hello ');
$c->write('World!');
$c->write('');
};
get '/res3' => sub {
my $c = shift;
$c->write_chunk('Hello ');
$c->write_chunk('World!');
$c->write_chunk('');
};
get '/res4' => sub {
my $c = shift;
$c->render(data => '', status => 204);
};
get '/res5' => sub {
die 'Hello World!';
};
app->start;
| Add another way to trigger rendering to the example app | Add another way to trigger rendering to the example app
| Perl | artistic-2.0 | kberov/mojo,polettix/mojo,kraih/mojo,kiwiroy/mojo,s1037989/mojo,s1037989/mojo,lindleyw/mojo | perl | ## Code Before:
use Mojolicious::Lite;
get '/res1' => sub {
my $c = shift;
$c->render(data => 'Hello World!');
};
get '/res2' => sub {
my $c = shift;
$c->write('Hello ');
$c->write('World!');
$c->write('');
};
get '/res3' => sub {
my $c = shift;
$c->write_chunk('Hello ');
$c->write_chunk('World!');
$c->write_chunk('');
};
get '/res4' => sub {
my $c = shift;
$c->render(data => '', status => 204);
};
app->start;
## Instruction:
Add another way to trigger rendering to the example app
## Code After:
use Mojolicious::Lite;
get '/res1' => sub {
my $c = shift;
$c->render(data => 'Hello World!');
};
get '/res2' => sub {
my $c = shift;
$c->write('Hello ');
$c->write('World!');
$c->write('');
};
get '/res3' => sub {
my $c = shift;
$c->write_chunk('Hello ');
$c->write_chunk('World!');
$c->write_chunk('');
};
get '/res4' => sub {
my $c = shift;
$c->render(data => '', status => 204);
};
get '/res5' => sub {
die 'Hello World!';
};
app->start;
| use Mojolicious::Lite;
get '/res1' => sub {
my $c = shift;
$c->render(data => 'Hello World!');
};
get '/res2' => sub {
my $c = shift;
$c->write('Hello ');
$c->write('World!');
$c->write('');
};
get '/res3' => sub {
my $c = shift;
$c->write_chunk('Hello ');
$c->write_chunk('World!');
$c->write_chunk('');
};
get '/res4' => sub {
my $c = shift;
$c->render(data => '', status => 204);
};
+ get '/res5' => sub {
+ die 'Hello World!';
+ };
+
app->start; | 4 | 0.148148 | 4 | 0 |
d540b376389b6e6c61c6fbe5912d182c2b9867b4 | app/assets/stylesheets/application.sass | app/assets/stylesheets/application.sass | // Library
@import normalize
@import foundation/foundation_and_overrides
@import jquery-ui/core
@import jquery-ui/menu
@import jquery-ui/theme
@import jquery-ui/autocomplete
@import font-awesome
@import mapbox
@import outdatedbrowser/outdatedBrowser
@import vex
@import vex-theme-plain
@import magnific-popup
@import fotorama
@import fullcalendar
@import mediaelement_rails.css
// Globals
@import globals/mixins
@import globals/settings
@import nprogress
// Base
@import base/base
@import base/forms
// Layout
@import layout/page
// Pages
@import pages/homepage
@import pages/contact
@import pages/blog
@import pages/search
// Modules
@import modules/header
@import modules/logo
@import modules/language
@import modules/menu
@import modules/adminbar
@import modules/ribbon
// @import modules/breadcrumb
@import modules/background
@import modules/comment
@import modules/social
@import modules/prev_next
@import modules/footer
@import modules/sticky_footer
@import modules/404
// custom_Plugins
// @import custom_plugins/devkit
@import custom_plugins/mapbox
@import custom_plugins/mapbox-popup
@import custom_plugins/alert
@import custom_plugins/autocomplete
@import custom_plugins/fotorama
@import custom_plugins/mediaelement
@import custom_plugins/cookie_cnil
@import custom_plugins/maintenance
| // Library
@import normalize
@import foundation/foundation_and_overrides
@import jquery-ui/core
@import jquery-ui/menu
@import jquery-ui/theme
@import jquery-ui/autocomplete
@import font-awesome
@import mapbox
@import outdatedbrowser/outdatedBrowser
@import vex
@import vex-theme-plain
@import magnific-popup
@import fotorama
@import fullcalendar
@import mediaelement_rails/mediaelementplayer
// Globals
@import globals/mixins
@import globals/settings
@import nprogress
// Base
@import base/base
@import base/forms
// Layout
@import layout/page
// Pages
@import pages/homepage
@import pages/contact
@import pages/blog
@import pages/search
// Modules
@import modules/header
@import modules/logo
@import modules/language
@import modules/menu
@import modules/adminbar
@import modules/ribbon
// @import modules/breadcrumb
@import modules/background
@import modules/comment
@import modules/social
@import modules/prev_next
@import modules/footer
@import modules/sticky_footer
@import modules/404
// custom_Plugins
// @import custom_plugins/devkit
@import custom_plugins/mapbox
@import custom_plugins/mapbox-popup
@import custom_plugins/alert
@import custom_plugins/autocomplete
@import custom_plugins/fotorama
@import custom_plugins/mediaelement
@import custom_plugins/cookie_cnil
@import custom_plugins/maintenance
| Fix bug with missing mediaelement style in production | Fix bug with missing mediaelement style in production
| Sass | mit | lr-agenceweb/rails-starter,lr-agenceweb/rails-starter,lr-agenceweb/rails-starter,lr-agenceweb/rails-starter | sass | ## Code Before:
// Library
@import normalize
@import foundation/foundation_and_overrides
@import jquery-ui/core
@import jquery-ui/menu
@import jquery-ui/theme
@import jquery-ui/autocomplete
@import font-awesome
@import mapbox
@import outdatedbrowser/outdatedBrowser
@import vex
@import vex-theme-plain
@import magnific-popup
@import fotorama
@import fullcalendar
@import mediaelement_rails.css
// Globals
@import globals/mixins
@import globals/settings
@import nprogress
// Base
@import base/base
@import base/forms
// Layout
@import layout/page
// Pages
@import pages/homepage
@import pages/contact
@import pages/blog
@import pages/search
// Modules
@import modules/header
@import modules/logo
@import modules/language
@import modules/menu
@import modules/adminbar
@import modules/ribbon
// @import modules/breadcrumb
@import modules/background
@import modules/comment
@import modules/social
@import modules/prev_next
@import modules/footer
@import modules/sticky_footer
@import modules/404
// custom_Plugins
// @import custom_plugins/devkit
@import custom_plugins/mapbox
@import custom_plugins/mapbox-popup
@import custom_plugins/alert
@import custom_plugins/autocomplete
@import custom_plugins/fotorama
@import custom_plugins/mediaelement
@import custom_plugins/cookie_cnil
@import custom_plugins/maintenance
## Instruction:
Fix bug with missing mediaelement style in production
## Code After:
// Library
@import normalize
@import foundation/foundation_and_overrides
@import jquery-ui/core
@import jquery-ui/menu
@import jquery-ui/theme
@import jquery-ui/autocomplete
@import font-awesome
@import mapbox
@import outdatedbrowser/outdatedBrowser
@import vex
@import vex-theme-plain
@import magnific-popup
@import fotorama
@import fullcalendar
@import mediaelement_rails/mediaelementplayer
// Globals
@import globals/mixins
@import globals/settings
@import nprogress
// Base
@import base/base
@import base/forms
// Layout
@import layout/page
// Pages
@import pages/homepage
@import pages/contact
@import pages/blog
@import pages/search
// Modules
@import modules/header
@import modules/logo
@import modules/language
@import modules/menu
@import modules/adminbar
@import modules/ribbon
// @import modules/breadcrumb
@import modules/background
@import modules/comment
@import modules/social
@import modules/prev_next
@import modules/footer
@import modules/sticky_footer
@import modules/404
// custom_Plugins
// @import custom_plugins/devkit
@import custom_plugins/mapbox
@import custom_plugins/mapbox-popup
@import custom_plugins/alert
@import custom_plugins/autocomplete
@import custom_plugins/fotorama
@import custom_plugins/mediaelement
@import custom_plugins/cookie_cnil
@import custom_plugins/maintenance
| // Library
@import normalize
@import foundation/foundation_and_overrides
@import jquery-ui/core
@import jquery-ui/menu
@import jquery-ui/theme
@import jquery-ui/autocomplete
@import font-awesome
@import mapbox
@import outdatedbrowser/outdatedBrowser
@import vex
@import vex-theme-plain
@import magnific-popup
@import fotorama
@import fullcalendar
- @import mediaelement_rails.css
+ @import mediaelement_rails/mediaelementplayer
// Globals
@import globals/mixins
@import globals/settings
@import nprogress
// Base
@import base/base
@import base/forms
// Layout
@import layout/page
// Pages
@import pages/homepage
@import pages/contact
@import pages/blog
@import pages/search
// Modules
@import modules/header
@import modules/logo
@import modules/language
@import modules/menu
@import modules/adminbar
@import modules/ribbon
// @import modules/breadcrumb
@import modules/background
@import modules/comment
@import modules/social
@import modules/prev_next
@import modules/footer
@import modules/sticky_footer
@import modules/404
// custom_Plugins
// @import custom_plugins/devkit
@import custom_plugins/mapbox
@import custom_plugins/mapbox-popup
@import custom_plugins/alert
@import custom_plugins/autocomplete
@import custom_plugins/fotorama
@import custom_plugins/mediaelement
@import custom_plugins/cookie_cnil
@import custom_plugins/maintenance | 2 | 0.031746 | 1 | 1 |
677239f3d9996d7f5ee701aa5d903a01ec0e2f6b | js/README.md | js/README.md |
The squares on the grid are numbered from 0 to 8, left to right, top to bottom:
012
345
678
This allows the squares to be accessed easily in an array.
The two files `x.json` and `o.json` contain the unminified DFA for the game's "impossible" mode for the machine playing as X and O, respectively. Each object is a DFA state object that contains a `move` field and a `status` field. The `move` field specifies the next move that the machine should take; its value is an integer from 0 to 8 representing one of the squares on the grid. The `status` field specifies the game's winning status: `1` for a win, `-1` for a loss, `0` for a draw, and `null` for an unfinished game (you will never see `-1` because this machine never loses _\*evil laughter\*_). If the value of `status` is `null`, then the object will also contain entries whose keys correspond to each of the empty squares on the grid available for the opponent to play and whose values are a DFA state object. `o.json` contains an array at the root because the machine is expected to play second; the elements in the array (with keys 0 to 8) are DFA state objects.
The data for `x.json` and `o.json` is courtesy of [XKCD comic #832](https://www.xkcd.com/832/).
|
The squares on the grid are numbered from 0 to 8, left to right, top to bottom:
012
345
678
This allows the squares to be accessed easily in an array.
The two files `x.json` and `o.json` contain the unminified DFA for the game's "impossible" mode for the machine playing as X and O, respectively. Each object is a DFA state object that contains a `move` field and a `status` field. The `move` field specifies the next move that the machine should take; its value is an integer from 0 to 8 representing one of the squares on the grid. The `status` field specifies the game's winning status: `1` for a win, `-1` for a loss, `0` for a draw, and `null` for an unfinished game (you will never see `-1` because this machine never loses _\*evil laughter\*_). If the value of `status` is `null`, then the object will also contain entries whose keys correspond to each of the empty squares on the grid available for the opponent to play and whose values are a DFA state object. `o.json` contains an array at the root because the machine is expected to play second; the elements in the array (with keys 0 to 8) are DFA state objects.
The data for `x.json` and `o.json` is based on [XKCD comic #832](https://www.xkcd.com/832/). Wikipedia has a [corrected version](https://en.wikipedia.org/wiki/Tic-tac-toe#Strategy).
| Add comment about corrected version | Add comment about corrected version
| Markdown | mpl-2.0 | tpenguinltg/tic-tac-toe,tpenguinltg/tic-tac-toe | markdown | ## Code Before:
The squares on the grid are numbered from 0 to 8, left to right, top to bottom:
012
345
678
This allows the squares to be accessed easily in an array.
The two files `x.json` and `o.json` contain the unminified DFA for the game's "impossible" mode for the machine playing as X and O, respectively. Each object is a DFA state object that contains a `move` field and a `status` field. The `move` field specifies the next move that the machine should take; its value is an integer from 0 to 8 representing one of the squares on the grid. The `status` field specifies the game's winning status: `1` for a win, `-1` for a loss, `0` for a draw, and `null` for an unfinished game (you will never see `-1` because this machine never loses _\*evil laughter\*_). If the value of `status` is `null`, then the object will also contain entries whose keys correspond to each of the empty squares on the grid available for the opponent to play and whose values are a DFA state object. `o.json` contains an array at the root because the machine is expected to play second; the elements in the array (with keys 0 to 8) are DFA state objects.
The data for `x.json` and `o.json` is courtesy of [XKCD comic #832](https://www.xkcd.com/832/).
## Instruction:
Add comment about corrected version
## Code After:
The squares on the grid are numbered from 0 to 8, left to right, top to bottom:
012
345
678
This allows the squares to be accessed easily in an array.
The two files `x.json` and `o.json` contain the unminified DFA for the game's "impossible" mode for the machine playing as X and O, respectively. Each object is a DFA state object that contains a `move` field and a `status` field. The `move` field specifies the next move that the machine should take; its value is an integer from 0 to 8 representing one of the squares on the grid. The `status` field specifies the game's winning status: `1` for a win, `-1` for a loss, `0` for a draw, and `null` for an unfinished game (you will never see `-1` because this machine never loses _\*evil laughter\*_). If the value of `status` is `null`, then the object will also contain entries whose keys correspond to each of the empty squares on the grid available for the opponent to play and whose values are a DFA state object. `o.json` contains an array at the root because the machine is expected to play second; the elements in the array (with keys 0 to 8) are DFA state objects.
The data for `x.json` and `o.json` is based on [XKCD comic #832](https://www.xkcd.com/832/). Wikipedia has a [corrected version](https://en.wikipedia.org/wiki/Tic-tac-toe#Strategy).
|
The squares on the grid are numbered from 0 to 8, left to right, top to bottom:
012
345
678
This allows the squares to be accessed easily in an array.
The two files `x.json` and `o.json` contain the unminified DFA for the game's "impossible" mode for the machine playing as X and O, respectively. Each object is a DFA state object that contains a `move` field and a `status` field. The `move` field specifies the next move that the machine should take; its value is an integer from 0 to 8 representing one of the squares on the grid. The `status` field specifies the game's winning status: `1` for a win, `-1` for a loss, `0` for a draw, and `null` for an unfinished game (you will never see `-1` because this machine never loses _\*evil laughter\*_). If the value of `status` is `null`, then the object will also contain entries whose keys correspond to each of the empty squares on the grid available for the opponent to play and whose values are a DFA state object. `o.json` contains an array at the root because the machine is expected to play second; the elements in the array (with keys 0 to 8) are DFA state objects.
- The data for `x.json` and `o.json` is courtesy of [XKCD comic #832](https://www.xkcd.com/832/).
+ The data for `x.json` and `o.json` is based on [XKCD comic #832](https://www.xkcd.com/832/). Wikipedia has a [corrected version](https://en.wikipedia.org/wiki/Tic-tac-toe#Strategy). | 2 | 0.166667 | 1 | 1 |
a8dc6903bbeb207b63fcde2f318501d27052ec07 | js/background.js | js/background.js | chrome.tabs.onCreated.addListener(function(tab) {
console.log("Tab Created", tab);
});
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
console.log("Tab changed: #" + tabId, changeInfo, tab );
});
chrome.browserAction.onClicked.addListener(function() {
console.log("Browser action clicked!");
}); | chrome.tabs.onCreated.addListener(function(tab) {
console.log("Tab Created", tab);
});
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
console.log("Tab changed: #" + tabId, changeInfo, tab );
});
chrome.storage.sync.get(null, function(items) {
console.log("All items in synced storage", items );
});
chrome.storage.sync.set({name : "Pierce Moore", _config : _config }, function() {
console.log("Set name to Pierce Moore");
});
chrome.storage.sync.get("name", function(items) {
console.log("Name in synced storage", items );
});
chrome.storage.sync.get(null, function(items) {
console.log("All items in synced storage", items );
});
chrome.storage.sync.getBytesInUse(null, function(bytes) {
console.log("Total bytes in use:" + bytes + " bytes");
});
var d = new Date();
console.log(d.toUTCString()); | Set of generic functions to test the chrome.* APIs | Set of generic functions to test the chrome.* APIs
| JavaScript | bsd-3-clause | rex/BANTP | javascript | ## Code Before:
chrome.tabs.onCreated.addListener(function(tab) {
console.log("Tab Created", tab);
});
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
console.log("Tab changed: #" + tabId, changeInfo, tab );
});
chrome.browserAction.onClicked.addListener(function() {
console.log("Browser action clicked!");
});
## Instruction:
Set of generic functions to test the chrome.* APIs
## Code After:
chrome.tabs.onCreated.addListener(function(tab) {
console.log("Tab Created", tab);
});
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
console.log("Tab changed: #" + tabId, changeInfo, tab );
});
chrome.storage.sync.get(null, function(items) {
console.log("All items in synced storage", items );
});
chrome.storage.sync.set({name : "Pierce Moore", _config : _config }, function() {
console.log("Set name to Pierce Moore");
});
chrome.storage.sync.get("name", function(items) {
console.log("Name in synced storage", items );
});
chrome.storage.sync.get(null, function(items) {
console.log("All items in synced storage", items );
});
chrome.storage.sync.getBytesInUse(null, function(bytes) {
console.log("Total bytes in use:" + bytes + " bytes");
});
var d = new Date();
console.log(d.toUTCString()); | chrome.tabs.onCreated.addListener(function(tab) {
console.log("Tab Created", tab);
});
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
console.log("Tab changed: #" + tabId, changeInfo, tab );
});
- chrome.browserAction.onClicked.addListener(function() {
- console.log("Browser action clicked!");
+ chrome.storage.sync.get(null, function(items) {
+ console.log("All items in synced storage", items );
});
+
+ chrome.storage.sync.set({name : "Pierce Moore", _config : _config }, function() {
+ console.log("Set name to Pierce Moore");
+ });
+
+ chrome.storage.sync.get("name", function(items) {
+ console.log("Name in synced storage", items );
+ });
+
+ chrome.storage.sync.get(null, function(items) {
+ console.log("All items in synced storage", items );
+ });
+
+ chrome.storage.sync.getBytesInUse(null, function(bytes) {
+ console.log("Total bytes in use:" + bytes + " bytes");
+ });
+
+ var d = new Date();
+
+ console.log(d.toUTCString()); | 24 | 2.181818 | 22 | 2 |
edb093979ffd6a0fe8494b1bb6d8715f5f2d15cf | src/site/adminBundle/Resources/config/security.yml | src/site/adminBundle/Resources/config/security.yml | security:
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: .*
form_login:
provider: fos_userbundle
csrf_provider: form.csrf_provider
always_use_default_target_path: true
#default_target_path: /admin/
default_target_path: /login-check
check_path: fos_user_security_check
#login-path: fos_user_security_login
login-path: loginredirect
logout:
path: fos_user_security_logout
anonymous: true
access_control:
- { path: /admin/login, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/login_check, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/*, role: ROLE_ADMIN }
- { path: /reserver/*, role: ROLE_PASSAGER }
- { path: /admin/profile/change-password, role: ROLE_USER}
# - { path: admin/register, role: IS_AUTHENTICATED_ANONYMOUSLY }
# - { path: ^/resetting, role: IS_AUTHENTICATED_ANONYMOUSLY }
| security:
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: .*
form_login:
provider: fos_userbundle
csrf_provider: form.csrf_provider
always_use_default_target_path: true
#default_target_path: /admin/
default_target_path: /login-check
check_path: fos_user_security_check
#login-path: fos_user_security_login
login-path: loginredirect
logout:
path: fos_user_security_logout
anonymous: true
access_control:
- { path: /admin/login, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/login_check, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/*, role: ROLE_ADMIN }
- { path: /reserver/*, role: ROLE_PASSAGER }
- { path: /admin/profile/change-password, role: ROLE_USER}
- { path: /mesreservations, role: ROLE_PASSAGER}
- { path: /profile, role: ROLE_PASSAGER}
- { path: /logout, role: ROLE_PASSAGER}
- { path: /reserver, role: ROLE_PASSAGER}
# - { path: admin/register, role: IS_AUTHENTICATED_ANONYMOUSLY }
# - { path: ^/resetting, role: IS_AUTHENTICATED_ANONYMOUSLY }
| Update Firewall for reservation coté client | Update Firewall for reservation coté client
| YAML | mit | kh3dr0n/projetphp,kh3dr0n/projetphp | yaml | ## Code Before:
security:
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: .*
form_login:
provider: fos_userbundle
csrf_provider: form.csrf_provider
always_use_default_target_path: true
#default_target_path: /admin/
default_target_path: /login-check
check_path: fos_user_security_check
#login-path: fos_user_security_login
login-path: loginredirect
logout:
path: fos_user_security_logout
anonymous: true
access_control:
- { path: /admin/login, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/login_check, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/*, role: ROLE_ADMIN }
- { path: /reserver/*, role: ROLE_PASSAGER }
- { path: /admin/profile/change-password, role: ROLE_USER}
# - { path: admin/register, role: IS_AUTHENTICATED_ANONYMOUSLY }
# - { path: ^/resetting, role: IS_AUTHENTICATED_ANONYMOUSLY }
## Instruction:
Update Firewall for reservation coté client
## Code After:
security:
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: .*
form_login:
provider: fos_userbundle
csrf_provider: form.csrf_provider
always_use_default_target_path: true
#default_target_path: /admin/
default_target_path: /login-check
check_path: fos_user_security_check
#login-path: fos_user_security_login
login-path: loginredirect
logout:
path: fos_user_security_logout
anonymous: true
access_control:
- { path: /admin/login, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/login_check, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/*, role: ROLE_ADMIN }
- { path: /reserver/*, role: ROLE_PASSAGER }
- { path: /admin/profile/change-password, role: ROLE_USER}
- { path: /mesreservations, role: ROLE_PASSAGER}
- { path: /profile, role: ROLE_PASSAGER}
- { path: /logout, role: ROLE_PASSAGER}
- { path: /reserver, role: ROLE_PASSAGER}
# - { path: admin/register, role: IS_AUTHENTICATED_ANONYMOUSLY }
# - { path: ^/resetting, role: IS_AUTHENTICATED_ANONYMOUSLY }
| security:
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: .*
form_login:
provider: fos_userbundle
csrf_provider: form.csrf_provider
always_use_default_target_path: true
#default_target_path: /admin/
default_target_path: /login-check
check_path: fos_user_security_check
#login-path: fos_user_security_login
login-path: loginredirect
logout:
path: fos_user_security_logout
anonymous: true
access_control:
- { path: /admin/login, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/login_check, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: /admin/*, role: ROLE_ADMIN }
- { path: /reserver/*, role: ROLE_PASSAGER }
- { path: /admin/profile/change-password, role: ROLE_USER}
+ - { path: /mesreservations, role: ROLE_PASSAGER}
+ - { path: /profile, role: ROLE_PASSAGER}
+ - { path: /logout, role: ROLE_PASSAGER}
+ - { path: /reserver, role: ROLE_PASSAGER}
+
# - { path: admin/register, role: IS_AUTHENTICATED_ANONYMOUSLY }
# - { path: ^/resetting, role: IS_AUTHENTICATED_ANONYMOUSLY } | 5 | 0.185185 | 5 | 0 |
7cdd6ffc4bfd9f86d28923b534e34cd942e0d93a | html-template-element/README.md | html-template-element/README.md |
The HTML `<template>` element is a mechanism for holding client-side content that is not to be rendered when a page is loaded but may subsequently be instantiated during runtime using JavaScript.
Think of a template as a content fragment that is being stored for subsequent use in the document. While the parser does process the contents of the <template> element while loading the page, it does so only to ensure that those contents are valid; the element's contents are not rendered, however.
[Read More on MDN](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template)
|
The HTML `<template>` element is a mechanism for holding client-side content that is not to be rendered when a page is loaded but may subsequently be instantiated during runtime using JavaScript.
Think of a template as a content fragment that is being stored for subsequent use in the document. While the parser does process the contents of the <template> element while loading the page, it does so only to ensure that those contents are valid; the element's contents are not rendered, however.
[Read More on MDN](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template)
## Apple Proposal
- https://github.com/w3c/webcomponents/blob/gh-pages/proposals/Template-Instantiation.md
| Add apple proposal to template | Add apple proposal to template
| Markdown | mit | devpunks/snuggsi,devpunks/snuggsi,snuggs/snuggsi,snuggs/snuggsi,snuggs/snuggsi,devpunks/snuggsi,snuggs/snuggsi,devpunks/snuggsi,devpunks/snuggsi | markdown | ## Code Before:
The HTML `<template>` element is a mechanism for holding client-side content that is not to be rendered when a page is loaded but may subsequently be instantiated during runtime using JavaScript.
Think of a template as a content fragment that is being stored for subsequent use in the document. While the parser does process the contents of the <template> element while loading the page, it does so only to ensure that those contents are valid; the element's contents are not rendered, however.
[Read More on MDN](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template)
## Instruction:
Add apple proposal to template
## Code After:
The HTML `<template>` element is a mechanism for holding client-side content that is not to be rendered when a page is loaded but may subsequently be instantiated during runtime using JavaScript.
Think of a template as a content fragment that is being stored for subsequent use in the document. While the parser does process the contents of the <template> element while loading the page, it does so only to ensure that those contents are valid; the element's contents are not rendered, however.
[Read More on MDN](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template)
## Apple Proposal
- https://github.com/w3c/webcomponents/blob/gh-pages/proposals/Template-Instantiation.md
|
The HTML `<template>` element is a mechanism for holding client-side content that is not to be rendered when a page is loaded but may subsequently be instantiated during runtime using JavaScript.
Think of a template as a content fragment that is being stored for subsequent use in the document. While the parser does process the contents of the <template> element while loading the page, it does so only to ensure that those contents are valid; the element's contents are not rendered, however.
[Read More on MDN](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/template)
+ ## Apple Proposal
+
+ - https://github.com/w3c/webcomponents/blob/gh-pages/proposals/Template-Instantiation.md
+
+ | 5 | 0.714286 | 5 | 0 |
897e10a81e3495c7749977a51cc4b82602f9ba03 | tests/cases/fourslash/jsxExpressionFollowedByIdentifier.ts | tests/cases/fourslash/jsxExpressionFollowedByIdentifier.ts | /// <reference path="fourslash.ts" />
//@Filename: jsxExpressionFollowedByIdentifier.tsx
////declare var React: any;
////declare var x: string;
////const a = <div>{<div />/*1*/x/*2*/}</div>
goTo.marker('1');
verify.getSyntacticDiagnostics([{
code: 1005,
message: "'}' expected.",
range: {
fileName: test.marker('1').fileName,
pos: test.marker('1').position,
end: test.marker('2').position,
}
}]);
verify.quickInfoIs('var x: string'); | /// <reference path="fourslash.ts" />
//@Filename: jsxExpressionFollowedByIdentifier.tsx
////declare var React: any;
////declare var x: string;
////const a = <div>{<div />[|x|]}</div>
const range = test.ranges()[0];
verify.getSyntacticDiagnostics([{
code: 1005,
message: "'}' expected.",
range,
}]);
verify.quickInfoAt(range, 'var x: string'); | Use range instead of two markers | Use range instead of two markers
| TypeScript | apache-2.0 | alexeagle/TypeScript,kpreisser/TypeScript,RyanCavanaugh/TypeScript,SaschaNaz/TypeScript,Microsoft/TypeScript,Microsoft/TypeScript,nojvek/TypeScript,weswigham/TypeScript,kpreisser/TypeScript,kpreisser/TypeScript,nojvek/TypeScript,alexeagle/TypeScript,weswigham/TypeScript,kitsonk/TypeScript,alexeagle/TypeScript,RyanCavanaugh/TypeScript,kitsonk/TypeScript,nojvek/TypeScript,SaschaNaz/TypeScript,minestarks/TypeScript,Microsoft/TypeScript,microsoft/TypeScript,minestarks/TypeScript,microsoft/TypeScript,RyanCavanaugh/TypeScript,nojvek/TypeScript,microsoft/TypeScript,minestarks/TypeScript,SaschaNaz/TypeScript,weswigham/TypeScript,SaschaNaz/TypeScript,kitsonk/TypeScript | typescript | ## Code Before:
/// <reference path="fourslash.ts" />
//@Filename: jsxExpressionFollowedByIdentifier.tsx
////declare var React: any;
////declare var x: string;
////const a = <div>{<div />/*1*/x/*2*/}</div>
goTo.marker('1');
verify.getSyntacticDiagnostics([{
code: 1005,
message: "'}' expected.",
range: {
fileName: test.marker('1').fileName,
pos: test.marker('1').position,
end: test.marker('2').position,
}
}]);
verify.quickInfoIs('var x: string');
## Instruction:
Use range instead of two markers
## Code After:
/// <reference path="fourslash.ts" />
//@Filename: jsxExpressionFollowedByIdentifier.tsx
////declare var React: any;
////declare var x: string;
////const a = <div>{<div />[|x|]}</div>
const range = test.ranges()[0];
verify.getSyntacticDiagnostics([{
code: 1005,
message: "'}' expected.",
range,
}]);
verify.quickInfoAt(range, 'var x: string'); | /// <reference path="fourslash.ts" />
//@Filename: jsxExpressionFollowedByIdentifier.tsx
////declare var React: any;
////declare var x: string;
- ////const a = <div>{<div />/*1*/x/*2*/}</div>
? ^^^^^ ^^^^^
+ ////const a = <div>{<div />[|x|]}</div>
? ^^ ^^
- goTo.marker('1');
+ const range = test.ranges()[0];
verify.getSyntacticDiagnostics([{
code: 1005,
message: "'}' expected.",
- range: {
? ^^^
+ range,
? ^
- fileName: test.marker('1').fileName,
- pos: test.marker('1').position,
- end: test.marker('2').position,
- }
}]);
- verify.quickInfoIs('var x: string');
? ^^
+ verify.quickInfoAt(range, 'var x: string');
? ^^ +++++++
| 12 | 0.666667 | 4 | 8 |
a0874d7b1cebd4f0e42de5658ddd599e4b50467d | beam/verbs.go | beam/verbs.go | package beam
type Verb uint32
const (
Ack Verb = iota
Attach
Connect
Error
File
Get
Log
Ls
Set
Spawn
Start
Stop
Watch
)
| package beam
type Verb uint32
const (
Ack Verb = iota
Attach
Connect
Error
File
Get
Log
Ls
Set
Spawn
Start
Stop
Watch
)
func (v Verb) String() string {
switch v {
case Ack:
return "Ack"
case Attach:
return "Attach"
case Connect:
return "Connect"
case Error:
return "Error"
case File:
return "File"
case Get:
return "Get"
case Log:
return "Log"
case Ls:
return "Ls"
case Set:
return "Set"
case Spawn:
return "Spawn"
case Start:
return "Start"
case Stop:
return "Stop"
case Watch:
return "Watch"
}
return ""
}
| Add string representation of verb enum | Add string representation of verb enum
Docker-DCO-1.1-Signed-off-by: Ben Firshman <ben@firshman.co.uk> (github: bfirsh)
| Go | apache-2.0 | BSWANG/denverdino.github.io,jimmyxian/swarm,kyhavlov/swarm,rancher/swarm,bdwill/docker.github.io,mrjana/swarm,gdevillele/docker.github.io,johnstep/docker.github.io,marsmensch/swarm,EderRoger/swarm,NunoEdgarGub1/swarm,docker/docker.github.io,denverdino/docker.github.io,aluzzardi/swarm,rancher/swarm,vieux/swarm,pdevine/swarm,johnstep/docker.github.io,iaintshine/swarm,denverdino/denverdino.github.io,bfirsh/swarm,beni55/swarm-1,gdevillele/docker.github.io,phiroict/docker,joeuo/docker.github.io,atlassian/swarm,kunalkushwaha/swarm,echupriyanov/swarm,chanwit/swarm,shubheksha/docker.github.io,thaJeztah/docker.github.io,beni55/swarm-1,docker-zh/docker.github.io,anweiss/docker.github.io,schmunk42/swarm,jzwlqx/denverdino.github.io,ejholmes/swarm,danix800/docker.github.io,jay-lau/swarm,pssahi/swarm,u2mejc/swarm,therealbill/swarm,JimGalasyn/docker.github.io,aduermael/docker.github.io,phiroict/docker,jimmycmh/swarm,CiaranCostello/swarm,londoncalling/docker.github.io,vlajos/swarm,qianzhangxa/swarm,krystism/swarm,shot/swarm,allencloud/swarm,thaJeztah/docker.github.io,JimGalasyn/docker.github.io,shin-/docker.github.io,Spritekin/swarm,rutulpatel/swarm,everett-toews/swarm,menglingwei/denverdino.github.io,SivagnanamCiena/swarm,errordeveloper/swarm,menglingwei/denverdino.github.io,denverdino/denverdino.github.io,bpradipt/swarm,danix800/docker.github.io,troy0820/docker.github.io,christophelec/swarm,BSWANG/denverdino.github.io,atlassian/swarm,denverdino/denverdino.github.io,sanscontext/docker.github.io,barais/swarm,bdwill/docker.github.io,joeuo/docker.github.io,gdevillele/docker.github.io,abronan/swarm,schmunk42/swarm,therealbill/swarm,marsmensch/swarm,shakamunyi/swarm,CodeNow/swarm,joaofnfernandes/docker.github.io,cpuguy83/swarm,docker-zh/docker.github.io,rav121/swarm,FreedomWangLi/swarm,denverdino/swarm-1,auzias/swarm,shot/swarm,sanscontext/docker.github.io,shin-/docker.github.io,rillig/docker.github.io,pdevine/swarm,zkcrescent/swarm,parapsyche/swarm,JimGalasyn/docker.github.io,CodeNow/swarm,LuisBosquez/docker.github.io,tnachen/swarm,rchicoli/swarm,menglingwei/denverdino.github.io,LuisBosquez/docker.github.io,ezrasilvera/swarm,unodba/swarm,tnachen/swarm,qianzhangxa/swarm,docker/docker.github.io,londoncalling/docker.github.io,johnstep/docker.github.io,raoofm/swarm,EderRoger/swarm,christophelec/swarm,bfirsh/swarm,danix800/docker.github.io,joeuo/docker.github.io,unodba/swarm,shubheksha/docker.github.io,sanscontext/docker.github.io,gdevillele/docker.github.io,denverdino/docker.github.io,BSWANG/denverdino.github.io,joaofnfernandes/docker.github.io,gdevillele/docker.github.io,docker/docker.github.io,johnstep/docker.github.io,aduermael/docker.github.io,shubheksha/docker.github.io,francisbouvier/swarm,jimmycmh/swarm,swarm-hooks/swarm,dnephin/swarm,chanwit/swarm,FreedomWangLi/swarm,moxiegirl/swarm,cgvarela/swarm,tangkun75/swarm,shubheksha/docker.github.io,fugr/swarm,gonkulator/swarm,gradywang/swarm,iaintshine/swarm,vieux/swarm,sanscontext/docker.github.io,troy0820/docker.github.io,menglingwei/denverdino.github.io,FollowMyDev/swarm,ealang/swarm,jzwlqx/denverdino.github.io,aduermael/docker.github.io,ezrasilvera/SwarmACLs,jeffjen/swarm,jay-lau/swarm,SivagnanamCiena/swarm,BSWANG/denverdino.github.io,denverdino/docker.github.io,phiroict/docker,dnephin/swarm,LukasBacigal/swarm,pssahi/swarm,xFragger/swarm,PeoplePerHour/libswarm,ehazlett/swarm,docker-zh/docker.github.io,LuisBosquez/docker.github.io,denverdino/docker.github.io,ezrasilvera/SwarmACLs,fugr/swarm,joaofnfernandes/docker.github.io,rillig/docker.github.io,joaofnfernandes/docker.github.io,londoncalling/docker.github.io,francisbouvier/swarm,ezrasilvera/swarm,alexisbellido/docker.github.io,londoncalling/docker.github.io,joaofnfernandes/docker.github.io,aduermael/docker.github.io,linecheng/swarm,denverdino/swarm,rutulpatel/swarm,u2mejc/swarm,denverdino/denverdino.github.io,alexisbellido/docker.github.io,shin-/docker.github.io,JimGalasyn/docker.github.io,cpuguy83/swarm,rajdeepd/swarm,resouer/swarm,thaJeztah/docker.github.io,rchicoli/swarm,mrjana/swarm,doronp/swarm,everett-toews/swarm,troy0820/docker.github.io,barais/swarm,docker/docker.github.io,affo/swarm,ch3lo/swarm,mgoelzer/swarm,gradywang/swarm,ejholmes/swarm,anweiss/docker.github.io,MHBauer/swarm,echupriyanov/swarm,Aorjoa/swarm,megamsys/swarm,MHBauer/swarm,phiroict/docker,docker-zh/docker.github.io,yp-engineering/swarm,bdwill/docker.github.io,phiroict/docker,jimenez/swarm,alexisbellido/docker.github.io,yp-engineering/swarm,shahankhatch/swarm,menglingwei/denverdino.github.io,docker-zh/docker.github.io,denverdino/swarm,mavenugo/swarm,zkcrescent/swarm,shahankhatch/swarm,srikalyan/swarm,shin-/docker.github.io,snrism/swarm,vlajos/swarm,Aorjoa/swarm,akerbis/armhf-swarm,alexisbellido/docker.github.io,abronan/swarm,errordeveloper/swarm,liubin/swarm,jeffjen/swarm,shubheksha/docker.github.io,sfsmithcha/swarm,sfsmithcha/swarm,bdwill/docker.github.io,anweiss/docker.github.io,denverdino/denverdino.github.io,jimmyxian/swarm,ehazlett/swarm,krystism/swarm,tangkun75/swarm,jzwlqx/denverdino.github.io,rajdeepd/swarm,johnstep/docker.github.io,bsmr-docker/swarm,jdamick/swarm,kyhavlov/swarm,mavenugo/swarm,ealang/swarm,troy0820/docker.github.io,raoofm/swarm,mgoelzer/swarm,shakamunyi/swarm,parapsyche/swarm,polyverse-security/swarm,swarm-hooks/swarm,bsmr-docker/swarm,rav121/swarm,aidora/pisces,cgvarela/swarm,bdwill/docker.github.io,srikalyan/swarm,rillig/docker.github.io,auzias/swarm,thaJeztah/docker.github.io,ricca742/swarm,NunoEdgarGub1/swarm,deis/swarm,ch3lo/swarm,allencloud/swarm,londoncalling/docker.github.io,aluzzardi/swarm,LuisBosquez/docker.github.io,tranceitionalMynd/swarm,akerbis/armhf-swarm,jimenez/swarm,LukasBacigal/swarm,ricca742/swarm,aidora/pisces,anweiss/docker.github.io,bpradipt/swarm,alexisbellido/docker.github.io,jzwlqx/denverdino.github.io,dims/swarm,morpheyesh/swarm,jzwlqx/denverdino.github.io,riuvshin/swarm,affo/swarm,joeuo/docker.github.io,gonkulator/swarm,mountkin/swarm,megamsys/swarm,denverdino/docker.github.io,moxiegirl/swarm,jdamick/swarm,FollowMyDev/swarm,snrism/swarm,rillig/docker.github.io,JimGalasyn/docker.github.io,liubin/swarm,denverdino/swarm-1,BSWANG/swarm,Spritekin/swarm,LuisBosquez/docker.github.io,thaJeztah/docker.github.io,BSWANG/denverdino.github.io,resouer/swarm,CiaranCostello/swarm,danix800/docker.github.io,docker/docker.github.io,riuvshin/swarm,doronp/swarm,dims/swarm,joeuo/docker.github.io,mountkin/swarm,deis/swarm,PeoplePerHour/libswarm,morpheyesh/swarm,kunalkushwaha/swarm,linecheng/swarm,BSWANG/swarm,sanscontext/docker.github.io,xFragger/swarm,adouang/swarm,shin-/docker.github.io,anweiss/docker.github.io,tranceitionalMynd/swarm,adouang/swarm,polyverse-security/swarm | go | ## Code Before:
package beam
type Verb uint32
const (
Ack Verb = iota
Attach
Connect
Error
File
Get
Log
Ls
Set
Spawn
Start
Stop
Watch
)
## Instruction:
Add string representation of verb enum
Docker-DCO-1.1-Signed-off-by: Ben Firshman <ben@firshman.co.uk> (github: bfirsh)
## Code After:
package beam
type Verb uint32
const (
Ack Verb = iota
Attach
Connect
Error
File
Get
Log
Ls
Set
Spawn
Start
Stop
Watch
)
func (v Verb) String() string {
switch v {
case Ack:
return "Ack"
case Attach:
return "Attach"
case Connect:
return "Connect"
case Error:
return "Error"
case File:
return "File"
case Get:
return "Get"
case Log:
return "Log"
case Ls:
return "Ls"
case Set:
return "Set"
case Spawn:
return "Spawn"
case Start:
return "Start"
case Stop:
return "Stop"
case Watch:
return "Watch"
}
return ""
}
| package beam
type Verb uint32
const (
Ack Verb = iota
Attach
Connect
Error
File
Get
Log
Ls
Set
Spawn
Start
Stop
Watch
)
+
+ func (v Verb) String() string {
+ switch v {
+ case Ack:
+ return "Ack"
+ case Attach:
+ return "Attach"
+ case Connect:
+ return "Connect"
+ case Error:
+ return "Error"
+ case File:
+ return "File"
+ case Get:
+ return "Get"
+ case Log:
+ return "Log"
+ case Ls:
+ return "Ls"
+ case Set:
+ return "Set"
+ case Spawn:
+ return "Spawn"
+ case Start:
+ return "Start"
+ case Stop:
+ return "Stop"
+ case Watch:
+ return "Watch"
+ }
+ return ""
+ } | 32 | 1.684211 | 32 | 0 |
6a0b3721254477ab987c6f80e684af1c21e7162a | raven-log4j/src/test/resources/log4j-test.properties | raven-log4j/src/test/resources/log4j-test.properties | log4j.rootLogger=ALL, ConsoleAppender, SentryAppender
log4j.appender.ConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.ConsoleAppender.layout=org.apache.log4j.SimpleLayout
| log4j.rootLogger=ALL, ConsoleAppender
log4j.appender.ConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.ConsoleAppender.layout=org.apache.log4j.SimpleLayout
| Disable the SentryAppender during log4j tests | Disable the SentryAppender during log4j tests
| INI | bsd-3-clause | reki2000/raven-java6,buckett/raven-java,littleyang/raven-java,reki2000/raven-java6,galmeida/raven-java,buckett/raven-java,galmeida/raven-java,littleyang/raven-java | ini | ## Code Before:
log4j.rootLogger=ALL, ConsoleAppender, SentryAppender
log4j.appender.ConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.ConsoleAppender.layout=org.apache.log4j.SimpleLayout
## Instruction:
Disable the SentryAppender during log4j tests
## Code After:
log4j.rootLogger=ALL, ConsoleAppender
log4j.appender.ConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.ConsoleAppender.layout=org.apache.log4j.SimpleLayout
| - log4j.rootLogger=ALL, ConsoleAppender, SentryAppender
? ----------------
+ log4j.rootLogger=ALL, ConsoleAppender
log4j.appender.ConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.ConsoleAppender.layout=org.apache.log4j.SimpleLayout | 2 | 0.4 | 1 | 1 |
42a7731927e6bdf94b44bb8e65dc364e623712e2 | README.md | README.md | respoke-android
===============
| respoke-android
===============
Building the WebRTC libaries
============================
Unlike iOS, the WebRTC libraries for Android can ONLY be built on Linux. It WILL NOT work on Mac OS X. If, like me, you are only working with a Mac then one easy-ish approach is to create an Ubuntu virtual machine and perform the build there. Once the library has been built, it may be used inside of the Android Studio project on Mac again without issues.
The following steps are what was required for me to build the libraries using a trial version of VMWare and Ubuntu 14.04 LTS.
1) Download and install VMWare Fusion (I used v7.0)
https://www.vmware.com/products/fusion/features.html
2) Download the Ubuntu 14.04 LTS install image. Get the "64-bit Mac (AMD64)" image
http://www.ubuntu.com/download/desktop
3) Create a new virtual machine using the Ubuntu image. Give it at least 20GB of space to work with. I choose not to let it share home directories with my Mac user so it was easier to clean up afterwards.
4) Login to the new Ubuntu virtual machine desktop and open a terminal (Ctrl-Alt-T)
5) Install git
sudo apt-get install git
6) Configure Git for your Github credentials, and then clone the Respoke repository into your home directory
username@ubuntu:~$ git clone https://github.com/Ninjanetic/respoke-android.git
7) Install dependenices part 1
A series of scripts have been provided in the Respoke repository to make it easier to set up and run the build. It requires several steps and takes several hours to finish, so be prepared.
cd respoke-android
./build_webrtc_libs.sh install_dependencies
./build_webrtc_libs.sh install_jdk1_6
./build_webrtc_libs.sh pull_depot_tools
8) Close your terminal and reopen it so that you have the new JDK environment variables, or run:
source ~/.bashrc
9) Pull WebRTC source part 1
Due to a chicken-and-egg problem, we will attempt to pull the massive WebRTC source code and at some point it will fail. By that time, it will have grabbed another script with the commands necessary to install the dependencies that are missing.
./pull_webrtc_source.sh
This took me ~4 hours to finish. Eventually it will fail, complaining about missing Ubuntu packages.
10) Install dependencies part 2
A script now exists that will fortunately install the remaining dependencies for us, so run it:
trunk/build/install-build-deps-android.sh
11) Pull WebRTC source part 2. It should actually finish successfully this time!
./pull_webrtc_source.sh
12) Install last dependencies & build
The following command will install some last dependency packages and start the actual build in release mode.
./build_webrtc_libs.sh build_apprtc
| Add WebRTC library build instructions part 1. | Add WebRTC library build instructions part 1.
| Markdown | mit | respoke/sample-respoke-android,respoke/sample-respoke-android | markdown | ## Code Before:
respoke-android
===============
## Instruction:
Add WebRTC library build instructions part 1.
## Code After:
respoke-android
===============
Building the WebRTC libaries
============================
Unlike iOS, the WebRTC libraries for Android can ONLY be built on Linux. It WILL NOT work on Mac OS X. If, like me, you are only working with a Mac then one easy-ish approach is to create an Ubuntu virtual machine and perform the build there. Once the library has been built, it may be used inside of the Android Studio project on Mac again without issues.
The following steps are what was required for me to build the libraries using a trial version of VMWare and Ubuntu 14.04 LTS.
1) Download and install VMWare Fusion (I used v7.0)
https://www.vmware.com/products/fusion/features.html
2) Download the Ubuntu 14.04 LTS install image. Get the "64-bit Mac (AMD64)" image
http://www.ubuntu.com/download/desktop
3) Create a new virtual machine using the Ubuntu image. Give it at least 20GB of space to work with. I choose not to let it share home directories with my Mac user so it was easier to clean up afterwards.
4) Login to the new Ubuntu virtual machine desktop and open a terminal (Ctrl-Alt-T)
5) Install git
sudo apt-get install git
6) Configure Git for your Github credentials, and then clone the Respoke repository into your home directory
username@ubuntu:~$ git clone https://github.com/Ninjanetic/respoke-android.git
7) Install dependenices part 1
A series of scripts have been provided in the Respoke repository to make it easier to set up and run the build. It requires several steps and takes several hours to finish, so be prepared.
cd respoke-android
./build_webrtc_libs.sh install_dependencies
./build_webrtc_libs.sh install_jdk1_6
./build_webrtc_libs.sh pull_depot_tools
8) Close your terminal and reopen it so that you have the new JDK environment variables, or run:
source ~/.bashrc
9) Pull WebRTC source part 1
Due to a chicken-and-egg problem, we will attempt to pull the massive WebRTC source code and at some point it will fail. By that time, it will have grabbed another script with the commands necessary to install the dependencies that are missing.
./pull_webrtc_source.sh
This took me ~4 hours to finish. Eventually it will fail, complaining about missing Ubuntu packages.
10) Install dependencies part 2
A script now exists that will fortunately install the remaining dependencies for us, so run it:
trunk/build/install-build-deps-android.sh
11) Pull WebRTC source part 2. It should actually finish successfully this time!
./pull_webrtc_source.sh
12) Install last dependencies & build
The following command will install some last dependency packages and start the actual build in release mode.
./build_webrtc_libs.sh build_apprtc
| respoke-android
===============
+
+
+
+ Building the WebRTC libaries
+ ============================
+
+ Unlike iOS, the WebRTC libraries for Android can ONLY be built on Linux. It WILL NOT work on Mac OS X. If, like me, you are only working with a Mac then one easy-ish approach is to create an Ubuntu virtual machine and perform the build there. Once the library has been built, it may be used inside of the Android Studio project on Mac again without issues.
+
+ The following steps are what was required for me to build the libraries using a trial version of VMWare and Ubuntu 14.04 LTS.
+
+ 1) Download and install VMWare Fusion (I used v7.0)
+
+ https://www.vmware.com/products/fusion/features.html
+
+
+ 2) Download the Ubuntu 14.04 LTS install image. Get the "64-bit Mac (AMD64)" image
+
+ http://www.ubuntu.com/download/desktop
+
+
+ 3) Create a new virtual machine using the Ubuntu image. Give it at least 20GB of space to work with. I choose not to let it share home directories with my Mac user so it was easier to clean up afterwards.
+
+
+ 4) Login to the new Ubuntu virtual machine desktop and open a terminal (Ctrl-Alt-T)
+
+
+ 5) Install git
+
+ sudo apt-get install git
+
+
+ 6) Configure Git for your Github credentials, and then clone the Respoke repository into your home directory
+
+ username@ubuntu:~$ git clone https://github.com/Ninjanetic/respoke-android.git
+
+
+ 7) Install dependenices part 1
+
+ A series of scripts have been provided in the Respoke repository to make it easier to set up and run the build. It requires several steps and takes several hours to finish, so be prepared.
+
+ cd respoke-android
+ ./build_webrtc_libs.sh install_dependencies
+ ./build_webrtc_libs.sh install_jdk1_6
+ ./build_webrtc_libs.sh pull_depot_tools
+
+
+ 8) Close your terminal and reopen it so that you have the new JDK environment variables, or run:
+
+ source ~/.bashrc
+
+
+ 9) Pull WebRTC source part 1
+
+ Due to a chicken-and-egg problem, we will attempt to pull the massive WebRTC source code and at some point it will fail. By that time, it will have grabbed another script with the commands necessary to install the dependencies that are missing.
+
+ ./pull_webrtc_source.sh
+
+ This took me ~4 hours to finish. Eventually it will fail, complaining about missing Ubuntu packages.
+
+
+ 10) Install dependencies part 2
+
+ A script now exists that will fortunately install the remaining dependencies for us, so run it:
+
+ trunk/build/install-build-deps-android.sh
+
+
+ 11) Pull WebRTC source part 2. It should actually finish successfully this time!
+
+ ./pull_webrtc_source.sh
+
+
+ 12) Install last dependencies & build
+
+ The following command will install some last dependency packages and start the actual build in release mode.
+
+ ./build_webrtc_libs.sh build_apprtc
+ | 78 | 39 | 78 | 0 |
5ee63adc067196a31d510eea7efd6eb2a15bcc08 | src/Control/Monad/State/Trans.purs | src/Control/Monad/State/Trans.purs | module Control.Monad.State.Trans where
import Prelude
import Control.Monad.Trans
type StateData s a = { state :: s, value :: a }
data StateT s m a = StateT (s -> m (StateData s a))
instance monadStateT :: (Monad m) => Monad (StateT s m) where
return a = StateT \s -> return { state: s, value: a }
(>>=) (StateT x) f = StateT \s -> do
{ state = s', value = v } <- x s
runStateT (f v) s'
instance monadTransStateT :: MonadTrans (StateT s) where
lift m = StateT \s -> do
x <- m
return { state: s, value: x }
runStateT :: forall s m a. StateT s m a -> s -> m (StateData s a)
runStateT (StateT s) = s
withStateT :: forall s m a. (s -> s) -> StateT s m a -> StateT s m a
withStateT f s = StateT $ runStateT s <<< f
| module Control.Monad.State.Trans where
import Prelude
import Control.Monad.Trans
type StateData s a = { state :: s, value :: a }
data StateT s m a = StateT (s -> m (StateData s a))
instance monadStateT :: (Monad m) => Monad (StateT s m) where
return a = StateT \s -> return { state: s, value: a }
(>>=) (StateT x) f = StateT \s -> do
{ state = s', value = v } <- x s
runStateT (f v) s'
instance monadTransStateT :: MonadTrans (StateT s) where
lift m = StateT \s -> do
x <- m
return { state: s, value: x }
runStateT :: forall s m a. StateT s m a -> s -> m (StateData s a)
runStateT (StateT s) = s
evalStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m a
evalStateT m s = runStateT m s >>= \x -> return x.value
execStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m s
execStateT m s = runStateT m s >>= \x -> return x.state
mapStateT :: forall s m1 m2 a b. (m1 (StateData s a) -> m2 (StateData s b)) -> StateT s m1 a -> StateT s m2 b
mapStateT f m = StateT $ f <<< runStateT m
withStateT :: forall s m a. (s -> s) -> StateT s m a -> StateT s m a
withStateT f s = StateT $ runStateT s <<< f
| Add additional state running functions | Add additional state running functions
| PureScript | bsd-3-clause | purescript/purescript-transformers,lvicentesanchez/purescript-transformers | purescript | ## Code Before:
module Control.Monad.State.Trans where
import Prelude
import Control.Monad.Trans
type StateData s a = { state :: s, value :: a }
data StateT s m a = StateT (s -> m (StateData s a))
instance monadStateT :: (Monad m) => Monad (StateT s m) where
return a = StateT \s -> return { state: s, value: a }
(>>=) (StateT x) f = StateT \s -> do
{ state = s', value = v } <- x s
runStateT (f v) s'
instance monadTransStateT :: MonadTrans (StateT s) where
lift m = StateT \s -> do
x <- m
return { state: s, value: x }
runStateT :: forall s m a. StateT s m a -> s -> m (StateData s a)
runStateT (StateT s) = s
withStateT :: forall s m a. (s -> s) -> StateT s m a -> StateT s m a
withStateT f s = StateT $ runStateT s <<< f
## Instruction:
Add additional state running functions
## Code After:
module Control.Monad.State.Trans where
import Prelude
import Control.Monad.Trans
type StateData s a = { state :: s, value :: a }
data StateT s m a = StateT (s -> m (StateData s a))
instance monadStateT :: (Monad m) => Monad (StateT s m) where
return a = StateT \s -> return { state: s, value: a }
(>>=) (StateT x) f = StateT \s -> do
{ state = s', value = v } <- x s
runStateT (f v) s'
instance monadTransStateT :: MonadTrans (StateT s) where
lift m = StateT \s -> do
x <- m
return { state: s, value: x }
runStateT :: forall s m a. StateT s m a -> s -> m (StateData s a)
runStateT (StateT s) = s
evalStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m a
evalStateT m s = runStateT m s >>= \x -> return x.value
execStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m s
execStateT m s = runStateT m s >>= \x -> return x.state
mapStateT :: forall s m1 m2 a b. (m1 (StateData s a) -> m2 (StateData s b)) -> StateT s m1 a -> StateT s m2 b
mapStateT f m = StateT $ f <<< runStateT m
withStateT :: forall s m a. (s -> s) -> StateT s m a -> StateT s m a
withStateT f s = StateT $ runStateT s <<< f
| module Control.Monad.State.Trans where
import Prelude
import Control.Monad.Trans
type StateData s a = { state :: s, value :: a }
data StateT s m a = StateT (s -> m (StateData s a))
instance monadStateT :: (Monad m) => Monad (StateT s m) where
return a = StateT \s -> return { state: s, value: a }
(>>=) (StateT x) f = StateT \s -> do
{ state = s', value = v } <- x s
runStateT (f v) s'
-
+
instance monadTransStateT :: MonadTrans (StateT s) where
lift m = StateT \s -> do
x <- m
return { state: s, value: x }
runStateT :: forall s m a. StateT s m a -> s -> m (StateData s a)
runStateT (StateT s) = s
+ evalStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m a
+ evalStateT m s = runStateT m s >>= \x -> return x.value
+
+ execStateT :: forall s m a. (Monad m) => StateT s m a -> s -> m s
+ execStateT m s = runStateT m s >>= \x -> return x.state
+
+ mapStateT :: forall s m1 m2 a b. (m1 (StateData s a) -> m2 (StateData s b)) -> StateT s m1 a -> StateT s m2 b
+ mapStateT f m = StateT $ f <<< runStateT m
+
withStateT :: forall s m a. (s -> s) -> StateT s m a -> StateT s m a
withStateT f s = StateT $ runStateT s <<< f | 11 | 0.44 | 10 | 1 |
ed83b7945fd2fd30d5f969e05381b60878e499a5 | .github/workflows/main.yml | .github/workflows/main.yml | name: CI
on: [push]
jobs:
linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Run a one-line script
run: echo Hello, world!
- name: Run a multi-line script
run: |
echo Add other actions to build,
echo test, and deploy your project.
shell: pwsh
| name: CI
on: [push]
jobs:
linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Install needed software
run: ./github/scripts/Install.ps1
shell: pwsh
- name: Build PSFzf-Binary
run: ./github/scripts/Build.ps1
shell: pwsh
- name: Run tests
run: ./github/scripts/Tests.ps1
shell: pwsh
env:
LAST_TAG: v1.1.15
FZF_VERSION: 0.17.3
| Add install, build, and tests steps | Add install, build, and tests steps | YAML | mit | kelleyma49/PSFzf,kelleyma49/PSFzf | yaml | ## Code Before:
name: CI
on: [push]
jobs:
linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Run a one-line script
run: echo Hello, world!
- name: Run a multi-line script
run: |
echo Add other actions to build,
echo test, and deploy your project.
shell: pwsh
## Instruction:
Add install, build, and tests steps
## Code After:
name: CI
on: [push]
jobs:
linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Install needed software
run: ./github/scripts/Install.ps1
shell: pwsh
- name: Build PSFzf-Binary
run: ./github/scripts/Build.ps1
shell: pwsh
- name: Run tests
run: ./github/scripts/Tests.ps1
shell: pwsh
env:
LAST_TAG: v1.1.15
FZF_VERSION: 0.17.3
| name: CI
on: [push]
jobs:
linux:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
+ - name: Install needed software
+ run: ./github/scripts/Install.ps1
- - name: Run a one-line script
- run: echo Hello, world!
- - name: Run a multi-line script
- run: |
- echo Add other actions to build,
- echo test, and deploy your project.
shell: pwsh
+ - name: Build PSFzf-Binary
+ run: ./github/scripts/Build.ps1
+ shell: pwsh
+ - name: Run tests
+ run: ./github/scripts/Tests.ps1
+ shell: pwsh
+ env:
+ LAST_TAG: v1.1.15
+ FZF_VERSION: 0.17.3 | 17 | 0.944444 | 11 | 6 |
708c42dc096eea4bf6c7b5928913fb02ba67582c | package.sh | package.sh |
cd ..
echo "Compressing file..."
tar Jcf hackclock_2.1.0.orig.tar.xz hack-clock/
cd hack-clock
dpkg-buildpackage -rfakeroot -uc -us
|
rm home/pi/hack-clock/run_clock.py
rm home/pi/hack-clock/blocks_clock.xml
rm home/pi/hack-clock/backups/*
cd ..
echo "Compressing file..."
tar Jcf hackclock_2.1.0.orig.tar.xz hack-clock/
cd hack-clock
dpkg-buildpackage -rfakeroot -uc -us
| Remove old files before packaging | Remove old files before packaging
| Shell | apache-2.0 | deckerego/hack-clock,deckerego/hack-clock,deckerego/hack-clock | shell | ## Code Before:
cd ..
echo "Compressing file..."
tar Jcf hackclock_2.1.0.orig.tar.xz hack-clock/
cd hack-clock
dpkg-buildpackage -rfakeroot -uc -us
## Instruction:
Remove old files before packaging
## Code After:
rm home/pi/hack-clock/run_clock.py
rm home/pi/hack-clock/blocks_clock.xml
rm home/pi/hack-clock/backups/*
cd ..
echo "Compressing file..."
tar Jcf hackclock_2.1.0.orig.tar.xz hack-clock/
cd hack-clock
dpkg-buildpackage -rfakeroot -uc -us
| +
+ rm home/pi/hack-clock/run_clock.py
+ rm home/pi/hack-clock/blocks_clock.xml
+ rm home/pi/hack-clock/backups/*
cd ..
echo "Compressing file..."
tar Jcf hackclock_2.1.0.orig.tar.xz hack-clock/
cd hack-clock
dpkg-buildpackage -rfakeroot -uc -us | 4 | 0.571429 | 4 | 0 |
b592437817d9a0dbd9697a06b4497d49f3854c5d | .travis.yml | .travis.yml | ---
language: node_js
env:
- CXX=g++-4.8
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
sudo: false
node_js:
- "stable"
- "7.1.0"
- "7.0.0"
- "6.9.1"
- "6.5"
- "6.4"
- "6.3"
- "5.12"
- "4.4"
- "4.3.2"
cache:
bundler: true
directories:
- node_modules # NPM packages
install:
- npm install
- npm install jscoverage coveralls
after_script:
- npm run coveralls
| ---
language: node_js
env:
- CXX=g++-4.8
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
sudo: false
node_js:
- "stable"
- "7.1.0"
- "7.0.0"
- "6.9.1"
- "6.5"
- "6.4"
- "6.3"
- "5.12"
cache:
bundler: true
directories:
- node_modules # NPM packages
install:
- npm install
- npm install jscoverage coveralls
after_script:
- npm run coveralls
| Remove old node versions from Travis test | Remove old node versions from Travis test
| YAML | apache-2.0 | pboyd04/CSDLParser | yaml | ## Code Before:
---
language: node_js
env:
- CXX=g++-4.8
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
sudo: false
node_js:
- "stable"
- "7.1.0"
- "7.0.0"
- "6.9.1"
- "6.5"
- "6.4"
- "6.3"
- "5.12"
- "4.4"
- "4.3.2"
cache:
bundler: true
directories:
- node_modules # NPM packages
install:
- npm install
- npm install jscoverage coveralls
after_script:
- npm run coveralls
## Instruction:
Remove old node versions from Travis test
## Code After:
---
language: node_js
env:
- CXX=g++-4.8
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
sudo: false
node_js:
- "stable"
- "7.1.0"
- "7.0.0"
- "6.9.1"
- "6.5"
- "6.4"
- "6.3"
- "5.12"
cache:
bundler: true
directories:
- node_modules # NPM packages
install:
- npm install
- npm install jscoverage coveralls
after_script:
- npm run coveralls
| ---
language: node_js
env:
- CXX=g++-4.8
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
sudo: false
node_js:
- "stable"
- "7.1.0"
- "7.0.0"
- "6.9.1"
- "6.5"
- "6.4"
- "6.3"
- "5.12"
- - "4.4"
- - "4.3.2"
cache:
bundler: true
directories:
- node_modules # NPM packages
install:
- npm install
- npm install jscoverage coveralls
after_script:
- npm run coveralls | 2 | 0.064516 | 0 | 2 |
bf00464461b2cc6b77ad4050a1df1de2bb629e31 | src/js/lib/SvgUtil.js | src/js/lib/SvgUtil.js | /**
* Copyright: (c) 2016-2017 Max Klein
* License: MIT
*/
define(function () {
var NS = 'http://www.w3.org/2000/svg';
function createSvgElement(name) {
return document.createElementNS(NS, name);
}
return {
createElement: createSvgElement
};
});
| /**
* Copyright: (c) 2016-2017 Max Klein
* License: MIT
*/
define(function () {
var NS = 'http://www.w3.org/2000/svg';
function createSvg() {
return document.createElementNS(NS, 'svg');
}
function createSvgElement(name) {
return document.createElementNS(NS, name);
}
return {
createSvg: createSvg,
createElement: createSvgElement
};
});
| Add method for creating SVGs | Add method for creating SVGs
| JavaScript | mpl-2.0 | maxkl/LogicSimulator,maxkl/LogicSimulator | javascript | ## Code Before:
/**
* Copyright: (c) 2016-2017 Max Klein
* License: MIT
*/
define(function () {
var NS = 'http://www.w3.org/2000/svg';
function createSvgElement(name) {
return document.createElementNS(NS, name);
}
return {
createElement: createSvgElement
};
});
## Instruction:
Add method for creating SVGs
## Code After:
/**
* Copyright: (c) 2016-2017 Max Klein
* License: MIT
*/
define(function () {
var NS = 'http://www.w3.org/2000/svg';
function createSvg() {
return document.createElementNS(NS, 'svg');
}
function createSvgElement(name) {
return document.createElementNS(NS, name);
}
return {
createSvg: createSvg,
createElement: createSvgElement
};
});
| /**
* Copyright: (c) 2016-2017 Max Klein
* License: MIT
*/
define(function () {
var NS = 'http://www.w3.org/2000/svg';
+ function createSvg() {
+ return document.createElementNS(NS, 'svg');
+ }
+
function createSvgElement(name) {
return document.createElementNS(NS, name);
}
return {
+ createSvg: createSvg,
createElement: createSvgElement
};
}); | 5 | 0.3125 | 5 | 0 |
7ae29a6e4755769639a08057d166cabdd48c9426 | lib/generators/redactor/templates/base/carrierwave/uploaders/redactor_rails_document_uploader.rb | lib/generators/redactor/templates/base/carrierwave/uploaders/redactor_rails_document_uploader.rb | class RedactorRailsDocumentUploader < CarrierWave::Uploader::Base
include RedactorRails::Backend::CarrierWave
storage :fog
def store_dir
"system/redactor_assets/documents/#{model.id}"
end
def extension_white_list
RedactorRails.document_file_types
end
end
| class RedactorRailsDocumentUploader < CarrierWave::Uploader::Base
include RedactorRails::Backend::CarrierWave
# storage :fog
storage :file
def store_dir
"system/redactor_assets/documents/#{model.id}"
end
def extension_white_list
RedactorRails.document_file_types
end
end
| Make :file the default storage with reminder that :fog is available | Make :file the default storage with reminder that :fog is available
| Ruby | mit | rikkeisoft/redactor-rails,dmsilaev/redactor-rails,SammyLin/redactor-rails,lawrrn/redactor,StarWar/redactor-rails,sergio1990/redactor-rails,AF83/redactor-rails,rikkeisoft/redactor-rails,lawrrn/redactor-paperclip,meliborn/redactor-rails,hnq90/redactor-rails,catarse/redactor-rails,liqites/redactor-rails,kyan/redactor-rails,niketpuranik/redactor-rails,ArtisR/redactor-rails,catarse/redactor-rails,lawrrn/redactor-paperclip,niketpuranik/redactor-rails,SammyLin/redactor-rails,krnjn/redactor-rails,sergio1990/redactor-rails,lawrrn/redactor,andystu/redactor-rails,meliborn/redactor-rails,ArtisR/redactor-rails,StarWar/redactor-rails,AF83/redactor-rails,boost/redactor-rails,lufutu/redactor-rails,ratafire/redactor-rails,ratafire/redactor-rails,lufutu/redactor-rails,glyph-fr/redactor-rails,boost/redactor-rails,kyan/redactor-rails,andystu/redactor-rails,hnq90/redactor-rails,krnjn/redactor-rails,liqites/redactor-rails | ruby | ## Code Before:
class RedactorRailsDocumentUploader < CarrierWave::Uploader::Base
include RedactorRails::Backend::CarrierWave
storage :fog
def store_dir
"system/redactor_assets/documents/#{model.id}"
end
def extension_white_list
RedactorRails.document_file_types
end
end
## Instruction:
Make :file the default storage with reminder that :fog is available
## Code After:
class RedactorRailsDocumentUploader < CarrierWave::Uploader::Base
include RedactorRails::Backend::CarrierWave
# storage :fog
storage :file
def store_dir
"system/redactor_assets/documents/#{model.id}"
end
def extension_white_list
RedactorRails.document_file_types
end
end
| class RedactorRailsDocumentUploader < CarrierWave::Uploader::Base
include RedactorRails::Backend::CarrierWave
- storage :fog
+ # storage :fog
? ++
+ storage :file
def store_dir
"system/redactor_assets/documents/#{model.id}"
end
def extension_white_list
RedactorRails.document_file_types
end
end | 3 | 0.230769 | 2 | 1 |
7f652c4e4441ef9e0c148948232075aed410db80 | README.md | README.md |
Web Extension for handling groups of tabs persisted as bookmarks.
[](https://travis-ci.org/hupf/tabmarks)
## Known issues
* Only tested with Firefox.
* No i18n support (currently english only)
* Changes to the persisted bookmarks will not yet be synchronized with the open tabs.
## Development (for Firefox)
Install dependencies:
yarn install
Open Firefox and load the extension temporarily in the browser:
yarn start
Linting:
yarn lint
Testing:
yarn test
Creating a ZIP file (will be put in `web-ext-artifacts/`):
yarn build
## Author
Mathis Hofer (thanks to [Puzzle ITC](https://puzzle.ch) for letting me starting this project)
## License
Distributed under the [MIT License](LICENSE).
|
Web Extension for handling groups of tabs persisted as bookmarks.
[](https://travis-ci.org/hupf/tabmarks)
## Known issues
* Switching to a group loads all tabs (which is slow and clutters browser history), see #9
* No ability to rename, move or delete groups without browser restart, see #6
* No i18n support (currently english only)
* Only tested with Firefox
## Development (for Firefox)
Install dependencies:
yarn install
Open Firefox and load the extension temporarily in the browser:
yarn start
Linting:
yarn lint
Testing:
yarn test
Creating a ZIP file (will be put in `web-ext-artifacts/`):
yarn build
## Author
Mathis Hofer (thanks to [Puzzle ITC](https://puzzle.ch) for letting me starting this project)
## License
Distributed under the [MIT License](LICENSE).
| Update know issues in readme | Update know issues in readme
| Markdown | mit | hupf/tabmarks,hupf/tabmarks | markdown | ## Code Before:
Web Extension for handling groups of tabs persisted as bookmarks.
[](https://travis-ci.org/hupf/tabmarks)
## Known issues
* Only tested with Firefox.
* No i18n support (currently english only)
* Changes to the persisted bookmarks will not yet be synchronized with the open tabs.
## Development (for Firefox)
Install dependencies:
yarn install
Open Firefox and load the extension temporarily in the browser:
yarn start
Linting:
yarn lint
Testing:
yarn test
Creating a ZIP file (will be put in `web-ext-artifacts/`):
yarn build
## Author
Mathis Hofer (thanks to [Puzzle ITC](https://puzzle.ch) for letting me starting this project)
## License
Distributed under the [MIT License](LICENSE).
## Instruction:
Update know issues in readme
## Code After:
Web Extension for handling groups of tabs persisted as bookmarks.
[](https://travis-ci.org/hupf/tabmarks)
## Known issues
* Switching to a group loads all tabs (which is slow and clutters browser history), see #9
* No ability to rename, move or delete groups without browser restart, see #6
* No i18n support (currently english only)
* Only tested with Firefox
## Development (for Firefox)
Install dependencies:
yarn install
Open Firefox and load the extension temporarily in the browser:
yarn start
Linting:
yarn lint
Testing:
yarn test
Creating a ZIP file (will be put in `web-ext-artifacts/`):
yarn build
## Author
Mathis Hofer (thanks to [Puzzle ITC](https://puzzle.ch) for letting me starting this project)
## License
Distributed under the [MIT License](LICENSE).
|
Web Extension for handling groups of tabs persisted as bookmarks.
[](https://travis-ci.org/hupf/tabmarks)
## Known issues
- * Only tested with Firefox.
+ * Switching to a group loads all tabs (which is slow and clutters browser history), see #9
+ * No ability to rename, move or delete groups without browser restart, see #6
* No i18n support (currently english only)
- * Changes to the persisted bookmarks will not yet be synchronized with the open tabs.
+ * Only tested with Firefox
## Development (for Firefox)
Install dependencies:
yarn install
Open Firefox and load the extension temporarily in the browser:
yarn start
Linting:
yarn lint
Testing:
yarn test
Creating a ZIP file (will be put in `web-ext-artifacts/`):
yarn build
## Author
Mathis Hofer (thanks to [Puzzle ITC](https://puzzle.ch) for letting me starting this project)
## License
Distributed under the [MIT License](LICENSE). | 5 | 0.113636 | 3 | 2 |
9b830eddf653b88247755d47fcf3811c3488236b | assets/stylesheets/grayscale-sass-ink.scss | assets/stylesheets/grayscale-sass-ink.scss | /* Equivalent to main.scss */
/* Vendor imports */
//@import "bootstrap-sprockets";
@import "bootstrap";
@import "hover";
/* App imports */
@import "variables";
@import "grayscale";
| /* Equivalent to main.scss */
$fa-font-path: "../grayscale-sass-ink/bower_components/font-awesome/fonts";
$icon-font-path: "../grayscale-sass-ink/bower_components/fonts/bootstrap";
/* Vendor imports */
//@import "bootstrap-sprockets";
@import "bootstrap";
@import "hover";
@import "font-awesome";
/* App imports */
@import "variables";
@import "grayscale";
| Set scss font paths import font-awesome | Set scss font paths
import font-awesome
| SCSS | mit | dougbeal/grayscale-sass-ink,dougbeal/grayscale-sass-ink | scss | ## Code Before:
/* Equivalent to main.scss */
/* Vendor imports */
//@import "bootstrap-sprockets";
@import "bootstrap";
@import "hover";
/* App imports */
@import "variables";
@import "grayscale";
## Instruction:
Set scss font paths
import font-awesome
## Code After:
/* Equivalent to main.scss */
$fa-font-path: "../grayscale-sass-ink/bower_components/font-awesome/fonts";
$icon-font-path: "../grayscale-sass-ink/bower_components/fonts/bootstrap";
/* Vendor imports */
//@import "bootstrap-sprockets";
@import "bootstrap";
@import "hover";
@import "font-awesome";
/* App imports */
@import "variables";
@import "grayscale";
| /* Equivalent to main.scss */
+ $fa-font-path: "../grayscale-sass-ink/bower_components/font-awesome/fonts";
+ $icon-font-path: "../grayscale-sass-ink/bower_components/fonts/bootstrap";
/* Vendor imports */
//@import "bootstrap-sprockets";
@import "bootstrap";
@import "hover";
+ @import "font-awesome";
/* App imports */
@import "variables";
@import "grayscale"; | 3 | 0.3 | 3 | 0 |
2b46b7eb34a2383c62f51b4cd6cb365f769f588d | app/src/main/java/com/ibm/mil/smartringer/RingerAdjusterService.java | app/src/main/java/com/ibm/mil/smartringer/RingerAdjusterService.java | package com.ibm.mil.smartringer;
import android.app.IntentService;
import android.content.Intent;
import android.media.AudioManager;
import android.util.Log;
public class RingerAdjusterService extends IntentService {
private static final String TAG = RingerAdjusterService.class.getName();
private static final String PREFS_NAME = "SmartRingerPrefs";
private static final String SENS_KEY = "sensitivityLevel";
public RingerAdjusterService() {
super(TAG);
}
@Override
protected void onHandleIntent(Intent intent) {
Log.i(TAG, "Service for adjusting ringer volume has been called");
// initialize volume adjuster and mute ringer to perform proper noise level detection
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
VolumeAdjuster volumeAdjuster = new VolumeAdjuster(audioManager, AudioManager.STREAM_RING);
volumeAdjuster.mute();
// detect noise level for specified sensitivity and adjust ringer volume accordingly
VolumeAdjuster.SensitivityLevel sensLevel = VolumeAdjuster.getUsersSensitivityLevel(this);
NoiseMeter noiseMeter = new NoiseMeter();
volumeAdjuster.adjustVolume(noiseMeter.getMaxAmplitude(), sensLevel);
stopSelf();
}
}
| package com.ibm.mil.smartringer;
import android.app.IntentService;
import android.content.Intent;
import android.media.AudioManager;
import android.util.Log;
public class RingerAdjusterService extends IntentService {
private static final String TAG = RingerAdjusterService.class.getName();
public RingerAdjusterService() {
super(TAG);
}
@Override
protected void onHandleIntent(Intent intent) {
Log.i(TAG, "Service for adjusting ringer volume has been called");
// initialize volume adjuster and mute ringer to perform proper noise level detection
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
VolumeAdjuster volumeAdjuster = new VolumeAdjuster(audioManager, AudioManager.STREAM_RING);
volumeAdjuster.mute();
// detect noise level for specified sensitivity and adjust ringer volume accordingly
VolumeAdjuster.SensitivityLevel sensLevel = VolumeAdjuster.getUsersSensitivityLevel(this);
NoiseMeter noiseMeter = new NoiseMeter();
volumeAdjuster.adjustVolume(noiseMeter.getMaxAmplitude(), sensLevel);
stopSelf();
}
}
| Remove unnecessary string constants from service | Remove unnecessary string constants from service
| Java | apache-2.0 | jpetitto/android-smart-ringer | java | ## Code Before:
package com.ibm.mil.smartringer;
import android.app.IntentService;
import android.content.Intent;
import android.media.AudioManager;
import android.util.Log;
public class RingerAdjusterService extends IntentService {
private static final String TAG = RingerAdjusterService.class.getName();
private static final String PREFS_NAME = "SmartRingerPrefs";
private static final String SENS_KEY = "sensitivityLevel";
public RingerAdjusterService() {
super(TAG);
}
@Override
protected void onHandleIntent(Intent intent) {
Log.i(TAG, "Service for adjusting ringer volume has been called");
// initialize volume adjuster and mute ringer to perform proper noise level detection
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
VolumeAdjuster volumeAdjuster = new VolumeAdjuster(audioManager, AudioManager.STREAM_RING);
volumeAdjuster.mute();
// detect noise level for specified sensitivity and adjust ringer volume accordingly
VolumeAdjuster.SensitivityLevel sensLevel = VolumeAdjuster.getUsersSensitivityLevel(this);
NoiseMeter noiseMeter = new NoiseMeter();
volumeAdjuster.adjustVolume(noiseMeter.getMaxAmplitude(), sensLevel);
stopSelf();
}
}
## Instruction:
Remove unnecessary string constants from service
## Code After:
package com.ibm.mil.smartringer;
import android.app.IntentService;
import android.content.Intent;
import android.media.AudioManager;
import android.util.Log;
public class RingerAdjusterService extends IntentService {
private static final String TAG = RingerAdjusterService.class.getName();
public RingerAdjusterService() {
super(TAG);
}
@Override
protected void onHandleIntent(Intent intent) {
Log.i(TAG, "Service for adjusting ringer volume has been called");
// initialize volume adjuster and mute ringer to perform proper noise level detection
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
VolumeAdjuster volumeAdjuster = new VolumeAdjuster(audioManager, AudioManager.STREAM_RING);
volumeAdjuster.mute();
// detect noise level for specified sensitivity and adjust ringer volume accordingly
VolumeAdjuster.SensitivityLevel sensLevel = VolumeAdjuster.getUsersSensitivityLevel(this);
NoiseMeter noiseMeter = new NoiseMeter();
volumeAdjuster.adjustVolume(noiseMeter.getMaxAmplitude(), sensLevel);
stopSelf();
}
}
| package com.ibm.mil.smartringer;
import android.app.IntentService;
import android.content.Intent;
import android.media.AudioManager;
import android.util.Log;
public class RingerAdjusterService extends IntentService {
private static final String TAG = RingerAdjusterService.class.getName();
- private static final String PREFS_NAME = "SmartRingerPrefs";
- private static final String SENS_KEY = "sensitivityLevel";
public RingerAdjusterService() {
super(TAG);
}
@Override
protected void onHandleIntent(Intent intent) {
Log.i(TAG, "Service for adjusting ringer volume has been called");
// initialize volume adjuster and mute ringer to perform proper noise level detection
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
VolumeAdjuster volumeAdjuster = new VolumeAdjuster(audioManager, AudioManager.STREAM_RING);
volumeAdjuster.mute();
// detect noise level for specified sensitivity and adjust ringer volume accordingly
VolumeAdjuster.SensitivityLevel sensLevel = VolumeAdjuster.getUsersSensitivityLevel(this);
NoiseMeter noiseMeter = new NoiseMeter();
volumeAdjuster.adjustVolume(noiseMeter.getMaxAmplitude(), sensLevel);
stopSelf();
}
} | 2 | 0.058824 | 0 | 2 |
c3a92d10ad1c1c9e50be895a53a0c5deaf061dc2 | src/config/constants.js | src/config/constants.js | import axios from 'axios';
export const liveRootUrl = 'https://bolg-app.herokuapp.com/posts/';
export const states = {
LOADING: 0,
EDITING: 1,
SAVED: 2,
ERROR: 3,
EDITING_OFFLINE: 4,
SAVED_OFFLINE: 5,
PUBLISHED: 6,
};
export const mapsAPIKey = 'AIzaSyBADvjevyMmDkHb_xjjh3FOltkO2Oa8iAQ';
export function reverseGeocode(lat, lng) {
return axios.get(`https://maps.googleapis.com/maps/api/geocode/json?latlng=${lat},${lng}&key=${mapsAPIKey}`);
}
export const sizes = [
{
width: 2560,
height: 1440,
},
{
width: 1920,
height: 1080,
},
{
width: 1024,
height: 576,
},
{
width: 640,
height: 360,
},
];
export const slugger = str => str
.toLowerCase()
.replace(/ä/g, 'ae')
.replace(/ö/g, 'oe')
.replace(/ü/g, 'ue')
.replace(/[^\w ]+/g, ' ')
.replace(/ +/g, '-');
| export const liveRootUrl = 'https://bolg-app.herokuapp.com/posts/';
export const states = {
LOADING: 0,
EDITING: 1,
SAVED: 2,
ERROR: 3,
EDITING_OFFLINE: 4,
SAVED_OFFLINE: 5,
PUBLISHED: 6,
};
export const mapsAPIKey = 'AIzaSyBADvjevyMmDkHb_xjjh3FOltkO2Oa8iAQ';
export const sizes = [
{
width: 2560,
height: 1440,
},
{
width: 1920,
height: 1080,
},
{
width: 1024,
height: 576,
},
{
width: 640,
height: 360,
},
];
export const slugger = str => str
.toLowerCase()
.replace(/ä/g, 'ae')
.replace(/ö/g, 'oe')
.replace(/ü/g, 'ue')
.replace(/[^\w ]+/g, ' ')
.replace(/ +/g, '-');
| Remove axios and unused function | Remove axios and unused function
| JavaScript | mit | tuelsch/bolg,tuelsch/bolg | javascript | ## Code Before:
import axios from 'axios';
export const liveRootUrl = 'https://bolg-app.herokuapp.com/posts/';
export const states = {
LOADING: 0,
EDITING: 1,
SAVED: 2,
ERROR: 3,
EDITING_OFFLINE: 4,
SAVED_OFFLINE: 5,
PUBLISHED: 6,
};
export const mapsAPIKey = 'AIzaSyBADvjevyMmDkHb_xjjh3FOltkO2Oa8iAQ';
export function reverseGeocode(lat, lng) {
return axios.get(`https://maps.googleapis.com/maps/api/geocode/json?latlng=${lat},${lng}&key=${mapsAPIKey}`);
}
export const sizes = [
{
width: 2560,
height: 1440,
},
{
width: 1920,
height: 1080,
},
{
width: 1024,
height: 576,
},
{
width: 640,
height: 360,
},
];
export const slugger = str => str
.toLowerCase()
.replace(/ä/g, 'ae')
.replace(/ö/g, 'oe')
.replace(/ü/g, 'ue')
.replace(/[^\w ]+/g, ' ')
.replace(/ +/g, '-');
## Instruction:
Remove axios and unused function
## Code After:
export const liveRootUrl = 'https://bolg-app.herokuapp.com/posts/';
export const states = {
LOADING: 0,
EDITING: 1,
SAVED: 2,
ERROR: 3,
EDITING_OFFLINE: 4,
SAVED_OFFLINE: 5,
PUBLISHED: 6,
};
export const mapsAPIKey = 'AIzaSyBADvjevyMmDkHb_xjjh3FOltkO2Oa8iAQ';
export const sizes = [
{
width: 2560,
height: 1440,
},
{
width: 1920,
height: 1080,
},
{
width: 1024,
height: 576,
},
{
width: 640,
height: 360,
},
];
export const slugger = str => str
.toLowerCase()
.replace(/ä/g, 'ae')
.replace(/ö/g, 'oe')
.replace(/ü/g, 'ue')
.replace(/[^\w ]+/g, ' ')
.replace(/ +/g, '-');
| - import axios from 'axios';
-
export const liveRootUrl = 'https://bolg-app.herokuapp.com/posts/';
export const states = {
LOADING: 0,
EDITING: 1,
SAVED: 2,
ERROR: 3,
EDITING_OFFLINE: 4,
SAVED_OFFLINE: 5,
PUBLISHED: 6,
};
export const mapsAPIKey = 'AIzaSyBADvjevyMmDkHb_xjjh3FOltkO2Oa8iAQ';
-
- export function reverseGeocode(lat, lng) {
- return axios.get(`https://maps.googleapis.com/maps/api/geocode/json?latlng=${lat},${lng}&key=${mapsAPIKey}`);
- }
export const sizes = [
{
width: 2560,
height: 1440,
},
{
width: 1920,
height: 1080,
},
{
width: 1024,
height: 576,
},
{
width: 640,
height: 360,
},
];
export const slugger = str => str
.toLowerCase()
.replace(/ä/g, 'ae')
.replace(/ö/g, 'oe')
.replace(/ü/g, 'ue')
.replace(/[^\w ]+/g, ' ')
.replace(/ +/g, '-'); | 6 | 0.130435 | 0 | 6 |
959a26521a1e19552854df47439703d3a73f2a1e | src/brreg.lisp | src/brreg.lisp | (in-package :brreg)
(push (cons "application" "json") drakma:*text-content-types*)
(defun get-jsonhash (orgnummer)
(when (and (every #'digit-char-p orgnummer)
(= (length orgnummer) 9))
(multiple-value-bind (response status)
(drakma:http-request
(format nil "http://data.brreg.no/enhetsregisteret/enhet/~A.json"
orgnummer))
(when (eql status 200)
(yason:parse response)))))
(defun get-orgfeature (feature orgnummer)
(ignore-errors
(let ((hash (get-jsonhash orgnummer)))
(if hash
(gethash feature hash)
(make-hash-table)))))
(defun get-org-name (orgnummer)
(let ((navn (get-orgfeature "navn" orgnummer)))
(when (stringp navn)
navn)))
(defun get-org-address (orgnummer)
(gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer)))
(defun get-org-postnummer (orgnummer)
(gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer)))
(defun get-org-poststed (orgnummer)
(gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer)))
| (in-package :brreg)
(push (cons "application" "json") drakma:*text-content-types*)
(defun get-jsonhash (orgnummer)
(when (and (every #'digit-char-p orgnummer)
(= (length orgnummer) 9))
(multiple-value-bind (response status)
(drakma:http-request
(format nil "http://data.brreg.no/enhetsregisteret/enhet/~A.json"
orgnummer))
(when (eql status 200)
(yason:parse response)))))
(defun get-orgfeature (feature orgnummer)
(ignore-errors
(let ((hash (get-jsonhash orgnummer)))
(if hash
(gethash feature hash)
(make-hash-table)))))
(defun get-org-name (orgnummer)
(let ((navn (get-orgfeature "navn" orgnummer)))
(when (stringp navn)
navn)))
(defun get-org-address (orgnummer)
(ignore-errors
(gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer))))
(defun get-org-postnummer (orgnummer)
(ignore-errors
(gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer))))
(defun get-org-poststed (orgnummer)
(ignore-errors
(gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer))))
| Return NIL for all error situations. | Return NIL for all error situations.
| Common Lisp | mit | torhenrik/cl-brreg | common-lisp | ## Code Before:
(in-package :brreg)
(push (cons "application" "json") drakma:*text-content-types*)
(defun get-jsonhash (orgnummer)
(when (and (every #'digit-char-p orgnummer)
(= (length orgnummer) 9))
(multiple-value-bind (response status)
(drakma:http-request
(format nil "http://data.brreg.no/enhetsregisteret/enhet/~A.json"
orgnummer))
(when (eql status 200)
(yason:parse response)))))
(defun get-orgfeature (feature orgnummer)
(ignore-errors
(let ((hash (get-jsonhash orgnummer)))
(if hash
(gethash feature hash)
(make-hash-table)))))
(defun get-org-name (orgnummer)
(let ((navn (get-orgfeature "navn" orgnummer)))
(when (stringp navn)
navn)))
(defun get-org-address (orgnummer)
(gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer)))
(defun get-org-postnummer (orgnummer)
(gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer)))
(defun get-org-poststed (orgnummer)
(gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer)))
## Instruction:
Return NIL for all error situations.
## Code After:
(in-package :brreg)
(push (cons "application" "json") drakma:*text-content-types*)
(defun get-jsonhash (orgnummer)
(when (and (every #'digit-char-p orgnummer)
(= (length orgnummer) 9))
(multiple-value-bind (response status)
(drakma:http-request
(format nil "http://data.brreg.no/enhetsregisteret/enhet/~A.json"
orgnummer))
(when (eql status 200)
(yason:parse response)))))
(defun get-orgfeature (feature orgnummer)
(ignore-errors
(let ((hash (get-jsonhash orgnummer)))
(if hash
(gethash feature hash)
(make-hash-table)))))
(defun get-org-name (orgnummer)
(let ((navn (get-orgfeature "navn" orgnummer)))
(when (stringp navn)
navn)))
(defun get-org-address (orgnummer)
(ignore-errors
(gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer))))
(defun get-org-postnummer (orgnummer)
(ignore-errors
(gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer))))
(defun get-org-poststed (orgnummer)
(ignore-errors
(gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer))))
| (in-package :brreg)
(push (cons "application" "json") drakma:*text-content-types*)
(defun get-jsonhash (orgnummer)
(when (and (every #'digit-char-p orgnummer)
(= (length orgnummer) 9))
(multiple-value-bind (response status)
(drakma:http-request
(format nil "http://data.brreg.no/enhetsregisteret/enhet/~A.json"
orgnummer))
(when (eql status 200)
(yason:parse response)))))
(defun get-orgfeature (feature orgnummer)
(ignore-errors
(let ((hash (get-jsonhash orgnummer)))
(if hash
(gethash feature hash)
(make-hash-table)))))
(defun get-org-name (orgnummer)
(let ((navn (get-orgfeature "navn" orgnummer)))
(when (stringp navn)
navn)))
(defun get-org-address (orgnummer)
+ (ignore-errors
- (gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer)))
+ (gethash "adresse" (get-orgfeature "forretningsadresse" orgnummer))))
? + +
(defun get-org-postnummer (orgnummer)
+ (ignore-errors
- (gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer)))
+ (gethash "postnummer" (get-orgfeature "forretningsadresse" orgnummer))))
? + +
(defun get-org-poststed (orgnummer)
+ (ignore-errors
- (gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer)))
+ (gethash "poststed" (get-orgfeature "forretningsadresse" orgnummer))))
? + +
| 9 | 0.25 | 6 | 3 |
5d301587d92c73edfd8ded5e8110fe91cde19820 | lib/dav/resource/children.rb | lib/dav/resource/children.rb | module DAV
class Children < Struct.new(:parent)
include DAV
attr_reader :uris
SEPARATOR = "\n"
def initialize(parent)
super parent
load_paths
end
def store
return unless changed?
unless @uris.empty?
relation_storage.set parent.id, @uris.join(SEPARATOR)
else
relation_storage.delete parent.id
end
reset!
return true
end
def add(child)
@uris.push child.decoded_uri
changed!
self
end
def remove(child)
@uris.delete_if { |uri| uri == child.decoded_uri }
changed!
self
end
def each
if block_given?
@uris.each { |uri| yield parent.join(uri.path) }
else
Enumerator.new self, :each
end
end
protected
attr_reader :changed
alias changed? changed
def changed!
@changed = true
end
def reset!
@changed = false
end
def load_paths
string = relation_storage.get parent.id
string ||= ''
@uris = string.split(SEPARATOR).map { |uri| URI.parse uri }
end
end
end
| require 'set'
module DAV
class Children < Struct.new(:parent)
include DAV
attr_reader :uris
SEPARATOR = "\n"
def initialize(parent)
super parent
load_paths
end
def store
return unless changed?
unless @uris.empty?
relation_storage.set parent.id, @uris.to_a.join(SEPARATOR)
else
relation_storage.delete parent.id
end
reset!
return true
end
def add(child)
@uris.add child.decoded_uri
changed!
self
end
def remove(child)
@uris.delete_if { |uri| uri == child.decoded_uri }
changed!
self
end
def each
if block_given?
@uris.each { |uri| yield parent.join(uri.path) }
else
Enumerator.new self, :each
end
end
protected
attr_reader :changed
alias changed? changed
def changed!
@changed = true
end
def reset!
@changed = false
end
def load_paths
string = relation_storage.get parent.id
string ||= ''
@uris = Set.new string.split(SEPARATOR).map { |uri| URI.parse uri }
end
end
end
| Use a set instead of simple array. | Use a set instead of simple array.
| Ruby | mit | fork/sinatra-webdav | ruby | ## Code Before:
module DAV
class Children < Struct.new(:parent)
include DAV
attr_reader :uris
SEPARATOR = "\n"
def initialize(parent)
super parent
load_paths
end
def store
return unless changed?
unless @uris.empty?
relation_storage.set parent.id, @uris.join(SEPARATOR)
else
relation_storage.delete parent.id
end
reset!
return true
end
def add(child)
@uris.push child.decoded_uri
changed!
self
end
def remove(child)
@uris.delete_if { |uri| uri == child.decoded_uri }
changed!
self
end
def each
if block_given?
@uris.each { |uri| yield parent.join(uri.path) }
else
Enumerator.new self, :each
end
end
protected
attr_reader :changed
alias changed? changed
def changed!
@changed = true
end
def reset!
@changed = false
end
def load_paths
string = relation_storage.get parent.id
string ||= ''
@uris = string.split(SEPARATOR).map { |uri| URI.parse uri }
end
end
end
## Instruction:
Use a set instead of simple array.
## Code After:
require 'set'
module DAV
class Children < Struct.new(:parent)
include DAV
attr_reader :uris
SEPARATOR = "\n"
def initialize(parent)
super parent
load_paths
end
def store
return unless changed?
unless @uris.empty?
relation_storage.set parent.id, @uris.to_a.join(SEPARATOR)
else
relation_storage.delete parent.id
end
reset!
return true
end
def add(child)
@uris.add child.decoded_uri
changed!
self
end
def remove(child)
@uris.delete_if { |uri| uri == child.decoded_uri }
changed!
self
end
def each
if block_given?
@uris.each { |uri| yield parent.join(uri.path) }
else
Enumerator.new self, :each
end
end
protected
attr_reader :changed
alias changed? changed
def changed!
@changed = true
end
def reset!
@changed = false
end
def load_paths
string = relation_storage.get parent.id
string ||= ''
@uris = Set.new string.split(SEPARATOR).map { |uri| URI.parse uri }
end
end
end
| + require 'set'
+
module DAV
class Children < Struct.new(:parent)
include DAV
attr_reader :uris
SEPARATOR = "\n"
def initialize(parent)
super parent
load_paths
end
def store
return unless changed?
unless @uris.empty?
- relation_storage.set parent.id, @uris.join(SEPARATOR)
+ relation_storage.set parent.id, @uris.to_a.join(SEPARATOR)
? +++++
else
relation_storage.delete parent.id
end
reset!
return true
end
def add(child)
- @uris.push child.decoded_uri
? ^^^^
+ @uris.add child.decoded_uri
? ^^^
changed!
self
end
def remove(child)
@uris.delete_if { |uri| uri == child.decoded_uri }
changed!
self
end
def each
if block_given?
@uris.each { |uri| yield parent.join(uri.path) }
else
Enumerator.new self, :each
end
end
protected
attr_reader :changed
alias changed? changed
def changed!
@changed = true
end
def reset!
@changed = false
end
def load_paths
string = relation_storage.get parent.id
string ||= ''
- @uris = string.split(SEPARATOR).map { |uri| URI.parse uri }
+ @uris = Set.new string.split(SEPARATOR).map { |uri| URI.parse uri }
? ++++++++
end
end
end | 8 | 0.123077 | 5 | 3 |
f330bf82d7eea8de20924aeecdab7343a602e999 | core/string/unpack/z_spec.rb | core/string/unpack/z_spec.rb | require_relative '../../../spec_helper'
require_relative '../fixtures/classes'
require_relative 'shared/basic'
require_relative 'shared/string'
require_relative 'shared/taint'
describe "String#unpack with format 'Z'" do
it_behaves_like :string_unpack_basic, 'Z'
it_behaves_like :string_unpack_no_platform, 'Z'
it_behaves_like :string_unpack_string, 'Z'
it_behaves_like :string_unpack_taint, 'Z'
it "stops decoding at NULL bytes when passed the '*' modifier" do
"a\x00\x00 b \x00c".unpack('Z*Z*Z*Z*').should == ["a", "", " b ", "c"]
end
it "decodes the number of bytes specified by the count modifier and truncates the decoded string at the first NULL byte" do
[ ["a\x00 \x00b c", ["a", " "]],
["\x00a\x00 bc \x00", ["", "c"]]
].should be_computed_by(:unpack, "Z5Z")
end
end
| require_relative '../../../spec_helper'
require_relative '../fixtures/classes'
require_relative 'shared/basic'
require_relative 'shared/string'
require_relative 'shared/taint'
describe "String#unpack with format 'Z'" do
it_behaves_like :string_unpack_basic, 'Z'
it_behaves_like :string_unpack_no_platform, 'Z'
it_behaves_like :string_unpack_string, 'Z'
it_behaves_like :string_unpack_taint, 'Z'
it "stops decoding at NULL bytes when passed the '*' modifier" do
"a\x00\x00 b \x00c".unpack('Z*Z*Z*Z*').should == ["a", "", " b ", "c"]
end
it "decodes the number of bytes specified by the count modifier and truncates the decoded string at the first NULL byte" do
[ ["a\x00 \x00b c", ["a", " "]],
["\x00a\x00 bc \x00", ["", "c"]]
].should be_computed_by(:unpack, "Z5Z")
end
it "does not advance past the null byte when given a 'Z' format specifier" do
"a\x00\x0f".unpack('Zxc').should == ['a', 15]
"a\x00\x0f".unpack('Zcc').should == ['a', 0, 15]
end
end
| Add specs for consuming null byte. | Add specs for consuming null byte.
| Ruby | mit | ruby/spec,ruby/spec,ruby/spec | ruby | ## Code Before:
require_relative '../../../spec_helper'
require_relative '../fixtures/classes'
require_relative 'shared/basic'
require_relative 'shared/string'
require_relative 'shared/taint'
describe "String#unpack with format 'Z'" do
it_behaves_like :string_unpack_basic, 'Z'
it_behaves_like :string_unpack_no_platform, 'Z'
it_behaves_like :string_unpack_string, 'Z'
it_behaves_like :string_unpack_taint, 'Z'
it "stops decoding at NULL bytes when passed the '*' modifier" do
"a\x00\x00 b \x00c".unpack('Z*Z*Z*Z*').should == ["a", "", " b ", "c"]
end
it "decodes the number of bytes specified by the count modifier and truncates the decoded string at the first NULL byte" do
[ ["a\x00 \x00b c", ["a", " "]],
["\x00a\x00 bc \x00", ["", "c"]]
].should be_computed_by(:unpack, "Z5Z")
end
end
## Instruction:
Add specs for consuming null byte.
## Code After:
require_relative '../../../spec_helper'
require_relative '../fixtures/classes'
require_relative 'shared/basic'
require_relative 'shared/string'
require_relative 'shared/taint'
describe "String#unpack with format 'Z'" do
it_behaves_like :string_unpack_basic, 'Z'
it_behaves_like :string_unpack_no_platform, 'Z'
it_behaves_like :string_unpack_string, 'Z'
it_behaves_like :string_unpack_taint, 'Z'
it "stops decoding at NULL bytes when passed the '*' modifier" do
"a\x00\x00 b \x00c".unpack('Z*Z*Z*Z*').should == ["a", "", " b ", "c"]
end
it "decodes the number of bytes specified by the count modifier and truncates the decoded string at the first NULL byte" do
[ ["a\x00 \x00b c", ["a", " "]],
["\x00a\x00 bc \x00", ["", "c"]]
].should be_computed_by(:unpack, "Z5Z")
end
it "does not advance past the null byte when given a 'Z' format specifier" do
"a\x00\x0f".unpack('Zxc').should == ['a', 15]
"a\x00\x0f".unpack('Zcc').should == ['a', 0, 15]
end
end
| require_relative '../../../spec_helper'
require_relative '../fixtures/classes'
require_relative 'shared/basic'
require_relative 'shared/string'
require_relative 'shared/taint'
describe "String#unpack with format 'Z'" do
it_behaves_like :string_unpack_basic, 'Z'
it_behaves_like :string_unpack_no_platform, 'Z'
it_behaves_like :string_unpack_string, 'Z'
it_behaves_like :string_unpack_taint, 'Z'
it "stops decoding at NULL bytes when passed the '*' modifier" do
"a\x00\x00 b \x00c".unpack('Z*Z*Z*Z*').should == ["a", "", " b ", "c"]
end
it "decodes the number of bytes specified by the count modifier and truncates the decoded string at the first NULL byte" do
[ ["a\x00 \x00b c", ["a", " "]],
["\x00a\x00 bc \x00", ["", "c"]]
].should be_computed_by(:unpack, "Z5Z")
end
+
+ it "does not advance past the null byte when given a 'Z' format specifier" do
+ "a\x00\x0f".unpack('Zxc').should == ['a', 15]
+ "a\x00\x0f".unpack('Zcc').should == ['a', 0, 15]
+ end
end | 5 | 0.227273 | 5 | 0 |
418dd74f2d89129e03fdbde232e1a846b607c190 | src/Ilios/WebBundle/Controller/IndexController.php | src/Ilios/WebBundle/Controller/IndexController.php | <?php
namespace Ilios\WebBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Ilios\CliBundle\Command\UpdateFrontendCommand;
use Ilios\WebBundle\Service\WebIndexFromJson;
class IndexController extends Controller
{
public function indexAction()
{
$fs = $this->get('ilioscore.symfonyfilesystem');
$path = $this->getParameter('kernel.cache_dir') . '/' . UpdateFrontendCommand::CACHE_FILE_NAME;
if (!$fs->exists($path)) {
throw new \Exception(
'Unable to load the index file. Run ilios:maintenance:update-frontend to create it.'
);
}
$contents = $fs->readFile($path);
$response = new Response($contents);
$response->headers->set('Content-Type', 'text/html');
$response->setPublic();
$response->setMaxAge(60);
return $response;
}
}
| <?php
namespace Ilios\WebBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Ilios\CliBundle\Command\UpdateFrontendCommand;
use Ilios\WebBundle\Service\WebIndexFromJson;
class IndexController extends Controller
{
public function indexAction()
{
$fs = $this->get('ilioscore.symfonyfilesystem');
$path = $this->getParameter('kernel.cache_dir') . '/' . UpdateFrontendCommand::CACHE_FILE_NAME;
if (!$fs->exists($path)) {
throw new \Exception(
"Unable to load the index file at {$path}. Run ilios:maintenance:update-frontend to create it."
);
}
$contents = $fs->readFile($path);
$response = new Response($contents);
$response->headers->set('Content-Type', 'text/html');
$response->setPublic();
$response->setMaxAge(60);
return $response;
}
}
| Improve exception message when fronted not found | Improve exception message when fronted not found
| PHP | mit | justinchu17/ilios,dartajax/ilios,justinchu17/ilios,stopfstedt/ilios,ilios/ilios,ilios/ilios,dartajax/ilios,Trott/ilios,justinchu17/ilios,justinchu17/ilios,stopfstedt/ilios,Trott/ilios,thecoolestguy/ilios,thecoolestguy/ilios | php | ## Code Before:
<?php
namespace Ilios\WebBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Ilios\CliBundle\Command\UpdateFrontendCommand;
use Ilios\WebBundle\Service\WebIndexFromJson;
class IndexController extends Controller
{
public function indexAction()
{
$fs = $this->get('ilioscore.symfonyfilesystem');
$path = $this->getParameter('kernel.cache_dir') . '/' . UpdateFrontendCommand::CACHE_FILE_NAME;
if (!$fs->exists($path)) {
throw new \Exception(
'Unable to load the index file. Run ilios:maintenance:update-frontend to create it.'
);
}
$contents = $fs->readFile($path);
$response = new Response($contents);
$response->headers->set('Content-Type', 'text/html');
$response->setPublic();
$response->setMaxAge(60);
return $response;
}
}
## Instruction:
Improve exception message when fronted not found
## Code After:
<?php
namespace Ilios\WebBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Ilios\CliBundle\Command\UpdateFrontendCommand;
use Ilios\WebBundle\Service\WebIndexFromJson;
class IndexController extends Controller
{
public function indexAction()
{
$fs = $this->get('ilioscore.symfonyfilesystem');
$path = $this->getParameter('kernel.cache_dir') . '/' . UpdateFrontendCommand::CACHE_FILE_NAME;
if (!$fs->exists($path)) {
throw new \Exception(
"Unable to load the index file at {$path}. Run ilios:maintenance:update-frontend to create it."
);
}
$contents = $fs->readFile($path);
$response = new Response($contents);
$response->headers->set('Content-Type', 'text/html');
$response->setPublic();
$response->setMaxAge(60);
return $response;
}
}
| <?php
namespace Ilios\WebBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Component\HttpFoundation\Response;
use Ilios\CliBundle\Command\UpdateFrontendCommand;
use Ilios\WebBundle\Service\WebIndexFromJson;
class IndexController extends Controller
{
public function indexAction()
{
$fs = $this->get('ilioscore.symfonyfilesystem');
$path = $this->getParameter('kernel.cache_dir') . '/' . UpdateFrontendCommand::CACHE_FILE_NAME;
if (!$fs->exists($path)) {
throw new \Exception(
- 'Unable to load the index file. Run ilios:maintenance:update-frontend to create it.'
? ^ ^
+ "Unable to load the index file at {$path}. Run ilios:maintenance:update-frontend to create it."
? ^ +++++++++++ ^
);
}
$contents = $fs->readFile($path);
$response = new Response($contents);
$response->headers->set('Content-Type', 'text/html');
$response->setPublic();
$response->setMaxAge(60);
return $response;
}
} | 2 | 0.060606 | 1 | 1 |
33592a1379945be6be713124acb7bbc43f033f5f | app/BookingStrategyTimeslot.php | app/BookingStrategyTimeslot.php | <?php
namespace App;
use App\Appointment;
use App\Business;
use Carbon\Carbon;
class BookingStrategyTimeslot implements BookingStrategyInterface
{
public function makeReservation(Business $business, $data)
{
$data['business_id'] = $business->id;
$appointment = new Appointment($data);
return $appointment->save();
}
}
| <?php
namespace App;
use App\Appointment;
use App\Business;
use Carbon\Carbon;
class BookingStrategyTimeslot implements BookingStrategyInterface
{
public function makeReservation(User $issuer, Business $business, $data)
{
$data['business_id'] = $business->id;
$appointment = new Appointment($data);
return $appointment->save();
}
}
| Fix function parameters to comply interface | Fix function parameters to comply interface
| PHP | agpl-3.0 | alariva/timegrid,timegridio/timegrid,alariva/timegrid,alariva/timegrid,timegridio/timegrid,alariva/timegrid,timegridio/timegrid | php | ## Code Before:
<?php
namespace App;
use App\Appointment;
use App\Business;
use Carbon\Carbon;
class BookingStrategyTimeslot implements BookingStrategyInterface
{
public function makeReservation(Business $business, $data)
{
$data['business_id'] = $business->id;
$appointment = new Appointment($data);
return $appointment->save();
}
}
## Instruction:
Fix function parameters to comply interface
## Code After:
<?php
namespace App;
use App\Appointment;
use App\Business;
use Carbon\Carbon;
class BookingStrategyTimeslot implements BookingStrategyInterface
{
public function makeReservation(User $issuer, Business $business, $data)
{
$data['business_id'] = $business->id;
$appointment = new Appointment($data);
return $appointment->save();
}
}
| <?php
namespace App;
use App\Appointment;
use App\Business;
use Carbon\Carbon;
class BookingStrategyTimeslot implements BookingStrategyInterface
{
- public function makeReservation(Business $business, $data)
+ public function makeReservation(User $issuer, Business $business, $data)
? ++++++++++++++
{
$data['business_id'] = $business->id;
$appointment = new Appointment($data);
return $appointment->save();
}
} | 2 | 0.117647 | 1 | 1 |
fc5274159d15d61c08ebf4cdfb1200b2f438b772 | .travis.yml | .travis.yml | sudo: required
services:
- docker
env:
- DOCKER_COMPOSE_VERSION=1.11.2
language: python
python:
- '3.6'
before_install:
# See https://github.com/travis-ci/travis-ci/issues/7940
- sudo rm -f /etc/boto.cfg
install:
- sudo docker --version
- sudo docker-compose --version
jobs:
include:
- stage: test
- script: ./helper ci test_suite
- script: ./helper ci test_backwards
if: type = pull_request
- script: ./helper ci style
notifications:
email: false
slack:
rooms: deptfunstuff:abJKvzApk5SKtcEyAgtswXAv
on_success: change
on_failure: change
stages:
- name: test
- name: develop deploy
if: branch = develop
- name: production deploy
if: branch = master
| sudo: required
services:
- docker
language: python
python:
- '3.6'
before_install:
# See https://github.com/travis-ci/travis-ci/issues/7940
- sudo rm -f /etc/boto.cfg
install:
- sudo docker --version
- sudo docker-compose --version
jobs:
include:
- stage: test
- script: ./helper ci test_suite
- script: ./helper ci test_backwards
if: type = pull_request
- script: ./helper ci style
notifications:
email: false
slack:
rooms: deptfunstuff:abJKvzApk5SKtcEyAgtswXAv
on_success: change
on_failure: change
stages:
- name: test
| Remove unused lines in Travis CI config file | Remove unused lines in Travis CI config file
| YAML | mit | uccser/cs4teachers,uccser/cs4teachers,uccser/cs4teachers,uccser/cs4teachers | yaml | ## Code Before:
sudo: required
services:
- docker
env:
- DOCKER_COMPOSE_VERSION=1.11.2
language: python
python:
- '3.6'
before_install:
# See https://github.com/travis-ci/travis-ci/issues/7940
- sudo rm -f /etc/boto.cfg
install:
- sudo docker --version
- sudo docker-compose --version
jobs:
include:
- stage: test
- script: ./helper ci test_suite
- script: ./helper ci test_backwards
if: type = pull_request
- script: ./helper ci style
notifications:
email: false
slack:
rooms: deptfunstuff:abJKvzApk5SKtcEyAgtswXAv
on_success: change
on_failure: change
stages:
- name: test
- name: develop deploy
if: branch = develop
- name: production deploy
if: branch = master
## Instruction:
Remove unused lines in Travis CI config file
## Code After:
sudo: required
services:
- docker
language: python
python:
- '3.6'
before_install:
# See https://github.com/travis-ci/travis-ci/issues/7940
- sudo rm -f /etc/boto.cfg
install:
- sudo docker --version
- sudo docker-compose --version
jobs:
include:
- stage: test
- script: ./helper ci test_suite
- script: ./helper ci test_backwards
if: type = pull_request
- script: ./helper ci style
notifications:
email: false
slack:
rooms: deptfunstuff:abJKvzApk5SKtcEyAgtswXAv
on_success: change
on_failure: change
stages:
- name: test
| sudo: required
services:
- docker
- env:
- - DOCKER_COMPOSE_VERSION=1.11.2
language: python
python:
- '3.6'
before_install:
# See https://github.com/travis-ci/travis-ci/issues/7940
- sudo rm -f /etc/boto.cfg
install:
- sudo docker --version
- sudo docker-compose --version
jobs:
include:
- stage: test
- script: ./helper ci test_suite
- script: ./helper ci test_backwards
if: type = pull_request
- script: ./helper ci style
notifications:
email: false
slack:
rooms: deptfunstuff:abJKvzApk5SKtcEyAgtswXAv
on_success: change
on_failure: change
stages:
- name: test
- - name: develop deploy
- if: branch = develop
- - name: production deploy
- if: branch = master | 6 | 0.181818 | 0 | 6 |
b40e492bd27e83350c2ca1982d5cca0e3ba21bc6 | config/initializers/panopticon_api_credentials.rb | config/initializers/panopticon_api_credentials.rb |
PANOPTICON_API_CREDENTIALS = {
bearer_token: "developmentapicredentials",
}
|
PANOPTICON_API_CREDENTIALS = {
bearer_token: ENV.fetch("PANOPTICON_BEARER_TOKEN", "developmentapicredentials"),
}
| Configure Panopticon API using env var | Configure Panopticon API using env var
| Ruby | mit | alphagov/whitehall,alphagov/whitehall,alphagov/whitehall,alphagov/whitehall | ruby | ## Code Before:
PANOPTICON_API_CREDENTIALS = {
bearer_token: "developmentapicredentials",
}
## Instruction:
Configure Panopticon API using env var
## Code After:
PANOPTICON_API_CREDENTIALS = {
bearer_token: ENV.fetch("PANOPTICON_BEARER_TOKEN", "developmentapicredentials"),
}
|
PANOPTICON_API_CREDENTIALS = {
- bearer_token: "developmentapicredentials",
+ bearer_token: ENV.fetch("PANOPTICON_BEARER_TOKEN", "developmentapicredentials"),
} | 2 | 0.5 | 1 | 1 |
4687878df61d13455f89f62c623f693765fb573b | .travis.yml | .travis.yml | language: node_js
node_js:
- "node"
cache:
yarn: true
directories:
- node_modules
install:
- yarn global add sequelize-cli
# - npm install sequelize-cli -g
- sequelize db:migrate
- sequelize db:seed --seed 20161204173948-default-admin
script:
- yarn test
# - npm test
| language: node_js
node_js:
- "node"
cache:
yarn: true
directories:
- node_modules
install:
# - yarn global add sequelize-cli
- npm install sequelize-cli
before_script:
- sequelize db:migrate
- sequelize db:seed --seed 20161204173948-default-admin
script:
# - yarn test
- npm test
| Move migration to before_script section. | Move migration to before_script section.
| YAML | mit | Calvin-Huang/LiveAPIExplore-Server,Calvin-Huang/LiveAPIExplore-Server | yaml | ## Code Before:
language: node_js
node_js:
- "node"
cache:
yarn: true
directories:
- node_modules
install:
- yarn global add sequelize-cli
# - npm install sequelize-cli -g
- sequelize db:migrate
- sequelize db:seed --seed 20161204173948-default-admin
script:
- yarn test
# - npm test
## Instruction:
Move migration to before_script section.
## Code After:
language: node_js
node_js:
- "node"
cache:
yarn: true
directories:
- node_modules
install:
# - yarn global add sequelize-cli
- npm install sequelize-cli
before_script:
- sequelize db:migrate
- sequelize db:seed --seed 20161204173948-default-admin
script:
# - yarn test
- npm test
| language: node_js
node_js:
- "node"
cache:
yarn: true
directories:
- node_modules
install:
- - yarn global add sequelize-cli
+ # - yarn global add sequelize-cli
? ++
- # - npm install sequelize-cli -g
? -- ---
+ - npm install sequelize-cli
+ before_script:
- sequelize db:migrate
- sequelize db:seed --seed 20161204173948-default-admin
script:
- - yarn test
+ # - yarn test
? ++
- # - npm test
? --
+ - npm test | 9 | 0.6 | 5 | 4 |
4f69a011c2373e47789b045ee2ccb9b5237676d3 | package.json | package.json | {
"name": "angular-client-side-auth",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"passport": "*",
"passport-local": "*",
"underscore": "*",
"validator": "~1.1.1"
},
"subdomain": "angular-client-side-auth",
"scripts": {
"start": "server.js"
},
"engines": {
"node": "0.8.x"
},
"devDependencies": {
"passport-twitter": "~0.1.4"
}
}
| {
"name": "angular-client-side-auth",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"passport": "*",
"passport-local": "*",
"underscore": "*",
"validator": "~1.1.1",
"passport-twitter": "~0.1.4",
"passport-facebook": "~0.1.5"
},
"subdomain": "angular-client-side-auth",
"scripts": {
"start": "server.js"
},
"engines": {
"node": "0.8.x"
}
}
| Add passport-facebook dependency, and move passport-twitter from devDependencies section. | Add passport-facebook dependency, and move passport-twitter from devDependencies section.
| JSON | mit | time4tigger/angular-client-side-auth,loiclacombe/angular-client-side-auth,mishavp2001/args,loiclacombe/angular-client-side-auth,fnakstad/angular-client-side-auth,time4tigger/angular-client-side-auth,fnakstad/angular-client-side-auth,loiclacombe/angular-client-side-auth,mishavp2001/args,time4tigger/angular-client-side-auth,jnber5/pool-hockey-lhplge,mishavp2001/args,eshopr/workshop,eshopr/workshop | json | ## Code Before:
{
"name": "angular-client-side-auth",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"passport": "*",
"passport-local": "*",
"underscore": "*",
"validator": "~1.1.1"
},
"subdomain": "angular-client-side-auth",
"scripts": {
"start": "server.js"
},
"engines": {
"node": "0.8.x"
},
"devDependencies": {
"passport-twitter": "~0.1.4"
}
}
## Instruction:
Add passport-facebook dependency, and move passport-twitter from devDependencies section.
## Code After:
{
"name": "angular-client-side-auth",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"passport": "*",
"passport-local": "*",
"underscore": "*",
"validator": "~1.1.1",
"passport-twitter": "~0.1.4",
"passport-facebook": "~0.1.5"
},
"subdomain": "angular-client-side-auth",
"scripts": {
"start": "server.js"
},
"engines": {
"node": "0.8.x"
}
}
| {
"name": "angular-client-side-auth",
"version": "0.0.1",
"dependencies": {
"express": "*",
"jade": "*",
"passport": "*",
"passport-local": "*",
"underscore": "*",
- "validator": "~1.1.1"
+ "validator": "~1.1.1",
? +
+ "passport-twitter": "~0.1.4",
+ "passport-facebook": "~0.1.5"
},
"subdomain": "angular-client-side-auth",
"scripts": {
"start": "server.js"
},
"engines": {
"node": "0.8.x"
- },
- "devDependencies": {
- "passport-twitter": "~0.1.4"
}
} | 7 | 0.318182 | 3 | 4 |
296c5d1999f820fb8e143c2d528e0a85bab33f69 | web/static/gm_pr.css | web/static/gm_pr.css | footer {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
}
table {
width: 100%;
}
.label {
padding: 2px;
border-radius: 4px;
font-weight: bold;
font-size :12px;
text-transform: uppercase;
}
.feedback_ok {
color: green;
}
.feedback_ko {
color: red;
}
td.alignleft, th.alignleft {
text-align: left;
}
td.aligncenter, th.aligncenter {
text-align: center;
} | /*
Copyright 2015 Genymobile
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
footer {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
}
table {
width: 100%;
}
.label {
padding: 2px;
border-radius: 4px;
font-weight: bold;
font-size :12px;
text-transform: uppercase;
}
.feedback_ok {
color: green;
}
.feedback_ko {
color: red;
}
td.alignleft, th.alignleft {
text-align: left;
}
td.aligncenter, th.aligncenter {
text-align: center;
} | Add copyright to CSS file | Add copyright to CSS file
| CSS | apache-2.0 | Genymobile/gm_pr,Genymobile/gm_pr | css | ## Code Before:
footer {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
}
table {
width: 100%;
}
.label {
padding: 2px;
border-radius: 4px;
font-weight: bold;
font-size :12px;
text-transform: uppercase;
}
.feedback_ok {
color: green;
}
.feedback_ko {
color: red;
}
td.alignleft, th.alignleft {
text-align: left;
}
td.aligncenter, th.aligncenter {
text-align: center;
}
## Instruction:
Add copyright to CSS file
## Code After:
/*
Copyright 2015 Genymobile
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
footer {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
}
table {
width: 100%;
}
.label {
padding: 2px;
border-radius: 4px;
font-weight: bold;
font-size :12px;
text-transform: uppercase;
}
.feedback_ok {
color: green;
}
.feedback_ko {
color: red;
}
td.alignleft, th.alignleft {
text-align: left;
}
td.aligncenter, th.aligncenter {
text-align: center;
} | + /*
+ Copyright 2015 Genymobile
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+ */
+
footer {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
}
table {
width: 100%;
}
.label {
padding: 2px;
border-radius: 4px;
font-weight: bold;
font-size :12px;
text-transform: uppercase;
}
.feedback_ok {
color: green;
}
.feedback_ko {
color: red;
}
td.alignleft, th.alignleft {
text-align: left;
}
td.aligncenter, th.aligncenter {
text-align: center;
} | 16 | 0.571429 | 16 | 0 |
530966b8effadfa3e026118240ce21709fc4c843 | chrome/browser/sync/glue/chrome_report_unrecoverable_error.cc | chrome/browser/sync/glue/chrome_report_unrecoverable_error.cc | // Copyright (c) 2012 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/sync/glue/chrome_report_unrecoverable_error.h"
#include "base/rand_util.h"
#include "build/build_config.h"
#if defined(OS_WIN)
#include <windows.h>
#endif
#include "chrome/common/chrome_constants.h"
namespace browser_sync {
void ChromeReportUnrecoverableError() {
// TODO(lipalani): Add this for other platforms as well.
#if defined(OS_WIN)
const double kErrorUploadRatio = 0.15;
// We only want to upload |kErrorUploadRatio| ratio of errors.
if (kErrorUploadRatio <= 0.0)
return; // We are not allowed to upload errors.
double random_number = base::RandDouble();
if (random_number > kErrorUploadRatio)
return;
// Get the breakpad pointer from chrome.exe
typedef void (__cdecl *DumpProcessFunction)();
DumpProcessFunction DumpProcess = reinterpret_cast<DumpProcessFunction>(
::GetProcAddress(::GetModuleHandle(
chrome::kBrowserProcessExecutableName),
"DumpProcessWithoutCrash"));
if (DumpProcess)
DumpProcess();
#endif // OS_WIN
}
} // namespace browser_sync
| // Copyright (c) 2012 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/sync/glue/chrome_report_unrecoverable_error.h"
#include "base/rand_util.h"
#include "chrome/common/chrome_constants.h"
#include "chrome/common/chrome_version_info.h"
#include "chrome/common/dump_without_crashing.h"
namespace browser_sync {
void ChromeReportUnrecoverableError() {
// Only upload on canary/dev builds to avoid overwhelming crash server.
chrome::VersionInfo::Channel channel = chrome::VersionInfo::GetChannel();
if (channel != chrome::VersionInfo::CHANNEL_CANARY &&
channel != chrome::VersionInfo::CHANNEL_DEV) {
return;
}
// We only want to upload |kErrorUploadRatio| ratio of errors.
const double kErrorUploadRatio = 0.15;
if (kErrorUploadRatio <= 0.0)
return; // We are not allowed to upload errors.
double random_number = base::RandDouble();
if (random_number > kErrorUploadRatio)
return;
logging::DumpWithoutCrashing();
}
} // namespace browser_sync
| Support unrecoverable error uploading on non-win platforms | [Sync] Support unrecoverable error uploading on non-win platforms
We now use chrome/common/dump_without_crashing.h, and only upload on canary/
dev builds.
BUG=127429
Review URL: https://codereview.chromium.org/80963003
git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@236623 0039d316-1c4b-4281-b951-d872f2087c98
| C++ | bsd-3-clause | anirudhSK/chromium,hgl888/chromium-crosswalk-efl,patrickm/chromium.src,TheTypoMaster/chromium-crosswalk,bright-sparks/chromium-spacewalk,patrickm/chromium.src,ltilve/chromium,dushu1203/chromium.src,bright-sparks/chromium-spacewalk,jaruba/chromium.src,bright-sparks/chromium-spacewalk,crosswalk-project/chromium-crosswalk-efl,ltilve/chromium,TheTypoMaster/chromium-crosswalk,ChromiumWebApps/chromium,ltilve/chromium,jaruba/chromium.src,anirudhSK/chromium,hgl888/chromium-crosswalk-efl,anirudhSK/chromium,patrickm/chromium.src,ondra-novak/chromium.src,mohamed--abdel-maksoud/chromium.src,Chilledheart/chromium,markYoungH/chromium.src,dushu1203/chromium.src,hgl888/chromium-crosswalk,Fireblend/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,Just-D/chromium-1,hgl888/chromium-crosswalk-efl,krieger-od/nwjs_chromium.src,M4sse/chromium.src,hgl888/chromium-crosswalk-efl,hgl888/chromium-crosswalk,ChromiumWebApps/chromium,mohamed--abdel-maksoud/chromium.src,crosswalk-project/chromium-crosswalk-efl,ChromiumWebApps/chromium,crosswalk-project/chromium-crosswalk-efl,ChromiumWebApps/chromium,Chilledheart/chromium,TheTypoMaster/chromium-crosswalk,crosswalk-project/chromium-crosswalk-efl,dushu1203/chromium.src,dushu1203/chromium.src,littlstar/chromium.src,fujunwei/chromium-crosswalk,markYoungH/chromium.src,ChromiumWebApps/chromium,dednal/chromium.src,bright-sparks/chromium-spacewalk,axinging/chromium-crosswalk,chuan9/chromium-crosswalk,axinging/chromium-crosswalk,littlstar/chromium.src,PeterWangIntel/chromium-crosswalk,markYoungH/chromium.src,markYoungH/chromium.src,dushu1203/chromium.src,patrickm/chromium.src,patrickm/chromium.src,bright-sparks/chromium-spacewalk,chuan9/chromium-crosswalk,fujunwei/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,TheTypoMaster/chromium-crosswalk,PeterWangIntel/chromium-crosswalk,Fireblend/chromium-crosswalk,markYoungH/chromium.src,jaruba/chromium.src,markYoungH/chromium.src,hgl888/chromium-crosswalk,TheTypoMaster/chromium-crosswalk,Jonekee/chromium.src,hgl888/chromium-crosswalk-efl,Chilledheart/chromium,anirudhSK/chromium,M4sse/chromium.src,dednal/chromium.src,axinging/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,hgl888/chromium-crosswalk,axinging/chromium-crosswalk,Jonekee/chromium.src,PeterWangIntel/chromium-crosswalk,dednal/chromium.src,Just-D/chromium-1,hgl888/chromium-crosswalk,dushu1203/chromium.src,ltilve/chromium,mohamed--abdel-maksoud/chromium.src,jaruba/chromium.src,Pluto-tv/chromium-crosswalk,jaruba/chromium.src,Jonekee/chromium.src,chuan9/chromium-crosswalk,Jonekee/chromium.src,dednal/chromium.src,hgl888/chromium-crosswalk,M4sse/chromium.src,ondra-novak/chromium.src,hgl888/chromium-crosswalk-efl,ondra-novak/chromium.src,PeterWangIntel/chromium-crosswalk,ltilve/chromium,krieger-od/nwjs_chromium.src,krieger-od/nwjs_chromium.src,littlstar/chromium.src,Chilledheart/chromium,ondra-novak/chromium.src,crosswalk-project/chromium-crosswalk-efl,hgl888/chromium-crosswalk-efl,ChromiumWebApps/chromium,PeterWangIntel/chromium-crosswalk,Pluto-tv/chromium-crosswalk,Pluto-tv/chromium-crosswalk,fujunwei/chromium-crosswalk,ChromiumWebApps/chromium,jaruba/chromium.src,Chilledheart/chromium,Just-D/chromium-1,chuan9/chromium-crosswalk,fujunwei/chromium-crosswalk,Just-D/chromium-1,axinging/chromium-crosswalk,Jonekee/chromium.src,Chilledheart/chromium,Fireblend/chromium-crosswalk,jaruba/chromium.src,M4sse/chromium.src,jaruba/chromium.src,anirudhSK/chromium,patrickm/chromium.src,ChromiumWebApps/chromium,ondra-novak/chromium.src,fujunwei/chromium-crosswalk,hgl888/chromium-crosswalk-efl,Just-D/chromium-1,Jonekee/chromium.src,anirudhSK/chromium,TheTypoMaster/chromium-crosswalk,M4sse/chromium.src,krieger-od/nwjs_chromium.src,krieger-od/nwjs_chromium.src,markYoungH/chromium.src,ltilve/chromium,Just-D/chromium-1,littlstar/chromium.src,chuan9/chromium-crosswalk,M4sse/chromium.src,dednal/chromium.src,ChromiumWebApps/chromium,markYoungH/chromium.src,crosswalk-project/chromium-crosswalk-efl,Jonekee/chromium.src,Pluto-tv/chromium-crosswalk,patrickm/chromium.src,hgl888/chromium-crosswalk,Just-D/chromium-1,ltilve/chromium,M4sse/chromium.src,markYoungH/chromium.src,fujunwei/chromium-crosswalk,dushu1203/chromium.src,patrickm/chromium.src,Fireblend/chromium-crosswalk,PeterWangIntel/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,Chilledheart/chromium,bright-sparks/chromium-spacewalk,axinging/chromium-crosswalk,chuan9/chromium-crosswalk,PeterWangIntel/chromium-crosswalk,ChromiumWebApps/chromium,M4sse/chromium.src,dednal/chromium.src,Just-D/chromium-1,anirudhSK/chromium,chuan9/chromium-crosswalk,crosswalk-project/chromium-crosswalk-efl,hgl888/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,Jonekee/chromium.src,Fireblend/chromium-crosswalk,Pluto-tv/chromium-crosswalk,littlstar/chromium.src,Fireblend/chromium-crosswalk,hgl888/chromium-crosswalk,Jonekee/chromium.src,littlstar/chromium.src,jaruba/chromium.src,krieger-od/nwjs_chromium.src,anirudhSK/chromium,ltilve/chromium,dednal/chromium.src,axinging/chromium-crosswalk,M4sse/chromium.src,TheTypoMaster/chromium-crosswalk,chuan9/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,ondra-novak/chromium.src,hgl888/chromium-crosswalk-efl,Jonekee/chromium.src,jaruba/chromium.src,crosswalk-project/chromium-crosswalk-efl,mohamed--abdel-maksoud/chromium.src,anirudhSK/chromium,bright-sparks/chromium-spacewalk,patrickm/chromium.src,krieger-od/nwjs_chromium.src,ltilve/chromium,Fireblend/chromium-crosswalk,dushu1203/chromium.src,bright-sparks/chromium-spacewalk,TheTypoMaster/chromium-crosswalk,Fireblend/chromium-crosswalk,axinging/chromium-crosswalk,Pluto-tv/chromium-crosswalk,Pluto-tv/chromium-crosswalk,ChromiumWebApps/chromium,axinging/chromium-crosswalk,dushu1203/chromium.src,ondra-novak/chromium.src,axinging/chromium-crosswalk,littlstar/chromium.src,krieger-od/nwjs_chromium.src,Chilledheart/chromium,Fireblend/chromium-crosswalk,jaruba/chromium.src,anirudhSK/chromium,Chilledheart/chromium,dednal/chromium.src,dednal/chromium.src,markYoungH/chromium.src,bright-sparks/chromium-spacewalk,anirudhSK/chromium,krieger-od/nwjs_chromium.src,krieger-od/nwjs_chromium.src,dushu1203/chromium.src,M4sse/chromium.src,mohamed--abdel-maksoud/chromium.src,crosswalk-project/chromium-crosswalk-efl,ondra-novak/chromium.src,dushu1203/chromium.src,Jonekee/chromium.src,axinging/chromium-crosswalk,fujunwei/chromium-crosswalk,littlstar/chromium.src,fujunwei/chromium-crosswalk,markYoungH/chromium.src,TheTypoMaster/chromium-crosswalk,krieger-od/nwjs_chromium.src,ondra-novak/chromium.src,Pluto-tv/chromium-crosswalk,anirudhSK/chromium,Pluto-tv/chromium-crosswalk,PeterWangIntel/chromium-crosswalk,chuan9/chromium-crosswalk,Just-D/chromium-1,PeterWangIntel/chromium-crosswalk,fujunwei/chromium-crosswalk,dednal/chromium.src,M4sse/chromium.src,ChromiumWebApps/chromium,dednal/chromium.src,hgl888/chromium-crosswalk-efl | c++ | ## Code Before:
// Copyright (c) 2012 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/sync/glue/chrome_report_unrecoverable_error.h"
#include "base/rand_util.h"
#include "build/build_config.h"
#if defined(OS_WIN)
#include <windows.h>
#endif
#include "chrome/common/chrome_constants.h"
namespace browser_sync {
void ChromeReportUnrecoverableError() {
// TODO(lipalani): Add this for other platforms as well.
#if defined(OS_WIN)
const double kErrorUploadRatio = 0.15;
// We only want to upload |kErrorUploadRatio| ratio of errors.
if (kErrorUploadRatio <= 0.0)
return; // We are not allowed to upload errors.
double random_number = base::RandDouble();
if (random_number > kErrorUploadRatio)
return;
// Get the breakpad pointer from chrome.exe
typedef void (__cdecl *DumpProcessFunction)();
DumpProcessFunction DumpProcess = reinterpret_cast<DumpProcessFunction>(
::GetProcAddress(::GetModuleHandle(
chrome::kBrowserProcessExecutableName),
"DumpProcessWithoutCrash"));
if (DumpProcess)
DumpProcess();
#endif // OS_WIN
}
} // namespace browser_sync
## Instruction:
[Sync] Support unrecoverable error uploading on non-win platforms
We now use chrome/common/dump_without_crashing.h, and only upload on canary/
dev builds.
BUG=127429
Review URL: https://codereview.chromium.org/80963003
git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@236623 0039d316-1c4b-4281-b951-d872f2087c98
## Code After:
// Copyright (c) 2012 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/sync/glue/chrome_report_unrecoverable_error.h"
#include "base/rand_util.h"
#include "chrome/common/chrome_constants.h"
#include "chrome/common/chrome_version_info.h"
#include "chrome/common/dump_without_crashing.h"
namespace browser_sync {
void ChromeReportUnrecoverableError() {
// Only upload on canary/dev builds to avoid overwhelming crash server.
chrome::VersionInfo::Channel channel = chrome::VersionInfo::GetChannel();
if (channel != chrome::VersionInfo::CHANNEL_CANARY &&
channel != chrome::VersionInfo::CHANNEL_DEV) {
return;
}
// We only want to upload |kErrorUploadRatio| ratio of errors.
const double kErrorUploadRatio = 0.15;
if (kErrorUploadRatio <= 0.0)
return; // We are not allowed to upload errors.
double random_number = base::RandDouble();
if (random_number > kErrorUploadRatio)
return;
logging::DumpWithoutCrashing();
}
} // namespace browser_sync
| // Copyright (c) 2012 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "chrome/browser/sync/glue/chrome_report_unrecoverable_error.h"
#include "base/rand_util.h"
- #include "build/build_config.h"
-
- #if defined(OS_WIN)
- #include <windows.h>
- #endif
-
#include "chrome/common/chrome_constants.h"
+ #include "chrome/common/chrome_version_info.h"
+ #include "chrome/common/dump_without_crashing.h"
namespace browser_sync {
void ChromeReportUnrecoverableError() {
- // TODO(lipalani): Add this for other platforms as well.
- #if defined(OS_WIN)
- const double kErrorUploadRatio = 0.15;
+ // Only upload on canary/dev builds to avoid overwhelming crash server.
+ chrome::VersionInfo::Channel channel = chrome::VersionInfo::GetChannel();
+ if (channel != chrome::VersionInfo::CHANNEL_CANARY &&
+ channel != chrome::VersionInfo::CHANNEL_DEV) {
+ return;
+ }
// We only want to upload |kErrorUploadRatio| ratio of errors.
+ const double kErrorUploadRatio = 0.15;
if (kErrorUploadRatio <= 0.0)
return; // We are not allowed to upload errors.
double random_number = base::RandDouble();
if (random_number > kErrorUploadRatio)
return;
+ logging::DumpWithoutCrashing();
- // Get the breakpad pointer from chrome.exe
- typedef void (__cdecl *DumpProcessFunction)();
- DumpProcessFunction DumpProcess = reinterpret_cast<DumpProcessFunction>(
- ::GetProcAddress(::GetModuleHandle(
- chrome::kBrowserProcessExecutableName),
- "DumpProcessWithoutCrash"));
- if (DumpProcess)
- DumpProcess();
- #endif // OS_WIN
-
}
} // namespace browser_sync | 29 | 0.690476 | 10 | 19 |
4c04979de66cf5d0858ff00002ef40df196ccd05 | serfnode/build/handler/handler.py | serfnode/build/handler/handler.py | import os
from serf_master import SerfHandlerProxy
from base_handler import BaseHandler
try:
from my_handler import MyHandler
except ImportError:
print "Could not import user's handler."
print "Defaulting to dummy handler."
MyHandler = BaseHandler
if __name__ == '__main__':
handler = SerfHandlerProxy()
role = os.environ.get('ROLE') or 'no_role'
handler.register(role, MyHandler())
handler.run() | import os
from serf_master import SerfHandlerProxy
from base_handler import BaseHandler
try:
from my_handler import MyHandler
except ImportError:
MyHandler = BaseHandler
if __name__ == '__main__':
handler = SerfHandlerProxy()
role = os.environ.get('ROLE') or 'no_role'
handler.register(role, MyHandler())
handler.run() | Remove prints that interfere with json output | Remove prints that interfere with json output | Python | mit | waltermoreira/serfnode,waltermoreira/serfnode,waltermoreira/serfnode | python | ## Code Before:
import os
from serf_master import SerfHandlerProxy
from base_handler import BaseHandler
try:
from my_handler import MyHandler
except ImportError:
print "Could not import user's handler."
print "Defaulting to dummy handler."
MyHandler = BaseHandler
if __name__ == '__main__':
handler = SerfHandlerProxy()
role = os.environ.get('ROLE') or 'no_role'
handler.register(role, MyHandler())
handler.run()
## Instruction:
Remove prints that interfere with json output
## Code After:
import os
from serf_master import SerfHandlerProxy
from base_handler import BaseHandler
try:
from my_handler import MyHandler
except ImportError:
MyHandler = BaseHandler
if __name__ == '__main__':
handler = SerfHandlerProxy()
role = os.environ.get('ROLE') or 'no_role'
handler.register(role, MyHandler())
handler.run() | import os
from serf_master import SerfHandlerProxy
from base_handler import BaseHandler
try:
from my_handler import MyHandler
except ImportError:
- print "Could not import user's handler."
- print "Defaulting to dummy handler."
MyHandler = BaseHandler
if __name__ == '__main__':
handler = SerfHandlerProxy()
role = os.environ.get('ROLE') or 'no_role'
handler.register(role, MyHandler())
handler.run() | 2 | 0.117647 | 0 | 2 |
aac11aae27b02c3f27f96585cc4b415f995559ff | .travis.yml | .travis.yml | language: objective-c
osx_image: xcode7.3
before_install:
- gem install xcpretty -N --no-ri --no-rdoc
script:
- set -o pipefail
- xcodebuild clean test -workspace Example/AirRivet.xcworkspace -scheme AirRivet-Example -destination "platform=iOS Simulator,name=iPhone 6 Plus" -enableCodeCoverage YES | xcpretty
deploy:
skip_cleanup: true
provider: script
script: ./scripts/deploy.sh
| language: objective-c
osx_image: xcode7.3
before_install:
- gem install xcpretty -N --no-ri --no-rdoc
script:
- set -o pipefail
- xcodebuild clean test -workspace Example/AirRivet.xcworkspace -scheme AirRivet-Example -destination "platform=iOS Simulator,name=iPhone 6 Plus" -enableCodeCoverage YES | xcpretty
deploy:
skip_cleanup: true
provider: script
script: ./scripts/deploy.sh
on:
tags: true
| Add the on tag command. | Add the on tag command.
| YAML | mit | icapps/swiftGenericWebService,icapps/ios-air-rivet,icapps/ios-faro,icapps/ios-faro,icapps/ios-faro,icapps/ios-faro,icapps/ios-air-rivet,icapps/ios-air-rivet,icapps/ios-air-rivet,icapps/ios-air-rivet | yaml | ## Code Before:
language: objective-c
osx_image: xcode7.3
before_install:
- gem install xcpretty -N --no-ri --no-rdoc
script:
- set -o pipefail
- xcodebuild clean test -workspace Example/AirRivet.xcworkspace -scheme AirRivet-Example -destination "platform=iOS Simulator,name=iPhone 6 Plus" -enableCodeCoverage YES | xcpretty
deploy:
skip_cleanup: true
provider: script
script: ./scripts/deploy.sh
## Instruction:
Add the on tag command.
## Code After:
language: objective-c
osx_image: xcode7.3
before_install:
- gem install xcpretty -N --no-ri --no-rdoc
script:
- set -o pipefail
- xcodebuild clean test -workspace Example/AirRivet.xcworkspace -scheme AirRivet-Example -destination "platform=iOS Simulator,name=iPhone 6 Plus" -enableCodeCoverage YES | xcpretty
deploy:
skip_cleanup: true
provider: script
script: ./scripts/deploy.sh
on:
tags: true
| language: objective-c
osx_image: xcode7.3
before_install:
- gem install xcpretty -N --no-ri --no-rdoc
script:
- set -o pipefail
- xcodebuild clean test -workspace Example/AirRivet.xcworkspace -scheme AirRivet-Example -destination "platform=iOS Simulator,name=iPhone 6 Plus" -enableCodeCoverage YES | xcpretty
deploy:
skip_cleanup: true
provider: script
script: ./scripts/deploy.sh
+ on:
+ tags: true | 2 | 0.142857 | 2 | 0 |
cf60f5feee54a92af54a66605176d061d3051c58 | spec/fabricators/upload_fabricator.rb | spec/fabricators/upload_fabricator.rb | Fabricator(:upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:subfolder_upload, from: :upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:attachment, from: :upload) do
id 42
user
original_filename "archive.zip"
filesize 1234
url "/uploads/default/42/66b3ed1503efc936.zip"
end
| Fabricator(:upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:attachment, from: :upload) do
id 42
user
original_filename "archive.zip"
filesize 1234
url "/uploads/default/42/66b3ed1503efc936.zip"
end
| Revert "Adds upload fabricator for subfolder image upload" | Revert "Adds upload fabricator for subfolder image upload"
This reverts commit b619bd2782d6d24f0cd93e22c50a74f8a27b8fda.
| Ruby | mit | natefinch/discourse,natefinch/discourse | ruby | ## Code Before:
Fabricator(:upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:subfolder_upload, from: :upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:attachment, from: :upload) do
id 42
user
original_filename "archive.zip"
filesize 1234
url "/uploads/default/42/66b3ed1503efc936.zip"
end
## Instruction:
Revert "Adds upload fabricator for subfolder image upload"
This reverts commit b619bd2782d6d24f0cd93e22c50a74f8a27b8fda.
## Code After:
Fabricator(:upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:attachment, from: :upload) do
id 42
user
original_filename "archive.zip"
filesize 1234
url "/uploads/default/42/66b3ed1503efc936.zip"
end
| Fabricator(:upload) do
- user
- sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
- original_filename "logo.png"
- filesize 1234
- width 100
- height 200
- url "/uploads/default/1/1234567890123456.png"
- end
-
- Fabricator(:subfolder_upload, from: :upload) do
user
sha1 "e9d71f5ee7c92d6dc9e92ffdad17b8bd49418f98"
original_filename "logo.png"
filesize 1234
width 100
height 200
url "/uploads/default/1/1234567890123456.png"
end
Fabricator(:attachment, from: :upload) do
id 42
user
original_filename "archive.zip"
filesize 1234
url "/uploads/default/42/66b3ed1503efc936.zip"
end | 10 | 0.37037 | 0 | 10 |
dd9f11f36668717ee349b357b2f32a7a52e38863 | pagerduty_events_api/pagerduty_incident.py | pagerduty_events_api/pagerduty_incident.py | from pagerduty_events_api.pagerduty_rest_client import PagerdutyRestClient
class PagerdutyIncident:
def __init__(self, service_key, incident_key):
self.__service_key = service_key
self.__incident_key = incident_key
def get_service_key(self):
return self.__service_key
def get_incident_key(self):
return self.__incident_key
def acknowledge(self):
payload = {'service_key': self.__service_key,
'event_type': 'acknowledge',
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
def resolve(self):
payload = {'service_key': self.__service_key,
'event_type': 'resolve',
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
| from pagerduty_events_api.pagerduty_rest_client import PagerdutyRestClient
class PagerdutyIncident:
def __init__(self, service_key, incident_key):
self.__service_key = service_key
self.__incident_key = incident_key
def get_service_key(self):
return self.__service_key
def get_incident_key(self):
return self.__incident_key
def acknowledge(self):
self.__send_request_with_event_type('acknowledge')
def resolve(self):
self.__send_request_with_event_type('resolve')
def __send_request_with_event_type(self, event_type):
payload = {'service_key': self.__service_key,
'event_type': event_type,
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
| Remove code duplication from PD Incident class. | Remove code duplication from PD Incident class.
| Python | mit | BlasiusVonSzerencsi/pagerduty-events-api | python | ## Code Before:
from pagerduty_events_api.pagerduty_rest_client import PagerdutyRestClient
class PagerdutyIncident:
def __init__(self, service_key, incident_key):
self.__service_key = service_key
self.__incident_key = incident_key
def get_service_key(self):
return self.__service_key
def get_incident_key(self):
return self.__incident_key
def acknowledge(self):
payload = {'service_key': self.__service_key,
'event_type': 'acknowledge',
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
def resolve(self):
payload = {'service_key': self.__service_key,
'event_type': 'resolve',
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
## Instruction:
Remove code duplication from PD Incident class.
## Code After:
from pagerduty_events_api.pagerduty_rest_client import PagerdutyRestClient
class PagerdutyIncident:
def __init__(self, service_key, incident_key):
self.__service_key = service_key
self.__incident_key = incident_key
def get_service_key(self):
return self.__service_key
def get_incident_key(self):
return self.__incident_key
def acknowledge(self):
self.__send_request_with_event_type('acknowledge')
def resolve(self):
self.__send_request_with_event_type('resolve')
def __send_request_with_event_type(self, event_type):
payload = {'service_key': self.__service_key,
'event_type': event_type,
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
| from pagerduty_events_api.pagerduty_rest_client import PagerdutyRestClient
class PagerdutyIncident:
def __init__(self, service_key, incident_key):
self.__service_key = service_key
self.__incident_key = incident_key
def get_service_key(self):
return self.__service_key
def get_incident_key(self):
return self.__incident_key
def acknowledge(self):
+ self.__send_request_with_event_type('acknowledge')
+
+ def resolve(self):
+ self.__send_request_with_event_type('resolve')
+
+ def __send_request_with_event_type(self, event_type):
payload = {'service_key': self.__service_key,
- 'event_type': 'acknowledge',
? ^^^^ ^^^ ----
+ 'event_type': event_type,
? ^^^ ^^^^^
'incident_key': self.__incident_key}
PagerdutyRestClient().post(payload)
-
- def resolve(self):
- payload = {'service_key': self.__service_key,
- 'event_type': 'resolve',
- 'incident_key': self.__incident_key}
-
- PagerdutyRestClient().post(payload) | 15 | 0.555556 | 7 | 8 |
a2906df7672ca06e622294cc65cd96b2ddba0373 | lib/redch/gateway.rb | lib/redch/gateway.rb | require 'redch/producer'
module Redch
class Gateway
def initialize(api_key, interval)
@routing_key = "sensor.samples"
@api_key = api_key
@interval = interval
end
def send_samples(&generate_value)
begin
AMQP.start("amqp://guest:guest@dev.rabbitmq.com") do |connection, open_ok|
# Connect to a channel with a direct exchange
producer = Producer.new(AMQP::Channel.new(connection), AMQP::Exchange.default)
puts "Publishing sensor data..."
# Output a sample every interval indefinitely
EventMachine.add_periodic_timer(@interval) do
# Call the proc to generate a new sample and build a message
sample = generate_value.call
message = { api_key: @api_key, measurement: "#{'%.3f' % sample} kWh" }
# Publish the message and make it route by the key 'routing_key'
producer.publish(message, routing_key: @routing_key)
end
end
rescue Interrupt
raise Interrupt
end
end
end
end | require 'redch/producer'
module Redch
class Gateway
def initialize(api_key, interval)
@routing_key = "sensor.samples"
@api_key = api_key
@interval = interval
end
def send_samples(&generate_value)
begin
AMQP.start(settings) do |connection, open_ok|
# Connect to a channel with a direct exchange
producer = Producer.new(AMQP::Channel.new(connection), AMQP::Exchange.default)
puts "Publishing sensor data..."
# Output a sample every interval indefinitely
EventMachine.add_periodic_timer(@interval) do
# Call the proc to generate a new sample and build a message
sample = generate_value.call
message = { api_key: @api_key, measurement: "#{'%.3f' % sample} kWh" }
# Publish the message and make it route by the key 'routing_key'
producer.publish(message, routing_key: @routing_key)
end
end
rescue Interrupt
raise Interrupt
end
end
def default_settings
raise ArgumentError, "Environmental variable AMQP_URL must exist" if ENV["AMQP_URL"].nil?
URI.encode(ENV["AMQP_URL"])
end
def settings
@settings ||= default_settings
end
end
end | Handle AMQP url with an ENV | Handle AMQP url with an ENV
| Ruby | apache-2.0 | sauloperez/redch,sauloperez/redch | ruby | ## Code Before:
require 'redch/producer'
module Redch
class Gateway
def initialize(api_key, interval)
@routing_key = "sensor.samples"
@api_key = api_key
@interval = interval
end
def send_samples(&generate_value)
begin
AMQP.start("amqp://guest:guest@dev.rabbitmq.com") do |connection, open_ok|
# Connect to a channel with a direct exchange
producer = Producer.new(AMQP::Channel.new(connection), AMQP::Exchange.default)
puts "Publishing sensor data..."
# Output a sample every interval indefinitely
EventMachine.add_periodic_timer(@interval) do
# Call the proc to generate a new sample and build a message
sample = generate_value.call
message = { api_key: @api_key, measurement: "#{'%.3f' % sample} kWh" }
# Publish the message and make it route by the key 'routing_key'
producer.publish(message, routing_key: @routing_key)
end
end
rescue Interrupt
raise Interrupt
end
end
end
end
## Instruction:
Handle AMQP url with an ENV
## Code After:
require 'redch/producer'
module Redch
class Gateway
def initialize(api_key, interval)
@routing_key = "sensor.samples"
@api_key = api_key
@interval = interval
end
def send_samples(&generate_value)
begin
AMQP.start(settings) do |connection, open_ok|
# Connect to a channel with a direct exchange
producer = Producer.new(AMQP::Channel.new(connection), AMQP::Exchange.default)
puts "Publishing sensor data..."
# Output a sample every interval indefinitely
EventMachine.add_periodic_timer(@interval) do
# Call the proc to generate a new sample and build a message
sample = generate_value.call
message = { api_key: @api_key, measurement: "#{'%.3f' % sample} kWh" }
# Publish the message and make it route by the key 'routing_key'
producer.publish(message, routing_key: @routing_key)
end
end
rescue Interrupt
raise Interrupt
end
end
def default_settings
raise ArgumentError, "Environmental variable AMQP_URL must exist" if ENV["AMQP_URL"].nil?
URI.encode(ENV["AMQP_URL"])
end
def settings
@settings ||= default_settings
end
end
end | require 'redch/producer'
module Redch
class Gateway
def initialize(api_key, interval)
@routing_key = "sensor.samples"
@api_key = api_key
@interval = interval
end
def send_samples(&generate_value)
begin
- AMQP.start("amqp://guest:guest@dev.rabbitmq.com") do |connection, open_ok|
+ AMQP.start(settings) do |connection, open_ok|
# Connect to a channel with a direct exchange
producer = Producer.new(AMQP::Channel.new(connection), AMQP::Exchange.default)
puts "Publishing sensor data..."
# Output a sample every interval indefinitely
EventMachine.add_periodic_timer(@interval) do
# Call the proc to generate a new sample and build a message
sample = generate_value.call
message = { api_key: @api_key, measurement: "#{'%.3f' % sample} kWh" }
# Publish the message and make it route by the key 'routing_key'
producer.publish(message, routing_key: @routing_key)
end
end
rescue Interrupt
raise Interrupt
end
end
+
+ def default_settings
+ raise ArgumentError, "Environmental variable AMQP_URL must exist" if ENV["AMQP_URL"].nil?
+ URI.encode(ENV["AMQP_URL"])
+ end
+
+ def settings
+ @settings ||= default_settings
+ end
end
end | 11 | 0.297297 | 10 | 1 |
83420a2ae3ceea6187b9f8fda642c01d039ac042 | spec/scanny/checks/input_filtering_check_spec.rb | spec/scanny/checks/input_filtering_check_spec.rb | require "spec_helper"
module Scanny::Checks
describe InputFilteringCheck do
before do
@runner = Scanny::Runner.new(InputFilteringCheck.new)
@message = "Possible injection vulnerabilities"
@issue = issue(:low, @message, 20)
end
it "reports \"logger(params[:password])\" correctly" do
@runner.should check("params[:password]").with_issue(@issue)
end
end
end
| require "spec_helper"
module Scanny::Checks
describe InputFilteringCheck do
before do
@runner = Scanny::Runner.new(InputFilteringCheck.new)
@message = "Possible injection vulnerabilities"
@issue = issue(:low, @message, 20)
end
it "reports \"logger(params[:password])\" correctly" do
@runner.should check("params[:password]").with_issue(@issue)
end
it "reports \"system('\\033]30;command\\007')\" correctly" do
@runner.should check('system("\\033]30;command\\007")').with_issue(@issue)
end
end
end
| Add spec for special string sequence | Add spec for special string sequence
| Ruby | mit | openSUSE/scanny | ruby | ## Code Before:
require "spec_helper"
module Scanny::Checks
describe InputFilteringCheck do
before do
@runner = Scanny::Runner.new(InputFilteringCheck.new)
@message = "Possible injection vulnerabilities"
@issue = issue(:low, @message, 20)
end
it "reports \"logger(params[:password])\" correctly" do
@runner.should check("params[:password]").with_issue(@issue)
end
end
end
## Instruction:
Add spec for special string sequence
## Code After:
require "spec_helper"
module Scanny::Checks
describe InputFilteringCheck do
before do
@runner = Scanny::Runner.new(InputFilteringCheck.new)
@message = "Possible injection vulnerabilities"
@issue = issue(:low, @message, 20)
end
it "reports \"logger(params[:password])\" correctly" do
@runner.should check("params[:password]").with_issue(@issue)
end
it "reports \"system('\\033]30;command\\007')\" correctly" do
@runner.should check('system("\\033]30;command\\007")').with_issue(@issue)
end
end
end
| require "spec_helper"
module Scanny::Checks
describe InputFilteringCheck do
before do
@runner = Scanny::Runner.new(InputFilteringCheck.new)
@message = "Possible injection vulnerabilities"
@issue = issue(:low, @message, 20)
end
it "reports \"logger(params[:password])\" correctly" do
@runner.should check("params[:password]").with_issue(@issue)
end
+
+ it "reports \"system('\\033]30;command\\007')\" correctly" do
+ @runner.should check('system("\\033]30;command\\007")').with_issue(@issue)
+ end
end
end | 4 | 0.266667 | 4 | 0 |
ff0ae66ee16bc3ac07cb88ddacb52ffa41779757 | tests/test_func.py | tests/test_func.py | from .utils import assert_eval
def test_simple_func():
assert_eval('(def @a $a 8) (@a)', 1, 8)
def test_simple_func_args():
assert_eval(
'(def @a $a $a)'
'(@a 1)'
'(@a 2)'
'(@a 5)',
1,
1,
2,
5)
def test_func_args_overwrite_globals():
assert_eval(
'(def @a $a 3)'
'(set $a 10)'
'$a'
'(@a 8)'
'$a',
1,
10,
10,
3,
8,
)
def test_func_args_with_offset():
assert_eval(
'(def @a $d (+ $d $i))'
'(def @b $i (+ $i $j))'
'(@a 1 2 3)'
'(@b 8 9 10)'
'$a\n$b\n$c\n$d\n$e\n$i\n$j\n$k\n',
1, 1,
4,
17,
0, 0, 0, 1, 2, 8, 9, 10,
)
| from .utils import assert_eval
def test_simple_func():
assert_eval('(def @a $a 8) (@a)', 1, 8)
def test_simple_func_args():
assert_eval(
'(def @a $a $a)'
'(@a 1)'
'(@a 2)'
'(@a 5)',
1,
1,
2,
5)
def test_func_args_overwrite_globals():
assert_eval(
'(def @a $a 3)'
'(set $a 10)'
'$a'
'(@a 8)'
'$a',
1,
10,
10,
3,
8,
)
def test_func_args_with_offset():
assert_eval(
'(def @a $d (+ $d $i))'
'(def @b $i (+ $i $j))'
'(@a 1 2 3)'
'(@b 8 9 10)'
'$a\n$b\n$c\n$d\n$e\n$i\n$j\n$k\n',
1, 1,
4,
17,
0, 0, 0, 1, 2, 8, 9, 10,
)
def test_no_args():
assert_eval(
'(def @a (+ 0 5))'
'(@a)',
1,
5,
)
def test_nested_calls():
assert_eval(
'(def @a $a (+ $a $b))'
'(def @b (+ (@a 1 2) (@a 3 4)))'
'(@b)',
1,
1,
1 + 2 + 3 + 4,
)
| Add some more function tests. | Add some more function tests.
| Python | bsd-3-clause | sapir/tinywhat,sapir/tinywhat,sapir/tinywhat | python | ## Code Before:
from .utils import assert_eval
def test_simple_func():
assert_eval('(def @a $a 8) (@a)', 1, 8)
def test_simple_func_args():
assert_eval(
'(def @a $a $a)'
'(@a 1)'
'(@a 2)'
'(@a 5)',
1,
1,
2,
5)
def test_func_args_overwrite_globals():
assert_eval(
'(def @a $a 3)'
'(set $a 10)'
'$a'
'(@a 8)'
'$a',
1,
10,
10,
3,
8,
)
def test_func_args_with_offset():
assert_eval(
'(def @a $d (+ $d $i))'
'(def @b $i (+ $i $j))'
'(@a 1 2 3)'
'(@b 8 9 10)'
'$a\n$b\n$c\n$d\n$e\n$i\n$j\n$k\n',
1, 1,
4,
17,
0, 0, 0, 1, 2, 8, 9, 10,
)
## Instruction:
Add some more function tests.
## Code After:
from .utils import assert_eval
def test_simple_func():
assert_eval('(def @a $a 8) (@a)', 1, 8)
def test_simple_func_args():
assert_eval(
'(def @a $a $a)'
'(@a 1)'
'(@a 2)'
'(@a 5)',
1,
1,
2,
5)
def test_func_args_overwrite_globals():
assert_eval(
'(def @a $a 3)'
'(set $a 10)'
'$a'
'(@a 8)'
'$a',
1,
10,
10,
3,
8,
)
def test_func_args_with_offset():
assert_eval(
'(def @a $d (+ $d $i))'
'(def @b $i (+ $i $j))'
'(@a 1 2 3)'
'(@b 8 9 10)'
'$a\n$b\n$c\n$d\n$e\n$i\n$j\n$k\n',
1, 1,
4,
17,
0, 0, 0, 1, 2, 8, 9, 10,
)
def test_no_args():
assert_eval(
'(def @a (+ 0 5))'
'(@a)',
1,
5,
)
def test_nested_calls():
assert_eval(
'(def @a $a (+ $a $b))'
'(def @b (+ (@a 1 2) (@a 3 4)))'
'(@b)',
1,
1,
1 + 2 + 3 + 4,
)
| from .utils import assert_eval
def test_simple_func():
assert_eval('(def @a $a 8) (@a)', 1, 8)
def test_simple_func_args():
assert_eval(
'(def @a $a $a)'
'(@a 1)'
'(@a 2)'
'(@a 5)',
1,
1,
2,
5)
def test_func_args_overwrite_globals():
assert_eval(
'(def @a $a 3)'
'(set $a 10)'
'$a'
'(@a 8)'
'$a',
1,
10,
10,
3,
8,
)
def test_func_args_with_offset():
assert_eval(
'(def @a $d (+ $d $i))'
'(def @b $i (+ $i $j))'
'(@a 1 2 3)'
'(@b 8 9 10)'
'$a\n$b\n$c\n$d\n$e\n$i\n$j\n$k\n',
1, 1,
4,
17,
0, 0, 0, 1, 2, 8, 9, 10,
)
+
+
+ def test_no_args():
+ assert_eval(
+ '(def @a (+ 0 5))'
+ '(@a)',
+ 1,
+ 5,
+ )
+
+
+ def test_nested_calls():
+ assert_eval(
+ '(def @a $a (+ $a $b))'
+ '(def @b (+ (@a 1 2) (@a 3 4)))'
+ '(@b)',
+ 1,
+ 1,
+ 1 + 2 + 3 + 4,
+ ) | 20 | 0.434783 | 20 | 0 |
a07df4bdb0692fc289d2b89e8e1a634bddb5153d | app/assets/javascripts/osem-tickets.js | app/assets/javascripts/osem-tickets.js | function update_price($this){
var id = $this.data('id');
// Calculate price for row
var value = $this.val();
var price = $('#price_' + id).text();
$('#total_row_' + id).text(value * price);
// Calculate total price
var total = 0;
$('.total_row').each(function( index ) {
total += parseFloat($(this).text());
});
$('#total_price').text(total);
}
$( document ).ready(function() {
$('.quantity').each(function() {
update_price($(this));
});
$('.quantity').change(function() {
update_price($(this));
});
$(function () {
$('[data-toggle="tooltip"]').tooltip()
});
});
| function update_price($this){
var id = $this.data('id');
// Calculate price for row
var value = $this.val();
var price = $('#price_' + id).text();
$('#total_row_' + id).text((value * price).toFixed(2));
// Calculate total price
var total = 0;
$('.total_row').each(function( index ) {
total += parseFloat($(this).text());
});
$('#total_price').text(total);
}
$( document ).ready(function() {
$('.quantity').each(function() {
update_price($(this));
});
$('.quantity').change(function() {
update_price($(this));
});
$(function () {
$('[data-toggle="tooltip"]').tooltip()
});
});
| Fix decimal numbers limit to 2 | Fix decimal numbers limit to 2
while calculating the row total, javascript was called and on some quantity, a number with a big decimal part was appearing.In this PR it is fixed
Fixes https://github.com/opensuse/osem/issues/1709
| JavaScript | mit | hennevogel/osem,rishabhptr/osem,rishabhptr/osem,openSUSE/osem,kormoc/osem,hennevogel/osem,differentreality/osem,hennevogel/osem,namangupta01/osem,differentreality/osem,kormoc/osem,AndrewKvalheim/osem,bear454/osem,AndrewKvalheim/osem,SeaGL/osem,bear454/osem,bear454/osem,differentreality/osem,openSUSE/osem,kormoc/osem,openSUSE/osem,namangupta01/osem,differentreality/osem,kormoc/osem,namangupta01/osem,rishabhptr/osem,namangupta01/osem,SeaGL/osem,bear454/osem,SeaGL/osem,hennevogel/osem,AndrewKvalheim/osem,rishabhptr/osem | javascript | ## Code Before:
function update_price($this){
var id = $this.data('id');
// Calculate price for row
var value = $this.val();
var price = $('#price_' + id).text();
$('#total_row_' + id).text(value * price);
// Calculate total price
var total = 0;
$('.total_row').each(function( index ) {
total += parseFloat($(this).text());
});
$('#total_price').text(total);
}
$( document ).ready(function() {
$('.quantity').each(function() {
update_price($(this));
});
$('.quantity').change(function() {
update_price($(this));
});
$(function () {
$('[data-toggle="tooltip"]').tooltip()
});
});
## Instruction:
Fix decimal numbers limit to 2
while calculating the row total, javascript was called and on some quantity, a number with a big decimal part was appearing.In this PR it is fixed
Fixes https://github.com/opensuse/osem/issues/1709
## Code After:
function update_price($this){
var id = $this.data('id');
// Calculate price for row
var value = $this.val();
var price = $('#price_' + id).text();
$('#total_row_' + id).text((value * price).toFixed(2));
// Calculate total price
var total = 0;
$('.total_row').each(function( index ) {
total += parseFloat($(this).text());
});
$('#total_price').text(total);
}
$( document ).ready(function() {
$('.quantity').each(function() {
update_price($(this));
});
$('.quantity').change(function() {
update_price($(this));
});
$(function () {
$('[data-toggle="tooltip"]').tooltip()
});
});
| function update_price($this){
var id = $this.data('id');
// Calculate price for row
var value = $this.val();
var price = $('#price_' + id).text();
- $('#total_row_' + id).text(value * price);
+ $('#total_row_' + id).text((value * price).toFixed(2));
? + ++++++++++++
// Calculate total price
var total = 0;
$('.total_row').each(function( index ) {
total += parseFloat($(this).text());
});
$('#total_price').text(total);
}
$( document ).ready(function() {
$('.quantity').each(function() {
update_price($(this));
});
$('.quantity').change(function() {
update_price($(this));
});
$(function () {
$('[data-toggle="tooltip"]').tooltip()
});
}); | 2 | 0.071429 | 1 | 1 |
0e6f679f940736ff78efbe66d27bb2b8a181990f | src/main/java/net/wizardsoflua/lua/module/print/PrintRedirector.java | src/main/java/net/wizardsoflua/lua/module/print/PrintRedirector.java | package net.wizardsoflua.lua.module.print;
import java.io.IOException;
import java.io.OutputStream;
import net.sandius.rembulan.Table;
import net.sandius.rembulan.lib.BasicLib;
import net.sandius.rembulan.runtime.LuaFunction;
public class PrintRedirector {
public interface Context {
public void send(String message);
}
public static PrintRedirector installInto(Table env, Context context) {
return new PrintRedirector(env, context);
}
private final Context context;
public PrintRedirector(Table env, Context context) {
this.context = context;
OutputStream out = new ChatOutputStream();
LuaFunction printFunc = BasicLib.print(out, env);
env.rawset("print", printFunc);
}
private class ChatOutputStream extends org.apache.commons.io.output.ByteArrayOutputStream {
@Override
public void flush() throws IOException {
String message = toString();
// Remove trailing line-feed.
if (message.endsWith("\n")) {
message = message.substring(0, message.length() - 1);
}
reset();
print(message);
}
}
private void print(String message) {
context.send(TabEncoder.encode(message));
}
}
| package net.wizardsoflua.lua.module.print;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.charset.Charset;
import net.sandius.rembulan.Table;
import net.sandius.rembulan.lib.BasicLib;
import net.sandius.rembulan.runtime.LuaFunction;
public class PrintRedirector {
public interface Context {
public void send(String message);
}
public static PrintRedirector installInto(Table env, Context context) {
return new PrintRedirector(env, context);
}
private final Context context;
public PrintRedirector(Table env, Context context) {
this.context = context;
OutputStream out = new ChatOutputStream();
LuaFunction printFunc = BasicLib.print(out, env);
env.rawset("print", printFunc);
}
private class ChatOutputStream extends org.apache.commons.io.output.ByteArrayOutputStream {
@Override
public void flush() throws IOException {
String message = toString(Charset.defaultCharset());
// Remove trailing line-feed.
if (message.endsWith("\n")) {
message = message.substring(0, message.length() - 1);
}
reset();
print(message);
}
}
private void print(String message) {
context.send(TabEncoder.encode(message));
}
}
| Remove use of the deprecated method ByteArrayOutputStream.toString() | Remove use of the deprecated method ByteArrayOutputStream.toString()
| Java | apache-2.0 | mkarneim/luamod | java | ## Code Before:
package net.wizardsoflua.lua.module.print;
import java.io.IOException;
import java.io.OutputStream;
import net.sandius.rembulan.Table;
import net.sandius.rembulan.lib.BasicLib;
import net.sandius.rembulan.runtime.LuaFunction;
public class PrintRedirector {
public interface Context {
public void send(String message);
}
public static PrintRedirector installInto(Table env, Context context) {
return new PrintRedirector(env, context);
}
private final Context context;
public PrintRedirector(Table env, Context context) {
this.context = context;
OutputStream out = new ChatOutputStream();
LuaFunction printFunc = BasicLib.print(out, env);
env.rawset("print", printFunc);
}
private class ChatOutputStream extends org.apache.commons.io.output.ByteArrayOutputStream {
@Override
public void flush() throws IOException {
String message = toString();
// Remove trailing line-feed.
if (message.endsWith("\n")) {
message = message.substring(0, message.length() - 1);
}
reset();
print(message);
}
}
private void print(String message) {
context.send(TabEncoder.encode(message));
}
}
## Instruction:
Remove use of the deprecated method ByteArrayOutputStream.toString()
## Code After:
package net.wizardsoflua.lua.module.print;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.charset.Charset;
import net.sandius.rembulan.Table;
import net.sandius.rembulan.lib.BasicLib;
import net.sandius.rembulan.runtime.LuaFunction;
public class PrintRedirector {
public interface Context {
public void send(String message);
}
public static PrintRedirector installInto(Table env, Context context) {
return new PrintRedirector(env, context);
}
private final Context context;
public PrintRedirector(Table env, Context context) {
this.context = context;
OutputStream out = new ChatOutputStream();
LuaFunction printFunc = BasicLib.print(out, env);
env.rawset("print", printFunc);
}
private class ChatOutputStream extends org.apache.commons.io.output.ByteArrayOutputStream {
@Override
public void flush() throws IOException {
String message = toString(Charset.defaultCharset());
// Remove trailing line-feed.
if (message.endsWith("\n")) {
message = message.substring(0, message.length() - 1);
}
reset();
print(message);
}
}
private void print(String message) {
context.send(TabEncoder.encode(message));
}
}
| package net.wizardsoflua.lua.module.print;
import java.io.IOException;
import java.io.OutputStream;
-
+ import java.nio.charset.Charset;
import net.sandius.rembulan.Table;
import net.sandius.rembulan.lib.BasicLib;
import net.sandius.rembulan.runtime.LuaFunction;
public class PrintRedirector {
public interface Context {
public void send(String message);
}
public static PrintRedirector installInto(Table env, Context context) {
return new PrintRedirector(env, context);
}
private final Context context;
public PrintRedirector(Table env, Context context) {
this.context = context;
OutputStream out = new ChatOutputStream();
LuaFunction printFunc = BasicLib.print(out, env);
env.rawset("print", printFunc);
}
private class ChatOutputStream extends org.apache.commons.io.output.ByteArrayOutputStream {
@Override
public void flush() throws IOException {
- String message = toString();
+ String message = toString(Charset.defaultCharset());
// Remove trailing line-feed.
if (message.endsWith("\n")) {
message = message.substring(0, message.length() - 1);
}
reset();
print(message);
}
}
private void print(String message) {
context.send(TabEncoder.encode(message));
}
} | 4 | 0.088889 | 2 | 2 |
3c32c2a446036dab674e76d6dad038b6e298bd22 | app/controllers/registrationController.js | app/controllers/registrationController.js | core.controller('RegistrationController', function ($controller, $location, $scope, $timeout, AlertService, RestApi) {
angular.extend(this, $controller('AbstractController', {$scope: $scope}));
$scope.registration = {
email: '',
token: ''
};
AlertService.create('auth/register');
AlertService.create('auth');
$scope.verifyEmail = function(email) {
RestApi.anonymousGet({
controller: 'auth',
method: 'register?email=' + email
}).then(function(data) {
$scope.registration.email = '';
});
};
if(typeof $location.search().token != 'undefined') {
$scope.registration.token = $location.search().token;
}
$scope.register = function() {
RestApi.anonymousGet({
'controller': 'auth',
'method': 'register',
'data': $scope.registration
}).then(function(data) {
$scope.registration = {
email: '',
token: ''
};
$location.path("/");
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
}); | core.controller('RegistrationController', function ($controller, $location, $scope, $timeout, AlertService, RestApi) {
angular.extend(this, $controller('AbstractController', {$scope: $scope}));
$scope.registration = {
email: '',
token: ''
};
AlertService.create('auth/register');
AlertService.create('auth');
$scope.verifyEmail = function(email) {
RestApi.anonymousGet({
controller: 'auth',
method: 'register?email=' + email
}).then(function(data) {
$scope.registration.email = '';
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
if(typeof $location.search().token != 'undefined') {
$scope.registration.token = $location.search().token;
}
$scope.register = function() {
RestApi.anonymousGet({
'controller': 'auth',
'method': 'register',
'data': $scope.registration
}).then(function(data) {
$scope.registration = {
email: '',
token: ''
};
$location.path("/");
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
}); | Add success alert indicating registration began and to verify email. | Add success alert indicating registration began and to verify email.
| JavaScript | mit | TAMULib/Weaver-UI-Core,TAMULib/Weaver-UI-Core | javascript | ## Code Before:
core.controller('RegistrationController', function ($controller, $location, $scope, $timeout, AlertService, RestApi) {
angular.extend(this, $controller('AbstractController', {$scope: $scope}));
$scope.registration = {
email: '',
token: ''
};
AlertService.create('auth/register');
AlertService.create('auth');
$scope.verifyEmail = function(email) {
RestApi.anonymousGet({
controller: 'auth',
method: 'register?email=' + email
}).then(function(data) {
$scope.registration.email = '';
});
};
if(typeof $location.search().token != 'undefined') {
$scope.registration.token = $location.search().token;
}
$scope.register = function() {
RestApi.anonymousGet({
'controller': 'auth',
'method': 'register',
'data': $scope.registration
}).then(function(data) {
$scope.registration = {
email: '',
token: ''
};
$location.path("/");
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
});
## Instruction:
Add success alert indicating registration began and to verify email.
## Code After:
core.controller('RegistrationController', function ($controller, $location, $scope, $timeout, AlertService, RestApi) {
angular.extend(this, $controller('AbstractController', {$scope: $scope}));
$scope.registration = {
email: '',
token: ''
};
AlertService.create('auth/register');
AlertService.create('auth');
$scope.verifyEmail = function(email) {
RestApi.anonymousGet({
controller: 'auth',
method: 'register?email=' + email
}).then(function(data) {
$scope.registration.email = '';
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
if(typeof $location.search().token != 'undefined') {
$scope.registration.token = $location.search().token;
}
$scope.register = function() {
RestApi.anonymousGet({
'controller': 'auth',
'method': 'register',
'data': $scope.registration
}).then(function(data) {
$scope.registration = {
email: '',
token: ''
};
$location.path("/");
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
}); | core.controller('RegistrationController', function ($controller, $location, $scope, $timeout, AlertService, RestApi) {
angular.extend(this, $controller('AbstractController', {$scope: $scope}));
$scope.registration = {
email: '',
token: ''
};
AlertService.create('auth/register');
AlertService.create('auth');
$scope.verifyEmail = function(email) {
RestApi.anonymousGet({
controller: 'auth',
method: 'register?email=' + email
}).then(function(data) {
$scope.registration.email = '';
+
+ $timeout(function() {
+ AlertService.add(data.meta, 'auth/register');
+ });
});
};
if(typeof $location.search().token != 'undefined') {
$scope.registration.token = $location.search().token;
}
$scope.register = function() {
+
RestApi.anonymousGet({
'controller': 'auth',
'method': 'register',
'data': $scope.registration
}).then(function(data) {
$scope.registration = {
email: '',
token: ''
};
$location.path("/");
$timeout(function() {
AlertService.add(data.meta, 'auth/register');
});
});
};
}); | 5 | 0.106383 | 5 | 0 |
bebf2b5aaa9d7a5071b01cc075dd218c29a64034 | .travis.yml | .travis.yml | language: go
go:
- 1.7
- master
sudo: false
install:
- go get -v github.com/Masterminds/glide
- cd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3
- glide install
| language: go
go:
- 1.7
- master
sudo: false
install:
- go get -v github.com/Masterminds/glide
- pushd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3 && popd
- glide install
| Return back to original directory | Return back to original directory
| YAML | mit | gotgenes/getignore,gotgenes/getignore | yaml | ## Code Before:
language: go
go:
- 1.7
- master
sudo: false
install:
- go get -v github.com/Masterminds/glide
- cd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3
- glide install
## Instruction:
Return back to original directory
## Code After:
language: go
go:
- 1.7
- master
sudo: false
install:
- go get -v github.com/Masterminds/glide
- pushd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3 && popd
- glide install
| language: go
go:
- 1.7
- master
sudo: false
install:
- go get -v github.com/Masterminds/glide
- - cd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3
? ^
+ - pushd $GOPATH/src/github.com/Masterminds/glide && git checkout tags/v0.12.3 && popd
? ^^^^ ++++++++
- glide install | 2 | 0.166667 | 1 | 1 |
2afc03a084cf736cc76f629a45a720c3c44eb27d | pkgs/applications/misc/hello/default.nix | pkgs/applications/misc/hello/default.nix | { lib
, stdenv
, fetchurl
, nixos
, testers
, hello
}:
stdenv.mkDerivation rec {
pname = "hello";
version = "2.12";
src = fetchurl {
url = "mirror://gnu/hello/${pname}-${version}.tar.gz";
sha256 = "1ayhp9v4m4rdhjmnl2bq3cibrbqqkgjbl3s7yk2nhlh8vj3ay16g";
};
doCheck = true;
passthru.tests = {
version = testers.testVersion { package = hello; };
invariant-under-noXlibs =
testers.testEqualDerivation
"hello must not be rebuilt when environment.noXlibs is set."
hello
(nixos { environment.noXlibs = true; }).pkgs.hello;
};
meta = with lib; {
description = "A program that produces a familiar, friendly greeting";
longDescription = ''
GNU Hello is a program that prints "Hello, world!" when you run it.
It is fully customizable.
'';
homepage = "https://www.gnu.org/software/hello/manual/";
changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${version}";
license = licenses.gpl3Plus;
maintainers = [ maintainers.eelco ];
platforms = platforms.all;
};
}
| { lib
, runCommand
, stdenv
, fetchurl
, nixos
, testers
, hello
}:
stdenv.mkDerivation (self: {
pname = "hello";
version = "2.12";
src = fetchurl {
url = "mirror://gnu/hello/${self.pname}-${self.version}.tar.gz";
sha256 = "1ayhp9v4m4rdhjmnl2bq3cibrbqqkgjbl3s7yk2nhlh8vj3ay16g";
};
doCheck = true;
passthru.tests = {
version = testers.testVersion { package = hello; };
invariant-under-noXlibs =
testers.testEqualDerivation
"hello must not be rebuilt when environment.noXlibs is set."
hello
(nixos { environment.noXlibs = true; }).pkgs.hello;
};
passthru.tests.run = runCommand "hello-test-run" {
nativeBuildInputs = [ self ];
} ''
diff -U3 --color=auto <(hello) <(echo 'Hello, world!')
touch $out
'';
meta = with lib; {
description = "A program that produces a familiar, friendly greeting";
longDescription = ''
GNU Hello is a program that prints "Hello, world!" when you run it.
It is fully customizable.
'';
homepage = "https://www.gnu.org/software/hello/manual/";
changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${self.version}";
license = licenses.gpl3Plus;
maintainers = [ maintainers.eelco ];
platforms = platforms.all;
};
})
| Define a passthru test via new mkDerivation self arg | hello: Define a passthru test via new mkDerivation self arg
| Nix | mit | NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs | nix | ## Code Before:
{ lib
, stdenv
, fetchurl
, nixos
, testers
, hello
}:
stdenv.mkDerivation rec {
pname = "hello";
version = "2.12";
src = fetchurl {
url = "mirror://gnu/hello/${pname}-${version}.tar.gz";
sha256 = "1ayhp9v4m4rdhjmnl2bq3cibrbqqkgjbl3s7yk2nhlh8vj3ay16g";
};
doCheck = true;
passthru.tests = {
version = testers.testVersion { package = hello; };
invariant-under-noXlibs =
testers.testEqualDerivation
"hello must not be rebuilt when environment.noXlibs is set."
hello
(nixos { environment.noXlibs = true; }).pkgs.hello;
};
meta = with lib; {
description = "A program that produces a familiar, friendly greeting";
longDescription = ''
GNU Hello is a program that prints "Hello, world!" when you run it.
It is fully customizable.
'';
homepage = "https://www.gnu.org/software/hello/manual/";
changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${version}";
license = licenses.gpl3Plus;
maintainers = [ maintainers.eelco ];
platforms = platforms.all;
};
}
## Instruction:
hello: Define a passthru test via new mkDerivation self arg
## Code After:
{ lib
, runCommand
, stdenv
, fetchurl
, nixos
, testers
, hello
}:
stdenv.mkDerivation (self: {
pname = "hello";
version = "2.12";
src = fetchurl {
url = "mirror://gnu/hello/${self.pname}-${self.version}.tar.gz";
sha256 = "1ayhp9v4m4rdhjmnl2bq3cibrbqqkgjbl3s7yk2nhlh8vj3ay16g";
};
doCheck = true;
passthru.tests = {
version = testers.testVersion { package = hello; };
invariant-under-noXlibs =
testers.testEqualDerivation
"hello must not be rebuilt when environment.noXlibs is set."
hello
(nixos { environment.noXlibs = true; }).pkgs.hello;
};
passthru.tests.run = runCommand "hello-test-run" {
nativeBuildInputs = [ self ];
} ''
diff -U3 --color=auto <(hello) <(echo 'Hello, world!')
touch $out
'';
meta = with lib; {
description = "A program that produces a familiar, friendly greeting";
longDescription = ''
GNU Hello is a program that prints "Hello, world!" when you run it.
It is fully customizable.
'';
homepage = "https://www.gnu.org/software/hello/manual/";
changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${self.version}";
license = licenses.gpl3Plus;
maintainers = [ maintainers.eelco ];
platforms = platforms.all;
};
})
| { lib
+ , runCommand
, stdenv
, fetchurl
, nixos
, testers
, hello
}:
- stdenv.mkDerivation rec {
? ^ ^
+ stdenv.mkDerivation (self: {
? ^^ ^^^
pname = "hello";
version = "2.12";
src = fetchurl {
- url = "mirror://gnu/hello/${pname}-${version}.tar.gz";
+ url = "mirror://gnu/hello/${self.pname}-${self.version}.tar.gz";
? +++++ +++++
sha256 = "1ayhp9v4m4rdhjmnl2bq3cibrbqqkgjbl3s7yk2nhlh8vj3ay16g";
};
doCheck = true;
passthru.tests = {
version = testers.testVersion { package = hello; };
invariant-under-noXlibs =
testers.testEqualDerivation
"hello must not be rebuilt when environment.noXlibs is set."
hello
(nixos { environment.noXlibs = true; }).pkgs.hello;
};
+ passthru.tests.run = runCommand "hello-test-run" {
+ nativeBuildInputs = [ self ];
+ } ''
+ diff -U3 --color=auto <(hello) <(echo 'Hello, world!')
+ touch $out
+ '';
+
meta = with lib; {
description = "A program that produces a familiar, friendly greeting";
longDescription = ''
GNU Hello is a program that prints "Hello, world!" when you run it.
It is fully customizable.
'';
homepage = "https://www.gnu.org/software/hello/manual/";
- changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${version}";
+ changelog = "https://git.savannah.gnu.org/cgit/hello.git/plain/NEWS?h=v${self.version}";
? +++++
license = licenses.gpl3Plus;
maintainers = [ maintainers.eelco ];
platforms = platforms.all;
};
- }
+ }) | 16 | 0.380952 | 12 | 4 |
4be53761875922cc8d27e54363624cd1ae34bc16 | app/containers/Main/index.js | app/containers/Main/index.js | import React, { Component, PropTypes } from 'react';
import fetchData from '../../actions/fetchData';
import Toast from '../../components/Toast';
import Modals from '../Modals';
import types from '../../utils/types';
import SocketEvents from '../../utils/socketEvents';
import './Main.scss';
import MainNav from '../MainNav';
export default class Main extends Component {
static propTypes = {
...types.entries,
...types.sections,
site: PropTypes.object.isRequired,
socket: PropTypes.object.isRequired,
ui: PropTypes.object.isRequired,
user: PropTypes.object.isRequired,
location: PropTypes.object.isRequired,
children: PropTypes.object.isRequired,
dispatch: PropTypes.func.isRequired,
};
componentDidMount() {
const { dispatch, socket } = this.props;
fetchData();
const events = new SocketEvents(socket, dispatch);
events.listen();
}
render() {
const { user, entries, sections, fields, assets, ui, dispatch, site } = this.props;
if (user.isFetching
|| entries.isFetching
|| sections.isFetching
|| assets.isFetching
|| fields.isFetching) return null;
return (
<main className="main">
<MainNav siteName={site.siteName} />
{React.cloneElement(this.props.children, {
...this.props,
key: this.props.location.pathname,
})}
<div className="toasts">
{ui.toasts.map(t => <Toast dispatch={dispatch} key={t.dateCreated} {...t} />)}
</div>
<Modals {...this.props} />
</main>
);
}
}
| import React, { Component, PropTypes } from 'react';
import fetchData from '../../actions/fetchData';
import Toast from '../../components/Toast';
import Modals from '../Modals';
import types from '../../utils/types';
import SocketEvents from '../../utils/socketEvents';
import './Main.scss';
import MainNav from '../MainNav';
export default class Main extends Component {
static propTypes = {
...types.entries,
...types.sections,
site: PropTypes.object.isRequired,
socket: PropTypes.object.isRequired,
ui: PropTypes.object.isRequired,
user: PropTypes.object.isRequired,
location: PropTypes.object.isRequired,
children: PropTypes.object.isRequired,
dispatch: PropTypes.func.isRequired,
};
componentDidMount() {
const { dispatch, socket } = this.props;
fetchData();
const events = new SocketEvents(socket, dispatch);
events.listen();
}
render() {
const { ui, dispatch, site } = this.props;
if (Object.keys(this.props).some(key => this.props[key].isFetching)) return null;
return (
<main className="main">
<MainNav siteName={site.siteName} />
{React.cloneElement(this.props.children, {
...this.props,
key: this.props.location.pathname,
})}
<div className="toasts">
{ui.toasts.map(t => <Toast dispatch={dispatch} key={t.dateCreated} {...t} />)}
</div>
<Modals {...this.props} />
</main>
);
}
}
| Check for isFetching with .some | :wrench: Check for isFetching with .some
| JavaScript | mit | JasonEtco/flintcms,JasonEtco/flintcms | javascript | ## Code Before:
import React, { Component, PropTypes } from 'react';
import fetchData from '../../actions/fetchData';
import Toast from '../../components/Toast';
import Modals from '../Modals';
import types from '../../utils/types';
import SocketEvents from '../../utils/socketEvents';
import './Main.scss';
import MainNav from '../MainNav';
export default class Main extends Component {
static propTypes = {
...types.entries,
...types.sections,
site: PropTypes.object.isRequired,
socket: PropTypes.object.isRequired,
ui: PropTypes.object.isRequired,
user: PropTypes.object.isRequired,
location: PropTypes.object.isRequired,
children: PropTypes.object.isRequired,
dispatch: PropTypes.func.isRequired,
};
componentDidMount() {
const { dispatch, socket } = this.props;
fetchData();
const events = new SocketEvents(socket, dispatch);
events.listen();
}
render() {
const { user, entries, sections, fields, assets, ui, dispatch, site } = this.props;
if (user.isFetching
|| entries.isFetching
|| sections.isFetching
|| assets.isFetching
|| fields.isFetching) return null;
return (
<main className="main">
<MainNav siteName={site.siteName} />
{React.cloneElement(this.props.children, {
...this.props,
key: this.props.location.pathname,
})}
<div className="toasts">
{ui.toasts.map(t => <Toast dispatch={dispatch} key={t.dateCreated} {...t} />)}
</div>
<Modals {...this.props} />
</main>
);
}
}
## Instruction:
:wrench: Check for isFetching with .some
## Code After:
import React, { Component, PropTypes } from 'react';
import fetchData from '../../actions/fetchData';
import Toast from '../../components/Toast';
import Modals from '../Modals';
import types from '../../utils/types';
import SocketEvents from '../../utils/socketEvents';
import './Main.scss';
import MainNav from '../MainNav';
export default class Main extends Component {
static propTypes = {
...types.entries,
...types.sections,
site: PropTypes.object.isRequired,
socket: PropTypes.object.isRequired,
ui: PropTypes.object.isRequired,
user: PropTypes.object.isRequired,
location: PropTypes.object.isRequired,
children: PropTypes.object.isRequired,
dispatch: PropTypes.func.isRequired,
};
componentDidMount() {
const { dispatch, socket } = this.props;
fetchData();
const events = new SocketEvents(socket, dispatch);
events.listen();
}
render() {
const { ui, dispatch, site } = this.props;
if (Object.keys(this.props).some(key => this.props[key].isFetching)) return null;
return (
<main className="main">
<MainNav siteName={site.siteName} />
{React.cloneElement(this.props.children, {
...this.props,
key: this.props.location.pathname,
})}
<div className="toasts">
{ui.toasts.map(t => <Toast dispatch={dispatch} key={t.dateCreated} {...t} />)}
</div>
<Modals {...this.props} />
</main>
);
}
}
| import React, { Component, PropTypes } from 'react';
import fetchData from '../../actions/fetchData';
import Toast from '../../components/Toast';
import Modals from '../Modals';
import types from '../../utils/types';
import SocketEvents from '../../utils/socketEvents';
import './Main.scss';
import MainNav from '../MainNav';
export default class Main extends Component {
static propTypes = {
...types.entries,
...types.sections,
site: PropTypes.object.isRequired,
socket: PropTypes.object.isRequired,
ui: PropTypes.object.isRequired,
user: PropTypes.object.isRequired,
location: PropTypes.object.isRequired,
children: PropTypes.object.isRequired,
dispatch: PropTypes.func.isRequired,
};
componentDidMount() {
const { dispatch, socket } = this.props;
fetchData();
const events = new SocketEvents(socket, dispatch);
events.listen();
}
render() {
+ const { ui, dispatch, site } = this.props;
+ if (Object.keys(this.props).some(key => this.props[key].isFetching)) return null;
- const { user, entries, sections, fields, assets, ui, dispatch, site } = this.props;
- if (user.isFetching
- || entries.isFetching
- || sections.isFetching
- || assets.isFetching
- || fields.isFetching) return null;
return (
<main className="main">
<MainNav siteName={site.siteName} />
{React.cloneElement(this.props.children, {
...this.props,
key: this.props.location.pathname,
})}
<div className="toasts">
{ui.toasts.map(t => <Toast dispatch={dispatch} key={t.dateCreated} {...t} />)}
</div>
<Modals {...this.props} />
</main>
);
}
} | 8 | 0.142857 | 2 | 6 |
f739a7605b738b85ba32436825b44f0059b83f45 | test/test_cpuid.c | test/test_cpuid.c |
void test_GetVendorName_should_ReturnGenuineIntel(void) {
char expected[20] = "GenuineIntel";
char actual[20];
get_vendor_name(actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
}
void test_GetVendorSignature_should_asdf(void) {
uint32_t exp_ebx = 0x756e6547; // translates to "Genu"
uint32_t exp_ecx = 0x6c65746e; // translates to "ineI"
uint32_t exp_edx = 0x49656e69; // translates to "ntel"
cpuid_info_t sig;
sig = get_vendor_signature();
TEST_ASSERT_EQUAL_UINT32(exp_ebx, sig.ebx);
TEST_ASSERT_EQUAL_UINT32(exp_ecx, sig.ecx);
TEST_ASSERT_EQUAL_UINT32(exp_edx, sig.edx);
}
void test_GetProcessorSignature_should_ReturnFamily6(void) {
unsigned int expected = 0x6;
unsigned int family;
uint32_t processor_signature = get_processor_signature();
family = (processor_signature >> 8) & 0xf;
TEST_ASSERT_EQUAL_UINT(expected, family);
}
|
void test_GetVendorName_should_ReturnGenuineIntel(void) {
char expected[20] = "GenuineIntel";
char actual[20];
get_vendor_name(actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
}
void test_GetVendorSignature_should_ReturnCorrectCpuIdValues(void) {
uint32_t exp_ebx = 0x756e6547; // translates to "Genu"
uint32_t exp_ecx = 0x6c65746e; // translates to "ineI"
uint32_t exp_edx = 0x49656e69; // translates to "ntel"
cpuid_info_t sig;
sig = get_vendor_signature();
TEST_ASSERT_EQUAL_UINT32(exp_ebx, sig.ebx);
TEST_ASSERT_EQUAL_UINT32(exp_ecx, sig.ecx);
TEST_ASSERT_EQUAL_UINT32(exp_edx, sig.edx);
}
void test_GetProcessorSignature_should_ReturnFamily6(void) {
unsigned int expected = 0x6;
unsigned int family;
uint32_t processor_signature = get_processor_signature();
family = (processor_signature >> 8) & 0xf;
TEST_ASSERT_EQUAL_UINT(expected, family);
}
| Change method-name to something more appropriate | Change method-name to something more appropriate
| C | bsd-2-clause | sosy-lab/power-gadget_benchexec,sosy-lab/power-gadget_benchexec | c | ## Code Before:
void test_GetVendorName_should_ReturnGenuineIntel(void) {
char expected[20] = "GenuineIntel";
char actual[20];
get_vendor_name(actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
}
void test_GetVendorSignature_should_asdf(void) {
uint32_t exp_ebx = 0x756e6547; // translates to "Genu"
uint32_t exp_ecx = 0x6c65746e; // translates to "ineI"
uint32_t exp_edx = 0x49656e69; // translates to "ntel"
cpuid_info_t sig;
sig = get_vendor_signature();
TEST_ASSERT_EQUAL_UINT32(exp_ebx, sig.ebx);
TEST_ASSERT_EQUAL_UINT32(exp_ecx, sig.ecx);
TEST_ASSERT_EQUAL_UINT32(exp_edx, sig.edx);
}
void test_GetProcessorSignature_should_ReturnFamily6(void) {
unsigned int expected = 0x6;
unsigned int family;
uint32_t processor_signature = get_processor_signature();
family = (processor_signature >> 8) & 0xf;
TEST_ASSERT_EQUAL_UINT(expected, family);
}
## Instruction:
Change method-name to something more appropriate
## Code After:
void test_GetVendorName_should_ReturnGenuineIntel(void) {
char expected[20] = "GenuineIntel";
char actual[20];
get_vendor_name(actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
}
void test_GetVendorSignature_should_ReturnCorrectCpuIdValues(void) {
uint32_t exp_ebx = 0x756e6547; // translates to "Genu"
uint32_t exp_ecx = 0x6c65746e; // translates to "ineI"
uint32_t exp_edx = 0x49656e69; // translates to "ntel"
cpuid_info_t sig;
sig = get_vendor_signature();
TEST_ASSERT_EQUAL_UINT32(exp_ebx, sig.ebx);
TEST_ASSERT_EQUAL_UINT32(exp_ecx, sig.ecx);
TEST_ASSERT_EQUAL_UINT32(exp_edx, sig.edx);
}
void test_GetProcessorSignature_should_ReturnFamily6(void) {
unsigned int expected = 0x6;
unsigned int family;
uint32_t processor_signature = get_processor_signature();
family = (processor_signature >> 8) & 0xf;
TEST_ASSERT_EQUAL_UINT(expected, family);
}
|
void test_GetVendorName_should_ReturnGenuineIntel(void) {
char expected[20] = "GenuineIntel";
char actual[20];
get_vendor_name(actual);
TEST_ASSERT_EQUAL_STRING(expected, actual);
}
- void test_GetVendorSignature_should_asdf(void) {
? --
+ void test_GetVendorSignature_should_ReturnCorrectCpuIdValues(void) {
? +++++++++++++++++++ +++
uint32_t exp_ebx = 0x756e6547; // translates to "Genu"
uint32_t exp_ecx = 0x6c65746e; // translates to "ineI"
uint32_t exp_edx = 0x49656e69; // translates to "ntel"
cpuid_info_t sig;
sig = get_vendor_signature();
TEST_ASSERT_EQUAL_UINT32(exp_ebx, sig.ebx);
TEST_ASSERT_EQUAL_UINT32(exp_ecx, sig.ecx);
TEST_ASSERT_EQUAL_UINT32(exp_edx, sig.edx);
}
void test_GetProcessorSignature_should_ReturnFamily6(void) {
unsigned int expected = 0x6;
unsigned int family;
uint32_t processor_signature = get_processor_signature();
family = (processor_signature >> 8) & 0xf;
TEST_ASSERT_EQUAL_UINT(expected, family);
} | 2 | 0.0625 | 1 | 1 |
f7f126e9bd900229aa04e9bf83da0ed8130b0d0f | environment/stage1/member.lisp | environment/stage1/member.lisp | ;;;;; TRE environment
;;;;; Copyright (c) 2005-2006,2008,2010 Sven Klose <pixel@copei.de>
(if (not (eq t *BUILTIN-MEMBER*))
(progn
(defun %member-r (elm lst)
(if lst
(if (equal elm (car lst))
lst
(%member-r elm (cdr lst)))))
(defun member (elm &rest lsts)
(or (%member-r elm (car lsts))
(if (cdr lsts)
(apply #'member elm (cdr lsts)))))))
(define-test "MEMBER finds elements"
((if (member 's '(i) '(l i k e) '(l i s p))
t))
t)
(define-test "MEMBER finds elements with user predicate"
((if (member "lisp" '("tre" "lisp") :test #'string=)
t))
t)
(define-test "MEMBER detects foureign elements"
((member 'A '(l i s p)))
nil)
(defun %member-if-r (pred lst)
(if lst
(if (funcall pred (car lst))
(car lst)
(%member-if-r pred (cdr lst)))))
(defun member-if (pred &rest lsts)
(or (%member-if-r pred (car lsts))
(if (cdr lsts)
(apply #'member-if pred (cdr lsts)))))
| ;;;;; TRE environment
;;;;; Copyright (c) 2005-2006,2008,2010-2011 Sven Klose <pixel@copei.de>
(if (not (eq t *BUILTIN-MEMBER*))
(progn
(defun %member-r (elm lst)
(if lst
(if (equal elm (car lst))
lst
(%member-r elm (cdr lst)))))
(defun member (elm &rest lsts)
(or (%member-r elm (car lsts))
(if (cdr lsts)
(apply #'member elm (cdr lsts)))))))
(define-test "MEMBER finds elements"
((if (member 's '(i) '(l i k e) '(l i s p))
t))
t)
(define-test "MEMBER finds elements with user predicate"
((if (member "lisp" '("tre" "lisp") :test #'string=)
t))
t)
(define-test "MEMBER detects foureign elements"
((member 'A '(l i s p)))
nil)
(defun %member-if-r (pred lst)
(if lst
(if (funcall pred (car lst))
lst
(%member-if-r pred (cdr lst)))))
(defun member-if (pred &rest lsts)
(or (%member-if-r pred (car lsts))
(if (cdr lsts)
(apply #'member-if pred (cdr lsts)))))
| Return the cell, not the element. | Return the cell, not the element.
| Common Lisp | mit | SvenMichaelKlose/tre,SvenMichaelKlose/tre,SvenMichaelKlose/tre | common-lisp | ## Code Before:
;;;;; TRE environment
;;;;; Copyright (c) 2005-2006,2008,2010 Sven Klose <pixel@copei.de>
(if (not (eq t *BUILTIN-MEMBER*))
(progn
(defun %member-r (elm lst)
(if lst
(if (equal elm (car lst))
lst
(%member-r elm (cdr lst)))))
(defun member (elm &rest lsts)
(or (%member-r elm (car lsts))
(if (cdr lsts)
(apply #'member elm (cdr lsts)))))))
(define-test "MEMBER finds elements"
((if (member 's '(i) '(l i k e) '(l i s p))
t))
t)
(define-test "MEMBER finds elements with user predicate"
((if (member "lisp" '("tre" "lisp") :test #'string=)
t))
t)
(define-test "MEMBER detects foureign elements"
((member 'A '(l i s p)))
nil)
(defun %member-if-r (pred lst)
(if lst
(if (funcall pred (car lst))
(car lst)
(%member-if-r pred (cdr lst)))))
(defun member-if (pred &rest lsts)
(or (%member-if-r pred (car lsts))
(if (cdr lsts)
(apply #'member-if pred (cdr lsts)))))
## Instruction:
Return the cell, not the element.
## Code After:
;;;;; TRE environment
;;;;; Copyright (c) 2005-2006,2008,2010-2011 Sven Klose <pixel@copei.de>
(if (not (eq t *BUILTIN-MEMBER*))
(progn
(defun %member-r (elm lst)
(if lst
(if (equal elm (car lst))
lst
(%member-r elm (cdr lst)))))
(defun member (elm &rest lsts)
(or (%member-r elm (car lsts))
(if (cdr lsts)
(apply #'member elm (cdr lsts)))))))
(define-test "MEMBER finds elements"
((if (member 's '(i) '(l i k e) '(l i s p))
t))
t)
(define-test "MEMBER finds elements with user predicate"
((if (member "lisp" '("tre" "lisp") :test #'string=)
t))
t)
(define-test "MEMBER detects foureign elements"
((member 'A '(l i s p)))
nil)
(defun %member-if-r (pred lst)
(if lst
(if (funcall pred (car lst))
lst
(%member-if-r pred (cdr lst)))))
(defun member-if (pred &rest lsts)
(or (%member-if-r pred (car lsts))
(if (cdr lsts)
(apply #'member-if pred (cdr lsts)))))
| ;;;;; TRE environment
- ;;;;; Copyright (c) 2005-2006,2008,2010 Sven Klose <pixel@copei.de>
+ ;;;;; Copyright (c) 2005-2006,2008,2010-2011 Sven Klose <pixel@copei.de>
? +++++
(if (not (eq t *BUILTIN-MEMBER*))
(progn
(defun %member-r (elm lst)
(if lst
(if (equal elm (car lst))
lst
(%member-r elm (cdr lst)))))
(defun member (elm &rest lsts)
(or (%member-r elm (car lsts))
(if (cdr lsts)
(apply #'member elm (cdr lsts)))))))
(define-test "MEMBER finds elements"
((if (member 's '(i) '(l i k e) '(l i s p))
t))
t)
(define-test "MEMBER finds elements with user predicate"
((if (member "lisp" '("tre" "lisp") :test #'string=)
t))
t)
(define-test "MEMBER detects foureign elements"
((member 'A '(l i s p)))
nil)
(defun %member-if-r (pred lst)
(if lst
(if (funcall pred (car lst))
- (car lst)
+ lst
(%member-if-r pred (cdr lst)))))
(defun member-if (pred &rest lsts)
(or (%member-if-r pred (car lsts))
(if (cdr lsts)
(apply #'member-if pred (cdr lsts))))) | 4 | 0.1 | 2 | 2 |
75da8eb05f42e59cdcb944a12e42a0c1eeb936c1 | upgrade/includes/311to320.sql | upgrade/includes/311to320.sql | UPDATE `settings` SET `cmumversion`='3.2.0', `dbversion`='3.2.0' WHERE `id`=1;
ALTER TABLE `groups` ADD `enabled` tinyint(1) NULL DEFAULT '1' AFTER `comment`;
ALTER TABLE `settings` DROP `dbversion`;
ALTER TABLE `settings` ADD `autoupdcheck` TINYINT(1) NULL AFTER `soonexpusrorder`;
UPDATE `settings` SET `autoupdcheck`='0' WHERE `id`=1;
ALTER TABLE `log_activity` ADD `data` VARCHAR(50) NULL AFTER `userid`;
TRUNCATE TABLE `log_activity`;
ALTER TABLE `settings` ADD `usrorderby` VARCHAR(12) NULL AFTER `autoupdcheck`, ADD `usrorder` VARCHAR(4) NULL AFTER `usrorderby`;
UPDATE `settings` SET `usrorderby`='user',`usrorder`='asc' WHERE `id`=1;
ALTER TABLE `log_genxmlreq` CHANGE `ip` `ip` VARCHAR(45);
ALTER TABLE `log_login` CHANGE `ip` `ip` VARCHAR(45); | UPDATE `settings` SET `cmumversion`='3.2.0', `dbversion`='3.2.0' WHERE `id`=1;
ALTER TABLE `groups` ADD `enabled` tinyint(1) NULL DEFAULT '1' AFTER `comment`;
ALTER TABLE `settings` DROP `dbversion`;
ALTER TABLE `settings` ADD `autoupdcheck` TINYINT(1) NULL AFTER `soonexpusrorder`;
UPDATE `settings` SET `autoupdcheck`='0' WHERE `id`=1;
TRUNCATE TABLE `log_activity`;
ALTER TABLE `log_activity` ADD `data` VARCHAR(50) NULL AFTER `userid`;
ALTER TABLE `settings` ADD `usrorderby` VARCHAR(12) NULL AFTER `autoupdcheck`, ADD `usrorder` VARCHAR(4) NULL AFTER `usrorderby`;
UPDATE `settings` SET `usrorderby`='user',`usrorder`='asc' WHERE `id`=1;
ALTER TABLE `log_genxmlreq` CHANGE `ip` `ip` VARCHAR(45);
ALTER TABLE `log_login` CHANGE `ip` `ip` VARCHAR(45); | Truncate activity log table before altering it | Truncate activity log table before altering it
| SQL | mit | dukereborn/cmum,dukereborn/cmum | sql | ## Code Before:
UPDATE `settings` SET `cmumversion`='3.2.0', `dbversion`='3.2.0' WHERE `id`=1;
ALTER TABLE `groups` ADD `enabled` tinyint(1) NULL DEFAULT '1' AFTER `comment`;
ALTER TABLE `settings` DROP `dbversion`;
ALTER TABLE `settings` ADD `autoupdcheck` TINYINT(1) NULL AFTER `soonexpusrorder`;
UPDATE `settings` SET `autoupdcheck`='0' WHERE `id`=1;
ALTER TABLE `log_activity` ADD `data` VARCHAR(50) NULL AFTER `userid`;
TRUNCATE TABLE `log_activity`;
ALTER TABLE `settings` ADD `usrorderby` VARCHAR(12) NULL AFTER `autoupdcheck`, ADD `usrorder` VARCHAR(4) NULL AFTER `usrorderby`;
UPDATE `settings` SET `usrorderby`='user',`usrorder`='asc' WHERE `id`=1;
ALTER TABLE `log_genxmlreq` CHANGE `ip` `ip` VARCHAR(45);
ALTER TABLE `log_login` CHANGE `ip` `ip` VARCHAR(45);
## Instruction:
Truncate activity log table before altering it
## Code After:
UPDATE `settings` SET `cmumversion`='3.2.0', `dbversion`='3.2.0' WHERE `id`=1;
ALTER TABLE `groups` ADD `enabled` tinyint(1) NULL DEFAULT '1' AFTER `comment`;
ALTER TABLE `settings` DROP `dbversion`;
ALTER TABLE `settings` ADD `autoupdcheck` TINYINT(1) NULL AFTER `soonexpusrorder`;
UPDATE `settings` SET `autoupdcheck`='0' WHERE `id`=1;
TRUNCATE TABLE `log_activity`;
ALTER TABLE `log_activity` ADD `data` VARCHAR(50) NULL AFTER `userid`;
ALTER TABLE `settings` ADD `usrorderby` VARCHAR(12) NULL AFTER `autoupdcheck`, ADD `usrorder` VARCHAR(4) NULL AFTER `usrorderby`;
UPDATE `settings` SET `usrorderby`='user',`usrorder`='asc' WHERE `id`=1;
ALTER TABLE `log_genxmlreq` CHANGE `ip` `ip` VARCHAR(45);
ALTER TABLE `log_login` CHANGE `ip` `ip` VARCHAR(45); | UPDATE `settings` SET `cmumversion`='3.2.0', `dbversion`='3.2.0' WHERE `id`=1;
ALTER TABLE `groups` ADD `enabled` tinyint(1) NULL DEFAULT '1' AFTER `comment`;
ALTER TABLE `settings` DROP `dbversion`;
ALTER TABLE `settings` ADD `autoupdcheck` TINYINT(1) NULL AFTER `soonexpusrorder`;
UPDATE `settings` SET `autoupdcheck`='0' WHERE `id`=1;
+ TRUNCATE TABLE `log_activity`;
ALTER TABLE `log_activity` ADD `data` VARCHAR(50) NULL AFTER `userid`;
- TRUNCATE TABLE `log_activity`;
ALTER TABLE `settings` ADD `usrorderby` VARCHAR(12) NULL AFTER `autoupdcheck`, ADD `usrorder` VARCHAR(4) NULL AFTER `usrorderby`;
UPDATE `settings` SET `usrorderby`='user',`usrorder`='asc' WHERE `id`=1;
ALTER TABLE `log_genxmlreq` CHANGE `ip` `ip` VARCHAR(45);
ALTER TABLE `log_login` CHANGE `ip` `ip` VARCHAR(45); | 2 | 0.181818 | 1 | 1 |
20829f036d7b54b15e3d748bad8923a00ff0313e | app/views/posts/show.html.haml | app/views/posts/show.html.haml | - if @pages.respond_to?(:total_pages)
- content_for(:rel_next_prev_link_tags) { rel_next_prev_link_tags(@pages) }
- content_for(:canonical_tag) do
%link{rel: :canonical, href: url_for(all: true, only_path: false)}
- breadcrumb :post, @post
- set_meta_tags title: @post.title
%article
%h1= @post.title
= raw render_markdown(@pages.map(&:body).join)
- if @pages.respond_to?(:total_pages) && @pages.total_pages > 1
%nav.pagination
= paginate @pages, theme: :post
= render partial: 'banner', locals: {position: :press_article_bottom}
%aside
= render partial: 'sns_share', locals: {title: @post.title, url: post_url(@post), twitter_account: current_site.twitter_account, caption: current_site.sns_share_caption}
= render partial: 'short_collection', locals: {posts: @post.related_posts}
| - if @pages.respond_to?(:total_pages)
- content_for(:rel_next_prev_link_tags) { rel_next_prev_link_tags(@pages) }
- content_for(:canonical_tag) do
%link{rel: :canonical, href: url_for(all: true, only_path: false)}
- breadcrumb :post, @post
- set_meta_tags title: @post.title, description: @post.body
%article
%h1= @post.title
= raw render_markdown(@pages.map(&:body).join)
- if @pages.respond_to?(:total_pages) && @pages.total_pages > 1
%nav.pagination
= paginate @pages, theme: :post
= render partial: 'banner', locals: {position: :press_article_bottom}
%aside
= render partial: 'sns_share', locals: {title: @post.title, url: post_url(@post), twitter_account: current_site.twitter_account, caption: current_site.sns_share_caption}
= render partial: 'short_collection', locals: {posts: @post.related_posts}
| Set meta description to post.body | Set meta description to post.body
| Haml | mit | bm-sms/daimon-news,bm-sms/daimon-news,bm-sms/daimon-news | haml | ## Code Before:
- if @pages.respond_to?(:total_pages)
- content_for(:rel_next_prev_link_tags) { rel_next_prev_link_tags(@pages) }
- content_for(:canonical_tag) do
%link{rel: :canonical, href: url_for(all: true, only_path: false)}
- breadcrumb :post, @post
- set_meta_tags title: @post.title
%article
%h1= @post.title
= raw render_markdown(@pages.map(&:body).join)
- if @pages.respond_to?(:total_pages) && @pages.total_pages > 1
%nav.pagination
= paginate @pages, theme: :post
= render partial: 'banner', locals: {position: :press_article_bottom}
%aside
= render partial: 'sns_share', locals: {title: @post.title, url: post_url(@post), twitter_account: current_site.twitter_account, caption: current_site.sns_share_caption}
= render partial: 'short_collection', locals: {posts: @post.related_posts}
## Instruction:
Set meta description to post.body
## Code After:
- if @pages.respond_to?(:total_pages)
- content_for(:rel_next_prev_link_tags) { rel_next_prev_link_tags(@pages) }
- content_for(:canonical_tag) do
%link{rel: :canonical, href: url_for(all: true, only_path: false)}
- breadcrumb :post, @post
- set_meta_tags title: @post.title, description: @post.body
%article
%h1= @post.title
= raw render_markdown(@pages.map(&:body).join)
- if @pages.respond_to?(:total_pages) && @pages.total_pages > 1
%nav.pagination
= paginate @pages, theme: :post
= render partial: 'banner', locals: {position: :press_article_bottom}
%aside
= render partial: 'sns_share', locals: {title: @post.title, url: post_url(@post), twitter_account: current_site.twitter_account, caption: current_site.sns_share_caption}
= render partial: 'short_collection', locals: {posts: @post.related_posts}
| - if @pages.respond_to?(:total_pages)
- content_for(:rel_next_prev_link_tags) { rel_next_prev_link_tags(@pages) }
- content_for(:canonical_tag) do
%link{rel: :canonical, href: url_for(all: true, only_path: false)}
- breadcrumb :post, @post
- - set_meta_tags title: @post.title
+ - set_meta_tags title: @post.title, description: @post.body
%article
%h1= @post.title
= raw render_markdown(@pages.map(&:body).join)
- if @pages.respond_to?(:total_pages) && @pages.total_pages > 1
%nav.pagination
= paginate @pages, theme: :post
= render partial: 'banner', locals: {position: :press_article_bottom}
%aside
= render partial: 'sns_share', locals: {title: @post.title, url: post_url(@post), twitter_account: current_site.twitter_account, caption: current_site.sns_share_caption}
= render partial: 'short_collection', locals: {posts: @post.related_posts} | 2 | 0.095238 | 1 | 1 |
09143f762ffa1f97dee2b69c878dd2061ab7e0aa | surelog/meta.yaml | surelog/meta.yaml | {% set version = '%s_%04i_%s'|format(GIT_DESCRIBE_TAG|replace('-v','.') or '0.X', GIT_DESCRIBE_NUMBER|int, GIT_DESCRIBE_HASH or 'gUNKNOWN') %}
package:
name: surelog
version: {{ version }}
source:
git_url: https://github.com/alainmarcel/Surelog
git_rev: master
build:
number: {{ environ.get('DATE_NUM') }}
string: {{ environ.get('DATE_STR') }}
script_env:
- CI
- TRAVIS
requirements:
build:
- {{ compiler('c') }}
- {{ compiler('cxx') }}
- cmake
- pkg-config
host:
- python {{ python }}
- gperftools
- tk
- libunwind
- libuuid
- swig
run:
- python
about:
home: https://github.com/alainmarcel/Surelog
license: Apache
license_file: LICENSE
summary: 'Parser/Compiler for SystemVerilog'
| {% set version = '%s_%04i_%s'|format(GIT_DESCRIBE_TAG|replace('-v','.') or '0.X', GIT_DESCRIBE_NUMBER|int, GIT_DESCRIBE_HASH or 'gUNKNOWN') %}
package:
name: surelog
version: {{ version }}
source:
git_url: https://github.com/alainmarcel/Surelog
git_rev: master
build:
number: {{ environ.get('DATE_NUM') }}
string: {{ environ.get('DATE_STR') }}
script_env:
- CI
- TRAVIS
requirements:
build:
- {{ compiler('c') }}
- {{ compiler('cxx') }}
- cmake
- pkg-config
host:
- python {{ python }}
- gperftools
- tk
- libunwind
- libuuid
- swig
run:
- python
- gperftools
about:
home: https://github.com/alainmarcel/Surelog
license: Apache
license_file: LICENSE
summary: 'Parser/Compiler for SystemVerilog'
| Make sure gperftools are available at runtime. | Make sure gperftools are available at runtime.
Signed-off-by: Henner Zeller <259ad50fabbfd39355f36ecd6c0afca541ea66ef@acm.org>
| YAML | apache-2.0 | SymbiFlow/conda-packages,SymbiFlow/conda-packages | yaml | ## Code Before:
{% set version = '%s_%04i_%s'|format(GIT_DESCRIBE_TAG|replace('-v','.') or '0.X', GIT_DESCRIBE_NUMBER|int, GIT_DESCRIBE_HASH or 'gUNKNOWN') %}
package:
name: surelog
version: {{ version }}
source:
git_url: https://github.com/alainmarcel/Surelog
git_rev: master
build:
number: {{ environ.get('DATE_NUM') }}
string: {{ environ.get('DATE_STR') }}
script_env:
- CI
- TRAVIS
requirements:
build:
- {{ compiler('c') }}
- {{ compiler('cxx') }}
- cmake
- pkg-config
host:
- python {{ python }}
- gperftools
- tk
- libunwind
- libuuid
- swig
run:
- python
about:
home: https://github.com/alainmarcel/Surelog
license: Apache
license_file: LICENSE
summary: 'Parser/Compiler for SystemVerilog'
## Instruction:
Make sure gperftools are available at runtime.
Signed-off-by: Henner Zeller <259ad50fabbfd39355f36ecd6c0afca541ea66ef@acm.org>
## Code After:
{% set version = '%s_%04i_%s'|format(GIT_DESCRIBE_TAG|replace('-v','.') or '0.X', GIT_DESCRIBE_NUMBER|int, GIT_DESCRIBE_HASH or 'gUNKNOWN') %}
package:
name: surelog
version: {{ version }}
source:
git_url: https://github.com/alainmarcel/Surelog
git_rev: master
build:
number: {{ environ.get('DATE_NUM') }}
string: {{ environ.get('DATE_STR') }}
script_env:
- CI
- TRAVIS
requirements:
build:
- {{ compiler('c') }}
- {{ compiler('cxx') }}
- cmake
- pkg-config
host:
- python {{ python }}
- gperftools
- tk
- libunwind
- libuuid
- swig
run:
- python
- gperftools
about:
home: https://github.com/alainmarcel/Surelog
license: Apache
license_file: LICENSE
summary: 'Parser/Compiler for SystemVerilog'
| {% set version = '%s_%04i_%s'|format(GIT_DESCRIBE_TAG|replace('-v','.') or '0.X', GIT_DESCRIBE_NUMBER|int, GIT_DESCRIBE_HASH or 'gUNKNOWN') %}
package:
name: surelog
version: {{ version }}
source:
git_url: https://github.com/alainmarcel/Surelog
git_rev: master
build:
number: {{ environ.get('DATE_NUM') }}
string: {{ environ.get('DATE_STR') }}
script_env:
- CI
- TRAVIS
requirements:
build:
- {{ compiler('c') }}
- {{ compiler('cxx') }}
- cmake
- pkg-config
host:
- python {{ python }}
- gperftools
- tk
- libunwind
- libuuid
- swig
run:
- python
+ - gperftools
about:
home: https://github.com/alainmarcel/Surelog
license: Apache
license_file: LICENSE
summary: 'Parser/Compiler for SystemVerilog' | 1 | 0.025641 | 1 | 0 |
03701d57af1c5c3093134d30e37a03fb140317de | _includes/footer.html | _includes/footer.html | <footer class="site-footer">
<!-- Latest compiled and minified JavaScript -->
<script src="https://code.jquery.com/jquery-2.1.4.min.js"crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" crossorigin="anonymous"></script>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-69723592-1', 'auto');
ga('send', 'pageview');
</script>
</footer>
| <footer class="site-footer">
<!-- Latest compiled and minified JavaScript -->
<script src="https://code.jquery.com/jquery-2.1.4.min.js"crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" crossorigin="anonymous"></script>
</footer>
| Move the Google Analytics code to CloudFlare | Move the Google Analytics code to CloudFlare
Lets see if this feature works!
| HTML | mit | SalvatoreT/monkeynumber,SalvatoreT/monkeynumber | html | ## Code Before:
<footer class="site-footer">
<!-- Latest compiled and minified JavaScript -->
<script src="https://code.jquery.com/jquery-2.1.4.min.js"crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" crossorigin="anonymous"></script>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-69723592-1', 'auto');
ga('send', 'pageview');
</script>
</footer>
## Instruction:
Move the Google Analytics code to CloudFlare
Lets see if this feature works!
## Code After:
<footer class="site-footer">
<!-- Latest compiled and minified JavaScript -->
<script src="https://code.jquery.com/jquery-2.1.4.min.js"crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" crossorigin="anonymous"></script>
</footer>
| <footer class="site-footer">
<!-- Latest compiled and minified JavaScript -->
<script src="https://code.jquery.com/jquery-2.1.4.min.js"crossorigin="anonymous"></script>
<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" crossorigin="anonymous"></script>
- <script>
- (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
- (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
- m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
- })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
-
- ga('create', 'UA-69723592-1', 'auto');
- ga('send', 'pageview');
- </script>
</footer> | 9 | 0.642857 | 0 | 9 |
b3972a05472f510278dbd5d94e95bde4ae14e6cc | packages/vulcan-admin/lib/modules/fragments.js | packages/vulcan-admin/lib/modules/fragments.js | import { registerFragment } from 'meteor/vulcan:lib';
// ------------------------------ Vote ------------------------------ //
// note: fragment used by default on the UsersProfile fragment
registerFragment(`
fragment UsersAdmin on User {
_id
username
createdAt
isAdmin
displayName
email
emailHash
slug
groups
services
avatarUrl
}
`);
| import { registerFragment } from 'meteor/vulcan:lib';
// ------------------------------ Vote ------------------------------ //
// note: fragment used by default on the UsersProfile fragment
registerFragment(`
fragment UsersAdmin on User {
_id
username
createdAt
isAdmin
displayName
email
emailHash
slug
groups
services
avatarUrl
pageUrl
pagePath
}
`);
| Add pageUrl and pagePath to user fragment; | Add pageUrl and pagePath to user fragment;
| JavaScript | mit | VulcanJS/Vulcan,VulcanJS/Vulcan | javascript | ## Code Before:
import { registerFragment } from 'meteor/vulcan:lib';
// ------------------------------ Vote ------------------------------ //
// note: fragment used by default on the UsersProfile fragment
registerFragment(`
fragment UsersAdmin on User {
_id
username
createdAt
isAdmin
displayName
email
emailHash
slug
groups
services
avatarUrl
}
`);
## Instruction:
Add pageUrl and pagePath to user fragment;
## Code After:
import { registerFragment } from 'meteor/vulcan:lib';
// ------------------------------ Vote ------------------------------ //
// note: fragment used by default on the UsersProfile fragment
registerFragment(`
fragment UsersAdmin on User {
_id
username
createdAt
isAdmin
displayName
email
emailHash
slug
groups
services
avatarUrl
pageUrl
pagePath
}
`);
| import { registerFragment } from 'meteor/vulcan:lib';
// ------------------------------ Vote ------------------------------ //
// note: fragment used by default on the UsersProfile fragment
registerFragment(`
fragment UsersAdmin on User {
_id
username
createdAt
isAdmin
displayName
email
emailHash
slug
groups
services
avatarUrl
+ pageUrl
+ pagePath
}
`); | 2 | 0.1 | 2 | 0 |
509c1679e22666385b44b8dd20236bcf31034fcb | .travis.yml | .travis.yml | sudo: false
language: rust
rust:
- nightly
- beta
- stable
- 1.9.0
before_script:
- |
pip install 'travis-cargo<0.2' --user &&
export PATH=$HOME/.local/bin:$PATH
script:
- |
travis-cargo build &&
travis-cargo test &&
travis-cargo build -- --no-default-features &&
travis-cargo test -- --no-default-features
after_success:
- |
if [ "$TRAVIS_RUST_VERSION" == "stable" ]; then
travis-cargo doc -- --no-default-features &&
mv target/doc target/doc_core &&
travis-cargo doc &&
mv target/doc_core target/doc/core &&
travis-cargo doc-upload
fi
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE="" # no unstable feature
- secure: ddcWXicVcCooC+Dy8guGruZY2bAU3oyGjrxdC3YNfBYdatEKzW1toAiQyN8SRyZyfoHsbb7lh4YeBfv1rpmTPM6nvHMz9CHMlvED8Y+/QuYoKN2qrNiQ7eQ9xSVhOVlha/GMPSZXxmEIuJVj0Dn1D/S4RWyNMKCJdj2YvybPzOU=
| sudo: false
language: rust
rust:
- nightly
- beta
- stable
- 1.9.0
before_script:
- |
pip install 'travis-cargo<0.2' --user &&
export PATH=$HOME/.local/bin:$PATH
script:
- |
travis-cargo test -- --no-default-features &&
travis-cargo test &&
travis-cargo test -- --all-features &&
rm Cargo.lock &&
# no version of quickcheck compiles with -Z minimal-versions, but serde does
travis-cargo --only nightly build -- -Z minimal-versions --features 'serde serde-test'
after_success:
- |
if [ "$TRAVIS_RUST_VERSION" == "stable" ]; then
travis-cargo doc -- --no-default-features &&
mv target/doc target/doc_core &&
travis-cargo doc &&
mv target/doc_core target/doc/core &&
travis-cargo doc-upload
fi
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE="" # no unstable feature
- secure: ddcWXicVcCooC+Dy8guGruZY2bAU3oyGjrxdC3YNfBYdatEKzW1toAiQyN8SRyZyfoHsbb7lh4YeBfv1rpmTPM6nvHMz9CHMlvED8Y+/QuYoKN2qrNiQ7eQ9xSVhOVlha/GMPSZXxmEIuJVj0Dn1D/S4RWyNMKCJdj2YvybPzOU=
| Test optional dependencies on Travis | Test optional dependencies on Travis
| YAML | apache-2.0 | tomprogrammer/rust-ascii | yaml | ## Code Before:
sudo: false
language: rust
rust:
- nightly
- beta
- stable
- 1.9.0
before_script:
- |
pip install 'travis-cargo<0.2' --user &&
export PATH=$HOME/.local/bin:$PATH
script:
- |
travis-cargo build &&
travis-cargo test &&
travis-cargo build -- --no-default-features &&
travis-cargo test -- --no-default-features
after_success:
- |
if [ "$TRAVIS_RUST_VERSION" == "stable" ]; then
travis-cargo doc -- --no-default-features &&
mv target/doc target/doc_core &&
travis-cargo doc &&
mv target/doc_core target/doc/core &&
travis-cargo doc-upload
fi
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE="" # no unstable feature
- secure: ddcWXicVcCooC+Dy8guGruZY2bAU3oyGjrxdC3YNfBYdatEKzW1toAiQyN8SRyZyfoHsbb7lh4YeBfv1rpmTPM6nvHMz9CHMlvED8Y+/QuYoKN2qrNiQ7eQ9xSVhOVlha/GMPSZXxmEIuJVj0Dn1D/S4RWyNMKCJdj2YvybPzOU=
## Instruction:
Test optional dependencies on Travis
## Code After:
sudo: false
language: rust
rust:
- nightly
- beta
- stable
- 1.9.0
before_script:
- |
pip install 'travis-cargo<0.2' --user &&
export PATH=$HOME/.local/bin:$PATH
script:
- |
travis-cargo test -- --no-default-features &&
travis-cargo test &&
travis-cargo test -- --all-features &&
rm Cargo.lock &&
# no version of quickcheck compiles with -Z minimal-versions, but serde does
travis-cargo --only nightly build -- -Z minimal-versions --features 'serde serde-test'
after_success:
- |
if [ "$TRAVIS_RUST_VERSION" == "stable" ]; then
travis-cargo doc -- --no-default-features &&
mv target/doc target/doc_core &&
travis-cargo doc &&
mv target/doc_core target/doc/core &&
travis-cargo doc-upload
fi
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE="" # no unstable feature
- secure: ddcWXicVcCooC+Dy8guGruZY2bAU3oyGjrxdC3YNfBYdatEKzW1toAiQyN8SRyZyfoHsbb7lh4YeBfv1rpmTPM6nvHMz9CHMlvED8Y+/QuYoKN2qrNiQ7eQ9xSVhOVlha/GMPSZXxmEIuJVj0Dn1D/S4RWyNMKCJdj2YvybPzOU=
| sudo: false
language: rust
rust:
- nightly
- beta
- stable
- 1.9.0
before_script:
- |
pip install 'travis-cargo<0.2' --user &&
export PATH=$HOME/.local/bin:$PATH
script:
- |
- travis-cargo build &&
+ travis-cargo test -- --no-default-features &&
travis-cargo test &&
- travis-cargo build -- --no-default-features &&
- travis-cargo test -- --no-default-features
? ------ - ^
+ travis-cargo test -- --all-features &&
? ^ +++
+ rm Cargo.lock &&
+ # no version of quickcheck compiles with -Z minimal-versions, but serde does
+ travis-cargo --only nightly build -- -Z minimal-versions --features 'serde serde-test'
after_success:
- |
if [ "$TRAVIS_RUST_VERSION" == "stable" ]; then
travis-cargo doc -- --no-default-features &&
mv target/doc target/doc_core &&
travis-cargo doc &&
mv target/doc_core target/doc/core &&
travis-cargo doc-upload
fi
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE="" # no unstable feature
- secure: ddcWXicVcCooC+Dy8guGruZY2bAU3oyGjrxdC3YNfBYdatEKzW1toAiQyN8SRyZyfoHsbb7lh4YeBfv1rpmTPM6nvHMz9CHMlvED8Y+/QuYoKN2qrNiQ7eQ9xSVhOVlha/GMPSZXxmEIuJVj0Dn1D/S4RWyNMKCJdj2YvybPzOU= | 8 | 0.228571 | 5 | 3 |
894c5b360f1451d20b5ac4f74d3185509c249a56 | lib/dogma/rules.ex | lib/dogma/rules.ex | defmodule Dogma.Rules do
@moduledoc """
Responsible for running of the appropriate rule set on a given set of scripts
with the appropriate configuration.
"""
alias Dogma.Formatter
alias Dogma.Script
@default_rule_set Dogma.RuleSet.All
@doc """
Runs the rules in the current rule set on the given scripts.
"""
def test(scripts) do
formatter = Formatter.default_formatter
scripts
|> Enum.map(&test_script(&1, formatter, selected_set))
end
@doc """
Returns currently selected rule set, as specified in the mix config.
Defaults to `Dogma.RuleSet.All`
"""
def selected_set do
Application.get_env :dogma, :rule_set, @default_rule_set
end
defp test_script(script, formatter, rule_set) do
rules = rule_set.list
errors = script |> Script.run_tests( rules )
script = %Script{ script | errors: errors }
Formatter.script( script, formatter )
script
end
end
| defmodule Dogma.Rules do
@moduledoc """
Responsible for running of the appropriate rule set on a given set of scripts
with the appropriate configuration.
"""
alias Dogma.Formatter
alias Dogma.Script
@default_rule_set Dogma.RuleSet.All
@doc """
Runs the rules in the current rule set on the given scripts.
"""
def test(scripts) do
formatter = Formatter.default_formatter
test_set = selected_set
scripts
|> Enum.map(&Task.async(fn -> test_script(&1, formatter, test_set) end))
|> Enum.map(&Task.await/1)
end
@doc """
Returns currently selected rule set, as specified in the mix config.
Defaults to `Dogma.RuleSet.All`
"""
def selected_set do
Application.get_env :dogma, :rule_set, @default_rule_set
end
defp test_script(script, formatter, rule_set) do
rules = rule_set.list
errors = script |> Script.run_tests( rules )
script = %Script{ script | errors: errors }
Formatter.script( script, formatter )
script
end
end
| Test files in parallel using Task.async | Test files in parallel using Task.async
| Elixir | mit | linearregression/dogma,eksperimental/dogma,meadsteve/dogma,knewter/dogma,henrik/dogma,rrrene/dogma,simonewebdesign/dogma,jacknoble/dogma,timbuchwaldt/dogma,aerosol/dogma,5thWall/dogma | elixir | ## Code Before:
defmodule Dogma.Rules do
@moduledoc """
Responsible for running of the appropriate rule set on a given set of scripts
with the appropriate configuration.
"""
alias Dogma.Formatter
alias Dogma.Script
@default_rule_set Dogma.RuleSet.All
@doc """
Runs the rules in the current rule set on the given scripts.
"""
def test(scripts) do
formatter = Formatter.default_formatter
scripts
|> Enum.map(&test_script(&1, formatter, selected_set))
end
@doc """
Returns currently selected rule set, as specified in the mix config.
Defaults to `Dogma.RuleSet.All`
"""
def selected_set do
Application.get_env :dogma, :rule_set, @default_rule_set
end
defp test_script(script, formatter, rule_set) do
rules = rule_set.list
errors = script |> Script.run_tests( rules )
script = %Script{ script | errors: errors }
Formatter.script( script, formatter )
script
end
end
## Instruction:
Test files in parallel using Task.async
## Code After:
defmodule Dogma.Rules do
@moduledoc """
Responsible for running of the appropriate rule set on a given set of scripts
with the appropriate configuration.
"""
alias Dogma.Formatter
alias Dogma.Script
@default_rule_set Dogma.RuleSet.All
@doc """
Runs the rules in the current rule set on the given scripts.
"""
def test(scripts) do
formatter = Formatter.default_formatter
test_set = selected_set
scripts
|> Enum.map(&Task.async(fn -> test_script(&1, formatter, test_set) end))
|> Enum.map(&Task.await/1)
end
@doc """
Returns currently selected rule set, as specified in the mix config.
Defaults to `Dogma.RuleSet.All`
"""
def selected_set do
Application.get_env :dogma, :rule_set, @default_rule_set
end
defp test_script(script, formatter, rule_set) do
rules = rule_set.list
errors = script |> Script.run_tests( rules )
script = %Script{ script | errors: errors }
Formatter.script( script, formatter )
script
end
end
| defmodule Dogma.Rules do
@moduledoc """
Responsible for running of the appropriate rule set on a given set of scripts
with the appropriate configuration.
"""
alias Dogma.Formatter
alias Dogma.Script
@default_rule_set Dogma.RuleSet.All
@doc """
Runs the rules in the current rule set on the given scripts.
"""
def test(scripts) do
formatter = Formatter.default_formatter
+ test_set = selected_set
scripts
- |> Enum.map(&test_script(&1, formatter, selected_set))
? ----- ^
+ |> Enum.map(&Task.async(fn -> test_script(&1, formatter, test_set) end))
? ++ +++++++++++++++++ ^^ ++++ +
+ |> Enum.map(&Task.await/1)
end
@doc """
Returns currently selected rule set, as specified in the mix config.
Defaults to `Dogma.RuleSet.All`
"""
def selected_set do
Application.get_env :dogma, :rule_set, @default_rule_set
end
-
defp test_script(script, formatter, rule_set) do
rules = rule_set.list
errors = script |> Script.run_tests( rules )
script = %Script{ script | errors: errors }
Formatter.script( script, formatter )
script
end
end | 5 | 0.128205 | 3 | 2 |
1118e43ba564bb499452bfc498deba6edc380cd3 | lib/hets/prove_caller.rb | lib/hets/prove_caller.rb | module Hets
class ProveCaller < ActionCaller
CMD = 'prove'
METHOD = :post
COMMAND_LIST = %w(auto)
PROVE_OPTIONS = {format: 'json', include: 'true'}
def call(iri, options = {})
escaped_iri = Rack::Utils.escape_path(iri)
arguments = [escaped_iri, *COMMAND_LIST]
api_uri = build_api_uri(CMD, arguments, build_query_string)
perform(api_uri, PROVE_OPTIONS.merge(options), METHOD)
end
end
end
| module Hets
class ProveCaller < ActionCaller
CMD = 'prove'
METHOD = :post
COMMAND_LIST = %w(auto)
PROVE_OPTIONS = {format: 'json', include: 'true'}
def call(iri, options = {})
escaped_iri = Rack::Utils.escape_path(iri)
arguments = [escaped_iri, *COMMAND_LIST]
api_uri = build_api_uri(CMD, arguments, build_query_string)
perform(api_uri, PROVE_OPTIONS.merge(options), METHOD)
rescue UnfollowableResponseError => error
handle_possible_hets_error(error)
end
end
end
| Add hets error handling to ProveCaller. | Add hets error handling to ProveCaller.
| Ruby | agpl-3.0 | ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub | ruby | ## Code Before:
module Hets
class ProveCaller < ActionCaller
CMD = 'prove'
METHOD = :post
COMMAND_LIST = %w(auto)
PROVE_OPTIONS = {format: 'json', include: 'true'}
def call(iri, options = {})
escaped_iri = Rack::Utils.escape_path(iri)
arguments = [escaped_iri, *COMMAND_LIST]
api_uri = build_api_uri(CMD, arguments, build_query_string)
perform(api_uri, PROVE_OPTIONS.merge(options), METHOD)
end
end
end
## Instruction:
Add hets error handling to ProveCaller.
## Code After:
module Hets
class ProveCaller < ActionCaller
CMD = 'prove'
METHOD = :post
COMMAND_LIST = %w(auto)
PROVE_OPTIONS = {format: 'json', include: 'true'}
def call(iri, options = {})
escaped_iri = Rack::Utils.escape_path(iri)
arguments = [escaped_iri, *COMMAND_LIST]
api_uri = build_api_uri(CMD, arguments, build_query_string)
perform(api_uri, PROVE_OPTIONS.merge(options), METHOD)
rescue UnfollowableResponseError => error
handle_possible_hets_error(error)
end
end
end
| module Hets
class ProveCaller < ActionCaller
CMD = 'prove'
METHOD = :post
COMMAND_LIST = %w(auto)
PROVE_OPTIONS = {format: 'json', include: 'true'}
def call(iri, options = {})
escaped_iri = Rack::Utils.escape_path(iri)
arguments = [escaped_iri, *COMMAND_LIST]
api_uri = build_api_uri(CMD, arguments, build_query_string)
perform(api_uri, PROVE_OPTIONS.merge(options), METHOD)
+ rescue UnfollowableResponseError => error
+ handle_possible_hets_error(error)
end
end
end | 2 | 0.125 | 2 | 0 |
103e19c090be59482d6f3518d70da2adc3676cf3 | README.md | README.md | Abungo
======
Simple chat app made with Socket.IO and node.js
|
Simple chat app made with node.js and Socket.IO.
## Simple and anonymous
The server doesn't store any information about the users or keep logs. It simply relays the data from one client to the rest.
## Responsive layout
The Abungo client is minimally designed and adapts to different screen sizes.


## Running Abungo
First, you'll need to download and install [node.js](http://nodejs.org) if you haven't already.
`cd` to the Abungo directory and install Socket.IO and Express:
```
npm install --save socket.io
npm install --save express
# save option automatically adds the package to package.json
```
After you have everything installed, run `node server` from the Abungo directory. Navigate to `http://localhost:3000` in your browser to see it in action.
| Add screenshots and installation instructions | Add screenshots and installation instructions | Markdown | mit | z-------------/Abungo,z-------------/Abungo | markdown | ## Code Before:
Abungo
======
Simple chat app made with Socket.IO and node.js
## Instruction:
Add screenshots and installation instructions
## Code After:
Simple chat app made with node.js and Socket.IO.
## Simple and anonymous
The server doesn't store any information about the users or keep logs. It simply relays the data from one client to the rest.
## Responsive layout
The Abungo client is minimally designed and adapts to different screen sizes.


## Running Abungo
First, you'll need to download and install [node.js](http://nodejs.org) if you haven't already.
`cd` to the Abungo directory and install Socket.IO and Express:
```
npm install --save socket.io
npm install --save express
# save option automatically adds the package to package.json
```
After you have everything installed, run `node server` from the Abungo directory. Navigate to `http://localhost:3000` in your browser to see it in action.
| - Abungo
- ======
- Simple chat app made with Socket.IO and node.js
? --------- --
+ Simple chat app made with node.js and Socket.IO.
? ++++++++++++
+
+ ## Simple and anonymous
+
+ The server doesn't store any information about the users or keep logs. It simply relays the data from one client to the rest.
+
+ ## Responsive layout
+
+ The Abungo client is minimally designed and adapts to different screen sizes.
+
+ 
+
+ 
+
+ ## Running Abungo
+
+ First, you'll need to download and install [node.js](http://nodejs.org) if you haven't already.
+
+ `cd` to the Abungo directory and install Socket.IO and Express:
+
+ ```
+ npm install --save socket.io
+ npm install --save express
+
+ # save option automatically adds the package to package.json
+ ```
+
+ After you have everything installed, run `node server` from the Abungo directory. Navigate to `http://localhost:3000` in your browser to see it in action. | 31 | 7.75 | 28 | 3 |
422520dac8a7204007e439ccc780593f36a63e1a | unittests/CodeGen/CMakeLists.txt | unittests/CodeGen/CMakeLists.txt | set(LLVM_LINK_COMPONENTS
AsmPrinter
Support
)
set(CodeGenSources
DIEHashTest.cpp
LowLevelTypeTest.cpp
)
add_llvm_unittest(CodeGenTests
${CodeGenSources}
)
add_subdirectory(GlobalISel)
| set(LLVM_LINK_COMPONENTS
AsmPrinter
CodeGen
Core
Support
)
set(CodeGenSources
DIEHashTest.cpp
LowLevelTypeTest.cpp
)
add_llvm_unittest(CodeGenTests
${CodeGenSources}
)
add_subdirectory(GlobalISel)
| Add missing link components to r277160 unittest. NFC. | [GlobalISel] Add missing link components to r277160 unittest. NFC.
It broke a shared builder:
http://lab.llvm.org:8011/builders/llvm-mips-linux/builds/17320
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@277201 91177308-0d34-0410-b5e6-96231b3b80d8
| Text | apache-2.0 | llvm-mirror/llvm,apple/swift-llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,apple/swift-llvm,apple/swift-llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,apple/swift-llvm,apple/swift-llvm,llvm-mirror/llvm,apple/swift-llvm,apple/swift-llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,apple/swift-llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm | text | ## Code Before:
set(LLVM_LINK_COMPONENTS
AsmPrinter
Support
)
set(CodeGenSources
DIEHashTest.cpp
LowLevelTypeTest.cpp
)
add_llvm_unittest(CodeGenTests
${CodeGenSources}
)
add_subdirectory(GlobalISel)
## Instruction:
[GlobalISel] Add missing link components to r277160 unittest. NFC.
It broke a shared builder:
http://lab.llvm.org:8011/builders/llvm-mips-linux/builds/17320
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@277201 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
set(LLVM_LINK_COMPONENTS
AsmPrinter
CodeGen
Core
Support
)
set(CodeGenSources
DIEHashTest.cpp
LowLevelTypeTest.cpp
)
add_llvm_unittest(CodeGenTests
${CodeGenSources}
)
add_subdirectory(GlobalISel)
| set(LLVM_LINK_COMPONENTS
AsmPrinter
+ CodeGen
+ Core
Support
)
set(CodeGenSources
DIEHashTest.cpp
LowLevelTypeTest.cpp
)
add_llvm_unittest(CodeGenTests
${CodeGenSources}
)
add_subdirectory(GlobalISel) | 2 | 0.133333 | 2 | 0 |
d2d11af057bee0af852a20f5a126bb4bb47bcc22 | app/controllers/clubs_controller.rb | app/controllers/clubs_controller.rb | class ClubsController < ApplicationController
before_filter :authenticate_user!
def edit
@club = Club.find params[:id]
authorize! :manage, @club
end
end
| class ClubsController < ApplicationController
respond_to :html, :json
before_filter :authenticate_user!
before_filter :get_club
def edit
authorize! :update, @club
end
def update
authorize! :update, @club
@club.update_attributes params[:club]
respond_with_bip @club
end
private
def get_club
@club = Club.find params[:id]
end
end
| Update ClubsController for Update Action | Update ClubsController for Update Action
Update the ClubsController to include the update action, as well as
provide some refactoring to eliminate code duplication.
| Ruby | mit | jekhokie/IfSimply,jekhokie/IfSimply,jekhokie/IfSimply | ruby | ## Code Before:
class ClubsController < ApplicationController
before_filter :authenticate_user!
def edit
@club = Club.find params[:id]
authorize! :manage, @club
end
end
## Instruction:
Update ClubsController for Update Action
Update the ClubsController to include the update action, as well as
provide some refactoring to eliminate code duplication.
## Code After:
class ClubsController < ApplicationController
respond_to :html, :json
before_filter :authenticate_user!
before_filter :get_club
def edit
authorize! :update, @club
end
def update
authorize! :update, @club
@club.update_attributes params[:club]
respond_with_bip @club
end
private
def get_club
@club = Club.find params[:id]
end
end
| class ClubsController < ApplicationController
+ respond_to :html, :json
+
before_filter :authenticate_user!
+ before_filter :get_club
def edit
+ authorize! :update, @club
+ end
+
+ def update
+ authorize! :update, @club
+
+ @club.update_attributes params[:club]
+
+ respond_with_bip @club
+ end
+
+ private
+
+ def get_club
@club = Club.find params[:id]
-
- authorize! :manage, @club
end
end | 19 | 2.111111 | 17 | 2 |
049be953ee24dc0ab6ffd8ab266ecc51b43d746f | tests/utils.php | tests/utils.php | <?php
function create_callable($type, $mode)
{
switch ($type) {
case 'method':
return [new TestClass(), 'method'.camelize($mode)];
case 'static_method':
return ['TestClass', 'staticMethod'.camelize($mode)];
case 'invoked_method':
return (new ReflectionClass('Invoke'.camelize($mode).'Class'))->newInstance();
case 'closure':
return (new ReflectionFunction('function_'.$mode))->getClosure();
case 'function':
return 'function_'.$mode;
}
}
function camelize($string)
{
return str_replace(' ', '', ucwords(str_replace('_', ' ', $string)));
}
| <?php
function create_callable($type, $mode)
{
switch ($type) {
case 'method':
return [new TestClass(), 'method'.camelize($mode)];
case 'static_method':
return ['TestClass', 'staticMethod'.camelize($mode)];
case 'invoked_method':
return (new ReflectionClass('Invoke'.camelize($mode).'Class'))->newInstance();
case 'closure':
return (new ReflectionFunction('function_'.$mode))->getClosure();
case 'function':
return 'function_'.$mode;
}
throw new \InvalidArgumentException(sprintf('Unsupported callable type "%s".', $type));
}
function camelize($string)
{
return str_replace(' ', '', ucwords(str_replace('_', ' ', $string)));
}
| Throw an exception on an unsupported callable type | Throw an exception on an unsupported callable type
| PHP | mit | rybakit/arguments-resolver | php | ## Code Before:
<?php
function create_callable($type, $mode)
{
switch ($type) {
case 'method':
return [new TestClass(), 'method'.camelize($mode)];
case 'static_method':
return ['TestClass', 'staticMethod'.camelize($mode)];
case 'invoked_method':
return (new ReflectionClass('Invoke'.camelize($mode).'Class'))->newInstance();
case 'closure':
return (new ReflectionFunction('function_'.$mode))->getClosure();
case 'function':
return 'function_'.$mode;
}
}
function camelize($string)
{
return str_replace(' ', '', ucwords(str_replace('_', ' ', $string)));
}
## Instruction:
Throw an exception on an unsupported callable type
## Code After:
<?php
function create_callable($type, $mode)
{
switch ($type) {
case 'method':
return [new TestClass(), 'method'.camelize($mode)];
case 'static_method':
return ['TestClass', 'staticMethod'.camelize($mode)];
case 'invoked_method':
return (new ReflectionClass('Invoke'.camelize($mode).'Class'))->newInstance();
case 'closure':
return (new ReflectionFunction('function_'.$mode))->getClosure();
case 'function':
return 'function_'.$mode;
}
throw new \InvalidArgumentException(sprintf('Unsupported callable type "%s".', $type));
}
function camelize($string)
{
return str_replace(' ', '', ucwords(str_replace('_', ' ', $string)));
}
| <?php
function create_callable($type, $mode)
{
switch ($type) {
case 'method':
return [new TestClass(), 'method'.camelize($mode)];
case 'static_method':
return ['TestClass', 'staticMethod'.camelize($mode)];
case 'invoked_method':
return (new ReflectionClass('Invoke'.camelize($mode).'Class'))->newInstance();
case 'closure':
return (new ReflectionFunction('function_'.$mode))->getClosure();
case 'function':
return 'function_'.$mode;
}
+
+ throw new \InvalidArgumentException(sprintf('Unsupported callable type "%s".', $type));
}
function camelize($string)
{
return str_replace(' ', '', ucwords(str_replace('_', ' ', $string)));
} | 2 | 0.076923 | 2 | 0 |
65a89bc9d43ce0d83c54da58bc18b7b5ae54405e | deploy-to-web.sh | deploy-to-web.sh | scp build/ NYoShWorkbench-129.1-1.0\ EAP1-linux.tar.gz
| find build/artifacts/NYoShWorkbenchDistribution/ -name NYoShWorkbench-\*|xargs -I{} scp {} campagnelab.org:/www/files/
| Update deploy to web script | Update deploy to web script
| Shell | apache-2.0 | CampagneLaboratory/NYoSh,CampagneLaboratory/NYoSh | shell | ## Code Before:
scp build/ NYoShWorkbench-129.1-1.0\ EAP1-linux.tar.gz
## Instruction:
Update deploy to web script
## Code After:
find build/artifacts/NYoShWorkbenchDistribution/ -name NYoShWorkbench-\*|xargs -I{} scp {} campagnelab.org:/www/files/
| - scp build/ NYoShWorkbench-129.1-1.0\ EAP1-linux.tar.gz
+ find build/artifacts/NYoShWorkbenchDistribution/ -name NYoShWorkbench-\*|xargs -I{} scp {} campagnelab.org:/www/files/
+ | 3 | 3 | 2 | 1 |
553176ae405b490e059a57d5fcc939e6e44f99d1 | src/System/Taffybar/Widget/Decorators.hs | src/System/Taffybar/Widget/Decorators.hs | {-# LANGUAGE OverloadedStrings #-}
module System.Taffybar.Widget.Decorators where
import Control.Monad.IO.Class
import qualified GI.Gtk as Gtk
import System.Taffybar.Widget.Util
-- | Wrap a widget with two container boxes. The inner box will have the class
-- "inner-pad", and the outer box will have the class "outer-pad". These boxes
-- can be used to add padding between the outline of the widget and its
-- contents, or for the purpose of displaying a different background behind the
-- widget.
buildPadBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildPadBox contents = liftIO $ do
innerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
outerBox <- Gtk.eventBoxNew
Gtk.containerAdd innerBox contents
Gtk.containerAdd outerBox innerBox
_ <- widgetSetClassGI innerBox "inner-pad"
_ <- widgetSetClassGI outerBox "outer-pad"
Gtk.widgetShow outerBox
Gtk.widgetShow innerBox
Gtk.toWidget outerBox
buildContentsBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildContentsBox widget = liftIO $ do
contents <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd contents widget
_ <- widgetSetClassGI contents "contents"
Gtk.widgetShowAll contents
Gtk.toWidget contents >>= buildPadBox
| {-# LANGUAGE OverloadedStrings #-}
module System.Taffybar.Widget.Decorators where
import Control.Monad.IO.Class
import qualified GI.Gtk as Gtk
import System.Taffybar.Widget.Util
-- | Wrap a widget with two container boxes. The inner box will have the class
-- "inner-pad", and the outer box will have the class "outer-pad". These boxes
-- can be used to add padding between the outline of the widget and its
-- contents, or for the purpose of displaying a different background behind the
-- widget.
buildPadBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildPadBox contents = liftIO $ do
innerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
outerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd innerBox contents
Gtk.containerAdd outerBox innerBox
_ <- widgetSetClassGI innerBox "inner-pad"
_ <- widgetSetClassGI outerBox "outer-pad"
Gtk.widgetShow outerBox
Gtk.widgetShow innerBox
Gtk.toWidget outerBox
buildContentsBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildContentsBox widget = liftIO $ do
contents <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd contents widget
_ <- widgetSetClassGI contents "contents"
Gtk.widgetShowAll contents
Gtk.toWidget contents >>= buildPadBox
| Use box new for outerBox | Use box new for outerBox
| Haskell | bsd-3-clause | teleshoes/taffybar | haskell | ## Code Before:
{-# LANGUAGE OverloadedStrings #-}
module System.Taffybar.Widget.Decorators where
import Control.Monad.IO.Class
import qualified GI.Gtk as Gtk
import System.Taffybar.Widget.Util
-- | Wrap a widget with two container boxes. The inner box will have the class
-- "inner-pad", and the outer box will have the class "outer-pad". These boxes
-- can be used to add padding between the outline of the widget and its
-- contents, or for the purpose of displaying a different background behind the
-- widget.
buildPadBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildPadBox contents = liftIO $ do
innerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
outerBox <- Gtk.eventBoxNew
Gtk.containerAdd innerBox contents
Gtk.containerAdd outerBox innerBox
_ <- widgetSetClassGI innerBox "inner-pad"
_ <- widgetSetClassGI outerBox "outer-pad"
Gtk.widgetShow outerBox
Gtk.widgetShow innerBox
Gtk.toWidget outerBox
buildContentsBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildContentsBox widget = liftIO $ do
contents <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd contents widget
_ <- widgetSetClassGI contents "contents"
Gtk.widgetShowAll contents
Gtk.toWidget contents >>= buildPadBox
## Instruction:
Use box new for outerBox
## Code After:
{-# LANGUAGE OverloadedStrings #-}
module System.Taffybar.Widget.Decorators where
import Control.Monad.IO.Class
import qualified GI.Gtk as Gtk
import System.Taffybar.Widget.Util
-- | Wrap a widget with two container boxes. The inner box will have the class
-- "inner-pad", and the outer box will have the class "outer-pad". These boxes
-- can be used to add padding between the outline of the widget and its
-- contents, or for the purpose of displaying a different background behind the
-- widget.
buildPadBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildPadBox contents = liftIO $ do
innerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
outerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd innerBox contents
Gtk.containerAdd outerBox innerBox
_ <- widgetSetClassGI innerBox "inner-pad"
_ <- widgetSetClassGI outerBox "outer-pad"
Gtk.widgetShow outerBox
Gtk.widgetShow innerBox
Gtk.toWidget outerBox
buildContentsBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildContentsBox widget = liftIO $ do
contents <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd contents widget
_ <- widgetSetClassGI contents "contents"
Gtk.widgetShowAll contents
Gtk.toWidget contents >>= buildPadBox
| {-# LANGUAGE OverloadedStrings #-}
module System.Taffybar.Widget.Decorators where
import Control.Monad.IO.Class
import qualified GI.Gtk as Gtk
import System.Taffybar.Widget.Util
-- | Wrap a widget with two container boxes. The inner box will have the class
-- "inner-pad", and the outer box will have the class "outer-pad". These boxes
-- can be used to add padding between the outline of the widget and its
-- contents, or for the purpose of displaying a different background behind the
-- widget.
buildPadBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildPadBox contents = liftIO $ do
innerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
- outerBox <- Gtk.eventBoxNew
+ outerBox <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd innerBox contents
Gtk.containerAdd outerBox innerBox
_ <- widgetSetClassGI innerBox "inner-pad"
_ <- widgetSetClassGI outerBox "outer-pad"
Gtk.widgetShow outerBox
Gtk.widgetShow innerBox
Gtk.toWidget outerBox
buildContentsBox :: MonadIO m => Gtk.Widget -> m Gtk.Widget
buildContentsBox widget = liftIO $ do
contents <- Gtk.boxNew Gtk.OrientationHorizontal 0
Gtk.containerAdd contents widget
_ <- widgetSetClassGI contents "contents"
Gtk.widgetShowAll contents
Gtk.toWidget contents >>= buildPadBox | 2 | 0.064516 | 1 | 1 |
a8db50b51deed874853b72d08037b2232eda060c | module/geb-core/src/test/groovy/geb/interaction/InteractionsSupportSpec.groovy | module/geb-core/src/test/groovy/geb/interaction/InteractionsSupportSpec.groovy | package geb.interaction
import geb.test.CrossBrowser
import geb.test.GebSpecWithServer
@CrossBrowser
class InteractionsSupportSpec extends GebSpecWithServer {
def "navigators are unpacked in interact block"() {
given:
html {
body {
input(id: 'first-input', value: '')
input(id: 'second-input', value: '')
}
}
$('#first-input').click()
when:
interact {
moveToElement $('#first-input')
click()
sendKeys 'GEB'
moveToElement $('#second-input')
click()
sendKeys 'geb'
}
then:
$('#first-input').value() == 'GEB'
$('#second-input').value() == 'geb'
}
} | package geb.interaction
import geb.test.CrossBrowser
import geb.test.GebSpecWithServer
@CrossBrowser
class InteractionsSupportSpec extends GebSpecWithServer {
def setup() {
html {
body {
input(id: 'first-input', value: '')
input(id: 'second-input', value: '')
}
}
}
def "navigators are unpacked in interact block"() {
when:
interact {
moveToElement $('#first-input')
click()
sendKeys 'GEB'
moveToElement $('#second-input')
click()
sendKeys 'geb'
}
then:
$('#first-input').value() == 'GEB'
$('#second-input').value() == 'geb'
}
def "page content items are unpacked in interact block"() {
given:
at InteractionPage
when:
interact {
moveToElement first
click()
sendKeys 'GEB'
moveToElement second
click()
sendKeys 'geb'
}
then:
first == 'GEB'
second == 'geb'
}
}
class InteractionPage extends geb.Page {
static content = {
first { $('#first-input') }
second { $('#second-input') }
}
} | Test unpacking defined content in an interact {} block. | Test unpacking defined content in an interact {} block.
| Groovy | apache-2.0 | kgeis/geb,onBass-naga/geb,pierre-hilt/geb,ntotomanov-taulia/geb,pierre-hilt/geb,madmas/geb,ntotomanov-taulia/geb,madmas/geb,kgeis/geb,geb/geb,madmas/geb,geb/geb,pierre-hilt/geb,menonvarun/geb,kgeis/geb,onBass-naga/geb,menonvarun/geb,menonvarun/geb | groovy | ## Code Before:
package geb.interaction
import geb.test.CrossBrowser
import geb.test.GebSpecWithServer
@CrossBrowser
class InteractionsSupportSpec extends GebSpecWithServer {
def "navigators are unpacked in interact block"() {
given:
html {
body {
input(id: 'first-input', value: '')
input(id: 'second-input', value: '')
}
}
$('#first-input').click()
when:
interact {
moveToElement $('#first-input')
click()
sendKeys 'GEB'
moveToElement $('#second-input')
click()
sendKeys 'geb'
}
then:
$('#first-input').value() == 'GEB'
$('#second-input').value() == 'geb'
}
}
## Instruction:
Test unpacking defined content in an interact {} block.
## Code After:
package geb.interaction
import geb.test.CrossBrowser
import geb.test.GebSpecWithServer
@CrossBrowser
class InteractionsSupportSpec extends GebSpecWithServer {
def setup() {
html {
body {
input(id: 'first-input', value: '')
input(id: 'second-input', value: '')
}
}
}
def "navigators are unpacked in interact block"() {
when:
interact {
moveToElement $('#first-input')
click()
sendKeys 'GEB'
moveToElement $('#second-input')
click()
sendKeys 'geb'
}
then:
$('#first-input').value() == 'GEB'
$('#second-input').value() == 'geb'
}
def "page content items are unpacked in interact block"() {
given:
at InteractionPage
when:
interact {
moveToElement first
click()
sendKeys 'GEB'
moveToElement second
click()
sendKeys 'geb'
}
then:
first == 'GEB'
second == 'geb'
}
}
class InteractionPage extends geb.Page {
static content = {
first { $('#first-input') }
second { $('#second-input') }
}
} | package geb.interaction
import geb.test.CrossBrowser
import geb.test.GebSpecWithServer
@CrossBrowser
class InteractionsSupportSpec extends GebSpecWithServer {
+ def setup() {
- def "navigators are unpacked in interact block"() {
- given:
html {
body {
input(id: 'first-input', value: '')
input(id: 'second-input', value: '')
}
}
+ }
+ def "navigators are unpacked in interact block"() {
- $('#first-input').click()
-
when:
interact {
moveToElement $('#first-input')
click()
sendKeys 'GEB'
moveToElement $('#second-input')
click()
sendKeys 'geb'
}
then:
$('#first-input').value() == 'GEB'
$('#second-input').value() == 'geb'
}
+ def "page content items are unpacked in interact block"() {
+ given:
+ at InteractionPage
+
+ when:
+ interact {
+ moveToElement first
+ click()
+ sendKeys 'GEB'
+ moveToElement second
+ click()
+ sendKeys 'geb'
+ }
+
+ then:
+ first == 'GEB'
+ second == 'geb'
+ }
}
+
+ class InteractionPage extends geb.Page {
+ static content = {
+ first { $('#first-input') }
+ second { $('#second-input') }
+ }
+ } | 32 | 0.888889 | 28 | 4 |
c238807ccfe423e7e47831ef5d6fa598b245266f | src/components/controls/transmission-lines.js | src/components/controls/transmission-lines.js | import React from "react";
import { connect } from "react-redux";
import { withTranslation } from "react-i18next";
import Toggle from "./toggle";
import { controlsWidth } from "../../util/globals";
import { TOGGLE_TRANSMISSION_LINES } from "../../actions/types";
import { SidebarSubtitle } from "./styles";
@connect((state) => {
return {
showTransmissionLines: state.controls.showTransmissionLines
};
})
class TransmissionLines extends React.Component {
render() {
const { t } = this.props;
return (
<>
<SidebarSubtitle spaceAbove>
{t("sidebar:Transmission lines")}
</SidebarSubtitle>
<div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
<Toggle
display
on={this.props.showTransmissionLines}
callback={() => {
this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
}}
label={t("sidebar:Show transmission lines")}
/>
</div>
</>
);
}
}
export default withTranslation()(TransmissionLines);
| import React from "react";
import { connect } from "react-redux";
import { withTranslation } from "react-i18next";
import Toggle from "./toggle";
import { controlsWidth } from "../../util/globals";
import { TOGGLE_TRANSMISSION_LINES } from "../../actions/types";
@connect((state) => {
return {
showTransmissionLines: state.controls.showTransmissionLines
};
})
class TransmissionLines extends React.Component {
render() {
const { t } = this.props;
return (
<div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
<Toggle
display
on={this.props.showTransmissionLines}
callback={() => {
this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
}}
label={t("sidebar:Show transmission lines")}
/>
</div>
);
}
}
export default withTranslation()(TransmissionLines);
| Remove unnecessary byline in sidebar | Remove unnecessary byline in sidebar | JavaScript | agpl-3.0 | nextstrain/auspice,nextstrain/auspice,nextstrain/auspice | javascript | ## Code Before:
import React from "react";
import { connect } from "react-redux";
import { withTranslation } from "react-i18next";
import Toggle from "./toggle";
import { controlsWidth } from "../../util/globals";
import { TOGGLE_TRANSMISSION_LINES } from "../../actions/types";
import { SidebarSubtitle } from "./styles";
@connect((state) => {
return {
showTransmissionLines: state.controls.showTransmissionLines
};
})
class TransmissionLines extends React.Component {
render() {
const { t } = this.props;
return (
<>
<SidebarSubtitle spaceAbove>
{t("sidebar:Transmission lines")}
</SidebarSubtitle>
<div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
<Toggle
display
on={this.props.showTransmissionLines}
callback={() => {
this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
}}
label={t("sidebar:Show transmission lines")}
/>
</div>
</>
);
}
}
export default withTranslation()(TransmissionLines);
## Instruction:
Remove unnecessary byline in sidebar
## Code After:
import React from "react";
import { connect } from "react-redux";
import { withTranslation } from "react-i18next";
import Toggle from "./toggle";
import { controlsWidth } from "../../util/globals";
import { TOGGLE_TRANSMISSION_LINES } from "../../actions/types";
@connect((state) => {
return {
showTransmissionLines: state.controls.showTransmissionLines
};
})
class TransmissionLines extends React.Component {
render() {
const { t } = this.props;
return (
<div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
<Toggle
display
on={this.props.showTransmissionLines}
callback={() => {
this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
}}
label={t("sidebar:Show transmission lines")}
/>
</div>
);
}
}
export default withTranslation()(TransmissionLines);
| import React from "react";
import { connect } from "react-redux";
import { withTranslation } from "react-i18next";
import Toggle from "./toggle";
import { controlsWidth } from "../../util/globals";
import { TOGGLE_TRANSMISSION_LINES } from "../../actions/types";
- import { SidebarSubtitle } from "./styles";
@connect((state) => {
return {
showTransmissionLines: state.controls.showTransmissionLines
};
})
class TransmissionLines extends React.Component {
render() {
const { t } = this.props;
return (
- <>
- <SidebarSubtitle spaceAbove>
- {t("sidebar:Transmission lines")}
- </SidebarSubtitle>
- <div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
? --
+ <div style={{marginBottom: 10, width: controlsWidth, fontSize: 14}}>
- <Toggle
? --
+ <Toggle
- display
? --
+ display
- on={this.props.showTransmissionLines}
? --
+ on={this.props.showTransmissionLines}
- callback={() => {
? --
+ callback={() => {
- this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
? --
+ this.props.dispatch({ type: TOGGLE_TRANSMISSION_LINES, data: !this.props.showTransmissionLines });
- }}
? --
+ }}
- label={t("sidebar:Show transmission lines")}
? --
+ label={t("sidebar:Show transmission lines")}
- />
? --
+ />
- </div>
? --
+ </div>
- </>
);
}
}
export default withTranslation()(TransmissionLines); | 26 | 0.684211 | 10 | 16 |
e441fbcc5fdf0af0460d7b8ac210af3d96546287 | .travis.yml | .travis.yml | language: c
env:
- SMVERSION=1.5
- SMVERSION=1.6
- SMVERSION=1.7
before_install:
- sudo apt-get update
- sudo apt-get install gcc-multilib
- sudo apt-get install lynx
before_script:
- SMPATTERN="http:.*sourcemod-.*-linux.*"
- SMURL="http://www.sourcemod.net/smdrop/$SMVERSION/"
- SMPACKAGE=`lynx -dump "$SMURL" | egrep -o "$SMPATTERN" | tail -1`
- wget $SMPACKAGE
- tar -xzf $(basename "$SMPACKAGE")
- cd addons/sourcemod/scripting/
- chmod +x spcomp
script:
- ./spcomp jetpack_plus.sp
- ./spcomp jetpack_bling.sp
after_script:
- ls *.smx
notifications:
email: false
| language: c
env:
- SMVERSION=1.5
- SMVERSION=1.6
- SMVERSION=1.7
matrix:
fast_finish: true
allow_failures:
- env: SMVERSION=1.7
before_install:
- sudo apt-get update
- sudo apt-get install gcc-multilib
- sudo apt-get install lynx
before_script:
- SMPATTERN="http:.*sourcemod-.*-linux.*"
- SMURL="http://www.sourcemod.net/smdrop/$SMVERSION/"
- SMPACKAGE=`lynx -dump "$SMURL" | egrep -o "$SMPATTERN" | tail -1`
- wget $SMPACKAGE
- tar -xzf $(basename "$SMPACKAGE")
- cd addons/sourcemod/scripting/
- chmod +x spcomp
script:
- ./spcomp jetpack_plus.sp
- ./spcomp jetpack_bling.sp
after_script:
- ls *.smx
notifications:
email: false
| Allow failures for sourcemod 1.7 build | Allow failures for sourcemod 1.7 build
| YAML | mit | CrimsonTautology/sm_jetpack_plus | yaml | ## Code Before:
language: c
env:
- SMVERSION=1.5
- SMVERSION=1.6
- SMVERSION=1.7
before_install:
- sudo apt-get update
- sudo apt-get install gcc-multilib
- sudo apt-get install lynx
before_script:
- SMPATTERN="http:.*sourcemod-.*-linux.*"
- SMURL="http://www.sourcemod.net/smdrop/$SMVERSION/"
- SMPACKAGE=`lynx -dump "$SMURL" | egrep -o "$SMPATTERN" | tail -1`
- wget $SMPACKAGE
- tar -xzf $(basename "$SMPACKAGE")
- cd addons/sourcemod/scripting/
- chmod +x spcomp
script:
- ./spcomp jetpack_plus.sp
- ./spcomp jetpack_bling.sp
after_script:
- ls *.smx
notifications:
email: false
## Instruction:
Allow failures for sourcemod 1.7 build
## Code After:
language: c
env:
- SMVERSION=1.5
- SMVERSION=1.6
- SMVERSION=1.7
matrix:
fast_finish: true
allow_failures:
- env: SMVERSION=1.7
before_install:
- sudo apt-get update
- sudo apt-get install gcc-multilib
- sudo apt-get install lynx
before_script:
- SMPATTERN="http:.*sourcemod-.*-linux.*"
- SMURL="http://www.sourcemod.net/smdrop/$SMVERSION/"
- SMPACKAGE=`lynx -dump "$SMURL" | egrep -o "$SMPATTERN" | tail -1`
- wget $SMPACKAGE
- tar -xzf $(basename "$SMPACKAGE")
- cd addons/sourcemod/scripting/
- chmod +x spcomp
script:
- ./spcomp jetpack_plus.sp
- ./spcomp jetpack_bling.sp
after_script:
- ls *.smx
notifications:
email: false
| language: c
env:
- SMVERSION=1.5
- SMVERSION=1.6
- SMVERSION=1.7
+
+ matrix:
+ fast_finish: true
+ allow_failures:
+ - env: SMVERSION=1.7
before_install:
- sudo apt-get update
- sudo apt-get install gcc-multilib
- sudo apt-get install lynx
before_script:
- SMPATTERN="http:.*sourcemod-.*-linux.*"
- SMURL="http://www.sourcemod.net/smdrop/$SMVERSION/"
- SMPACKAGE=`lynx -dump "$SMURL" | egrep -o "$SMPATTERN" | tail -1`
- wget $SMPACKAGE
- tar -xzf $(basename "$SMPACKAGE")
- cd addons/sourcemod/scripting/
- chmod +x spcomp
script:
- ./spcomp jetpack_plus.sp
- ./spcomp jetpack_bling.sp
after_script:
- ls *.smx
notifications:
email: false | 5 | 0.166667 | 5 | 0 |
939270e96f367881eb1447a537ad6fb4fa09866a | setup.sql | setup.sql | CREATE DATABASE IF NOT EXISTS event_detection;
USE event_detection;
CREATE TABLE IF NOT EXISTS sources (
id varchar(255), reliability double,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS feeds (
id varchar(255), source varchar(255), url blob, scrapers blob, lastseen blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
CREATE TABLE IF NOT EXISTS articles (
id INT UNSIGNED AUTO_INCREMENT, title varchar(255), source varchar(255), url blob, filename blob default NULL,
primary key (id),
foreign key (source) references sources(id),
unique key (filename(100))
);
| CREATE DATABASE IF NOT EXISTS event_detection;
USE event_detection;
CREATE TABLE IF NOT EXISTS sources (
id varchar(255), reliability double,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS feeds (
id varchar(255), source varchar(255), url blob, scrapers blob, lastseen blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
CREATE TABLE IF NOT EXISTS articles (
id INT UNSIGNED AUTO_INCREMENT, title varchar(255), source varchar(255), url blob, filename blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
| Remove an invalid uniqueness constraint. | Remove an invalid uniqueness constraint.
| SQL | mit | beallej/event-detection,beallej/event-detection,beallej/event-detection,beallej/event-detection,beallej/event-detection | sql | ## Code Before:
CREATE DATABASE IF NOT EXISTS event_detection;
USE event_detection;
CREATE TABLE IF NOT EXISTS sources (
id varchar(255), reliability double,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS feeds (
id varchar(255), source varchar(255), url blob, scrapers blob, lastseen blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
CREATE TABLE IF NOT EXISTS articles (
id INT UNSIGNED AUTO_INCREMENT, title varchar(255), source varchar(255), url blob, filename blob default NULL,
primary key (id),
foreign key (source) references sources(id),
unique key (filename(100))
);
## Instruction:
Remove an invalid uniqueness constraint.
## Code After:
CREATE DATABASE IF NOT EXISTS event_detection;
USE event_detection;
CREATE TABLE IF NOT EXISTS sources (
id varchar(255), reliability double,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS feeds (
id varchar(255), source varchar(255), url blob, scrapers blob, lastseen blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
CREATE TABLE IF NOT EXISTS articles (
id INT UNSIGNED AUTO_INCREMENT, title varchar(255), source varchar(255), url blob, filename blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
| CREATE DATABASE IF NOT EXISTS event_detection;
USE event_detection;
CREATE TABLE IF NOT EXISTS sources (
id varchar(255), reliability double,
PRIMARY KEY (id)
);
CREATE TABLE IF NOT EXISTS feeds (
id varchar(255), source varchar(255), url blob, scrapers blob, lastseen blob default NULL,
primary key (id),
foreign key (source) references sources(id)
);
CREATE TABLE IF NOT EXISTS articles (
id INT UNSIGNED AUTO_INCREMENT, title varchar(255), source varchar(255), url blob, filename blob default NULL,
primary key (id),
- foreign key (source) references sources(id),
? -
+ foreign key (source) references sources(id)
- unique key (filename(100))
); | 3 | 0.15 | 1 | 2 |
113972c8a483db75c66fc5e4ce42b132da3ff20c | .travis.yml | .travis.yml | language: python
os: linux
dist: xenial
matrix:
include:
- env: NOXSESSION=docs
- env: NOXSESSION=lint
- python: 3.6
env: NOXSESSION=test-3.6
- python: 3.7
env: NOXSESSION=test-3.7
- python: 3.8
env: NOXSESSION=test-3.8
install: pip install nox
script: nox
| language: python
os: linux
dist: xenial
python: 3.8
matrix:
include:
- env: NOXSESSION=docs
- env: NOXSESSION=lint
- python: 3.6
env: NOXSESSION=test-3.6
- python: 3.7
env: NOXSESSION=test-3.7
- python: 3.8
env: NOXSESSION=test-3.8
install: pip install nox
script: nox
| Use Python 3.8 on Travis CI | Use Python 3.8 on Travis CI
| YAML | mit | GaretJax/sphinx-autobuild | yaml | ## Code Before:
language: python
os: linux
dist: xenial
matrix:
include:
- env: NOXSESSION=docs
- env: NOXSESSION=lint
- python: 3.6
env: NOXSESSION=test-3.6
- python: 3.7
env: NOXSESSION=test-3.7
- python: 3.8
env: NOXSESSION=test-3.8
install: pip install nox
script: nox
## Instruction:
Use Python 3.8 on Travis CI
## Code After:
language: python
os: linux
dist: xenial
python: 3.8
matrix:
include:
- env: NOXSESSION=docs
- env: NOXSESSION=lint
- python: 3.6
env: NOXSESSION=test-3.6
- python: 3.7
env: NOXSESSION=test-3.7
- python: 3.8
env: NOXSESSION=test-3.8
install: pip install nox
script: nox
| language: python
os: linux
dist: xenial
+ python: 3.8
matrix:
include:
- env: NOXSESSION=docs
- env: NOXSESSION=lint
- python: 3.6
env: NOXSESSION=test-3.6
- python: 3.7
env: NOXSESSION=test-3.7
- python: 3.8
env: NOXSESSION=test-3.8
install: pip install nox
script: nox | 1 | 0.058824 | 1 | 0 |
0470b9384b211a1d04bd6cbdfb84994b057676a2 | lib/cenit/redis.rb | lib/cenit/redis.rb | require 'redis'
module Cenit
module Redis
class << self
def client?
!client.nil?
end
def client
unless instance_variable_defined?(:@redis_client)
client = ::Redis.new
client =
begin
client.ping
puts 'Redis connection detected!'
client
rescue Exception => ex
puts "Redis connection rejected: #{ex.message}"
nil
end
instance_variable_set(:@redis_client, client)
end
instance_variable_get(:@redis_client)
end
def pipelined
yield client if client && block_given?
end
def method_missing(symbol, *args, &block)
if client&.respond_to?(symbol)
client.send(symbol, *args, &block)
else
super
end
end
end
end
end | require 'redis'
module Cenit
module Redis
class << self
def client?
!client.nil?
end
def client
unless instance_variable_defined?(:@redis_client)
client = ::Redis.new(host: ENV["REDIS_HOST"], port: 6379, db: 15)
client =
begin
client.ping
puts 'Redis connection detected!'
client
rescue Exception => ex
puts "Redis connection rejected: #{ex.message}"
nil
end
instance_variable_set(:@redis_client, client)
end
instance_variable_get(:@redis_client)
end
def pipelined
yield client if client && block_given?
end
def method_missing(symbol, *args, &block)
if client&.respond_to?(symbol)
client.send(symbol, *args, &block)
else
super
end
end
end
end
end | Add support for config Redis conection | Add support for config Redis conection
| Ruby | mit | macarci/cenit,cenit-io/cenit,cenit-io/cenit,macarci/cenit,cenit-io/cenit,cenit-io/cenit,macarci/cenit | ruby | ## Code Before:
require 'redis'
module Cenit
module Redis
class << self
def client?
!client.nil?
end
def client
unless instance_variable_defined?(:@redis_client)
client = ::Redis.new
client =
begin
client.ping
puts 'Redis connection detected!'
client
rescue Exception => ex
puts "Redis connection rejected: #{ex.message}"
nil
end
instance_variable_set(:@redis_client, client)
end
instance_variable_get(:@redis_client)
end
def pipelined
yield client if client && block_given?
end
def method_missing(symbol, *args, &block)
if client&.respond_to?(symbol)
client.send(symbol, *args, &block)
else
super
end
end
end
end
end
## Instruction:
Add support for config Redis conection
## Code After:
require 'redis'
module Cenit
module Redis
class << self
def client?
!client.nil?
end
def client
unless instance_variable_defined?(:@redis_client)
client = ::Redis.new(host: ENV["REDIS_HOST"], port: 6379, db: 15)
client =
begin
client.ping
puts 'Redis connection detected!'
client
rescue Exception => ex
puts "Redis connection rejected: #{ex.message}"
nil
end
instance_variable_set(:@redis_client, client)
end
instance_variable_get(:@redis_client)
end
def pipelined
yield client if client && block_given?
end
def method_missing(symbol, *args, &block)
if client&.respond_to?(symbol)
client.send(symbol, *args, &block)
else
super
end
end
end
end
end | require 'redis'
module Cenit
module Redis
class << self
def client?
!client.nil?
end
def client
unless instance_variable_defined?(:@redis_client)
- client = ::Redis.new
+ client = ::Redis.new(host: ENV["REDIS_HOST"], port: 6379, db: 15)
client =
begin
client.ping
puts 'Redis connection detected!'
client
rescue Exception => ex
puts "Redis connection rejected: #{ex.message}"
nil
end
instance_variable_set(:@redis_client, client)
end
instance_variable_get(:@redis_client)
end
def pipelined
yield client if client && block_given?
end
def method_missing(symbol, *args, &block)
if client&.respond_to?(symbol)
client.send(symbol, *args, &block)
else
super
end
end
end
end
end | 2 | 0.04878 | 1 | 1 |
30c559586a3cf11333f78ca5a40afb4334410653 | templates/index.html | templates/index.html | <!DOCTYPE html5>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Time Tracker</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<div class="timer container">
<p>Work Timer</p>
<a href="" class="start button">
{% include start.svg %}
<p>Start</p>
</a>
<a href="" class="stop button">
{% include stop.svg %}
<p>Stop</p>
</a>
</div>
<div class="subject container">
{% for subject in subjects %}
<a href="{{ subject.link }}" class="button {{ subject.color }}">
{% include subject.icon %}
<p>{{ subject.shortname }}</p>
</a>
{% endfor %}
</div>
</body>
</html>
| <!DOCTYPE html5>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Time Tracker</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<div class="timer container">
<p>Work Timer</p>
<a href="" class="start button">
{% include url_for('static', filename="images/start.svg") %}
<p>Start</p>
</a>
<a href="" class="stop button">
{% include url_for('static', filename="images/stop.svg") %}
<p>Stop</p>
</a>
</div>
<div class="subject container">
{% for subject in subjects %}
<a href="{{ subject.link }}" class="button {{ subject.color }}">
{% include url_for('static', filename=subject.icon) %}
<p>{{ subject.shortname }}</p>
</a>
{% endfor %}
</div>
</body>
</html>
| Revert permissions tweak, try to fix crashing on svg import | Revert permissions tweak, try to fix crashing on svg import
| HTML | mit | williamleuschner/TimeTracker,williamleuschner/TimeTracker | html | ## Code Before:
<!DOCTYPE html5>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Time Tracker</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<div class="timer container">
<p>Work Timer</p>
<a href="" class="start button">
{% include start.svg %}
<p>Start</p>
</a>
<a href="" class="stop button">
{% include stop.svg %}
<p>Stop</p>
</a>
</div>
<div class="subject container">
{% for subject in subjects %}
<a href="{{ subject.link }}" class="button {{ subject.color }}">
{% include subject.icon %}
<p>{{ subject.shortname }}</p>
</a>
{% endfor %}
</div>
</body>
</html>
## Instruction:
Revert permissions tweak, try to fix crashing on svg import
## Code After:
<!DOCTYPE html5>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Time Tracker</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<div class="timer container">
<p>Work Timer</p>
<a href="" class="start button">
{% include url_for('static', filename="images/start.svg") %}
<p>Start</p>
</a>
<a href="" class="stop button">
{% include url_for('static', filename="images/stop.svg") %}
<p>Stop</p>
</a>
</div>
<div class="subject container">
{% for subject in subjects %}
<a href="{{ subject.link }}" class="button {{ subject.color }}">
{% include url_for('static', filename=subject.icon) %}
<p>{{ subject.shortname }}</p>
</a>
{% endfor %}
</div>
</body>
</html>
| <!DOCTYPE html5>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Time Tracker</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<div class="timer container">
<p>Work Timer</p>
<a href="" class="start button">
- {% include start.svg %}
+ {% include url_for('static', filename="images/start.svg") %}
<p>Start</p>
</a>
<a href="" class="stop button">
- {% include stop.svg %}
+ {% include url_for('static', filename="images/stop.svg") %}
<p>Stop</p>
</a>
</div>
<div class="subject container">
{% for subject in subjects %}
<a href="{{ subject.link }}" class="button {{ subject.color }}">
- {% include subject.icon %}
+ {% include url_for('static', filename=subject.icon) %}
? +++++++++++++++++++++++++++ +
<p>{{ subject.shortname }}</p>
</a>
{% endfor %}
</div>
</body>
</html> | 6 | 0.206897 | 3 | 3 |
6cc866b6537f0f5c0e172dfffba9db7427132b49 | README.md | README.md | Patronage 2015 Augmented Szczecin iOS client.
| Patronage 2015 Augmented Szczecin iOS client.
# Arrangements
- Use at least Xcode 6.3 (currently Beta) with Swift 1.2.
- Work with gitflow workflow.
# Extras
- [Gitflow Workflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow)
- [5 Useful Tips For A Better Commit Message](https://robots.thoughtbot.com/5-useful-tips-for-a-better-commit-message)
- [Note About Git Commit Messages](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html)
| Add basic arrangements and some additional links. | Add basic arrangements and some additional links. | Markdown | apache-2.0 | blstream/AugmentedSzczecin_iOS,blstream/AugmentedSzczecin_iOS,blstream/AugmentedSzczecin_iOS,blstream/AugmentedSzczecin_iOS | markdown | ## Code Before:
Patronage 2015 Augmented Szczecin iOS client.
## Instruction:
Add basic arrangements and some additional links.
## Code After:
Patronage 2015 Augmented Szczecin iOS client.
# Arrangements
- Use at least Xcode 6.3 (currently Beta) with Swift 1.2.
- Work with gitflow workflow.
# Extras
- [Gitflow Workflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow)
- [5 Useful Tips For A Better Commit Message](https://robots.thoughtbot.com/5-useful-tips-for-a-better-commit-message)
- [Note About Git Commit Messages](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html)
| Patronage 2015 Augmented Szczecin iOS client.
+
+
+ # Arrangements
+ - Use at least Xcode 6.3 (currently Beta) with Swift 1.2.
+ - Work with gitflow workflow.
+
+
+ # Extras
+ - [Gitflow Workflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow)
+ - [5 Useful Tips For A Better Commit Message](https://robots.thoughtbot.com/5-useful-tips-for-a-better-commit-message)
+ - [Note About Git Commit Messages](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html) | 11 | 11 | 11 | 0 |
98d0339ce921950ca38fcd1004d3ff951e71920f | package.json | package.json | {
"private": true,
"author": "Mason Browne <mason@massivelyfun.com> (http://massivelyfun.com/people/maseb)",
"name": "mf-tools",
"description": "Build and test tools.",
"version": "0.0.3",
"homepage": "http://massivelyfun.com/projects/mf-tools",
"repository": {
"type": "git",
"url": "git://github.com/massivelyfun/mf-tools.git"
},
"main": "lib/mf-tools/index.js",
"engines": {
"node": "~0.6.1"
},
"dependencies": {
"glob" : "= 3.1.5"
},
"devDependencies": {
"coffee-script": "= 1.2.0",
"mocha": "= 0.10.1",
"chai": "= 0.3.3",
"jsdom": "= 0.2.10",
"glob": "= 3.1.5",
"phantom": "= 0.3.3"
},
"optionalDependencies": {}
}
| {
"private": true,
"author": "Mason Browne <mason@massivelyfun.com> (http://massivelyfun.com/people/maseb)",
"name": "mf-tools",
"description": "Build and test tools.",
"version": "0.0.4",
"homepage": "http://massivelyfun.com/projects/mf-tools",
"repository": {
"type": "git",
"url": "git://github.com/massivelyfun/mf-tools.git"
},
"main": "lib/mf-tools/index.js",
"engines": {
"node": "~0.6.1"
},
"dependencies": {
"glob" : "= 3.1.5",
"jsdom": "= 0.2.10",
"mocha": "= 0.10.1",
"chai": "= 0.3.3",
"phantom": "= 0.3.3"
},
"devDependencies": {
"coffee-script": "= 1.2.0",
},
"optionalDependencies": {}
}
| Move dev deps into main deps, as they're required by someone pulling this lib in. Bump version. | Move dev deps into main deps, as they're required by someone pulling this lib in. Bump version.
| JSON | mit | massivelyfun/mf-tools | json | ## Code Before:
{
"private": true,
"author": "Mason Browne <mason@massivelyfun.com> (http://massivelyfun.com/people/maseb)",
"name": "mf-tools",
"description": "Build and test tools.",
"version": "0.0.3",
"homepage": "http://massivelyfun.com/projects/mf-tools",
"repository": {
"type": "git",
"url": "git://github.com/massivelyfun/mf-tools.git"
},
"main": "lib/mf-tools/index.js",
"engines": {
"node": "~0.6.1"
},
"dependencies": {
"glob" : "= 3.1.5"
},
"devDependencies": {
"coffee-script": "= 1.2.0",
"mocha": "= 0.10.1",
"chai": "= 0.3.3",
"jsdom": "= 0.2.10",
"glob": "= 3.1.5",
"phantom": "= 0.3.3"
},
"optionalDependencies": {}
}
## Instruction:
Move dev deps into main deps, as they're required by someone pulling this lib in. Bump version.
## Code After:
{
"private": true,
"author": "Mason Browne <mason@massivelyfun.com> (http://massivelyfun.com/people/maseb)",
"name": "mf-tools",
"description": "Build and test tools.",
"version": "0.0.4",
"homepage": "http://massivelyfun.com/projects/mf-tools",
"repository": {
"type": "git",
"url": "git://github.com/massivelyfun/mf-tools.git"
},
"main": "lib/mf-tools/index.js",
"engines": {
"node": "~0.6.1"
},
"dependencies": {
"glob" : "= 3.1.5",
"jsdom": "= 0.2.10",
"mocha": "= 0.10.1",
"chai": "= 0.3.3",
"phantom": "= 0.3.3"
},
"devDependencies": {
"coffee-script": "= 1.2.0",
},
"optionalDependencies": {}
}
| {
"private": true,
"author": "Mason Browne <mason@massivelyfun.com> (http://massivelyfun.com/people/maseb)",
"name": "mf-tools",
"description": "Build and test tools.",
- "version": "0.0.3",
? ^
+ "version": "0.0.4",
? ^
"homepage": "http://massivelyfun.com/projects/mf-tools",
"repository": {
"type": "git",
"url": "git://github.com/massivelyfun/mf-tools.git"
},
"main": "lib/mf-tools/index.js",
"engines": {
"node": "~0.6.1"
},
"dependencies": {
- "glob" : "= 3.1.5"
+ "glob" : "= 3.1.5",
? +
+ "jsdom": "= 0.2.10",
+ "mocha": "= 0.10.1",
+ "chai": "= 0.3.3",
+ "phantom": "= 0.3.3"
},
"devDependencies": {
"coffee-script": "= 1.2.0",
- "mocha": "= 0.10.1",
- "chai": "= 0.3.3",
- "jsdom": "= 0.2.10",
- "glob": "= 3.1.5",
- "phantom": "= 0.3.3"
},
"optionalDependencies": {}
} | 13 | 0.464286 | 6 | 7 |
9ebaafef796f97941be812e371d58e8ba10321f5 | README.md | README.md | A trello in Genropy
| A Trello in Genropy
Checkout the live demo at [genrello.mkshid.me](genrello.mkshid.me)
- username: demo
- password: demo
| Add like to the demo and creditials | Add like to the demo and creditials
| Markdown | mit | mkshid/genrello,mkshid/genrello,mkshid/genrello | markdown | ## Code Before:
A trello in Genropy
## Instruction:
Add like to the demo and creditials
## Code After:
A Trello in Genropy
Checkout the live demo at [genrello.mkshid.me](genrello.mkshid.me)
- username: demo
- password: demo
| - A trello in Genropy
? ^
+ A Trello in Genropy
? ^
+
+ Checkout the live demo at [genrello.mkshid.me](genrello.mkshid.me)
+
+ - username: demo
+ - password: demo | 7 | 7 | 6 | 1 |
88af3a6c21c93988548b5933552200898caff830 | pkgs/applications/radio/qradiolink/default.nix | pkgs/applications/radio/qradiolink/default.nix | { stdenv, fetchFromGitHub, alsaLib, boost
, qt4, libpulseaudio, codec2, libconfig
, gnuradio, gr-osmosdr, gsm
, libopus, libjpeg, protobuf, qwt, speex
} :
let
version = "0.5.2";
in stdenv.mkDerivation {
pname = "qradiolink";
inherit version;
src = fetchFromGitHub {
owner = "kantooon";
repo = "qradiolink";
rev = version;
sha256 = "0djhrr96b8sr2vi6hwbzgzlp9771622dp93f0rsphsyxlwbzsrgl";
};
preBuild = ''
cd ext
protoc --cpp_out=. Mumble.proto
protoc --cpp_out=. QRadioLink.proto
cd ..
qmake
'';
installPhase = ''
mkdir -p $out/bin
cp qradiolink $out/bin
'';
buildInputs = [
qt4
alsaLib
boost
libpulseaudio
codec2
libconfig
gsm
gnuradio
gr-osmosdr
libopus
libjpeg
protobuf
speex
qwt
];
enableParallelBuilding = true;
meta = with stdenv.lib; {
description = "SDR transceiver application for analog and digital modes";
homepage = http://qradiolink.org/;
license = licenses.agpl3;
maintainers = [ maintainers.markuskowa ];
platforms = platforms.linux;
};
}
| { stdenv, fetchFromGitHub, alsaLib, boost
, qt4, libpulseaudio, codec2, libconfig
, gnuradio, gr-osmosdr, gsm
, libopus, libjpeg, protobuf, qwt, speex
} :
let
version = "0.5.0";
in stdenv.mkDerivation {
pname = "qradiolink";
inherit version;
src = fetchFromGitHub {
owner = "kantooon";
repo = "qradiolink";
rev = version;
sha256 = "0xhg5zhjznmls5m3rhpk1qx0dipxmca12s85w15d0i7qwva2f1gi";
};
preBuild = ''
cd ext
protoc --cpp_out=. Mumble.proto
protoc --cpp_out=. QRadioLink.proto
cd ..
qmake
'';
installPhase = ''
mkdir -p $out/bin
cp qradiolink $out/bin
'';
buildInputs = [
qt4
alsaLib
boost
libpulseaudio
codec2
libconfig
gsm
gnuradio
gr-osmosdr
libopus
libjpeg
protobuf
speex
qwt
];
enableParallelBuilding = true;
meta = with stdenv.lib; {
description = "SDR transceiver application for analog and digital modes";
homepage = http://qradiolink.org/;
license = licenses.agpl3;
maintainers = [ maintainers.markuskowa ];
platforms = platforms.linux;
};
}
| Revert "qradiolink: 0.5.0 -> 0.5.2" | Revert "qradiolink: 0.5.0 -> 0.5.2"
| Nix | mit | NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs | nix | ## Code Before:
{ stdenv, fetchFromGitHub, alsaLib, boost
, qt4, libpulseaudio, codec2, libconfig
, gnuradio, gr-osmosdr, gsm
, libopus, libjpeg, protobuf, qwt, speex
} :
let
version = "0.5.2";
in stdenv.mkDerivation {
pname = "qradiolink";
inherit version;
src = fetchFromGitHub {
owner = "kantooon";
repo = "qradiolink";
rev = version;
sha256 = "0djhrr96b8sr2vi6hwbzgzlp9771622dp93f0rsphsyxlwbzsrgl";
};
preBuild = ''
cd ext
protoc --cpp_out=. Mumble.proto
protoc --cpp_out=. QRadioLink.proto
cd ..
qmake
'';
installPhase = ''
mkdir -p $out/bin
cp qradiolink $out/bin
'';
buildInputs = [
qt4
alsaLib
boost
libpulseaudio
codec2
libconfig
gsm
gnuradio
gr-osmosdr
libopus
libjpeg
protobuf
speex
qwt
];
enableParallelBuilding = true;
meta = with stdenv.lib; {
description = "SDR transceiver application for analog and digital modes";
homepage = http://qradiolink.org/;
license = licenses.agpl3;
maintainers = [ maintainers.markuskowa ];
platforms = platforms.linux;
};
}
## Instruction:
Revert "qradiolink: 0.5.0 -> 0.5.2"
## Code After:
{ stdenv, fetchFromGitHub, alsaLib, boost
, qt4, libpulseaudio, codec2, libconfig
, gnuradio, gr-osmosdr, gsm
, libopus, libjpeg, protobuf, qwt, speex
} :
let
version = "0.5.0";
in stdenv.mkDerivation {
pname = "qradiolink";
inherit version;
src = fetchFromGitHub {
owner = "kantooon";
repo = "qradiolink";
rev = version;
sha256 = "0xhg5zhjznmls5m3rhpk1qx0dipxmca12s85w15d0i7qwva2f1gi";
};
preBuild = ''
cd ext
protoc --cpp_out=. Mumble.proto
protoc --cpp_out=. QRadioLink.proto
cd ..
qmake
'';
installPhase = ''
mkdir -p $out/bin
cp qradiolink $out/bin
'';
buildInputs = [
qt4
alsaLib
boost
libpulseaudio
codec2
libconfig
gsm
gnuradio
gr-osmosdr
libopus
libjpeg
protobuf
speex
qwt
];
enableParallelBuilding = true;
meta = with stdenv.lib; {
description = "SDR transceiver application for analog and digital modes";
homepage = http://qradiolink.org/;
license = licenses.agpl3;
maintainers = [ maintainers.markuskowa ];
platforms = platforms.linux;
};
}
| { stdenv, fetchFromGitHub, alsaLib, boost
, qt4, libpulseaudio, codec2, libconfig
, gnuradio, gr-osmosdr, gsm
, libopus, libjpeg, protobuf, qwt, speex
} :
let
- version = "0.5.2";
? ^
+ version = "0.5.0";
? ^
in stdenv.mkDerivation {
pname = "qradiolink";
inherit version;
src = fetchFromGitHub {
owner = "kantooon";
repo = "qradiolink";
rev = version;
- sha256 = "0djhrr96b8sr2vi6hwbzgzlp9771622dp93f0rsphsyxlwbzsrgl";
+ sha256 = "0xhg5zhjznmls5m3rhpk1qx0dipxmca12s85w15d0i7qwva2f1gi";
};
preBuild = ''
cd ext
protoc --cpp_out=. Mumble.proto
protoc --cpp_out=. QRadioLink.proto
cd ..
qmake
'';
installPhase = ''
mkdir -p $out/bin
cp qradiolink $out/bin
'';
buildInputs = [
qt4
alsaLib
boost
libpulseaudio
codec2
libconfig
gsm
gnuradio
gr-osmosdr
libopus
libjpeg
protobuf
speex
qwt
];
enableParallelBuilding = true;
meta = with stdenv.lib; {
description = "SDR transceiver application for analog and digital modes";
homepage = http://qradiolink.org/;
license = licenses.agpl3;
maintainers = [ maintainers.markuskowa ];
platforms = platforms.linux;
};
} | 4 | 0.066667 | 2 | 2 |
842ca961a8bd3571c509ff6e0413d1e68c9729ca | RichTextExample/NoteEditorTextViewDelegate.swift | RichTextExample/NoteEditorTextViewDelegate.swift | //
// NoteEditorTextViewDelegate.swift
// RichTextExample
//
// Created by Clay Tinnell on 7/27/15.
// Copyright (c) 2015 Clay Tinnell. All rights reserved.
//
import UIKit
import Foundation
class NoteEditorTextViewDelegate: NSObject {
func characterIsAWordTerminator(text: String) -> Bool {
return text == " " || text == "." || text == "," || text == ":" || text == ";" || text == "\n" || text == "!" || text == "?"
}
func stripSpecialCharacters(text: String) -> [String]? {
let strippedLowerCaseText = " ".join(text.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
let trimmedWhiteSpace = strippedLowerCaseText.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).filter({!isEmpty($0)})
return trimmedWhiteSpace
}
}
extension NoteEditorTextViewDelegate : UITextViewDelegate {
func textView(textView: UITextView, shouldChangeTextInRange range: NSRange, replacementText text: String) -> Bool {
println("text changed: \(text)")
if characterIsAWordTerminator(text) {
println("word terminator")
if let strippedText = stripSpecialCharacters(textView.text) {
println(",".join(strippedText))
}
}
return true
}
}
| //
// NoteEditorTextViewDelegate.swift
// RichTextExample
//
// Created by Clay Tinnell on 7/27/15.
// Copyright (c) 2015 Clay Tinnell. All rights reserved.
//
import UIKit
import Foundation
class NoteEditorTextViewDelegate: NSObject {
}
extension NoteEditorTextViewDelegate : UITextViewDelegate {
func textView(textView: UITextView, shouldChangeTextInRange range: NSRange, replacementText text: String) -> Bool {
if text.isAWordTerminator() {
println(textView.text.stripSpecialCharacters().trimExtraWhiteSpace())
}
return true
}
}
private extension String {
func isAWordTerminator() -> Bool {
return self == " " || self == "." || self == "," || self == ":" || self == ";" || self == "\n" || self == "!" || self == "?"
}
func stripSpecialCharacters() -> String {
return " ".join(self.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
}
func componentsSeparatedByWhiteSpace() -> [String] {
return self.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet())
}
func trimExtraWhiteSpace() -> String {
return " ".join(self.componentsSeparatedByWhiteSpace().filter({!$0.isEmpty}))
}
}
| Refactor functions into String extension | Refactor functions into String extension
This seems to provide improved readability and usability.
| Swift | mit | ctinnell/RichTextEditorExample | swift | ## Code Before:
//
// NoteEditorTextViewDelegate.swift
// RichTextExample
//
// Created by Clay Tinnell on 7/27/15.
// Copyright (c) 2015 Clay Tinnell. All rights reserved.
//
import UIKit
import Foundation
class NoteEditorTextViewDelegate: NSObject {
func characterIsAWordTerminator(text: String) -> Bool {
return text == " " || text == "." || text == "," || text == ":" || text == ";" || text == "\n" || text == "!" || text == "?"
}
func stripSpecialCharacters(text: String) -> [String]? {
let strippedLowerCaseText = " ".join(text.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
let trimmedWhiteSpace = strippedLowerCaseText.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).filter({!isEmpty($0)})
return trimmedWhiteSpace
}
}
extension NoteEditorTextViewDelegate : UITextViewDelegate {
func textView(textView: UITextView, shouldChangeTextInRange range: NSRange, replacementText text: String) -> Bool {
println("text changed: \(text)")
if characterIsAWordTerminator(text) {
println("word terminator")
if let strippedText = stripSpecialCharacters(textView.text) {
println(",".join(strippedText))
}
}
return true
}
}
## Instruction:
Refactor functions into String extension
This seems to provide improved readability and usability.
## Code After:
//
// NoteEditorTextViewDelegate.swift
// RichTextExample
//
// Created by Clay Tinnell on 7/27/15.
// Copyright (c) 2015 Clay Tinnell. All rights reserved.
//
import UIKit
import Foundation
class NoteEditorTextViewDelegate: NSObject {
}
extension NoteEditorTextViewDelegate : UITextViewDelegate {
func textView(textView: UITextView, shouldChangeTextInRange range: NSRange, replacementText text: String) -> Bool {
if text.isAWordTerminator() {
println(textView.text.stripSpecialCharacters().trimExtraWhiteSpace())
}
return true
}
}
private extension String {
func isAWordTerminator() -> Bool {
return self == " " || self == "." || self == "," || self == ":" || self == ";" || self == "\n" || self == "!" || self == "?"
}
func stripSpecialCharacters() -> String {
return " ".join(self.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
}
func componentsSeparatedByWhiteSpace() -> [String] {
return self.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet())
}
func trimExtraWhiteSpace() -> String {
return " ".join(self.componentsSeparatedByWhiteSpace().filter({!$0.isEmpty}))
}
}
| //
// NoteEditorTextViewDelegate.swift
// RichTextExample
//
// Created by Clay Tinnell on 7/27/15.
// Copyright (c) 2015 Clay Tinnell. All rights reserved.
//
import UIKit
import Foundation
class NoteEditorTextViewDelegate: NSObject {
- func characterIsAWordTerminator(text: String) -> Bool {
- return text == " " || text == "." || text == "," || text == ":" || text == ";" || text == "\n" || text == "!" || text == "?"
- }
-
- func stripSpecialCharacters(text: String) -> [String]? {
- let strippedLowerCaseText = " ".join(text.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
- let trimmedWhiteSpace = strippedLowerCaseText.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).filter({!isEmpty($0)})
- return trimmedWhiteSpace
- }
-
}
extension NoteEditorTextViewDelegate : UITextViewDelegate {
+
func textView(textView: UITextView, shouldChangeTextInRange range: NSRange, replacementText text: String) -> Bool {
-
- println("text changed: \(text)")
- if characterIsAWordTerminator(text) {
? ------ ^^ ----
+ if text.isAWordTerminator() {
? ^^^^
+ println(textView.text.stripSpecialCharacters().trimExtraWhiteSpace())
- println("word terminator")
- if let strippedText = stripSpecialCharacters(textView.text) {
- println(",".join(strippedText))
- }
}
return true
}
}
+
+ private extension String {
+ func isAWordTerminator() -> Bool {
+ return self == " " || self == "." || self == "," || self == ":" || self == ";" || self == "\n" || self == "!" || self == "?"
+ }
+
+ func stripSpecialCharacters() -> String {
+ return " ".join(self.componentsSeparatedByCharactersInSet(NSCharacterSet.letterCharacterSet().invertedSet)).lowercaseString
+ }
+
+ func componentsSeparatedByWhiteSpace() -> [String] {
+ return self.componentsSeparatedByCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet())
+ }
+
+ func trimExtraWhiteSpace() -> String {
+ return " ".join(self.componentsSeparatedByWhiteSpace().filter({!$0.isEmpty}))
+ }
+ }
+
+
+ | 41 | 1.078947 | 24 | 17 |
b77e58c2c18c50990375ae46d811600143b3bed8 | tools/lld/TODO.txt | tools/lld/TODO.txt | tools/lld
~~~~~~~~~
Driver
------
lld needs a driver that supports gnu-ld_, ld64_, and link.exe_ arguments. It
would be nice to refactor the argument parsing parts of `Clang's`_ driver support
out to LLVM's Support library.
.. _gnu-ld: http://sourceware.org/binutils/docs-2.22/ld/Options.html#Options
.. _ld64: https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/Xcode-3.2.5/man1/ld.1.html
.. _link.exe: http://msdn.microsoft.com/en-us/library/y0zzbyt4(v=vs.110).aspx
.. _Clang's: http://clang.llvm.org/docs/DriverInternals.html
| tools/lld
~~~~~~~~~
Driver
------
lld needs a driver that supports gnu-ld_, ld64_, and link.exe_ arguments. It
would be nice to refactor the argument parsing parts of `Clang's`_ driver support
out to LLVM's Support library.
.. _gnu-ld: http://sourceware.org/binutils/docs-2.22/ld/Options.html#Options
.. _ld64: https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/Xcode-3.2.5/man1/ld.1.html
.. _link.exe: http://msdn.microsoft.com/en-us/library/y0zzbyt4(v=vs.110).aspx
.. _Clang's: http://clang.llvm.org/docs/DriverInternals.html
Driver Requirements
*******************
The following are the different types of arguments that the driver will need to
support for compatibility with each linker.
link.exe
^^^^^^^^
Output: Default output filename is the name of the first file with .obj on the
command line with .exe or .dll appended. Override with /OUT.
Options are case insensitive and can begin with - or /.
Types of options:
* @response_file
* {-/}flag
* {-/}flag[:number]
* {-/}flag[:string]
* {-/}flag:string
* {-/}flag:string[,[string][,string]]
* {-/}flag:string[,number]
* {-/}flag:@string,string
* {-/}flag:string[,@string[,string]][,string]
* {-/}flag:[string\]string
* {-/}flag:[string | string=string]
* {-/}flag:string,[[!]{DEKPRSW}][,ALIGN=#]
When options conflict, the last one wins.
ld64
^^^^
gnuld
^^^^^
| Add some docs on the type of flags link.exe has that our driver will need to support. | Add some docs on the type of flags link.exe has that our driver will need to support.
git-svn-id: f6089bf0e6284f307027cef4f64114ee9ebb0424@155861 91177308-0d34-0410-b5e6-96231b3b80d8
| Text | apache-2.0 | llvm-mirror/lld,llvm-mirror/lld | text | ## Code Before:
tools/lld
~~~~~~~~~
Driver
------
lld needs a driver that supports gnu-ld_, ld64_, and link.exe_ arguments. It
would be nice to refactor the argument parsing parts of `Clang's`_ driver support
out to LLVM's Support library.
.. _gnu-ld: http://sourceware.org/binutils/docs-2.22/ld/Options.html#Options
.. _ld64: https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/Xcode-3.2.5/man1/ld.1.html
.. _link.exe: http://msdn.microsoft.com/en-us/library/y0zzbyt4(v=vs.110).aspx
.. _Clang's: http://clang.llvm.org/docs/DriverInternals.html
## Instruction:
Add some docs on the type of flags link.exe has that our driver will need to support.
git-svn-id: f6089bf0e6284f307027cef4f64114ee9ebb0424@155861 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
tools/lld
~~~~~~~~~
Driver
------
lld needs a driver that supports gnu-ld_, ld64_, and link.exe_ arguments. It
would be nice to refactor the argument parsing parts of `Clang's`_ driver support
out to LLVM's Support library.
.. _gnu-ld: http://sourceware.org/binutils/docs-2.22/ld/Options.html#Options
.. _ld64: https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/Xcode-3.2.5/man1/ld.1.html
.. _link.exe: http://msdn.microsoft.com/en-us/library/y0zzbyt4(v=vs.110).aspx
.. _Clang's: http://clang.llvm.org/docs/DriverInternals.html
Driver Requirements
*******************
The following are the different types of arguments that the driver will need to
support for compatibility with each linker.
link.exe
^^^^^^^^
Output: Default output filename is the name of the first file with .obj on the
command line with .exe or .dll appended. Override with /OUT.
Options are case insensitive and can begin with - or /.
Types of options:
* @response_file
* {-/}flag
* {-/}flag[:number]
* {-/}flag[:string]
* {-/}flag:string
* {-/}flag:string[,[string][,string]]
* {-/}flag:string[,number]
* {-/}flag:@string,string
* {-/}flag:string[,@string[,string]][,string]
* {-/}flag:[string\]string
* {-/}flag:[string | string=string]
* {-/}flag:string,[[!]{DEKPRSW}][,ALIGN=#]
When options conflict, the last one wins.
ld64
^^^^
gnuld
^^^^^
| tools/lld
~~~~~~~~~
Driver
------
lld needs a driver that supports gnu-ld_, ld64_, and link.exe_ arguments. It
would be nice to refactor the argument parsing parts of `Clang's`_ driver support
out to LLVM's Support library.
.. _gnu-ld: http://sourceware.org/binutils/docs-2.22/ld/Options.html#Options
.. _ld64: https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/Xcode-3.2.5/man1/ld.1.html
.. _link.exe: http://msdn.microsoft.com/en-us/library/y0zzbyt4(v=vs.110).aspx
.. _Clang's: http://clang.llvm.org/docs/DriverInternals.html
+
+ Driver Requirements
+ *******************
+
+ The following are the different types of arguments that the driver will need to
+ support for compatibility with each linker.
+
+ link.exe
+ ^^^^^^^^
+
+ Output: Default output filename is the name of the first file with .obj on the
+ command line with .exe or .dll appended. Override with /OUT.
+
+ Options are case insensitive and can begin with - or /.
+
+ Types of options:
+
+ * @response_file
+ * {-/}flag
+ * {-/}flag[:number]
+ * {-/}flag[:string]
+ * {-/}flag:string
+ * {-/}flag:string[,[string][,string]]
+ * {-/}flag:string[,number]
+ * {-/}flag:@string,string
+ * {-/}flag:string[,@string[,string]][,string]
+ * {-/}flag:[string\]string
+ * {-/}flag:[string | string=string]
+ * {-/}flag:string,[[!]{DEKPRSW}][,ALIGN=#]
+
+ When options conflict, the last one wins.
+
+ ld64
+ ^^^^
+
+ gnuld
+ ^^^^^ | 37 | 2.642857 | 37 | 0 |
7b7605dcc77909afed0141b87ec586a865b5f426 | modes/dired-conf.el | modes/dired-conf.el | ;;; dired-conf.el -- Settings for dired-mode
(eval-when-compile
(require 'dired)
(require 'dired-aux))
(setq dired-listing-switches "-lRA --ignore='.git' --group-directories-first"
dired-auto-revert-buffer t
dired-isearch-filenames t)
| ;;; dired-conf.el -- Settings for dired-mode
(eval-when-compile
(require 'dired)
(require 'dired-aux))
(setq dired-listing-switches "-lRA --ignore='.git' --group-directories-first"
dired-auto-revert-buffer t
dired-isearch-filenames t)
(defun pjones:dired-show-only-matching-files (regexp)
(interactive "sFiles to show (regexp): ")
(dired-mark-files-regexp regexp)
(dired-toggle-marks)
(dired-do-kill-lines))
(defun pjones:dired-load-hook ()
(define-key dired-mode-map [?%?h] 'pjones:dired-show-only-matching-files))
(add-hook 'dired-load-hook 'pjones:dired-load-hook)
(add-hook 'dired-mode-hook 'turn-on-gnus-dired-mode)
| Add a new dired function to only show files that match a regexp | Add a new dired function to only show files that match a regexp
| Emacs Lisp | bsd-3-clause | pjones/emacsrc | emacs-lisp | ## Code Before:
;;; dired-conf.el -- Settings for dired-mode
(eval-when-compile
(require 'dired)
(require 'dired-aux))
(setq dired-listing-switches "-lRA --ignore='.git' --group-directories-first"
dired-auto-revert-buffer t
dired-isearch-filenames t)
## Instruction:
Add a new dired function to only show files that match a regexp
## Code After:
;;; dired-conf.el -- Settings for dired-mode
(eval-when-compile
(require 'dired)
(require 'dired-aux))
(setq dired-listing-switches "-lRA --ignore='.git' --group-directories-first"
dired-auto-revert-buffer t
dired-isearch-filenames t)
(defun pjones:dired-show-only-matching-files (regexp)
(interactive "sFiles to show (regexp): ")
(dired-mark-files-regexp regexp)
(dired-toggle-marks)
(dired-do-kill-lines))
(defun pjones:dired-load-hook ()
(define-key dired-mode-map [?%?h] 'pjones:dired-show-only-matching-files))
(add-hook 'dired-load-hook 'pjones:dired-load-hook)
(add-hook 'dired-mode-hook 'turn-on-gnus-dired-mode)
| ;;; dired-conf.el -- Settings for dired-mode
(eval-when-compile
(require 'dired)
(require 'dired-aux))
(setq dired-listing-switches "-lRA --ignore='.git' --group-directories-first"
dired-auto-revert-buffer t
dired-isearch-filenames t)
+
+ (defun pjones:dired-show-only-matching-files (regexp)
+ (interactive "sFiles to show (regexp): ")
+ (dired-mark-files-regexp regexp)
+ (dired-toggle-marks)
+ (dired-do-kill-lines))
+
+ (defun pjones:dired-load-hook ()
+ (define-key dired-mode-map [?%?h] 'pjones:dired-show-only-matching-files))
+
+ (add-hook 'dired-load-hook 'pjones:dired-load-hook)
+ (add-hook 'dired-mode-hook 'turn-on-gnus-dired-mode) | 12 | 1.5 | 12 | 0 |
9de844864b3e6c732241a68d1871f701232d2733 | celery_janitor/utils.py | celery_janitor/utils.py | import importlib
import urlparse
from celery_janitor import conf
from celery_janitor.exceptions import BackendNotSupportedException
BACKEND_MAPPING = {
'sqs': 'celery_janitor.backends.sqs.SQSBackend'
}
def import_class(path):
path_bits = path.split('.')
class_name = path_bits.pop()
module_path = '.'.join(path_bits)
module_itself = importlib.import_module(module_path)
if not hasattr(module_itself, class_name):
raise ImportError("Module '%s' has no '%s' class." % (module_path, class_name))
return getattr(module_itself, class_name)
class Config(object):
def __init__(self):
self.broker = urlparse.urlparse(conf.BROKER_URL)
def get_backend_class(self):
try:
return BACKEND_MAPPING[self.broker.scheme]
except KeyError:
raise BackendNotSupportedException(
"{} not supported".format(self.broker.scheme))
def get_credentials(self):
if self.broker.scheme == 'sqs':
access_id, access_secret = self.broker.netloc.split(':')
access_secret = access_secret[:-1]
return (access_id, access_secret)
def get_backend():
config = Config()
backend_class = config.get_backend()
backend = import_class(backend_class)
return backend(*config.get_credentials())
| import importlib
try:
from urlparse import urlparse
except ImportError: # Python 3.x
from urllib.parse import urlparse
from celery_janitor import conf
from celery_janitor.exceptions import BackendNotSupportedException
BACKEND_MAPPING = {
'sqs': 'celery_janitor.backends.sqs.SQSBackend'
}
def import_class(path):
path_bits = path.split('.')
class_name = path_bits.pop()
module_path = '.'.join(path_bits)
module_itself = importlib.import_module(module_path)
if not hasattr(module_itself, class_name):
raise ImportError("Module '%s' has no '%s' class." % (module_path, class_name))
return getattr(module_itself, class_name)
class Config(object):
def __init__(self):
self.broker = urlparse(conf.BROKER_URL)
def get_backend_class(self):
try:
return BACKEND_MAPPING[self.broker.scheme]
except KeyError:
raise BackendNotSupportedException(
"{} not supported".format(self.broker.scheme))
def get_credentials(self):
if self.broker.scheme == 'sqs':
access_id, access_secret = self.broker.netloc.split(':')
access_secret = access_secret[:-1]
return (access_id, access_secret)
def get_backend():
config = Config()
backend_class = config.get_backend()
backend = import_class(backend_class)
return backend(*config.get_credentials())
| Fix Python 3.4 import error | Fix Python 3.4 import error
| Python | mit | comandrei/celery-janitor | python | ## Code Before:
import importlib
import urlparse
from celery_janitor import conf
from celery_janitor.exceptions import BackendNotSupportedException
BACKEND_MAPPING = {
'sqs': 'celery_janitor.backends.sqs.SQSBackend'
}
def import_class(path):
path_bits = path.split('.')
class_name = path_bits.pop()
module_path = '.'.join(path_bits)
module_itself = importlib.import_module(module_path)
if not hasattr(module_itself, class_name):
raise ImportError("Module '%s' has no '%s' class." % (module_path, class_name))
return getattr(module_itself, class_name)
class Config(object):
def __init__(self):
self.broker = urlparse.urlparse(conf.BROKER_URL)
def get_backend_class(self):
try:
return BACKEND_MAPPING[self.broker.scheme]
except KeyError:
raise BackendNotSupportedException(
"{} not supported".format(self.broker.scheme))
def get_credentials(self):
if self.broker.scheme == 'sqs':
access_id, access_secret = self.broker.netloc.split(':')
access_secret = access_secret[:-1]
return (access_id, access_secret)
def get_backend():
config = Config()
backend_class = config.get_backend()
backend = import_class(backend_class)
return backend(*config.get_credentials())
## Instruction:
Fix Python 3.4 import error
## Code After:
import importlib
try:
from urlparse import urlparse
except ImportError: # Python 3.x
from urllib.parse import urlparse
from celery_janitor import conf
from celery_janitor.exceptions import BackendNotSupportedException
BACKEND_MAPPING = {
'sqs': 'celery_janitor.backends.sqs.SQSBackend'
}
def import_class(path):
path_bits = path.split('.')
class_name = path_bits.pop()
module_path = '.'.join(path_bits)
module_itself = importlib.import_module(module_path)
if not hasattr(module_itself, class_name):
raise ImportError("Module '%s' has no '%s' class." % (module_path, class_name))
return getattr(module_itself, class_name)
class Config(object):
def __init__(self):
self.broker = urlparse(conf.BROKER_URL)
def get_backend_class(self):
try:
return BACKEND_MAPPING[self.broker.scheme]
except KeyError:
raise BackendNotSupportedException(
"{} not supported".format(self.broker.scheme))
def get_credentials(self):
if self.broker.scheme == 'sqs':
access_id, access_secret = self.broker.netloc.split(':')
access_secret = access_secret[:-1]
return (access_id, access_secret)
def get_backend():
config = Config()
backend_class = config.get_backend()
backend = import_class(backend_class)
return backend(*config.get_credentials())
| import importlib
- import urlparse
+ try:
+ from urlparse import urlparse
+ except ImportError: # Python 3.x
+ from urllib.parse import urlparse
from celery_janitor import conf
from celery_janitor.exceptions import BackendNotSupportedException
BACKEND_MAPPING = {
'sqs': 'celery_janitor.backends.sqs.SQSBackend'
}
def import_class(path):
path_bits = path.split('.')
class_name = path_bits.pop()
module_path = '.'.join(path_bits)
module_itself = importlib.import_module(module_path)
if not hasattr(module_itself, class_name):
raise ImportError("Module '%s' has no '%s' class." % (module_path, class_name))
return getattr(module_itself, class_name)
class Config(object):
def __init__(self):
- self.broker = urlparse.urlparse(conf.BROKER_URL)
? ---------
+ self.broker = urlparse(conf.BROKER_URL)
def get_backend_class(self):
try:
return BACKEND_MAPPING[self.broker.scheme]
except KeyError:
raise BackendNotSupportedException(
"{} not supported".format(self.broker.scheme))
def get_credentials(self):
if self.broker.scheme == 'sqs':
access_id, access_secret = self.broker.netloc.split(':')
access_secret = access_secret[:-1]
return (access_id, access_secret)
def get_backend():
config = Config()
backend_class = config.get_backend()
backend = import_class(backend_class)
return backend(*config.get_credentials()) | 7 | 0.145833 | 5 | 2 |
a90da195fd6163b97c42dc074a7930660fb7a72b | README.md | README.md |
- Node (8+)
- NPM
- [Jekyll](https://jekyllrb.com/docs/installation/)
## Setup
Install node dev packages:
```bash
npm i
```
Initialize lerna:
```bash
npx lerna init
```
## Build
Compile typescript for all packages:
```bash
./scripts/build/cli
```
## Run
Runs the command line app:
```bash
./scripts/run/cli --help
```
## Compile example project
Outputs a compiled JSON graph for one of the example projects:
```bash
./scripts/run/cli compile ts/example-bigquery
```
## Serve docs
Runs Jekyll locally at http://localhost:4000
```bash
(cd docs && bundle exec jekyll serve)
```
## Run tests
Run mocha tests in `ts/tests`:
```bash
./scripts/test
```
## Publish
Publish a new version with lerna:
```bash
npx lerna publish
```
|
- Node (8+)
- NPM
- [Jekyll](https://jekyllrb.com/docs/installation/)
## Setup
Install node dev packages:
```bash
npm i
```
Install lerna managed packages:
```bash
npx lerna bootstrap
```
## Build
Compile typescript for all packages:
```bash
./scripts/build/cli
```
## Run
Runs the command line app:
```bash
./scripts/run/cli --help
```
## Compile example project
Outputs a compiled JSON graph for one of the example projects:
```bash
./scripts/run/cli compile ts/example-bigquery
```
## Serve docs
Runs Jekyll locally at http://localhost:4000
```bash
(cd docs && bundle exec jekyll serve)
```
## Run tests
Run mocha tests in `ts/tests`:
```bash
./scripts/test
```
## Publish
Publish a new version with lerna:
```bash
npx lerna publish
```
| Update readme with proper lerna command | Update readme with proper lerna command
| Markdown | mit | dataform-co/dataform,dataform-co/dataform,dataform-co/dataform,dataform-co/dataform,dataform-co/dataform | markdown | ## Code Before:
- Node (8+)
- NPM
- [Jekyll](https://jekyllrb.com/docs/installation/)
## Setup
Install node dev packages:
```bash
npm i
```
Initialize lerna:
```bash
npx lerna init
```
## Build
Compile typescript for all packages:
```bash
./scripts/build/cli
```
## Run
Runs the command line app:
```bash
./scripts/run/cli --help
```
## Compile example project
Outputs a compiled JSON graph for one of the example projects:
```bash
./scripts/run/cli compile ts/example-bigquery
```
## Serve docs
Runs Jekyll locally at http://localhost:4000
```bash
(cd docs && bundle exec jekyll serve)
```
## Run tests
Run mocha tests in `ts/tests`:
```bash
./scripts/test
```
## Publish
Publish a new version with lerna:
```bash
npx lerna publish
```
## Instruction:
Update readme with proper lerna command
## Code After:
- Node (8+)
- NPM
- [Jekyll](https://jekyllrb.com/docs/installation/)
## Setup
Install node dev packages:
```bash
npm i
```
Install lerna managed packages:
```bash
npx lerna bootstrap
```
## Build
Compile typescript for all packages:
```bash
./scripts/build/cli
```
## Run
Runs the command line app:
```bash
./scripts/run/cli --help
```
## Compile example project
Outputs a compiled JSON graph for one of the example projects:
```bash
./scripts/run/cli compile ts/example-bigquery
```
## Serve docs
Runs Jekyll locally at http://localhost:4000
```bash
(cd docs && bundle exec jekyll serve)
```
## Run tests
Run mocha tests in `ts/tests`:
```bash
./scripts/test
```
## Publish
Publish a new version with lerna:
```bash
npx lerna publish
```
|
- Node (8+)
- NPM
- [Jekyll](https://jekyllrb.com/docs/installation/)
## Setup
Install node dev packages:
```bash
npm i
```
- Initialize lerna:
+ Install lerna managed packages:
```bash
- npx lerna init
+ npx lerna bootstrap
```
## Build
Compile typescript for all packages:
```bash
./scripts/build/cli
```
## Run
Runs the command line app:
```bash
./scripts/run/cli --help
```
## Compile example project
Outputs a compiled JSON graph for one of the example projects:
```bash
./scripts/run/cli compile ts/example-bigquery
```
## Serve docs
Runs Jekyll locally at http://localhost:4000
```bash
(cd docs && bundle exec jekyll serve)
```
## Run tests
Run mocha tests in `ts/tests`:
```bash
./scripts/test
```
## Publish
Publish a new version with lerna:
```bash
npx lerna publish
``` | 4 | 0.060606 | 2 | 2 |
1d660ad431e5b760359535a6563c557ec8490efe | app.yaml | app.yaml | runtime: go
vm: true
api_version: go1
handlers:
- url: /.*
script: _go_app
secure: always | runtime: go
vm: true
api_version: go1
handlers:
- url: /.*
script: _go_app
secure: always
redirect_http_response_code: 301
| Change HTTP -> HTTPS redirect to a 301. | Change HTTP -> HTTPS redirect to a 301.
Apparently 302 is the default: https://cloud.google.com/appengine/docs/go/config/appref#handlers_redirect_http_response_code
Thanks for @ivanr for finding (using hardenize.com).
| YAML | bsd-3-clause | chromium/hstspreload.org,chromium/hstspreload.org,chromium/hstspreload.org,chromium/hstspreload.org | yaml | ## Code Before:
runtime: go
vm: true
api_version: go1
handlers:
- url: /.*
script: _go_app
secure: always
## Instruction:
Change HTTP -> HTTPS redirect to a 301.
Apparently 302 is the default: https://cloud.google.com/appengine/docs/go/config/appref#handlers_redirect_http_response_code
Thanks for @ivanr for finding (using hardenize.com).
## Code After:
runtime: go
vm: true
api_version: go1
handlers:
- url: /.*
script: _go_app
secure: always
redirect_http_response_code: 301
| runtime: go
vm: true
api_version: go1
handlers:
- url: /.*
script: _go_app
secure: always
+ redirect_http_response_code: 301 | 1 | 0.125 | 1 | 0 |
73b9d262433290c174cd7decce85f167aae28adb | index.js | index.js | "use strict";
var port = process.env.PORT || 8080;
var file = require("fs").readFileSync;
var glob = require("glob");
var path = require("path");
var rested = require("rested");
var routes = glob.sync("routes/*.js").map(function (e, i, a) {
return require(path.join(__dirname, e));
});
var server = rested.createServer({
"crt": file(path.join(__dirname, "files/api.munapps.ca.crt")),
"key": file(path.join(__dirname, "files/api.munapps.ca.key")),
"secure": true,
"routes": routes
});
server.listen(port, function () {
console.log("Server listening on port " + port);
});
| "use strict";
var port = process.env.PORT || 8080;
var file = require("fs").readFileSync;
var glob = require("glob");
var path = require("path");
var rested = require("rested");
var routes = glob.sync("routes/*.js").map(function (e, i, a) {
return require(path.join(__dirname, e));
});
var server = rested.createServer({
"crt": file(path.join(__dirname, "files/api.munapps.ca.crt")),
"key": file(path.join(__dirname, "files/api.munapps.ca.key")),
"secure": true,
"routes": routes
});
server.eachRequest(function (request, response) {
response.setHeader("X-Powered-By", "Coffee");
response.setHeader("X-Shenanigans", "none");
response.setHeader("X-GitHub", "http://git.io/3K55mA");
});
server.listen(port, function () {
console.log("Server listening on port " + port);
});
| Add headers common to each request | Add headers common to each request
| JavaScript | bsd-3-clause | munapps/munapps-api | javascript | ## Code Before:
"use strict";
var port = process.env.PORT || 8080;
var file = require("fs").readFileSync;
var glob = require("glob");
var path = require("path");
var rested = require("rested");
var routes = glob.sync("routes/*.js").map(function (e, i, a) {
return require(path.join(__dirname, e));
});
var server = rested.createServer({
"crt": file(path.join(__dirname, "files/api.munapps.ca.crt")),
"key": file(path.join(__dirname, "files/api.munapps.ca.key")),
"secure": true,
"routes": routes
});
server.listen(port, function () {
console.log("Server listening on port " + port);
});
## Instruction:
Add headers common to each request
## Code After:
"use strict";
var port = process.env.PORT || 8080;
var file = require("fs").readFileSync;
var glob = require("glob");
var path = require("path");
var rested = require("rested");
var routes = glob.sync("routes/*.js").map(function (e, i, a) {
return require(path.join(__dirname, e));
});
var server = rested.createServer({
"crt": file(path.join(__dirname, "files/api.munapps.ca.crt")),
"key": file(path.join(__dirname, "files/api.munapps.ca.key")),
"secure": true,
"routes": routes
});
server.eachRequest(function (request, response) {
response.setHeader("X-Powered-By", "Coffee");
response.setHeader("X-Shenanigans", "none");
response.setHeader("X-GitHub", "http://git.io/3K55mA");
});
server.listen(port, function () {
console.log("Server listening on port " + port);
});
| "use strict";
var port = process.env.PORT || 8080;
var file = require("fs").readFileSync;
var glob = require("glob");
var path = require("path");
var rested = require("rested");
var routes = glob.sync("routes/*.js").map(function (e, i, a) {
return require(path.join(__dirname, e));
});
var server = rested.createServer({
"crt": file(path.join(__dirname, "files/api.munapps.ca.crt")),
"key": file(path.join(__dirname, "files/api.munapps.ca.key")),
"secure": true,
"routes": routes
});
+ server.eachRequest(function (request, response) {
+ response.setHeader("X-Powered-By", "Coffee");
+ response.setHeader("X-Shenanigans", "none");
+ response.setHeader("X-GitHub", "http://git.io/3K55mA");
+ });
+
server.listen(port, function () {
console.log("Server listening on port " + port);
}); | 6 | 0.26087 | 6 | 0 |
6eefc0acf6e7ffeaa9e22bd4c7929e75128f636d | README.md | README.md | Library for interacting with pCloud's API. Uses pCloud's own binary protocol.
|
An official type-safe networking client for [ pCloud's API][docs] binary protocol by pCloud AG.
## Requirements
- Java 7.0+
- Android 2.3+ (API9+)
## Documentation
The pCloud API and binary protocol documentation can be found [here][docs].
[site]: https://www.pcloud.com/
[docs]: https://docs.pcloud.com/
| Update the readme file with something more meaningful | (docs): Update the readme file with something more meaningful
| Markdown | apache-2.0 | pCloud/pcloud-networking-java | markdown | ## Code Before:
Library for interacting with pCloud's API. Uses pCloud's own binary protocol.
## Instruction:
(docs): Update the readme file with something more meaningful
## Code After:
An official type-safe networking client for [ pCloud's API][docs] binary protocol by pCloud AG.
## Requirements
- Java 7.0+
- Android 2.3+ (API9+)
## Documentation
The pCloud API and binary protocol documentation can be found [here][docs].
[site]: https://www.pcloud.com/
[docs]: https://docs.pcloud.com/
| - Library for interacting with pCloud's API. Uses pCloud's own binary protocol.
+
+ An official type-safe networking client for [ pCloud's API][docs] binary protocol by pCloud AG.
+
+ ## Requirements
+
+ - Java 7.0+
+ - Android 2.3+ (API9+)
+
+ ## Documentation
+
+ The pCloud API and binary protocol documentation can be found [here][docs].
+
+
+
+ [site]: https://www.pcloud.com/
+ [docs]: https://docs.pcloud.com/ | 17 | 17 | 16 | 1 |
8377ed36990acb2eae2b1a4775f00688ab4490ea | tests/performance.c | tests/performance.c | int test_performance() {
asic_t *device = asic_init(TI83p);
struct timespec start, stop;
unsigned long long t;
int i;
clock_gettime(CLOCK_MONOTONIC_RAW, &start);
for (i = 0; i < 1000000; i++) {
i -= cpu_execute(device->cpu, 1);
}
clock_gettime(CLOCK_MONOTONIC_RAW, &stop);
t = (stop.tv_sec*1000000000UL) + stop.tv_nsec;
t -= (start.tv_sec * 1000000000UL) + start.tv_nsec;
printf("executed 1,000,000 cycles in %llu microseconds (~%llu MHz)\n", t/1000, 1000000000/t);
asic_free(device);
return -1;
}
| int test_performance() {
asic_t *device = asic_init(TI83p);
clock_t start, stop;
int i;
start = clock();
for (i = 0; i < 1000000; i++) {
i -= cpu_execute(device->cpu, 1);
}
stop = clock();
double time = (double)(stop - start) / (CLOCKS_PER_SEC / 1000);
double mHz = 1000.0 / time;
printf("executed 1,000,000 cycles in %f milliseconds (~%f MHz)\n", time, mHz);
asic_free(device);
return -1;
}
| Switch to more common timing functions | Switch to more common timing functions
| C | mit | KnightOS/z80e,KnightOS/z80e | c | ## Code Before:
int test_performance() {
asic_t *device = asic_init(TI83p);
struct timespec start, stop;
unsigned long long t;
int i;
clock_gettime(CLOCK_MONOTONIC_RAW, &start);
for (i = 0; i < 1000000; i++) {
i -= cpu_execute(device->cpu, 1);
}
clock_gettime(CLOCK_MONOTONIC_RAW, &stop);
t = (stop.tv_sec*1000000000UL) + stop.tv_nsec;
t -= (start.tv_sec * 1000000000UL) + start.tv_nsec;
printf("executed 1,000,000 cycles in %llu microseconds (~%llu MHz)\n", t/1000, 1000000000/t);
asic_free(device);
return -1;
}
## Instruction:
Switch to more common timing functions
## Code After:
int test_performance() {
asic_t *device = asic_init(TI83p);
clock_t start, stop;
int i;
start = clock();
for (i = 0; i < 1000000; i++) {
i -= cpu_execute(device->cpu, 1);
}
stop = clock();
double time = (double)(stop - start) / (CLOCKS_PER_SEC / 1000);
double mHz = 1000.0 / time;
printf("executed 1,000,000 cycles in %f milliseconds (~%f MHz)\n", time, mHz);
asic_free(device);
return -1;
}
| int test_performance() {
asic_t *device = asic_init(TI83p);
+ clock_t start, stop;
- struct timespec start, stop;
- unsigned long long t;
int i;
- clock_gettime(CLOCK_MONOTONIC_RAW, &start);
+ start = clock();
for (i = 0; i < 1000000; i++) {
i -= cpu_execute(device->cpu, 1);
}
- clock_gettime(CLOCK_MONOTONIC_RAW, &stop);
- t = (stop.tv_sec*1000000000UL) + stop.tv_nsec;
- t -= (start.tv_sec * 1000000000UL) + start.tv_nsec;
+ stop = clock();
+ double time = (double)(stop - start) / (CLOCKS_PER_SEC / 1000);
+ double mHz = 1000.0 / time;
- printf("executed 1,000,000 cycles in %llu microseconds (~%llu MHz)\n", t/1000, 1000000000/t);
? ^^^ ^^^ ^^^ ^^^^^ ^^^^^^^^^^^^
+ printf("executed 1,000,000 cycles in %f milliseconds (~%f MHz)\n", time, mHz);
? ^ ^^^ ^ ^^^ ^^^
asic_free(device);
return -1;
} | 13 | 0.722222 | 6 | 7 |
0e8f6d4d134854eb38e89ba60d9ff9e591f2ce81 | graphql/types/comment-input.js | graphql/types/comment-input.js | import {
GraphQLInputObjectType,
GraphQLString,
GraphQLID,
GraphQLNonNull
} from 'graphql';
export default new GraphQLInputObjectType({
name: 'CommentInput',
fields: {
_id: {
type: new GraphQLNonNull(GraphQLID)
},
postId: {
type: new GraphQLNonNull(GraphQLID)
},
text: {
type: new GraphQLNonNull(GraphQLString)
}
}
});
| import {
GraphQLInputObjectType,
GraphQLString,
GraphQLID,
GraphQLNonNull
} from 'graphql';
export default new GraphQLInputObjectType({
name: 'CommentInput',
fields: {
postId: {
type: new GraphQLNonNull(GraphQLID)
},
text: {
type: new GraphQLNonNull(GraphQLString)
}
}
});
| Remove ID in CommentInput type | Remove ID in CommentInput type
| JavaScript | mit | maxpou/graphql-nodejs-mongodb | javascript | ## Code Before:
import {
GraphQLInputObjectType,
GraphQLString,
GraphQLID,
GraphQLNonNull
} from 'graphql';
export default new GraphQLInputObjectType({
name: 'CommentInput',
fields: {
_id: {
type: new GraphQLNonNull(GraphQLID)
},
postId: {
type: new GraphQLNonNull(GraphQLID)
},
text: {
type: new GraphQLNonNull(GraphQLString)
}
}
});
## Instruction:
Remove ID in CommentInput type
## Code After:
import {
GraphQLInputObjectType,
GraphQLString,
GraphQLID,
GraphQLNonNull
} from 'graphql';
export default new GraphQLInputObjectType({
name: 'CommentInput',
fields: {
postId: {
type: new GraphQLNonNull(GraphQLID)
},
text: {
type: new GraphQLNonNull(GraphQLString)
}
}
});
| import {
GraphQLInputObjectType,
GraphQLString,
GraphQLID,
GraphQLNonNull
} from 'graphql';
export default new GraphQLInputObjectType({
name: 'CommentInput',
fields: {
- _id: {
- type: new GraphQLNonNull(GraphQLID)
- },
postId: {
type: new GraphQLNonNull(GraphQLID)
},
text: {
type: new GraphQLNonNull(GraphQLString)
}
}
}); | 3 | 0.142857 | 0 | 3 |
4c656fa459751c968fc8aed6194ebc1a0fe0dc01 | lib/psd/layer/info/sheet_color.rb | lib/psd/layer/info/sheet_color.rb | require 'psd/layer_info'
class PSD
class SheetColor < LayerInfo
def self.should_parse?(key)
key == 'lclr'
end
def parse
@data = [
@file.read_short,
@file.read_short,
@file.read_short,
@file.read_short
]
end
end
end | require 'psd/layer_info'
class PSD
# This is the color label for a group/layer. Not sure why Adobe
# refers to it as the "Sheet Color".
class SheetColor < LayerInfo
def self.should_parse?(key)
key == 'lclr'
end
COLORS = [
:no_color,
:red,
:orange,
:yellow,
:green,
:blue,
:violet,
:gray
]
def parse
# Only the first entry is used, the rest are always 0.
@data = [
@file.read_short,
@file.read_short,
@file.read_short,
@file.read_short
]
end
def color
COLORS[@data[0]]
end
end
end
| Make it easy to get label color | Make it easy to get label color
| Ruby | mit | layervault/psd.rb | ruby | ## Code Before:
require 'psd/layer_info'
class PSD
class SheetColor < LayerInfo
def self.should_parse?(key)
key == 'lclr'
end
def parse
@data = [
@file.read_short,
@file.read_short,
@file.read_short,
@file.read_short
]
end
end
end
## Instruction:
Make it easy to get label color
## Code After:
require 'psd/layer_info'
class PSD
# This is the color label for a group/layer. Not sure why Adobe
# refers to it as the "Sheet Color".
class SheetColor < LayerInfo
def self.should_parse?(key)
key == 'lclr'
end
COLORS = [
:no_color,
:red,
:orange,
:yellow,
:green,
:blue,
:violet,
:gray
]
def parse
# Only the first entry is used, the rest are always 0.
@data = [
@file.read_short,
@file.read_short,
@file.read_short,
@file.read_short
]
end
def color
COLORS[@data[0]]
end
end
end
| require 'psd/layer_info'
class PSD
+ # This is the color label for a group/layer. Not sure why Adobe
+ # refers to it as the "Sheet Color".
class SheetColor < LayerInfo
def self.should_parse?(key)
key == 'lclr'
end
+ COLORS = [
+ :no_color,
+ :red,
+ :orange,
+ :yellow,
+ :green,
+ :blue,
+ :violet,
+ :gray
+ ]
+
def parse
+ # Only the first entry is used, the rest are always 0.
@data = [
@file.read_short,
@file.read_short,
@file.read_short,
@file.read_short
]
end
+
+ def color
+ COLORS[@data[0]]
+ end
end
end | 18 | 1 | 18 | 0 |
b97f50bc64ae57ac60b882a9281813c701adc325 | src/main/java/Main.java | src/main/java/Main.java | import java.util.Scanner;
public class Main {
public static void main(String[] args) {
printTitle();
titleWait();
System.out.println("You wake up on a beach.");
}
private static void printTitle() {
String banner = "___________ .__ ___________ __ \n" +
"\\_ _____/_____ |__| ___\\__ ___/___ ___ ____/ |_ \n" +
" | __)_\\____ \\| |/ ___\\| |_/ __ \\\\ \\/ /\\ __\\\n" +
" | \\ |_> > \\ \\___| |\\ ___/ > < | | \n" +
"/_______ / __/|__|\\___ >____| \\___ >__/\\_ \\ |__| \n" +
" \\/|__| \\/ \\/ \\/ ";
System.out.println(banner);
}
private static void titleWait() {
System.out.println("Press \"ENTER\" to continue...");
Scanner scanner = new Scanner(System.in);
scanner.nextLine();
clearScreen();
}
private static void clearScreen() {
}
}
| import java.util.Scanner;
public class Main {
public static void main(String[] args) {
printTitle();
titleWait();
System.out.println("You wake up on a beach.");
}
private static void printTitle() {
String banner = "___________ .__ ___________ __ \n" +
"\\_ _____/_____ |__| ___\\__ ___/___ ___ ____/ |_ \n" +
" | __)_\\____ \\| |/ ___\\| |_/ __ \\\\ \\/ /\\ __\\\n" +
" | \\ |_> > \\ \\___| |\\ ___/ > < | | \n" +
"/_______ / __/|__|\\___ >____| \\___ >__/\\_ \\ |__| \n" +
" \\/|__| \\/ \\/ \\/ ";
System.out.println(banner);
}
private static void titleWait() {
System.out.println("Press \"ENTER\" to continue...");
Scanner scanner = new Scanner(System.in);
scanner.nextLine();
clearScreen();
}
private static void clearScreen() {
System.out.println("\n\n\n\n\n\n\n\n\n\n");
}
}
| Add new line print to clearScreen() | Add new line print to clearScreen()
| Java | mit | MarkNBroadhead/EpicText | java | ## Code Before:
import java.util.Scanner;
public class Main {
public static void main(String[] args) {
printTitle();
titleWait();
System.out.println("You wake up on a beach.");
}
private static void printTitle() {
String banner = "___________ .__ ___________ __ \n" +
"\\_ _____/_____ |__| ___\\__ ___/___ ___ ____/ |_ \n" +
" | __)_\\____ \\| |/ ___\\| |_/ __ \\\\ \\/ /\\ __\\\n" +
" | \\ |_> > \\ \\___| |\\ ___/ > < | | \n" +
"/_______ / __/|__|\\___ >____| \\___ >__/\\_ \\ |__| \n" +
" \\/|__| \\/ \\/ \\/ ";
System.out.println(banner);
}
private static void titleWait() {
System.out.println("Press \"ENTER\" to continue...");
Scanner scanner = new Scanner(System.in);
scanner.nextLine();
clearScreen();
}
private static void clearScreen() {
}
}
## Instruction:
Add new line print to clearScreen()
## Code After:
import java.util.Scanner;
public class Main {
public static void main(String[] args) {
printTitle();
titleWait();
System.out.println("You wake up on a beach.");
}
private static void printTitle() {
String banner = "___________ .__ ___________ __ \n" +
"\\_ _____/_____ |__| ___\\__ ___/___ ___ ____/ |_ \n" +
" | __)_\\____ \\| |/ ___\\| |_/ __ \\\\ \\/ /\\ __\\\n" +
" | \\ |_> > \\ \\___| |\\ ___/ > < | | \n" +
"/_______ / __/|__|\\___ >____| \\___ >__/\\_ \\ |__| \n" +
" \\/|__| \\/ \\/ \\/ ";
System.out.println(banner);
}
private static void titleWait() {
System.out.println("Press \"ENTER\" to continue...");
Scanner scanner = new Scanner(System.in);
scanner.nextLine();
clearScreen();
}
private static void clearScreen() {
System.out.println("\n\n\n\n\n\n\n\n\n\n");
}
}
| import java.util.Scanner;
public class Main {
public static void main(String[] args) {
printTitle();
titleWait();
System.out.println("You wake up on a beach.");
}
private static void printTitle() {
String banner = "___________ .__ ___________ __ \n" +
"\\_ _____/_____ |__| ___\\__ ___/___ ___ ____/ |_ \n" +
" | __)_\\____ \\| |/ ___\\| |_/ __ \\\\ \\/ /\\ __\\\n" +
" | \\ |_> > \\ \\___| |\\ ___/ > < | | \n" +
"/_______ / __/|__|\\___ >____| \\___ >__/\\_ \\ |__| \n" +
" \\/|__| \\/ \\/ \\/ ";
System.out.println(banner);
}
private static void titleWait() {
System.out.println("Press \"ENTER\" to continue...");
Scanner scanner = new Scanner(System.in);
scanner.nextLine();
clearScreen();
}
private static void clearScreen() {
-
+ System.out.println("\n\n\n\n\n\n\n\n\n\n");
}
} | 2 | 0.064516 | 1 | 1 |
b971b3bedef799cfc044cf2712eb952acf83f812 | CONTRIBUTING.md | CONTRIBUTING.md | `Hello, dear developer`
We will be happy to see your Pull Request!
Actually, we don't want to limit you, so just write clean & good code with tests and everything will be okay :smile:
| `Hello, dear developer`
We will be happy to see your Pull Request!
Actually, we don't want to limit you, so just write clean & good code with tests and everything will be okay :smile:
####Building project:
To build and test the project you need to connect emulator or device and then execute:
```bash
sh ci/ci.sh
# from root project directory
```
| Add info about building project | Add info about building project | Markdown | apache-2.0 | zayass/storio,adamin1990/storio,murugespandian/storio,inigo0178/storio,ysnows/storio,zayass/storio,0359xiaodong/storio,sylsi/storio,ysnows/storio,gostik/tweets-key-value,geralt-encore/storio,WeRockStar/storio,geralt-encore/storio,awesome-niu/storio,TonyTangAndroid/storio,adamin1990/storio,geralt-encore/storio,hai-nguyen/storio,pushtorefresh/storio,talenguyen/storio,TonyTangAndroid/storio,talenguyen/storio,gostik/tweets-key-value,murugespandian/storio,awesome-niu/storio,hai-nguyen/storio,inigo0178/storio,0359xiaodong/storio,pushtorefresh/storio,sylsi/storio,pushtorefresh/storio,WeRockStar/storio | markdown | ## Code Before:
`Hello, dear developer`
We will be happy to see your Pull Request!
Actually, we don't want to limit you, so just write clean & good code with tests and everything will be okay :smile:
## Instruction:
Add info about building project
## Code After:
`Hello, dear developer`
We will be happy to see your Pull Request!
Actually, we don't want to limit you, so just write clean & good code with tests and everything will be okay :smile:
####Building project:
To build and test the project you need to connect emulator or device and then execute:
```bash
sh ci/ci.sh
# from root project directory
```
| `Hello, dear developer`
-
We will be happy to see your Pull Request!
Actually, we don't want to limit you, so just write clean & good code with tests and everything will be okay :smile:
+
+ ####Building project:
+
+ To build and test the project you need to connect emulator or device and then execute:
+ ```bash
+ sh ci/ci.sh
+ # from root project directory
+ ``` | 9 | 1.8 | 8 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.