commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0503c3ceccb6e97cf0710b37ec4fe7ca762aea8f | test/unit/shell-tests.js | test/unit/shell-tests.js | testFiles = [
'graphics.js',
'displayObject.js',
'displayObjectContainer.js',
// 'eventDispatcher.js',
'broadcastEvents.js',
'bitmap.js',
'stage.js',
'filters.js',
'mouse.js',
'matrix3D.js',
];
| testFiles = [
'graphics.js',
'displayObject.js',
'displayObjectContainer.js',
// 'eventDispatcher.js',
'broadcastEvents.js',
'bitmap.js',
'stage.js',
'filters.js'
];
| Remove tests that are not committed. | Remove tests that are not committed.
| JavaScript | apache-2.0 | tschneidereit/shumway,yurydelendik/shumway,mbebenita/shumway,tschneidereit/shumway,yurydelendik/shumway,tschneidereit/shumway,mozilla/shumway,tschneidereit/shumway,mozilla/shumway,mbebenita/shumway,yurydelendik/shumway,mozilla/shumway,mozilla/shumway,mbebenita/shumway,tschneidereit/shumway,yurydelendik/shumway,yurydelendik/shumway,yurydelendik/shumway,mbebenita/shumway,yurydelendik/shumway,mbebenita/shumway,mozilla/shumway,tschneidereit/shumway,mozilla/shumway,mozilla/shumway,mbebenita/shumway,tschneidereit/shumway,mbebenita/shumway,tschneidereit/shumway,mbebenita/shumway,yurydelendik/shumway,mozilla/shumway | javascript | ## Code Before:
testFiles = [
'graphics.js',
'displayObject.js',
'displayObjectContainer.js',
// 'eventDispatcher.js',
'broadcastEvents.js',
'bitmap.js',
'stage.js',
'filters.js',
'mouse.js',
'matrix3D.js',
];
## Instruction:
Remove tests that are not committed.
## Code After:
testFiles = [
'graphics.js',
'displayObject.js',
'displayObjectContainer.js',
// 'eventDispatcher.js',
'broadcastEvents.js',
'bitmap.js',
'stage.js',
'filters.js'
];
| testFiles = [
'graphics.js',
'displayObject.js',
'displayObjectContainer.js',
// 'eventDispatcher.js',
'broadcastEvents.js',
'bitmap.js',
'stage.js',
- 'filters.js',
? -
+ 'filters.js'
- 'mouse.js',
- 'matrix3D.js',
]; | 4 | 0.333333 | 1 | 3 |
afc18ff91bde4e6e6da554c7f9e520e5cac89fa2 | streams.py | streams.py | import praw
r = praw.Reddit(user_agent='nba_stream_parser')
submissions = r.get_subreddit('nbastreams').get_hot(limit=10)
for submission in submissions:
print(submission.selftext_html)
| from bs4 import BeautifulSoup
import html
import praw
r = praw.Reddit(user_agent='nba_stream_parser')
def get_streams_for_team(teams):
teams.append('Game Thread')
submissions = r.get_subreddit('nbastreams').get_hot(limit=20)
streams = []
for submission in submissions:
if all(team in submission.title for team in teams):
for comment in submission.comments:
soup = BeautifulSoup(
html.unescape(comment.body_html), 'html.parser')
if soup.find('a'):
streams.append(soup.find('a')['href'])
return streams
if __name__ == '__main__':
print(get_streams_for_team(['Spurs']))
| Add comment parser; get stream links for any (2) team(s) | Add comment parser; get stream links for any (2) team(s)
| Python | mit | kshvmdn/NBAScores,kshvmdn/nba.js,kshvmdn/nba-scores | python | ## Code Before:
import praw
r = praw.Reddit(user_agent='nba_stream_parser')
submissions = r.get_subreddit('nbastreams').get_hot(limit=10)
for submission in submissions:
print(submission.selftext_html)
## Instruction:
Add comment parser; get stream links for any (2) team(s)
## Code After:
from bs4 import BeautifulSoup
import html
import praw
r = praw.Reddit(user_agent='nba_stream_parser')
def get_streams_for_team(teams):
teams.append('Game Thread')
submissions = r.get_subreddit('nbastreams').get_hot(limit=20)
streams = []
for submission in submissions:
if all(team in submission.title for team in teams):
for comment in submission.comments:
soup = BeautifulSoup(
html.unescape(comment.body_html), 'html.parser')
if soup.find('a'):
streams.append(soup.find('a')['href'])
return streams
if __name__ == '__main__':
print(get_streams_for_team(['Spurs']))
| + from bs4 import BeautifulSoup
+ import html
import praw
+
r = praw.Reddit(user_agent='nba_stream_parser')
+
+ def get_streams_for_team(teams):
+ teams.append('Game Thread')
- submissions = r.get_subreddit('nbastreams').get_hot(limit=10)
? ^
+ submissions = r.get_subreddit('nbastreams').get_hot(limit=20)
? ++++ ^
+ streams = []
- for submission in submissions:
+ for submission in submissions:
? ++++
- print(submission.selftext_html)
+ if all(team in submission.title for team in teams):
+ for comment in submission.comments:
+ soup = BeautifulSoup(
+ html.unescape(comment.body_html), 'html.parser')
+ if soup.find('a'):
+ streams.append(soup.find('a')['href'])
+ return streams
+
+ if __name__ == '__main__':
+ print(get_streams_for_team(['Spurs'])) | 22 | 3.666667 | 19 | 3 |
e6d89d89bcabbe079f32434ded15fafd23e98c47 | app/views/index.haml | app/views/index.haml | - if @incidents.present?
#incidents.span9
- @incidents.each do |incident|
.incident
.row-fluid
.span8
.row-fluid
%h3.name= "#{Faker::Name.last_name}, #{Faker::Name.first_name}" #incident.arrest_report.person_name
%p.docket-number= incident.docket_number
.span3
%h3.court-date= incident.next_court_date.strftime("%b %-d")
%span.court-date-note Next Court Date
.row-fluid
.span1
%p.sex= incident.defendant_sex
%p.age= incident.defendant_age
.span3
%p.severity= incident.top_charge_code_expanded
%p.charge-code= format_top_charge(incident)
.span3
%p.priors
%span.count 3
prior convictions
%p.open-cases
%span.count 1
open case
.span3.offset1
%p.court-part= incident.next_court_part
%p.last-updated
Last updated
%span.updated-date
May 16
.row-fluid
= will_paginate @incidents, renderer: BootstrapPagination::Sinatra
- else
No cases found.
| - if @incidents.present?
#incidents.span9
- @incidents.each do |incident|
.incident
.row-fluid
.span8
.row-fluid
%h3.name= "#{Faker::Name.last_name}, #{Faker::Name.first_name}" #incident.arrest_report.person_name
%p.docket-number= incident.docket_number
.span3
%h3.court-date= incident.next_court_date.strftime("%b %-d")
%span.court-date-note Next Court Date
.row-fluid
.span1
%p.sex= incident.defendant_sex
%p.age= incident.defendant_age
.span3
%p.severity= incident.top_charge_code_expanded
%p.charge-code= format_top_charge(incident)
.span3
%p.priors
%span.count 3
prior convictions
%p.open-cases
%span.count 1
open case
.span3.offset1
%p.court-part= incident.next_court_part
.row-fluid
= will_paginate @incidents, renderer: BootstrapPagination::Sinatra
- else
No cases found.
| Remove the last updated display. | Remove the last updated display.
| Haml | bsd-3-clause | codeforamerica/criminal_case_search,codeforamerica/criminal_case_search,codeforamerica/criminal_case_search | haml | ## Code Before:
- if @incidents.present?
#incidents.span9
- @incidents.each do |incident|
.incident
.row-fluid
.span8
.row-fluid
%h3.name= "#{Faker::Name.last_name}, #{Faker::Name.first_name}" #incident.arrest_report.person_name
%p.docket-number= incident.docket_number
.span3
%h3.court-date= incident.next_court_date.strftime("%b %-d")
%span.court-date-note Next Court Date
.row-fluid
.span1
%p.sex= incident.defendant_sex
%p.age= incident.defendant_age
.span3
%p.severity= incident.top_charge_code_expanded
%p.charge-code= format_top_charge(incident)
.span3
%p.priors
%span.count 3
prior convictions
%p.open-cases
%span.count 1
open case
.span3.offset1
%p.court-part= incident.next_court_part
%p.last-updated
Last updated
%span.updated-date
May 16
.row-fluid
= will_paginate @incidents, renderer: BootstrapPagination::Sinatra
- else
No cases found.
## Instruction:
Remove the last updated display.
## Code After:
- if @incidents.present?
#incidents.span9
- @incidents.each do |incident|
.incident
.row-fluid
.span8
.row-fluid
%h3.name= "#{Faker::Name.last_name}, #{Faker::Name.first_name}" #incident.arrest_report.person_name
%p.docket-number= incident.docket_number
.span3
%h3.court-date= incident.next_court_date.strftime("%b %-d")
%span.court-date-note Next Court Date
.row-fluid
.span1
%p.sex= incident.defendant_sex
%p.age= incident.defendant_age
.span3
%p.severity= incident.top_charge_code_expanded
%p.charge-code= format_top_charge(incident)
.span3
%p.priors
%span.count 3
prior convictions
%p.open-cases
%span.count 1
open case
.span3.offset1
%p.court-part= incident.next_court_part
.row-fluid
= will_paginate @incidents, renderer: BootstrapPagination::Sinatra
- else
No cases found.
| - if @incidents.present?
#incidents.span9
- @incidents.each do |incident|
.incident
.row-fluid
.span8
.row-fluid
%h3.name= "#{Faker::Name.last_name}, #{Faker::Name.first_name}" #incident.arrest_report.person_name
%p.docket-number= incident.docket_number
.span3
%h3.court-date= incident.next_court_date.strftime("%b %-d")
%span.court-date-note Next Court Date
.row-fluid
.span1
%p.sex= incident.defendant_sex
%p.age= incident.defendant_age
.span3
%p.severity= incident.top_charge_code_expanded
%p.charge-code= format_top_charge(incident)
.span3
%p.priors
%span.count 3
prior convictions
%p.open-cases
%span.count 1
open case
.span3.offset1
%p.court-part= incident.next_court_part
- %p.last-updated
- Last updated
- %span.updated-date
- May 16
.row-fluid
= will_paginate @incidents, renderer: BootstrapPagination::Sinatra
- else
No cases found. | 4 | 0.108108 | 0 | 4 |
680d938438d154c3c765b8cb2b2a975662a7e880 | src/main/resources/org/mifos/pentaho/tally/all_ledgers.ftl | src/main/resources/org/mifos/pentaho/tally/all_ledgers.ftl | <ALLLEDGERENTRIES.LIST>
<LEDGERNAME>${ledger.name}</LEDGERNAME>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<ISPARTYLEDGER>No</ISPARTYLEDGER>
<AMOUNT>${ledger.amount}</AMOUNT>
<CATEGORYALLOCATIONS.LIST>
<CATEGORY>Primary Cost Category</CATEGORY>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<COSTCENTREALLOCATIONS.LIST>
<NAME>${ledger.branchName}</NAME>
<AMOUNT>${ledger.amount}</AMOUNT>
</COSTCENTREALLOCATIONS.LIST>
</CATEGORYALLOCATIONS.LIST>
</ALLLEDGERENTRIES.LIST> | <ALLLEDGERENTRIES.LIST>
<LEDGERNAME>${ledger.name}</LEDGERNAME>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<AMOUNT>${ledger.amount}</AMOUNT>
<CATEGORYALLOCATIONS.LIST>
<CATEGORY>Primary Cost Category</CATEGORY>
<COSTCENTREALLOCATIONS.LIST>
<NAME>${ledger.branchName}</NAME>
<AMOUNT>${ledger.amount}</AMOUNT>
</COSTCENTREALLOCATIONS.LIST>
</CATEGORYALLOCATIONS.LIST>
</ALLLEDGERENTRIES.LIST> | Remove ISPARTYLEDGER and ISDEEMEDPOSITIVE tags from Tally XML output | Remove ISPARTYLEDGER and ISDEEMEDPOSITIVE tags from Tally XML output
| FreeMarker | apache-2.0 | mifos/bi,mifos/bi,mifos/bi | freemarker | ## Code Before:
<ALLLEDGERENTRIES.LIST>
<LEDGERNAME>${ledger.name}</LEDGERNAME>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<ISPARTYLEDGER>No</ISPARTYLEDGER>
<AMOUNT>${ledger.amount}</AMOUNT>
<CATEGORYALLOCATIONS.LIST>
<CATEGORY>Primary Cost Category</CATEGORY>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<COSTCENTREALLOCATIONS.LIST>
<NAME>${ledger.branchName}</NAME>
<AMOUNT>${ledger.amount}</AMOUNT>
</COSTCENTREALLOCATIONS.LIST>
</CATEGORYALLOCATIONS.LIST>
</ALLLEDGERENTRIES.LIST>
## Instruction:
Remove ISPARTYLEDGER and ISDEEMEDPOSITIVE tags from Tally XML output
## Code After:
<ALLLEDGERENTRIES.LIST>
<LEDGERNAME>${ledger.name}</LEDGERNAME>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<AMOUNT>${ledger.amount}</AMOUNT>
<CATEGORYALLOCATIONS.LIST>
<CATEGORY>Primary Cost Category</CATEGORY>
<COSTCENTREALLOCATIONS.LIST>
<NAME>${ledger.branchName}</NAME>
<AMOUNT>${ledger.amount}</AMOUNT>
</COSTCENTREALLOCATIONS.LIST>
</CATEGORYALLOCATIONS.LIST>
</ALLLEDGERENTRIES.LIST> | <ALLLEDGERENTRIES.LIST>
<LEDGERNAME>${ledger.name}</LEDGERNAME>
<ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
- <ISPARTYLEDGER>No</ISPARTYLEDGER>
<AMOUNT>${ledger.amount}</AMOUNT>
<CATEGORYALLOCATIONS.LIST>
<CATEGORY>Primary Cost Category</CATEGORY>
- <ISDEEMEDPOSITIVE>${ledger.isDeemedPositive}</ISDEEMEDPOSITIVE>
<COSTCENTREALLOCATIONS.LIST>
<NAME>${ledger.branchName}</NAME>
<AMOUNT>${ledger.amount}</AMOUNT>
</COSTCENTREALLOCATIONS.LIST>
</CATEGORYALLOCATIONS.LIST>
</ALLLEDGERENTRIES.LIST> | 2 | 0.142857 | 0 | 2 |
d29a3e8af97fd892ac6e3757b27d1e3272f0153d | gh-pages.sh | gh-pages.sh | org="joneit"
# set variable repo to current directory name (without path)
repo=${PWD##*/}
# make sure the docs are built
gulp doc >/dev/null
# remove temp directory in case it already exists, remake it, switch to it
rm -rf ../temp >/dev/null
mkdir ../temp
pushd ../temp >/dev/null
# clone it so it will be a branch of the repo
git clone -q --single-branch http://github.com/$org/$repo.git
cd $repo >/dev/null
# create and switch to a new gh-pages branch
git checkout -q --orphan gh-pages
# remove all content from this new branch
git rm -rf -q .
# copy the doc directory from the workspace
cp -R ../../$repo/doc/* . >/dev/null
# send it up
git add . >/dev/null
git commit -q -m 'see gh-pages'
git push -ufq origin gh-pages >/dev/null
# back to workspace
popd >/dev/null
# remove temp directory
rm -rf ../temp >/dev/null
echo 'Opening page at http://$org.github.io/$repo/ ...'
open http://$org.github.io/$repo/
echo 'CAVEAT: New pages will not be immediately available so wait a few minutes and refresh.'
| org="joneit"
# set variable repo to current directory name (without path)
repo=${PWD##*/}
# make sure the docs are built
gulp doc >/dev/null
# remove temp directory in case it already exists, remake it, switch to it
rm -rf ../temp >/dev/null
mkdir ../temp
pushd ../temp >/dev/null
# clone it so it will be a branch of the repo
git clone -q --single-branch http://github.com/$org/$repo.git
cd $repo >/dev/null
# create and switch to a new gh-pages branch
git checkout -q --orphan gh-pages
# remove all content from this new branch
git rm -rf -q .
# copy the doc directory from the workspace
cp -R ../../$repo/doc/* . >/dev/null
# copy all source files from src/js to the cdn directory here
ln -s ../../$repo/src/js src
ls src | while read a; do uglify -s src/$a -o ${a%.js}.min.js; done
rm src
# send it up
git add . >/dev/null
git commit -q -m '(See gh-pages.sh on master branch.)'
git push -ufq origin gh-pages >/dev/null
# back to workspace
popd >/dev/null
# remove temp directory
rm -rf ../temp >/dev/null
echo 'Opening page at http://$org.github.io/$repo/ ...'
open http://$org.github.io/$repo/
echo 'CAVEAT: New pages will not be immediately available so wait a few minutes and refresh.'
| Add *.min.js file(s) to $org/gitub.io/$repo site. | Add *.min.js file(s) to $org/gitub.io/$repo site.
| Shell | mit | joneit/MojoSpy,joneit/MojoSpy | shell | ## Code Before:
org="joneit"
# set variable repo to current directory name (without path)
repo=${PWD##*/}
# make sure the docs are built
gulp doc >/dev/null
# remove temp directory in case it already exists, remake it, switch to it
rm -rf ../temp >/dev/null
mkdir ../temp
pushd ../temp >/dev/null
# clone it so it will be a branch of the repo
git clone -q --single-branch http://github.com/$org/$repo.git
cd $repo >/dev/null
# create and switch to a new gh-pages branch
git checkout -q --orphan gh-pages
# remove all content from this new branch
git rm -rf -q .
# copy the doc directory from the workspace
cp -R ../../$repo/doc/* . >/dev/null
# send it up
git add . >/dev/null
git commit -q -m 'see gh-pages'
git push -ufq origin gh-pages >/dev/null
# back to workspace
popd >/dev/null
# remove temp directory
rm -rf ../temp >/dev/null
echo 'Opening page at http://$org.github.io/$repo/ ...'
open http://$org.github.io/$repo/
echo 'CAVEAT: New pages will not be immediately available so wait a few minutes and refresh.'
## Instruction:
Add *.min.js file(s) to $org/gitub.io/$repo site.
## Code After:
org="joneit"
# set variable repo to current directory name (without path)
repo=${PWD##*/}
# make sure the docs are built
gulp doc >/dev/null
# remove temp directory in case it already exists, remake it, switch to it
rm -rf ../temp >/dev/null
mkdir ../temp
pushd ../temp >/dev/null
# clone it so it will be a branch of the repo
git clone -q --single-branch http://github.com/$org/$repo.git
cd $repo >/dev/null
# create and switch to a new gh-pages branch
git checkout -q --orphan gh-pages
# remove all content from this new branch
git rm -rf -q .
# copy the doc directory from the workspace
cp -R ../../$repo/doc/* . >/dev/null
# copy all source files from src/js to the cdn directory here
ln -s ../../$repo/src/js src
ls src | while read a; do uglify -s src/$a -o ${a%.js}.min.js; done
rm src
# send it up
git add . >/dev/null
git commit -q -m '(See gh-pages.sh on master branch.)'
git push -ufq origin gh-pages >/dev/null
# back to workspace
popd >/dev/null
# remove temp directory
rm -rf ../temp >/dev/null
echo 'Opening page at http://$org.github.io/$repo/ ...'
open http://$org.github.io/$repo/
echo 'CAVEAT: New pages will not be immediately available so wait a few minutes and refresh.'
| org="joneit"
# set variable repo to current directory name (without path)
repo=${PWD##*/}
# make sure the docs are built
gulp doc >/dev/null
# remove temp directory in case it already exists, remake it, switch to it
rm -rf ../temp >/dev/null
mkdir ../temp
pushd ../temp >/dev/null
# clone it so it will be a branch of the repo
git clone -q --single-branch http://github.com/$org/$repo.git
cd $repo >/dev/null
# create and switch to a new gh-pages branch
git checkout -q --orphan gh-pages
# remove all content from this new branch
git rm -rf -q .
# copy the doc directory from the workspace
cp -R ../../$repo/doc/* . >/dev/null
+ # copy all source files from src/js to the cdn directory here
+ ln -s ../../$repo/src/js src
+ ls src | while read a; do uglify -s src/$a -o ${a%.js}.min.js; done
+ rm src
+
# send it up
git add . >/dev/null
- git commit -q -m 'see gh-pages'
+ git commit -q -m '(See gh-pages.sh on master branch.)'
git push -ufq origin gh-pages >/dev/null
# back to workspace
popd >/dev/null
# remove temp directory
rm -rf ../temp >/dev/null
echo 'Opening page at http://$org.github.io/$repo/ ...'
open http://$org.github.io/$repo/
echo 'CAVEAT: New pages will not be immediately available so wait a few minutes and refresh.' | 7 | 0.175 | 6 | 1 |
9b5ce4a04efef34a83d5f4bd394dd9b3a526e8c2 | lib/compass/_css3.scss | lib/compass/_css3.scss | @import "css3/border-radius";
@import "css3/inline-block";
@import "css3/opacity";
@import "css3/box-shadow";
@import "css3/text-shadow";
@import "css3/columns";
@import "css3/box-sizing";
@import "css3/box";
@import "css3/images";
@import "css3/background-clip";
@import "css3/background-origin";
@import "css3/background-size";
@import "css3/font-face";
@import "css3/transform";
@import "css3/transition";
@import "css3/appearance";
@import "css3/regions";
@import "css3/hyphenation";
@import "css3/filter";
| @import "css3/border-radius";
@import "css3/inline-block";
@import "css3/opacity";
@import "css3/box-shadow";
@import "css3/text-shadow";
@import "css3/columns";
@import "css3/box-sizing";
@import "css3/box";
@import "css3/images";
@import "css3/background-clip";
@import "css3/background-origin";
@import "css3/background-size";
@import "css3/font-face";
@import "css3/transform";
@import "css3/transition";
@import "css3/appearance";
@import "css3/regions";
@import "css3/hyphenation";
@import "css3/filter";
@import "css3/flexbox";
| Add import to css3 module | Add import to css3 module
| SCSS | mit | rajasegar/compass-mixins | scss | ## Code Before:
@import "css3/border-radius";
@import "css3/inline-block";
@import "css3/opacity";
@import "css3/box-shadow";
@import "css3/text-shadow";
@import "css3/columns";
@import "css3/box-sizing";
@import "css3/box";
@import "css3/images";
@import "css3/background-clip";
@import "css3/background-origin";
@import "css3/background-size";
@import "css3/font-face";
@import "css3/transform";
@import "css3/transition";
@import "css3/appearance";
@import "css3/regions";
@import "css3/hyphenation";
@import "css3/filter";
## Instruction:
Add import to css3 module
## Code After:
@import "css3/border-radius";
@import "css3/inline-block";
@import "css3/opacity";
@import "css3/box-shadow";
@import "css3/text-shadow";
@import "css3/columns";
@import "css3/box-sizing";
@import "css3/box";
@import "css3/images";
@import "css3/background-clip";
@import "css3/background-origin";
@import "css3/background-size";
@import "css3/font-face";
@import "css3/transform";
@import "css3/transition";
@import "css3/appearance";
@import "css3/regions";
@import "css3/hyphenation";
@import "css3/filter";
@import "css3/flexbox";
| @import "css3/border-radius";
@import "css3/inline-block";
@import "css3/opacity";
@import "css3/box-shadow";
@import "css3/text-shadow";
@import "css3/columns";
@import "css3/box-sizing";
@import "css3/box";
@import "css3/images";
@import "css3/background-clip";
@import "css3/background-origin";
@import "css3/background-size";
@import "css3/font-face";
@import "css3/transform";
@import "css3/transition";
@import "css3/appearance";
@import "css3/regions";
@import "css3/hyphenation";
@import "css3/filter";
+ @import "css3/flexbox"; | 1 | 0.052632 | 1 | 0 |
226d04277ecc02082d70e57e725e02da533131b6 | ftdetect/scala.vim | ftdetect/scala.vim | au BufRead,BufNewFile *.scala set filetype=scala
| au BufRead,BufNewFile *.scala set filetype=scala
au BufNewFile,BufRead *.sbt set filetype=scala
" Use haml syntax for scaml
au BufRead,BufNewFile *.scaml set filetype=haml
| Add filetype detection for sbt and template files | Add filetype detection for sbt and template files
| VimL | apache-2.0 | ornicar/vim-scala | viml | ## Code Before:
au BufRead,BufNewFile *.scala set filetype=scala
## Instruction:
Add filetype detection for sbt and template files
## Code After:
au BufRead,BufNewFile *.scala set filetype=scala
au BufNewFile,BufRead *.sbt set filetype=scala
" Use haml syntax for scaml
au BufRead,BufNewFile *.scaml set filetype=haml
| au BufRead,BufNewFile *.scala set filetype=scala
+ au BufNewFile,BufRead *.sbt set filetype=scala
+
+ " Use haml syntax for scaml
+ au BufRead,BufNewFile *.scaml set filetype=haml | 4 | 4 | 4 | 0 |
a7792a50ca0d22f595d01d191252ab20f5cb206c | integration/cocos2d-x-v2/include/bee/Cocos2dxBeehive.h | integration/cocos2d-x-v2/include/bee/Cocos2dxBeehive.h | //
// Created by Dawid Drozd aka Gelldur on 08.10.17.
//
#pragma once
#include <map>
#include <string>
#include <vector>
#include <bee/Beehive.h>
namespace cocos2d
{
class CCNode;
}
namespace Bee
{
class Graph;
class Node;
class Cocos2dxBeehive
{
public:
Cocos2dxBeehive(const std::vector<std::string>& searchPaths);
~Cocos2dxBeehive();
cocos2d::CCNode* createView(const std::string& content);
cocos2d::CCNode* findViewById(const std::string& id);
private:
Graph* _graph;
std::shared_ptr<sel::State> _state;
Bee::Beehive _beehive;
void addRelation(Node* nodeParent, Node* nodeChild);
};
}
| //
// Created by Dawid Drozd aka Gelldur on 08.10.17.
//
#pragma once
#include <map>
#include <string>
#include <vector>
#include <bee/Beehive.h>
namespace cocos2d
{
class CCNode;
}
namespace Bee
{
class Graph;
class Node;
class Cocos2dxBeehive
{
public:
Cocos2dxBeehive(const std::vector<std::string>& searchPaths);
~Cocos2dxBeehive();
cocos2d::CCNode* createView(const std::string& content);
cocos2d::CCNode* findViewById(const std::string& id);
const std::shared_ptr<sel::State>& getState()
{
return _state;
}
private:
Graph* _graph;
std::shared_ptr<sel::State> _state;
Bee::Beehive _beehive;
void addRelation(Node* nodeParent, Node* nodeChild);
};
}
| Add getter for lua state | Add getter for lua state
| C | apache-2.0 | gelldur/Bee | c | ## Code Before:
//
// Created by Dawid Drozd aka Gelldur on 08.10.17.
//
#pragma once
#include <map>
#include <string>
#include <vector>
#include <bee/Beehive.h>
namespace cocos2d
{
class CCNode;
}
namespace Bee
{
class Graph;
class Node;
class Cocos2dxBeehive
{
public:
Cocos2dxBeehive(const std::vector<std::string>& searchPaths);
~Cocos2dxBeehive();
cocos2d::CCNode* createView(const std::string& content);
cocos2d::CCNode* findViewById(const std::string& id);
private:
Graph* _graph;
std::shared_ptr<sel::State> _state;
Bee::Beehive _beehive;
void addRelation(Node* nodeParent, Node* nodeChild);
};
}
## Instruction:
Add getter for lua state
## Code After:
//
// Created by Dawid Drozd aka Gelldur on 08.10.17.
//
#pragma once
#include <map>
#include <string>
#include <vector>
#include <bee/Beehive.h>
namespace cocos2d
{
class CCNode;
}
namespace Bee
{
class Graph;
class Node;
class Cocos2dxBeehive
{
public:
Cocos2dxBeehive(const std::vector<std::string>& searchPaths);
~Cocos2dxBeehive();
cocos2d::CCNode* createView(const std::string& content);
cocos2d::CCNode* findViewById(const std::string& id);
const std::shared_ptr<sel::State>& getState()
{
return _state;
}
private:
Graph* _graph;
std::shared_ptr<sel::State> _state;
Bee::Beehive _beehive;
void addRelation(Node* nodeParent, Node* nodeChild);
};
}
| //
// Created by Dawid Drozd aka Gelldur on 08.10.17.
//
#pragma once
#include <map>
#include <string>
#include <vector>
#include <bee/Beehive.h>
namespace cocos2d
{
class CCNode;
}
namespace Bee
{
class Graph;
class Node;
class Cocos2dxBeehive
{
public:
Cocos2dxBeehive(const std::vector<std::string>& searchPaths);
~Cocos2dxBeehive();
cocos2d::CCNode* createView(const std::string& content);
cocos2d::CCNode* findViewById(const std::string& id);
+ const std::shared_ptr<sel::State>& getState()
+ {
+ return _state;
+ }
+
private:
Graph* _graph;
std::shared_ptr<sel::State> _state;
Bee::Beehive _beehive;
void addRelation(Node* nodeParent, Node* nodeChild);
};
} | 5 | 0.125 | 5 | 0 |
7ebd1397980081115c1d2c2367dd36e52d795974 | src/js/components/Title.spec.js | src/js/components/Title.spec.js | import React from 'react';
import render from 'react-test-renderer';
import { shallow } from 'enzyme';
import Title from './Title';
import { singleLine } from '../../../test/test-helper';
it('renders', () => {
const wrapper = shallow(<Title />);
expect(wrapper.length).toBe(1);
});
it('matches snapshot', () => {
const tree = render.create(<Title />).toJSON();
expect(tree).toMatchSnapshot();
});
it('renders heading when has one child', () => {
const wrapper = shallow(<Title>Hello</Title>);
expect(wrapper.html()).toBe('<h1 class="title">Hello</h1>');
});
it('accepts className', () => {
const wrapper = shallow(<Title className="test">Hello</Title>);
expect(wrapper.html()).toBe('<h1 class="title test">Hello</h1>');
});
it('renders <hgroup> when has multiple children', () => {
const wrapper = shallow(
<Title>
<h1>Hello</h1>
<h2>World</h2>
</Title>,
);
const expected = singleLine`
<hgroup class="title">
<h1 class="title__main">Hello</h1>
<h2 class="title__secondary">World</h2>
</hgroup>
`;
expect(wrapper.html()).toBe(expected);
});
| import React from 'react';
import render from 'react-test-renderer';
import Title from './Title';
it('renders heading', () => {
const tree = render.create(<Title>Test</Title>);
expect(tree.toJSON()).toMatchSnapshot();
});
it('renders group of headings', () => {
const tree = render.create(
<Title>
<h1>Hello</h1>
<h2>World</h2>
</Title>,
);
expect(tree.toJSON()).toMatchSnapshot();
});
it('accepts class name', () => {
const tree = render.create(<Title className="test">Test</Title>);
expect(tree.toJSON()).toMatchSnapshot();
});
| Use snapshots in Title tests | Use snapshots in Title tests
| JavaScript | mit | slavapavlutin/pavlutin-node,slavapavlutin/pavlutin-node | javascript | ## Code Before:
import React from 'react';
import render from 'react-test-renderer';
import { shallow } from 'enzyme';
import Title from './Title';
import { singleLine } from '../../../test/test-helper';
it('renders', () => {
const wrapper = shallow(<Title />);
expect(wrapper.length).toBe(1);
});
it('matches snapshot', () => {
const tree = render.create(<Title />).toJSON();
expect(tree).toMatchSnapshot();
});
it('renders heading when has one child', () => {
const wrapper = shallow(<Title>Hello</Title>);
expect(wrapper.html()).toBe('<h1 class="title">Hello</h1>');
});
it('accepts className', () => {
const wrapper = shallow(<Title className="test">Hello</Title>);
expect(wrapper.html()).toBe('<h1 class="title test">Hello</h1>');
});
it('renders <hgroup> when has multiple children', () => {
const wrapper = shallow(
<Title>
<h1>Hello</h1>
<h2>World</h2>
</Title>,
);
const expected = singleLine`
<hgroup class="title">
<h1 class="title__main">Hello</h1>
<h2 class="title__secondary">World</h2>
</hgroup>
`;
expect(wrapper.html()).toBe(expected);
});
## Instruction:
Use snapshots in Title tests
## Code After:
import React from 'react';
import render from 'react-test-renderer';
import Title from './Title';
it('renders heading', () => {
const tree = render.create(<Title>Test</Title>);
expect(tree.toJSON()).toMatchSnapshot();
});
it('renders group of headings', () => {
const tree = render.create(
<Title>
<h1>Hello</h1>
<h2>World</h2>
</Title>,
);
expect(tree.toJSON()).toMatchSnapshot();
});
it('accepts class name', () => {
const tree = render.create(<Title className="test">Test</Title>);
expect(tree.toJSON()).toMatchSnapshot();
});
| import React from 'react';
import render from 'react-test-renderer';
- import { shallow } from 'enzyme';
import Title from './Title';
- import { singleLine } from '../../../test/test-helper';
- it('renders', () => {
+ it('renders heading', () => {
? ++++++++
- const wrapper = shallow(<Title />);
- expect(wrapper.length).toBe(1);
+ const tree = render.create(<Title>Test</Title>);
+ expect(tree.toJSON()).toMatchSnapshot();
});
+ it('renders group of headings', () => {
+ const tree = render.create(
- it('matches snapshot', () => {
- const tree = render.create(<Title />).toJSON();
- expect(tree).toMatchSnapshot();
- });
-
- it('renders heading when has one child', () => {
- const wrapper = shallow(<Title>Hello</Title>);
- expect(wrapper.html()).toBe('<h1 class="title">Hello</h1>');
- });
-
- it('accepts className', () => {
- const wrapper = shallow(<Title className="test">Hello</Title>);
- expect(wrapper.html()).toBe('<h1 class="title test">Hello</h1>');
- });
-
- it('renders <hgroup> when has multiple children', () => {
- const wrapper = shallow(
<Title>
<h1>Hello</h1>
<h2>World</h2>
</Title>,
);
+ expect(tree.toJSON()).toMatchSnapshot();
- const expected = singleLine`
- <hgroup class="title">
- <h1 class="title__main">Hello</h1>
- <h2 class="title__secondary">World</h2>
- </hgroup>
- `;
- expect(wrapper.html()).toBe(expected);
});
+
+ it('accepts class name', () => {
+ const tree = render.create(<Title className="test">Test</Title>);
+ expect(tree.toJSON()).toMatchSnapshot();
+ }); | 40 | 0.97561 | 11 | 29 |
845a40ba458e636b7611c760d558bb64836bef29 | benchmark/utils.js | benchmark/utils.js | function pad( str, len ) {
let res = str;
while ( res.length < len ) {
res += ' ';
}
return res;
}
function runBenchmark( name, preFunc, func, maxTime, maxIterations = Infinity ) {
let iterations = 0;
let elapsed = 0;
while ( elapsed < maxTime ) {
if ( preFunc ) preFunc;
let start = Date.now();
func();
elapsed += Date.now() - start;
iterations ++;
if ( iterations >= maxIterations ) break;
}
console.log( `\t${ pad( name, 25 ) }: ${ parseFloat( ( elapsed / iterations ).toFixed( 6 ) ) } ms` );
}
export { pad, runBenchmark };
| function pad( str, len ) {
let res = str;
while ( res.length < len ) {
res += ' ';
}
return res;
}
function runBenchmark( name, preFunc, func, maxTime, maxIterations = 100 ) {
let iterations = 0;
let elapsed = 0;
while ( elapsed < maxTime ) {
if ( preFunc ) preFunc();
let start = Date.now();
func();
elapsed += Date.now() - start;
iterations ++;
if ( iterations >= maxIterations ) break;
}
console.log( `\t${ pad( name, 25 ) }: ${ parseFloat( ( elapsed / iterations ).toFixed( 6 ) ) } ms` );
}
export { pad, runBenchmark };
| Fix pre func use in benchmark | Fix pre func use in benchmark
| JavaScript | mit | gkjohnson/three-mesh-bvh,gkjohnson/three-mesh-bvh,gkjohnson/three-mesh-bvh | javascript | ## Code Before:
function pad( str, len ) {
let res = str;
while ( res.length < len ) {
res += ' ';
}
return res;
}
function runBenchmark( name, preFunc, func, maxTime, maxIterations = Infinity ) {
let iterations = 0;
let elapsed = 0;
while ( elapsed < maxTime ) {
if ( preFunc ) preFunc;
let start = Date.now();
func();
elapsed += Date.now() - start;
iterations ++;
if ( iterations >= maxIterations ) break;
}
console.log( `\t${ pad( name, 25 ) }: ${ parseFloat( ( elapsed / iterations ).toFixed( 6 ) ) } ms` );
}
export { pad, runBenchmark };
## Instruction:
Fix pre func use in benchmark
## Code After:
function pad( str, len ) {
let res = str;
while ( res.length < len ) {
res += ' ';
}
return res;
}
function runBenchmark( name, preFunc, func, maxTime, maxIterations = 100 ) {
let iterations = 0;
let elapsed = 0;
while ( elapsed < maxTime ) {
if ( preFunc ) preFunc();
let start = Date.now();
func();
elapsed += Date.now() - start;
iterations ++;
if ( iterations >= maxIterations ) break;
}
console.log( `\t${ pad( name, 25 ) }: ${ parseFloat( ( elapsed / iterations ).toFixed( 6 ) ) } ms` );
}
export { pad, runBenchmark };
| function pad( str, len ) {
let res = str;
while ( res.length < len ) {
res += ' ';
}
return res;
}
- function runBenchmark( name, preFunc, func, maxTime, maxIterations = Infinity ) {
? ^^^^^^^^
+ function runBenchmark( name, preFunc, func, maxTime, maxIterations = 100 ) {
? ^^^
let iterations = 0;
let elapsed = 0;
while ( elapsed < maxTime ) {
- if ( preFunc ) preFunc;
+ if ( preFunc ) preFunc();
? ++
let start = Date.now();
func();
elapsed += Date.now() - start;
iterations ++;
if ( iterations >= maxIterations ) break;
}
console.log( `\t${ pad( name, 25 ) }: ${ parseFloat( ( elapsed / iterations ).toFixed( 6 ) ) } ms` );
}
export { pad, runBenchmark }; | 4 | 0.117647 | 2 | 2 |
f418b2182b8dbae45906ded6b4aedc07ffb7d90c | ext/glib/glib.c | ext/glib/glib.c |
static VALUE utf8_size(VALUE self, VALUE string)
{
VALUE result;
Check_Type(string, T_STRING);
result = ULONG2NUM(g_utf8_strlen(StringValuePtr(string), RSTRING(string)->len));
return result;
}
static VALUE utf8_upcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strup(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
static VALUE utf8_downcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strdown(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
void
Init_glib()
{
VALUE mGlib;
mGlib = rb_define_module("Glib");
rb_define_method(mGlib, "utf8_size", utf8_size, 1);
rb_define_method(mGlib, "utf8_upcase", utf8_upcase, 1);
rb_define_method(mGlib, "utf8_downcase", utf8_downcase, 1);
}
|
static VALUE utf8_size(VALUE self, VALUE string)
{
VALUE result;
Check_Type(string, T_STRING);
result = ULONG2NUM(g_utf8_strlen(StringValuePtr(string), RSTRING(string)->len));
return result;
}
static VALUE utf8_upcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strup(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
static VALUE utf8_downcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strdown(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
void
Init_glib()
{
VALUE mGlib;
mGlib = rb_define_module("Glib");
rb_define_module_function(mGlib, "utf8_size", utf8_size, 1);
rb_define_module_function(mGlib, "utf8_upcase", utf8_upcase, 1);
rb_define_module_function(mGlib, "utf8_downcase", utf8_downcase, 1);
}
| Define the Glib functions as module functions. | Define the Glib functions as module functions.
| C | mit | Manfred/unichars,Manfred/unichars | c | ## Code Before:
static VALUE utf8_size(VALUE self, VALUE string)
{
VALUE result;
Check_Type(string, T_STRING);
result = ULONG2NUM(g_utf8_strlen(StringValuePtr(string), RSTRING(string)->len));
return result;
}
static VALUE utf8_upcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strup(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
static VALUE utf8_downcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strdown(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
void
Init_glib()
{
VALUE mGlib;
mGlib = rb_define_module("Glib");
rb_define_method(mGlib, "utf8_size", utf8_size, 1);
rb_define_method(mGlib, "utf8_upcase", utf8_upcase, 1);
rb_define_method(mGlib, "utf8_downcase", utf8_downcase, 1);
}
## Instruction:
Define the Glib functions as module functions.
## Code After:
static VALUE utf8_size(VALUE self, VALUE string)
{
VALUE result;
Check_Type(string, T_STRING);
result = ULONG2NUM(g_utf8_strlen(StringValuePtr(string), RSTRING(string)->len));
return result;
}
static VALUE utf8_upcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strup(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
static VALUE utf8_downcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strdown(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
void
Init_glib()
{
VALUE mGlib;
mGlib = rb_define_module("Glib");
rb_define_module_function(mGlib, "utf8_size", utf8_size, 1);
rb_define_module_function(mGlib, "utf8_upcase", utf8_upcase, 1);
rb_define_module_function(mGlib, "utf8_downcase", utf8_downcase, 1);
}
|
static VALUE utf8_size(VALUE self, VALUE string)
{
VALUE result;
Check_Type(string, T_STRING);
result = ULONG2NUM(g_utf8_strlen(StringValuePtr(string), RSTRING(string)->len));
return result;
}
static VALUE utf8_upcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strup(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
static VALUE utf8_downcase(VALUE self, VALUE string)
{
VALUE result;
gchar *temp;
Check_Type(string, T_STRING);
temp = g_utf8_strdown(StringValuePtr(string), RSTRING(string)->len);
result = rb_str_new2(temp);
return result;
}
void
Init_glib()
{
VALUE mGlib;
mGlib = rb_define_module("Glib");
- rb_define_method(mGlib, "utf8_size", utf8_size, 1);
? ---
+ rb_define_module_function(mGlib, "utf8_size", utf8_size, 1);
? ++++++++++++
- rb_define_method(mGlib, "utf8_upcase", utf8_upcase, 1);
? ---
+ rb_define_module_function(mGlib, "utf8_upcase", utf8_upcase, 1);
? ++++++++++++
- rb_define_method(mGlib, "utf8_downcase", utf8_downcase, 1);
? ---
+ rb_define_module_function(mGlib, "utf8_downcase", utf8_downcase, 1);
? ++++++++++++
} | 6 | 0.133333 | 3 | 3 |
24501be74423f062a48b19959c43790465eae6bd | lib/thinking_sphinx/deltas/delayed_delta.rb | lib/thinking_sphinx/deltas/delayed_delta.rb | require 'delayed/job'
require 'thinking_sphinx/deltas/delayed_delta/delta_job'
require 'thinking_sphinx/deltas/delayed_delta/flag_as_deleted_job'
require 'thinking_sphinx/deltas/delayed_delta/job'
module ThinkingSphinx
module Deltas
class DelayedDelta < ThinkingSphinx::Deltas::DefaultDelta
def index(model, instance = nil)
ThinkingSphinx::Deltas::Job.enqueue(
ThinkingSphinx::Deltas::DeltaJob.new(delta_index_name(model)),
ThinkingSphinx::Configuration.instance.delayed_job_priority
)
Delayed::Job.enqueue(
ThinkingSphinx::Deltas::FlagAsDeletedJob.new(
core_index_name(model), instance.sphinx_document_id
),
ThinkingSphinx::Configuration.instance.delayed_job_priority
) if instance
true
end
end
end
end
| require 'delayed/job'
require 'thinking_sphinx/deltas/delayed_delta/delta_job'
require 'thinking_sphinx/deltas/delayed_delta/flag_as_deleted_job'
require 'thinking_sphinx/deltas/delayed_delta/job'
module ThinkingSphinx
module Deltas
class DelayedDelta < ThinkingSphinx::Deltas::DefaultDelta
def index(model, instance = nil)
return true unless ThinkingSphinx.updates_enabled? && ThinkingSphinx.deltas_enabled?
return true if instance && !toggled(instance)
ThinkingSphinx::Deltas::Job.enqueue(
ThinkingSphinx::Deltas::DeltaJob.new(delta_index_name(model)),
ThinkingSphinx::Configuration.instance.delayed_job_priority
)
Delayed::Job.enqueue(
ThinkingSphinx::Deltas::FlagAsDeletedJob.new(
core_index_name(model), instance.sphinx_document_id
),
ThinkingSphinx::Configuration.instance.delayed_job_priority
) if instance
true
end
end
end
end
| Make delayed deltas respect the TS.deltas_enabled/TS.updates_enabled flags | Make delayed deltas respect the TS.deltas_enabled/TS.updates_enabled flags
| Ruby | mit | fxposter/thinking-sphinx,factorylabs/thinking-sphinx,pierrevalade/thinking-sphinx,raecoo/thinking-sphinx-chinese,Dishwasha/thinking-sphinx,talho/thinking-sphinx,agibralter/thinking-sphinx,dhh/thinking-sphinx,Napolskih/thinking-sphinx | ruby | ## Code Before:
require 'delayed/job'
require 'thinking_sphinx/deltas/delayed_delta/delta_job'
require 'thinking_sphinx/deltas/delayed_delta/flag_as_deleted_job'
require 'thinking_sphinx/deltas/delayed_delta/job'
module ThinkingSphinx
module Deltas
class DelayedDelta < ThinkingSphinx::Deltas::DefaultDelta
def index(model, instance = nil)
ThinkingSphinx::Deltas::Job.enqueue(
ThinkingSphinx::Deltas::DeltaJob.new(delta_index_name(model)),
ThinkingSphinx::Configuration.instance.delayed_job_priority
)
Delayed::Job.enqueue(
ThinkingSphinx::Deltas::FlagAsDeletedJob.new(
core_index_name(model), instance.sphinx_document_id
),
ThinkingSphinx::Configuration.instance.delayed_job_priority
) if instance
true
end
end
end
end
## Instruction:
Make delayed deltas respect the TS.deltas_enabled/TS.updates_enabled flags
## Code After:
require 'delayed/job'
require 'thinking_sphinx/deltas/delayed_delta/delta_job'
require 'thinking_sphinx/deltas/delayed_delta/flag_as_deleted_job'
require 'thinking_sphinx/deltas/delayed_delta/job'
module ThinkingSphinx
module Deltas
class DelayedDelta < ThinkingSphinx::Deltas::DefaultDelta
def index(model, instance = nil)
return true unless ThinkingSphinx.updates_enabled? && ThinkingSphinx.deltas_enabled?
return true if instance && !toggled(instance)
ThinkingSphinx::Deltas::Job.enqueue(
ThinkingSphinx::Deltas::DeltaJob.new(delta_index_name(model)),
ThinkingSphinx::Configuration.instance.delayed_job_priority
)
Delayed::Job.enqueue(
ThinkingSphinx::Deltas::FlagAsDeletedJob.new(
core_index_name(model), instance.sphinx_document_id
),
ThinkingSphinx::Configuration.instance.delayed_job_priority
) if instance
true
end
end
end
end
| require 'delayed/job'
require 'thinking_sphinx/deltas/delayed_delta/delta_job'
require 'thinking_sphinx/deltas/delayed_delta/flag_as_deleted_job'
require 'thinking_sphinx/deltas/delayed_delta/job'
module ThinkingSphinx
module Deltas
class DelayedDelta < ThinkingSphinx::Deltas::DefaultDelta
def index(model, instance = nil)
+ return true unless ThinkingSphinx.updates_enabled? && ThinkingSphinx.deltas_enabled?
+ return true if instance && !toggled(instance)
+
ThinkingSphinx::Deltas::Job.enqueue(
ThinkingSphinx::Deltas::DeltaJob.new(delta_index_name(model)),
ThinkingSphinx::Configuration.instance.delayed_job_priority
)
Delayed::Job.enqueue(
ThinkingSphinx::Deltas::FlagAsDeletedJob.new(
core_index_name(model), instance.sphinx_document_id
),
ThinkingSphinx::Configuration.instance.delayed_job_priority
) if instance
true
end
end
end
end | 3 | 0.111111 | 3 | 0 |
297c0e72a047acb8d261dc1e44f1bf1e6fa1b694 | spec/commands/build_spec.rb | spec/commands/build_spec.rb | require 'spec_helper'
describe Polytexnic::Commands::Build do
before(:all) { generate_book }
after(:all) { remove_book }
context 'valid builder formats' do
Polytexnic::FORMATS.each do |format|
subject { Polytexnic::Commands::Build.builder_for(format) }
it { should be_a Polytexnic::Builder }
end
end
context 'invalid builder format' do
subject { lambda { Polytexnic::Commands::Build.for_format('derp') } }
it { should raise_error }
end
context 'building each format' do
Polytexnic::FORMATS.each do |format|
subject {
lambda {
silence { Polytexnic::Commands::Build.for_format format }
}
}
it { should_not raise_error }
end
end
context 'building all' do
subject {
lambda {
silence { Polytexnic::Commands::Build.all_formats }
}
}
it { should_not raise_error }
after(:all) do
chdir_to_md_book
Polytexnic::Builders::Html.new.clean!
end
end
end | require 'spec_helper'
describe Polytexnic::Commands::Build do
before(:all) { generate_book }
after(:all) { remove_book }
context 'valid builder formats' do
Polytexnic::FORMATS.each do |format|
subject { Polytexnic::Commands::Build.builder_for(format) }
it { should be_a Polytexnic::Builder }
end
end
context 'invalid builder format' do
subject { lambda { Polytexnic::Commands::Build.for_format('derp') } }
it { should raise_error }
end
context 'building all' do
subject(:build) { Polytexnic::Commands::Build }
it { should respond_to(:all_formats) }
it "should build all formats" do
pdf_builder = build.builder_for('pdf')
html_builder = build.builder_for('html')
epub_builder = build.builder_for('epub')
mobi_builder = build.builder_for('mobi')
pdf_builder .should_receive(:build!)
html_builder.should_receive(:build!)
epub_builder.should_receive(:build!)
mobi_builder.should_receive(:build!)
build.should_receive(:builder_for).with('pdf') .and_return(pdf_builder)
build.should_receive(:builder_for).with('html').and_return(html_builder)
build.should_receive(:builder_for).with('epub').and_return(epub_builder)
build.should_receive(:builder_for).with('mobi').and_return(mobi_builder)
build.all_formats
end
end
end | Use message expectations to speed up the build spec | Use message expectations to speed up the build spec
This commit also removes some (very slow) tests for `build!` that are
already tested in the tests for individual builders.
[#52758883]
| Ruby | mit | softcover/softcover,softcover/softcover,minireference/softcover,minireference/softcover,minireference/softcover,softcover/softcover | ruby | ## Code Before:
require 'spec_helper'
describe Polytexnic::Commands::Build do
before(:all) { generate_book }
after(:all) { remove_book }
context 'valid builder formats' do
Polytexnic::FORMATS.each do |format|
subject { Polytexnic::Commands::Build.builder_for(format) }
it { should be_a Polytexnic::Builder }
end
end
context 'invalid builder format' do
subject { lambda { Polytexnic::Commands::Build.for_format('derp') } }
it { should raise_error }
end
context 'building each format' do
Polytexnic::FORMATS.each do |format|
subject {
lambda {
silence { Polytexnic::Commands::Build.for_format format }
}
}
it { should_not raise_error }
end
end
context 'building all' do
subject {
lambda {
silence { Polytexnic::Commands::Build.all_formats }
}
}
it { should_not raise_error }
after(:all) do
chdir_to_md_book
Polytexnic::Builders::Html.new.clean!
end
end
end
## Instruction:
Use message expectations to speed up the build spec
This commit also removes some (very slow) tests for `build!` that are
already tested in the tests for individual builders.
[#52758883]
## Code After:
require 'spec_helper'
describe Polytexnic::Commands::Build do
before(:all) { generate_book }
after(:all) { remove_book }
context 'valid builder formats' do
Polytexnic::FORMATS.each do |format|
subject { Polytexnic::Commands::Build.builder_for(format) }
it { should be_a Polytexnic::Builder }
end
end
context 'invalid builder format' do
subject { lambda { Polytexnic::Commands::Build.for_format('derp') } }
it { should raise_error }
end
context 'building all' do
subject(:build) { Polytexnic::Commands::Build }
it { should respond_to(:all_formats) }
it "should build all formats" do
pdf_builder = build.builder_for('pdf')
html_builder = build.builder_for('html')
epub_builder = build.builder_for('epub')
mobi_builder = build.builder_for('mobi')
pdf_builder .should_receive(:build!)
html_builder.should_receive(:build!)
epub_builder.should_receive(:build!)
mobi_builder.should_receive(:build!)
build.should_receive(:builder_for).with('pdf') .and_return(pdf_builder)
build.should_receive(:builder_for).with('html').and_return(html_builder)
build.should_receive(:builder_for).with('epub').and_return(epub_builder)
build.should_receive(:builder_for).with('mobi').and_return(mobi_builder)
build.all_formats
end
end
end | require 'spec_helper'
describe Polytexnic::Commands::Build do
before(:all) { generate_book }
after(:all) { remove_book }
context 'valid builder formats' do
Polytexnic::FORMATS.each do |format|
subject { Polytexnic::Commands::Build.builder_for(format) }
it { should be_a Polytexnic::Builder }
end
end
context 'invalid builder format' do
subject { lambda { Polytexnic::Commands::Build.for_format('derp') } }
it { should raise_error }
end
- context 'building each format' do
? - ^^^^^^^^^
+ context 'building all' do
? ^^
+ subject(:build) { Polytexnic::Commands::Build }
- Polytexnic::FORMATS.each do |format|
- subject {
- lambda {
- silence { Polytexnic::Commands::Build.for_format format }
- }
- }
- it { should_not raise_error }
- end
- end
+ it { should respond_to(:all_formats) }
+ it "should build all formats" do
+ pdf_builder = build.builder_for('pdf')
+ html_builder = build.builder_for('html')
+ epub_builder = build.builder_for('epub')
+ mobi_builder = build.builder_for('mobi')
+ pdf_builder .should_receive(:build!)
+ html_builder.should_receive(:build!)
+ epub_builder.should_receive(:build!)
+ mobi_builder.should_receive(:build!)
- context 'building all' do
- subject {
- lambda {
- silence { Polytexnic::Commands::Build.all_formats }
- }
- }
- it { should_not raise_error }
+ build.should_receive(:builder_for).with('pdf') .and_return(pdf_builder)
+ build.should_receive(:builder_for).with('html').and_return(html_builder)
+ build.should_receive(:builder_for).with('epub').and_return(epub_builder)
+ build.should_receive(:builder_for).with('mobi').and_return(mobi_builder)
+ build.all_formats
- after(:all) do
- chdir_to_md_book
- Polytexnic::Builders::Html.new.clean!
end
end
end | 37 | 0.804348 | 17 | 20 |
dc9901abcf9722e9505063afdb1d02dd4eb4b929 | release-checklist.rst | release-checklist.rst | Release checklist
=================
Things to remember when making a new release of pandas-validation.
#. Changes should be made to some branch other than master (a pull request
should then be created before making the release).
#. Make desirable changes to the code.
#. Check coding style against some of the conventions in PEP8:
.. code-block:: none
$ pycodestyle *.py
#. Run tests and report coverage:
.. code-block:: none
$ pytest -v test_pandasvalidation.py
$ coverage run -m pytest test_pandasvalidation.py
$ coverage report -m pandasvalidation.py
#. Update ``README.rst`` and the documentation (in ``docs/``).
.. code-block:: none
$ sphinx-build -b html ./docs/source ./docs/_build/html
#. Update ``CHANGELOG.rst`` and add a release date.
#. Update the release (version) number in ``setup.py`` and
``pandasvalidation.py``. Use `Semantic Versioning <http://semver.org>`_.
#. Create pull request(s) with changes for the new release.
#. Create distributions and upload the files to
`PyPI <https://pypi.python.org/pypi>`_ with
`twine <https://github.com/pypa/twine>`_.
.. code-block:: none
$ python setup.py sdist bdist_wheel
$ twine upload dist/*
#. Create the new release in GitHub.
#. Trigger a new build of the documentation on `http://readthedocs.io`_.
| Release checklist
=================
Things to remember when making a new release of pandas-validation.
#. Changes should be made to some branch other than master (a pull request
should then be created before making the release).
#. Make desirable changes to the code.
#. Check coding style against some of the conventions in PEP8:
.. code-block:: none
$ pycodestyle *.py
#. Run tests and report coverage:
.. code-block:: none
$ pytest -v test_pandasvalidation.py
$ coverage run -m pytest test_pandasvalidation.py
$ coverage report -m pandasvalidation.py
#. Update ``README.rst`` and the documentation (in ``docs/``).
.. code-block:: none
$ sphinx-build -b html ./docs/source ./docs/_build/html
#. Update ``CHANGELOG.rst`` and add a release date.
#. Update the release (version) number in ``setup.py`` and
``pandasvalidation.py``. Use `Semantic Versioning <http://semver.org>`_.
#. Create pull request(s) with changes for the new release.
#. Create distributions and upload the files to
`PyPI <https://pypi.python.org/pypi>`_ with
`twine <https://github.com/pypa/twine>`_.
.. code-block:: none
$ python setup.py sdist bdist_wheel
$ twine upload dist/*
#. Create the new release in GitHub.
#. Trigger a new build (latest version) of the documentation on
`http://readthedocs.io`_.
| Modify text in release checklist | Modify text in release checklist
| reStructuredText | mit | jmenglund/pandas-validation | restructuredtext | ## Code Before:
Release checklist
=================
Things to remember when making a new release of pandas-validation.
#. Changes should be made to some branch other than master (a pull request
should then be created before making the release).
#. Make desirable changes to the code.
#. Check coding style against some of the conventions in PEP8:
.. code-block:: none
$ pycodestyle *.py
#. Run tests and report coverage:
.. code-block:: none
$ pytest -v test_pandasvalidation.py
$ coverage run -m pytest test_pandasvalidation.py
$ coverage report -m pandasvalidation.py
#. Update ``README.rst`` and the documentation (in ``docs/``).
.. code-block:: none
$ sphinx-build -b html ./docs/source ./docs/_build/html
#. Update ``CHANGELOG.rst`` and add a release date.
#. Update the release (version) number in ``setup.py`` and
``pandasvalidation.py``. Use `Semantic Versioning <http://semver.org>`_.
#. Create pull request(s) with changes for the new release.
#. Create distributions and upload the files to
`PyPI <https://pypi.python.org/pypi>`_ with
`twine <https://github.com/pypa/twine>`_.
.. code-block:: none
$ python setup.py sdist bdist_wheel
$ twine upload dist/*
#. Create the new release in GitHub.
#. Trigger a new build of the documentation on `http://readthedocs.io`_.
## Instruction:
Modify text in release checklist
## Code After:
Release checklist
=================
Things to remember when making a new release of pandas-validation.
#. Changes should be made to some branch other than master (a pull request
should then be created before making the release).
#. Make desirable changes to the code.
#. Check coding style against some of the conventions in PEP8:
.. code-block:: none
$ pycodestyle *.py
#. Run tests and report coverage:
.. code-block:: none
$ pytest -v test_pandasvalidation.py
$ coverage run -m pytest test_pandasvalidation.py
$ coverage report -m pandasvalidation.py
#. Update ``README.rst`` and the documentation (in ``docs/``).
.. code-block:: none
$ sphinx-build -b html ./docs/source ./docs/_build/html
#. Update ``CHANGELOG.rst`` and add a release date.
#. Update the release (version) number in ``setup.py`` and
``pandasvalidation.py``. Use `Semantic Versioning <http://semver.org>`_.
#. Create pull request(s) with changes for the new release.
#. Create distributions and upload the files to
`PyPI <https://pypi.python.org/pypi>`_ with
`twine <https://github.com/pypa/twine>`_.
.. code-block:: none
$ python setup.py sdist bdist_wheel
$ twine upload dist/*
#. Create the new release in GitHub.
#. Trigger a new build (latest version) of the documentation on
`http://readthedocs.io`_.
| Release checklist
=================
Things to remember when making a new release of pandas-validation.
#. Changes should be made to some branch other than master (a pull request
should then be created before making the release).
#. Make desirable changes to the code.
#. Check coding style against some of the conventions in PEP8:
.. code-block:: none
$ pycodestyle *.py
#. Run tests and report coverage:
.. code-block:: none
$ pytest -v test_pandasvalidation.py
$ coverage run -m pytest test_pandasvalidation.py
$ coverage report -m pandasvalidation.py
#. Update ``README.rst`` and the documentation (in ``docs/``).
.. code-block:: none
$ sphinx-build -b html ./docs/source ./docs/_build/html
#. Update ``CHANGELOG.rst`` and add a release date.
#. Update the release (version) number in ``setup.py`` and
``pandasvalidation.py``. Use `Semantic Versioning <http://semver.org>`_.
#. Create pull request(s) with changes for the new release.
#. Create distributions and upload the files to
`PyPI <https://pypi.python.org/pypi>`_ with
`twine <https://github.com/pypa/twine>`_.
.. code-block:: none
$ python setup.py sdist bdist_wheel
$ twine upload dist/*
#. Create the new release in GitHub.
- #. Trigger a new build of the documentation on `http://readthedocs.io`_.
+ #. Trigger a new build (latest version) of the documentation on
+ `http://readthedocs.io`_. | 3 | 0.061224 | 2 | 1 |
e4f89f25d8ba9bf661738a8ba3a33e2ecaca014d | webserver/home/templates/home/_competition_list.html | webserver/home/templates/home/_competition_list.html | <div class="panel panel-default">
<div class="panel-heading">{{ section_name }}</div>
<div class="panel-body">
{% for competition in competitions %}
</a>
<div class="media">
<div class="media-left media-middle">
<a href="{{ competition.get_absolute_url }}">
<img class="media-object" src="{{ competition.thumbnail_url|default:competition.DEFAULT_THUMB }}" alt="{{ competition.name }}" style="width: 64px; height: 64px;">
</a>
</div>
<div class="media-body">
<h4 class="media-heading">{{ competition.name }}</h4>
{{ competition.description }}
</div>
</div>
{% endfor %}
</div>
</div>
| <div class="panel panel-default">
<div class="panel-heading">{{ section_name }}</div>
<div class="panel-body">
{% for competition in competitions %}
</a>
<div class="media">
<div class="media-left media-middle">
<a href="{{ competition.get_absolute_url }}">
<img class="media-object" src="{{ competition.thumbnail_url|default:competition.DEFAULT_THUMB }}" alt="{{ competition.name }}" style="width: 64px; height: 64px;">
</a>
</div>
<div class="media-body">
<h4 class="media-heading">
<a href="{{ competition.get_absolute_url }}">
{{ competition.name }}
</a>
</h4>
{{ competition.description }}
</div>
</div>
{% endfor %}
</div>
</div>
| Add competition link to competition name | Add competition link to competition name
| HTML | bsd-3-clause | siggame/webserver,siggame/webserver,siggame/webserver | html | ## Code Before:
<div class="panel panel-default">
<div class="panel-heading">{{ section_name }}</div>
<div class="panel-body">
{% for competition in competitions %}
</a>
<div class="media">
<div class="media-left media-middle">
<a href="{{ competition.get_absolute_url }}">
<img class="media-object" src="{{ competition.thumbnail_url|default:competition.DEFAULT_THUMB }}" alt="{{ competition.name }}" style="width: 64px; height: 64px;">
</a>
</div>
<div class="media-body">
<h4 class="media-heading">{{ competition.name }}</h4>
{{ competition.description }}
</div>
</div>
{% endfor %}
</div>
</div>
## Instruction:
Add competition link to competition name
## Code After:
<div class="panel panel-default">
<div class="panel-heading">{{ section_name }}</div>
<div class="panel-body">
{% for competition in competitions %}
</a>
<div class="media">
<div class="media-left media-middle">
<a href="{{ competition.get_absolute_url }}">
<img class="media-object" src="{{ competition.thumbnail_url|default:competition.DEFAULT_THUMB }}" alt="{{ competition.name }}" style="width: 64px; height: 64px;">
</a>
</div>
<div class="media-body">
<h4 class="media-heading">
<a href="{{ competition.get_absolute_url }}">
{{ competition.name }}
</a>
</h4>
{{ competition.description }}
</div>
</div>
{% endfor %}
</div>
</div>
| <div class="panel panel-default">
<div class="panel-heading">{{ section_name }}</div>
<div class="panel-body">
{% for competition in competitions %}
</a>
<div class="media">
<div class="media-left media-middle">
<a href="{{ competition.get_absolute_url }}">
<img class="media-object" src="{{ competition.thumbnail_url|default:competition.DEFAULT_THUMB }}" alt="{{ competition.name }}" style="width: 64px; height: 64px;">
</a>
</div>
<div class="media-body">
- <h4 class="media-heading">{{ competition.name }}</h4>
+ <h4 class="media-heading">
+ <a href="{{ competition.get_absolute_url }}">
+ {{ competition.name }}
+ </a>
+ </h4>
{{ competition.description }}
</div>
</div>
{% endfor %}
</div>
</div> | 6 | 0.285714 | 5 | 1 |
b800d35342354c77d52d86faaf5a067ab253bcaa | src/USlugify.php | src/USlugify.php | <?php
namespace USlugify;
/**
* USlugify
*
* @author Florent Denis <dflorent.pokap@gmail.com>
*/
class USlugify implements USlugifyInterface
{
/**
* {@inheritdoc}
*/
public function slugify($text)
{
$reservedChars = $this->getReservedChars();
$text = str_replace($reservedChars, '-', $text);
$text = trim(preg_replace('/[-\s]+/u', '-', $text), '-');
return mb_strtolower($text, mb_detect_encoding($text));
}
/**
* Returns list of char reserved.
* http://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters
*
* @return array
*/
protected function getReservedChars()
{
return array('!', '#', '$', '&', '\'', '"', '(', ')', '*', '+', ',', '/', ':', ';', '=', '?', '@', '[', ']');
}
}
| <?php
namespace USlugify;
/**
* USlugify
*
* @author Florent Denis <dflorent.pokap@gmail.com>
*/
class USlugify implements USlugifyInterface
{
/**
* {@inheritdoc}
*/
public function slugify($text)
{
$reservedChars = $this->getReservedChars();
$text = str_replace($reservedChars, '-', $text);
$text = preg_replace('/\p{Mn}+/u', '-', $text);
$text = trim(preg_replace('/[-\s]+/u', '-', $text), '-');
return mb_strtolower($text, mb_detect_encoding($text));
}
/**
* Returns list of char reserved.
* http://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters
*
* @return array
*/
protected function getReservedChars()
{
return array('!', '#', '$', '&', '\'', '"', '(', ')', '*', '+', ',', '/', ':', ';', '=', '?', '@', '[', ']');
}
}
| Test with non-spacing mark filter | Test with non-spacing mark filter | PHP | mit | pokap/USlugify | php | ## Code Before:
<?php
namespace USlugify;
/**
* USlugify
*
* @author Florent Denis <dflorent.pokap@gmail.com>
*/
class USlugify implements USlugifyInterface
{
/**
* {@inheritdoc}
*/
public function slugify($text)
{
$reservedChars = $this->getReservedChars();
$text = str_replace($reservedChars, '-', $text);
$text = trim(preg_replace('/[-\s]+/u', '-', $text), '-');
return mb_strtolower($text, mb_detect_encoding($text));
}
/**
* Returns list of char reserved.
* http://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters
*
* @return array
*/
protected function getReservedChars()
{
return array('!', '#', '$', '&', '\'', '"', '(', ')', '*', '+', ',', '/', ':', ';', '=', '?', '@', '[', ']');
}
}
## Instruction:
Test with non-spacing mark filter
## Code After:
<?php
namespace USlugify;
/**
* USlugify
*
* @author Florent Denis <dflorent.pokap@gmail.com>
*/
class USlugify implements USlugifyInterface
{
/**
* {@inheritdoc}
*/
public function slugify($text)
{
$reservedChars = $this->getReservedChars();
$text = str_replace($reservedChars, '-', $text);
$text = preg_replace('/\p{Mn}+/u', '-', $text);
$text = trim(preg_replace('/[-\s]+/u', '-', $text), '-');
return mb_strtolower($text, mb_detect_encoding($text));
}
/**
* Returns list of char reserved.
* http://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters
*
* @return array
*/
protected function getReservedChars()
{
return array('!', '#', '$', '&', '\'', '"', '(', ')', '*', '+', ',', '/', ':', ';', '=', '?', '@', '[', ']');
}
}
| <?php
namespace USlugify;
/**
* USlugify
*
* @author Florent Denis <dflorent.pokap@gmail.com>
*/
class USlugify implements USlugifyInterface
{
/**
* {@inheritdoc}
*/
public function slugify($text)
{
$reservedChars = $this->getReservedChars();
$text = str_replace($reservedChars, '-', $text);
+ $text = preg_replace('/\p{Mn}+/u', '-', $text);
$text = trim(preg_replace('/[-\s]+/u', '-', $text), '-');
return mb_strtolower($text, mb_detect_encoding($text));
}
/**
* Returns list of char reserved.
* http://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters
*
* @return array
*/
protected function getReservedChars()
{
return array('!', '#', '$', '&', '\'', '"', '(', ')', '*', '+', ',', '/', ':', ';', '=', '?', '@', '[', ']');
}
} | 1 | 0.028571 | 1 | 0 |
8f696734ee407ac375d377b8ee15c758c3079f7f | app/datastore/sample-objects.ts | app/datastore/sample-objects.ts | import {IdaiFieldDocument} from '../model/idai-field-document';
export var DOCS: IdaiFieldDocument[] = [
{
"id" : "o1", "resource" : { "id": "o1",
"identifier": "ob1",
"title": "Obi One Kenobi",
"cuts" : ["o2"],
"type": "object"
},
"synced": 0
},
{
"id" : "o2", "resource" : { "id" : "o2",
"identifier": "ob2",
"title": "Qui Gon Jinn",
"isCutBy" : ["o1"],
"type": "object"
},
"synced": 0
},
{
"id" : "o3", "resource" : { "id": "o3",
"identifier": "ob3", "title": "Luke Skywalker", "type": "object"
},
"synced": 0
},
{
"id" : "o4", "resource" : { "id": "o4",
"identifier": "ob4", "title": "Han Solo", "type": "object"
},
"synced": 0
},
{
"id" : "o5", "resource" : { "id": "o5",
"identifier": "ob5", "title": "Boba Fett", "type": "object"
},
"synced": 0
}
]; | import {IdaiFieldDocument} from '../model/idai-field-document';
export var DOCS: IdaiFieldDocument[] = [
{
"id" : "o1", "resource" : { "id": "o1",
"identifier": "ob1",
"title": "Obi One Kenobi",
"relations" : { "cuts" : ["o2"] },
"type": "object"
},
"synced": 0
},
{
"id" : "o2", "resource" : { "id" : "o2",
"identifier": "ob2",
"title": "Qui Gon Jinn",
"relations" : { "isCutBy" : ["o1"] },
"type": "object"
},
"synced": 0
},
{
"id" : "o3", "resource" : { "id": "o3",
"identifier": "ob3", "title": "Luke Skywalker", "type": "object",
"relations" : {}
},
"synced": 0
},
{
"id" : "o4", "resource" : { "id": "o4",
"identifier": "ob4", "title": "Han Solo", "type": "object",
"relations" : {}
},
"synced": 0
},
{
"id" : "o5", "resource" : { "id": "o5",
"identifier": "ob5", "title": "Boba Fett", "type": "object",
"relations" : {}
},
"synced": 0
}
]; | Adjust sample objects. Add 'relations' level. | Adjust sample objects. Add 'relations' level.
| TypeScript | apache-2.0 | codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client | typescript | ## Code Before:
import {IdaiFieldDocument} from '../model/idai-field-document';
export var DOCS: IdaiFieldDocument[] = [
{
"id" : "o1", "resource" : { "id": "o1",
"identifier": "ob1",
"title": "Obi One Kenobi",
"cuts" : ["o2"],
"type": "object"
},
"synced": 0
},
{
"id" : "o2", "resource" : { "id" : "o2",
"identifier": "ob2",
"title": "Qui Gon Jinn",
"isCutBy" : ["o1"],
"type": "object"
},
"synced": 0
},
{
"id" : "o3", "resource" : { "id": "o3",
"identifier": "ob3", "title": "Luke Skywalker", "type": "object"
},
"synced": 0
},
{
"id" : "o4", "resource" : { "id": "o4",
"identifier": "ob4", "title": "Han Solo", "type": "object"
},
"synced": 0
},
{
"id" : "o5", "resource" : { "id": "o5",
"identifier": "ob5", "title": "Boba Fett", "type": "object"
},
"synced": 0
}
];
## Instruction:
Adjust sample objects. Add 'relations' level.
## Code After:
import {IdaiFieldDocument} from '../model/idai-field-document';
export var DOCS: IdaiFieldDocument[] = [
{
"id" : "o1", "resource" : { "id": "o1",
"identifier": "ob1",
"title": "Obi One Kenobi",
"relations" : { "cuts" : ["o2"] },
"type": "object"
},
"synced": 0
},
{
"id" : "o2", "resource" : { "id" : "o2",
"identifier": "ob2",
"title": "Qui Gon Jinn",
"relations" : { "isCutBy" : ["o1"] },
"type": "object"
},
"synced": 0
},
{
"id" : "o3", "resource" : { "id": "o3",
"identifier": "ob3", "title": "Luke Skywalker", "type": "object",
"relations" : {}
},
"synced": 0
},
{
"id" : "o4", "resource" : { "id": "o4",
"identifier": "ob4", "title": "Han Solo", "type": "object",
"relations" : {}
},
"synced": 0
},
{
"id" : "o5", "resource" : { "id": "o5",
"identifier": "ob5", "title": "Boba Fett", "type": "object",
"relations" : {}
},
"synced": 0
}
]; | import {IdaiFieldDocument} from '../model/idai-field-document';
export var DOCS: IdaiFieldDocument[] = [
{
"id" : "o1", "resource" : { "id": "o1",
"identifier": "ob1",
"title": "Obi One Kenobi",
- "cuts" : ["o2"],
+ "relations" : { "cuts" : ["o2"] },
? ++++++++++++++++ ++
"type": "object"
},
"synced": 0
},
{
"id" : "o2", "resource" : { "id" : "o2",
"identifier": "ob2",
"title": "Qui Gon Jinn",
- "isCutBy" : ["o1"],
+ "relations" : { "isCutBy" : ["o1"] },
? ++++++++++++++++ ++
"type": "object"
},
"synced": 0
},
{
"id" : "o3", "resource" : { "id": "o3",
- "identifier": "ob3", "title": "Luke Skywalker", "type": "object"
+ "identifier": "ob3", "title": "Luke Skywalker", "type": "object",
? +
+ "relations" : {}
},
"synced": 0
},
{
"id" : "o4", "resource" : { "id": "o4",
- "identifier": "ob4", "title": "Han Solo", "type": "object"
+ "identifier": "ob4", "title": "Han Solo", "type": "object",
? +
+ "relations" : {}
},
"synced": 0
},
{
"id" : "o5", "resource" : { "id": "o5",
- "identifier": "ob5", "title": "Boba Fett", "type": "object"
+ "identifier": "ob5", "title": "Boba Fett", "type": "object",
? +
+ "relations" : {}
},
"synced": 0
}
]; | 13 | 0.325 | 8 | 5 |
8f4c443a2367baca35b23679c9f45df3ed82deea | archive.html | archive.html | ---
license: MIT
---
<!DOCTYPE html>
<html>
<head>
<title>{{ page.title }}</title>
</head>
<body>
{% for post in page.archive.posts reversed %}
<article>
<h2><a href="{{post.url}}">{{ post.title }}</a></h2>
</article>
{% endfor %}
</body>
</html>
| ---
license: MIT
---
<!DOCTYPE html>
<html>
<head>
<title>{{ page.title }}</title>
</head>
<body>
{% for post in page.archive.posts reversed %}
<article>
<h2><a href="{{post.url}}">{{ post.title }}</a></h2>
</article>
{% endfor %}
<!-- Pagination links -->
<nav class="pagination">
{% if paginator.previous_page %}
<a href="/page{{ paginator.previous_page }}" class="previous">Previous</a>
{% else %}
<span class="previous">Previous</span>
{% endif %}
<span class="page_number ">Page: {{ paginator.page }} of {{ paginator.total_pages }}</span>
{% if paginator.next_page %}
<a href="/page{{ paginator.next_page }}" class="next">Next</a>
{% else %}
<span class="next ">Next</span>
{% endif %}
</nav>
</body>
</html>
| Add pagination to example layout. | Add pagination to example layout.
| HTML | mit | itafroma/jekyll-archive,itafroma/jekyll-archive | html | ## Code Before:
---
license: MIT
---
<!DOCTYPE html>
<html>
<head>
<title>{{ page.title }}</title>
</head>
<body>
{% for post in page.archive.posts reversed %}
<article>
<h2><a href="{{post.url}}">{{ post.title }}</a></h2>
</article>
{% endfor %}
</body>
</html>
## Instruction:
Add pagination to example layout.
## Code After:
---
license: MIT
---
<!DOCTYPE html>
<html>
<head>
<title>{{ page.title }}</title>
</head>
<body>
{% for post in page.archive.posts reversed %}
<article>
<h2><a href="{{post.url}}">{{ post.title }}</a></h2>
</article>
{% endfor %}
<!-- Pagination links -->
<nav class="pagination">
{% if paginator.previous_page %}
<a href="/page{{ paginator.previous_page }}" class="previous">Previous</a>
{% else %}
<span class="previous">Previous</span>
{% endif %}
<span class="page_number ">Page: {{ paginator.page }} of {{ paginator.total_pages }}</span>
{% if paginator.next_page %}
<a href="/page{{ paginator.next_page }}" class="next">Next</a>
{% else %}
<span class="next ">Next</span>
{% endif %}
</nav>
</body>
</html>
| ---
license: MIT
---
<!DOCTYPE html>
<html>
<head>
<title>{{ page.title }}</title>
</head>
<body>
{% for post in page.archive.posts reversed %}
<article>
<h2><a href="{{post.url}}">{{ post.title }}</a></h2>
</article>
{% endfor %}
+
+ <!-- Pagination links -->
+ <nav class="pagination">
+ {% if paginator.previous_page %}
+ <a href="/page{{ paginator.previous_page }}" class="previous">Previous</a>
+ {% else %}
+ <span class="previous">Previous</span>
+ {% endif %}
+ <span class="page_number ">Page: {{ paginator.page }} of {{ paginator.total_pages }}</span>
+ {% if paginator.next_page %}
+ <a href="/page{{ paginator.next_page }}" class="next">Next</a>
+ {% else %}
+ <span class="next ">Next</span>
+ {% endif %}
+ </nav>
</body>
</html> | 15 | 0.9375 | 15 | 0 |
20294daf4697b94016ebd6b4d38207ec785d0d5a | elasticsearch/init.sls | elasticsearch/init.sls | include:
- java
elasticsearch:
pkg:
- installed
- sources:
- elasticsearch: http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.deb
service.running:
- require:
- pkg: elasticsearch
- file: /mnt/elasticsearch
- file: /var/log/elasticsearch
- watch:
- file: /etc/elasticsearch/elasticsearch.yml
- file: /etc/elasticsearch/default_mapping.json
/mnt/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/var/log/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/etc/elasticsearch/elasticsearch.yml:
file.managed:
- source: salt://elasticsearch/elasticsearch.yml
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
/etc/elasticsearch/default_mapping.json:
file.managed:
- source: salt://elasticsearch/default_mapping.json
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
| include:
- java
elasticsearch:
pkg:
- installed
- sources:
- elasticsearch: http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.deb
service.running:
- require:
- pkg: java
- pkg: elasticsearch
- file: /mnt/elasticsearch
- file: /var/log/elasticsearch
- watch:
- file: /etc/elasticsearch/elasticsearch.yml
- file: /etc/elasticsearch/default_mapping.json
/mnt/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/var/log/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/etc/elasticsearch/elasticsearch.yml:
file.managed:
- source: salt://elasticsearch/elasticsearch.yml
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
/etc/elasticsearch/default_mapping.json:
file.managed:
- source: salt://elasticsearch/default_mapping.json
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
| Make sure java is installed before we start elasticsearch. | Make sure java is installed before we start elasticsearch.
| SaltStack | apache-2.0 | jesusaurus/hpcs-salt-state,jesusaurus/hpcs-salt-state | saltstack | ## Code Before:
include:
- java
elasticsearch:
pkg:
- installed
- sources:
- elasticsearch: http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.deb
service.running:
- require:
- pkg: elasticsearch
- file: /mnt/elasticsearch
- file: /var/log/elasticsearch
- watch:
- file: /etc/elasticsearch/elasticsearch.yml
- file: /etc/elasticsearch/default_mapping.json
/mnt/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/var/log/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/etc/elasticsearch/elasticsearch.yml:
file.managed:
- source: salt://elasticsearch/elasticsearch.yml
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
/etc/elasticsearch/default_mapping.json:
file.managed:
- source: salt://elasticsearch/default_mapping.json
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
## Instruction:
Make sure java is installed before we start elasticsearch.
## Code After:
include:
- java
elasticsearch:
pkg:
- installed
- sources:
- elasticsearch: http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.deb
service.running:
- require:
- pkg: java
- pkg: elasticsearch
- file: /mnt/elasticsearch
- file: /var/log/elasticsearch
- watch:
- file: /etc/elasticsearch/elasticsearch.yml
- file: /etc/elasticsearch/default_mapping.json
/mnt/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/var/log/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/etc/elasticsearch/elasticsearch.yml:
file.managed:
- source: salt://elasticsearch/elasticsearch.yml
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
/etc/elasticsearch/default_mapping.json:
file.managed:
- source: salt://elasticsearch/default_mapping.json
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
| include:
- java
elasticsearch:
pkg:
- installed
- sources:
- elasticsearch: http://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.deb
service.running:
- require:
+ - pkg: java
- pkg: elasticsearch
- file: /mnt/elasticsearch
- file: /var/log/elasticsearch
- watch:
- file: /etc/elasticsearch/elasticsearch.yml
- file: /etc/elasticsearch/default_mapping.json
/mnt/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/var/log/elasticsearch:
file.directory:
- user: elasticsearch
- group: elasticsearch
- require:
- pkg: elasticsearch
/etc/elasticsearch/elasticsearch.yml:
file.managed:
- source: salt://elasticsearch/elasticsearch.yml
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch
/etc/elasticsearch/default_mapping.json:
file.managed:
- source: salt://elasticsearch/default_mapping.json
- user: root
- group: root
- mode: 0644
- require:
- pkg: elasticsearch | 1 | 0.020833 | 1 | 0 |
3839ec0682fa03ebdfe2d14bc5aa9e65fb67b798 | docs/development.md | docs/development.md | - Based on `go-swagger` tool.
- The following files are completely generated.
- `models`/
- `restapi/`
- `restapi/server.go`
- `cmd/weaviate-server/main.go`
- The file `restapi/configure_weaviate.go` is partially automatically generated, partially hand-edited.
## Dockerized development environment
### Build and run Weaviate
```
docker build -f Dockerfile.dev -t weaviate/development .
docker run --rm -p 8080:8080 weaviate/development
```
### Build and run the acceptance tests
```
docker build -f Dockerfile.dev --target acceptance_test -t weaviate/acceptance_test .
docker run --net=host --rm weaviate/acceptance_test -args -server-port=8080 -server-host=localhost -api-token=blah -api-key=blah
```
| - Based on `go-swagger` tool.
- The following files are completely generated.
- `models`/
- `restapi/`
- `restapi/server.go`
- `cmd/weaviate-server/main.go`
- The file `restapi/configure_weaviate.go` is partially automatically generated, partially hand-edited.
## Data Model
- Weaviate stores Things, Actions and Keys.
- Keys are used both for authentication and authorization in Weaviate.
- Owners of a key can create more keys; the new key points to the parent key that is used to create the key.
- Permissions (read, write, delete, execute) are linked to a key.
- Each piece of data (e.g. Things & Actions) is associated with a Key.
## Dockerized development environment
### Build and run Weaviate
```
docker build -f Dockerfile.dev -t weaviate/development .
docker run --rm -p 8080:8080 weaviate/development
```
### Build and run the acceptance tests
```
docker build -f Dockerfile.dev --target acceptance_test -t weaviate/acceptance_test .
docker run --net=host --rm weaviate/acceptance_test -args -server-port=8080 -server-host=localhost -api-token=blah -api-key=blah
```
| Add some ruminations about how Things, Actions and Keys relate | Add some ruminations about how Things, Actions and Keys relate
| Markdown | bsd-3-clause | weaviate/weaviate,weaviate/weaviate | markdown | ## Code Before:
- Based on `go-swagger` tool.
- The following files are completely generated.
- `models`/
- `restapi/`
- `restapi/server.go`
- `cmd/weaviate-server/main.go`
- The file `restapi/configure_weaviate.go` is partially automatically generated, partially hand-edited.
## Dockerized development environment
### Build and run Weaviate
```
docker build -f Dockerfile.dev -t weaviate/development .
docker run --rm -p 8080:8080 weaviate/development
```
### Build and run the acceptance tests
```
docker build -f Dockerfile.dev --target acceptance_test -t weaviate/acceptance_test .
docker run --net=host --rm weaviate/acceptance_test -args -server-port=8080 -server-host=localhost -api-token=blah -api-key=blah
```
## Instruction:
Add some ruminations about how Things, Actions and Keys relate
## Code After:
- Based on `go-swagger` tool.
- The following files are completely generated.
- `models`/
- `restapi/`
- `restapi/server.go`
- `cmd/weaviate-server/main.go`
- The file `restapi/configure_weaviate.go` is partially automatically generated, partially hand-edited.
## Data Model
- Weaviate stores Things, Actions and Keys.
- Keys are used both for authentication and authorization in Weaviate.
- Owners of a key can create more keys; the new key points to the parent key that is used to create the key.
- Permissions (read, write, delete, execute) are linked to a key.
- Each piece of data (e.g. Things & Actions) is associated with a Key.
## Dockerized development environment
### Build and run Weaviate
```
docker build -f Dockerfile.dev -t weaviate/development .
docker run --rm -p 8080:8080 weaviate/development
```
### Build and run the acceptance tests
```
docker build -f Dockerfile.dev --target acceptance_test -t weaviate/acceptance_test .
docker run --net=host --rm weaviate/acceptance_test -args -server-port=8080 -server-host=localhost -api-token=blah -api-key=blah
```
| - Based on `go-swagger` tool.
- The following files are completely generated.
- `models`/
- `restapi/`
- `restapi/server.go`
- `cmd/weaviate-server/main.go`
- The file `restapi/configure_weaviate.go` is partially automatically generated, partially hand-edited.
+
+ ## Data Model
+ - Weaviate stores Things, Actions and Keys.
+ - Keys are used both for authentication and authorization in Weaviate.
+ - Owners of a key can create more keys; the new key points to the parent key that is used to create the key.
+ - Permissions (read, write, delete, execute) are linked to a key.
+ - Each piece of data (e.g. Things & Actions) is associated with a Key.
## Dockerized development environment
### Build and run Weaviate
```
docker build -f Dockerfile.dev -t weaviate/development .
docker run --rm -p 8080:8080 weaviate/development
```
### Build and run the acceptance tests
```
docker build -f Dockerfile.dev --target acceptance_test -t weaviate/acceptance_test .
docker run --net=host --rm weaviate/acceptance_test -args -server-port=8080 -server-host=localhost -api-token=blah -api-key=blah
``` | 7 | 0.291667 | 7 | 0 |
3b7da8cd1e2f77d73cc19533fc657ba10a80c8cd | include/tgbot/export.h | include/tgbot/export.h | #ifdef TGBOT_DLL
#if defined _WIN32 || defined __CYGWIN__
#define TGBOT_HELPER_DLL_IMPORT __declspec(dllimport)
#define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#else
#if __GNUC__ >= 4
#define TGBOT_HELPER_DLL_IMPORT __attribute__ ((visibility ("default")))
#define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#else
#define TGBOT_HELPER_DLL_IMPORT
#define TGBOT_HELPER_DLL_EXPORT
#endif
#endif
#ifdef TgBot_EXPORTS
#define TGBOT_API TGBOT_HELPER_DLL_EXPORT
#else
#define FOX_API TGBOT_HELPER_DLL_IMPORT
#endif
#else
#define TGBOT_API
#endif
#endif
#endif //TGBOT_EXPORT_H
| #ifdef TGBOT_DLL
#if defined _WIN32 || defined __CYGWIN__
#define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#define TGBOT_HELPER_DLL_IMPORT __declspec(dllimport)
#else
#if __GNUC__ >= 4
#define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#define TGBOT_HELPER_DLL_IMPORT __attribute__ ((visibility ("default")))
#else
#define TGBOT_HELPER_DLL_EXPORT
#define TGBOT_HELPER_DLL_IMPORT
#endif
#endif
#ifdef TgBot_EXPORTS
#define TGBOT_API TGBOT_HELPER_DLL_EXPORT
#else
#define TGBOT_API TGBOT_HELPER_DLL_IMPORT
#endif
#else
#define TGBOT_API
#endif
#endif
#endif //TGBOT_EXPORT_H
| Fix mistake FOX and reorder EXPORT/IMPORT | Fix mistake FOX and reorder EXPORT/IMPORT
| C | mit | reo7sp/tgbot-cpp,reo7sp/tgbot-cpp,reo7sp/tgbot-cpp | c | ## Code Before:
#ifdef TGBOT_DLL
#if defined _WIN32 || defined __CYGWIN__
#define TGBOT_HELPER_DLL_IMPORT __declspec(dllimport)
#define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#else
#if __GNUC__ >= 4
#define TGBOT_HELPER_DLL_IMPORT __attribute__ ((visibility ("default")))
#define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#else
#define TGBOT_HELPER_DLL_IMPORT
#define TGBOT_HELPER_DLL_EXPORT
#endif
#endif
#ifdef TgBot_EXPORTS
#define TGBOT_API TGBOT_HELPER_DLL_EXPORT
#else
#define FOX_API TGBOT_HELPER_DLL_IMPORT
#endif
#else
#define TGBOT_API
#endif
#endif
#endif //TGBOT_EXPORT_H
## Instruction:
Fix mistake FOX and reorder EXPORT/IMPORT
## Code After:
#ifdef TGBOT_DLL
#if defined _WIN32 || defined __CYGWIN__
#define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#define TGBOT_HELPER_DLL_IMPORT __declspec(dllimport)
#else
#if __GNUC__ >= 4
#define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#define TGBOT_HELPER_DLL_IMPORT __attribute__ ((visibility ("default")))
#else
#define TGBOT_HELPER_DLL_EXPORT
#define TGBOT_HELPER_DLL_IMPORT
#endif
#endif
#ifdef TgBot_EXPORTS
#define TGBOT_API TGBOT_HELPER_DLL_EXPORT
#else
#define TGBOT_API TGBOT_HELPER_DLL_IMPORT
#endif
#else
#define TGBOT_API
#endif
#endif
#endif //TGBOT_EXPORT_H
| #ifdef TGBOT_DLL
#if defined _WIN32 || defined __CYGWIN__
+ #define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#define TGBOT_HELPER_DLL_IMPORT __declspec(dllimport)
- #define TGBOT_HELPER_DLL_EXPORT __declspec(dllexport)
#else
#if __GNUC__ >= 4
+ #define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#define TGBOT_HELPER_DLL_IMPORT __attribute__ ((visibility ("default")))
- #define TGBOT_HELPER_DLL_EXPORT __attribute__ ((visibility ("default")))
#else
+ #define TGBOT_HELPER_DLL_EXPORT
#define TGBOT_HELPER_DLL_IMPORT
- #define TGBOT_HELPER_DLL_EXPORT
#endif
#endif
#ifdef TgBot_EXPORTS
#define TGBOT_API TGBOT_HELPER_DLL_EXPORT
#else
- #define FOX_API TGBOT_HELPER_DLL_IMPORT
? ^ ^
+ #define TGBOT_API TGBOT_HELPER_DLL_IMPORT
? ^^^ ^
#endif
#else
#define TGBOT_API
#endif
#endif
#endif //TGBOT_EXPORT_H | 8 | 0.333333 | 4 | 4 |
75615b2328e521b6bb37321d1cd7dc75c4d3bfef | hecate/core/topology/border.py | hecate/core/topology/border.py | from hecate.core.topology.mixins import DimensionsMixin
class Border(DimensionsMixin):
"""
Base class for all types of borders.
"""
def __init__(self):
self.topology = None
class TorusBorder(Border):
supported_dimensions = list(range(1, 100))
def wrap_coords(self, coord_prefix):
code = ""
for i in range(self.dimensions):
code += "{x}{i} %= {w}{i};\n".format(
x=coord_prefix, i=i,
w=self.topology.lattice.width_prefix
)
return code
| from hecate.core.topology.mixins import DimensionsMixin
class Border(DimensionsMixin):
"""
Base class for all types of borders.
"""
def __init__(self):
self.topology = None
class TorusBorder(Border):
supported_dimensions = list(range(1, 100))
def wrap_coords(self, coord_prefix):
code = ""
for i in range(self.dimensions):
code += "{x}{i} = ({x}{i} + {w}{i}) % {w}{i};\n".format(
x=coord_prefix, i=i,
w=self.topology.lattice.width_prefix
)
return code
| Fix incorrect TorusBorder wrapping in negative direction | Fix incorrect TorusBorder wrapping in negative direction
| Python | mit | a5kin/hecate,a5kin/hecate | python | ## Code Before:
from hecate.core.topology.mixins import DimensionsMixin
class Border(DimensionsMixin):
"""
Base class for all types of borders.
"""
def __init__(self):
self.topology = None
class TorusBorder(Border):
supported_dimensions = list(range(1, 100))
def wrap_coords(self, coord_prefix):
code = ""
for i in range(self.dimensions):
code += "{x}{i} %= {w}{i};\n".format(
x=coord_prefix, i=i,
w=self.topology.lattice.width_prefix
)
return code
## Instruction:
Fix incorrect TorusBorder wrapping in negative direction
## Code After:
from hecate.core.topology.mixins import DimensionsMixin
class Border(DimensionsMixin):
"""
Base class for all types of borders.
"""
def __init__(self):
self.topology = None
class TorusBorder(Border):
supported_dimensions = list(range(1, 100))
def wrap_coords(self, coord_prefix):
code = ""
for i in range(self.dimensions):
code += "{x}{i} = ({x}{i} + {w}{i}) % {w}{i};\n".format(
x=coord_prefix, i=i,
w=self.topology.lattice.width_prefix
)
return code
| from hecate.core.topology.mixins import DimensionsMixin
class Border(DimensionsMixin):
"""
Base class for all types of borders.
"""
def __init__(self):
self.topology = None
class TorusBorder(Border):
supported_dimensions = list(range(1, 100))
def wrap_coords(self, coord_prefix):
code = ""
for i in range(self.dimensions):
- code += "{x}{i} %= {w}{i};\n".format(
? -
+ code += "{x}{i} = ({x}{i} + {w}{i}) % {w}{i};\n".format(
? ++++++++++++++++++++
x=coord_prefix, i=i,
w=self.topology.lattice.width_prefix
)
return code | 2 | 0.086957 | 1 | 1 |
0ead9861a0d676a5564a6dcc6dd049edfa137954 | src/Http/Controllers/Auth/ApiAuthController.php | src/Http/Controllers/Auth/ApiAuthController.php | <?php
namespace Bishopm\Connexion\Http\Controllers\Auth;
use JWTAuth;
use Tymon\JWTAuth\Exceptions\JWTException;
use App\Http\Controllers\Controller;
use Illuminate\Support\Facades\Log;
class ApiAuthController extends Controller
{
public function login(Request $request)
{
// grab credentials from the request
$credentials = $request->only('name', 'password');
Log::info('API login attempt: ' . $credentials);
try {
// attempt to verify the credentials and create a token for the user
if (! $token = JWTAuth::attempt($credentials)) {
return response()->json(['error' => 'invalid_credentials'], 401);
}
} catch (JWTException $e) {
// something went wrong whilst attempting to encode the token
return response()->json(['error' => 'could_not_create_token'], 500);
}
// all good so return the token
return response()->json(compact('token'));
}
}
| <?php
namespace Bishopm\Connexion\Http\Controllers\Auth;
use JWTAuth;
use Tymon\JWTAuth\Exceptions\JWTException;
use App\Http\Controllers\Controller;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\Log;
class ApiAuthController extends Controller
{
public function login(Request $request)
{
// grab credentials from the request
$credentials = $request->only('name', 'password');
Log::info('API login attempt: ' . $credentials);
try {
// attempt to verify the credentials and create a token for the user
if (! $token = JWTAuth::attempt($credentials)) {
return response()->json(['error' => 'invalid_credentials'], 401);
}
} catch (JWTException $e) {
// something went wrong whilst attempting to encode the token
return response()->json(['error' => 'could_not_create_token'], 500);
}
// all good so return the token
return response()->json(compact('token'));
}
}
| Add request to API auth | Add request to API auth
| PHP | mit | bishopm/base,bishopm/connexion,bishopm/connexion,bishopm/connexion,bishopm/base,bishopm/base | php | ## Code Before:
<?php
namespace Bishopm\Connexion\Http\Controllers\Auth;
use JWTAuth;
use Tymon\JWTAuth\Exceptions\JWTException;
use App\Http\Controllers\Controller;
use Illuminate\Support\Facades\Log;
class ApiAuthController extends Controller
{
public function login(Request $request)
{
// grab credentials from the request
$credentials = $request->only('name', 'password');
Log::info('API login attempt: ' . $credentials);
try {
// attempt to verify the credentials and create a token for the user
if (! $token = JWTAuth::attempt($credentials)) {
return response()->json(['error' => 'invalid_credentials'], 401);
}
} catch (JWTException $e) {
// something went wrong whilst attempting to encode the token
return response()->json(['error' => 'could_not_create_token'], 500);
}
// all good so return the token
return response()->json(compact('token'));
}
}
## Instruction:
Add request to API auth
## Code After:
<?php
namespace Bishopm\Connexion\Http\Controllers\Auth;
use JWTAuth;
use Tymon\JWTAuth\Exceptions\JWTException;
use App\Http\Controllers\Controller;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\Log;
class ApiAuthController extends Controller
{
public function login(Request $request)
{
// grab credentials from the request
$credentials = $request->only('name', 'password');
Log::info('API login attempt: ' . $credentials);
try {
// attempt to verify the credentials and create a token for the user
if (! $token = JWTAuth::attempt($credentials)) {
return response()->json(['error' => 'invalid_credentials'], 401);
}
} catch (JWTException $e) {
// something went wrong whilst attempting to encode the token
return response()->json(['error' => 'could_not_create_token'], 500);
}
// all good so return the token
return response()->json(compact('token'));
}
}
| <?php
namespace Bishopm\Connexion\Http\Controllers\Auth;
use JWTAuth;
use Tymon\JWTAuth\Exceptions\JWTException;
use App\Http\Controllers\Controller;
+ use Illuminate\Http\Request;
use Illuminate\Support\Facades\Log;
class ApiAuthController extends Controller
{
public function login(Request $request)
{
// grab credentials from the request
$credentials = $request->only('name', 'password');
Log::info('API login attempt: ' . $credentials);
try {
// attempt to verify the credentials and create a token for the user
if (! $token = JWTAuth::attempt($credentials)) {
return response()->json(['error' => 'invalid_credentials'], 401);
}
} catch (JWTException $e) {
// something went wrong whilst attempting to encode the token
return response()->json(['error' => 'could_not_create_token'], 500);
}
// all good so return the token
return response()->json(compact('token'));
}
} | 1 | 0.033333 | 1 | 0 |
14329daf571400812594c0388eac87538cd10079 | denim/api.py | denim/api.py | from fabric import api as __api
# Setup some default values.
__api.env.deploy_user = 'webapps'
from denim.paths import (cd_deploy, cd_package, deploy_path, package_path)
from denim import (scm, service, system, virtualenv, webserver)
from denim.decorators import deploy_env
@__api.task(name="help")
def show_help():
"""
Help on common operations.
"""
from denim.environment import get_environments
import denim
print """
Common operations with Denim (%(version)s).
Provision server:
> fab {%(environments)s} init
Deploy (require a source control revision to be supplied. i.e. master):
> fab {%(environments)s} deploy:{revision}
Status of service:
> fab {%(environments)s} service.status
""" % {
'environments': '|'.join(get_environments()),
'version': denim.__version__,
}
@__api.task
def environment():
"""
Environments defined in fabfile.
"""
from denim.environment import get_environments
print 'Environments defined in fab file:'
print ', '.join(get_environments())
| from fabric import api as _api
# Setup some default values.
_api.env.deploy_user = 'webapps'
from denim.paths import (cd_deploy, cd_application, deploy_path, application_path)
from denim import (scm, service, system, virtualenv, webserver)
from denim.decorators import deploy_env
# Pending deprecation
from denim.paths import (cd_package, package_path)
@_api.task(name="help")
def show_help():
"""
Help on common operations.
"""
from denim.environment import get_environments
import denim
print """
Common operations with Denim (%(version)s).
Provision server:
> fab {%(environments)s} init
Deploy (require a source control revision to be supplied. i.e. master):
> fab {%(environments)s} deploy:{revision}
Status of service:
> fab {%(environments)s} service.status
""" % {
'environments': '|'.join(get_environments()),
'version': denim.__version__,
}
@_api.task
def environments():
"""
Environments defined in fabfile.
"""
from denim.environment import get_environments
print 'Environments defined in fab file:'
print ', '.join(get_environments())
| Break out items pending deprecation, remove double underscores | Break out items pending deprecation, remove double underscores
| Python | bsd-2-clause | timsavage/denim | python | ## Code Before:
from fabric import api as __api
# Setup some default values.
__api.env.deploy_user = 'webapps'
from denim.paths import (cd_deploy, cd_package, deploy_path, package_path)
from denim import (scm, service, system, virtualenv, webserver)
from denim.decorators import deploy_env
@__api.task(name="help")
def show_help():
"""
Help on common operations.
"""
from denim.environment import get_environments
import denim
print """
Common operations with Denim (%(version)s).
Provision server:
> fab {%(environments)s} init
Deploy (require a source control revision to be supplied. i.e. master):
> fab {%(environments)s} deploy:{revision}
Status of service:
> fab {%(environments)s} service.status
""" % {
'environments': '|'.join(get_environments()),
'version': denim.__version__,
}
@__api.task
def environment():
"""
Environments defined in fabfile.
"""
from denim.environment import get_environments
print 'Environments defined in fab file:'
print ', '.join(get_environments())
## Instruction:
Break out items pending deprecation, remove double underscores
## Code After:
from fabric import api as _api
# Setup some default values.
_api.env.deploy_user = 'webapps'
from denim.paths import (cd_deploy, cd_application, deploy_path, application_path)
from denim import (scm, service, system, virtualenv, webserver)
from denim.decorators import deploy_env
# Pending deprecation
from denim.paths import (cd_package, package_path)
@_api.task(name="help")
def show_help():
"""
Help on common operations.
"""
from denim.environment import get_environments
import denim
print """
Common operations with Denim (%(version)s).
Provision server:
> fab {%(environments)s} init
Deploy (require a source control revision to be supplied. i.e. master):
> fab {%(environments)s} deploy:{revision}
Status of service:
> fab {%(environments)s} service.status
""" % {
'environments': '|'.join(get_environments()),
'version': denim.__version__,
}
@_api.task
def environments():
"""
Environments defined in fabfile.
"""
from denim.environment import get_environments
print 'Environments defined in fab file:'
print ', '.join(get_environments())
| - from fabric import api as __api
? -
+ from fabric import api as _api
# Setup some default values.
- __api.env.deploy_user = 'webapps'
? -
+ _api.env.deploy_user = 'webapps'
- from denim.paths import (cd_deploy, cd_package, deploy_path, package_path)
? ^^^^^ ^^^^^
+ from denim.paths import (cd_deploy, cd_application, deploy_path, application_path)
? + ++++ ^^^^ + ++++ ^^^^
from denim import (scm, service, system, virtualenv, webserver)
from denim.decorators import deploy_env
+ # Pending deprecation
+ from denim.paths import (cd_package, package_path)
- @__api.task(name="help")
? -
+ @_api.task(name="help")
def show_help():
"""
Help on common operations.
"""
from denim.environment import get_environments
import denim
print """
Common operations with Denim (%(version)s).
Provision server:
> fab {%(environments)s} init
Deploy (require a source control revision to be supplied. i.e. master):
> fab {%(environments)s} deploy:{revision}
Status of service:
> fab {%(environments)s} service.status
""" % {
'environments': '|'.join(get_environments()),
'version': denim.__version__,
}
- @__api.task
? -
+ @_api.task
- def environment():
+ def environments():
? +
"""
Environments defined in fabfile.
"""
from denim.environment import get_environments
print 'Environments defined in fab file:'
print ', '.join(get_environments()) | 14 | 0.318182 | 8 | 6 |
a1318a5ced6efc4ae88abc0b23190daea5899704 | open_humans/serializers.py | open_humans/serializers.py | from django.contrib.auth.models import User
from django.core.urlresolvers import reverse
from rest_framework import serializers
class ProfileSerializer(serializers.ModelSerializer):
url = serializers.SerializerMethodField('get_profile_url')
class Meta:
model = User
fields = ('id', 'url', 'username')
def get_profile_url(self, obj):
return reverse('member_profile', args=(obj.id,))
| from django.contrib.auth.models import User
# from django.core.urlresolvers import reverse
from rest_framework import serializers
class ProfileSerializer(serializers.ModelSerializer):
# url = serializers.SerializerMethodField('get_profile_url')
message = serializers.SerializerMethodField('get_message')
class Meta:
model = User
# fields = ('id', 'url', 'username')
fields = ('message',)
# def get_profile_url(self, obj):
# return reverse('member_profile', args=(obj.id,))
def get_message(self, obj):
return 'profiles are not yet implemented'
| Make /api/profile return no private data | Make /api/profile return no private data
| Python | mit | OpenHumans/open-humans,PersonalGenomesOrg/open-humans,PersonalGenomesOrg/open-humans,PersonalGenomesOrg/open-humans,OpenHumans/open-humans,OpenHumans/open-humans,OpenHumans/open-humans,PersonalGenomesOrg/open-humans | python | ## Code Before:
from django.contrib.auth.models import User
from django.core.urlresolvers import reverse
from rest_framework import serializers
class ProfileSerializer(serializers.ModelSerializer):
url = serializers.SerializerMethodField('get_profile_url')
class Meta:
model = User
fields = ('id', 'url', 'username')
def get_profile_url(self, obj):
return reverse('member_profile', args=(obj.id,))
## Instruction:
Make /api/profile return no private data
## Code After:
from django.contrib.auth.models import User
# from django.core.urlresolvers import reverse
from rest_framework import serializers
class ProfileSerializer(serializers.ModelSerializer):
# url = serializers.SerializerMethodField('get_profile_url')
message = serializers.SerializerMethodField('get_message')
class Meta:
model = User
# fields = ('id', 'url', 'username')
fields = ('message',)
# def get_profile_url(self, obj):
# return reverse('member_profile', args=(obj.id,))
def get_message(self, obj):
return 'profiles are not yet implemented'
| from django.contrib.auth.models import User
- from django.core.urlresolvers import reverse
+ # from django.core.urlresolvers import reverse
? ++
from rest_framework import serializers
class ProfileSerializer(serializers.ModelSerializer):
- url = serializers.SerializerMethodField('get_profile_url')
+ # url = serializers.SerializerMethodField('get_profile_url')
? ++
+ message = serializers.SerializerMethodField('get_message')
class Meta:
model = User
- fields = ('id', 'url', 'username')
+ # fields = ('id', 'url', 'username')
? ++
+ fields = ('message',)
- def get_profile_url(self, obj):
+ # def get_profile_url(self, obj):
? ++
- return reverse('member_profile', args=(obj.id,))
+ # return reverse('member_profile', args=(obj.id,))
? ++
+
+ def get_message(self, obj):
+ return 'profiles are not yet implemented' | 15 | 1.071429 | 10 | 5 |
9d05f18dcb4b52c1d4e68f53f24e5ccebab10a58 | bot/models.py | bot/models.py | from sqlalchemy import create_engine, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
def db_connect():
"""
Performs database connection
Returns sqlalchemy engine instance
"""
return create_engine('postgres://avvcurseaphtxf:X0466JySVtLq6nyq_5pb7BQNjR@'
'ec2-54-227-250-80.compute-1.amazonaws.com'
':5432/d7do67r1b7t1nn', echo=False)
def create_battletag_table(engine):
Base.metadata.create_all(engine)
class Battletags(Base):
"""
Table to store user battletags
"""
__tablename__ = 'Battletags'
disc_name = Column(String, primary_key=True)
battletag = Column(String, unique=True)
| from sqlalchemy import create_engine, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
def db_connect():
"""
Performs database connection
Returns sqlalchemy engine instance
"""
return create_engine('postgres://fbcmeskynsvati:aURfAdENt6-kumO0j224GuXRWH'
'@ec2-54-221-235-135.compute-1.amazonaws.com'
':5432/d2cc1tb2t1iges', echo=False)
def create_battletag_table(engine):
Base.metadata.create_all(engine)
class Battletags(Base):
"""
Table to store user battletags
"""
__tablename__ = 'Battletags'
disc_name = Column(String, primary_key=True)
battletag = Column(String, unique=True)
| Change database url for create_engine() | Change database url for create_engine()
| Python | mit | alexbotello/BastionBot | python | ## Code Before:
from sqlalchemy import create_engine, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
def db_connect():
"""
Performs database connection
Returns sqlalchemy engine instance
"""
return create_engine('postgres://avvcurseaphtxf:X0466JySVtLq6nyq_5pb7BQNjR@'
'ec2-54-227-250-80.compute-1.amazonaws.com'
':5432/d7do67r1b7t1nn', echo=False)
def create_battletag_table(engine):
Base.metadata.create_all(engine)
class Battletags(Base):
"""
Table to store user battletags
"""
__tablename__ = 'Battletags'
disc_name = Column(String, primary_key=True)
battletag = Column(String, unique=True)
## Instruction:
Change database url for create_engine()
## Code After:
from sqlalchemy import create_engine, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
def db_connect():
"""
Performs database connection
Returns sqlalchemy engine instance
"""
return create_engine('postgres://fbcmeskynsvati:aURfAdENt6-kumO0j224GuXRWH'
'@ec2-54-221-235-135.compute-1.amazonaws.com'
':5432/d2cc1tb2t1iges', echo=False)
def create_battletag_table(engine):
Base.metadata.create_all(engine)
class Battletags(Base):
"""
Table to store user battletags
"""
__tablename__ = 'Battletags'
disc_name = Column(String, primary_key=True)
battletag = Column(String, unique=True)
| from sqlalchemy import create_engine, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
def db_connect():
"""
Performs database connection
Returns sqlalchemy engine instance
"""
- return create_engine('postgres://avvcurseaphtxf:X0466JySVtLq6nyq_5pb7BQNjR@'
+ return create_engine('postgres://fbcmeskynsvati:aURfAdENt6-kumO0j224GuXRWH'
- 'ec2-54-227-250-80.compute-1.amazonaws.com'
? ^ - ^^
+ '@ec2-54-221-235-135.compute-1.amazonaws.com'
? + ^ + ^^^
- ':5432/d7do67r1b7t1nn', echo=False)
? ^^^^^^ ^ ^^
+ ':5432/d2cc1tb2t1iges', echo=False)
? ^^^ + ^ ^^^^
def create_battletag_table(engine):
Base.metadata.create_all(engine)
class Battletags(Base):
"""
Table to store user battletags
"""
__tablename__ = 'Battletags'
disc_name = Column(String, primary_key=True)
battletag = Column(String, unique=True) | 6 | 0.2 | 3 | 3 |
fda24cbf96c0456c80560a6c7903301fa6110832 | requirements.txt | requirements.txt | pbr>=1.6 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.3.0,!=2.3.1,!=2.3.2,!=2.3.3,!=2.4.0,>=1.3 # BSD
cliff!=1.16.0,!=1.17.0,>=1.15.0 # Apache-2.0
eclsdk>=0.0.1 # Apache-2.0
keystoneauth1<=3.4.0,>=2.1.0 # Apache-2.0
os-client-config>=1.13.1 # Apache-2.0
oslo.config>=3.7.0 # Apache-2.0
oslo.i18n>=2.1.0 # Apache-2.0
oslo.utils>=3.5.0 # Apache-2.0
python-glanceclient>=2.0.0 # Apache-2.0
python-keystoneclient!=1.8.0,!=2.1.0,>=1.6.0 # Apache-2.0
python-novaclient!=2.33.0,>=2.29.0,<=9.1.0 # Apache-2.0
python-cinderclient>=1.3.1 # Apache-2.0
requests!=2.9.0,>=2.8.1 # Apache-2.0
stevedore>=1.5.0 # Apache-2.0
json-merge-patch>=0.2
| pbr>=1.6 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.3.0,!=2.3.1,!=2.3.2,!=2.3.3,!=2.4.0,>=1.3 # BSD
cliff!=1.16.0,!=1.17.0,>=1.15.0 # Apache-2.0
eclsdk>=0.0.1 # Apache-2.0
keystoneauth1<=3.4.0,>=2.1.0 # Apache-2.0
openstacksdk<=0.13.0 # Apache-2.0
os-client-config>=1.13.1 # Apache-2.0
oslo.config>=3.7.0 # Apache-2.0
oslo.i18n>=2.1.0 # Apache-2.0
oslo.utils>=3.5.0 # Apache-2.0
python-glanceclient>=2.0.0,<=2.11.0 # Apache-2.0
python-keystoneclient!=1.8.0,!=2.1.0,>=1.6.0 # Apache-2.0
python-novaclient!=2.33.0,>=2.29.0,<=9.1.0 # Apache-2.0
python-cinderclient>=1.3.1 # Apache-2.0
requests!=2.9.0,>=2.8.1 # Apache-2.0
stevedore>=1.5.0 # Apache-2.0
json-merge-patch>=0.2
| Fix openstacksdk and python-glanceclient versions | Fix openstacksdk and python-glanceclient versions
| Text | apache-2.0 | nttcom/eclcli | text | ## Code Before:
pbr>=1.6 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.3.0,!=2.3.1,!=2.3.2,!=2.3.3,!=2.4.0,>=1.3 # BSD
cliff!=1.16.0,!=1.17.0,>=1.15.0 # Apache-2.0
eclsdk>=0.0.1 # Apache-2.0
keystoneauth1<=3.4.0,>=2.1.0 # Apache-2.0
os-client-config>=1.13.1 # Apache-2.0
oslo.config>=3.7.0 # Apache-2.0
oslo.i18n>=2.1.0 # Apache-2.0
oslo.utils>=3.5.0 # Apache-2.0
python-glanceclient>=2.0.0 # Apache-2.0
python-keystoneclient!=1.8.0,!=2.1.0,>=1.6.0 # Apache-2.0
python-novaclient!=2.33.0,>=2.29.0,<=9.1.0 # Apache-2.0
python-cinderclient>=1.3.1 # Apache-2.0
requests!=2.9.0,>=2.8.1 # Apache-2.0
stevedore>=1.5.0 # Apache-2.0
json-merge-patch>=0.2
## Instruction:
Fix openstacksdk and python-glanceclient versions
## Code After:
pbr>=1.6 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.3.0,!=2.3.1,!=2.3.2,!=2.3.3,!=2.4.0,>=1.3 # BSD
cliff!=1.16.0,!=1.17.0,>=1.15.0 # Apache-2.0
eclsdk>=0.0.1 # Apache-2.0
keystoneauth1<=3.4.0,>=2.1.0 # Apache-2.0
openstacksdk<=0.13.0 # Apache-2.0
os-client-config>=1.13.1 # Apache-2.0
oslo.config>=3.7.0 # Apache-2.0
oslo.i18n>=2.1.0 # Apache-2.0
oslo.utils>=3.5.0 # Apache-2.0
python-glanceclient>=2.0.0,<=2.11.0 # Apache-2.0
python-keystoneclient!=1.8.0,!=2.1.0,>=1.6.0 # Apache-2.0
python-novaclient!=2.33.0,>=2.29.0,<=9.1.0 # Apache-2.0
python-cinderclient>=1.3.1 # Apache-2.0
requests!=2.9.0,>=2.8.1 # Apache-2.0
stevedore>=1.5.0 # Apache-2.0
json-merge-patch>=0.2
| pbr>=1.6 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.3.0,!=2.3.1,!=2.3.2,!=2.3.3,!=2.4.0,>=1.3 # BSD
cliff!=1.16.0,!=1.17.0,>=1.15.0 # Apache-2.0
eclsdk>=0.0.1 # Apache-2.0
keystoneauth1<=3.4.0,>=2.1.0 # Apache-2.0
+ openstacksdk<=0.13.0 # Apache-2.0
os-client-config>=1.13.1 # Apache-2.0
oslo.config>=3.7.0 # Apache-2.0
oslo.i18n>=2.1.0 # Apache-2.0
oslo.utils>=3.5.0 # Apache-2.0
- python-glanceclient>=2.0.0 # Apache-2.0
+ python-glanceclient>=2.0.0,<=2.11.0 # Apache-2.0
? +++++++++
python-keystoneclient!=1.8.0,!=2.1.0,>=1.6.0 # Apache-2.0
python-novaclient!=2.33.0,>=2.29.0,<=9.1.0 # Apache-2.0
python-cinderclient>=1.3.1 # Apache-2.0
requests!=2.9.0,>=2.8.1 # Apache-2.0
stevedore>=1.5.0 # Apache-2.0
json-merge-patch>=0.2 | 3 | 0.166667 | 2 | 1 |
1a71a91e9b143625d5863c21dc9de039e38541a2 | src/user/components/online_payments.js | src/user/components/online_payments.js | import React from 'react'
import CreaditCardPayment from './credit_card_payment.js'
import Paypal from './paypal.js'
export default (props) => {
return props.user_payments.payment_sent
? SuccessfulPayment(props)
: (props.user_payments.braintree_error ? Error() : PaymentOptions(props))
}
const Error = () =>
<div>There has been a network error. Please refresh the browser.</div>
const PaymentOptions = (props) =>
<div className='make-payment'>
<h1 className='title'>Online Payment</h1>
<h3 className='subtitle'>If you would prefer to pay by PayPal</h3>
<h3 className='subtitle'>Alternatively pay by card</h3>
<CreaditCardPayment {...props} />
</div>
// <Paypal {...props} />
const SuccessfulPayment = ({ user_payments }) =>
<div className='make-payment'>
<h1 className='title'>Successful Payment</h1>
<h3 className='subtitle'>
Thank you for your payment of £{user_payments.amount_entered}.
Your reference for that payment is {user_payments.payment_sent.reference}.
</h3>
</div>
| import React from 'react'
import CreaditCardPayment from './credit_card_payment.js'
import Paypal from './paypal.js'
export default (props) => {
return props.user_payments.payment_sent
? SuccessfulPayment(props)
: (props.user_payments.braintree_error ? Error() : PaymentOptions(props))
}
const Error = () =>
<div>There has been a network error. Please refresh the browser.</div>
const PaymentOptions = (props) =>
<div className='make-payment'>
<h1 className='title'>Online Payment</h1>
<h3 className='subtitle'>Pay using PayPal</h3>
<Paypal {...props} />
<h3 className='subtitle'>Alternatively pay by card</h3>
<CreaditCardPayment {...props} />
</div>
const SuccessfulPayment = ({ user_payments }) =>
<div className='make-payment'>
<h1 className='title'>Successful Payment</h1>
<h3 className='subtitle'>
Thank you for your payment of £{user_payments.amount_entered}.
Your reference for that payment is {user_payments.payment_sent.reference}.
</h3>
</div>
| Add paypal component to online payments. | Add paypal component to online payments.
| JavaScript | mit | foundersandcoders/sail-back,foundersandcoders/sail-back | javascript | ## Code Before:
import React from 'react'
import CreaditCardPayment from './credit_card_payment.js'
import Paypal from './paypal.js'
export default (props) => {
return props.user_payments.payment_sent
? SuccessfulPayment(props)
: (props.user_payments.braintree_error ? Error() : PaymentOptions(props))
}
const Error = () =>
<div>There has been a network error. Please refresh the browser.</div>
const PaymentOptions = (props) =>
<div className='make-payment'>
<h1 className='title'>Online Payment</h1>
<h3 className='subtitle'>If you would prefer to pay by PayPal</h3>
<h3 className='subtitle'>Alternatively pay by card</h3>
<CreaditCardPayment {...props} />
</div>
// <Paypal {...props} />
const SuccessfulPayment = ({ user_payments }) =>
<div className='make-payment'>
<h1 className='title'>Successful Payment</h1>
<h3 className='subtitle'>
Thank you for your payment of £{user_payments.amount_entered}.
Your reference for that payment is {user_payments.payment_sent.reference}.
</h3>
</div>
## Instruction:
Add paypal component to online payments.
## Code After:
import React from 'react'
import CreaditCardPayment from './credit_card_payment.js'
import Paypal from './paypal.js'
export default (props) => {
return props.user_payments.payment_sent
? SuccessfulPayment(props)
: (props.user_payments.braintree_error ? Error() : PaymentOptions(props))
}
const Error = () =>
<div>There has been a network error. Please refresh the browser.</div>
const PaymentOptions = (props) =>
<div className='make-payment'>
<h1 className='title'>Online Payment</h1>
<h3 className='subtitle'>Pay using PayPal</h3>
<Paypal {...props} />
<h3 className='subtitle'>Alternatively pay by card</h3>
<CreaditCardPayment {...props} />
</div>
const SuccessfulPayment = ({ user_payments }) =>
<div className='make-payment'>
<h1 className='title'>Successful Payment</h1>
<h3 className='subtitle'>
Thank you for your payment of £{user_payments.amount_entered}.
Your reference for that payment is {user_payments.payment_sent.reference}.
</h3>
</div>
| import React from 'react'
import CreaditCardPayment from './credit_card_payment.js'
import Paypal from './paypal.js'
export default (props) => {
return props.user_payments.payment_sent
? SuccessfulPayment(props)
: (props.user_payments.braintree_error ? Error() : PaymentOptions(props))
}
const Error = () =>
<div>There has been a network error. Please refresh the browser.</div>
const PaymentOptions = (props) =>
<div className='make-payment'>
<h1 className='title'>Online Payment</h1>
- <h3 className='subtitle'>If you would prefer to pay by PayPal</h3>
+ <h3 className='subtitle'>Pay using PayPal</h3>
+ <Paypal {...props} />
<h3 className='subtitle'>Alternatively pay by card</h3>
<CreaditCardPayment {...props} />
</div>
- // <Paypal {...props} />
const SuccessfulPayment = ({ user_payments }) =>
<div className='make-payment'>
<h1 className='title'>Successful Payment</h1>
<h3 className='subtitle'>
Thank you for your payment of £{user_payments.amount_entered}.
Your reference for that payment is {user_payments.payment_sent.reference}.
</h3>
</div> | 4 | 0.125 | 2 | 2 |
3886c04bd95f893e0889aeaf683a269b161104e2 | .azure/steps/codecov-setup.yml | .azure/steps/codecov-setup.yml | steps:
- bash: |
curl -Os https://uploader.codecov.io/latest/linux/codecov
set -ex
chmod +x codecov
./codecov
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
displayName: 'Upload test result to codecov.io' | steps:
- bash: bash <(curl -s https://codecov.io/bash)
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
displayName: 'Upload to codecov.io' | Rollback to prev. version of codecov.io bash | Rollback to prev. version of codecov.io bash
| YAML | mit | gimlichael/Cuemon,gimlichael/Cuemon | yaml | ## Code Before:
steps:
- bash: |
curl -Os https://uploader.codecov.io/latest/linux/codecov
set -ex
chmod +x codecov
./codecov
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
displayName: 'Upload test result to codecov.io'
## Instruction:
Rollback to prev. version of codecov.io bash
## Code After:
steps:
- bash: bash <(curl -s https://codecov.io/bash)
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
displayName: 'Upload to codecov.io' | steps:
+ - bash: bash <(curl -s https://codecov.io/bash)
- - bash: |
- curl -Os https://uploader.codecov.io/latest/linux/codecov
- set -ex
- chmod +x codecov
- ./codecov
condition: and(succeeded(), eq(variables['Agent.OS'], 'Linux'))
- displayName: 'Upload test result to codecov.io'
? ------------
+ displayName: 'Upload to codecov.io' | 8 | 1 | 2 | 6 |
f4f2257201b90f71d029a72531760c9da380ae8b | .travis.yml | .travis.yml | language: java
script:
- mvn clean package -B
sudo: false
notifications:
email: false
| language: java
script:
- mvn clean package -B
sudo: false
jdk: oraclejdk9
notifications:
email: false
| Declare dependency upon Java 9 for Travis CI | Declare dependency upon Java 9 for Travis CI | YAML | apache-2.0 | perdian/mp3tagtiger | yaml | ## Code Before:
language: java
script:
- mvn clean package -B
sudo: false
notifications:
email: false
## Instruction:
Declare dependency upon Java 9 for Travis CI
## Code After:
language: java
script:
- mvn clean package -B
sudo: false
jdk: oraclejdk9
notifications:
email: false
| language: java
script:
- mvn clean package -B
sudo: false
+ jdk: oraclejdk9
notifications:
email: false | 1 | 0.166667 | 1 | 0 |
7a4187aab2147778d6f77f8c5415fc618d1989d4 | src/main/java/graphql/schema/diff/DiffCategory.java | src/main/java/graphql/schema/diff/DiffCategory.java | package graphql.schema.diff;
import graphql.PublicApi;
/**
* A classification of difference events.
*/
@PublicApi
public enum DiffCategory {
/**
* The new API is missing something compared to the old API
*/
MISSING,
/**
* The new API has become stricter for existing clients than the old API
*/
STRICTER,
/**
* The new API has an invalid structure
*/
INVALID,
/**
* The new API has added something not present in the old API
*/
ADDITION,
/**
* The new API has changed something compared to the old API
*/
DIFFERENT
}
| package graphql.schema.diff;
import graphql.PublicApi;
/**
* A classification of difference events.
*/
@PublicApi
public enum DiffCategory {
/**
* The new API is missing something compared to the old API
*/
MISSING,
/**
* The new API has become stricter for existing clients than the old API
*/
STRICTER,
/**
* The new API has an invalid structure
*/
INVALID,
/**
* The new API has added something not present in the old API
*/
ADDITION,
/**
* The new API has changed something compared to the old API
*/
DIFFERENT,
/**
* The new API has deprecated something or removed something deprecated from the old API
*/
DEPRECATED
}
| Add new diff category for deprecated differences | Add new diff category for deprecated differences
| Java | mit | graphql-java/graphql-java,graphql-java/graphql-java | java | ## Code Before:
package graphql.schema.diff;
import graphql.PublicApi;
/**
* A classification of difference events.
*/
@PublicApi
public enum DiffCategory {
/**
* The new API is missing something compared to the old API
*/
MISSING,
/**
* The new API has become stricter for existing clients than the old API
*/
STRICTER,
/**
* The new API has an invalid structure
*/
INVALID,
/**
* The new API has added something not present in the old API
*/
ADDITION,
/**
* The new API has changed something compared to the old API
*/
DIFFERENT
}
## Instruction:
Add new diff category for deprecated differences
## Code After:
package graphql.schema.diff;
import graphql.PublicApi;
/**
* A classification of difference events.
*/
@PublicApi
public enum DiffCategory {
/**
* The new API is missing something compared to the old API
*/
MISSING,
/**
* The new API has become stricter for existing clients than the old API
*/
STRICTER,
/**
* The new API has an invalid structure
*/
INVALID,
/**
* The new API has added something not present in the old API
*/
ADDITION,
/**
* The new API has changed something compared to the old API
*/
DIFFERENT,
/**
* The new API has deprecated something or removed something deprecated from the old API
*/
DEPRECATED
}
| package graphql.schema.diff;
import graphql.PublicApi;
/**
* A classification of difference events.
*/
@PublicApi
public enum DiffCategory {
/**
* The new API is missing something compared to the old API
*/
MISSING,
/**
* The new API has become stricter for existing clients than the old API
*/
STRICTER,
/**
* The new API has an invalid structure
*/
INVALID,
/**
* The new API has added something not present in the old API
*/
ADDITION,
/**
* The new API has changed something compared to the old API
*/
- DIFFERENT
+ DIFFERENT,
? +
+ /**
+ * The new API has deprecated something or removed something deprecated from the old API
+ */
+ DEPRECATED
} | 6 | 0.2 | 5 | 1 |
2de18601957568389d8514f0a3447f097be7b286 | test/helpers/setup.js | test/helpers/setup.js | var chai = require('chai');
var supertest = require('supertest-as-promised');
var api = require('../../server');
var request = supertest(api);
var blueprint_client = require('../../modules/blueprint');
var mqtt_client = require('../../modules/mqtt');
GLOBAL.AssertionError = chai.AssertionError;
GLOBAL.expect = chai.expect;
GLOBAL.request = request;
GLOBAL.blueprint_client = blueprint_client;
GLOBAL.mqtt_client = mqtt_client;
| /* global GLOBAL */
var chai = require('chai');
var supertest = require('supertest-as-promised');
var api = require('../../server');
var request = supertest(api);
var blueprint_client = require('../../modules/blueprint');
var mqtt_client = require('../../modules/mqtt');
GLOBAL.AssertionError = chai.AssertionError;
GLOBAL.expect = chai.expect;
GLOBAL.request = request;
GLOBAL.blueprint_client = blueprint_client;
GLOBAL.mqtt_client = mqtt_client;
| Add GLOBAL as global var | Add GLOBAL as global var
| JavaScript | mit | Altoros/refill-them-api | javascript | ## Code Before:
var chai = require('chai');
var supertest = require('supertest-as-promised');
var api = require('../../server');
var request = supertest(api);
var blueprint_client = require('../../modules/blueprint');
var mqtt_client = require('../../modules/mqtt');
GLOBAL.AssertionError = chai.AssertionError;
GLOBAL.expect = chai.expect;
GLOBAL.request = request;
GLOBAL.blueprint_client = blueprint_client;
GLOBAL.mqtt_client = mqtt_client;
## Instruction:
Add GLOBAL as global var
## Code After:
/* global GLOBAL */
var chai = require('chai');
var supertest = require('supertest-as-promised');
var api = require('../../server');
var request = supertest(api);
var blueprint_client = require('../../modules/blueprint');
var mqtt_client = require('../../modules/mqtt');
GLOBAL.AssertionError = chai.AssertionError;
GLOBAL.expect = chai.expect;
GLOBAL.request = request;
GLOBAL.blueprint_client = blueprint_client;
GLOBAL.mqtt_client = mqtt_client;
| + /* global GLOBAL */
var chai = require('chai');
var supertest = require('supertest-as-promised');
var api = require('../../server');
var request = supertest(api);
var blueprint_client = require('../../modules/blueprint');
var mqtt_client = require('../../modules/mqtt');
GLOBAL.AssertionError = chai.AssertionError;
GLOBAL.expect = chai.expect;
GLOBAL.request = request;
GLOBAL.blueprint_client = blueprint_client;
GLOBAL.mqtt_client = mqtt_client; | 1 | 0.083333 | 1 | 0 |
7c59831afc3903187803fdb533762da2704f8790 | tb_website/static/css/basic.css | tb_website/static/css/basic.css | /*! project specific CSS goes here. */
body > .container {
padding-top: 50px;
}
*[data-toggle=tooltip] {
cursor: pointer;
}
a, a:active, a:hover {
outline: 0 !important;
}
table.content-to-top td,
table.content-to-top th {
vertical-align: top;
}
table input {
width: 100%;
}
td.field {
}
td.field input,
td.field textarea {
width: 100%;
}
th[scope="row"] label {
text-align: right;
width: 100%;
}
.breadcrumb {
/* Breadcrumbs appear way too high */
margin-bottom: -15px;
}
.modal-body .modal-footer {
padding: 15px 0px 0px 0px;
}
/* stype the modules view */
.vis-network-tooltip ul {
list-style: none;
margin: 0px;
padding: 0px;
}
| /*! project specific CSS goes here. */
body > .container {
padding-top: 50px;
}
*[data-toggle=tooltip] {
cursor: pointer;
}
a, a:active, a:hover {
outline: 0 !important;
}
table.content-to-top td,
table.content-to-top th {
vertical-align: top;
}
table input {
width: 100%;
}
td.field {
}
td.field input,
td.field textarea {
width: 100%;
}
th[scope="row"] label {
text-align: right;
width: 100%;
}
.breadcrumb {
/* Breadcrumbs appear way too high */
margin-bottom: -15px;
}
.modal-body .modal-footer {
padding: 15px 0px 0px 0px;
}
/* override the checkbox position */
td.field input[type="checkbox"] {
width: 16px;
text-align: left;
}
/* stype the modules view */
.vis-network-tooltip ul {
list-style: none;
margin: 0px;
padding: 0px;
}
| Move checkbox to the left | Move checkbox to the left
| CSS | agpl-3.0 | IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site,IQSS/gentb-site | css | ## Code Before:
/*! project specific CSS goes here. */
body > .container {
padding-top: 50px;
}
*[data-toggle=tooltip] {
cursor: pointer;
}
a, a:active, a:hover {
outline: 0 !important;
}
table.content-to-top td,
table.content-to-top th {
vertical-align: top;
}
table input {
width: 100%;
}
td.field {
}
td.field input,
td.field textarea {
width: 100%;
}
th[scope="row"] label {
text-align: right;
width: 100%;
}
.breadcrumb {
/* Breadcrumbs appear way too high */
margin-bottom: -15px;
}
.modal-body .modal-footer {
padding: 15px 0px 0px 0px;
}
/* stype the modules view */
.vis-network-tooltip ul {
list-style: none;
margin: 0px;
padding: 0px;
}
## Instruction:
Move checkbox to the left
## Code After:
/*! project specific CSS goes here. */
body > .container {
padding-top: 50px;
}
*[data-toggle=tooltip] {
cursor: pointer;
}
a, a:active, a:hover {
outline: 0 !important;
}
table.content-to-top td,
table.content-to-top th {
vertical-align: top;
}
table input {
width: 100%;
}
td.field {
}
td.field input,
td.field textarea {
width: 100%;
}
th[scope="row"] label {
text-align: right;
width: 100%;
}
.breadcrumb {
/* Breadcrumbs appear way too high */
margin-bottom: -15px;
}
.modal-body .modal-footer {
padding: 15px 0px 0px 0px;
}
/* override the checkbox position */
td.field input[type="checkbox"] {
width: 16px;
text-align: left;
}
/* stype the modules view */
.vis-network-tooltip ul {
list-style: none;
margin: 0px;
padding: 0px;
}
| /*! project specific CSS goes here. */
body > .container {
padding-top: 50px;
}
*[data-toggle=tooltip] {
cursor: pointer;
}
a, a:active, a:hover {
outline: 0 !important;
}
table.content-to-top td,
table.content-to-top th {
vertical-align: top;
}
table input {
width: 100%;
}
td.field {
}
td.field input,
td.field textarea {
width: 100%;
}
th[scope="row"] label {
text-align: right;
width: 100%;
}
.breadcrumb {
/* Breadcrumbs appear way too high */
margin-bottom: -15px;
}
.modal-body .modal-footer {
padding: 15px 0px 0px 0px;
}
+ /* override the checkbox position */
+ td.field input[type="checkbox"] {
+ width: 16px;
+ text-align: left;
+ }
+
/* stype the modules view */
.vis-network-tooltip ul {
list-style: none;
margin: 0px;
padding: 0px;
} | 6 | 0.115385 | 6 | 0 |
ec47a7050369d73dd1be2dac7b17f2201e26997b | lib/fog/hp/models/storage/shared_files.rb | lib/fog/hp/models/storage/shared_files.rb | require 'fog/core/collection'
require 'fog/hp/models/storage/shared_file'
module Fog
module Storage
class HP
class SharedFiles < Fog::Collection
attribute :shared_directory
model Fog::Storage::HP::SharedFile
def all
requires :shared_directory
parent = shared_directory.collection.get(shared_directory.url)
if parent
load(parent.files.map {|file| file.attributes})
else
nil
end
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def get(key, &block)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.get_shared_object(shared_object_url, &block)
file_data = data.headers.merge({
:body => data.body,
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def head(key)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.head_shared_object(shared_object_url)
file_data = data.headers.merge({
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def new(attributes = {})
requires :shared_directory
super({ :shared_directory => shared_directory }.merge!(attributes))
end
end
end
end
end
| require 'fog/core/collection'
require 'fog/hp/models/storage/shared_file'
module Fog
module Storage
class HP
class SharedFiles < Fog::Collection
attribute :shared_directory
model Fog::Storage::HP::SharedFile
def all
requires :shared_directory
parent = shared_directory.collection.get(shared_directory.url)
if parent
load(parent.files.map {|file| file.attributes})
else
nil
end
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def get(key, &block)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.get_shared_object(shared_object_url, &block)
file_data = data.headers.merge({
:body => data.body,
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def head(key)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.head_shared_object(shared_object_url)
file_data = data.headers.merge({
:body => '',
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def new(attributes = {})
requires :shared_directory
super({ :shared_directory => shared_directory }.merge!(attributes))
end
end
end
end
end
| Fix head call to return an empty body. | Fix head call to return an empty body.
| Ruby | mit | dtdream/fog,brilliomsinterop/fog,yyuu/fog,brilliomsinterop/fog,NETWAYS/fog,nandhanurrevanth/fog,runtimerevolution/fog,joshisa/fog,TerryHowe/fog,dustacio/fog,eLobato/fog,papedaniel/fog,mitchlloyd/fog,phillbaker/fog,seanhandley/fog,ManageIQ/fog,fog/fog,plribeiro3000/fog,displague/fog,github/fog,joshmyers/fog,github/fog,10io/fog,nicolasbrechet/fog,runtimerevolution/fog,ack/fog,alphagov/fog,rackspace/fog,b0ric/fog,martinb3/fog,unorthodoxgeek/fog,cocktail-io/fog,pyama86/fog,covario-cdiaz/fog,NETWAYS/fog,nalabjp/fog,theforeman/fog,mitchlloyd/fog,sferik/fog,fog/fog,nicolasbrechet/fog,10io/fog,dustacio/fog,dLobatog/fog,adamleff/fog,sideci-sample/sideci-sample-fog,petems/fog,seanhandley/fog,papedaniel/fog,Ladas/fog,asebastian-r7/fog,sapcc/fog,brilliomsinterop/fog,lwander/fog,icco/fog,backupify/fog,MSOpenTech/fog,Ladas/fog,MSOpenTech/fog,pravi/fog,SpryBTS/fog,mohitsethi/fog,mohitsethi/fog,bdunne/fog,dpowell7/fog,joshmyers/fog,duhast/fog,mavenlink/fog,alphagov/fog,Programatica/fog,surminus/fog,dhague/fog,martinb3/fog,joshisa/fog,sapcc/fog,surminus/fog,backupify/fog,ack/fog,eLobato/fog,covario-cdiaz/fog,b0e/fog,glennpratt/fog,bryanl/fog,nalabjp/fog,nandhanurrevanth/fog,duhast/fog,12spokes/fog,phillbaker/fog,bryanl/fog,adecarolis/fog,adamleff/fog,zephyrean/fog,dLobatog/fog,petems/fog,rackspace/fog,unorthodoxgeek/fog,pyama86/fog,mavenlink/fog,zephyrean/fog,dtdream/fog,nandhanurrevanth/fog,Programatica/fog,displague/fog,yyuu/fog,adecarolis/fog,dhague/fog,b0ric/fog,cocktail-io/fog,lwander/fog,plribeiro3000/fog,pravi/fog,kongslund/fog,ManageIQ/fog,sideci-sample/sideci-sample-fog,SpryBTS/fog,glennpratt/fog,zephyrean/fog,kongslund/fog,brandondunne/fog,TerryHowe/fog,asebastian-r7/fog,icco/fog,theforeman/fog,phillbaker/fog | ruby | ## Code Before:
require 'fog/core/collection'
require 'fog/hp/models/storage/shared_file'
module Fog
module Storage
class HP
class SharedFiles < Fog::Collection
attribute :shared_directory
model Fog::Storage::HP::SharedFile
def all
requires :shared_directory
parent = shared_directory.collection.get(shared_directory.url)
if parent
load(parent.files.map {|file| file.attributes})
else
nil
end
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def get(key, &block)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.get_shared_object(shared_object_url, &block)
file_data = data.headers.merge({
:body => data.body,
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def head(key)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.head_shared_object(shared_object_url)
file_data = data.headers.merge({
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def new(attributes = {})
requires :shared_directory
super({ :shared_directory => shared_directory }.merge!(attributes))
end
end
end
end
end
## Instruction:
Fix head call to return an empty body.
## Code After:
require 'fog/core/collection'
require 'fog/hp/models/storage/shared_file'
module Fog
module Storage
class HP
class SharedFiles < Fog::Collection
attribute :shared_directory
model Fog::Storage::HP::SharedFile
def all
requires :shared_directory
parent = shared_directory.collection.get(shared_directory.url)
if parent
load(parent.files.map {|file| file.attributes})
else
nil
end
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def get(key, &block)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.get_shared_object(shared_object_url, &block)
file_data = data.headers.merge({
:body => data.body,
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def head(key)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.head_shared_object(shared_object_url)
file_data = data.headers.merge({
:body => '',
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def new(attributes = {})
requires :shared_directory
super({ :shared_directory => shared_directory }.merge!(attributes))
end
end
end
end
end
| require 'fog/core/collection'
require 'fog/hp/models/storage/shared_file'
module Fog
module Storage
class HP
class SharedFiles < Fog::Collection
attribute :shared_directory
model Fog::Storage::HP::SharedFile
def all
requires :shared_directory
parent = shared_directory.collection.get(shared_directory.url)
if parent
load(parent.files.map {|file| file.attributes})
else
nil
end
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def get(key, &block)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.get_shared_object(shared_object_url, &block)
file_data = data.headers.merge({
:body => data.body,
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def head(key)
requires :shared_directory
shared_object_url = "#{shared_directory.url}/#{key}"
data = connection.head_shared_object(shared_object_url)
file_data = data.headers.merge({
+ :body => '',
:key => key
})
new(file_data)
rescue Fog::Storage::HP::NotFound, Fog::HP::Errors::Forbidden
nil
end
def new(attributes = {})
requires :shared_directory
super({ :shared_directory => shared_directory }.merge!(attributes))
end
end
end
end
end | 1 | 0.016667 | 1 | 0 |
f340a959555928e17c040d3e189f5e6abe31dd87 | .travis.yml | .travis.yml | language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
| language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit --verbose
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
| Enable verbose mode on phpunit on travos | Enable verbose mode on phpunit on travos | YAML | mit | bobthecow/psysh,bobthecow/psysh | yaml | ## Code Before:
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
## Instruction:
Enable verbose mode on phpunit on travos
## Code After:
language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
script: vendor/bin/phpunit --verbose
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*)
| language: php
sudo: false
matrix:
include:
- php: 5.3
- php: 5.4
- php: 5.5
- php: 5.6
- php: 7.0
- php: 7.1
- php: hhvm
dist: trusty
allow_failures:
- php: hhvm
install: travis_retry composer update --no-interaction
- script: vendor/bin/phpunit
+ script: vendor/bin/phpunit --verbose
? ++++++++++
before_deploy: bin/package -v $TRAVIS_TAG
deploy:
provider: releases
api_key:
secure: LL8koDM1xDqzF9t0URHvmMPyWjojyd4PeZ7IW7XYgyvD6n1H6GYrVAeKCh5wfUKFbwHoa9s5AAn6pLzra00bODVkPTmUH+FSMWz9JKLw9ODAn8HvN7C+IooxmeClGHFZc0TfHfya8/D1E9C1iXtGGEoE/GqtaYq/z0C1DLpO0OU=
file_glob: true
file: dist/psysh-*.tar.gz
skip_cleanup: true
on:
tags: true
repo: bobthecow/psysh
condition: ($TRAVIS_PHP_VERSION = 5.3* || $TRAVIS_PHP_VERSION = 7.1*) | 2 | 0.058824 | 1 | 1 |
bc97034ccba2673e08e4bc2bc23dfe81b5b8863b | lib/shoulda/matchers/routing.rb | lib/shoulda/matchers/routing.rb | module Shoulda
module Matchers
module Routing
# @private
def route(method, path)
ActionController::RouteMatcher.new(method, path, self)
end
end
end
end
| module Shoulda
module Matchers
# @private
module Routing
def route(method, path)
ActionController::RouteMatcher.new(method, path, self)
end
end
end
end
| Hide Routing module in docs | Hide Routing module in docs
[ci skip]
| Ruby | mit | cheshire-cat/shoulda-matchers,plribeiro3000/shoulda-matchers,biow0lf/shoulda-matchers,guialbuk/shoulda-matchers,guialbuk/shoulda-matchers,cheshire-cat/shoulda-matchers,alejandrogutierrez/shoulda-matchers,biow0lf/shoulda-matchers,cheshire-cat/shoulda-matchers,plribeiro3000/shoulda-matchers,reacuna/shoulda-matchers,thoughtbot/shoulda-matchers,guialbuk/shoulda-matchers,alejandrogutierrez/shoulda-matchers,plribeiro3000/shoulda-matchers,biow0lf/shoulda-matchers,alejandrogutierrez/shoulda-matchers,thoughtbot/shoulda-matchers,alejandrogutierrez/shoulda-matchers,plribeiro3000/shoulda-matchers,guialbuk/shoulda-matchers,reacuna/shoulda-matchers,reacuna/shoulda-matchers,reacuna/shoulda-matchers,thoughtbot/shoulda-matchers,thoughtbot/shoulda-matchers,biow0lf/shoulda-matchers,cheshire-cat/shoulda-matchers | ruby | ## Code Before:
module Shoulda
module Matchers
module Routing
# @private
def route(method, path)
ActionController::RouteMatcher.new(method, path, self)
end
end
end
end
## Instruction:
Hide Routing module in docs
[ci skip]
## Code After:
module Shoulda
module Matchers
# @private
module Routing
def route(method, path)
ActionController::RouteMatcher.new(method, path, self)
end
end
end
end
| module Shoulda
module Matchers
+ # @private
module Routing
- # @private
def route(method, path)
ActionController::RouteMatcher.new(method, path, self)
end
end
end
end | 2 | 0.2 | 1 | 1 |
94129a0f21fbfd2ab5e4ea9a03a2873f68d42798 | keps/sig-release/0000-anago-to-krel-migration/kep.yaml | keps/sig-release/0000-anago-to-krel-migration/kep.yaml | title: Anago to Krel Migration
authors:
- "@saschagrunert"
owning-sig: sig-release
reviewers:
- "@justaugustus"
approvers:
- "@justaugustus"
creation-date: 2020-09-22
status: implementable
| title: Anago to Krel Migration
authors:
- "@saschagrunert"
owning-sig: sig-release
reviewers:
- "@justaugustus"
approvers:
- "@justaugustus"
creation-date: 2020-09-22
latest-milestone: v1.20
status: implemented
| Mark anago krel migration KEP as implemented | Mark anago krel migration KEP as implemented
With the merge of the anago removal PR we can now consider that KEP as
done.
Signed-off-by: Sascha Grunert <70ab469ddb2ac3e35f32ed7c2fd1cca514b2e879@suse.com>
| YAML | apache-2.0 | kubernetes/enhancements,kubernetes/enhancements,kubernetes/enhancements | yaml | ## Code Before:
title: Anago to Krel Migration
authors:
- "@saschagrunert"
owning-sig: sig-release
reviewers:
- "@justaugustus"
approvers:
- "@justaugustus"
creation-date: 2020-09-22
status: implementable
## Instruction:
Mark anago krel migration KEP as implemented
With the merge of the anago removal PR we can now consider that KEP as
done.
Signed-off-by: Sascha Grunert <70ab469ddb2ac3e35f32ed7c2fd1cca514b2e879@suse.com>
## Code After:
title: Anago to Krel Migration
authors:
- "@saschagrunert"
owning-sig: sig-release
reviewers:
- "@justaugustus"
approvers:
- "@justaugustus"
creation-date: 2020-09-22
latest-milestone: v1.20
status: implemented
| title: Anago to Krel Migration
authors:
- "@saschagrunert"
owning-sig: sig-release
reviewers:
- "@justaugustus"
approvers:
- "@justaugustus"
creation-date: 2020-09-22
+ latest-milestone: v1.20
- status: implementable
? ---
+ status: implemented
? +
| 3 | 0.3 | 2 | 1 |
0eeb1104d3f59e68616bb081fd9d4e45f73e823e | public/index.php | public/index.php | <?php
if (php_sapi_name() === 'cli-server' && is_file(__DIR__ . parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH))) {
return false;
}
require 'vendor/autoload.php';
Zend\Mvc\Application::init(require 'config/application.config.php')->run();
| <?php chdir(dirname(__DIR__));
if (php_sapi_name() === 'cli-server' && is_file(__DIR__ . parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH))) {
return false;
}
require 'vendor/autoload.php';
Zend\Mvc\Application::init(require 'config/application.config.php')->run();
| Change directory to project root before bootstrapping | Change directory to project root before bootstrapping | PHP | bsd-3-clause | sdgoij/thing,sdgoij/thing,sdgoij/thing,sdgoij/thing | php | ## Code Before:
<?php
if (php_sapi_name() === 'cli-server' && is_file(__DIR__ . parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH))) {
return false;
}
require 'vendor/autoload.php';
Zend\Mvc\Application::init(require 'config/application.config.php')->run();
## Instruction:
Change directory to project root before bootstrapping
## Code After:
<?php chdir(dirname(__DIR__));
if (php_sapi_name() === 'cli-server' && is_file(__DIR__ . parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH))) {
return false;
}
require 'vendor/autoload.php';
Zend\Mvc\Application::init(require 'config/application.config.php')->run();
| - <?php
+ <?php chdir(dirname(__DIR__));
if (php_sapi_name() === 'cli-server' && is_file(__DIR__ . parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH))) {
return false;
}
require 'vendor/autoload.php';
Zend\Mvc\Application::init(require 'config/application.config.php')->run(); | 2 | 0.222222 | 1 | 1 |
bde3db8011bb83bb67937d0162c72f686b81dc3f | README.md | README.md | Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/) and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
#### NOTICE - **You still need to test the effectiveness of the profiles created!** In many cases, this _will not be enough_!
Please see a tool like Tim Sutton's [mcxToProfile](https://github.com/timsutton/mcxToProfile) and follow the workflow that tool provides to collect the appropriate keys for an app that does not work with Extinguish out-of-the-box, or edit the profile generated on your own.
| Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/), [Nate Walck's Chef-driven version](https://github.com/natewalck/ChefExamples2016/blob/master/cookbooks/disablesparkle/recipes/default.rb), and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
### Usage
Download from releases, unpack, and use it on the command line in any of these three ways:
1. `~/Downloads/Extinguish-0.2/extinguish.py /Applications/VLC.app`
By dragging the path to any affected app into the terminal window, this will generate a single profile in the root of your home folder (or whatever the current directory is you're working in) called `disable_autoupdates_VLC.app`.
2. `~/Downloads/Extinguish-0.2/extinguish.py -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
This will generate two separate profiles with naming like the VLC example above.
3. `~/Downloads/Extinguish-0.2/extinguish.py -g True -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
This will generate a single profile called `disable_all_sparkle_autoupdates.mobileconfig` containing payloads which will disable all apps specified in one shot.
#### NOTICE - **You still need to test the effectiveness of the profiles created!** In many cases, this _will not be enough_!
Please see a tool like Tim Sutton's [mcxToProfile](https://github.com/timsutton/mcxToProfile) and follow the workflow that tool provides to collect the appropriate keys for an app that does not work with Extinguish out-of-the-box, or edit the profile generated on your own.
| Add howto info to readme | Add howto info to readme
With credit to @natewalck for inspiration | Markdown | apache-2.0 | arubdesu/Extinguish | markdown | ## Code Before:
Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/) and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
#### NOTICE - **You still need to test the effectiveness of the profiles created!** In many cases, this _will not be enough_!
Please see a tool like Tim Sutton's [mcxToProfile](https://github.com/timsutton/mcxToProfile) and follow the workflow that tool provides to collect the appropriate keys for an app that does not work with Extinguish out-of-the-box, or edit the profile generated on your own.
## Instruction:
Add howto info to readme
With credit to @natewalck for inspiration
## Code After:
Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/), [Nate Walck's Chef-driven version](https://github.com/natewalck/ChefExamples2016/blob/master/cookbooks/disablesparkle/recipes/default.rb), and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
### Usage
Download from releases, unpack, and use it on the command line in any of these three ways:
1. `~/Downloads/Extinguish-0.2/extinguish.py /Applications/VLC.app`
By dragging the path to any affected app into the terminal window, this will generate a single profile in the root of your home folder (or whatever the current directory is you're working in) called `disable_autoupdates_VLC.app`.
2. `~/Downloads/Extinguish-0.2/extinguish.py -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
This will generate two separate profiles with naming like the VLC example above.
3. `~/Downloads/Extinguish-0.2/extinguish.py -g True -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
This will generate a single profile called `disable_all_sparkle_autoupdates.mobileconfig` containing payloads which will disable all apps specified in one shot.
#### NOTICE - **You still need to test the effectiveness of the profiles created!** In many cases, this _will not be enough_!
Please see a tool like Tim Sutton's [mcxToProfile](https://github.com/timsutton/mcxToProfile) and follow the workflow that tool provides to collect the appropriate keys for an app that does not work with Extinguish out-of-the-box, or edit the profile generated on your own.
| - Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/) and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
+ Generates configuration profiles to set Sparkle-updater-enabled apps off by default. Inspired by Ben Toms [post on the Sparkle fiasco](https://macmule.com/2016/01/31/sparkle-updater-framework-http-man-in-the-middle-vulnerability/), [Nate Walck's Chef-driven version](https://github.com/natewalck/ChefExamples2016/blob/master/cookbooks/disablesparkle/recipes/default.rb), and [Greg Neagle's profiles](https://github.com/gregneagle/profiles/tree/master/autoupdate_disablers)
? ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+ ### Usage
+ Download from releases, unpack, and use it on the command line in any of these three ways:
+
+ 1. `~/Downloads/Extinguish-0.2/extinguish.py /Applications/VLC.app`
+
+ By dragging the path to any affected app into the terminal window, this will generate a single profile in the root of your home folder (or whatever the current directory is you're working in) called `disable_autoupdates_VLC.app`.
+
+ 2. `~/Downloads/Extinguish-0.2/extinguish.py -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
+
+ This will generate two separate profiles with naming like the VLC example above.
+
+ 3. `~/Downloads/Extinguish-0.2/extinguish.py -g True -a com.mactrackerapp.Mactracker -a com.fluidapp.Fluid`
+
+ This will generate a single profile called `disable_all_sparkle_autoupdates.mobileconfig` containing payloads which will disable all apps specified in one shot.
#### NOTICE - **You still need to test the effectiveness of the profiles created!** In many cases, this _will not be enough_!
Please see a tool like Tim Sutton's [mcxToProfile](https://github.com/timsutton/mcxToProfile) and follow the workflow that tool provides to collect the appropriate keys for an app that does not work with Extinguish out-of-the-box, or edit the profile generated on your own. | 17 | 4.25 | 16 | 1 |
bff33d5cb6763778980cab675ee742b2491f84ed | lib/rack/oauth2/models/active_record.rb | lib/rack/oauth2/models/active_record.rb | module Rack
module OAuth2
class Server
class ActiveRecord < ::ActiveRecord::Base
set_table_name do
"oauth2_provider_#{name.split("::").last.underscore}"
end
end
class << self
# Create new instance of the klass and populate its attributes.
def new_instance(klass, fields)
instance = klass.new fields
end
end
end
end
end
require "rack/oauth2/models/active_record/client"
require "rack/oauth2/models/active_record/auth_request"
require "rack/oauth2/models/active_record/access_grant"
require "rack/oauth2/models/active_record/access_token"
require "rack/oauth2/models/active_record/issuer"
| module Rack
module OAuth2
class Server
class ActiveRecord < ::ActiveRecord::Base
TABLE_PREFIX = "oauth2_provider_"
def self.table_name
TABLE_PREFIX + name.split("::").last.underscore
end
end
class << self
# Create new instance of the klass and populate its attributes.
def new_instance(klass, fields)
instance = klass.new fields
end
end
class CreateRackOauth2ServerSchema < ::ActiveRecord::Migration
def change
create_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
# Client identifier.
t.string :client_id
# Client secret: random, long, and hexy.
t.string :secret
# User see this.
t.string :display_name
# Link to client's Web site.
t.string :link
# Preferred image URL for this icon.
t.string :image_url
# Redirect URL. Supplied by the client if they want to restrict redirect URLs (better security).
t.string :redirect_uri
# List of scope the client is allowed to request.
t.string :scope
# Free form fields for internal use.
t.string :notes
# Timestamp if revoked.
t.datetime :revoked
# Counts how many access tokens were granted.
t.integer :tokens_granted
# Counts how many access tokens were revoked.
t. integer :tokens_revoked
t.timestamps
end
change_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
t.index :client_id
t.index [:client_id, :secret]
end
end
end
end
end
end
require "rack/oauth2/models/active_record/client"
require "rack/oauth2/models/active_record/auth_request"
require "rack/oauth2/models/active_record/access_grant"
require "rack/oauth2/models/active_record/access_token"
require "rack/oauth2/models/active_record/issuer"
| Add DB migrations setup - first try | Add DB migrations setup - first try | Ruby | mit | anerhan/rack-oauth2-server,anerhan/rack-oauth2-server,anerhan/rack-oauth2-server | ruby | ## Code Before:
module Rack
module OAuth2
class Server
class ActiveRecord < ::ActiveRecord::Base
set_table_name do
"oauth2_provider_#{name.split("::").last.underscore}"
end
end
class << self
# Create new instance of the klass and populate its attributes.
def new_instance(klass, fields)
instance = klass.new fields
end
end
end
end
end
require "rack/oauth2/models/active_record/client"
require "rack/oauth2/models/active_record/auth_request"
require "rack/oauth2/models/active_record/access_grant"
require "rack/oauth2/models/active_record/access_token"
require "rack/oauth2/models/active_record/issuer"
## Instruction:
Add DB migrations setup - first try
## Code After:
module Rack
module OAuth2
class Server
class ActiveRecord < ::ActiveRecord::Base
TABLE_PREFIX = "oauth2_provider_"
def self.table_name
TABLE_PREFIX + name.split("::").last.underscore
end
end
class << self
# Create new instance of the klass and populate its attributes.
def new_instance(klass, fields)
instance = klass.new fields
end
end
class CreateRackOauth2ServerSchema < ::ActiveRecord::Migration
def change
create_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
# Client identifier.
t.string :client_id
# Client secret: random, long, and hexy.
t.string :secret
# User see this.
t.string :display_name
# Link to client's Web site.
t.string :link
# Preferred image URL for this icon.
t.string :image_url
# Redirect URL. Supplied by the client if they want to restrict redirect URLs (better security).
t.string :redirect_uri
# List of scope the client is allowed to request.
t.string :scope
# Free form fields for internal use.
t.string :notes
# Timestamp if revoked.
t.datetime :revoked
# Counts how many access tokens were granted.
t.integer :tokens_granted
# Counts how many access tokens were revoked.
t. integer :tokens_revoked
t.timestamps
end
change_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
t.index :client_id
t.index [:client_id, :secret]
end
end
end
end
end
end
require "rack/oauth2/models/active_record/client"
require "rack/oauth2/models/active_record/auth_request"
require "rack/oauth2/models/active_record/access_grant"
require "rack/oauth2/models/active_record/access_token"
require "rack/oauth2/models/active_record/issuer"
| module Rack
module OAuth2
class Server
class ActiveRecord < ::ActiveRecord::Base
+ TABLE_PREFIX = "oauth2_provider_"
+
- set_table_name do
? ^^ ---
+ def self.table_name
? ++++ ^^^
- "oauth2_provider_#{name.split("::").last.underscore}"
+ TABLE_PREFIX + name.split("::").last.underscore
end
end
class << self
# Create new instance of the klass and populate its attributes.
def new_instance(klass, fields)
instance = klass.new fields
end
end
+ class CreateRackOauth2ServerSchema < ::ActiveRecord::Migration
+ def change
+ create_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
+ # Client identifier.
+ t.string :client_id
+ # Client secret: random, long, and hexy.
+ t.string :secret
+ # User see this.
+ t.string :display_name
+ # Link to client's Web site.
+ t.string :link
+ # Preferred image URL for this icon.
+ t.string :image_url
+ # Redirect URL. Supplied by the client if they want to restrict redirect URLs (better security).
+ t.string :redirect_uri
+ # List of scope the client is allowed to request.
+ t.string :scope
+ # Free form fields for internal use.
+ t.string :notes
+ # Timestamp if revoked.
+ t.datetime :revoked
+ # Counts how many access tokens were granted.
+ t.integer :tokens_granted
+ # Counts how many access tokens were revoked.
+ t. integer :tokens_revoked
+
+ t.timestamps
+ end
+ change_table "#{ActiveRecord::TABLE_PREFIX}client" do |t|
+ t.index :client_id
+ t.index [:client_id, :secret]
+ end
+ end
+ end
end
end
end
require "rack/oauth2/models/active_record/client"
require "rack/oauth2/models/active_record/auth_request"
require "rack/oauth2/models/active_record/access_grant"
require "rack/oauth2/models/active_record/access_token"
require "rack/oauth2/models/active_record/issuer" | 40 | 1.481481 | 38 | 2 |
43b60094098b162bdeb52d0867e738f04bb86a22 | _config/extensions.yml | _config/extensions.yml | ---
Name: mappable
After: 'framework/*','cms/*'
---
PointOfInterest:
extensions:
- MapExtension
DataObject:
extensions:
- MappableData
DataList:
extensions:
- MappableDataObjectSet
ArrayList:
extensions:
- MappableDataObjectSet
| ---
Name: mappable
After: 'framework/*','cms/*'
---
DataObject:
extensions:
- MappableData
DataList:
extensions:
- MappableDataObjectSet
ArrayList:
extensions:
- MappableDataObjectSet
| Move application of MapExtension for PointOfInterest to the points of interest module | FIX: Move application of MapExtension for PointOfInterest to the points of interest module
| YAML | bsd-3-clause | gordonbanderson/Mappable,gordonbanderson/Mappable | yaml | ## Code Before:
---
Name: mappable
After: 'framework/*','cms/*'
---
PointOfInterest:
extensions:
- MapExtension
DataObject:
extensions:
- MappableData
DataList:
extensions:
- MappableDataObjectSet
ArrayList:
extensions:
- MappableDataObjectSet
## Instruction:
FIX: Move application of MapExtension for PointOfInterest to the points of interest module
## Code After:
---
Name: mappable
After: 'framework/*','cms/*'
---
DataObject:
extensions:
- MappableData
DataList:
extensions:
- MappableDataObjectSet
ArrayList:
extensions:
- MappableDataObjectSet
| ---
Name: mappable
After: 'framework/*','cms/*'
---
- PointOfInterest:
- extensions:
- - MapExtension
-
DataObject:
extensions:
- MappableData
DataList:
extensions:
- MappableDataObjectSet
ArrayList:
extensions:
- MappableDataObjectSet | 4 | 0.210526 | 0 | 4 |
60f1569893f23052ad66ae08bc80507c41f06bd0 | src/main/resources/jooq-settings.xml | src/main/resources/jooq-settings.xml | <?xml version="1.0" encoding="UTF-8"?>
<settings>
<renderSchema>false</renderSchema>
<renderNameStyle>LOWER</renderNameStyle>
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings>
| <?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.8.0.xsd">
<renderSchema>false</renderSchema>
<renderNameStyle>LOWER</renderNameStyle>
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings>
| Add XML schema for jOOQ settings | Add XML schema for jOOQ settings
| XML | apache-2.0 | pvorb/platon,pvorb/platon,pvorb/platon | xml | ## Code Before:
<?xml version="1.0" encoding="UTF-8"?>
<settings>
<renderSchema>false</renderSchema>
<renderNameStyle>LOWER</renderNameStyle>
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings>
## Instruction:
Add XML schema for jOOQ settings
## Code After:
<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.8.0.xsd">
<renderSchema>false</renderSchema>
<renderNameStyle>LOWER</renderNameStyle>
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings>
| <?xml version="1.0" encoding="UTF-8"?>
- <settings>
+ <settings xmlns="http://www.jooq.org/xsd/jooq-runtime-3.8.0.xsd">
<renderSchema>false</renderSchema>
<renderNameStyle>LOWER</renderNameStyle>
<renderKeywordStyle>UPPER</renderKeywordStyle>
</settings> | 2 | 0.333333 | 1 | 1 |
0e6d0af215b3deb7a23bddc93d1d7283e3841b81 | week-8/database_intro/my_solution.md | week-8/database_intro/my_solution.md | SELECT * FROM states;
### 2.
SELECT * FROM regions;
### 3.
SELECT state_name,populations FROM states;
### 4.
SELECT state_name,population FROM states
ORDER BY population DESC;
### 5.
SELECT state_name FROM states
WHERE region_id=7;
### 6.
SELECT state_name,population_density FROM states
WHERE population_density>=50
ORDER BY population_density ASC;
### 7.
SELECT state_name FROM states
WHERE population BETWEEN 1000000 AND 1500000;
### 8.
SELECT state_name,region_id FROM states
ORDER BY region_id ASC;
### 9.
SELECT region_name FROM regions
WHERE region_name LIKE '%Central';
### 10.
SELECT regions.region_name,states.state_name
FROM states
INNER JOIN regions
ON states.region_id=regions.id
ORDER BY regions.id;

| SELECT * FROM states;
###2.
SELECT * FROM regions;
###3.
SELECT state_name,populations FROM states;
###4.
SELECT state_name,population FROM states
ORDER BY population DESC;
###5.
SELECT state_name FROM states
WHERE region_id=7;
###6.
SELECT state_name,population_density FROM states
WHERE population_density>=50
ORDER BY population_density ASC;
###7.
SELECT state_name FROM states
WHERE population BETWEEN 1000000 AND 1500000;
###8.
SELECT state_name,region_id FROM states
ORDER BY region_id ASC;
###9.
SELECT region_name FROM regions
WHERE region_name LIKE '%Central';
###10.
SELECT regions.region_name,states.state_name
FROM states
INNER JOIN regions
ON states.region_id=regions.id
ORDER BY regions.id;

###What are databases for?
Databases are for storing large amounts of data in an organized way so that it can easily be created, updated, retrieved, and deleted
###What is a one-to-many relationship?
A one to many relationship is one in which the property or field applies to many other fields or properties, the example given was regions have many states. to take it further states have many cities, cities have many families, families have many people.
###What is a primary key? What is a foreign key? How can you determine which is which?
A primary key is the id which is being referenced by a foreign key. It is housed in the table for that property, a foreign key will exist in another table to link it to that information.
###How can you select information out of a SQL database? What are some general guidelines for that?
selecting information can be done using various operators that for the most part take the shape of words. You can find the various operators on w3schools.com but some examples are BETWEEN, WHERE, SELECT. you do not need to capitalize but it is the standard way of doing it so for readability it is a good idea.
| Add reflection to markdown file in database_intro | Add reflection to markdown file in database_intro
| Markdown | mit | thomas-yancey/phase-0,thomas-yancey/phase-0,thomas-yancey/phase-0 | markdown | ## Code Before:
SELECT * FROM states;
### 2.
SELECT * FROM regions;
### 3.
SELECT state_name,populations FROM states;
### 4.
SELECT state_name,population FROM states
ORDER BY population DESC;
### 5.
SELECT state_name FROM states
WHERE region_id=7;
### 6.
SELECT state_name,population_density FROM states
WHERE population_density>=50
ORDER BY population_density ASC;
### 7.
SELECT state_name FROM states
WHERE population BETWEEN 1000000 AND 1500000;
### 8.
SELECT state_name,region_id FROM states
ORDER BY region_id ASC;
### 9.
SELECT region_name FROM regions
WHERE region_name LIKE '%Central';
### 10.
SELECT regions.region_name,states.state_name
FROM states
INNER JOIN regions
ON states.region_id=regions.id
ORDER BY regions.id;

## Instruction:
Add reflection to markdown file in database_intro
## Code After:
SELECT * FROM states;
###2.
SELECT * FROM regions;
###3.
SELECT state_name,populations FROM states;
###4.
SELECT state_name,population FROM states
ORDER BY population DESC;
###5.
SELECT state_name FROM states
WHERE region_id=7;
###6.
SELECT state_name,population_density FROM states
WHERE population_density>=50
ORDER BY population_density ASC;
###7.
SELECT state_name FROM states
WHERE population BETWEEN 1000000 AND 1500000;
###8.
SELECT state_name,region_id FROM states
ORDER BY region_id ASC;
###9.
SELECT region_name FROM regions
WHERE region_name LIKE '%Central';
###10.
SELECT regions.region_name,states.state_name
FROM states
INNER JOIN regions
ON states.region_id=regions.id
ORDER BY regions.id;

###What are databases for?
Databases are for storing large amounts of data in an organized way so that it can easily be created, updated, retrieved, and deleted
###What is a one-to-many relationship?
A one to many relationship is one in which the property or field applies to many other fields or properties, the example given was regions have many states. to take it further states have many cities, cities have many families, families have many people.
###What is a primary key? What is a foreign key? How can you determine which is which?
A primary key is the id which is being referenced by a foreign key. It is housed in the table for that property, a foreign key will exist in another table to link it to that information.
###How can you select information out of a SQL database? What are some general guidelines for that?
selecting information can be done using various operators that for the most part take the shape of words. You can find the various operators on w3schools.com but some examples are BETWEEN, WHERE, SELECT. you do not need to capitalize but it is the standard way of doing it so for readability it is a good idea.
| SELECT * FROM states;
- ### 2.
? -
+ ###2.
SELECT * FROM regions;
- ### 3.
? -
+ ###3.
SELECT state_name,populations FROM states;
- ### 4.
? -
+ ###4.
SELECT state_name,population FROM states
ORDER BY population DESC;
- ### 5.
? -
+ ###5.
SELECT state_name FROM states
WHERE region_id=7;
- ### 6.
? -
+ ###6.
SELECT state_name,population_density FROM states
WHERE population_density>=50
ORDER BY population_density ASC;
- ### 7.
? -
+ ###7.
SELECT state_name FROM states
WHERE population BETWEEN 1000000 AND 1500000;
- ### 8.
? -
+ ###8.
SELECT state_name,region_id FROM states
ORDER BY region_id ASC;
- ### 9.
? -
+ ###9.
SELECT region_name FROM regions
WHERE region_name LIKE '%Central';
- ### 10.
? -
+ ###10.
SELECT regions.region_name,states.state_name
FROM states
INNER JOIN regions
ON states.region_id=regions.id
ORDER BY regions.id;

+
+ ###What are databases for?
+ Databases are for storing large amounts of data in an organized way so that it can easily be created, updated, retrieved, and deleted
+ ###What is a one-to-many relationship?
+ A one to many relationship is one in which the property or field applies to many other fields or properties, the example given was regions have many states. to take it further states have many cities, cities have many families, families have many people.
+ ###What is a primary key? What is a foreign key? How can you determine which is which?
+ A primary key is the id which is being referenced by a foreign key. It is housed in the table for that property, a foreign key will exist in another table to link it to that information.
+ ###How can you select information out of a SQL database? What are some general guidelines for that?
+ selecting information can be done using various operators that for the most part take the shape of words. You can find the various operators on w3schools.com but some examples are BETWEEN, WHERE, SELECT. you do not need to capitalize but it is the standard way of doing it so for readability it is a good idea. | 27 | 0.84375 | 18 | 9 |
e4e0ede5057d2fed03d1157c2029f2e56e57298b | coffee-chats/src/main/java/com/google/step/coffee/servlets/AuthCalendarServlet.java | coffee-chats/src/main/java/com/google/step/coffee/servlets/AuthCalendarServlet.java | package com.google.step.coffee.servlets;
import com.google.api.client.auth.oauth2.Credential;
import com.google.step.coffee.JsonServlet;
import com.google.step.coffee.JsonServletRequest;
import com.google.step.coffee.OAuthService;
import com.google.step.coffee.UserManager;
import java.io.IOException;
import javax.servlet.annotation.WebServlet;
/**
* Manage and check user authorisation for scopes for Google Calendar API in
* <code>OAuthService</code>
*/
@WebServlet("/api/auth/calendar")
public class AuthCalendarServlet extends JsonServlet {
/** Response object for fetching authorisation status of user. */
private static class CalAuthResponse {
private boolean authorised;
private String authLink;
CalAuthResponse(boolean authorised) {
this.authorised = authorised;
}
CalAuthResponse(boolean authorised, String authLink) {
this(authorised);
this.authLink = authLink;
}
}
@Override
public Object get(JsonServletRequest request) throws IOException {
CalAuthResponse responseData;
Credential credentials = OAuthService.getCredentials(UserManager.getCurrentUserId());
if (credentials != null) {
responseData = new CalAuthResponse(true);
} else {
responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
}
return responseData;
}
}
| package com.google.step.coffee.servlets;
import com.google.api.client.auth.oauth2.Credential;
import com.google.step.coffee.JsonServlet;
import com.google.step.coffee.JsonServletRequest;
import com.google.step.coffee.OAuthService;
import com.google.step.coffee.UserManager;
import java.io.IOException;
import javax.servlet.annotation.WebServlet;
/**
* Manage and check user authorisation for scopes for Google Calendar API in
* <code>OAuthService</code>
*/
@WebServlet("/api/auth/calendar")
public class AuthCalendarServlet extends JsonServlet {
/** Response object for fetching authorisation status of user. */
private static class CalAuthResponse {
private boolean authorised;
private String authLink;
CalAuthResponse(boolean authorised) {
this.authorised = authorised;
}
CalAuthResponse(boolean authorised, String authLink) {
this(authorised);
this.authLink = authLink;
}
}
@Override
public Object get(JsonServletRequest request) throws IOException {
CalAuthResponse responseData;
if (OAuthService.userHasAuthorised(UserManager.getCurrentUserId())) {
responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
} else {
responseData = new CalAuthResponse(true);
}
return responseData;
}
}
| Change credential check to not fetch credential to check existence | Change credential check to not fetch credential to check existence
| Java | apache-2.0 | googleinterns/step250-2020,googleinterns/step250-2020,googleinterns/step250-2020,googleinterns/step250-2020 | java | ## Code Before:
package com.google.step.coffee.servlets;
import com.google.api.client.auth.oauth2.Credential;
import com.google.step.coffee.JsonServlet;
import com.google.step.coffee.JsonServletRequest;
import com.google.step.coffee.OAuthService;
import com.google.step.coffee.UserManager;
import java.io.IOException;
import javax.servlet.annotation.WebServlet;
/**
* Manage and check user authorisation for scopes for Google Calendar API in
* <code>OAuthService</code>
*/
@WebServlet("/api/auth/calendar")
public class AuthCalendarServlet extends JsonServlet {
/** Response object for fetching authorisation status of user. */
private static class CalAuthResponse {
private boolean authorised;
private String authLink;
CalAuthResponse(boolean authorised) {
this.authorised = authorised;
}
CalAuthResponse(boolean authorised, String authLink) {
this(authorised);
this.authLink = authLink;
}
}
@Override
public Object get(JsonServletRequest request) throws IOException {
CalAuthResponse responseData;
Credential credentials = OAuthService.getCredentials(UserManager.getCurrentUserId());
if (credentials != null) {
responseData = new CalAuthResponse(true);
} else {
responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
}
return responseData;
}
}
## Instruction:
Change credential check to not fetch credential to check existence
## Code After:
package com.google.step.coffee.servlets;
import com.google.api.client.auth.oauth2.Credential;
import com.google.step.coffee.JsonServlet;
import com.google.step.coffee.JsonServletRequest;
import com.google.step.coffee.OAuthService;
import com.google.step.coffee.UserManager;
import java.io.IOException;
import javax.servlet.annotation.WebServlet;
/**
* Manage and check user authorisation for scopes for Google Calendar API in
* <code>OAuthService</code>
*/
@WebServlet("/api/auth/calendar")
public class AuthCalendarServlet extends JsonServlet {
/** Response object for fetching authorisation status of user. */
private static class CalAuthResponse {
private boolean authorised;
private String authLink;
CalAuthResponse(boolean authorised) {
this.authorised = authorised;
}
CalAuthResponse(boolean authorised, String authLink) {
this(authorised);
this.authLink = authLink;
}
}
@Override
public Object get(JsonServletRequest request) throws IOException {
CalAuthResponse responseData;
if (OAuthService.userHasAuthorised(UserManager.getCurrentUserId())) {
responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
} else {
responseData = new CalAuthResponse(true);
}
return responseData;
}
}
| package com.google.step.coffee.servlets;
import com.google.api.client.auth.oauth2.Credential;
import com.google.step.coffee.JsonServlet;
import com.google.step.coffee.JsonServletRequest;
import com.google.step.coffee.OAuthService;
import com.google.step.coffee.UserManager;
import java.io.IOException;
import javax.servlet.annotation.WebServlet;
/**
* Manage and check user authorisation for scopes for Google Calendar API in
* <code>OAuthService</code>
*/
@WebServlet("/api/auth/calendar")
public class AuthCalendarServlet extends JsonServlet {
/** Response object for fetching authorisation status of user. */
private static class CalAuthResponse {
private boolean authorised;
private String authLink;
CalAuthResponse(boolean authorised) {
this.authorised = authorised;
}
CalAuthResponse(boolean authorised, String authLink) {
this(authorised);
this.authLink = authLink;
}
}
@Override
public Object get(JsonServletRequest request) throws IOException {
CalAuthResponse responseData;
- Credential credentials = OAuthService.getCredentials(UserManager.getCurrentUserId());
- if (credentials != null) {
+ if (OAuthService.userHasAuthorised(UserManager.getCurrentUserId())) {
+ responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
+ } else {
responseData = new CalAuthResponse(true);
- } else {
- responseData = new CalAuthResponse(false, OAuthService.getAuthURL(request));
}
return responseData;
}
} | 7 | 0.152174 | 3 | 4 |
d3b584fd7cc9436b4e4858d84f4519a97147defc | tests/lsb_release_test.rs | tests/lsb_release_test.rs | extern crate regex;
#[path="../src/lsb_release.rs"]
mod lsb_release;
#[path="../src/utils.rs"]
mod utils;
fn file() -> String {
"
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
".to_string()
}
fn arch_file() -> String {
"
LSB Version: 1.4
Distributor ID: Arch
Description: Arch Linux
Release: rolling
Codename: n/a
".to_string()
}
#[test]
pub fn test_parses_lsb_distro() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.distro, Some("Debian".to_string()));
}
#[test]
pub fn test_parses_lsb_version() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.version, Some("7.8".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_distro() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.distro, Some("Arch".to_string()));
}
| extern crate regex;
#[path="../src/lsb_release.rs"]
mod lsb_release;
#[path="../src/utils.rs"]
mod utils;
fn file() -> String {
"
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
".to_string()
}
fn arch_file() -> String {
"
LSB Version: 1.4
Distributor ID: Arch
Description: Arch Linux
Release: rolling
Codename: n/a
".to_string()
}
#[test]
pub fn test_parses_lsb_distro() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.distro, Some("Debian".to_string()));
}
#[test]
pub fn test_parses_lsb_version() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.version, Some("7.8".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_distro() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.distro, Some("Arch".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_version() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.version, Some("rolling".to_string()));
}
| Test that rolling release is recognized correctly | test: Test that rolling release is recognized correctly
| Rust | mit | schultyy/os_type | rust | ## Code Before:
extern crate regex;
#[path="../src/lsb_release.rs"]
mod lsb_release;
#[path="../src/utils.rs"]
mod utils;
fn file() -> String {
"
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
".to_string()
}
fn arch_file() -> String {
"
LSB Version: 1.4
Distributor ID: Arch
Description: Arch Linux
Release: rolling
Codename: n/a
".to_string()
}
#[test]
pub fn test_parses_lsb_distro() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.distro, Some("Debian".to_string()));
}
#[test]
pub fn test_parses_lsb_version() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.version, Some("7.8".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_distro() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.distro, Some("Arch".to_string()));
}
## Instruction:
test: Test that rolling release is recognized correctly
## Code After:
extern crate regex;
#[path="../src/lsb_release.rs"]
mod lsb_release;
#[path="../src/utils.rs"]
mod utils;
fn file() -> String {
"
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
".to_string()
}
fn arch_file() -> String {
"
LSB Version: 1.4
Distributor ID: Arch
Description: Arch Linux
Release: rolling
Codename: n/a
".to_string()
}
#[test]
pub fn test_parses_lsb_distro() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.distro, Some("Debian".to_string()));
}
#[test]
pub fn test_parses_lsb_version() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.version, Some("7.8".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_distro() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.distro, Some("Arch".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_version() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.version, Some("rolling".to_string()));
}
| extern crate regex;
#[path="../src/lsb_release.rs"]
mod lsb_release;
#[path="../src/utils.rs"]
mod utils;
fn file() -> String {
"
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
".to_string()
}
fn arch_file() -> String {
"
LSB Version: 1.4
Distributor ID: Arch
Description: Arch Linux
Release: rolling
Codename: n/a
".to_string()
}
#[test]
pub fn test_parses_lsb_distro() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.distro, Some("Debian".to_string()));
}
#[test]
pub fn test_parses_lsb_version() {
let parse_results = lsb_release::parse(file());
assert_eq!(parse_results.version, Some("7.8".to_string()));
}
#[test]
pub fn test_parses_arch_lsb_distro() {
let parse_results = lsb_release::parse(arch_file());
assert_eq!(parse_results.distro, Some("Arch".to_string()));
}
+
+ #[test]
+ pub fn test_parses_arch_lsb_version() {
+ let parse_results = lsb_release::parse(arch_file());
+ assert_eq!(parse_results.version, Some("rolling".to_string()));
+ } | 6 | 0.142857 | 6 | 0 |
a23747f415b9d926025d4789d09bd6a0c5b8a2e7 | lib/jsonparser.ts | lib/jsonparser.ts | import epcis = require('./epcisevents');
export module EPCIS {
export class EpcisJsonParser {
constructor() {}
static parseObj(obj: Object) : epcis.EPCIS.EpcisEvent {
if(EpcisJsonParser.isEvent(obj)) {
if(obj['type'] === 'AggregationEvent') {
var agg = new epcis.EPCIS.AggregationEvent();
agg.loadFromObj(obj);
return agg;
} else if(obj['type'] === 'TransformationEvent') {
var trans = new epcis.EPCIS.TransformationEvent();
trans.loadFromObj(obj);
return trans;
}
}
}
static parseJson(json: string) : epcis.EPCIS.Events {
var obj = JSON.parse(json);
return this.parseObj(obj);
}
// just check whether the given object is a valid event object already
static isEvent(obj:any): boolean {
var allowedTypes:Array<string> = ['ObjectEvent', 'AggregationEvent', 'TransactionEvent'];
var type = obj['type'];
if(allowedTypes.indexOf(type) != -1) {
// allowed type
return true;
}
return false;
}
}
} | import epcis = require('./epcisevents');
export module EPCIS {
export class EpcisJsonParser {
constructor() {}
static parseObj(obj: Object) : epcis.EPCIS.EpcisEvent {
if(EpcisJsonParser.isEvent(obj)) {
if(obj['type'] === 'AggregationEvent') {
var agg = epcis.EPCIS.AggregationEvent.loadFromObj(obj);
return agg;
} else if(obj['type'] === 'TransformationEvent') {
var trans = epcis.EPCIS.TransformationEvent.loadFromObj(obj);
return trans;
}
}
}
static parseJson(json: string) : epcis.EPCIS.EpcisEvent {
var obj = JSON.parse(json);
return EpcisJsonParser.parseObj(obj);
}
// just check whether the given object is a valid event object already
static isEvent(obj:any): boolean {
var allowedTypes:Array<string> = ['ObjectEvent', 'AggregationEvent', 'TransactionEvent'];
var type = obj['type'];
if(allowedTypes.indexOf(type) != -1) {
// allowed type
return true;
}
return false;
}
}
} | Use new static parser functions | Use new static parser functions
| TypeScript | mit | matgnt/epcis-js,matgnt/epcis-js | typescript | ## Code Before:
import epcis = require('./epcisevents');
export module EPCIS {
export class EpcisJsonParser {
constructor() {}
static parseObj(obj: Object) : epcis.EPCIS.EpcisEvent {
if(EpcisJsonParser.isEvent(obj)) {
if(obj['type'] === 'AggregationEvent') {
var agg = new epcis.EPCIS.AggregationEvent();
agg.loadFromObj(obj);
return agg;
} else if(obj['type'] === 'TransformationEvent') {
var trans = new epcis.EPCIS.TransformationEvent();
trans.loadFromObj(obj);
return trans;
}
}
}
static parseJson(json: string) : epcis.EPCIS.Events {
var obj = JSON.parse(json);
return this.parseObj(obj);
}
// just check whether the given object is a valid event object already
static isEvent(obj:any): boolean {
var allowedTypes:Array<string> = ['ObjectEvent', 'AggregationEvent', 'TransactionEvent'];
var type = obj['type'];
if(allowedTypes.indexOf(type) != -1) {
// allowed type
return true;
}
return false;
}
}
}
## Instruction:
Use new static parser functions
## Code After:
import epcis = require('./epcisevents');
export module EPCIS {
export class EpcisJsonParser {
constructor() {}
static parseObj(obj: Object) : epcis.EPCIS.EpcisEvent {
if(EpcisJsonParser.isEvent(obj)) {
if(obj['type'] === 'AggregationEvent') {
var agg = epcis.EPCIS.AggregationEvent.loadFromObj(obj);
return agg;
} else if(obj['type'] === 'TransformationEvent') {
var trans = epcis.EPCIS.TransformationEvent.loadFromObj(obj);
return trans;
}
}
}
static parseJson(json: string) : epcis.EPCIS.EpcisEvent {
var obj = JSON.parse(json);
return EpcisJsonParser.parseObj(obj);
}
// just check whether the given object is a valid event object already
static isEvent(obj:any): boolean {
var allowedTypes:Array<string> = ['ObjectEvent', 'AggregationEvent', 'TransactionEvent'];
var type = obj['type'];
if(allowedTypes.indexOf(type) != -1) {
// allowed type
return true;
}
return false;
}
}
} | import epcis = require('./epcisevents');
export module EPCIS {
export class EpcisJsonParser {
constructor() {}
static parseObj(obj: Object) : epcis.EPCIS.EpcisEvent {
if(EpcisJsonParser.isEvent(obj)) {
if(obj['type'] === 'AggregationEvent') {
- var agg = new epcis.EPCIS.AggregationEvent();
? ----
+ var agg = epcis.EPCIS.AggregationEvent.loadFromObj(obj);
? ++++++++++++ +++
- agg.loadFromObj(obj);
return agg;
} else if(obj['type'] === 'TransformationEvent') {
- var trans = new epcis.EPCIS.TransformationEvent();
? ----
+ var trans = epcis.EPCIS.TransformationEvent.loadFromObj(obj);
? ++++++++++++ +++
- trans.loadFromObj(obj);
return trans;
}
}
}
- static parseJson(json: string) : epcis.EPCIS.Events {
? -
+ static parseJson(json: string) : epcis.EPCIS.EpcisEvent {
? +++++
var obj = JSON.parse(json);
- return this.parseObj(obj);
? ^^
+ return EpcisJsonParser.parseObj(obj);
? ^^^ ++++++++++
}
// just check whether the given object is a valid event object already
static isEvent(obj:any): boolean {
var allowedTypes:Array<string> = ['ObjectEvent', 'AggregationEvent', 'TransactionEvent'];
var type = obj['type'];
if(allowedTypes.indexOf(type) != -1) {
// allowed type
return true;
}
return false;
}
}
} | 10 | 0.27027 | 4 | 6 |
e9d2a3e047a1a21983ceffce915c7ed5baf56d43 | lib/ab_panel/mixpanel.rb | lib/ab_panel/mixpanel.rb | require 'mixpanel'
module AbPanel
module Mixpanel
class Tracker < ::Mixpanel::Tracker
def initialize(options={})
@tracker = ::Mixpanel::Tracker.new Config.token, ab_panel_options.merge(options)
end
def ab_panel_options
{
api_key: Config.api_key,
env: AbPanel.env
}
end
def track(event_name, properties, options={})
@tracker.track event_name, properties, options
end
end
class Config
def self.api_key
config['api_key']
end
def self.token
config['token']
end
def self.config
@settings ||= YAML.load(
ERB.new(File.read(File.join(Rails.root, 'config', 'mixpanel.yml'))).result)
end
end
end
end
| require 'mixpanel'
module AbPanel
module Mixpanel
class Tracker < ::Mixpanel::Tracker
def initialize(options={})
@tracker = ::Mixpanel::Tracker.new Config.token, ab_panel_options.merge(options)
end
def ab_panel_options
{
api_key: Config.api_key,
env: AbPanel.env
}
end
def track(event_name, properties, options={})
if defined?(Resque)
Resque.enqueue ResqueTracker, event_name, properties, options
else
@tracker.track event_name, properties, options
end
end
end
class ResqueTracker
@queue = :ab_panel
def self.perform(event_name, properties, options={})
Tracker.new.track(event_name, properties, options)
end
end
class Config
def self.api_key
config['api_key']
end
def self.token
config['token']
end
def self.config
@settings ||= YAML.load(
ERB.new(File.read(File.join(Rails.root, 'config', 'mixpanel.yml'))).result)
end
end
end
end
| Use Resque for background jobs if present. | Use Resque for background jobs if present.
| Ruby | mit | Springest/ab_panel,Springest/ab_panel,Springest/ab_panel | ruby | ## Code Before:
require 'mixpanel'
module AbPanel
module Mixpanel
class Tracker < ::Mixpanel::Tracker
def initialize(options={})
@tracker = ::Mixpanel::Tracker.new Config.token, ab_panel_options.merge(options)
end
def ab_panel_options
{
api_key: Config.api_key,
env: AbPanel.env
}
end
def track(event_name, properties, options={})
@tracker.track event_name, properties, options
end
end
class Config
def self.api_key
config['api_key']
end
def self.token
config['token']
end
def self.config
@settings ||= YAML.load(
ERB.new(File.read(File.join(Rails.root, 'config', 'mixpanel.yml'))).result)
end
end
end
end
## Instruction:
Use Resque for background jobs if present.
## Code After:
require 'mixpanel'
module AbPanel
module Mixpanel
class Tracker < ::Mixpanel::Tracker
def initialize(options={})
@tracker = ::Mixpanel::Tracker.new Config.token, ab_panel_options.merge(options)
end
def ab_panel_options
{
api_key: Config.api_key,
env: AbPanel.env
}
end
def track(event_name, properties, options={})
if defined?(Resque)
Resque.enqueue ResqueTracker, event_name, properties, options
else
@tracker.track event_name, properties, options
end
end
end
class ResqueTracker
@queue = :ab_panel
def self.perform(event_name, properties, options={})
Tracker.new.track(event_name, properties, options)
end
end
class Config
def self.api_key
config['api_key']
end
def self.token
config['token']
end
def self.config
@settings ||= YAML.load(
ERB.new(File.read(File.join(Rails.root, 'config', 'mixpanel.yml'))).result)
end
end
end
end
| require 'mixpanel'
module AbPanel
module Mixpanel
class Tracker < ::Mixpanel::Tracker
def initialize(options={})
@tracker = ::Mixpanel::Tracker.new Config.token, ab_panel_options.merge(options)
end
def ab_panel_options
{
api_key: Config.api_key,
env: AbPanel.env
}
end
def track(event_name, properties, options={})
+ if defined?(Resque)
+ Resque.enqueue ResqueTracker, event_name, properties, options
+ else
- @tracker.track event_name, properties, options
+ @tracker.track event_name, properties, options
? ++
+ end
+ end
+ end
+
+ class ResqueTracker
+ @queue = :ab_panel
+
+ def self.perform(event_name, properties, options={})
+ Tracker.new.track(event_name, properties, options)
end
end
class Config
def self.api_key
config['api_key']
end
def self.token
config['token']
end
def self.config
@settings ||= YAML.load(
ERB.new(File.read(File.join(Rails.root, 'config', 'mixpanel.yml'))).result)
end
end
end
end | 14 | 0.378378 | 13 | 1 |
0c8788e96fa58dea69b4fcb93adafefd54a29f16 | src/ts/index.ts | src/ts/index.ts | export function insertHTMLBeforeBegin(element: HTMLElement, text: string) {
element.insertAdjacentHTML("beforebegin", text);
};
export function insertHTMLAfterBegin(element: HTMLElement, text: string) {
element.insertAdjacentHTML("afterbegin", text);
};
export function insertHTMLBeforeEnd(element: HTMLElement, text: string) {
element.insertAdjacentHTML("beforeend", text);
};
export function insertHTMLAfterEnd(element: HTMLElement, text: string) {
element.insertAdjacentHTML("afterend", text);
};
export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("beforebegin", element);
};
export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("afterbegin", element);
};
export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("beforeend", element);
};
export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("afterend", element);
};
| export function insertHTMLBeforeBegin(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("beforebegin", text);
};
export function insertHTMLAfterBegin(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("afterbegin", text);
};
export function insertHTMLBeforeEnd(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("beforeend", text);
};
export function insertHTMLAfterEnd(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("afterend", text);
};
export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("beforebegin", element);
};
export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("afterbegin", element);
};
export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("beforeend", element);
};
export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("afterend", element);
};
| Fix return value and type | Fix return value and type
| TypeScript | mit | tee-talog/insert-adjacent-simple,tee-talog/insert-adjacent-simple,tee-talog/insert-adjacent-simple | typescript | ## Code Before:
export function insertHTMLBeforeBegin(element: HTMLElement, text: string) {
element.insertAdjacentHTML("beforebegin", text);
};
export function insertHTMLAfterBegin(element: HTMLElement, text: string) {
element.insertAdjacentHTML("afterbegin", text);
};
export function insertHTMLBeforeEnd(element: HTMLElement, text: string) {
element.insertAdjacentHTML("beforeend", text);
};
export function insertHTMLAfterEnd(element: HTMLElement, text: string) {
element.insertAdjacentHTML("afterend", text);
};
export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("beforebegin", element);
};
export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("afterbegin", element);
};
export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("beforeend", element);
};
export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement) {
targetElement.insertAdjacentElement("afterend", element);
};
## Instruction:
Fix return value and type
## Code After:
export function insertHTMLBeforeBegin(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("beforebegin", text);
};
export function insertHTMLAfterBegin(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("afterbegin", text);
};
export function insertHTMLBeforeEnd(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("beforeend", text);
};
export function insertHTMLAfterEnd(element: HTMLElement, text: string): void {
element.insertAdjacentHTML("afterend", text);
};
export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("beforebegin", element);
};
export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("afterbegin", element);
};
export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("beforeend", element);
};
export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
return targetElement.insertAdjacentElement("afterend", element);
};
| - export function insertHTMLBeforeBegin(element: HTMLElement, text: string) {
+ export function insertHTMLBeforeBegin(element: HTMLElement, text: string): void {
? ++++++
element.insertAdjacentHTML("beforebegin", text);
};
- export function insertHTMLAfterBegin(element: HTMLElement, text: string) {
+ export function insertHTMLAfterBegin(element: HTMLElement, text: string): void {
? ++++++
element.insertAdjacentHTML("afterbegin", text);
};
- export function insertHTMLBeforeEnd(element: HTMLElement, text: string) {
+ export function insertHTMLBeforeEnd(element: HTMLElement, text: string): void {
? ++++++
element.insertAdjacentHTML("beforeend", text);
};
- export function insertHTMLAfterEnd(element: HTMLElement, text: string) {
+ export function insertHTMLAfterEnd(element: HTMLElement, text: string): void {
? ++++++
element.insertAdjacentHTML("afterend", text);
};
- export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement) {
+ export function insertElementBeforeBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
? ++++++++++++++++++++
- targetElement.insertAdjacentElement("beforebegin", element);
+ return targetElement.insertAdjacentElement("beforebegin", element);
? +++++++
};
- export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement) {
+ export function insertElementAfterBegin(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
? ++++++++++++++++++++
- targetElement.insertAdjacentElement("afterbegin", element);
+ return targetElement.insertAdjacentElement("afterbegin", element);
? +++++++
};
- export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement) {
+ export function insertElementBeforeEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
? ++++++++++++++++++++
- targetElement.insertAdjacentElement("beforeend", element);
+ return targetElement.insertAdjacentElement("beforeend", element);
? +++++++
};
- export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement) {
+ export function insertElementAfterEnd(targetElement: HTMLElement, element: HTMLElement): HTMLElement | null {
? ++++++++++++++++++++
- targetElement.insertAdjacentElement("afterend", element);
+ return targetElement.insertAdjacentElement("afterend", element);
? +++++++
};
| 24 | 0.923077 | 12 | 12 |
7ff0d5e062a2d21feb1afd55ca5305656fe98460 | README.md | README.md |

Hotlibrary is a library management system. The users can be consultation all the library collection and make book loan request.
|

Hotlibrary is a library management system. The users can be consultation all the library collection and make book loan request.
## How to install
To install the Hotlibrary in your local computer you need:
* Composer
* Bower
* Web Server
* PHP > 5
* PostgreSQL
Clone this project in you machine with git:
```
git clone https://github.com/cronodev/hotlibrary.git
```
After that run __Composer__ and __bower__ inside the project folder to download the project
dependencies:
```
bower install
```
```
composer install
```
To finish the installation get the database script and run in PostgreSQL.
```
hotlibrary/db/database.sql
```
The last step is configurate the database in the project. The path is:
```
hotlibrary/application/config/database.php
```
Set the __username__, __password__ and __database__. Default values:
* username: postgres
* database: HOTLIBRARY
* password: _yourPassword_
| Insert "how to install" in readme | Insert "how to install" in readme
| Markdown | apache-2.0 | cronodev/hotlibrary,cronodev/hotlibrary,cronodev/hotlibrary | markdown | ## Code Before:

Hotlibrary is a library management system. The users can be consultation all the library collection and make book loan request.
## Instruction:
Insert "how to install" in readme
## Code After:

Hotlibrary is a library management system. The users can be consultation all the library collection and make book loan request.
## How to install
To install the Hotlibrary in your local computer you need:
* Composer
* Bower
* Web Server
* PHP > 5
* PostgreSQL
Clone this project in you machine with git:
```
git clone https://github.com/cronodev/hotlibrary.git
```
After that run __Composer__ and __bower__ inside the project folder to download the project
dependencies:
```
bower install
```
```
composer install
```
To finish the installation get the database script and run in PostgreSQL.
```
hotlibrary/db/database.sql
```
The last step is configurate the database in the project. The path is:
```
hotlibrary/application/config/database.php
```
Set the __username__, __password__ and __database__. Default values:
* username: postgres
* database: HOTLIBRARY
* password: _yourPassword_
|

Hotlibrary is a library management system. The users can be consultation all the library collection and make book loan request.
+
+ ## How to install
+
+ To install the Hotlibrary in your local computer you need:
+
+ * Composer
+ * Bower
+ * Web Server
+ * PHP > 5
+ * PostgreSQL
+
+ Clone this project in you machine with git:
+ ```
+ git clone https://github.com/cronodev/hotlibrary.git
+ ```
+
+ After that run __Composer__ and __bower__ inside the project folder to download the project
+ dependencies:
+
+ ```
+ bower install
+ ```
+
+ ```
+ composer install
+ ```
+
+ To finish the installation get the database script and run in PostgreSQL.
+
+ ```
+ hotlibrary/db/database.sql
+ ```
+ The last step is configurate the database in the project. The path is:
+ ```
+ hotlibrary/application/config/database.php
+ ```
+ Set the __username__, __password__ and __database__. Default values:
+ * username: postgres
+ * database: HOTLIBRARY
+ * password: _yourPassword_ | 40 | 10 | 40 | 0 |
e5ff3b3ff2ae0432be7fefcdf61a937a39c9835f | docker_node/create_node_container.sh | docker_node/create_node_container.sh | docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo --restart=always seanreece/palindromer-node-cent | docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo:palindromer-mongo --restart=always seanreece/palindromer
| Add Support for Docker Autobuild | Add Support for Docker Autobuild
Change the Docker container build script to use the Docker Hub autobuild image. | Shell | mit | SeanReece/palindromer,SeanReece/palindromer,SeanReece/palindromer | shell | ## Code Before:
docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo --restart=always seanreece/palindromer-node-cent
## Instruction:
Add Support for Docker Autobuild
Change the Docker container build script to use the Docker Hub autobuild image.
## Code After:
docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo:palindromer-mongo --restart=always seanreece/palindromer
| - docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo --restart=always seanreece/palindromer-node-cent
? ----------
+ docker run -d -p 80:8080 -e NODE_ENV=production --link palindromer-mongo:palindromer-mongo --restart=always seanreece/palindromer
? ++++++++++++++++++
| 2 | 2 | 1 | 1 |
c334f19745b252ad5d536b00cd7a032c2e1d603e | Services/ServiceLegacyMavenProxy/src/main/java/fr/synchrotron/soleil/ica/ci/service/legacymavenproxy/HttpArtifactCaller.java | Services/ServiceLegacyMavenProxy/src/main/java/fr/synchrotron/soleil/ica/ci/service/legacymavenproxy/HttpArtifactCaller.java | package fr.synchrotron.soleil.ica.ci.service.legacymavenproxy;
import org.vertx.java.core.Vertx;
import org.vertx.java.core.http.HttpClient;
import org.vertx.java.core.http.HttpServerRequest;
/**
* @author Gregory Boissinot
*/
public class HttpArtifactCaller {
private final Vertx vertx;
private final String repoHost;
private final int repoPort;
private final String repoURIPath;
public HttpArtifactCaller(Vertx vertx,
String repoHost, int repoPort, String repoURIPath) {
this.vertx = vertx;
this.repoHost = repoHost;
this.repoPort = repoPort;
this.repoURIPath = repoURIPath;
}
public Vertx getVertx() {
return vertx;
}
public String getRepoHost() {
return repoHost;
}
public String buildRequestPath(final HttpServerRequest request) {
final String prefix = "/maven";
String artifactPath = request.path().substring(prefix.length() + 1);
return repoURIPath.endsWith("/") ? (repoURIPath + artifactPath) : (repoURIPath + "/" + artifactPath);
}
public HttpClient getPClient() {
return getVertx().createHttpClient()
.setHost(repoHost)
.setPort(repoPort)
.setConnectTimeout(10000);
}
}
| package fr.synchrotron.soleil.ica.ci.service.legacymavenproxy;
import org.vertx.java.core.Vertx;
import org.vertx.java.core.http.HttpClient;
import org.vertx.java.core.http.HttpServerRequest;
/**
* @author Gregory Boissinot
*/
public class HttpArtifactCaller {
private final Vertx vertx;
private final String repoHost;
private final int repoPort;
private final String repoURIPath;
public HttpArtifactCaller(Vertx vertx,
String repoHost, int repoPort, String repoURIPath) {
this.vertx = vertx;
this.repoHost = repoHost;
this.repoPort = repoPort;
this.repoURIPath = repoURIPath;
}
public Vertx getVertx() {
return vertx;
}
public String getRepoHost() {
return repoHost;
}
public String buildRequestPath(final HttpServerRequest request) {
final String prefix = HttpArtifactProxyEndpointVerticle.PROXY_PATH;
String artifactPath = request.path().substring(prefix.length() + 1);
return repoURIPath.endsWith("/") ? (repoURIPath + artifactPath) : (repoURIPath + "/" + artifactPath);
}
public HttpClient getPClient() {
return getVertx().createHttpClient()
.setHost(repoHost)
.setPort(repoPort)
.setConnectTimeout(10000);
}
}
| Fix regression Use the proxy path | Fix regression
Use the proxy path
| Java | mit | synchrotron-soleil-ica/continuous-materials,synchrotron-soleil-ica/continuous-materials,synchrotron-soleil-ica/continuous-materials | java | ## Code Before:
package fr.synchrotron.soleil.ica.ci.service.legacymavenproxy;
import org.vertx.java.core.Vertx;
import org.vertx.java.core.http.HttpClient;
import org.vertx.java.core.http.HttpServerRequest;
/**
* @author Gregory Boissinot
*/
public class HttpArtifactCaller {
private final Vertx vertx;
private final String repoHost;
private final int repoPort;
private final String repoURIPath;
public HttpArtifactCaller(Vertx vertx,
String repoHost, int repoPort, String repoURIPath) {
this.vertx = vertx;
this.repoHost = repoHost;
this.repoPort = repoPort;
this.repoURIPath = repoURIPath;
}
public Vertx getVertx() {
return vertx;
}
public String getRepoHost() {
return repoHost;
}
public String buildRequestPath(final HttpServerRequest request) {
final String prefix = "/maven";
String artifactPath = request.path().substring(prefix.length() + 1);
return repoURIPath.endsWith("/") ? (repoURIPath + artifactPath) : (repoURIPath + "/" + artifactPath);
}
public HttpClient getPClient() {
return getVertx().createHttpClient()
.setHost(repoHost)
.setPort(repoPort)
.setConnectTimeout(10000);
}
}
## Instruction:
Fix regression
Use the proxy path
## Code After:
package fr.synchrotron.soleil.ica.ci.service.legacymavenproxy;
import org.vertx.java.core.Vertx;
import org.vertx.java.core.http.HttpClient;
import org.vertx.java.core.http.HttpServerRequest;
/**
* @author Gregory Boissinot
*/
public class HttpArtifactCaller {
private final Vertx vertx;
private final String repoHost;
private final int repoPort;
private final String repoURIPath;
public HttpArtifactCaller(Vertx vertx,
String repoHost, int repoPort, String repoURIPath) {
this.vertx = vertx;
this.repoHost = repoHost;
this.repoPort = repoPort;
this.repoURIPath = repoURIPath;
}
public Vertx getVertx() {
return vertx;
}
public String getRepoHost() {
return repoHost;
}
public String buildRequestPath(final HttpServerRequest request) {
final String prefix = HttpArtifactProxyEndpointVerticle.PROXY_PATH;
String artifactPath = request.path().substring(prefix.length() + 1);
return repoURIPath.endsWith("/") ? (repoURIPath + artifactPath) : (repoURIPath + "/" + artifactPath);
}
public HttpClient getPClient() {
return getVertx().createHttpClient()
.setHost(repoHost)
.setPort(repoPort)
.setConnectTimeout(10000);
}
}
| package fr.synchrotron.soleil.ica.ci.service.legacymavenproxy;
import org.vertx.java.core.Vertx;
import org.vertx.java.core.http.HttpClient;
import org.vertx.java.core.http.HttpServerRequest;
/**
* @author Gregory Boissinot
*/
public class HttpArtifactCaller {
private final Vertx vertx;
private final String repoHost;
private final int repoPort;
private final String repoURIPath;
public HttpArtifactCaller(Vertx vertx,
String repoHost, int repoPort, String repoURIPath) {
this.vertx = vertx;
this.repoHost = repoHost;
this.repoPort = repoPort;
this.repoURIPath = repoURIPath;
}
public Vertx getVertx() {
return vertx;
}
public String getRepoHost() {
return repoHost;
}
public String buildRequestPath(final HttpServerRequest request) {
- final String prefix = "/maven";
+ final String prefix = HttpArtifactProxyEndpointVerticle.PROXY_PATH;
String artifactPath = request.path().substring(prefix.length() + 1);
return repoURIPath.endsWith("/") ? (repoURIPath + artifactPath) : (repoURIPath + "/" + artifactPath);
}
public HttpClient getPClient() {
return getVertx().createHttpClient()
.setHost(repoHost)
.setPort(repoPort)
.setConnectTimeout(10000);
}
} | 2 | 0.041667 | 1 | 1 |
b891828616f70f52dc09ca42c0960fb8ace74042 | lib/unshorten.rb | lib/unshorten.rb | module Unshorten
class << self
def unshorten(url, options = {})
uri_str = URI.encode(url)
uri = URI.parse(url) rescue nil
return url if uri.nil?
result = Curl::Easy.http_get(uri_str) do |curl|
curl.follow_location = false
end
if result && !result.nil? && result.header_str && !result.header_str.nil? && result.header_str.split('Location: ').count > 1
return result.header_str.split('Location: ')[1].split(' ')[0]
end
end
alias :'[]' :unshorten
end
end
| module Unshorten
class << self
def unshorten(url, options = {})
httpcheck = url.split('http://')
httpscheck = url.split('https://')
return url if httpcheck.count == 0 || httpscheck.count == 0
uri_str = URI.encode(url)
uri = URI.parse(url) rescue nil
return url if uri.nil?
result = Curl::Easy.http_get(uri_str) do |curl|
curl.follow_location = false
end
if result && !result.nil? && result.header_str && !result.header_str.nil? && result.header_str.split('Location: ').count > 1
return result.header_str.split('Location: ')[1].split(' ')[0]
end
return url
end
alias :'[]' :unshorten
end
end
| Fix the check for bad url | Fix the check for bad url
| Ruby | mit | rememberlenny/email-newsletter-stand,rememberlenny/email-newsletter-stand,rememberlenny/newsletterstand,rememberlenny/email-newsletter-stand,rememberlenny/newsletterstand,rememberlenny/newsletterstand | ruby | ## Code Before:
module Unshorten
class << self
def unshorten(url, options = {})
uri_str = URI.encode(url)
uri = URI.parse(url) rescue nil
return url if uri.nil?
result = Curl::Easy.http_get(uri_str) do |curl|
curl.follow_location = false
end
if result && !result.nil? && result.header_str && !result.header_str.nil? && result.header_str.split('Location: ').count > 1
return result.header_str.split('Location: ')[1].split(' ')[0]
end
end
alias :'[]' :unshorten
end
end
## Instruction:
Fix the check for bad url
## Code After:
module Unshorten
class << self
def unshorten(url, options = {})
httpcheck = url.split('http://')
httpscheck = url.split('https://')
return url if httpcheck.count == 0 || httpscheck.count == 0
uri_str = URI.encode(url)
uri = URI.parse(url) rescue nil
return url if uri.nil?
result = Curl::Easy.http_get(uri_str) do |curl|
curl.follow_location = false
end
if result && !result.nil? && result.header_str && !result.header_str.nil? && result.header_str.split('Location: ').count > 1
return result.header_str.split('Location: ')[1].split(' ')[0]
end
return url
end
alias :'[]' :unshorten
end
end
| module Unshorten
class << self
def unshorten(url, options = {})
+ httpcheck = url.split('http://')
+ httpscheck = url.split('https://')
+ return url if httpcheck.count == 0 || httpscheck.count == 0
uri_str = URI.encode(url)
uri = URI.parse(url) rescue nil
return url if uri.nil?
result = Curl::Easy.http_get(uri_str) do |curl|
curl.follow_location = false
end
if result && !result.nil? && result.header_str && !result.header_str.nil? && result.header_str.split('Location: ').count > 1
return result.header_str.split('Location: ')[1].split(' ')[0]
end
+ return url
end
alias :'[]' :unshorten
end
end | 4 | 0.235294 | 4 | 0 |
d0b732f14c7cdf441ff2689712ec7130f0add60b | available-roles.mk | available-roles.mk |
available-roles:
@echo ""
@echo "Available roles:"
@echo ""
@echo "common: a common Ansible role to install a host"
@echo " https://github.com/andreasscherbaum/ansible-common"
@echo "gpdb-ansible: an Ansible role to install Greenplum Database"
@echo " https://github.com/andreasscherbaum/gpdb-ansible"
common:
ifneq (,$(wildcard roles/common))
cd roles/common && git pull
ln -s roles/common/ansible.mk ansible-common.mk
else
git clone https://github.com/andreasscherbaum/ansible-common.git roles/common
endif
gpdb:
ifneq (,$(wildcard roles/gpdb-ansible))
cd roles/gpdb-ansible && git pull
ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
else
git clone https://github.com/andreasscherbaum/gpdb-ansible.git roles/gpdb-ansible
endif
|
available-roles:
@echo ""
@echo "Available roles:"
@echo ""
@echo "common: a common Ansible role to install a host"
@echo " https://github.com/andreasscherbaum/ansible-common"
@echo "gpdb-ansible: an Ansible role to install Greenplum Database"
@echo " https://github.com/andreasscherbaum/gpdb-ansible"
common:
ifneq (,$(wildcard roles/common))
cd roles/common && git pull
else
git clone https://github.com/andreasscherbaum/ansible-common.git roles/common
ln -s roles/common/ansible.mk ansible-common.mk
endif
gpdb:
ifneq (,$(wildcard roles/gpdb-ansible))
cd roles/gpdb-ansible && git pull
else
git clone https://github.com/andreasscherbaum/gpdb-ansible.git roles/gpdb-ansible
ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
endif
| Move link creation into the right place | Move link creation into the right place
| Makefile | apache-2.0 | andreasscherbaum/ansible-base | makefile | ## Code Before:
available-roles:
@echo ""
@echo "Available roles:"
@echo ""
@echo "common: a common Ansible role to install a host"
@echo " https://github.com/andreasscherbaum/ansible-common"
@echo "gpdb-ansible: an Ansible role to install Greenplum Database"
@echo " https://github.com/andreasscherbaum/gpdb-ansible"
common:
ifneq (,$(wildcard roles/common))
cd roles/common && git pull
ln -s roles/common/ansible.mk ansible-common.mk
else
git clone https://github.com/andreasscherbaum/ansible-common.git roles/common
endif
gpdb:
ifneq (,$(wildcard roles/gpdb-ansible))
cd roles/gpdb-ansible && git pull
ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
else
git clone https://github.com/andreasscherbaum/gpdb-ansible.git roles/gpdb-ansible
endif
## Instruction:
Move link creation into the right place
## Code After:
available-roles:
@echo ""
@echo "Available roles:"
@echo ""
@echo "common: a common Ansible role to install a host"
@echo " https://github.com/andreasscherbaum/ansible-common"
@echo "gpdb-ansible: an Ansible role to install Greenplum Database"
@echo " https://github.com/andreasscherbaum/gpdb-ansible"
common:
ifneq (,$(wildcard roles/common))
cd roles/common && git pull
else
git clone https://github.com/andreasscherbaum/ansible-common.git roles/common
ln -s roles/common/ansible.mk ansible-common.mk
endif
gpdb:
ifneq (,$(wildcard roles/gpdb-ansible))
cd roles/gpdb-ansible && git pull
else
git clone https://github.com/andreasscherbaum/gpdb-ansible.git roles/gpdb-ansible
ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
endif
|
available-roles:
@echo ""
@echo "Available roles:"
@echo ""
@echo "common: a common Ansible role to install a host"
@echo " https://github.com/andreasscherbaum/ansible-common"
@echo "gpdb-ansible: an Ansible role to install Greenplum Database"
@echo " https://github.com/andreasscherbaum/gpdb-ansible"
common:
ifneq (,$(wildcard roles/common))
cd roles/common && git pull
- ln -s roles/common/ansible.mk ansible-common.mk
else
git clone https://github.com/andreasscherbaum/ansible-common.git roles/common
+ ln -s roles/common/ansible.mk ansible-common.mk
endif
gpdb:
ifneq (,$(wildcard roles/gpdb-ansible))
cd roles/gpdb-ansible && git pull
- ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
else
git clone https://github.com/andreasscherbaum/gpdb-ansible.git roles/gpdb-ansible
+ ln -s roles/gpdb-ansible/ansible.mk ansible-gpdb-ansible.mk
endif
| 4 | 0.142857 | 2 | 2 |
c3b0214f80b03b27d94d373f9f08c2e177d16d9b | setup-files/setup-smart-compile.el | setup-files/setup-smart-compile.el | ;; Time-stamp: <2016-05-19 22:15:18 kmodi>
;; Smart Compile
(use-package smart-compile
:defer t
:config
(progn
;; http://stackoverflow.com/a/15724162/1219634
(defun do-execute (exe)
(with-current-buffer "*eshell*"
(goto-char (point-max))
(insert exe)
(eshell-send-input))
(switch-to-buffer-other-window "*eshell*")
(end-of-buffer))
(defun save-compile-execute ()
(interactive)
(lexical-let ((exe (smart-compile-string "./%n"))
finish-callback)
;; When compilation is done, execute the program and remove the
;; callback from `compilation-finish-functions'
(setq finish-callback
(lambda (buf msg)
(do-execute exe)
(setq compilation-finish-functions
(delq finish-callback compilation-finish-functions))))
(push finish-callback compilation-finish-functions))
(smart-compile 1))))
(provide 'setup-smart-compile)
| ;; Time-stamp: <2016-12-04 12:33:25 kmodi>
;; Smart Compile
(use-package smart-compile
:commands (modi/save-compile-execute)
:init
(progn
(bind-keys
:map modi-mode-map
:filter (not (or (derived-mode-p 'emacs-lisp-mode)
(derived-mode-p 'verilog-mode)))
("<f9>" . modi/save-compile-execute)))
:config
(progn
;; Always use C99 standard for compilation
(setcdr (assoc "\\.c\\'" smart-compile-alist) "gcc -O2 %f -lm -o %n -std=gnu99")
;; http://stackoverflow.com/a/15724162/1219634
(defun modi/do-execute (exe)
"Run EXE in eshell."
(eshell) ; Start eshell or switch to an existing eshell session
(goto-char (point-max))
(insert exe)
(eshell-send-input))
(defun modi/save-compile-execute ()
"Save, compile and execute"
(interactive)
(lexical-let ((code-buf (buffer-name))
(exe (smart-compile-string "./%n"))
finish-callback)
(setq finish-callback
(lambda (buf msg)
;; Bury the compilation buffer
(with-selected-window (get-buffer-window "*compilation*")
(bury-buffer))
(modi/do-execute exe)
;; When compilation is done, execute the program and remove the
;; callback from `compilation-finish-functions'
(setq compilation-finish-functions
(delq finish-callback compilation-finish-functions))))
(push finish-callback compilation-finish-functions))
(smart-compile 1))))
(provide 'setup-smart-compile)
| Update how C code is compiled using `smart-compile` | Update how C code is compiled using `smart-compile`
| Emacs Lisp | mit | kaushalmodi/.emacs.d,kaushalmodi/.emacs.d | emacs-lisp | ## Code Before:
;; Time-stamp: <2016-05-19 22:15:18 kmodi>
;; Smart Compile
(use-package smart-compile
:defer t
:config
(progn
;; http://stackoverflow.com/a/15724162/1219634
(defun do-execute (exe)
(with-current-buffer "*eshell*"
(goto-char (point-max))
(insert exe)
(eshell-send-input))
(switch-to-buffer-other-window "*eshell*")
(end-of-buffer))
(defun save-compile-execute ()
(interactive)
(lexical-let ((exe (smart-compile-string "./%n"))
finish-callback)
;; When compilation is done, execute the program and remove the
;; callback from `compilation-finish-functions'
(setq finish-callback
(lambda (buf msg)
(do-execute exe)
(setq compilation-finish-functions
(delq finish-callback compilation-finish-functions))))
(push finish-callback compilation-finish-functions))
(smart-compile 1))))
(provide 'setup-smart-compile)
## Instruction:
Update how C code is compiled using `smart-compile`
## Code After:
;; Time-stamp: <2016-12-04 12:33:25 kmodi>
;; Smart Compile
(use-package smart-compile
:commands (modi/save-compile-execute)
:init
(progn
(bind-keys
:map modi-mode-map
:filter (not (or (derived-mode-p 'emacs-lisp-mode)
(derived-mode-p 'verilog-mode)))
("<f9>" . modi/save-compile-execute)))
:config
(progn
;; Always use C99 standard for compilation
(setcdr (assoc "\\.c\\'" smart-compile-alist) "gcc -O2 %f -lm -o %n -std=gnu99")
;; http://stackoverflow.com/a/15724162/1219634
(defun modi/do-execute (exe)
"Run EXE in eshell."
(eshell) ; Start eshell or switch to an existing eshell session
(goto-char (point-max))
(insert exe)
(eshell-send-input))
(defun modi/save-compile-execute ()
"Save, compile and execute"
(interactive)
(lexical-let ((code-buf (buffer-name))
(exe (smart-compile-string "./%n"))
finish-callback)
(setq finish-callback
(lambda (buf msg)
;; Bury the compilation buffer
(with-selected-window (get-buffer-window "*compilation*")
(bury-buffer))
(modi/do-execute exe)
;; When compilation is done, execute the program and remove the
;; callback from `compilation-finish-functions'
(setq compilation-finish-functions
(delq finish-callback compilation-finish-functions))))
(push finish-callback compilation-finish-functions))
(smart-compile 1))))
(provide 'setup-smart-compile)
| - ;; Time-stamp: <2016-05-19 22:15:18 kmodi>
? ^^ --- ^ ---
+ ;; Time-stamp: <2016-12-04 12:33:25 kmodi>
? +++ ^^ ^^^^
;; Smart Compile
(use-package smart-compile
- :defer t
+ :commands (modi/save-compile-execute)
+ :init
+ (progn
+ (bind-keys
+ :map modi-mode-map
+ :filter (not (or (derived-mode-p 'emacs-lisp-mode)
+ (derived-mode-p 'verilog-mode)))
+ ("<f9>" . modi/save-compile-execute)))
:config
(progn
+ ;; Always use C99 standard for compilation
+ (setcdr (assoc "\\.c\\'" smart-compile-alist) "gcc -O2 %f -lm -o %n -std=gnu99")
+
;; http://stackoverflow.com/a/15724162/1219634
- (defun do-execute (exe)
+ (defun modi/do-execute (exe)
? +++++
- (with-current-buffer "*eshell*"
+ "Run EXE in eshell."
+ (eshell) ; Start eshell or switch to an existing eshell session
- (goto-char (point-max))
? --
+ (goto-char (point-max))
- (insert exe)
? --
+ (insert exe)
- (eshell-send-input))
? --
+ (eshell-send-input))
- (switch-to-buffer-other-window "*eshell*")
- (end-of-buffer))
- (defun save-compile-execute ()
+ (defun modi/save-compile-execute ()
? +++++
+ "Save, compile and execute"
(interactive)
+ (lexical-let ((code-buf (buffer-name))
- (lexical-let ((exe (smart-compile-string "./%n"))
? ------------ ^
+ (exe (smart-compile-string "./%n"))
? ^^^^^^^^^^^^^
finish-callback)
- ;; When compilation is done, execute the program and remove the
- ;; callback from `compilation-finish-functions'
(setq finish-callback
(lambda (buf msg)
+ ;; Bury the compilation buffer
+ (with-selected-window (get-buffer-window "*compilation*")
+ (bury-buffer))
- (do-execute exe)
+ (modi/do-execute exe)
? +++++
+ ;; When compilation is done, execute the program and remove the
+ ;; callback from `compilation-finish-functions'
(setq compilation-finish-functions
(delq finish-callback compilation-finish-functions))))
(push finish-callback compilation-finish-functions))
(smart-compile 1))))
(provide 'setup-smart-compile) | 42 | 1.272727 | 28 | 14 |
5967940b83c14f5047863f1a6529c40f20af676b | jquery.sameheight.js | jquery.sameheight.js | (function($) {
$.fn.sameHeight = function() {
var these = this;
function setHeight() {
var max = 0;
these.height('auto').each(function() {
max = Math.max(max, $(this).height());
}).height(max);
};
$(window).resize(setHeight);
setHeight();
return this;
};
})(jQuery);
| /*
* sameHeight jQuery plugin
* http://github.com/sidewaysmilk/jquery-sameheight
*
* Copyright 2011, Justin Force
* Licensed under the MIT license
*/
jQuery.fn.sameHeight = function() {
var these = this;
function setHeight() {
var max = 0;
these.height('auto').each(function() {
max = Math.max(max, jQuery(this).height());
}).height(max);
};
jQuery(window).resize(setHeight);
setHeight();
return this;
};
| Add header and don't use $ for jQuery | Add header and don't use $ for jQuery
| JavaScript | bsd-3-clause | justinforce/jquery-sameheight | javascript | ## Code Before:
(function($) {
$.fn.sameHeight = function() {
var these = this;
function setHeight() {
var max = 0;
these.height('auto').each(function() {
max = Math.max(max, $(this).height());
}).height(max);
};
$(window).resize(setHeight);
setHeight();
return this;
};
})(jQuery);
## Instruction:
Add header and don't use $ for jQuery
## Code After:
/*
* sameHeight jQuery plugin
* http://github.com/sidewaysmilk/jquery-sameheight
*
* Copyright 2011, Justin Force
* Licensed under the MIT license
*/
jQuery.fn.sameHeight = function() {
var these = this;
function setHeight() {
var max = 0;
these.height('auto').each(function() {
max = Math.max(max, jQuery(this).height());
}).height(max);
};
jQuery(window).resize(setHeight);
setHeight();
return this;
};
| - (function($) {
+ /*
+ * sameHeight jQuery plugin
+ * http://github.com/sidewaysmilk/jquery-sameheight
+ *
+ * Copyright 2011, Justin Force
+ * Licensed under the MIT license
+ */
- $.fn.sameHeight = function() {
? ^^^
+ jQuery.fn.sameHeight = function() {
? ^^^^^^
- var these = this;
? --
+ var these = this;
- function setHeight() {
? --
+ function setHeight() {
- var max = 0;
? --
+ var max = 0;
- these.height('auto').each(function() {
? --
+ these.height('auto').each(function() {
- max = Math.max(max, $(this).height());
? -- ^
+ max = Math.max(max, jQuery(this).height());
? ^^^^^^
- }).height(max);
? --
+ }).height(max);
- };
- $(window).resize(setHeight);
- setHeight();
- return this;
};
- })(jQuery);
+ jQuery(window).resize(setHeight);
+ setHeight();
+ return this;
+ };
+ | 32 | 2.285714 | 19 | 13 |
0fa48c2f4da36ed57c9e84cec9f4c145b8a45947 | components/http-delivery.js | components/http-delivery.js | // http-delivery.js
var _ = require('underscore');
var wrapper = require('../src/javascript-wrapper.js');
/**
* This updater takes response content, response type, and a {req, res} pair as
* input. It then writes the content and type to the response and returns the
* input pair.
*/
module.exports = wrapper(function(content, type, input) {
if (type) input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': content.length
});
input.res.write(content);
return input;
});
| // http-delivery.js
var _ = require('underscore');
var wrapper = require('../src/javascript-wrapper.js');
/**
* This updater takes response content, response type, and a {req, res} pair as
* input. It then writes the content and type to the response and returns the
* input pair.
*/
module.exports = wrapper(function(content, type, input) {
if (_.isArray(content)) {
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': _.pluck(content, 'length').reduce(add, 0)
});
content.forEach(input.res.write.bind(input.res));
} else if (_.isString(content)) {
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': content.length
});
input.res.write(content);
} else {
var json = JSON.stringify(content);
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'application/json',
'Content-Length': json.length
});
input.res.write(json);
}
return input;
});
function add(a, b) {
return a + b;
}
| Support array of tokens in HTTP response | Support array of tokens in HTTP response
| JavaScript | apache-2.0 | rdf-pipeline/noflo-rdf-components,rdf-pipeline/noflo-rdf-components,rdf-pipeline/noflo-rdf-components | javascript | ## Code Before:
// http-delivery.js
var _ = require('underscore');
var wrapper = require('../src/javascript-wrapper.js');
/**
* This updater takes response content, response type, and a {req, res} pair as
* input. It then writes the content and type to the response and returns the
* input pair.
*/
module.exports = wrapper(function(content, type, input) {
if (type) input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': content.length
});
input.res.write(content);
return input;
});
## Instruction:
Support array of tokens in HTTP response
## Code After:
// http-delivery.js
var _ = require('underscore');
var wrapper = require('../src/javascript-wrapper.js');
/**
* This updater takes response content, response type, and a {req, res} pair as
* input. It then writes the content and type to the response and returns the
* input pair.
*/
module.exports = wrapper(function(content, type, input) {
if (_.isArray(content)) {
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': _.pluck(content, 'length').reduce(add, 0)
});
content.forEach(input.res.write.bind(input.res));
} else if (_.isString(content)) {
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'text/plain',
'Content-Length': content.length
});
input.res.write(content);
} else {
var json = JSON.stringify(content);
input.res.writeHead(200, 'OK', {
'Content-Type': type || 'application/json',
'Content-Length': json.length
});
input.res.write(json);
}
return input;
});
function add(a, b) {
return a + b;
}
| // http-delivery.js
var _ = require('underscore');
var wrapper = require('../src/javascript-wrapper.js');
/**
* This updater takes response content, response type, and a {req, res} pair as
* input. It then writes the content and type to the response and returns the
* input pair.
*/
module.exports = wrapper(function(content, type, input) {
+ if (_.isArray(content)) {
- if (type) input.res.writeHead(200, 'OK', {
? -- ^^^^^^
+ input.res.writeHead(200, 'OK', {
? ^^
- 'Content-Type': type || 'text/plain',
+ 'Content-Type': type || 'text/plain',
? ++++
+ 'Content-Length': _.pluck(content, 'length').reduce(add, 0)
+ });
+ content.forEach(input.res.write.bind(input.res));
+ } else if (_.isString(content)) {
+ input.res.writeHead(200, 'OK', {
+ 'Content-Type': type || 'text/plain',
- 'Content-Length': content.length
+ 'Content-Length': content.length
? ++++
- });
+ });
? ++++
- input.res.write(content);
+ input.res.write(content);
? ++++
+ } else {
+ var json = JSON.stringify(content);
+ input.res.writeHead(200, 'OK', {
+ 'Content-Type': type || 'application/json',
+ 'Content-Length': json.length
+ });
+ input.res.write(json);
+ }
return input;
});
+
+ function add(a, b) {
+ return a + b;
+ } | 29 | 1.526316 | 24 | 5 |
ee2e8c6832d9d2bf96f30785563cb0734b6069d1 | scripts/update_vagrant_boxes.sh | scripts/update_vagrant_boxes.sh | vagrant box outdated --global 2>/dev/null | grep outdated | tr -d "*'" | cut -d ' ' -f 2 | xargs -I {} vagrant box add --clean {}
# Prune out any old versions
vagrant box prune
| set -ev
# List all the outdated vagrant boxes, and do an upgrade for each one
vagrant box outdated --global 2>/dev/null | grep outdated | tr -d "*'" | cut -d ' ' -f 2 | xargs -I {} vagrant box add --clean {}
# Prune out any old versions
vagrant box prune
| Add stop on error to the vagrant box script (and also the verbose output) | Add stop on error to the vagrant box script (and also the verbose output)
| Shell | mit | triplepoint/dotfiles,triplepoint/dotfiles | shell | ## Code Before:
vagrant box outdated --global 2>/dev/null | grep outdated | tr -d "*'" | cut -d ' ' -f 2 | xargs -I {} vagrant box add --clean {}
# Prune out any old versions
vagrant box prune
## Instruction:
Add stop on error to the vagrant box script (and also the verbose output)
## Code After:
set -ev
# List all the outdated vagrant boxes, and do an upgrade for each one
vagrant box outdated --global 2>/dev/null | grep outdated | tr -d "*'" | cut -d ' ' -f 2 | xargs -I {} vagrant box add --clean {}
# Prune out any old versions
vagrant box prune
| + set -ev
+
+ # List all the outdated vagrant boxes, and do an upgrade for each one
vagrant box outdated --global 2>/dev/null | grep outdated | tr -d "*'" | cut -d ' ' -f 2 | xargs -I {} vagrant box add --clean {}
# Prune out any old versions
vagrant box prune | 3 | 0.75 | 3 | 0 |
700f0b54664351485396d456f4a2b1269cecc285 | _posts/2016-11-28-course-x-mobile-cloud.markdown | _posts/2016-11-28-course-x-mobile-cloud.markdown | ---
published: true
title: (Course) Cross-Platform Mobile App with Cloud Architecture
layout: post
comments: true
categories: course
---
# A Complementary Course Material
ใช้ประกอบการอบรมและทำ workshop สำหรับ Cross-Platform Mobile App with Cloud Architecture
<!-- break -->
For the workshop you might need to register to: [Microsoft Dev Essentials](http://aka.ms/vsdevhelp)
สำหรับคนที่ปิดหน้าต่าง ฺBenefit ไปแล้ว ให้เข้าไปที่ [https://my.visualstudio.com/benefits](https://my.visualstudio.com/benefits)
## Setting up Security Profile
* [ionic Package - Security Profile](http://docs.ionic.io/services/profiles/)
* [Android App KeyStore](http://docs.ionic.io/services/profiles/#android-app-keystore)
## Additional Resources:
* [Microsoft Virtual Academy](https://mva.microsoft.com/)
| ---
published: true
title: (Course) Cross-Platform Mobile App with Cloud Architecture
layout: post
comments: true
categories: course
---
# A Complementary Course Material
ใช้ประกอบการอบรมและทำ workshop สำหรับ Cross-Platform Mobile App with Cloud Architecture
<!-- break -->
For the workshop you might need to register to: [Microsoft Dev Essentials](http://aka.ms/vsdevhelp)
สำหรับคนที่ปิดหน้าต่าง ฺBenefit ไปแล้ว ให้เข้าไปที่ [https://my.visualstudio.com/benefits](https://my.visualstudio.com/benefits)
## Source Code
* [Horse Racer on Github](https://github.com/horseracer/cross-platform-with-cloud)
## Setting up Security Profile
* [ionic Package - Security Profile](http://docs.ionic.io/services/profiles/)
* [Android App KeyStore](http://docs.ionic.io/services/profiles/#android-app-keystore)
## How to use 3rd Party Libraries
* [Ionic Official Doc](http://ionicframework.com/docs/v2/resources/third-party-libs/)
* [How to fix Ionic Script](https://github.com/driftyco/ionic-app-scripts/issues/389)
## Additional Resources:
* [Microsoft Virtual Academy](https://mva.microsoft.com/)
| Add how to use 3rd party lib to Ionic | Add how to use 3rd party lib to Ionic
| Markdown | mit | tlaothong/tlaothong.github.io | markdown | ## Code Before:
---
published: true
title: (Course) Cross-Platform Mobile App with Cloud Architecture
layout: post
comments: true
categories: course
---
# A Complementary Course Material
ใช้ประกอบการอบรมและทำ workshop สำหรับ Cross-Platform Mobile App with Cloud Architecture
<!-- break -->
For the workshop you might need to register to: [Microsoft Dev Essentials](http://aka.ms/vsdevhelp)
สำหรับคนที่ปิดหน้าต่าง ฺBenefit ไปแล้ว ให้เข้าไปที่ [https://my.visualstudio.com/benefits](https://my.visualstudio.com/benefits)
## Setting up Security Profile
* [ionic Package - Security Profile](http://docs.ionic.io/services/profiles/)
* [Android App KeyStore](http://docs.ionic.io/services/profiles/#android-app-keystore)
## Additional Resources:
* [Microsoft Virtual Academy](https://mva.microsoft.com/)
## Instruction:
Add how to use 3rd party lib to Ionic
## Code After:
---
published: true
title: (Course) Cross-Platform Mobile App with Cloud Architecture
layout: post
comments: true
categories: course
---
# A Complementary Course Material
ใช้ประกอบการอบรมและทำ workshop สำหรับ Cross-Platform Mobile App with Cloud Architecture
<!-- break -->
For the workshop you might need to register to: [Microsoft Dev Essentials](http://aka.ms/vsdevhelp)
สำหรับคนที่ปิดหน้าต่าง ฺBenefit ไปแล้ว ให้เข้าไปที่ [https://my.visualstudio.com/benefits](https://my.visualstudio.com/benefits)
## Source Code
* [Horse Racer on Github](https://github.com/horseracer/cross-platform-with-cloud)
## Setting up Security Profile
* [ionic Package - Security Profile](http://docs.ionic.io/services/profiles/)
* [Android App KeyStore](http://docs.ionic.io/services/profiles/#android-app-keystore)
## How to use 3rd Party Libraries
* [Ionic Official Doc](http://ionicframework.com/docs/v2/resources/third-party-libs/)
* [How to fix Ionic Script](https://github.com/driftyco/ionic-app-scripts/issues/389)
## Additional Resources:
* [Microsoft Virtual Academy](https://mva.microsoft.com/)
| ---
published: true
title: (Course) Cross-Platform Mobile App with Cloud Architecture
layout: post
comments: true
categories: course
---
# A Complementary Course Material
ใช้ประกอบการอบรมและทำ workshop สำหรับ Cross-Platform Mobile App with Cloud Architecture
<!-- break -->
For the workshop you might need to register to: [Microsoft Dev Essentials](http://aka.ms/vsdevhelp)
สำหรับคนที่ปิดหน้าต่าง ฺBenefit ไปแล้ว ให้เข้าไปที่ [https://my.visualstudio.com/benefits](https://my.visualstudio.com/benefits)
+ ## Source Code
+ * [Horse Racer on Github](https://github.com/horseracer/cross-platform-with-cloud)
+
## Setting up Security Profile
* [ionic Package - Security Profile](http://docs.ionic.io/services/profiles/)
* [Android App KeyStore](http://docs.ionic.io/services/profiles/#android-app-keystore)
+ ## How to use 3rd Party Libraries
+ * [Ionic Official Doc](http://ionicframework.com/docs/v2/resources/third-party-libs/)
+ * [How to fix Ionic Script](https://github.com/driftyco/ionic-app-scripts/issues/389)
+
## Additional Resources:
* [Microsoft Virtual Academy](https://mva.microsoft.com/) | 7 | 0.318182 | 7 | 0 |
8acaaa22c869bdc6e9cac1d91a7dcc5aec3f1eb3 | package.json | package.json | {
"name": "bourbon.io",
"version": "1.0.0",
"devDependencies": {
"@thoughtbot/stylelint-config": "^0.2.0",
"sassdoc": "^2.5.1",
"stylelint": "^9.10.1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/thoughtbot/bourbon.io.git"
}
}
| {
"name": "bourbon.io",
"version": "1.0.0",
"devDependencies": {
"@thoughtbot/stylelint-config": "^0.2.0",
"sassdoc": "^2.5.1",
"stylelint": "^9.10.1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/thoughtbot/bourbon.io.git"
},
"scripts": {
"stylelint": "npx stylelint 'source/assets/stylesheets/**/*.scss'"
}
}
| Add a `stylelint` npm script | Add a `stylelint` npm script
| JSON | mit | thoughtbot/bourbon.io,thoughtbot/bourbon.io,thoughtbot/bourbon.io,thoughtbot/bourbon.io | json | ## Code Before:
{
"name": "bourbon.io",
"version": "1.0.0",
"devDependencies": {
"@thoughtbot/stylelint-config": "^0.2.0",
"sassdoc": "^2.5.1",
"stylelint": "^9.10.1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/thoughtbot/bourbon.io.git"
}
}
## Instruction:
Add a `stylelint` npm script
## Code After:
{
"name": "bourbon.io",
"version": "1.0.0",
"devDependencies": {
"@thoughtbot/stylelint-config": "^0.2.0",
"sassdoc": "^2.5.1",
"stylelint": "^9.10.1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/thoughtbot/bourbon.io.git"
},
"scripts": {
"stylelint": "npx stylelint 'source/assets/stylesheets/**/*.scss'"
}
}
| {
"name": "bourbon.io",
"version": "1.0.0",
"devDependencies": {
"@thoughtbot/stylelint-config": "^0.2.0",
"sassdoc": "^2.5.1",
"stylelint": "^9.10.1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/thoughtbot/bourbon.io.git"
+ },
+ "scripts": {
+ "stylelint": "npx stylelint 'source/assets/stylesheets/**/*.scss'"
}
} | 3 | 0.230769 | 3 | 0 |
4c18d98b456d8a9f231a7009079f9b00f732c92e | comics/crawler/crawlers/ctrlaltdelsillies.py | comics/crawler/crawlers/ctrlaltdelsillies.py | from comics.crawler.base import BaseComicCrawler
from comics.crawler.meta import BaseComicMeta
class ComicMeta(BaseComicMeta):
name = 'Ctrl+Alt+Del Sillies'
language = 'en'
url = 'http://www.ctrlaltdel-online.com/'
start_date = '2008-06-27'
history_capable_date = '2008-06-27'
schedule = 'Mo,Tu,We,Th,Fr,Sa,Su'
time_zone = -5
rights = 'Tim Buckley'
class ComicCrawler(BaseComicCrawler):
def _get_url(self):
self.url = 'http://www.cad-comic.com/comics/Lite%(date)s.jpg' % {
'date': self.pub_date.strftime('%Y%m%d'),
}
| from comics.crawler.base import BaseComicCrawler
from comics.crawler.meta import BaseComicMeta
class ComicMeta(BaseComicMeta):
name = 'Ctrl+Alt+Del Sillies'
language = 'en'
url = 'http://www.ctrlaltdel-online.com/'
start_date = '2008-06-27'
history_capable_date = '2008-06-27'
schedule = 'Mo,Tu,We,Th,Fr,Sa,Su'
time_zone = -5
rights = 'Tim Buckley'
class ComicCrawler(BaseComicCrawler):
def _get_url(self):
self.url = 'http://www.ctrlaltdel-online.com/comics/Lite%(date)s.gif' % {
'date': self.pub_date.strftime('%Y%m%d'),
}
| Update Ctrl+Alt+Del Sillies crawler with new URL | Update Ctrl+Alt+Del Sillies crawler with new URL
| Python | agpl-3.0 | klette/comics,datagutten/comics,klette/comics,datagutten/comics,jodal/comics,jodal/comics,jodal/comics,datagutten/comics,jodal/comics,klette/comics,datagutten/comics | python | ## Code Before:
from comics.crawler.base import BaseComicCrawler
from comics.crawler.meta import BaseComicMeta
class ComicMeta(BaseComicMeta):
name = 'Ctrl+Alt+Del Sillies'
language = 'en'
url = 'http://www.ctrlaltdel-online.com/'
start_date = '2008-06-27'
history_capable_date = '2008-06-27'
schedule = 'Mo,Tu,We,Th,Fr,Sa,Su'
time_zone = -5
rights = 'Tim Buckley'
class ComicCrawler(BaseComicCrawler):
def _get_url(self):
self.url = 'http://www.cad-comic.com/comics/Lite%(date)s.jpg' % {
'date': self.pub_date.strftime('%Y%m%d'),
}
## Instruction:
Update Ctrl+Alt+Del Sillies crawler with new URL
## Code After:
from comics.crawler.base import BaseComicCrawler
from comics.crawler.meta import BaseComicMeta
class ComicMeta(BaseComicMeta):
name = 'Ctrl+Alt+Del Sillies'
language = 'en'
url = 'http://www.ctrlaltdel-online.com/'
start_date = '2008-06-27'
history_capable_date = '2008-06-27'
schedule = 'Mo,Tu,We,Th,Fr,Sa,Su'
time_zone = -5
rights = 'Tim Buckley'
class ComicCrawler(BaseComicCrawler):
def _get_url(self):
self.url = 'http://www.ctrlaltdel-online.com/comics/Lite%(date)s.gif' % {
'date': self.pub_date.strftime('%Y%m%d'),
}
| from comics.crawler.base import BaseComicCrawler
from comics.crawler.meta import BaseComicMeta
class ComicMeta(BaseComicMeta):
name = 'Ctrl+Alt+Del Sillies'
language = 'en'
url = 'http://www.ctrlaltdel-online.com/'
start_date = '2008-06-27'
history_capable_date = '2008-06-27'
schedule = 'Mo,Tu,We,Th,Fr,Sa,Su'
time_zone = -5
rights = 'Tim Buckley'
class ComicCrawler(BaseComicCrawler):
def _get_url(self):
- self.url = 'http://www.cad-comic.com/comics/Lite%(date)s.jpg' % {
? - ^ ^ --
+ self.url = 'http://www.ctrlaltdel-online.com/comics/Lite%(date)s.gif' % {
? +++ ++ ++ ^^ ^^ ++
'date': self.pub_date.strftime('%Y%m%d'),
} | 2 | 0.111111 | 1 | 1 |
fde55045ef2acd2cc35d3bf657ecb20827655f2c | roles/jupyterhub/vars/main.yml | roles/jupyterhub/vars/main.yml | ---
jupyterhub_src_dir: /data/local/jupyterhub
| ---
jupyterhub_src_dir: /data/local/jupyterhub
jupyterhub_modules_dir: /data/local/jupyterhub/modules
| Define location for the extra modules for Jupyter. | Define location for the extra modules for Jupyter.
| YAML | apache-2.0 | nlesc-sherlock/emma,nlesc-sherlock/emma,nlesc-sherlock/emma | yaml | ## Code Before:
---
jupyterhub_src_dir: /data/local/jupyterhub
## Instruction:
Define location for the extra modules for Jupyter.
## Code After:
---
jupyterhub_src_dir: /data/local/jupyterhub
jupyterhub_modules_dir: /data/local/jupyterhub/modules
| ---
jupyterhub_src_dir: /data/local/jupyterhub
+ jupyterhub_modules_dir: /data/local/jupyterhub/modules | 1 | 0.5 | 1 | 0 |
2dd1e4cb4c66b00168f4cdcaae9ac31f9fe2332a | src/Contributors.json | src/Contributors.json | [
"113691352327389188",
"242727621518032896",
"152144229936660482",
"88410718314971136",
"196716483567419403",
"113378870228475904"
]
| {
"113691352327389188": "Creator",
"258750582964092953": "Support Staff",
"340861504255557634": "Support Staff",
"235542521894273024": "Support Staff",
"475701656693112842": "Support Staff",
"242727621518032896": "Contributor",
"183239837988421632": "Contributor",
"196716483567419403": "Contributor",
"174960694196699137": "Contributor",
"339177677326123018": "Contributor",
"152144229936660482": "Contributor"
}
| Update how contributors are handled (json changes) | Update how contributors are handled (json changes) | JSON | apache-2.0 | evaera/RoVer | json | ## Code Before:
[
"113691352327389188",
"242727621518032896",
"152144229936660482",
"88410718314971136",
"196716483567419403",
"113378870228475904"
]
## Instruction:
Update how contributors are handled (json changes)
## Code After:
{
"113691352327389188": "Creator",
"258750582964092953": "Support Staff",
"340861504255557634": "Support Staff",
"235542521894273024": "Support Staff",
"475701656693112842": "Support Staff",
"242727621518032896": "Contributor",
"183239837988421632": "Contributor",
"196716483567419403": "Contributor",
"174960694196699137": "Contributor",
"339177677326123018": "Contributor",
"152144229936660482": "Contributor"
}
| - [
+ {
- "113691352327389188",
? --
+ "113691352327389188": "Creator",
? +++++++++++
- "242727621518032896",
- "152144229936660482",
- "88410718314971136",
- "196716483567419403",
- "113378870228475904"
- ]
+ "258750582964092953": "Support Staff",
+ "340861504255557634": "Support Staff",
+ "235542521894273024": "Support Staff",
+ "475701656693112842": "Support Staff",
+ "242727621518032896": "Contributor",
+ "183239837988421632": "Contributor",
+ "196716483567419403": "Contributor",
+ "174960694196699137": "Contributor",
+ "339177677326123018": "Contributor",
+ "152144229936660482": "Contributor"
+ } | 21 | 2.625 | 13 | 8 |
32aa82d81750b1ce6e6c77f0e75635297f1d9d58 | roles/install_packages/tasks/main.yml | roles/install_packages/tasks/main.yml |
- name: Print summary of all the repositories enabled on the host
shell: "{{ yum_bin }} repolist all"
- name: Install test dependencies RPMs needed to run the tests
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=latest
with_items: "{{ component_config.setup.install }}"
when: "{{ component_config.setup.install }}"
- name: Remove unwanted rpms
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=absent
with_items: "{{ component_config.setup.remove }}"
when: "{{ component_config.setup.remove }}"
- name: Install packages needed for converting and collecting the logs
become: yes
become_method: sudo
yum: name="{{ logs_packages }}" state=present
|
- name: Print summary of all the repositories enabled on the host
shell: "{{ yum_bin }} repolist all"
- name: Install test dependencies RPMs needed to run the tests
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=latest
with_items: "{{ component_config.setup.install | default([]) }}"
when: component_config.setup | default(false) and component_config.setup.install | default(false)
- name: Remove unwanted rpms
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=absent
with_items: "{{ component_config.setup.remove | default([]) }}"
when: component_config.setup | default(false) and component_config.setup.remove | default(false)
- name: Install packages needed for converting and collecting the logs
become: yes
become_method: sudo
yum: name="{{ logs_packages }}" state=present
| Correct skip logic in install_packages | Correct skip logic in install_packages
[DEPRECATION WARNING]: Skipping task due to undefined attribute,
in the future this will be a fatal error..
according to https://github.com/ansible/ansible/issues/13274
the correct behavior is to include | default([]) on with_items if
may not be defined
Change-Id: I070de78e1d8e01274a62b2400f2b675c5e26382b
| YAML | apache-2.0 | redhat-openstack/octario,redhat-openstack/octario | yaml | ## Code Before:
- name: Print summary of all the repositories enabled on the host
shell: "{{ yum_bin }} repolist all"
- name: Install test dependencies RPMs needed to run the tests
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=latest
with_items: "{{ component_config.setup.install }}"
when: "{{ component_config.setup.install }}"
- name: Remove unwanted rpms
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=absent
with_items: "{{ component_config.setup.remove }}"
when: "{{ component_config.setup.remove }}"
- name: Install packages needed for converting and collecting the logs
become: yes
become_method: sudo
yum: name="{{ logs_packages }}" state=present
## Instruction:
Correct skip logic in install_packages
[DEPRECATION WARNING]: Skipping task due to undefined attribute,
in the future this will be a fatal error..
according to https://github.com/ansible/ansible/issues/13274
the correct behavior is to include | default([]) on with_items if
may not be defined
Change-Id: I070de78e1d8e01274a62b2400f2b675c5e26382b
## Code After:
- name: Print summary of all the repositories enabled on the host
shell: "{{ yum_bin }} repolist all"
- name: Install test dependencies RPMs needed to run the tests
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=latest
with_items: "{{ component_config.setup.install | default([]) }}"
when: component_config.setup | default(false) and component_config.setup.install | default(false)
- name: Remove unwanted rpms
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=absent
with_items: "{{ component_config.setup.remove | default([]) }}"
when: component_config.setup | default(false) and component_config.setup.remove | default(false)
- name: Install packages needed for converting and collecting the logs
become: yes
become_method: sudo
yum: name="{{ logs_packages }}" state=present
|
- name: Print summary of all the repositories enabled on the host
shell: "{{ yum_bin }} repolist all"
- name: Install test dependencies RPMs needed to run the tests
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=latest
- with_items: "{{ component_config.setup.install }}"
+ with_items: "{{ component_config.setup.install | default([]) }}"
? ++++++++++++++
- when: "{{ component_config.setup.install }}"
+ when: component_config.setup | default(false) and component_config.setup.install | default(false)
- name: Remove unwanted rpms
become: yes
become_method: sudo
yum: pkg="{{ item }}" state=absent
- with_items: "{{ component_config.setup.remove }}"
+ with_items: "{{ component_config.setup.remove | default([]) }}"
? ++++++++++++++
- when: "{{ component_config.setup.remove }}"
+ when: component_config.setup | default(false) and component_config.setup.remove | default(false)
- name: Install packages needed for converting and collecting the logs
become: yes
become_method: sudo
yum: name="{{ logs_packages }}" state=present | 8 | 0.363636 | 4 | 4 |
1f32d06c4e075439f93ec03f645b00d9750fc595 | t/130-mode/all.do | t/130-mode/all.do | umask 0022
redo mode1
MODE="$(ls -l mode1 | cut -d' ' -f1)"
[ "$MODE" = "-rw-r--r--" ] || exit 78
| umask 0022
redo mode1
MODE=$(python -c 'import os; print oct(os.stat("mode1").st_mode & 07777)')
[ "$MODE" = "0644" ] || exit 78
| Fix t/130-mode: "ls -l" output is not always as expected | Fix t/130-mode: "ls -l" output is not always as expected
[tweaked by apenwarr to remove dependency on non-portable /usr/bin/stat]
| Stata | apache-2.0 | apenwarr/redo,apenwarr/redo,apenwarr/redo,apenwarr/redo | stata | ## Code Before:
umask 0022
redo mode1
MODE="$(ls -l mode1 | cut -d' ' -f1)"
[ "$MODE" = "-rw-r--r--" ] || exit 78
## Instruction:
Fix t/130-mode: "ls -l" output is not always as expected
[tweaked by apenwarr to remove dependency on non-portable /usr/bin/stat]
## Code After:
umask 0022
redo mode1
MODE=$(python -c 'import os; print oct(os.stat("mode1").st_mode & 07777)')
[ "$MODE" = "0644" ] || exit 78
| umask 0022
redo mode1
- MODE="$(ls -l mode1 | cut -d' ' -f1)"
+ MODE=$(python -c 'import os; print oct(os.stat("mode1").st_mode & 07777)')
- [ "$MODE" = "-rw-r--r--" ] || exit 78
? ^^^^^^^^^^
+ [ "$MODE" = "0644" ] || exit 78
? ^^^^
| 4 | 0.8 | 2 | 2 |
2937a0dca08a6afbd02ce0c744d9c266cafb2328 | .travis.yml | .travis.yml | language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
- "4"
sudo: false
before_install: if [[ `npm -v | cut -d . -f 1` -lt 3 ]]; then npm i -g npm@3; fi
| language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
sudo: false
before_install: if [[ `npm -v | cut -d . -f 1` -lt 3 ]]; then npm i -g npm@3; fi
| Drop support for Node 4 and 5, as Babel no longer supports them. | Drop support for Node 4 and 5, as Babel no longer supports them.
https://github.com/babel/babel/releases/tag/v7.0.0-beta.45
| YAML | mit | benjamn/reify,benjamn/reify | yaml | ## Code Before:
language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- "5"
- "4"
sudo: false
before_install: if [[ `npm -v | cut -d . -f 1` -lt 3 ]]; then npm i -g npm@3; fi
## Instruction:
Drop support for Node 4 and 5, as Babel no longer supports them.
https://github.com/babel/babel/releases/tag/v7.0.0-beta.45
## Code After:
language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
sudo: false
before_install: if [[ `npm -v | cut -d . -f 1` -lt 3 ]]; then npm i -g npm@3; fi
| language: node_js
node_js:
- "9"
- "8"
- "7"
- "6"
- - "5"
- - "4"
sudo: false
before_install: if [[ `npm -v | cut -d . -f 1` -lt 3 ]]; then npm i -g npm@3; fi | 2 | 0.153846 | 0 | 2 |
c8d55024519477dbf2bd53af6725a54161bad2af | views/js/displayGetContent.js | views/js/displayGetContent.js | $(document).ready(function() {
function displaySellermaniaCredentials()
{
if ($('#sm_import_orders_yes').attr('checked') == 'checked')
$('#sm_import_orders_credentials').fadeIn();
else
$('#sm_import_orders_credentials').fadeOut();
return true;
}
$('#sm_import_orders_yes').click(function() { return displaySellermaniaCredentials(); });
$('#sm_import_orders_no').click(function() { return displaySellermaniaCredentials(); });
displaySellermaniaCredentials();
}); | $(document).ready(function() {
function displaySellermaniaCredentials()
{
if ($('#sm_import_orders_yes').attr('checked') == 'checked' || $('#sm_import_orders_yes').attr('checked') == true)
$('#sm_import_orders_credentials').fadeIn();
else
$('#sm_import_orders_credentials').fadeOut();
return true;
}
$('#sm_import_orders_yes').click(function() { return displaySellermaniaCredentials(); });
$('#sm_import_orders_no').click(function() { return displaySellermaniaCredentials(); });
displaySellermaniaCredentials();
}); | Fix JS compatibility for PS 1.4 | Fix JS compatibility for PS 1.4
| JavaScript | apache-2.0 | FabienSerny/sellermania-prestashop,FabienSerny/sellermania-prestashop,FabienSerny/sellermania-prestashop | javascript | ## Code Before:
$(document).ready(function() {
function displaySellermaniaCredentials()
{
if ($('#sm_import_orders_yes').attr('checked') == 'checked')
$('#sm_import_orders_credentials').fadeIn();
else
$('#sm_import_orders_credentials').fadeOut();
return true;
}
$('#sm_import_orders_yes').click(function() { return displaySellermaniaCredentials(); });
$('#sm_import_orders_no').click(function() { return displaySellermaniaCredentials(); });
displaySellermaniaCredentials();
});
## Instruction:
Fix JS compatibility for PS 1.4
## Code After:
$(document).ready(function() {
function displaySellermaniaCredentials()
{
if ($('#sm_import_orders_yes').attr('checked') == 'checked' || $('#sm_import_orders_yes').attr('checked') == true)
$('#sm_import_orders_credentials').fadeIn();
else
$('#sm_import_orders_credentials').fadeOut();
return true;
}
$('#sm_import_orders_yes').click(function() { return displaySellermaniaCredentials(); });
$('#sm_import_orders_no').click(function() { return displaySellermaniaCredentials(); });
displaySellermaniaCredentials();
}); | $(document).ready(function() {
function displaySellermaniaCredentials()
{
- if ($('#sm_import_orders_yes').attr('checked') == 'checked')
+ if ($('#sm_import_orders_yes').attr('checked') == 'checked' || $('#sm_import_orders_yes').attr('checked') == true)
$('#sm_import_orders_credentials').fadeIn();
else
$('#sm_import_orders_credentials').fadeOut();
return true;
}
$('#sm_import_orders_yes').click(function() { return displaySellermaniaCredentials(); });
$('#sm_import_orders_no').click(function() { return displaySellermaniaCredentials(); });
displaySellermaniaCredentials();
}); | 2 | 0.133333 | 1 | 1 |
1a1d6abc84de04880bc5737f6af1aee8ef9a63c5 | Source/Shaders/CompositeOITFS.glsl | Source/Shaders/CompositeOITFS.glsl | /**
* Compositing for Weighted Blended Order-Independent Transparency. See:
* http://jcgt.org/published/0002/02/09/
*/
uniform sampler2D u_opaque;
uniform sampler2D u_accumulation;
uniform sampler2D u_revealage;
varying vec2 v_textureCoordinates;
void main()
{
vec4 opaque = texture2D(u_opaque, v_textureCoordinates);
vec4 accum = texture2D(u_accumulation, v_textureCoordinates);
float r = texture2D(u_revealage, v_textureCoordinates).r;
#ifdef MRT
vec4 transparent = vec4(accum.rgb / clamp(r, 1e-4, 5e4), accum.a);
#else
vec4 transparent = vec4(accum.rgb / clamp(accum.a, 1e-4, 5e4), r);
#endif
gl_FragColor = (1.0 - transparent.a) * transparent + transparent.a * opaque;
} | /**
* Compositing for Weighted Blended Order-Independent Transparency. See:
* http://jcgt.org/published/0002/02/09/
*/
uniform sampler2D u_opaque;
uniform sampler2D u_accumulation;
uniform sampler2D u_revealage;
varying vec2 v_textureCoordinates;
void main()
{
vec4 opaque = texture2D(u_opaque, v_textureCoordinates);
vec4 accum = texture2D(u_accumulation, v_textureCoordinates);
float r = texture2D(u_revealage, v_textureCoordinates).r;
#ifdef MRT
vec4 transparent = vec4(accum.rgb / clamp(r, 1e-4, 5e4), accum.a);
#else
vec4 transparent = vec4(accum.rgb / clamp(accum.a, 1e-4, 5e4), r);
#endif
transparent.rgb = mix(vec3(0.0), transparent.rgb, 1.0 - transparent.a);
gl_FragColor = (1.0 - transparent.a) * transparent + transparent.a * opaque;
} | Adjust brightness of translucent geometry proportional to the alpha. | Adjust brightness of translucent geometry proportional to the alpha.
| GLSL | apache-2.0 | soceur/cesium,ggetz/cesium,denverpierce/cesium,NaderCHASER/cesium,YonatanKra/cesium,hodbauer/cesium,likangning93/cesium,AnimatedRNG/cesium,kaktus40/cesium,josh-bernstein/cesium,likangning93/cesium,denverpierce/cesium,emackey/cesium,AnalyticalGraphicsInc/cesium,CesiumGS/cesium,esraerik/cesium,CesiumGS/cesium,josh-bernstein/cesium,esraerik/cesium,YonatanKra/cesium,oterral/cesium,soceur/cesium,kiselev-dv/cesium,aelatgt/cesium,NaderCHASER/cesium,denverpierce/cesium,hodbauer/cesium,hodbauer/cesium,jasonbeverage/cesium,omh1280/cesium,wallw-bits/cesium,ggetz/cesium,oterral/cesium,progsung/cesium,YonatanKra/cesium,jason-crow/cesium,jason-crow/cesium,esraerik/cesium,AnimatedRNG/cesium,AnimatedRNG/cesium,oterral/cesium,esraerik/cesium,omh1280/cesium,likangning93/cesium,AnalyticalGraphicsInc/cesium,kiselev-dv/cesium,omh1280/cesium,progsung/cesium,wallw-bits/cesium,aelatgt/cesium,AnimatedRNG/cesium,denverpierce/cesium,jason-crow/cesium,ggetz/cesium,jason-crow/cesium,kiselev-dv/cesium,omh1280/cesium,aelatgt/cesium,soceur/cesium,ggetz/cesium,wallw-bits/cesium,kiselev-dv/cesium,geoscan/cesium,likangning93/cesium,NaderCHASER/cesium,emackey/cesium,jasonbeverage/cesium,CesiumGS/cesium,emackey/cesium,CesiumGS/cesium,kaktus40/cesium,geoscan/cesium,emackey/cesium,wallw-bits/cesium,aelatgt/cesium,CesiumGS/cesium,likangning93/cesium,YonatanKra/cesium | glsl | ## Code Before:
/**
* Compositing for Weighted Blended Order-Independent Transparency. See:
* http://jcgt.org/published/0002/02/09/
*/
uniform sampler2D u_opaque;
uniform sampler2D u_accumulation;
uniform sampler2D u_revealage;
varying vec2 v_textureCoordinates;
void main()
{
vec4 opaque = texture2D(u_opaque, v_textureCoordinates);
vec4 accum = texture2D(u_accumulation, v_textureCoordinates);
float r = texture2D(u_revealage, v_textureCoordinates).r;
#ifdef MRT
vec4 transparent = vec4(accum.rgb / clamp(r, 1e-4, 5e4), accum.a);
#else
vec4 transparent = vec4(accum.rgb / clamp(accum.a, 1e-4, 5e4), r);
#endif
gl_FragColor = (1.0 - transparent.a) * transparent + transparent.a * opaque;
}
## Instruction:
Adjust brightness of translucent geometry proportional to the alpha.
## Code After:
/**
* Compositing for Weighted Blended Order-Independent Transparency. See:
* http://jcgt.org/published/0002/02/09/
*/
uniform sampler2D u_opaque;
uniform sampler2D u_accumulation;
uniform sampler2D u_revealage;
varying vec2 v_textureCoordinates;
void main()
{
vec4 opaque = texture2D(u_opaque, v_textureCoordinates);
vec4 accum = texture2D(u_accumulation, v_textureCoordinates);
float r = texture2D(u_revealage, v_textureCoordinates).r;
#ifdef MRT
vec4 transparent = vec4(accum.rgb / clamp(r, 1e-4, 5e4), accum.a);
#else
vec4 transparent = vec4(accum.rgb / clamp(accum.a, 1e-4, 5e4), r);
#endif
transparent.rgb = mix(vec3(0.0), transparent.rgb, 1.0 - transparent.a);
gl_FragColor = (1.0 - transparent.a) * transparent + transparent.a * opaque;
} | /**
* Compositing for Weighted Blended Order-Independent Transparency. See:
* http://jcgt.org/published/0002/02/09/
*/
uniform sampler2D u_opaque;
uniform sampler2D u_accumulation;
uniform sampler2D u_revealage;
varying vec2 v_textureCoordinates;
void main()
{
vec4 opaque = texture2D(u_opaque, v_textureCoordinates);
vec4 accum = texture2D(u_accumulation, v_textureCoordinates);
float r = texture2D(u_revealage, v_textureCoordinates).r;
#ifdef MRT
vec4 transparent = vec4(accum.rgb / clamp(r, 1e-4, 5e4), accum.a);
#else
vec4 transparent = vec4(accum.rgb / clamp(accum.a, 1e-4, 5e4), r);
#endif
+ transparent.rgb = mix(vec3(0.0), transparent.rgb, 1.0 - transparent.a);
gl_FragColor = (1.0 - transparent.a) * transparent + transparent.a * opaque;
} | 1 | 0.04 | 1 | 0 |
d44ba81c4ff0b489e861dbb9312ffc98330e9a3b | app/backand/js/global.js | app/backand/js/global.js | var backandGlobal = {
url: "http://localhost:4109/backapi", //"https://api.backand.com:8080",//
defaultApp: null,
};
//load the backand banner
$(document).ready(function () {
angular.element(document).ready(function () {
backand.security.authentication.addLoginEvent();
document.dispatchEvent(backand.security.authentication.onlogin);
});
});
var zfill = function (num, len) { return (Array(len).join("0") + num).slice(-len); }
| var backandGlobal = {
url: "https://www.backand.com:8080",//
defaultApp: null,
};
//load the backand banner
$(document).ready(function () {
angular.element(document).ready(function () {
backand.security.authentication.addLoginEvent();
document.dispatchEvent(backand.security.authentication.onlogin);
});
});
var zfill = function (num, len) { return (Array(len).join("0") + num).slice(-len); }
| Move to production - DO NOT CHANGE | Move to production - DO NOT CHANGE
| JavaScript | mit | backand/angularbknd,backand/diamining,itayher/scr-rma,backand/angularbknd | javascript | ## Code Before:
var backandGlobal = {
url: "http://localhost:4109/backapi", //"https://api.backand.com:8080",//
defaultApp: null,
};
//load the backand banner
$(document).ready(function () {
angular.element(document).ready(function () {
backand.security.authentication.addLoginEvent();
document.dispatchEvent(backand.security.authentication.onlogin);
});
});
var zfill = function (num, len) { return (Array(len).join("0") + num).slice(-len); }
## Instruction:
Move to production - DO NOT CHANGE
## Code After:
var backandGlobal = {
url: "https://www.backand.com:8080",//
defaultApp: null,
};
//load the backand banner
$(document).ready(function () {
angular.element(document).ready(function () {
backand.security.authentication.addLoginEvent();
document.dispatchEvent(backand.security.authentication.onlogin);
});
});
var zfill = function (num, len) { return (Array(len).join("0") + num).slice(-len); }
| var backandGlobal = {
- url: "http://localhost:4109/backapi", //"https://api.backand.com:8080",//
+ url: "https://www.backand.com:8080",//
defaultApp: null,
};
//load the backand banner
$(document).ready(function () {
angular.element(document).ready(function () {
backand.security.authentication.addLoginEvent();
document.dispatchEvent(backand.security.authentication.onlogin);
});
});
-
var zfill = function (num, len) { return (Array(len).join("0") + num).slice(-len); } | 3 | 0.2 | 1 | 2 |
d00de512059c0bc289f54ae0bf611286ac115e7c | spec/run-all.sh | spec/run-all.sh |
for required_variable in \
GOOGLE_CLOUD_PROJECT \
GOOGLE_APPLICATION_CREDENTIALS \
GOOGLE_CLOUD_STORAGE_BUCKET \
ALTERNATE_GOOGLE_CLOUD_STORAGE_BUCKET \
TRANSLATE_API_KEY \
; do
if [[ -z "${!required_variable}" ]]; then
echo "Must set $required_variable"
exit 1
fi
done
script_directory="$(dirname "`realpath $0`")"
repo_directory="$(dirname $script_directory)"
status_return=0 # everything passed
# Print out Ruby version
ruby --version
# Run Tets
for product in \
bigquery \
datastore \
endpoints/getting-started \
language \
logging \
pubsub \
speech \
storage \
translate \
vision \
; do
echo "[$product]"
cd "$repo_directory/$product/"
bundle install && bundle exec rspec --format documentation
# Check status of bundle exec rspec
if [ $? != 0 ]; then
status_return=1
fi
done
exit $status_return
|
for required_variable in \
GOOGLE_CLOUD_PROJECT \
GOOGLE_APPLICATION_CREDENTIALS \
GOOGLE_CLOUD_STORAGE_BUCKET \
ALTERNATE_GOOGLE_CLOUD_STORAGE_BUCKET \
; do
if [[ -z "${!required_variable}" ]]; then
echo "Must set $required_variable"
exit 1
fi
done
script_directory="$(dirname "`realpath $0`")"
repo_directory="$(dirname $script_directory)"
status_return=0 # everything passed
# Print out Ruby version
ruby --version
# Run Tets
for product in \
bigquery \
datastore \
endpoints/getting-started \
language \
logging \
pubsub \
speech \
storage \
translate \
vision \
; do
echo "[$product]"
cd "$repo_directory/$product/"
bundle install && bundle exec rspec --format documentation
# Check status of bundle exec rspec
if [ $? != 0 ]; then
status_return=1
fi
done
exit $status_return
| Remove TRANSLATE_API_KEY ~ no longer needed | Remove TRANSLATE_API_KEY ~ no longer needed
| Shell | apache-2.0 | GoogleCloudPlatform/ruby-docs-samples,GoogleCloudPlatform/ruby-docs-samples,GoogleCloudPlatform/ruby-docs-samples,GoogleCloudPlatform/ruby-docs-samples | shell | ## Code Before:
for required_variable in \
GOOGLE_CLOUD_PROJECT \
GOOGLE_APPLICATION_CREDENTIALS \
GOOGLE_CLOUD_STORAGE_BUCKET \
ALTERNATE_GOOGLE_CLOUD_STORAGE_BUCKET \
TRANSLATE_API_KEY \
; do
if [[ -z "${!required_variable}" ]]; then
echo "Must set $required_variable"
exit 1
fi
done
script_directory="$(dirname "`realpath $0`")"
repo_directory="$(dirname $script_directory)"
status_return=0 # everything passed
# Print out Ruby version
ruby --version
# Run Tets
for product in \
bigquery \
datastore \
endpoints/getting-started \
language \
logging \
pubsub \
speech \
storage \
translate \
vision \
; do
echo "[$product]"
cd "$repo_directory/$product/"
bundle install && bundle exec rspec --format documentation
# Check status of bundle exec rspec
if [ $? != 0 ]; then
status_return=1
fi
done
exit $status_return
## Instruction:
Remove TRANSLATE_API_KEY ~ no longer needed
## Code After:
for required_variable in \
GOOGLE_CLOUD_PROJECT \
GOOGLE_APPLICATION_CREDENTIALS \
GOOGLE_CLOUD_STORAGE_BUCKET \
ALTERNATE_GOOGLE_CLOUD_STORAGE_BUCKET \
; do
if [[ -z "${!required_variable}" ]]; then
echo "Must set $required_variable"
exit 1
fi
done
script_directory="$(dirname "`realpath $0`")"
repo_directory="$(dirname $script_directory)"
status_return=0 # everything passed
# Print out Ruby version
ruby --version
# Run Tets
for product in \
bigquery \
datastore \
endpoints/getting-started \
language \
logging \
pubsub \
speech \
storage \
translate \
vision \
; do
echo "[$product]"
cd "$repo_directory/$product/"
bundle install && bundle exec rspec --format documentation
# Check status of bundle exec rspec
if [ $? != 0 ]; then
status_return=1
fi
done
exit $status_return
|
for required_variable in \
GOOGLE_CLOUD_PROJECT \
GOOGLE_APPLICATION_CREDENTIALS \
GOOGLE_CLOUD_STORAGE_BUCKET \
ALTERNATE_GOOGLE_CLOUD_STORAGE_BUCKET \
- TRANSLATE_API_KEY \
; do
if [[ -z "${!required_variable}" ]]; then
echo "Must set $required_variable"
exit 1
fi
done
script_directory="$(dirname "`realpath $0`")"
repo_directory="$(dirname $script_directory)"
status_return=0 # everything passed
# Print out Ruby version
ruby --version
# Run Tets
for product in \
bigquery \
datastore \
endpoints/getting-started \
language \
logging \
pubsub \
speech \
storage \
translate \
vision \
; do
echo "[$product]"
cd "$repo_directory/$product/"
bundle install && bundle exec rspec --format documentation
# Check status of bundle exec rspec
if [ $? != 0 ]; then
status_return=1
fi
done
exit $status_return | 1 | 0.022222 | 0 | 1 |
df156adb1fd9f27fbe243a5a143cdead981adb8a | src/test/_config.ts | src/test/_config.ts | import { Config, LoadBalancingStrategy } from "../connection";
const ARANGO_URL = process.env.TEST_ARANGODB_URL || "http://localhost:8529";
const ARANGO_VERSION = Number(
process.env.ARANGO_VERSION || process.env.ARANGOJS_DEVEL_VERSION || 39999
);
const ARANGO_LOAD_BALANCING_STRATEGY = process.env
.TEST_ARANGO_LOAD_BALANCING_STRATEGY as LoadBalancingStrategy | undefined;
export const config: Config & {
arangoVersion: NonNullable<Config["arangoVersion"]>;
} = ARANGO_URL.includes(",")
? {
url: ARANGO_URL.split(",").filter((s) => Boolean(s)),
arangoVersion: ARANGO_VERSION,
precaptureStackTraces: true,
loadBalancingStrategy: ARANGO_LOAD_BALANCING_STRATEGY || "ROUND_ROBIN",
}
: {
url: ARANGO_URL,
arangoVersion: ARANGO_VERSION,
precaptureStackTraces: true,
};
| import { Config, LoadBalancingStrategy } from "../connection";
const ARANGO_URL = process.env.TEST_ARANGODB_URL || "http://localhost:8529";
const ARANGO_VERSION = Number(
process.env.ARANGO_VERSION || process.env.ARANGOJS_DEVEL_VERSION || 39999
);
const ARANGO_LOAD_BALANCING_STRATEGY = process.env
.TEST_ARANGO_LOAD_BALANCING_STRATEGY as LoadBalancingStrategy | undefined;
export const config: Config & {
arangoVersion: NonNullable<Config["arangoVersion"]>;
} = ARANGO_URL.includes(",")
? {
url: ARANGO_URL.split(",").filter((s) => Boolean(s)),
arangoVersion: ARANGO_VERSION,
loadBalancingStrategy: ARANGO_LOAD_BALANCING_STRATEGY || "ROUND_ROBIN",
}
: {
url: ARANGO_URL,
arangoVersion: ARANGO_VERSION,
};
| Disable precaptureStackTraces in tests for now | Disable precaptureStackTraces in tests for now
Thinking this might cause the timeouts we're seeing in some cursor tests. | TypeScript | apache-2.0 | arangodb/arangojs,arangodb/arangojs | typescript | ## Code Before:
import { Config, LoadBalancingStrategy } from "../connection";
const ARANGO_URL = process.env.TEST_ARANGODB_URL || "http://localhost:8529";
const ARANGO_VERSION = Number(
process.env.ARANGO_VERSION || process.env.ARANGOJS_DEVEL_VERSION || 39999
);
const ARANGO_LOAD_BALANCING_STRATEGY = process.env
.TEST_ARANGO_LOAD_BALANCING_STRATEGY as LoadBalancingStrategy | undefined;
export const config: Config & {
arangoVersion: NonNullable<Config["arangoVersion"]>;
} = ARANGO_URL.includes(",")
? {
url: ARANGO_URL.split(",").filter((s) => Boolean(s)),
arangoVersion: ARANGO_VERSION,
precaptureStackTraces: true,
loadBalancingStrategy: ARANGO_LOAD_BALANCING_STRATEGY || "ROUND_ROBIN",
}
: {
url: ARANGO_URL,
arangoVersion: ARANGO_VERSION,
precaptureStackTraces: true,
};
## Instruction:
Disable precaptureStackTraces in tests for now
Thinking this might cause the timeouts we're seeing in some cursor tests.
## Code After:
import { Config, LoadBalancingStrategy } from "../connection";
const ARANGO_URL = process.env.TEST_ARANGODB_URL || "http://localhost:8529";
const ARANGO_VERSION = Number(
process.env.ARANGO_VERSION || process.env.ARANGOJS_DEVEL_VERSION || 39999
);
const ARANGO_LOAD_BALANCING_STRATEGY = process.env
.TEST_ARANGO_LOAD_BALANCING_STRATEGY as LoadBalancingStrategy | undefined;
export const config: Config & {
arangoVersion: NonNullable<Config["arangoVersion"]>;
} = ARANGO_URL.includes(",")
? {
url: ARANGO_URL.split(",").filter((s) => Boolean(s)),
arangoVersion: ARANGO_VERSION,
loadBalancingStrategy: ARANGO_LOAD_BALANCING_STRATEGY || "ROUND_ROBIN",
}
: {
url: ARANGO_URL,
arangoVersion: ARANGO_VERSION,
};
| import { Config, LoadBalancingStrategy } from "../connection";
const ARANGO_URL = process.env.TEST_ARANGODB_URL || "http://localhost:8529";
const ARANGO_VERSION = Number(
process.env.ARANGO_VERSION || process.env.ARANGOJS_DEVEL_VERSION || 39999
);
const ARANGO_LOAD_BALANCING_STRATEGY = process.env
.TEST_ARANGO_LOAD_BALANCING_STRATEGY as LoadBalancingStrategy | undefined;
export const config: Config & {
arangoVersion: NonNullable<Config["arangoVersion"]>;
} = ARANGO_URL.includes(",")
? {
url: ARANGO_URL.split(",").filter((s) => Boolean(s)),
arangoVersion: ARANGO_VERSION,
- precaptureStackTraces: true,
loadBalancingStrategy: ARANGO_LOAD_BALANCING_STRATEGY || "ROUND_ROBIN",
}
: {
url: ARANGO_URL,
arangoVersion: ARANGO_VERSION,
- precaptureStackTraces: true,
}; | 2 | 0.086957 | 0 | 2 |
bfb4c71aa3cfa92a8dfcf300b547b66a7f22c063 | src/CoreBrowser/wwwroot/js/site.js | src/CoreBrowser/wwwroot/js/site.js | $(document).ready(function () {
$("table.sortable").stupidtable();
$(document).on("click", "table tr", function (evt) {
var url = $(this).attr("data-url"),
type = $(this).attr("data-type"),
links = $(this).find("a");
if (typeof url === "undefined" || typeof type === "undefined")
return;
evt.preventDefault();
if (type === "folder" || type === "parent-folder") {
window.location.href = url;
}
else if (type === "image") {
links.first().ekkoLightbox({
always_show_close: false,
loadingMessage: '<div class="text-center"><i class="fa fa-circle-o-notch fa-spin fa-3x fa-fw margin-bottom"></i><span class="sr-only">Loading...</span></div>'
});
} else {
ga('send', 'event', 'Download', type, url);
window.location.href = url;
}
});
}); | $(document).ready(function () {
$("table.sortable").stupidtable();
$(document).on("click", "table tr", function (evt) {
var url = $(this).attr("data-url"),
type = $(this).attr("data-type"),
links = $(this).find("a");
if (typeof url === "undefined" || typeof type === "undefined")
return;
evt.preventDefault();
if (type === "folder" || type === "parent-folder") {
window.location.href = url;
}
else if (type === "image") {
links.first().ekkoLightbox({
always_show_close: false,
loadingMessage: '<div class="text-center"><i class="fa fa-circle-o-notch fa-spin fa-3x fa-fw margin-bottom"></i><span class="sr-only">Loading...</span></div>'
});
} else if (typeof(ga) != "undefined"){
ga('send', 'event', 'Download', type, url);
window.location.href = url;
}
});
}); | Check that GA is defined | Check that GA is defined
| JavaScript | mit | ollejacobsen/corebrowser,ollejacobsen/corebrowser | javascript | ## Code Before:
$(document).ready(function () {
$("table.sortable").stupidtable();
$(document).on("click", "table tr", function (evt) {
var url = $(this).attr("data-url"),
type = $(this).attr("data-type"),
links = $(this).find("a");
if (typeof url === "undefined" || typeof type === "undefined")
return;
evt.preventDefault();
if (type === "folder" || type === "parent-folder") {
window.location.href = url;
}
else if (type === "image") {
links.first().ekkoLightbox({
always_show_close: false,
loadingMessage: '<div class="text-center"><i class="fa fa-circle-o-notch fa-spin fa-3x fa-fw margin-bottom"></i><span class="sr-only">Loading...</span></div>'
});
} else {
ga('send', 'event', 'Download', type, url);
window.location.href = url;
}
});
});
## Instruction:
Check that GA is defined
## Code After:
$(document).ready(function () {
$("table.sortable").stupidtable();
$(document).on("click", "table tr", function (evt) {
var url = $(this).attr("data-url"),
type = $(this).attr("data-type"),
links = $(this).find("a");
if (typeof url === "undefined" || typeof type === "undefined")
return;
evt.preventDefault();
if (type === "folder" || type === "parent-folder") {
window.location.href = url;
}
else if (type === "image") {
links.first().ekkoLightbox({
always_show_close: false,
loadingMessage: '<div class="text-center"><i class="fa fa-circle-o-notch fa-spin fa-3x fa-fw margin-bottom"></i><span class="sr-only">Loading...</span></div>'
});
} else if (typeof(ga) != "undefined"){
ga('send', 'event', 'Download', type, url);
window.location.href = url;
}
});
}); | $(document).ready(function () {
$("table.sortable").stupidtable();
$(document).on("click", "table tr", function (evt) {
var url = $(this).attr("data-url"),
type = $(this).attr("data-type"),
links = $(this).find("a");
if (typeof url === "undefined" || typeof type === "undefined")
return;
evt.preventDefault();
if (type === "folder" || type === "parent-folder") {
window.location.href = url;
}
else if (type === "image") {
links.first().ekkoLightbox({
always_show_close: false,
loadingMessage: '<div class="text-center"><i class="fa fa-circle-o-notch fa-spin fa-3x fa-fw margin-bottom"></i><span class="sr-only">Loading...</span></div>'
});
- } else {
+ } else if (typeof(ga) != "undefined"){
ga('send', 'event', 'Download', type, url);
window.location.href = url;
}
});
}); | 2 | 0.074074 | 1 | 1 |
5b8ed6ee7453927472ad503cde497a8a17b10020 | README.md | README.md |
`panflute` is a Python package that makes creating Pandoc filters fun.
For a detailed user guide, documentation, and installation instructions, see
<http://scorreia.com/software/panflute/> (or the [PDF version](http://scorreia.com/software/panflute/Panflute.pdf))
### Install
To install panflute, open the command line and type::
```
pip install git+git://github.com/sergiocorreia/panflute.git
```
- Requires Python 3.2 or later.
- On windows, the command line (``cmd``) must be run as administrator.
# Dev Install
After cloning the repo and opening the panflute folder:
`python setup.py install`
: installs the package locally
`python setup.py develop`
: installs locally with a symlink so changes are automatically updated
### Contributing
Feel free to submit push requests. For consistency, code should comply with [pep8](https://pypi.python.org/pypi/pep8) (as long as its reasonable), and with the style guides by [@kennethreitz](http://docs.python-guide.org/en/latest/writing/style/) and [google](http://google.github.io/styleguide/pyguide.html).
# License
BSD3 license (following `pandocfilter` by @jgm)
|
`panflute` is a Python package that makes creating Pandoc filters fun.
For a detailed user guide, documentation, and installation instructions, see
<http://scorreia.com/software/panflute/> (or the [PDF version](http://scorreia.com/software/panflute/Panflute.pdf))
## Install
To install panflute, open the command line and type::
### Python 3
```
pip install git+git://github.com/sergiocorreia/panflute.git
```
- Requires Python 3.2 or later.
- On windows, the command line (``cmd``) must be run as administrator.
### Python 2
```
pip install git+git://github.com/sergiocorreia/panflute.git@python2
```
## To Uninstall
```
pip uninstall panflute
```
## Dev Install
After cloning the repo and opening the panflute folder:
`python setup.py install`
: installs the package locally
`python setup.py develop`
: installs locally with a symlink so changes are automatically updated
## Contributing
Feel free to submit push requests. For consistency, code should comply with [pep8](https://pypi.python.org/pypi/pep8) (as long as its reasonable), and with the style guides by [@kennethreitz](http://docs.python-guide.org/en/latest/writing/style/) and [google](http://google.github.io/styleguide/pyguide.html).
## License
BSD3 license (following `pandocfilter` by @jgm)
| Update readme with install instructions for Python2 | Update readme with install instructions for Python2
Also reorganize the heading markers
| Markdown | bsd-3-clause | sergiocorreia/panflute | markdown | ## Code Before:
`panflute` is a Python package that makes creating Pandoc filters fun.
For a detailed user guide, documentation, and installation instructions, see
<http://scorreia.com/software/panflute/> (or the [PDF version](http://scorreia.com/software/panflute/Panflute.pdf))
### Install
To install panflute, open the command line and type::
```
pip install git+git://github.com/sergiocorreia/panflute.git
```
- Requires Python 3.2 or later.
- On windows, the command line (``cmd``) must be run as administrator.
# Dev Install
After cloning the repo and opening the panflute folder:
`python setup.py install`
: installs the package locally
`python setup.py develop`
: installs locally with a symlink so changes are automatically updated
### Contributing
Feel free to submit push requests. For consistency, code should comply with [pep8](https://pypi.python.org/pypi/pep8) (as long as its reasonable), and with the style guides by [@kennethreitz](http://docs.python-guide.org/en/latest/writing/style/) and [google](http://google.github.io/styleguide/pyguide.html).
# License
BSD3 license (following `pandocfilter` by @jgm)
## Instruction:
Update readme with install instructions for Python2
Also reorganize the heading markers
## Code After:
`panflute` is a Python package that makes creating Pandoc filters fun.
For a detailed user guide, documentation, and installation instructions, see
<http://scorreia.com/software/panflute/> (or the [PDF version](http://scorreia.com/software/panflute/Panflute.pdf))
## Install
To install panflute, open the command line and type::
### Python 3
```
pip install git+git://github.com/sergiocorreia/panflute.git
```
- Requires Python 3.2 or later.
- On windows, the command line (``cmd``) must be run as administrator.
### Python 2
```
pip install git+git://github.com/sergiocorreia/panflute.git@python2
```
## To Uninstall
```
pip uninstall panflute
```
## Dev Install
After cloning the repo and opening the panflute folder:
`python setup.py install`
: installs the package locally
`python setup.py develop`
: installs locally with a symlink so changes are automatically updated
## Contributing
Feel free to submit push requests. For consistency, code should comply with [pep8](https://pypi.python.org/pypi/pep8) (as long as its reasonable), and with the style guides by [@kennethreitz](http://docs.python-guide.org/en/latest/writing/style/) and [google](http://google.github.io/styleguide/pyguide.html).
## License
BSD3 license (following `pandocfilter` by @jgm)
|
`panflute` is a Python package that makes creating Pandoc filters fun.
For a detailed user guide, documentation, and installation instructions, see
<http://scorreia.com/software/panflute/> (or the [PDF version](http://scorreia.com/software/panflute/Panflute.pdf))
- ### Install
? -
+ ## Install
To install panflute, open the command line and type::
+
+ ### Python 3
```
pip install git+git://github.com/sergiocorreia/panflute.git
```
- Requires Python 3.2 or later.
- On windows, the command line (``cmd``) must be run as administrator.
+ ### Python 2
+
+ ```
+ pip install git+git://github.com/sergiocorreia/panflute.git@python2
+ ```
+
+ ## To Uninstall
+
+ ```
+ pip uninstall panflute
+ ```
+
- # Dev Install
+ ## Dev Install
? +
After cloning the repo and opening the panflute folder:
`python setup.py install`
: installs the package locally
`python setup.py develop`
: installs locally with a symlink so changes are automatically updated
- ### Contributing
? -
+ ## Contributing
Feel free to submit push requests. For consistency, code should comply with [pep8](https://pypi.python.org/pypi/pep8) (as long as its reasonable), and with the style guides by [@kennethreitz](http://docs.python-guide.org/en/latest/writing/style/) and [google](http://google.github.io/styleguide/pyguide.html).
- # License
+ ## License
? +
BSD3 license (following `pandocfilter` by @jgm)
| 22 | 0.611111 | 18 | 4 |
f3ff5d1c6184f88c719f2d165224080e68588554 | test/unit/indexer/document_preparer_test.rb | test/unit/indexer/document_preparer_test.rb | require "test_helper"
require "indexer"
describe Indexer::DocumentPreparer do
describe "#prepared" do
let(:doc_hash) { {"link" => "some-slug" } }
before do
Indexer::TagLookup.stubs(:prepare_tags).returns(doc_hash)
end
describe "alpha taxonomies" do
it "adds an alpha taxonomy to the doc if a match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(["taxon-1", "taxon-2"])
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
assert_equal doc_hash.merge("alpha_taxonomy" => ["taxon-1", "taxon-2"]), updated_doc_hash
end
it "does nothing to the doc if no match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(nil)
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
assert_equal doc_hash, updated_doc_hash
end
end
end
end
| require "test_helper"
require "indexer"
describe Indexer::DocumentPreparer do
describe "#prepared" do
describe "alpha taxonomies" do
before do
Indexer::TagLookup.stubs(:prepare_tags).returns({"link" => "some-slug" })
end
it "adds an alpha taxonomy to the doc if a match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(["taxon-1", "taxon-2"])
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
assert_equal ["taxon-1", "taxon-2"], updated_doc_hash['alpha_taxonomy']
end
it "does nothing to the doc if no match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(nil)
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
assert_nil updated_doc_hash['alpha_taxonomy']
end
end
end
end
| Simplify tests for document preparer | Simplify tests for document preparer
This will allow us to add tests more easily.
| Ruby | mit | alphagov/rummager,alphagov/rummager | ruby | ## Code Before:
require "test_helper"
require "indexer"
describe Indexer::DocumentPreparer do
describe "#prepared" do
let(:doc_hash) { {"link" => "some-slug" } }
before do
Indexer::TagLookup.stubs(:prepare_tags).returns(doc_hash)
end
describe "alpha taxonomies" do
it "adds an alpha taxonomy to the doc if a match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(["taxon-1", "taxon-2"])
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
assert_equal doc_hash.merge("alpha_taxonomy" => ["taxon-1", "taxon-2"]), updated_doc_hash
end
it "does nothing to the doc if no match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(nil)
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
assert_equal doc_hash, updated_doc_hash
end
end
end
end
## Instruction:
Simplify tests for document preparer
This will allow us to add tests more easily.
## Code After:
require "test_helper"
require "indexer"
describe Indexer::DocumentPreparer do
describe "#prepared" do
describe "alpha taxonomies" do
before do
Indexer::TagLookup.stubs(:prepare_tags).returns({"link" => "some-slug" })
end
it "adds an alpha taxonomy to the doc if a match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(["taxon-1", "taxon-2"])
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
assert_equal ["taxon-1", "taxon-2"], updated_doc_hash['alpha_taxonomy']
end
it "does nothing to the doc if no match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(nil)
updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
assert_nil updated_doc_hash['alpha_taxonomy']
end
end
end
end
| require "test_helper"
require "indexer"
describe Indexer::DocumentPreparer do
describe "#prepared" do
- let(:doc_hash) { {"link" => "some-slug" } }
+ describe "alpha taxonomies" do
+ before do
+ Indexer::TagLookup.stubs(:prepare_tags).returns({"link" => "some-slug" })
+ end
- before do
- Indexer::TagLookup.stubs(:prepare_tags).returns(doc_hash)
- end
-
- describe "alpha taxonomies" do
it "adds an alpha taxonomy to the doc if a match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(["taxon-1", "taxon-2"])
- updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
? ^^^^^^^^
+ updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
? ^^
- assert_equal doc_hash.merge("alpha_taxonomy" => ["taxon-1", "taxon-2"]), updated_doc_hash
+
+ assert_equal ["taxon-1", "taxon-2"], updated_doc_hash['alpha_taxonomy']
end
it "does nothing to the doc if no match is found" do
::TaxonomyPrototype::TaxonFinder.stubs(:find_by).returns(nil)
- updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared(doc_hash, nil, true)
? ^^^^^^^^
+ updated_doc_hash = Indexer::DocumentPreparer.new("fake_client").prepared({}, nil, true)
? ^^
- assert_equal doc_hash, updated_doc_hash
+
+ assert_nil updated_doc_hash['alpha_taxonomy']
end
end
end
end | 20 | 0.714286 | 10 | 10 |
9ea2700a94c12f041f5f4111adba0d62c209cd68 | views/_utils.erb | views/_utils.erb | <div class="utils">
<% unless @meta.layers.first.nil? %>
<% @layers = '' %>
<% @meta.layers.each do |layer| %>
<% @layers = @layers + layer + ',' %>
<% end %>
<span class="kml-link">
<a href="<%= kml_url(@layers.chop) %>">Google Earth</a>
<img alt="Google Earth Link" src="/images/ge.gif" />
</span>
<% end %>
<% unless @meta.catalog_link.nil? %>
<span class="catalog-link">
<a href="<%= cat_url(@item_id) %>">Virgo</a>
<img alt="catalog link" src="/images/browser.png" />
</span>
<% end %>
</div>
| <div class="utils">
<% unless @meta.layers.first.nil? %>
<% @layers = '' %>
<% @meta.layers.each do |layer| %>
<% @layers = @layers + layer + ',' %>
<% end %>
<span class="kml-link">
<a href="<%= kml_url(@layers.chop) %>">Google Earth
<img alt="Google Earth Link" src="/images/ge.gif" />
</a>
</span>
<% end %>
<% unless @meta.catalog_link.nil? %>
<span class="catalog-link">
<a href="<%= cat_url(@item_id) %>">Virgo</a>
<img alt="catalog link" src="/images/browser.png" />
</span>
<% end %>
</div>
| Make the icon for Google Earth linked | Make the icon for Google Earth linked
| HTML+ERB | apache-2.0 | scholarslab/geoportal,scholarslab/geoportal,scholarslab/geoportal | html+erb | ## Code Before:
<div class="utils">
<% unless @meta.layers.first.nil? %>
<% @layers = '' %>
<% @meta.layers.each do |layer| %>
<% @layers = @layers + layer + ',' %>
<% end %>
<span class="kml-link">
<a href="<%= kml_url(@layers.chop) %>">Google Earth</a>
<img alt="Google Earth Link" src="/images/ge.gif" />
</span>
<% end %>
<% unless @meta.catalog_link.nil? %>
<span class="catalog-link">
<a href="<%= cat_url(@item_id) %>">Virgo</a>
<img alt="catalog link" src="/images/browser.png" />
</span>
<% end %>
</div>
## Instruction:
Make the icon for Google Earth linked
## Code After:
<div class="utils">
<% unless @meta.layers.first.nil? %>
<% @layers = '' %>
<% @meta.layers.each do |layer| %>
<% @layers = @layers + layer + ',' %>
<% end %>
<span class="kml-link">
<a href="<%= kml_url(@layers.chop) %>">Google Earth
<img alt="Google Earth Link" src="/images/ge.gif" />
</a>
</span>
<% end %>
<% unless @meta.catalog_link.nil? %>
<span class="catalog-link">
<a href="<%= cat_url(@item_id) %>">Virgo</a>
<img alt="catalog link" src="/images/browser.png" />
</span>
<% end %>
</div>
| <div class="utils">
<% unless @meta.layers.first.nil? %>
<% @layers = '' %>
- <% @meta.layers.each do |layer| %>
? --
+ <% @meta.layers.each do |layer| %>
- <% @layers = @layers + layer + ',' %>
? --
+ <% @layers = @layers + layer + ',' %>
- <% end %>
? ----
+ <% end %>
- <span class="kml-link">
? --
+ <span class="kml-link">
- <a href="<%= kml_url(@layers.chop) %>">Google Earth</a>
? -- ----
+ <a href="<%= kml_url(@layers.chop) %>">Google Earth
<img alt="Google Earth Link" src="/images/ge.gif" />
+ </a>
- </span>
? ---
+ </span>
- <% end %>
? ---
+ <% end %>
- <% unless @meta.catalog_link.nil? %>
? -
+ <% unless @meta.catalog_link.nil? %>
<span class="catalog-link">
- <a href="<%= cat_url(@item_id) %>">Virgo</a>
? --
+ <a href="<%= cat_url(@item_id) %>">Virgo</a>
- <img alt="catalog link" src="/images/browser.png" />
? --
+ <img alt="catalog link" src="/images/browser.png" />
- </span>
? --
+ </span>
- <% end %>
? ---
+ <% end %>
</div> | 25 | 1.315789 | 13 | 12 |
b1fde66e70ba88e3f8ed1b2c76934d05c713f4de | .travis.yml | .travis.yml | language: generic
dist: trusty
group: edge
sudo: true
env:
global:
- PATH="/opt/chefdk/bin:/opt/chefdk/embedded/bin:/opt/chef/bin:$PATH"
addons:
apt:
sources:
- chef-stable-trusty
packages:
- chefdk
install:
- unset GEM_PATH
- gem install git
- ./bin/packer-build-install
script:
- make hackcheck
- make
- git diff --exit-code
- git diff --cached --exit-code
- rspec
- ./runtests --env .example.env
- if [[ $TRAVIS_PULL_REQUEST = false ]] ; then
./bin/trigger-downstreams ;
fi
| language: generic
dist: trusty
group: edge
sudo: true
env:
global:
- PATH="/opt/chefdk/bin:/opt/chefdk/embedded/bin:/opt/chef/bin:$PATH"
addons:
apt:
sources:
- chef-stable-trusty
packages:
- chefdk
install:
- unset GEM_PATH
- gem install git
- ./bin/packer-build-install
script:
- make hackcheck
- make
- git diff --exit-code
- git diff --cached --exit-code
- rspec
- ./runtests --env .example.env
- if [[ $TRAVIS_PULL_REQUEST = false ]] ; then
./bin/trigger-downstreams ;
else
echo '✮✮✮ Skip trigger-downstreams for pull request builds ✮✮✮' ;
fi
| Print message when skipping ./trigger-downstreams | Print message when skipping ./trigger-downstreams
Signed-off-by: Carmen Andoh <f1196a8a993e28d05bd187b7b130720e5dd34147@travis-ci.org>
| YAML | mit | travis-ci/packer-templates,travis-ci/packer-templates,travis-ci/packer-templates | yaml | ## Code Before:
language: generic
dist: trusty
group: edge
sudo: true
env:
global:
- PATH="/opt/chefdk/bin:/opt/chefdk/embedded/bin:/opt/chef/bin:$PATH"
addons:
apt:
sources:
- chef-stable-trusty
packages:
- chefdk
install:
- unset GEM_PATH
- gem install git
- ./bin/packer-build-install
script:
- make hackcheck
- make
- git diff --exit-code
- git diff --cached --exit-code
- rspec
- ./runtests --env .example.env
- if [[ $TRAVIS_PULL_REQUEST = false ]] ; then
./bin/trigger-downstreams ;
fi
## Instruction:
Print message when skipping ./trigger-downstreams
Signed-off-by: Carmen Andoh <f1196a8a993e28d05bd187b7b130720e5dd34147@travis-ci.org>
## Code After:
language: generic
dist: trusty
group: edge
sudo: true
env:
global:
- PATH="/opt/chefdk/bin:/opt/chefdk/embedded/bin:/opt/chef/bin:$PATH"
addons:
apt:
sources:
- chef-stable-trusty
packages:
- chefdk
install:
- unset GEM_PATH
- gem install git
- ./bin/packer-build-install
script:
- make hackcheck
- make
- git diff --exit-code
- git diff --cached --exit-code
- rspec
- ./runtests --env .example.env
- if [[ $TRAVIS_PULL_REQUEST = false ]] ; then
./bin/trigger-downstreams ;
else
echo '✮✮✮ Skip trigger-downstreams for pull request builds ✮✮✮' ;
fi
| language: generic
dist: trusty
group: edge
sudo: true
env:
global:
- PATH="/opt/chefdk/bin:/opt/chefdk/embedded/bin:/opt/chef/bin:$PATH"
addons:
apt:
sources:
- chef-stable-trusty
packages:
- chefdk
install:
- unset GEM_PATH
- gem install git
- ./bin/packer-build-install
script:
- make hackcheck
- make
- git diff --exit-code
- git diff --cached --exit-code
- rspec
- ./runtests --env .example.env
- if [[ $TRAVIS_PULL_REQUEST = false ]] ; then
./bin/trigger-downstreams ;
+ else
+ echo '✮✮✮ Skip trigger-downstreams for pull request builds ✮✮✮' ;
fi | 2 | 0.074074 | 2 | 0 |
d64a171dfde57106a5abd7d46990c81c6250b965 | whitespaceterminator.py | whitespaceterminator.py |
from gi.repository import GObject, Gedit
class WhiteSpaceTerminator(GObject.Object, Gedit.WindowActivatable):
"""Strip trailing whitespace before saving."""
window = GObject.property(type=Gedit.Window)
def do_activate(self):
self.window.connect("tab-added", self.on_tab_added)
def on_tab_added(self, window, tab, data=None):
tab.get_document().connect("save", self.on_document_save)
def on_document_save(self, document, location, encoding, compression,
flags, data=None):
for i, text in enumerate(document.props.text.rstrip().split("\n")):
strip_stop = document.get_iter_at_line(i)
strip_stop.forward_to_line_end()
strip_start = strip_stop.copy()
strip_start.backward_chars(len(text) - len(text.rstrip()))
document.delete(strip_start, strip_stop)
document.delete(strip_start, document.get_end_iter())
|
from gi.repository import GObject, Gedit
class WhiteSpaceTerminator(GObject.Object, Gedit.WindowActivatable):
"""Strip trailing whitespace before saving."""
window = GObject.property(type=Gedit.Window)
def do_activate(self):
self.window.connect("tab-added", self.on_tab_added)
for document in self.window.get_documents():
document.connect("save", self.on_document_save)
def on_tab_added(self, window, tab, data=None):
tab.get_document().connect("save", self.on_document_save)
def on_document_save(self, document, location, encoding, compression,
flags, data=None):
for i, text in enumerate(document.props.text.rstrip().split("\n")):
strip_stop = document.get_iter_at_line(i)
strip_stop.forward_to_line_end()
strip_start = strip_stop.copy()
strip_start.backward_chars(len(text) - len(text.rstrip()))
document.delete(strip_start, strip_stop)
document.delete(strip_start, document.get_end_iter())
| Connect on existing tabs when activating the plugin. | Connect on existing tabs when activating the plugin.
| Python | bsd-3-clause | Kozea/Gedit-WhiteSpace-Terminator | python | ## Code Before:
from gi.repository import GObject, Gedit
class WhiteSpaceTerminator(GObject.Object, Gedit.WindowActivatable):
"""Strip trailing whitespace before saving."""
window = GObject.property(type=Gedit.Window)
def do_activate(self):
self.window.connect("tab-added", self.on_tab_added)
def on_tab_added(self, window, tab, data=None):
tab.get_document().connect("save", self.on_document_save)
def on_document_save(self, document, location, encoding, compression,
flags, data=None):
for i, text in enumerate(document.props.text.rstrip().split("\n")):
strip_stop = document.get_iter_at_line(i)
strip_stop.forward_to_line_end()
strip_start = strip_stop.copy()
strip_start.backward_chars(len(text) - len(text.rstrip()))
document.delete(strip_start, strip_stop)
document.delete(strip_start, document.get_end_iter())
## Instruction:
Connect on existing tabs when activating the plugin.
## Code After:
from gi.repository import GObject, Gedit
class WhiteSpaceTerminator(GObject.Object, Gedit.WindowActivatable):
"""Strip trailing whitespace before saving."""
window = GObject.property(type=Gedit.Window)
def do_activate(self):
self.window.connect("tab-added", self.on_tab_added)
for document in self.window.get_documents():
document.connect("save", self.on_document_save)
def on_tab_added(self, window, tab, data=None):
tab.get_document().connect("save", self.on_document_save)
def on_document_save(self, document, location, encoding, compression,
flags, data=None):
for i, text in enumerate(document.props.text.rstrip().split("\n")):
strip_stop = document.get_iter_at_line(i)
strip_stop.forward_to_line_end()
strip_start = strip_stop.copy()
strip_start.backward_chars(len(text) - len(text.rstrip()))
document.delete(strip_start, strip_stop)
document.delete(strip_start, document.get_end_iter())
|
from gi.repository import GObject, Gedit
class WhiteSpaceTerminator(GObject.Object, Gedit.WindowActivatable):
"""Strip trailing whitespace before saving."""
window = GObject.property(type=Gedit.Window)
def do_activate(self):
self.window.connect("tab-added", self.on_tab_added)
+ for document in self.window.get_documents():
+ document.connect("save", self.on_document_save)
def on_tab_added(self, window, tab, data=None):
tab.get_document().connect("save", self.on_document_save)
def on_document_save(self, document, location, encoding, compression,
flags, data=None):
for i, text in enumerate(document.props.text.rstrip().split("\n")):
strip_stop = document.get_iter_at_line(i)
strip_stop.forward_to_line_end()
strip_start = strip_stop.copy()
strip_start.backward_chars(len(text) - len(text.rstrip()))
document.delete(strip_start, strip_stop)
document.delete(strip_start, document.get_end_iter()) | 2 | 0.086957 | 2 | 0 |
d2ad0ea3806babe5967e82b009dcdd3537419e18 | README.md | README.md |
> Generate PHP validations from short and concise descriptions of the input format.
**NOTE:** This project is not related to the Laravel official Validator class. It's
a stricter and statically generated alternative to it.
## Installation
Install via [npm](https://npmjs.org/package/gulp-laravel-validator):
```
npm install gulp-laravel-validator --save-dev
```
## Example
```js
var gulp = require('gulp'),
laravelValidator = require('gulp-laravel-validator'),
rename = require('gulp-rename');
gulp.task('default', function() {
return gulp.src('app/validators/**/*.val')
.pipe(laravelValidator())
.pipe(rename({extname: '.php'}))
.pipe(gulp.dest('app/lib/Validators'));
});
```
For this example to work you'll need to install [gulp-rename](https://github.com/hparra/gulp-rename).
|
> Generate PHP validations from short and concise descriptions of the input format.
**NOTE:** This project is not related to the Laravel official Validator class. It's
a stricter and statically generated alternative to it.
To read more about the format of the validator files see the [laravel-validator](https://github.com/ernestoalejo/laravel-validator) project directly.
## Installation
Install via [npm](https://npmjs.org/package/gulp-laravel-validator):
```
npm install gulp-laravel-validator --save-dev
```
## Example
```js
var gulp = require('gulp'),
laravelValidator = require('gulp-laravel-validator'),
rename = require('gulp-rename');
gulp.task('default', function() {
return gulp.src('app/validators/**/*.val')
.pipe(laravelValidator())
.pipe(rename({extname: '.php'}))
.pipe(gulp.dest('app/lib/Validators'));
});
```
For this example to work you'll need to install [gulp-rename](https://github.com/hparra/gulp-rename).
| Add a reference to the original library in the docs. | Add a reference to the original library in the docs.
| Markdown | mit | ernestoalejo/gulp-laravel-validator,ernestoalejo/gulp-laravel-validator | markdown | ## Code Before:
> Generate PHP validations from short and concise descriptions of the input format.
**NOTE:** This project is not related to the Laravel official Validator class. It's
a stricter and statically generated alternative to it.
## Installation
Install via [npm](https://npmjs.org/package/gulp-laravel-validator):
```
npm install gulp-laravel-validator --save-dev
```
## Example
```js
var gulp = require('gulp'),
laravelValidator = require('gulp-laravel-validator'),
rename = require('gulp-rename');
gulp.task('default', function() {
return gulp.src('app/validators/**/*.val')
.pipe(laravelValidator())
.pipe(rename({extname: '.php'}))
.pipe(gulp.dest('app/lib/Validators'));
});
```
For this example to work you'll need to install [gulp-rename](https://github.com/hparra/gulp-rename).
## Instruction:
Add a reference to the original library in the docs.
## Code After:
> Generate PHP validations from short and concise descriptions of the input format.
**NOTE:** This project is not related to the Laravel official Validator class. It's
a stricter and statically generated alternative to it.
To read more about the format of the validator files see the [laravel-validator](https://github.com/ernestoalejo/laravel-validator) project directly.
## Installation
Install via [npm](https://npmjs.org/package/gulp-laravel-validator):
```
npm install gulp-laravel-validator --save-dev
```
## Example
```js
var gulp = require('gulp'),
laravelValidator = require('gulp-laravel-validator'),
rename = require('gulp-rename');
gulp.task('default', function() {
return gulp.src('app/validators/**/*.val')
.pipe(laravelValidator())
.pipe(rename({extname: '.php'}))
.pipe(gulp.dest('app/lib/Validators'));
});
```
For this example to work you'll need to install [gulp-rename](https://github.com/hparra/gulp-rename).
|
> Generate PHP validations from short and concise descriptions of the input format.
**NOTE:** This project is not related to the Laravel official Validator class. It's
a stricter and statically generated alternative to it.
+
+ To read more about the format of the validator files see the [laravel-validator](https://github.com/ernestoalejo/laravel-validator) project directly.
+
## Installation
Install via [npm](https://npmjs.org/package/gulp-laravel-validator):
```
npm install gulp-laravel-validator --save-dev
```
## Example
```js
var gulp = require('gulp'),
laravelValidator = require('gulp-laravel-validator'),
rename = require('gulp-rename');
gulp.task('default', function() {
return gulp.src('app/validators/**/*.val')
.pipe(laravelValidator())
.pipe(rename({extname: '.php'}))
.pipe(gulp.dest('app/lib/Validators'));
});
```
For this example to work you'll need to install [gulp-rename](https://github.com/hparra/gulp-rename).
| 3 | 0.09375 | 3 | 0 |
0cfd3b32dc8ba338c89150331d10dd0a974b84ee | layouts/shortcodes/list_core.html | layouts/shortcodes/list_core.html | <em>Active</em><br>
Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Matt Stratton (web team lead), John Willis<br>
<br>
<em>Historic</em><br>
Patrick Debois (founder), Damon Edwards, Anthony Goddard, Lindsay Holmwood, Gildas Le Nadan, Stephen Nelson-Smith, Andrew Clay Shafer, Julian Simpson, Christian Trabold, John Vincent, James Wickett
<br>
<br>
| <em>Active</em><br>
Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Mike Rosado, Matt Stratton (web team lead), John Willis<br>
<br>
<em>Historic</em><br>
Patrick Debois (founder), Damon Edwards, Anthony Goddard, Lindsay Holmwood, Gildas Le Nadan, Stephen Nelson-Smith, Andrew Clay Shafer, Julian Simpson, Christian Trabold, John Vincent, James Wickett
<br>
<br>
| Add Mike to core team | Add Mike to core team
| HTML | apache-2.0 | gomex/devopsdays-web,gomex/devopsdays-web,gomex/devopsdays-web,gomex/devopsdays-web | html | ## Code Before:
<em>Active</em><br>
Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Matt Stratton (web team lead), John Willis<br>
<br>
<em>Historic</em><br>
Patrick Debois (founder), Damon Edwards, Anthony Goddard, Lindsay Holmwood, Gildas Le Nadan, Stephen Nelson-Smith, Andrew Clay Shafer, Julian Simpson, Christian Trabold, John Vincent, James Wickett
<br>
<br>
## Instruction:
Add Mike to core team
## Code After:
<em>Active</em><br>
Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Mike Rosado, Matt Stratton (web team lead), John Willis<br>
<br>
<em>Historic</em><br>
Patrick Debois (founder), Damon Edwards, Anthony Goddard, Lindsay Holmwood, Gildas Le Nadan, Stephen Nelson-Smith, Andrew Clay Shafer, Julian Simpson, Christian Trabold, John Vincent, James Wickett
<br>
<br>
| <em>Active</em><br>
- Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Matt Stratton (web team lead), John Willis<br>
+ Bridget Kromhout (lead), Kris Buytaert, Jennifer Davis, Bernd Erk, Dan Maher, Mike Rosado, Matt Stratton (web team lead), John Willis<br>
? +++++++++++++
<br>
<em>Historic</em><br>
Patrick Debois (founder), Damon Edwards, Anthony Goddard, Lindsay Holmwood, Gildas Le Nadan, Stephen Nelson-Smith, Andrew Clay Shafer, Julian Simpson, Christian Trabold, John Vincent, James Wickett
<br>
<br> | 2 | 0.285714 | 1 | 1 |
937fd7c07dfe98a086a9af07f0f7b316a6f2f6d8 | invoke/main.py | invoke/main.py |
from ._version import __version__
from .program import Program
program = Program(name="Invoke", binary='inv[oke]', version=__version__)
|
from . import __version__, Program
program = Program(
name="Invoke",
binary='inv[oke]',
version=__version__,
)
| Clean up binstub a bit | Clean up binstub a bit
| Python | bsd-2-clause | frol/invoke,frol/invoke,pyinvoke/invoke,mkusz/invoke,mattrobenolt/invoke,pfmoore/invoke,pyinvoke/invoke,mkusz/invoke,mattrobenolt/invoke,pfmoore/invoke | python | ## Code Before:
from ._version import __version__
from .program import Program
program = Program(name="Invoke", binary='inv[oke]', version=__version__)
## Instruction:
Clean up binstub a bit
## Code After:
from . import __version__, Program
program = Program(
name="Invoke",
binary='inv[oke]',
version=__version__,
)
|
+ from . import __version__, Program
- from ._version import __version__
- from .program import Program
-
- program = Program(name="Invoke", binary='inv[oke]', version=__version__)
+ program = Program(
+ name="Invoke",
+ binary='inv[oke]',
+ version=__version__,
+ ) | 10 | 1.666667 | 6 | 4 |
2169af00b430578cc26ed81e448cd52ebb7e4ae2 | posts.md | posts.md | ---
layout: page
title: Posts
---
{% for post in site.posts %}
<article class="post-preview">
<a href="{{ post.url | prepend: site.baseurl | replace: '//', '/' }}">
<h2 class="post-title">{{ post.title }}</h2>
{% if post.subtitle %}
<h3 class="post-subtitle">{{ post.subtitle }}</h3>
{% else %}
<h3 class="post-subtitle">{{ post.excerpt | strip_html | truncatewords: 15 }}</h3>
{% endif %}
</a>
<p class="post-meta">Posted by
{% if post.author %}
{{ post.author }}
{% else %}
{{ site.author }}
{% endif %}
on
{{ post.date | date: '%B %d, %Y' }}
</p>
</article>
<hr>
{% endfor %}
| ---
layout: page
title: Posts
---
{% for post in site.posts %}
<article class="post-preview">
<a href="{{ post.url | prepend: site.baseurl | replace: '//', '/' }}">
<h2 class="post-title">{{ post.title }}</h2>
{% if post.subtitle %}
<h3 class="post-subtitle">{{ post.subtitle }}</h3>
{% else %}
<h3 class="post-subtitle">{{ post.excerpt | strip_html | truncatewords: 15 }}</h3>
{% endif %}
</a>
<p class="post-meta">Posted by
{% if post.author %}
{{ post.author }}
{% else %}
{{ site.author }}
{% endif %}
on
{{ post.date | date: '%B %d, %Y' }} · {% include read_time.html content=post.content %}
</p>
</article>
<hr>
{% endfor %}
| Revert "Attempt at fixing github-pages compilation error" | Revert "Attempt at fixing github-pages compilation error"
This reverts commit 89e74ca52514bcca524a28bd8f90d7b94b9743fa.
| Markdown | mit | SamuelTurner/samuelturner.github.io,SamuelTurner/samuelturner.github.io | markdown | ## Code Before:
---
layout: page
title: Posts
---
{% for post in site.posts %}
<article class="post-preview">
<a href="{{ post.url | prepend: site.baseurl | replace: '//', '/' }}">
<h2 class="post-title">{{ post.title }}</h2>
{% if post.subtitle %}
<h3 class="post-subtitle">{{ post.subtitle }}</h3>
{% else %}
<h3 class="post-subtitle">{{ post.excerpt | strip_html | truncatewords: 15 }}</h3>
{% endif %}
</a>
<p class="post-meta">Posted by
{% if post.author %}
{{ post.author }}
{% else %}
{{ site.author }}
{% endif %}
on
{{ post.date | date: '%B %d, %Y' }}
</p>
</article>
<hr>
{% endfor %}
## Instruction:
Revert "Attempt at fixing github-pages compilation error"
This reverts commit 89e74ca52514bcca524a28bd8f90d7b94b9743fa.
## Code After:
---
layout: page
title: Posts
---
{% for post in site.posts %}
<article class="post-preview">
<a href="{{ post.url | prepend: site.baseurl | replace: '//', '/' }}">
<h2 class="post-title">{{ post.title }}</h2>
{% if post.subtitle %}
<h3 class="post-subtitle">{{ post.subtitle }}</h3>
{% else %}
<h3 class="post-subtitle">{{ post.excerpt | strip_html | truncatewords: 15 }}</h3>
{% endif %}
</a>
<p class="post-meta">Posted by
{% if post.author %}
{{ post.author }}
{% else %}
{{ site.author }}
{% endif %}
on
{{ post.date | date: '%B %d, %Y' }} · {% include read_time.html content=post.content %}
</p>
</article>
<hr>
{% endfor %}
| ---
layout: page
title: Posts
---
{% for post in site.posts %}
<article class="post-preview">
<a href="{{ post.url | prepend: site.baseurl | replace: '//', '/' }}">
<h2 class="post-title">{{ post.title }}</h2>
{% if post.subtitle %}
<h3 class="post-subtitle">{{ post.subtitle }}</h3>
{% else %}
<h3 class="post-subtitle">{{ post.excerpt | strip_html | truncatewords: 15 }}</h3>
{% endif %}
</a>
<p class="post-meta">Posted by
{% if post.author %}
{{ post.author }}
{% else %}
{{ site.author }}
{% endif %}
on
- {{ post.date | date: '%B %d, %Y' }}
+ {{ post.date | date: '%B %d, %Y' }} · {% include read_time.html content=post.content %}
</p>
</article>
<hr>
{% endfor %} | 2 | 0.066667 | 1 | 1 |
8cc68b43f2a3941fe08a93252f0d136b11d93b72 | readme.md | readme.md | This is the source repo for the QuickUI runtime.
The main QuickUI project home page is at http://quickui.org.
Build
-----
1. Install the packages which the build depends upon:
```
npm install
```
2. Build:
```
grunt
```
| Notice
------
QuickUI is no longer under active development.
Many thanks to all who have supported this project. People interested in
component-based web user interface design and development should look at
HTML web components, for example those possible in the
[Polymer](http://polymer-project.org) framework.
About
-----
This is the source repo for the QuickUI runtime.
The main QuickUI project home page is at http://quickui.org.
Build
-----
1. Install the packages which the build depends upon:
```
npm install
```
2. Build:
```
grunt
```
| Add announcement of end of active development. | Add announcement of end of active development. | Markdown | mit | JanMiksovsky/quickui,JanMiksovsky/quickui | markdown | ## Code Before:
This is the source repo for the QuickUI runtime.
The main QuickUI project home page is at http://quickui.org.
Build
-----
1. Install the packages which the build depends upon:
```
npm install
```
2. Build:
```
grunt
```
## Instruction:
Add announcement of end of active development.
## Code After:
Notice
------
QuickUI is no longer under active development.
Many thanks to all who have supported this project. People interested in
component-based web user interface design and development should look at
HTML web components, for example those possible in the
[Polymer](http://polymer-project.org) framework.
About
-----
This is the source repo for the QuickUI runtime.
The main QuickUI project home page is at http://quickui.org.
Build
-----
1. Install the packages which the build depends upon:
```
npm install
```
2. Build:
```
grunt
```
| + Notice
+ ------
+ QuickUI is no longer under active development.
+ Many thanks to all who have supported this project. People interested in
+ component-based web user interface design and development should look at
+ HTML web components, for example those possible in the
+ [Polymer](http://polymer-project.org) framework.
+
+ About
+ -----
This is the source repo for the QuickUI runtime.
The main QuickUI project home page is at http://quickui.org.
Build
-----
1. Install the packages which the build depends upon:
```
npm install
```
2. Build:
```
grunt
``` | 10 | 0.588235 | 10 | 0 |
9a20a024897e0f82cda5cc4b693de1ff567c70d9 | app/src/main/kotlin/net/expandable/sample/MainActivity.kt | app/src/main/kotlin/net/expandable/sample/MainActivity.kt | package net.expandable.sample
import android.os.Bundle
import android.support.v7.app.AppCompatActivity
import android.widget.Toast
import net.expandable.ExpandableTextView
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
val textView = findViewById(R.id.text) as ExpandableTextView
textView.text = getString(R.string.long_text)
textView.setCollapseLines(2)
textView.setOnExpandableClickListener(
onExpand = { showToast("Expand") },
onCollapse = { showToast("Collapse") }
)
}
private fun showToast(text: String) {
Toast.makeText(this, text, Toast.LENGTH_SHORT).show()
}
}
| package net.expandable.sample
import android.os.Bundle
import android.support.v7.app.AppCompatActivity
import android.widget.Toast
import net.expandable.ExpandableTextView
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
val textView = findViewById<ExpandableTextView>(R.id.text)
textView.text = getString(R.string.long_text)
textView.setCollapseLines(2)
textView.setOnExpandableClickListener(
onExpand = { showToast("Expand") },
onCollapse = { showToast("Collapse") }
)
}
private fun showToast(text: String) {
Toast.makeText(this, text, Toast.LENGTH_SHORT).show()
}
}
| Convert cast to findViewById with type parameter | Convert cast to findViewById with type parameter
close #11
| Kotlin | apache-2.0 | yuzumone/ExpandableTextView | kotlin | ## Code Before:
package net.expandable.sample
import android.os.Bundle
import android.support.v7.app.AppCompatActivity
import android.widget.Toast
import net.expandable.ExpandableTextView
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
val textView = findViewById(R.id.text) as ExpandableTextView
textView.text = getString(R.string.long_text)
textView.setCollapseLines(2)
textView.setOnExpandableClickListener(
onExpand = { showToast("Expand") },
onCollapse = { showToast("Collapse") }
)
}
private fun showToast(text: String) {
Toast.makeText(this, text, Toast.LENGTH_SHORT).show()
}
}
## Instruction:
Convert cast to findViewById with type parameter
close #11
## Code After:
package net.expandable.sample
import android.os.Bundle
import android.support.v7.app.AppCompatActivity
import android.widget.Toast
import net.expandable.ExpandableTextView
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
val textView = findViewById<ExpandableTextView>(R.id.text)
textView.text = getString(R.string.long_text)
textView.setCollapseLines(2)
textView.setOnExpandableClickListener(
onExpand = { showToast("Expand") },
onCollapse = { showToast("Collapse") }
)
}
private fun showToast(text: String) {
Toast.makeText(this, text, Toast.LENGTH_SHORT).show()
}
}
| package net.expandable.sample
import android.os.Bundle
import android.support.v7.app.AppCompatActivity
import android.widget.Toast
import net.expandable.ExpandableTextView
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
- val textView = findViewById(R.id.text) as ExpandableTextView
? ^^^^^^^^^^^^^^^
+ val textView = findViewById<ExpandableTextView>(R.id.text)
? ^ ++++++++++++
textView.text = getString(R.string.long_text)
textView.setCollapseLines(2)
textView.setOnExpandableClickListener(
onExpand = { showToast("Expand") },
onCollapse = { showToast("Collapse") }
)
}
private fun showToast(text: String) {
Toast.makeText(this, text, Toast.LENGTH_SHORT).show()
}
} | 2 | 0.08 | 1 | 1 |
f38d084da289834b458baf53e0204528e54ac507 | app/src/main/res/layout/activity_main.xml | app/src/main/res/layout/activity_main.xml | <android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:ads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.grounduphq.arrispwgen.MainActivity">
<ListView
android:layout_width="344dp"
android:layout_height="445dp"
tools:layout_editor_absoluteY="8dp"
tools:layout_editor_absoluteX="8dp"
android:id="@+id/potd_list" />
<!-- view for AdMob Banner Ad -->
<com.google.android.gms.ads.AdView
android:id="@+id/adView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="24dp"
tools:layout_constraintBottom_creator="1"
tools:layout_constraintLeft_creator="1"
ads:adSize="BANNER"
ads:adUnitId="@string/banner_ad_unit_id"
ads:layout_constraintBottom_toBottomOf="parent"
ads:layout_constraintLeft_toLeftOf="parent" />
</android.support.constraint.ConstraintLayout>
| <android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:ads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.grounduphq.arrispwgen.MainActivity">
<ListView
android:id="@+id/potd_list"
android:layout_width="304dp"
android:layout_height="431dp"
android:layout_marginStart="16dp"
ads:layout_constraintLeft_toLeftOf="parent"
android:layout_marginEnd="16dp"
ads:layout_constraintRight_toRightOf="parent"
android:layout_marginTop="16dp"
ads:layout_constraintTop_toTopOf="parent" />
<!-- view for AdMob Banner Ad -->
<com.google.android.gms.ads.AdView
android:id="@+id/adView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="24dp"
tools:layout_constraintBottom_creator="1"
tools:layout_constraintLeft_creator="1"
ads:adSize="BANNER"
ads:adUnitId="@string/banner_ad_unit_id"
ads:layout_constraintBottom_toBottomOf="parent"
ads:layout_constraintLeft_toLeftOf="parent" />
</android.support.constraint.ConstraintLayout>
| Add missing constraints to POTD list in main activity layout | Add missing constraints to POTD list in main activity layout | XML | mit | borfast/arrispwgen-android,borfast/arrispwgen-android | xml | ## Code Before:
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:ads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.grounduphq.arrispwgen.MainActivity">
<ListView
android:layout_width="344dp"
android:layout_height="445dp"
tools:layout_editor_absoluteY="8dp"
tools:layout_editor_absoluteX="8dp"
android:id="@+id/potd_list" />
<!-- view for AdMob Banner Ad -->
<com.google.android.gms.ads.AdView
android:id="@+id/adView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="24dp"
tools:layout_constraintBottom_creator="1"
tools:layout_constraintLeft_creator="1"
ads:adSize="BANNER"
ads:adUnitId="@string/banner_ad_unit_id"
ads:layout_constraintBottom_toBottomOf="parent"
ads:layout_constraintLeft_toLeftOf="parent" />
</android.support.constraint.ConstraintLayout>
## Instruction:
Add missing constraints to POTD list in main activity layout
## Code After:
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:ads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.grounduphq.arrispwgen.MainActivity">
<ListView
android:id="@+id/potd_list"
android:layout_width="304dp"
android:layout_height="431dp"
android:layout_marginStart="16dp"
ads:layout_constraintLeft_toLeftOf="parent"
android:layout_marginEnd="16dp"
ads:layout_constraintRight_toRightOf="parent"
android:layout_marginTop="16dp"
ads:layout_constraintTop_toTopOf="parent" />
<!-- view for AdMob Banner Ad -->
<com.google.android.gms.ads.AdView
android:id="@+id/adView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="24dp"
tools:layout_constraintBottom_creator="1"
tools:layout_constraintLeft_creator="1"
ads:adSize="BANNER"
ads:adUnitId="@string/banner_ad_unit_id"
ads:layout_constraintBottom_toBottomOf="parent"
ads:layout_constraintLeft_toLeftOf="parent" />
</android.support.constraint.ConstraintLayout>
| <android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:ads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.grounduphq.arrispwgen.MainActivity">
<ListView
+ android:id="@+id/potd_list"
- android:layout_width="344dp"
? ^
+ android:layout_width="304dp"
? ^
- android:layout_height="445dp"
? ^^
+ android:layout_height="431dp"
? ^^
- tools:layout_editor_absoluteY="8dp"
- tools:layout_editor_absoluteX="8dp"
- android:id="@+id/potd_list" />
+ android:layout_marginStart="16dp"
+ ads:layout_constraintLeft_toLeftOf="parent"
+ android:layout_marginEnd="16dp"
+ ads:layout_constraintRight_toRightOf="parent"
+ android:layout_marginTop="16dp"
+ ads:layout_constraintTop_toTopOf="parent" />
<!-- view for AdMob Banner Ad -->
<com.google.android.gms.ads.AdView
android:id="@+id/adView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="24dp"
tools:layout_constraintBottom_creator="1"
tools:layout_constraintLeft_creator="1"
ads:adSize="BANNER"
ads:adUnitId="@string/banner_ad_unit_id"
ads:layout_constraintBottom_toBottomOf="parent"
ads:layout_constraintLeft_toLeftOf="parent" />
</android.support.constraint.ConstraintLayout> | 14 | 0.5 | 9 | 5 |
b50d434d086b47c6e5f5b746d06433b336d5bbd5 | developer/doctrine/data/fixtures/users.yml | developer/doctrine/data/fixtures/users.yml | Model_User:
Guest:
username: guest
password: 084e0343a0486ff05530df6c705c8bb4
Member:
username: member
password: aa08769cdcb26674c6706093503ff0a3
Admin:
username: admin
password: 21232f297a57a5a743894a0e4a801fc3
Owner:
username: owner
password: 72122ce96bfec66e2396d2e25225d70a | Model_User:
Guest:
username: guest
password: 084e0343a0486ff05530df6c705c8bb4
firstName: Guest
surname: User
Member:
username: member
password: aa08769cdcb26674c6706093503ff0a3
firstName: Member
surname: User
Admin:
username: admin
password: 21232f297a57a5a743894a0e4a801fc3
firstName: Admin
surname: User
Owner:
username: owner
password: 72122ce96bfec66e2396d2e25225d70a
firstName: Owner
surname: User | Add firstName and surname user fixture | Add firstName and surname user fixture | YAML | mit | lab2023/kebab-project,lab2023/kebab-project,lab2023/kebab-project | yaml | ## Code Before:
Model_User:
Guest:
username: guest
password: 084e0343a0486ff05530df6c705c8bb4
Member:
username: member
password: aa08769cdcb26674c6706093503ff0a3
Admin:
username: admin
password: 21232f297a57a5a743894a0e4a801fc3
Owner:
username: owner
password: 72122ce96bfec66e2396d2e25225d70a
## Instruction:
Add firstName and surname user fixture
## Code After:
Model_User:
Guest:
username: guest
password: 084e0343a0486ff05530df6c705c8bb4
firstName: Guest
surname: User
Member:
username: member
password: aa08769cdcb26674c6706093503ff0a3
firstName: Member
surname: User
Admin:
username: admin
password: 21232f297a57a5a743894a0e4a801fc3
firstName: Admin
surname: User
Owner:
username: owner
password: 72122ce96bfec66e2396d2e25225d70a
firstName: Owner
surname: User | Model_User:
Guest:
username: guest
password: 084e0343a0486ff05530df6c705c8bb4
+ firstName: Guest
+ surname: User
Member:
username: member
password: aa08769cdcb26674c6706093503ff0a3
+ firstName: Member
+ surname: User
Admin:
username: admin
password: 21232f297a57a5a743894a0e4a801fc3
+ firstName: Admin
+ surname: User
Owner:
username: owner
password: 72122ce96bfec66e2396d2e25225d70a
+ firstName: Owner
+ surname: User | 8 | 0.5 | 8 | 0 |
647e7ad365fc6918599f182ba603b9ec0ccaf23a | core/db/migrate/20140120135026_remove_facts_without_site.rb | core/db/migrate/20140120135026_remove_facts_without_site.rb | class RemoveFactsWithoutSite < Mongoid::Migration
def self.up
Fact.all.to_a
.keep_if{|f| !f.site || !f.site.url || f.site.url.blank?)}
.each do |fact|
Resque.enqueue ReallyRemoveFact, fact.id
end
end
def self.down
end
end
| class RemoveFactsWithoutSite < Mongoid::Migration
def self.up
Fact.all.ids.each do |fact_id|
fact = Fact[fact_id]
unless f.site and f.site.url and not f.site.url.blank?
Resque.enqueue ReallyRemoveFact, fact_id
end
end
end
def self.down
end
end
| Use Fact.ids and don't use keep_if | Use Fact.ids and don't use keep_if
| Ruby | mit | daukantas/factlink-core,daukantas/factlink-core,daukantas/factlink-core,Factlink/factlink-core,Factlink/factlink-core,Factlink/factlink-core,daukantas/factlink-core,Factlink/factlink-core | ruby | ## Code Before:
class RemoveFactsWithoutSite < Mongoid::Migration
def self.up
Fact.all.to_a
.keep_if{|f| !f.site || !f.site.url || f.site.url.blank?)}
.each do |fact|
Resque.enqueue ReallyRemoveFact, fact.id
end
end
def self.down
end
end
## Instruction:
Use Fact.ids and don't use keep_if
## Code After:
class RemoveFactsWithoutSite < Mongoid::Migration
def self.up
Fact.all.ids.each do |fact_id|
fact = Fact[fact_id]
unless f.site and f.site.url and not f.site.url.blank?
Resque.enqueue ReallyRemoveFact, fact_id
end
end
end
def self.down
end
end
| class RemoveFactsWithoutSite < Mongoid::Migration
def self.up
+ Fact.all.ids.each do |fact_id|
+ fact = Fact[fact_id]
- Fact.all.to_a
- .keep_if{|f| !f.site || !f.site.url || f.site.url.blank?)}
- .each do |fact|
+ unless f.site and f.site.url and not f.site.url.blank?
- Resque.enqueue ReallyRemoveFact, fact.id
? ^
+ Resque.enqueue ReallyRemoveFact, fact_id
? ^
-
+ end
end
end
def self.down
end
end | 10 | 0.714286 | 5 | 5 |
437444d8572d6713f1b3036c9dc69641b452351f | code/type_null_true_false.rb | code/type_null_true_false.rb | def if_value(values)
puts '"if value":'
values.each { |k, v| puts "#{k} - #{v ? 'true' : 'false'}" }
puts ''
end
def nil_value(values)
puts '"if value.nil?":'
values.each { |k, v| puts "#{k} - #{v.nil? ? 'true' : 'false'}" }
puts ''
end
def empty_value(values)
puts '"if value.empty?":'
values.each do |k, v|
puts "#{k} - #{v.empty? ? 'true' : 'false'}" if v.respond_to? :empty?
end
end
values = {
"'string'": 'string',
"''": '',
'[1, 2, 3]': [1, 2, 3],
'[]': [],
'5': 5,
'0': 0,
true: true,
false: false,
nil: nil
}
if_value(values)
nil_value(values)
empty_value(values)
| def check(label, fn, values)
puts label
values.each do |value|
begin
result = fn.call(value) ? 'true' : 'false'
rescue => e
result = "error: #{e}"
end
printf(" %-9p - %s\n", value, result)
end
puts ''
end
values = ['string', '', [1, 2, 3], [], 5, 0, true, false, nil]
check('if value:', -> (v) { v }, values)
check('if value.nil?:', -> (v) { v.nil? }, values)
check('if value.empty?:', -> (v) { v.empty? }, values)
| Refactor null-true-false example in Ruby to make it more readable | Refactor null-true-false example in Ruby to make it more readable
| Ruby | mit | Evmorov/ruby-coffeescript,evmorov/lang-compare,evmorov/lang-compare,Evmorov/ruby-coffeescript,evmorov/lang-compare,evmorov/lang-compare,Evmorov/ruby-coffeescript,evmorov/lang-compare,evmorov/lang-compare | ruby | ## Code Before:
def if_value(values)
puts '"if value":'
values.each { |k, v| puts "#{k} - #{v ? 'true' : 'false'}" }
puts ''
end
def nil_value(values)
puts '"if value.nil?":'
values.each { |k, v| puts "#{k} - #{v.nil? ? 'true' : 'false'}" }
puts ''
end
def empty_value(values)
puts '"if value.empty?":'
values.each do |k, v|
puts "#{k} - #{v.empty? ? 'true' : 'false'}" if v.respond_to? :empty?
end
end
values = {
"'string'": 'string',
"''": '',
'[1, 2, 3]': [1, 2, 3],
'[]': [],
'5': 5,
'0': 0,
true: true,
false: false,
nil: nil
}
if_value(values)
nil_value(values)
empty_value(values)
## Instruction:
Refactor null-true-false example in Ruby to make it more readable
## Code After:
def check(label, fn, values)
puts label
values.each do |value|
begin
result = fn.call(value) ? 'true' : 'false'
rescue => e
result = "error: #{e}"
end
printf(" %-9p - %s\n", value, result)
end
puts ''
end
values = ['string', '', [1, 2, 3], [], 5, 0, true, false, nil]
check('if value:', -> (v) { v }, values)
check('if value.nil?:', -> (v) { v.nil? }, values)
check('if value.empty?:', -> (v) { v.empty? }, values)
| - def if_value(values)
- puts '"if value":'
- values.each { |k, v| puts "#{k} - #{v ? 'true' : 'false'}" }
+ def check(label, fn, values)
+ puts label
+ values.each do |value|
+ begin
+ result = fn.call(value) ? 'true' : 'false'
+ rescue => e
+ result = "error: #{e}"
+ end
+ printf(" %-9p - %s\n", value, result)
+ end
puts ''
end
+ values = ['string', '', [1, 2, 3], [], 5, 0, true, false, nil]
- def nil_value(values)
- puts '"if value.nil?":'
- values.each { |k, v| puts "#{k} - #{v.nil? ? 'true' : 'false'}" }
- puts ''
- end
+ check('if value:', -> (v) { v }, values)
+ check('if value.nil?:', -> (v) { v.nil? }, values)
+ check('if value.empty?:', -> (v) { v.empty? }, values)
- def empty_value(values)
- puts '"if value.empty?":'
- values.each do |k, v|
- puts "#{k} - #{v.empty? ? 'true' : 'false'}" if v.respond_to? :empty?
- end
- end
-
- values = {
- "'string'": 'string',
- "''": '',
- '[1, 2, 3]': [1, 2, 3],
- '[]': [],
- '5': 5,
- '0': 0,
- true: true,
- false: false,
- nil: nil
- }
-
- if_value(values)
- nil_value(values)
- empty_value(values) | 44 | 1.294118 | 14 | 30 |
501422e46dbbd07436eec45c98abca0f3343a8e4 | spec/integration/spec_helper.rb | spec/integration/spec_helper.rb | require 'capybara/rspec'
Capybara.default_driver = :selenium
Capybara.app_host = 'http://localhost:8080'
def current_route
route = URI.parse(current_url).fragment
route if route and not route.empty?
end
Dir[File.join(File.dirname(__FILE__), 'helpers', "**", "*")].each {|f| require f}
RSpec.configure do |c|
c.include Capybara::DSL
c.include Capybara::RSpecMatchers
c.include LoginHelpers
c.include CleditorHelpers
Capybara.default_wait_time = 30
end | require 'capybara/rspec'
require 'capybara-screenshot'
Capybara.default_driver = :selenium
Capybara.app_host = 'http://localhost:8080'
Capybara.save_and_open_page_path = ENV['CC_BUILD_ARTIFACTS'] || File.join(File.dirname(__FILE__), '..', '..', '..', '..', '..', 'rspec_failures')
def current_route
route = URI.parse(current_url).fragment
route if route and not route.empty?
end
Dir[File.join(File.dirname(__FILE__), 'helpers', "**", "*")].each {|f| require f}
RSpec.configure do |c|
c.include Capybara::DSL
c.include Capybara::RSpecMatchers
c.include LoginHelpers
c.include CleditorHelpers
Capybara.default_wait_time = 30
end
| Save screenshot in build artifacts when integration test fails | Save screenshot in build artifacts when integration test fails
[fixes #27034517]
| Ruby | apache-2.0 | mpushpav/chorus,prakash-alpine/chorus,prakash-alpine/chorus,jamesblunt/chorus,prakash-alpine/chorus,mpushpav/chorus,jamesblunt/chorus,prakash-alpine/chorus,atul-alpine/chorus,hewtest/chorus,lukepolo/chorus,atul-alpine/chorus,mpushpav/chorus,lukepolo/chorus,jamesblunt/chorus,hewtest/chorus,atul-alpine/chorus,atul-alpine/chorus,atul-alpine/chorus,hewtest/chorus,jamesblunt/chorus,lukepolo/chorus,mpushpav/chorus,hewtest/chorus,atul-alpine/chorus,lukepolo/chorus,hewtest/chorus,lukepolo/chorus,hewtest/chorus,mpushpav/chorus,hewtest/chorus,prakash-alpine/chorus,jamesblunt/chorus,hewtest/chorus,lukepolo/chorus,lukepolo/chorus,jamesblunt/chorus,mpushpav/chorus,jamesblunt/chorus,mpushpav/chorus,atul-alpine/chorus | ruby | ## Code Before:
require 'capybara/rspec'
Capybara.default_driver = :selenium
Capybara.app_host = 'http://localhost:8080'
def current_route
route = URI.parse(current_url).fragment
route if route and not route.empty?
end
Dir[File.join(File.dirname(__FILE__), 'helpers', "**", "*")].each {|f| require f}
RSpec.configure do |c|
c.include Capybara::DSL
c.include Capybara::RSpecMatchers
c.include LoginHelpers
c.include CleditorHelpers
Capybara.default_wait_time = 30
end
## Instruction:
Save screenshot in build artifacts when integration test fails
[fixes #27034517]
## Code After:
require 'capybara/rspec'
require 'capybara-screenshot'
Capybara.default_driver = :selenium
Capybara.app_host = 'http://localhost:8080'
Capybara.save_and_open_page_path = ENV['CC_BUILD_ARTIFACTS'] || File.join(File.dirname(__FILE__), '..', '..', '..', '..', '..', 'rspec_failures')
def current_route
route = URI.parse(current_url).fragment
route if route and not route.empty?
end
Dir[File.join(File.dirname(__FILE__), 'helpers', "**", "*")].each {|f| require f}
RSpec.configure do |c|
c.include Capybara::DSL
c.include Capybara::RSpecMatchers
c.include LoginHelpers
c.include CleditorHelpers
Capybara.default_wait_time = 30
end
| require 'capybara/rspec'
+ require 'capybara-screenshot'
Capybara.default_driver = :selenium
Capybara.app_host = 'http://localhost:8080'
+ Capybara.save_and_open_page_path = ENV['CC_BUILD_ARTIFACTS'] || File.join(File.dirname(__FILE__), '..', '..', '..', '..', '..', 'rspec_failures')
def current_route
route = URI.parse(current_url).fragment
route if route and not route.empty?
end
Dir[File.join(File.dirname(__FILE__), 'helpers', "**", "*")].each {|f| require f}
RSpec.configure do |c|
c.include Capybara::DSL
c.include Capybara::RSpecMatchers
c.include LoginHelpers
c.include CleditorHelpers
Capybara.default_wait_time = 30
end | 2 | 0.095238 | 2 | 0 |
28e00395cd29dee1449ec522b55d08f68518eb70 | pyoctree/__init__.py | pyoctree/__init__.py | import version
__version__ = version.__version__
| from .version import __version__
__version__ = version.__version__
| Fix import bug in Python 3 | Fix import bug in Python 3 | Python | mit | mhogg/pyoctree,mhogg/pyoctree | python | ## Code Before:
import version
__version__ = version.__version__
## Instruction:
Fix import bug in Python 3
## Code After:
from .version import __version__
__version__ = version.__version__
| - import version
+ from .version import __version__
__version__ = version.__version__ | 2 | 1 | 1 | 1 |
0f47f712e45ade11aa6d297bfd362182d63fe7b6 | docs/source/api.rst | docs/source/api.rst | API
===
Symbols
-------
* **t** - True.
* **nil** - False.
* **pi** - 3.14159
* **e** - 2.7182
Basic Functions
---------------
.. automodule:: lcad_language.functions
:members: LCadAref, LCadBlock, LCadCond, LCadDef, LCadFor, LCadIf, LCadImport, LCadList, LCadMirror, LCadPart, LCadPrint, LCadRotate, LCadSet, LCadTranslate, LCadWhile
Comparison Operators
--------------------
.. automodule:: lcad_language.functions
:members: LCadEqual, LCadNe, LCadGt, LCadGe, LCadLt, LCadLe
Logical Operators
-----------------
.. automodule:: lcad_language.functions
:members: LCadAnd, LCadOr, LCadNot
Math Functions
--------------
.. automodule:: lcad_language.functions
:members: LCadPlus, LCadMinus, LCadMultiply, LCadDivide, LCadModulo
Python Math Functions
---------------------
All the functions in the python math library are also available:
Usage::
(cos x)
(sin x)
...
| API
===
Symbols
-------
* **t** - True
* **nil** - False
* **pi** - 3.14159
* **e** - 2.7182
* **time-index** - 0..N, for creating animations.
Basic Functions
---------------
.. automodule:: lcad_language.functions
:members: LCadAref, LCadBlock, LCadCond, LCadDef, LCadFor, LCadIf, LCadImport, LCadList, LCadMirror, LCadPart, LCadPrint, LCadRotate, LCadSet, LCadTranslate, LCadWhile
Comparison Operators
--------------------
.. automodule:: lcad_language.functions
:members: LCadEqual, LCadNe, LCadGt, LCadGe, LCadLt, LCadLe
Logical Operators
-----------------
.. automodule:: lcad_language.functions
:members: LCadAnd, LCadOr, LCadNot
Math Functions
--------------
.. automodule:: lcad_language.functions
:members: LCadPlus, LCadMinus, LCadMultiply, LCadDivide, LCadModulo
Python Math Functions
---------------------
All the functions in the python math library are also available:
Usage::
(cos x)
(sin x)
...
| Add time-index to the list of symbols. | Add time-index to the list of symbols.
| reStructuredText | mit | HazenBabcock/opensdraw | restructuredtext | ## Code Before:
API
===
Symbols
-------
* **t** - True.
* **nil** - False.
* **pi** - 3.14159
* **e** - 2.7182
Basic Functions
---------------
.. automodule:: lcad_language.functions
:members: LCadAref, LCadBlock, LCadCond, LCadDef, LCadFor, LCadIf, LCadImport, LCadList, LCadMirror, LCadPart, LCadPrint, LCadRotate, LCadSet, LCadTranslate, LCadWhile
Comparison Operators
--------------------
.. automodule:: lcad_language.functions
:members: LCadEqual, LCadNe, LCadGt, LCadGe, LCadLt, LCadLe
Logical Operators
-----------------
.. automodule:: lcad_language.functions
:members: LCadAnd, LCadOr, LCadNot
Math Functions
--------------
.. automodule:: lcad_language.functions
:members: LCadPlus, LCadMinus, LCadMultiply, LCadDivide, LCadModulo
Python Math Functions
---------------------
All the functions in the python math library are also available:
Usage::
(cos x)
(sin x)
...
## Instruction:
Add time-index to the list of symbols.
## Code After:
API
===
Symbols
-------
* **t** - True
* **nil** - False
* **pi** - 3.14159
* **e** - 2.7182
* **time-index** - 0..N, for creating animations.
Basic Functions
---------------
.. automodule:: lcad_language.functions
:members: LCadAref, LCadBlock, LCadCond, LCadDef, LCadFor, LCadIf, LCadImport, LCadList, LCadMirror, LCadPart, LCadPrint, LCadRotate, LCadSet, LCadTranslate, LCadWhile
Comparison Operators
--------------------
.. automodule:: lcad_language.functions
:members: LCadEqual, LCadNe, LCadGt, LCadGe, LCadLt, LCadLe
Logical Operators
-----------------
.. automodule:: lcad_language.functions
:members: LCadAnd, LCadOr, LCadNot
Math Functions
--------------
.. automodule:: lcad_language.functions
:members: LCadPlus, LCadMinus, LCadMultiply, LCadDivide, LCadModulo
Python Math Functions
---------------------
All the functions in the python math library are also available:
Usage::
(cos x)
(sin x)
...
| API
===
Symbols
-------
- * **t** - True.
? -
+ * **t** - True
- * **nil** - False.
? -
+ * **nil** - False
* **pi** - 3.14159
* **e** - 2.7182
+ * **time-index** - 0..N, for creating animations.
Basic Functions
---------------
.. automodule:: lcad_language.functions
:members: LCadAref, LCadBlock, LCadCond, LCadDef, LCadFor, LCadIf, LCadImport, LCadList, LCadMirror, LCadPart, LCadPrint, LCadRotate, LCadSet, LCadTranslate, LCadWhile
Comparison Operators
--------------------
.. automodule:: lcad_language.functions
:members: LCadEqual, LCadNe, LCadGt, LCadGe, LCadLt, LCadLe
Logical Operators
-----------------
.. automodule:: lcad_language.functions
:members: LCadAnd, LCadOr, LCadNot
Math Functions
--------------
.. automodule:: lcad_language.functions
:members: LCadPlus, LCadMinus, LCadMultiply, LCadDivide, LCadModulo
Python Math Functions
---------------------
All the functions in the python math library are also available:
Usage::
(cos x)
(sin x)
...
| 5 | 0.111111 | 3 | 2 |
3bc220819478dbb558fa9a2925cad11d79eb1980 | Casks/virtualbox.rb | Casks/virtualbox.rb | class Virtualbox < Cask
url 'http://download.virtualbox.org/virtualbox/4.2.10/VirtualBox-4.2.10-84104-OSX.dmg'
homepage 'http://www.virtualbox.org'
version '4.2.10-84104'
sha1 'e4d016587426c1ef041bd460905b5385e566fe77'
end
| class Virtualbox < Cask
url 'http://download.virtualbox.org/virtualbox/4.2.12/VirtualBox-4.2.12-84980-OSX.dmg'
homepage 'http://www.virtualbox.org'
version '4.2.12-84980'
sha1 '9a5913f0b25c69a67b554caa9bebd1a4d72d3d00'
end
| Upgrade Virtual box to 4.2.12 | Upgrade Virtual box to 4.2.12 | Ruby | bsd-2-clause | chino/homebrew-cask,bkono/homebrew-cask,kamilboratynski/homebrew-cask,maxnordlund/homebrew-cask,skatsuta/homebrew-cask,puffdad/homebrew-cask,hakamadare/homebrew-cask,dieterdemeyer/homebrew-cask,helloIAmPau/homebrew-cask,shoichiaizawa/homebrew-cask,n8henrie/homebrew-cask,taherio/homebrew-cask,paulombcosta/homebrew-cask,flaviocamilo/homebrew-cask,xcezx/homebrew-cask,xyb/homebrew-cask,sachin21/homebrew-cask,segiddins/homebrew-cask,lvicentesanchez/homebrew-cask,tmoreira2020/homebrew,garborg/homebrew-cask,faun/homebrew-cask,nshemonsky/homebrew-cask,AndreTheHunter/homebrew-cask,sohtsuka/homebrew-cask,Nitecon/homebrew-cask,mwean/homebrew-cask,wKovacs64/homebrew-cask,kevyau/homebrew-cask,djmonta/homebrew-cask,leonmachadowilcox/homebrew-cask,kevinoconnor7/homebrew-cask,troyxmccall/homebrew-cask,deanmorin/homebrew-cask,lolgear/homebrew-cask,williamboman/homebrew-cask,bcomnes/homebrew-cask,ch3n2k/homebrew-cask,tarwich/homebrew-cask,joshka/homebrew-cask,lcasey001/homebrew-cask,nanoxd/homebrew-cask,Ngrd/homebrew-cask,morsdyce/homebrew-cask,scribblemaniac/homebrew-cask,antogg/homebrew-cask,jellyfishcoder/homebrew-cask,ywfwj2008/homebrew-cask,joshka/homebrew-cask,fharbe/homebrew-cask,sirodoht/homebrew-cask,Philosoft/homebrew-cask,markhuber/homebrew-cask,lumaxis/homebrew-cask,danielbayley/homebrew-cask,shorshe/homebrew-cask,ericbn/homebrew-cask,tolbkni/homebrew-cask,optikfluffel/homebrew-cask,fazo96/homebrew-cask,hristozov/homebrew-cask,AnastasiaSulyagina/homebrew-cask,n0ts/homebrew-cask,greg5green/homebrew-cask,jhowtan/homebrew-cask,thomanq/homebrew-cask,dezon/homebrew-cask,Ngrd/homebrew-cask,onlynone/homebrew-cask,chrisfinazzo/homebrew-cask,kuno/homebrew-cask,patresi/homebrew-cask,ksylvan/homebrew-cask,zchee/homebrew-cask,joschi/homebrew-cask,fwiesel/homebrew-cask,reitermarkus/homebrew-cask,fly19890211/homebrew-cask,kronicd/homebrew-cask,zerrot/homebrew-cask,mwilmer/homebrew-cask,crzrcn/homebrew-cask,kevinoconnor7/homebrew-cask,julionc/homebrew-cask,victorpopkov/homebrew-cask,johnjelinek/homebrew-cask,malob/homebrew-cask,jacobdam/homebrew-cask,santoshsahoo/homebrew-cask,adrianchia/homebrew-cask,yutarody/homebrew-cask,scw/homebrew-cask,MoOx/homebrew-cask,boecko/homebrew-cask,scribblemaniac/homebrew-cask,gibsjose/homebrew-cask,ky0615/homebrew-cask-1,kingthorin/homebrew-cask,brianshumate/homebrew-cask,mAAdhaTTah/homebrew-cask,winkelsdorf/homebrew-cask,bgandon/homebrew-cask,gyndav/homebrew-cask,pablote/homebrew-cask,zeusdeux/homebrew-cask,cohei/homebrew-cask,paour/homebrew-cask,guylabs/homebrew-cask,RogerThiede/homebrew-cask,rednoah/homebrew-cask,renard/homebrew-cask,doits/homebrew-cask,pinut/homebrew-cask,asins/homebrew-cask,iAmGhost/homebrew-cask,renaudguerin/homebrew-cask,kevyau/homebrew-cask,hellosky806/homebrew-cask,optikfluffel/homebrew-cask,sjackman/homebrew-cask,ahvigil/homebrew-cask,y00rb/homebrew-cask,wickedsp1d3r/homebrew-cask,Whoaa512/homebrew-cask,julienlavergne/homebrew-cask,schneidmaster/homebrew-cask,d/homebrew-cask,diguage/homebrew-cask,jen20/homebrew-cask,delphinus35/homebrew-cask,diguage/homebrew-cask,lifepillar/homebrew-cask,rubenerd/homebrew-cask,paour/homebrew-cask,dunn/homebrew-cask,otaran/homebrew-cask,jrwesolo/homebrew-cask,claui/homebrew-cask,alexg0/homebrew-cask,stephenwade/homebrew-cask,johndbritton/homebrew-cask,johndbritton/homebrew-cask,donbobka/homebrew-cask,joaoponceleao/homebrew-cask,wolflee/homebrew-cask,shishi/homebrew-cask,pacav69/homebrew-cask,retbrown/homebrew-cask,RJHsiao/homebrew-cask,imgarylai/homebrew-cask,gyndav/homebrew-cask,stevehedrick/homebrew-cask,13k/homebrew-cask,nathancahill/homebrew-cask,inz/homebrew-cask,jrwesolo/homebrew-cask,shorshe/homebrew-cask,mingzhi22/homebrew-cask,mingzhi22/homebrew-cask,mwek/homebrew-cask,JikkuJose/homebrew-cask,cedwardsmedia/homebrew-cask,ywfwj2008/homebrew-cask,tedbundyjr/homebrew-cask,shanonvl/homebrew-cask,SentinelWarren/homebrew-cask,0xadada/homebrew-cask,bric3/homebrew-cask,corbt/homebrew-cask,ptb/homebrew-cask,moogar0880/homebrew-cask,yuhki50/homebrew-cask,cliffcotino/homebrew-cask,askl56/homebrew-cask,bgandon/homebrew-cask,jspahrsummers/homebrew-cask,astorije/homebrew-cask,michelegera/homebrew-cask,FredLackeyOfficial/homebrew-cask,vigosan/homebrew-cask,nathanielvarona/homebrew-cask,christer155/homebrew-cask,mokagio/homebrew-cask,onlynone/homebrew-cask,adelinofaria/homebrew-cask,mAAdhaTTah/homebrew-cask,jamesmlees/homebrew-cask,muan/homebrew-cask,CameronGarrett/homebrew-cask,gurghet/homebrew-cask,bkono/homebrew-cask,thii/homebrew-cask,csmith-palantir/homebrew-cask,theoriginalgri/homebrew-cask,RickWong/homebrew-cask,Philosoft/homebrew-cask,donbobka/homebrew-cask,xakraz/homebrew-cask,larseggert/homebrew-cask,SamiHiltunen/homebrew-cask,mahori/homebrew-cask,wizonesolutions/homebrew-cask,rcuza/homebrew-cask,psibre/homebrew-cask,JikkuJose/homebrew-cask,fanquake/homebrew-cask,lucasmezencio/homebrew-cask,jacobbednarz/homebrew-cask,nrlquaker/homebrew-cask,jonathanwiesel/homebrew-cask,johnste/homebrew-cask,ahvigil/homebrew-cask,jgarber623/homebrew-cask,kiliankoe/homebrew-cask,mauricerkelly/homebrew-cask,yumitsu/homebrew-cask,pgr0ss/homebrew-cask,jtriley/homebrew-cask,cfillion/homebrew-cask,genewoo/homebrew-cask,gilesdring/homebrew-cask,segiddins/homebrew-cask,optikfluffel/homebrew-cask,bchatard/homebrew-cask,sscotth/homebrew-cask,JacopKane/homebrew-cask,alebcay/homebrew-cask,afdnlw/homebrew-cask,nicholsn/homebrew-cask,alloy/homebrew-cask,kpearson/homebrew-cask,csmith-palantir/homebrew-cask,bchatard/homebrew-cask,kirikiriyamama/homebrew-cask,coeligena/homebrew-customized,0rax/homebrew-cask,bendoerr/homebrew-cask,jeroenseegers/homebrew-cask,jangalinski/homebrew-cask,neverfox/homebrew-cask,sosedoff/homebrew-cask,chrisfinazzo/homebrew-cask,squid314/homebrew-cask,catap/homebrew-cask,nysthee/homebrew-cask,sanyer/homebrew-cask,hristozov/homebrew-cask,Ephemera/homebrew-cask,williamboman/homebrew-cask,kTitan/homebrew-cask,dcondrey/homebrew-cask,tdsmith/homebrew-cask,aguynamedryan/homebrew-cask,FranklinChen/homebrew-cask,johan/homebrew-cask,troyxmccall/homebrew-cask,a1russell/homebrew-cask,ianyh/homebrew-cask,tdsmith/homebrew-cask,colindean/homebrew-cask,wizonesolutions/homebrew-cask,yurrriq/homebrew-cask,moimikey/homebrew-cask,lifepillar/homebrew-cask,ericbn/homebrew-cask,alebcay/homebrew-cask,mfpierre/homebrew-cask,lvicentesanchez/homebrew-cask,mahori/homebrew-cask,jpodlech/homebrew-cask,julienlavergne/homebrew-cask,larseggert/homebrew-cask,RogerThiede/homebrew-cask,freeslugs/homebrew-cask,asbachb/homebrew-cask,doits/homebrew-cask,arranubels/homebrew-cask,scw/homebrew-cask,Bombenleger/homebrew-cask,sysbot/homebrew-cask,mishari/homebrew-cask,kteru/homebrew-cask,petmoo/homebrew-cask,tedski/homebrew-cask,franklouwers/homebrew-cask,dustinblackman/homebrew-cask,ch3n2k/homebrew-cask,codeurge/homebrew-cask,bric3/homebrew-cask,amatos/homebrew-cask,pablote/homebrew-cask,aguynamedryan/homebrew-cask,daften/homebrew-cask,sachin21/homebrew-cask,wolflee/homebrew-cask,dwihn0r/homebrew-cask,singingwolfboy/homebrew-cask,inz/homebrew-cask,jeanregisser/homebrew-cask,mahori/homebrew-cask,mrmachine/homebrew-cask,goxberry/homebrew-cask,franklouwers/homebrew-cask,dlovitch/homebrew-cask,janlugt/homebrew-cask,sparrc/homebrew-cask,atsuyim/homebrew-cask,robbiethegeek/homebrew-cask,LaurentFough/homebrew-cask,jamesmlees/homebrew-cask,tranc99/homebrew-cask,dezon/homebrew-cask,jalaziz/homebrew-cask,0xadada/homebrew-cask,blogabe/homebrew-cask,howie/homebrew-cask,gguillotte/homebrew-cask,claui/homebrew-cask,sparrc/homebrew-cask,valepert/homebrew-cask,malford/homebrew-cask,gyugyu/homebrew-cask,vuquoctuan/homebrew-cask,wickedsp1d3r/homebrew-cask,ksylvan/homebrew-cask,elyscape/homebrew-cask,MoOx/homebrew-cask,mishari/homebrew-cask,gabrielizaias/homebrew-cask,napaxton/homebrew-cask,crmne/homebrew-cask,Gasol/homebrew-cask,neil-ca-moore/homebrew-cask,dvdoliveira/homebrew-cask,Ephemera/homebrew-cask,hovancik/homebrew-cask,englishm/homebrew-cask,wastrachan/homebrew-cask,nickpellant/homebrew-cask,winkelsdorf/homebrew-cask,stevenmaguire/homebrew-cask,kassi/homebrew-cask,stevenmaguire/homebrew-cask,katoquro/homebrew-cask,Amorymeltzer/homebrew-cask,stigkj/homebrew-caskroom-cask,vuquoctuan/homebrew-cask,Fedalto/homebrew-cask,ctrevino/homebrew-cask,samshadwell/homebrew-cask,neverfox/homebrew-cask,nightscape/homebrew-cask,perfide/homebrew-cask,lukasbestle/homebrew-cask,kassi/homebrew-cask,nathancahill/homebrew-cask,coneman/homebrew-cask,MichaelPei/homebrew-cask,feigaochn/homebrew-cask,samshadwell/homebrew-cask,antogg/homebrew-cask,scottsuch/homebrew-cask,coneman/homebrew-cask,yurikoles/homebrew-cask,leipert/homebrew-cask,ksato9700/homebrew-cask,githubutilities/homebrew-cask,retrography/homebrew-cask,joaocc/homebrew-cask,supriyantomaftuh/homebrew-cask,ldong/homebrew-cask,nicholsn/homebrew-cask,crmne/homebrew-cask,gurghet/homebrew-cask,hakamadare/homebrew-cask,kostasdizas/homebrew-cask,askl56/homebrew-cask,leonmachadowilcox/homebrew-cask,mindriot101/homebrew-cask,mjgardner/homebrew-cask,ddm/homebrew-cask,danielbayley/homebrew-cask,jedahan/homebrew-cask,patresi/homebrew-cask,JosephViolago/homebrew-cask,dustinblackman/homebrew-cask,jalaziz/homebrew-cask,andersonba/homebrew-cask,bendoerr/homebrew-cask,casidiablo/homebrew-cask,feniix/homebrew-cask,sanchezm/homebrew-cask,shonjir/homebrew-cask,kesara/homebrew-cask,pinut/homebrew-cask,ajbw/homebrew-cask,reitermarkus/homebrew-cask,nightscape/homebrew-cask,jiashuw/homebrew-cask,sysbot/homebrew-cask,brianshumate/homebrew-cask,nathanielvarona/homebrew-cask,barravi/homebrew-cask,carlmod/homebrew-cask,nicolas-brousse/homebrew-cask,moimikey/homebrew-cask,tarwich/homebrew-cask,wuman/homebrew-cask,bdhess/homebrew-cask,norio-nomura/homebrew-cask,dictcp/homebrew-cask,rkJun/homebrew-cask,elyscape/homebrew-cask,rajiv/homebrew-cask,malford/homebrew-cask,xalep/homebrew-cask,tolbkni/homebrew-cask,ninjahoahong/homebrew-cask,qbmiller/homebrew-cask,diogodamiani/homebrew-cask,chuanxd/homebrew-cask,Keloran/homebrew-cask,sanyer/homebrew-cask,kievechua/homebrew-cask,a-x-/homebrew-cask,chrisfinazzo/homebrew-cask,jacobdam/homebrew-cask,hovancik/homebrew-cask,mathbunnyru/homebrew-cask,sideci-sample/sideci-sample-homebrew-cask,squid314/homebrew-cask,blainesch/homebrew-cask,drostron/homebrew-cask,MicTech/homebrew-cask,afdnlw/homebrew-cask,kTitan/homebrew-cask,kolomiichenko/homebrew-cask,morganestes/homebrew-cask,Ephemera/homebrew-cask,bosr/homebrew-cask,mathbunnyru/homebrew-cask,akiomik/homebrew-cask,xiongchiamiov/homebrew-cask,fly19890211/homebrew-cask,johntrandall/homebrew-cask,niksy/homebrew-cask,yuhki50/homebrew-cask,syscrusher/homebrew-cask,jeroenseegers/homebrew-cask,jpmat296/homebrew-cask,kpearson/homebrew-cask,alloy/homebrew-cask,usami-k/homebrew-cask,jeroenj/homebrew-cask,uetchy/homebrew-cask,jmeridth/homebrew-cask,fwiesel/homebrew-cask,artdevjs/homebrew-cask,andrewdisley/homebrew-cask,mathbunnyru/homebrew-cask,d/homebrew-cask,carlmod/homebrew-cask,howie/homebrew-cask,epmatsw/homebrew-cask,timsutton/homebrew-cask,gyndav/homebrew-cask,neil-ca-moore/homebrew-cask,jmeridth/homebrew-cask,dlackty/homebrew-cask,exherb/homebrew-cask,dcondrey/homebrew-cask,Labutin/homebrew-cask,sjackman/homebrew-cask,prime8/homebrew-cask,flaviocamilo/homebrew-cask,mjgardner/homebrew-cask,colindunn/homebrew-cask,unasuke/homebrew-cask,haha1903/homebrew-cask,a1russell/homebrew-cask,goxberry/homebrew-cask,zerrot/homebrew-cask,gabrielizaias/homebrew-cask,kryhear/homebrew-cask,moogar0880/homebrew-cask,shoichiaizawa/homebrew-cask,Nitecon/homebrew-cask,stonehippo/homebrew-cask,casidiablo/homebrew-cask,ayohrling/homebrew-cask,atsuyim/homebrew-cask,perfide/homebrew-cask,josa42/homebrew-cask,athrunsun/homebrew-cask,mazehall/homebrew-cask,cobyism/homebrew-cask,gmkey/homebrew-cask,imgarylai/homebrew-cask,dwkns/homebrew-cask,klane/homebrew-cask,ohammersmith/homebrew-cask,boydj/homebrew-cask,neverfox/homebrew-cask,dspeckhard/homebrew-cask,mrmachine/homebrew-cask,mchlrmrz/homebrew-cask,retbrown/homebrew-cask,githubutilities/homebrew-cask,shoichiaizawa/homebrew-cask,scottsuch/homebrew-cask,zhuzihhhh/homebrew-cask,alebcay/homebrew-cask,royalwang/homebrew-cask,tan9/homebrew-cask,Dremora/homebrew-cask,freeslugs/homebrew-cask,chrisopedia/homebrew-cask,mchlrmrz/homebrew-cask,mauricerkelly/homebrew-cask,delphinus35/homebrew-cask,illusionfield/homebrew-cask,samnung/homebrew-cask,esebastian/homebrew-cask,sebcode/homebrew-cask,royalwang/homebrew-cask,coeligena/homebrew-customized,gilesdring/homebrew-cask,mwilmer/homebrew-cask,underyx/homebrew-cask,miccal/homebrew-cask,julionc/homebrew-cask,ninjahoahong/homebrew-cask,tonyseek/homebrew-cask,hackhandslabs/homebrew-cask,My2ndAngelic/homebrew-cask,seanorama/homebrew-cask,muan/homebrew-cask,andrewdisley/homebrew-cask,vin047/homebrew-cask,forevergenin/homebrew-cask,tjnycum/homebrew-cask,deizel/homebrew-cask,opsdev-ws/homebrew-cask,jiashuw/homebrew-cask,catap/homebrew-cask,gerrypower/homebrew-cask,rogeriopradoj/homebrew-cask,uetchy/homebrew-cask,yurikoles/homebrew-cask,bosr/homebrew-cask,kongslund/homebrew-cask,puffdad/homebrew-cask,wesen/homebrew-cask,RickWong/homebrew-cask,ingorichter/homebrew-cask,jppelteret/homebrew-cask,samnung/homebrew-cask,adriweb/homebrew-cask,wmorin/homebrew-cask,ptb/homebrew-cask,andrewdisley/homebrew-cask,garborg/homebrew-cask,christer155/homebrew-cask,astorije/homebrew-cask,gregkare/homebrew-cask,xight/homebrew-cask,yutarody/homebrew-cask,jgarber623/homebrew-cask,prime8/homebrew-cask,ftiff/homebrew-cask,Hywan/homebrew-cask,rubenerd/homebrew-cask,riyad/homebrew-cask,ingorichter/homebrew-cask,m3nu/homebrew-cask,AndreTheHunter/homebrew-cask,englishm/homebrew-cask,hanxue/caskroom,mattrobenolt/homebrew-cask,samdoran/homebrew-cask,gibsjose/homebrew-cask,dspeckhard/homebrew-cask,gord1anknot/homebrew-cask,imgarylai/homebrew-cask,mlocher/homebrew-cask,JoelLarson/homebrew-cask,pgr0ss/homebrew-cask,paulombcosta/homebrew-cask,Ibuprofen/homebrew-cask,stonehippo/homebrew-cask,enriclluelles/homebrew-cask,deiga/homebrew-cask,jalaziz/homebrew-cask,pkq/homebrew-cask,adriweb/homebrew-cask,gerrymiller/homebrew-cask,devmynd/homebrew-cask,BenjaminHCCarr/homebrew-cask,ksato9700/homebrew-cask,mattrobenolt/homebrew-cask,jaredsampson/homebrew-cask,anbotero/homebrew-cask,Fedalto/homebrew-cask,nathanielvarona/homebrew-cask,gregkare/homebrew-cask,zeusdeux/homebrew-cask,mikem/homebrew-cask,kesara/homebrew-cask,lukasbestle/homebrew-cask,markthetech/homebrew-cask,AdamCmiel/homebrew-cask,christophermanning/homebrew-cask,dictcp/homebrew-cask,johntrandall/homebrew-cask,chadcatlett/caskroom-homebrew-cask,hvisage/homebrew-cask,lukeadams/homebrew-cask,AnastasiaSulyagina/homebrew-cask,Hywan/homebrew-cask,iamso/homebrew-cask,nanoxd/homebrew-cask,iamso/homebrew-cask,bcaceiro/homebrew-cask,kteru/homebrew-cask,deizel/homebrew-cask,tjt263/homebrew-cask,underyx/homebrew-cask,lieuwex/homebrew-cask,SentinelWarren/homebrew-cask,ebraminio/homebrew-cask,jconley/homebrew-cask,reitermarkus/homebrew-cask,nysthee/homebrew-cask,stephenwade/homebrew-cask,af/homebrew-cask,wuman/homebrew-cask,MisumiRize/homebrew-cask,jaredsampson/homebrew-cask,josa42/homebrew-cask,jedahan/homebrew-cask,MerelyAPseudonym/homebrew-cask,michelegera/homebrew-cask,yutarody/homebrew-cask,bcaceiro/homebrew-cask,johnste/homebrew-cask,kamilboratynski/homebrew-cask,koenrh/homebrew-cask,fazo96/homebrew-cask,tan9/homebrew-cask,slnovak/homebrew-cask,LaurentFough/homebrew-cask,mattrobenolt/homebrew-cask,CameronGarrett/homebrew-cask,diogodamiani/homebrew-cask,uetchy/homebrew-cask,BenjaminHCCarr/homebrew-cask,singingwolfboy/homebrew-cask,aktau/homebrew-cask,moonboots/homebrew-cask,slack4u/homebrew-cask,paulbreslin/homebrew-cask,L2G/homebrew-cask,aki77/homebrew-cask,johan/homebrew-cask,haha1903/homebrew-cask,morganestes/homebrew-cask,ebraminio/homebrew-cask,ahundt/homebrew-cask,greg5green/homebrew-cask,slack4u/homebrew-cask,zorosteven/homebrew-cask,blogabe/homebrew-cask,rajiv/homebrew-cask,spruceb/homebrew-cask,seanorama/homebrew-cask,akiomik/homebrew-cask,kolomiichenko/homebrew-cask,nivanchikov/homebrew-cask,JosephViolago/homebrew-cask,mindriot101/homebrew-cask,meduz/homebrew-cask,rhendric/homebrew-cask,markhuber/homebrew-cask,My2ndAngelic/homebrew-cask,sirodoht/homebrew-cask,yurrriq/homebrew-cask,xyb/homebrew-cask,miguelfrde/homebrew-cask,dunn/homebrew-cask,tangestani/homebrew-cask,ldong/homebrew-cask,ohammersmith/homebrew-cask,crzrcn/homebrew-cask,devmynd/homebrew-cask,3van/homebrew-cask,jasmas/homebrew-cask,jellyfishcoder/homebrew-cask,sscotth/homebrew-cask,lantrix/homebrew-cask,n0ts/homebrew-cask,3van/homebrew-cask,cobyism/homebrew-cask,iAmGhost/homebrew-cask,L2G/homebrew-cask,sohtsuka/homebrew-cask,Ibuprofen/homebrew-cask,adelinofaria/homebrew-cask,dwkns/homebrew-cask,gwaldo/homebrew-cask,MatzFan/homebrew-cask,jhowtan/homebrew-cask,arronmabrey/homebrew-cask,theoriginalgri/homebrew-cask,kuno/homebrew-cask,kiliankoe/homebrew-cask,rogeriopradoj/homebrew-cask,mjgardner/homebrew-cask,norio-nomura/homebrew-cask,a1russell/homebrew-cask,jtriley/homebrew-cask,ponychicken/homebrew-customcask,y00rb/homebrew-cask,xight/homebrew-cask,JosephViolago/homebrew-cask,vmrob/homebrew-cask,ashishb/homebrew-cask,xtian/homebrew-cask,linc01n/homebrew-cask,reelsense/homebrew-cask,jonathanwiesel/homebrew-cask,jeroenj/homebrew-cask,xight/homebrew-cask,cfillion/homebrew-cask,jawshooah/homebrew-cask,djakarta-trap/homebrew-myCask,Cottser/homebrew-cask,flada-auxv/homebrew-cask,cliffcotino/homebrew-cask,vigosan/homebrew-cask,skyyuan/homebrew-cask,markthetech/homebrew-cask,schneidmaster/homebrew-cask,santoshsahoo/homebrew-cask,robbiethegeek/homebrew-cask,thomanq/homebrew-cask,samdoran/homebrew-cask,ctrevino/homebrew-cask,qbmiller/homebrew-cask,barravi/homebrew-cask,timsutton/homebrew-cask,jen20/homebrew-cask,BenjaminHCCarr/homebrew-cask,muescha/homebrew-cask,vmrob/homebrew-cask,remko/homebrew-cask,chadcatlett/caskroom-homebrew-cask,frapposelli/homebrew-cask,hswong3i/homebrew-cask,katoquro/homebrew-cask,FinalDes/homebrew-cask,hackhandslabs/homebrew-cask,inta/homebrew-cask,aktau/homebrew-cask,lantrix/homebrew-cask,feigaochn/homebrew-cask,mchlrmrz/homebrew-cask,a-x-/homebrew-cask,rednoah/homebrew-cask,ahbeng/homebrew-cask,Gasol/homebrew-cask,sscotth/homebrew-cask,reelsense/homebrew-cask,sosedoff/homebrew-cask,guylabs/homebrew-cask,yumitsu/homebrew-cask,buo/homebrew-cask,illusionfield/homebrew-cask,gustavoavellar/homebrew-cask,bcomnes/homebrew-cask,flada-auxv/homebrew-cask,mjdescy/homebrew-cask,MerelyAPseudonym/homebrew-cask,athrunsun/homebrew-cask,hyuna917/homebrew-cask,nshemonsky/homebrew-cask,thehunmonkgroup/homebrew-cask,JoelLarson/homebrew-cask,lauantai/homebrew-cask,linc01n/homebrew-cask,jacobbednarz/homebrew-cask,colindunn/homebrew-cask,nelsonjchen/homebrew-cask,mariusbutuc/homebrew-cask,dictcp/homebrew-cask,Dremora/homebrew-cask,rcuza/homebrew-cask,miku/homebrew-cask,jpodlech/homebrew-cask,renard/homebrew-cask,elseym/homebrew-cask,jppelteret/homebrew-cask,chrisopedia/homebrew-cask,ahbeng/homebrew-cask,koenrh/homebrew-cask,tranc99/homebrew-cask,deiga/homebrew-cask,faun/homebrew-cask,tsparber/homebrew-cask,KosherBacon/homebrew-cask,djakarta-trap/homebrew-myCask,ajbw/homebrew-cask,seanzxx/homebrew-cask,jconley/homebrew-cask,tyage/homebrew-cask,codeurge/homebrew-cask,cprecioso/homebrew-cask,sebcode/homebrew-cask,vitorgalvao/homebrew-cask,shishi/homebrew-cask,gwaldo/homebrew-cask,kkdd/homebrew-cask,petmoo/homebrew-cask,epmatsw/homebrew-cask,julionc/homebrew-cask,enriclluelles/homebrew-cask,kesara/homebrew-cask,xcezx/homebrew-cask,kkdd/homebrew-cask,xiongchiamiov/homebrew-cask,skyyuan/homebrew-cask,j13k/homebrew-cask,zchee/homebrew-cask,ddm/homebrew-cask,cblecker/homebrew-cask,boydj/homebrew-cask,alexg0/homebrew-cask,zorosteven/homebrew-cask,13k/homebrew-cask,xtian/homebrew-cask,cclauss/homebrew-cask,shonjir/homebrew-cask,malob/homebrew-cask,jangalinski/homebrew-cask,corbt/homebrew-cask,BahtiyarB/homebrew-cask,spruceb/homebrew-cask,cblecker/homebrew-cask,RJHsiao/homebrew-cask,kryhear/homebrew-cask,BahtiyarB/homebrew-cask,joschi/homebrew-cask,elnappo/homebrew-cask,anbotero/homebrew-cask,deanmorin/homebrew-cask,dvdoliveira/homebrew-cask,kongslund/homebrew-cask,rajiv/homebrew-cask,Saklad5/homebrew-cask,jawshooah/homebrew-cask,dwihn0r/homebrew-cask,andyli/homebrew-cask,cedwardsmedia/homebrew-cask,bric3/homebrew-cask,helloIAmPau/homebrew-cask,MisumiRize/homebrew-cask,dlackty/homebrew-cask,bdhess/homebrew-cask,MircoT/homebrew-cask,moimikey/homebrew-cask,Saklad5/homebrew-cask,christophermanning/homebrew-cask,daften/homebrew-cask,fkrone/homebrew-cask,psibre/homebrew-cask,ftiff/homebrew-cask,taherio/homebrew-cask,mfpierre/homebrew-cask,andyshinn/homebrew-cask,cobyism/homebrew-cask,yurikoles/homebrew-cask,arranubels/homebrew-cask,afh/homebrew-cask,inta/homebrew-cask,claui/homebrew-cask,gerrypower/homebrew-cask,jbeagley52/homebrew-cask,unasuke/homebrew-cask,JacopKane/homebrew-cask,malob/homebrew-cask,stonehippo/homebrew-cask,ashishb/homebrew-cask,andrewschleifer/homebrew-cask,remko/homebrew-cask,wayou/homebrew-cask,drostron/homebrew-cask,djmonta/homebrew-cask,nickpellant/homebrew-cask,mwek/homebrew-cask,qnm/homebrew-cask,guerrero/homebrew-cask,forevergenin/homebrew-cask,hellosky806/homebrew-cask,opsdev-ws/homebrew-cask,gerrymiller/homebrew-cask,albertico/homebrew-cask,jgarber623/homebrew-cask,rogeriopradoj/homebrew-cask,mokagio/homebrew-cask,miccal/homebrew-cask,stevehedrick/homebrew-cask,afh/homebrew-cask,jbeagley52/homebrew-cask,gustavoavellar/homebrew-cask,mazehall/homebrew-cask,chino/homebrew-cask,MircoT/homebrew-cask,wickles/homebrew-cask,ianyh/homebrew-cask,tonyseek/homebrew-cask,gmkey/homebrew-cask,napaxton/homebrew-cask,kirikiriyamama/homebrew-cask,miccal/homebrew-cask,hanxue/caskroom,6uclz1/homebrew-cask,Bombenleger/homebrew-cask,lukeadams/homebrew-cask,Keloran/homebrew-cask,giannitm/homebrew-cask,syscrusher/homebrew-cask,scribblemaniac/homebrew-cask,sanchezm/homebrew-cask,mkozjak/homebrew-cask,Amorymeltzer/homebrew-cask,lcasey001/homebrew-cask,FredLackeyOfficial/homebrew-cask,danielgomezrico/homebrew-cask,nivanchikov/homebrew-cask,mgryszko/homebrew-cask,riyad/homebrew-cask,tjnycum/homebrew-cask,mattfelsen/homebrew-cask,robertgzr/homebrew-cask,leoj3n/homebrew-cask,mwean/homebrew-cask,jspahrsummers/homebrew-cask,bsiddiqui/homebrew-cask,rhendric/homebrew-cask,kei-yamazaki/homebrew-cask,tangestani/homebrew-cask,boecko/homebrew-cask,Amorymeltzer/homebrew-cask,guerrero/homebrew-cask,MicTech/homebrew-cask,lucasmezencio/homebrew-cask,m3nu/homebrew-cask,6uclz1/homebrew-cask,lalyos/homebrew-cask,andyli/homebrew-cask,lumaxis/homebrew-cask,mhubig/homebrew-cask,mikem/homebrew-cask,miku/homebrew-cask,tangestani/homebrew-cask,valepert/homebrew-cask,wKovacs64/homebrew-cask,elnappo/homebrew-cask,Ketouem/homebrew-cask,nrlquaker/homebrew-cask,jayshao/homebrew-cask,mkozjak/homebrew-cask,ponychicken/homebrew-customcask,farmerchris/homebrew-cask,winkelsdorf/homebrew-cask,kingthorin/homebrew-cask,caskroom/homebrew-cask,mjdescy/homebrew-cask,danielgomezrico/homebrew-cask,SamiHiltunen/homebrew-cask,amatos/homebrew-cask,qnm/homebrew-cask,huanzhang/homebrew-cask,fanquake/homebrew-cask,epardee/homebrew-cask,gguillotte/homebrew-cask,huanzhang/homebrew-cask,aki77/homebrew-cask,ayohrling/homebrew-cask,skatsuta/homebrew-cask,leipert/homebrew-cask,renaudguerin/homebrew-cask,decrement/homebrew-cask,paulbreslin/homebrew-cask,dieterdemeyer/homebrew-cask,wastrachan/homebrew-cask,jeanregisser/homebrew-cask,buo/homebrew-cask,okket/homebrew-cask,otzy007/homebrew-cask,andersonba/homebrew-cask,ahundt/homebrew-cask,nelsonjchen/homebrew-cask,lalyos/homebrew-cask,cprecioso/homebrew-cask,nathansgreen/homebrew-cask,antogg/homebrew-cask,MichaelPei/homebrew-cask,mlocher/homebrew-cask,dlovitch/homebrew-cask,wayou/homebrew-cask,asbachb/homebrew-cask,farmerchris/homebrew-cask,gyugyu/homebrew-cask,phpwutz/homebrew-cask,zmwangx/homebrew-cask,danielbayley/homebrew-cask,vitorgalvao/homebrew-cask,ericbn/homebrew-cask,AdamCmiel/homebrew-cask,af/homebrew-cask,sanyer/homebrew-cask,stigkj/homebrew-caskroom-cask,wickles/homebrew-cask,Whoaa512/homebrew-cask,jpmat296/homebrew-cask,artdevjs/homebrew-cask,seanzxx/homebrew-cask,deiga/homebrew-cask,zmwangx/homebrew-cask,thii/homebrew-cask,chuanxd/homebrew-cask,rickychilcott/homebrew-cask,stephenwade/homebrew-cask,slnovak/homebrew-cask,arronmabrey/homebrew-cask,colindean/homebrew-cask,usami-k/homebrew-cask,kostasdizas/homebrew-cask,KosherBacon/homebrew-cask,wmorin/homebrew-cask,kingthorin/homebrew-cask,xyb/homebrew-cask,hyuna917/homebrew-cask,xalep/homebrew-cask,retrography/homebrew-cask,pkq/homebrew-cask,pkq/homebrew-cask,paour/homebrew-cask,thehunmonkgroup/homebrew-cask,tedbundyjr/homebrew-cask,cclauss/homebrew-cask,joschi/homebrew-cask,toonetown/homebrew-cask,exherb/homebrew-cask,JacopKane/homebrew-cask,tmoreira2020/homebrew,caskroom/homebrew-cask,pacav69/homebrew-cask,lieuwex/homebrew-cask,esebastian/homebrew-cask,joshka/homebrew-cask,lauantai/homebrew-cask,otaran/homebrew-cask,albertico/homebrew-cask,rickychilcott/homebrew-cask,shanonvl/homebrew-cask,chrisRidgers/homebrew-cask,xakraz/homebrew-cask,vin047/homebrew-cask,fengb/homebrew-cask,feniix/homebrew-cask,decrement/homebrew-cask,cblecker/homebrew-cask,genewoo/homebrew-cask,toonetown/homebrew-cask,fkrone/homebrew-cask,Ketouem/homebrew-cask,MatzFan/homebrew-cask,esebastian/homebrew-cask,tsparber/homebrew-cask,kei-yamazaki/homebrew-cask,tyage/homebrew-cask,klane/homebrew-cask,joaocc/homebrew-cask,otzy007/homebrew-cask,wesen/homebrew-cask,blainesch/homebrew-cask,chrisRidgers/homebrew-cask,jasmas/homebrew-cask,scottsuch/homebrew-cask,tjt263/homebrew-cask,andrewschleifer/homebrew-cask,gord1anknot/homebrew-cask,FranklinChen/homebrew-cask,asins/homebrew-cask,cohei/homebrew-cask,nrlquaker/homebrew-cask,hanxue/caskroom,fharbe/homebrew-cask,Labutin/homebrew-cask,rkJun/homebrew-cask,m3nu/homebrew-cask,sgnh/homebrew-cask,axodys/homebrew-cask,maxnordlund/homebrew-cask,alexg0/homebrew-cask,lolgear/homebrew-cask,mariusbutuc/homebrew-cask,axodys/homebrew-cask,moonboots/homebrew-cask,zhuzihhhh/homebrew-cask,hvisage/homebrew-cask,tjnycum/homebrew-cask,josa42/homebrew-cask,0rax/homebrew-cask,singingwolfboy/homebrew-cask,epardee/homebrew-cask,morsdyce/homebrew-cask,n8henrie/homebrew-cask,tedski/homebrew-cask,shonjir/homebrew-cask,johnjelinek/homebrew-cask,FinalDes/homebrew-cask,giannitm/homebrew-cask,okket/homebrew-cask,ky0615/homebrew-cask-1,janlugt/homebrew-cask,nicolas-brousse/homebrew-cask,elseym/homebrew-cask,sgnh/homebrew-cask,andyshinn/homebrew-cask,sideci-sample/sideci-sample-homebrew-cask,joaoponceleao/homebrew-cask,Cottser/homebrew-cask,robertgzr/homebrew-cask,nathansgreen/homebrew-cask,miguelfrde/homebrew-cask,frapposelli/homebrew-cask,mattfelsen/homebrew-cask,mhubig/homebrew-cask,adrianchia/homebrew-cask,coeligena/homebrew-customized,timsutton/homebrew-cask,kievechua/homebrew-cask,wmorin/homebrew-cask,supriyantomaftuh/homebrew-cask,hswong3i/homebrew-cask,jayshao/homebrew-cask,adrianchia/homebrew-cask,kronicd/homebrew-cask,bsiddiqui/homebrew-cask,victorpopkov/homebrew-cask,mgryszko/homebrew-cask,phpwutz/homebrew-cask,j13k/homebrew-cask,blogabe/homebrew-cask | ruby | ## Code Before:
class Virtualbox < Cask
url 'http://download.virtualbox.org/virtualbox/4.2.10/VirtualBox-4.2.10-84104-OSX.dmg'
homepage 'http://www.virtualbox.org'
version '4.2.10-84104'
sha1 'e4d016587426c1ef041bd460905b5385e566fe77'
end
## Instruction:
Upgrade Virtual box to 4.2.12
## Code After:
class Virtualbox < Cask
url 'http://download.virtualbox.org/virtualbox/4.2.12/VirtualBox-4.2.12-84980-OSX.dmg'
homepage 'http://www.virtualbox.org'
version '4.2.12-84980'
sha1 '9a5913f0b25c69a67b554caa9bebd1a4d72d3d00'
end
| class Virtualbox < Cask
- url 'http://download.virtualbox.org/virtualbox/4.2.10/VirtualBox-4.2.10-84104-OSX.dmg'
? ^ ^ ^ -
+ url 'http://download.virtualbox.org/virtualbox/4.2.12/VirtualBox-4.2.12-84980-OSX.dmg'
? ^ ^ ^^
homepage 'http://www.virtualbox.org'
- version '4.2.10-84104'
? ^ ^ -
+ version '4.2.12-84980'
? ^ ^^
- sha1 'e4d016587426c1ef041bd460905b5385e566fe77'
+ sha1 '9a5913f0b25c69a67b554caa9bebd1a4d72d3d00'
end | 6 | 1 | 3 | 3 |
ac2cf0400b2a9b22bd0b1f43b36be99f5d1a787c | README.rst | README.rst | .. image:: https://img.shields.io/pypi/v/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Version
.. image:: https://img.shields.io/pypi/l/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: License
.. image:: https://img.shields.io/pypi/pyversions/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Support Python versions
TinyAPI is a Python wrapper around `TinyLetter's <https://tinyletter.com/>`_ publicly accessible — but undocumented — API.
Brought to you by `Data Is Plural <https://tinyletter.com/data-is-plural>`_, a weekly newsletter of interesting/curious datasets.
Key links
---------
* Documentation: https://tinyapi.readthedocs.io/
* Repository: https://github.com/jsvine/tinyapi
* Issues: https://github.com/jsvine/tinyapi/issues
| .. image:: https://travis-ci.org/jsvine/tinyapi.png
:target: https://travis-ci.org/jsvine/tinyapi
:alt: Build status
.. image:: https://img.shields.io/coveralls/jsvine/tinyapi.svg
:target: https://coveralls.io/github/jsvine/tinyapi
:alt: Code coverage
.. image:: https://img.shields.io/pypi/v/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Version
.. image:: https://img.shields.io/pypi/l/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: License
.. image:: https://img.shields.io/pypi/pyversions/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Support Python versions
TinyAPI is a Python wrapper around `TinyLetter's <https://tinyletter.com/>`_ publicly accessible — but undocumented — API.
Brought to you by `Data Is Plural <https://tinyletter.com/data-is-plural>`_, a weekly newsletter of interesting/curious datasets.
Key links
---------
* Documentation: https://tinyapi.readthedocs.io/
* Repository: https://github.com/jsvine/tinyapi
* Issues: https://github.com/jsvine/tinyapi/issues
| Add Travis and Coveralls shields | Add Travis and Coveralls shields
| reStructuredText | mit | jsvine/tinyapi | restructuredtext | ## Code Before:
.. image:: https://img.shields.io/pypi/v/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Version
.. image:: https://img.shields.io/pypi/l/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: License
.. image:: https://img.shields.io/pypi/pyversions/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Support Python versions
TinyAPI is a Python wrapper around `TinyLetter's <https://tinyletter.com/>`_ publicly accessible — but undocumented — API.
Brought to you by `Data Is Plural <https://tinyletter.com/data-is-plural>`_, a weekly newsletter of interesting/curious datasets.
Key links
---------
* Documentation: https://tinyapi.readthedocs.io/
* Repository: https://github.com/jsvine/tinyapi
* Issues: https://github.com/jsvine/tinyapi/issues
## Instruction:
Add Travis and Coveralls shields
## Code After:
.. image:: https://travis-ci.org/jsvine/tinyapi.png
:target: https://travis-ci.org/jsvine/tinyapi
:alt: Build status
.. image:: https://img.shields.io/coveralls/jsvine/tinyapi.svg
:target: https://coveralls.io/github/jsvine/tinyapi
:alt: Code coverage
.. image:: https://img.shields.io/pypi/v/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Version
.. image:: https://img.shields.io/pypi/l/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: License
.. image:: https://img.shields.io/pypi/pyversions/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Support Python versions
TinyAPI is a Python wrapper around `TinyLetter's <https://tinyletter.com/>`_ publicly accessible — but undocumented — API.
Brought to you by `Data Is Plural <https://tinyletter.com/data-is-plural>`_, a weekly newsletter of interesting/curious datasets.
Key links
---------
* Documentation: https://tinyapi.readthedocs.io/
* Repository: https://github.com/jsvine/tinyapi
* Issues: https://github.com/jsvine/tinyapi/issues
| + .. image:: https://travis-ci.org/jsvine/tinyapi.png
+ :target: https://travis-ci.org/jsvine/tinyapi
+ :alt: Build status
+
+ .. image:: https://img.shields.io/coveralls/jsvine/tinyapi.svg
+ :target: https://coveralls.io/github/jsvine/tinyapi
+ :alt: Code coverage
+
.. image:: https://img.shields.io/pypi/v/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Version
.. image:: https://img.shields.io/pypi/l/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: License
.. image:: https://img.shields.io/pypi/pyversions/tinyapi.svg
:target: https://pypi.python.org/pypi/tinyapi
:alt: Support Python versions
TinyAPI is a Python wrapper around `TinyLetter's <https://tinyletter.com/>`_ publicly accessible — but undocumented — API.
Brought to you by `Data Is Plural <https://tinyletter.com/data-is-plural>`_, a weekly newsletter of interesting/curious datasets.
Key links
---------
* Documentation: https://tinyapi.readthedocs.io/
* Repository: https://github.com/jsvine/tinyapi
* Issues: https://github.com/jsvine/tinyapi/issues | 8 | 0.363636 | 8 | 0 |
5c7a411c45f793b846734053b18511b6528f7d1c | client/src/app/shared/channel/avatar.component.scss | client/src/app/shared/channel/avatar.component.scss | @import '_mixins';
.wrapper {
width: 35px;
height: 35px;
min-width: 35px;
min-height: 35px;
position: relative;
margin-right: 5px;
a {
@include disable-outline;
}
a img {
height: 100%;
object-fit: cover;
position: absolute;
top:50%;
left:50%;
border-radius: 50%;
transform: translate(-50%,-50%)
}
a:nth-of-type(2) img {
height: 60%;
width: 60%;
border: 2px solid var(--mainBackgroundColor);
transform: translateX(15%);
position: relative;
}
}
| @import '_mixins';
.wrapper {
width: 35px;
height: 35px;
min-width: 35px;
min-height: 35px;
position: relative;
margin-right: 5px;
a {
@include disable-outline;
}
a img {
height: 100%;
object-fit: cover;
position: absolute;
top:50%;
left:50%;
border-radius: 50%;
transform: translate(-50%,-50%)
}
a:nth-of-type(2) img {
height: 60%;
width: 60%;
border: 2px solid var(--mainBackgroundColor);
transform: translateX(15%);
position: relative;
background-color: var(--mainBackgroundColor);
}
}
| Set a default background color for account avatar | Set a default background color for account avatar
fixes #2392
| SCSS | agpl-3.0 | Chocobozzz/PeerTube,Chocobozzz/PeerTube,Chocobozzz/PeerTube,Chocobozzz/PeerTube | scss | ## Code Before:
@import '_mixins';
.wrapper {
width: 35px;
height: 35px;
min-width: 35px;
min-height: 35px;
position: relative;
margin-right: 5px;
a {
@include disable-outline;
}
a img {
height: 100%;
object-fit: cover;
position: absolute;
top:50%;
left:50%;
border-radius: 50%;
transform: translate(-50%,-50%)
}
a:nth-of-type(2) img {
height: 60%;
width: 60%;
border: 2px solid var(--mainBackgroundColor);
transform: translateX(15%);
position: relative;
}
}
## Instruction:
Set a default background color for account avatar
fixes #2392
## Code After:
@import '_mixins';
.wrapper {
width: 35px;
height: 35px;
min-width: 35px;
min-height: 35px;
position: relative;
margin-right: 5px;
a {
@include disable-outline;
}
a img {
height: 100%;
object-fit: cover;
position: absolute;
top:50%;
left:50%;
border-radius: 50%;
transform: translate(-50%,-50%)
}
a:nth-of-type(2) img {
height: 60%;
width: 60%;
border: 2px solid var(--mainBackgroundColor);
transform: translateX(15%);
position: relative;
background-color: var(--mainBackgroundColor);
}
}
| @import '_mixins';
.wrapper {
width: 35px;
height: 35px;
min-width: 35px;
min-height: 35px;
position: relative;
margin-right: 5px;
a {
@include disable-outline;
}
a img {
height: 100%;
object-fit: cover;
position: absolute;
top:50%;
left:50%;
border-radius: 50%;
transform: translate(-50%,-50%)
}
a:nth-of-type(2) img {
height: 60%;
width: 60%;
border: 2px solid var(--mainBackgroundColor);
transform: translateX(15%);
position: relative;
+ background-color: var(--mainBackgroundColor);
}
} | 1 | 0.03125 | 1 | 0 |
c6be5567905898377ee9ffc41afe1e12ec057cbb | workshift/templates/search/indexes/workshift/workshiftinstance_text.txt | workshift/templates/search/indexes/workshift/workshiftinstance_text.txt | {{ object.title }}
{{ object.workshifter }}
{{ object.pool.title }},
{{ object.semester.get_season_display }}
{{ object.semester.year }}
{% if object.description %}
{{ object.description|safe }}
{% endif %}
| {{ object.title }}
{{ object.workshifter }}
{{ object.pool.title }},
{{ object.semester.get_season_display }}
{{ object.semester.year }}
{% if object.description %}
{{ object.description|safe }}
{% endif %}
{% for log in object.logs.all %}
log.person
{% endfor %}
| Add logs to instance search | Add logs to instance search
| Text | bsd-2-clause | knagra/farnsworth,knagra/farnsworth,knagra/farnsworth,knagra/farnsworth | text | ## Code Before:
{{ object.title }}
{{ object.workshifter }}
{{ object.pool.title }},
{{ object.semester.get_season_display }}
{{ object.semester.year }}
{% if object.description %}
{{ object.description|safe }}
{% endif %}
## Instruction:
Add logs to instance search
## Code After:
{{ object.title }}
{{ object.workshifter }}
{{ object.pool.title }},
{{ object.semester.get_season_display }}
{{ object.semester.year }}
{% if object.description %}
{{ object.description|safe }}
{% endif %}
{% for log in object.logs.all %}
log.person
{% endfor %}
| {{ object.title }}
{{ object.workshifter }}
{{ object.pool.title }},
{{ object.semester.get_season_display }}
{{ object.semester.year }}
{% if object.description %}
{{ object.description|safe }}
{% endif %}
+ {% for log in object.logs.all %}
+ log.person
+ {% endfor %} | 3 | 0.375 | 3 | 0 |
c8c904ad5e920ebc36af10c775fea57561398c10 | compile.sh | compile.sh | set -e
# sudo apt-get install libxpm-dev libxt-dev
make clean
./configure --with-features=huge \
--enable-fail-if-missing \
--with-x \
--enable-multibyte \
--enable-rubyinterp=yes \
--disable-gui \
--enable-luainterp=yes \
--enable-pythoninterp=no \
--enable-python3interp=yes \
--with-python3-command=python3.6 \
# --enable-gtk2-check \
# --enable-gnome-check \
# --enable-gui=gnome2 \
# --enable-python3interp=yes \
# --with-python-config-dir=/usr/lib/python2.7/config \
# --with-python3-config-dir=/usr/lib/python3.5/config \
# --enable-perlinterp=yes \
# --enable-gui=gtk2 \
# --enable-cscope \
# --prefix=/usr
# make VIMRUNTIMEDIR=/usr/share/vim/vim74/
make VIMRUNTIMEDIR=~/workspace/vim/runtime/
set +e
| set -e
# sudo apt-get install libxpm-dev libxt-dev
# sudo ln -s /usr/include/lua5.3 /usr/include/lua
# sudo ln -s /usr/lib/x86_64-linux-gnu/liblua5.3.so /usr/local/lib/liblua.so
make clean
./configure --with-features=huge \
--enable-fail-if-missing \
--with-x \
--enable-multibyte \
--enable-rubyinterp=yes \
--disable-gui \
--enable-luainterp=yes \
--enable-pythoninterp=no \
--enable-python3interp=yes \
--with-python3-command=python3.6 \
# --enable-gtk2-check \
# --enable-gnome-check \
# --enable-gui=gnome2 \
# --enable-python3interp=yes \
# --with-python-config-dir=/usr/lib/python2.7/config \
# --with-python3-config-dir=/usr/lib/python3.5/config \
# --enable-perlinterp=yes \
# --enable-gui=gtk2 \
# --enable-cscope \
# --prefix=/usr
# make VIMRUNTIMEDIR=/usr/share/vim/vim74/
make VIMRUNTIMEDIR=~/workspace/vim/runtime/
set +e
| Add reminder about linking lua libs | Add reminder about linking lua libs
| Shell | mit | JoseKilo/vimrc,JoseKilo/vimrc | shell | ## Code Before:
set -e
# sudo apt-get install libxpm-dev libxt-dev
make clean
./configure --with-features=huge \
--enable-fail-if-missing \
--with-x \
--enable-multibyte \
--enable-rubyinterp=yes \
--disable-gui \
--enable-luainterp=yes \
--enable-pythoninterp=no \
--enable-python3interp=yes \
--with-python3-command=python3.6 \
# --enable-gtk2-check \
# --enable-gnome-check \
# --enable-gui=gnome2 \
# --enable-python3interp=yes \
# --with-python-config-dir=/usr/lib/python2.7/config \
# --with-python3-config-dir=/usr/lib/python3.5/config \
# --enable-perlinterp=yes \
# --enable-gui=gtk2 \
# --enable-cscope \
# --prefix=/usr
# make VIMRUNTIMEDIR=/usr/share/vim/vim74/
make VIMRUNTIMEDIR=~/workspace/vim/runtime/
set +e
## Instruction:
Add reminder about linking lua libs
## Code After:
set -e
# sudo apt-get install libxpm-dev libxt-dev
# sudo ln -s /usr/include/lua5.3 /usr/include/lua
# sudo ln -s /usr/lib/x86_64-linux-gnu/liblua5.3.so /usr/local/lib/liblua.so
make clean
./configure --with-features=huge \
--enable-fail-if-missing \
--with-x \
--enable-multibyte \
--enable-rubyinterp=yes \
--disable-gui \
--enable-luainterp=yes \
--enable-pythoninterp=no \
--enable-python3interp=yes \
--with-python3-command=python3.6 \
# --enable-gtk2-check \
# --enable-gnome-check \
# --enable-gui=gnome2 \
# --enable-python3interp=yes \
# --with-python-config-dir=/usr/lib/python2.7/config \
# --with-python3-config-dir=/usr/lib/python3.5/config \
# --enable-perlinterp=yes \
# --enable-gui=gtk2 \
# --enable-cscope \
# --prefix=/usr
# make VIMRUNTIMEDIR=/usr/share/vim/vim74/
make VIMRUNTIMEDIR=~/workspace/vim/runtime/
set +e
| set -e
# sudo apt-get install libxpm-dev libxt-dev
+ # sudo ln -s /usr/include/lua5.3 /usr/include/lua
+ # sudo ln -s /usr/lib/x86_64-linux-gnu/liblua5.3.so /usr/local/lib/liblua.so
make clean
./configure --with-features=huge \
--enable-fail-if-missing \
--with-x \
--enable-multibyte \
--enable-rubyinterp=yes \
--disable-gui \
--enable-luainterp=yes \
--enable-pythoninterp=no \
--enable-python3interp=yes \
--with-python3-command=python3.6 \
# --enable-gtk2-check \
# --enable-gnome-check \
# --enable-gui=gnome2 \
# --enable-python3interp=yes \
# --with-python-config-dir=/usr/lib/python2.7/config \
# --with-python3-config-dir=/usr/lib/python3.5/config \
# --enable-perlinterp=yes \
# --enable-gui=gtk2 \
# --enable-cscope \
# --prefix=/usr
# make VIMRUNTIMEDIR=/usr/share/vim/vim74/
make VIMRUNTIMEDIR=~/workspace/vim/runtime/
set +e | 2 | 0.066667 | 2 | 0 |
547d1eea4940e4beb48a2a85b3b48bef6eb2ff25 | tests/helpers/readGraph.js | tests/helpers/readGraph.js | const fs = require('fs')
const file = require('file')
const path = require('path')
const toml = require('toml')
const parsers = {
'.toml': toml.parse,
'.json': JSON.parse
}
function readGraphObjects(objectsPath) {
let result = []
file.walkSync(objectsPath, (dirPath, dirs, filePaths) => {
result = result.concat(
filePaths
.filter(filePath => !!parsers[path.extname(filePath)])
.map(filePath => {
const absoluteFilePath = path.resolve(dirPath, filePath)
const fileContent = fs.readFileSync(absoluteFilePath, { encoding: 'utf8' })
const parser = parsers[path.extname(filePath)]
const content = parser(fileContent)
return {
absoluteFilePath,
content
}
})
)
})
return result
}
module.exports = function readGraph(graphPath) {
return {
nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels')),
nodes: readGraphObjects(path.resolve(graphPath, 'nodes')),
relationships: readGraphObjects(path.resolve(graphPath, 'relationships')),
relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes')),
}
}
| const fs = require('fs')
const file = require('file')
const path = require('path')
const toml = require('toml')
const parsers = {
'.toml': toml.parse,
'.json': JSON.parse
}
function readGraphObjects(objectsPath, mapResult) {
let result = []
file.walkSync(objectsPath, (dirPath, dirs, filePaths) => {
result = result.concat(
filePaths
.filter(filePath => !!parsers[path.extname(filePath)])
.map(filePath => {
const absoluteFilePath = path.resolve(dirPath, filePath)
const fileContent = fs.readFileSync(absoluteFilePath, { encoding: 'utf8' })
const parser = parsers[path.extname(filePath)]
const content = parser(fileContent)
return mapResult({
absoluteFilePath,
content
})
})
)
})
return result
}
module.exports = function readGraph(graphPath, mapResult = i => i) {
return {
nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels'), mapResult),
nodes: readGraphObjects(path.resolve(graphPath, 'nodes'), mapResult),
relationships: readGraphObjects(path.resolve(graphPath, 'relationships'), mapResult),
relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes'), mapResult),
}
}
| Allow custom mapping function when reading graph. | Allow custom mapping function when reading graph.
| JavaScript | mit | frontiernav/frontiernav-data,frontiernav/frontiernav-data | javascript | ## Code Before:
const fs = require('fs')
const file = require('file')
const path = require('path')
const toml = require('toml')
const parsers = {
'.toml': toml.parse,
'.json': JSON.parse
}
function readGraphObjects(objectsPath) {
let result = []
file.walkSync(objectsPath, (dirPath, dirs, filePaths) => {
result = result.concat(
filePaths
.filter(filePath => !!parsers[path.extname(filePath)])
.map(filePath => {
const absoluteFilePath = path.resolve(dirPath, filePath)
const fileContent = fs.readFileSync(absoluteFilePath, { encoding: 'utf8' })
const parser = parsers[path.extname(filePath)]
const content = parser(fileContent)
return {
absoluteFilePath,
content
}
})
)
})
return result
}
module.exports = function readGraph(graphPath) {
return {
nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels')),
nodes: readGraphObjects(path.resolve(graphPath, 'nodes')),
relationships: readGraphObjects(path.resolve(graphPath, 'relationships')),
relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes')),
}
}
## Instruction:
Allow custom mapping function when reading graph.
## Code After:
const fs = require('fs')
const file = require('file')
const path = require('path')
const toml = require('toml')
const parsers = {
'.toml': toml.parse,
'.json': JSON.parse
}
function readGraphObjects(objectsPath, mapResult) {
let result = []
file.walkSync(objectsPath, (dirPath, dirs, filePaths) => {
result = result.concat(
filePaths
.filter(filePath => !!parsers[path.extname(filePath)])
.map(filePath => {
const absoluteFilePath = path.resolve(dirPath, filePath)
const fileContent = fs.readFileSync(absoluteFilePath, { encoding: 'utf8' })
const parser = parsers[path.extname(filePath)]
const content = parser(fileContent)
return mapResult({
absoluteFilePath,
content
})
})
)
})
return result
}
module.exports = function readGraph(graphPath, mapResult = i => i) {
return {
nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels'), mapResult),
nodes: readGraphObjects(path.resolve(graphPath, 'nodes'), mapResult),
relationships: readGraphObjects(path.resolve(graphPath, 'relationships'), mapResult),
relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes'), mapResult),
}
}
| const fs = require('fs')
const file = require('file')
const path = require('path')
const toml = require('toml')
const parsers = {
'.toml': toml.parse,
'.json': JSON.parse
}
- function readGraphObjects(objectsPath) {
+ function readGraphObjects(objectsPath, mapResult) {
? +++++++++++
let result = []
file.walkSync(objectsPath, (dirPath, dirs, filePaths) => {
result = result.concat(
filePaths
.filter(filePath => !!parsers[path.extname(filePath)])
.map(filePath => {
const absoluteFilePath = path.resolve(dirPath, filePath)
const fileContent = fs.readFileSync(absoluteFilePath, { encoding: 'utf8' })
const parser = parsers[path.extname(filePath)]
const content = parser(fileContent)
- return {
+ return mapResult({
? ++++++++++
absoluteFilePath,
content
- }
+ })
? +
})
)
})
return result
}
- module.exports = function readGraph(graphPath) {
+ module.exports = function readGraph(graphPath, mapResult = i => i) {
? ++++++++++++++++++++
return {
- nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels')),
+ nodeLabels: readGraphObjects(path.resolve(graphPath, 'nodeLabels'), mapResult),
? +++++++++++
- nodes: readGraphObjects(path.resolve(graphPath, 'nodes')),
+ nodes: readGraphObjects(path.resolve(graphPath, 'nodes'), mapResult),
? +++++++++++
- relationships: readGraphObjects(path.resolve(graphPath, 'relationships')),
+ relationships: readGraphObjects(path.resolve(graphPath, 'relationships'), mapResult),
? +++++++++++
- relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes')),
+ relationshipTypes: readGraphObjects(path.resolve(graphPath, 'relationshipTypes'), mapResult),
? +++++++++++
}
} | 16 | 0.380952 | 8 | 8 |
cc337da05fbe42067ca5b878dd9ab5f26d6bfc9e | spec/models/miq_product_feature_spec.rb | spec/models/miq_product_feature_spec.rb | require "spec_helper"
describe MiqProductFeature do
context ".seed" do
it "empty table" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
end
it "run twice" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
end
it "with existing records" do
deleted = FactoryGirl.create(:miq_product_feature, :identifier => "xxx")
changed = FactoryGirl.create(:miq_product_feature, :identifier => "about", :name => "XXX")
unchanged = FactoryGirl.create(:miq_product_feature_everything)
unchanged_orig_updated_at = unchanged.updated_at
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
expect { deleted.reload }.to raise_error(ActiveRecord::RecordNotFound)
changed.reload.name.should == "About"
unchanged.reload.updated_at.should be_same_time_as unchanged_orig_updated_at
end
end
end
| require "spec_helper"
describe MiqProductFeature do
before do
@expected_feature_count = 846
end
context ".seed" do
it "empty table" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
end
it "run twice" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
end
it "with existing records" do
deleted = FactoryGirl.create(:miq_product_feature, :identifier => "xxx")
changed = FactoryGirl.create(:miq_product_feature, :identifier => "about", :name => "XXX")
unchanged = FactoryGirl.create(:miq_product_feature_everything)
unchanged_orig_updated_at = unchanged.updated_at
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
expect { deleted.reload }.to raise_error(ActiveRecord::RecordNotFound)
changed.reload.name.should == "About"
unchanged.reload.updated_at.should be_same_time_as unchanged_orig_updated_at
end
end
end
| Set expected feature count in one place instead of 3 | Set expected feature count in one place instead of 3
| Ruby | apache-2.0 | NickLaMuro/manageiq,NickLaMuro/manageiq,NaNi-Z/manageiq,jvlcek/manageiq,syncrou/manageiq,yaacov/manageiq,jameswnl/manageiq,djberg96/manageiq,syncrou/manageiq,chessbyte/manageiq,gerikis/manageiq,kbrock/manageiq,mzazrivec/manageiq,maas-ufcg/manageiq,mresti/manageiq,skateman/manageiq,jvlcek/manageiq,gmcculloug/manageiq,tinaafitz/manageiq,NaNi-Z/manageiq,mzazrivec/manageiq,ailisp/manageiq,ailisp/manageiq,romaintb/manageiq,andyvesel/manageiq,tzumainn/manageiq,pkomanek/manageiq,billfitzgerald0120/manageiq,billfitzgerald0120/manageiq,durandom/manageiq,chessbyte/manageiq,pkomanek/manageiq,josejulio/manageiq,KevinLoiseau/manageiq,ilackarms/manageiq,romaintb/manageiq,chessbyte/manageiq,pkomanek/manageiq,mkanoor/manageiq,gmcculloug/manageiq,mresti/manageiq,hstastna/manageiq,branic/manageiq,ManageIQ/manageiq,jntullo/manageiq,jrafanie/manageiq,hstastna/manageiq,maas-ufcg/manageiq,lpichler/manageiq,pkomanek/manageiq,maas-ufcg/manageiq,aufi/manageiq,jameswnl/manageiq,juliancheal/manageiq,borod108/manageiq,gmcculloug/manageiq,matobet/manageiq,branic/manageiq,KevinLoiseau/manageiq,romanblanco/manageiq,aufi/manageiq,jntullo/manageiq,matobet/manageiq,djberg96/manageiq,mfeifer/manageiq,maas-ufcg/manageiq,lpichler/manageiq,fbladilo/manageiq,KevinLoiseau/manageiq,tinaafitz/manageiq,gerikis/manageiq,KevinLoiseau/manageiq,skateman/manageiq,agrare/manageiq,agrare/manageiq,NickLaMuro/manageiq,hstastna/manageiq,mfeifer/manageiq,mzazrivec/manageiq,tinaafitz/manageiq,mkanoor/manageiq,agrare/manageiq,agrare/manageiq,aufi/manageiq,tinaafitz/manageiq,yaacov/manageiq,lpichler/manageiq,israel-hdez/manageiq,jameswnl/manageiq,djberg96/manageiq,syncrou/manageiq,gerikis/manageiq,matobet/manageiq,syncrou/manageiq,romanblanco/manageiq,josejulio/manageiq,mfeifer/manageiq,josejulio/manageiq,yaacov/manageiq,jntullo/manageiq,durandom/manageiq,mresti/manageiq,mresti/manageiq,juliancheal/manageiq,fbladilo/manageiq,billfitzgerald0120/manageiq,NaNi-Z/manageiq,fbladilo/manageiq,juliancheal/manageiq,durandom/manageiq,borod108/manageiq,borod108/manageiq,israel-hdez/manageiq,mfeifer/manageiq,chessbyte/manageiq,d-m-u/manageiq,mkanoor/manageiq,ailisp/manageiq,kbrock/manageiq,lpichler/manageiq,kbrock/manageiq,fbladilo/manageiq,mzazrivec/manageiq,romanblanco/manageiq,mkanoor/manageiq,romaintb/manageiq,durandom/manageiq,gerikis/manageiq,romaintb/manageiq,borod108/manageiq,NaNi-Z/manageiq,skateman/manageiq,juliancheal/manageiq,ManageIQ/manageiq,jrafanie/manageiq,israel-hdez/manageiq,jvlcek/manageiq,d-m-u/manageiq,maas-ufcg/manageiq,ilackarms/manageiq,billfitzgerald0120/manageiq,andyvesel/manageiq,KevinLoiseau/manageiq,ilackarms/manageiq,yaacov/manageiq,maas-ufcg/manageiq,KevinLoiseau/manageiq,jrafanie/manageiq,kbrock/manageiq,andyvesel/manageiq,skateman/manageiq,branic/manageiq,d-m-u/manageiq,ilackarms/manageiq,djberg96/manageiq,romaintb/manageiq,jameswnl/manageiq,ManageIQ/manageiq,jntullo/manageiq,jvlcek/manageiq,tzumainn/manageiq,jrafanie/manageiq,andyvesel/manageiq,ailisp/manageiq,israel-hdez/manageiq,aufi/manageiq,branic/manageiq,matobet/manageiq,ManageIQ/manageiq,josejulio/manageiq,NickLaMuro/manageiq,d-m-u/manageiq,romanblanco/manageiq,tzumainn/manageiq,tzumainn/manageiq,hstastna/manageiq,romaintb/manageiq,gmcculloug/manageiq | ruby | ## Code Before:
require "spec_helper"
describe MiqProductFeature do
context ".seed" do
it "empty table" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
end
it "run twice" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
end
it "with existing records" do
deleted = FactoryGirl.create(:miq_product_feature, :identifier => "xxx")
changed = FactoryGirl.create(:miq_product_feature, :identifier => "about", :name => "XXX")
unchanged = FactoryGirl.create(:miq_product_feature_everything)
unchanged_orig_updated_at = unchanged.updated_at
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(846)
expect { deleted.reload }.to raise_error(ActiveRecord::RecordNotFound)
changed.reload.name.should == "About"
unchanged.reload.updated_at.should be_same_time_as unchanged_orig_updated_at
end
end
end
## Instruction:
Set expected feature count in one place instead of 3
## Code After:
require "spec_helper"
describe MiqProductFeature do
before do
@expected_feature_count = 846
end
context ".seed" do
it "empty table" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
end
it "run twice" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
end
it "with existing records" do
deleted = FactoryGirl.create(:miq_product_feature, :identifier => "xxx")
changed = FactoryGirl.create(:miq_product_feature, :identifier => "about", :name => "XXX")
unchanged = FactoryGirl.create(:miq_product_feature_everything)
unchanged_orig_updated_at = unchanged.updated_at
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.count.should eq(@expected_feature_count)
expect { deleted.reload }.to raise_error(ActiveRecord::RecordNotFound)
changed.reload.name.should == "About"
unchanged.reload.updated_at.should be_same_time_as unchanged_orig_updated_at
end
end
end
| require "spec_helper"
describe MiqProductFeature do
+ before do
+ @expected_feature_count = 846
+ end
+
context ".seed" do
it "empty table" do
MiqRegion.seed
MiqProductFeature.seed
- MiqProductFeature.count.should eq(846)
? ^^^
+ MiqProductFeature.count.should eq(@expected_feature_count)
? ^^^^^^^^^^^^^^^^^^^^^^^
end
it "run twice" do
MiqRegion.seed
MiqProductFeature.seed
MiqProductFeature.seed
- MiqProductFeature.count.should eq(846)
? ^^^
+ MiqProductFeature.count.should eq(@expected_feature_count)
? ^^^^^^^^^^^^^^^^^^^^^^^
end
it "with existing records" do
deleted = FactoryGirl.create(:miq_product_feature, :identifier => "xxx")
changed = FactoryGirl.create(:miq_product_feature, :identifier => "about", :name => "XXX")
unchanged = FactoryGirl.create(:miq_product_feature_everything)
unchanged_orig_updated_at = unchanged.updated_at
MiqRegion.seed
MiqProductFeature.seed
- MiqProductFeature.count.should eq(846)
? ^^^
+ MiqProductFeature.count.should eq(@expected_feature_count)
? ^^^^^^^^^^^^^^^^^^^^^^^
expect { deleted.reload }.to raise_error(ActiveRecord::RecordNotFound)
changed.reload.name.should == "About"
unchanged.reload.updated_at.should be_same_time_as unchanged_orig_updated_at
end
end
end | 10 | 0.3125 | 7 | 3 |
09efad38b72dc76361cb2f869e5b676a4e2136d9 | lib/spiderfw/controller/page_controller.rb | lib/spiderfw/controller/page_controller.rb | require 'spiderfw/controller/app_controller'
module Spider
class PageController < Controller
include Visual
def initialize(request, response, scene=nil)
super
@widgets = {}
end
def get_route(path)
if (path =~ /^[^:]+:([^:\/]+)[:\/]?(.*)$/) # route to widgets
if (@widgets[$1])
return Route.new(:path => path, :dest => @widgets[$1], :action => $2)
end
end
return super
end
def init_template(path)
template = load_template(path)
template.widgets = @widgets
template.init(@request, @scene)
return template
end
def render(path=nil, scene=nil)
scene ||= @scene
scene[:widgets] = @widgets
super(path, scene)
end
end
end | require 'spiderfw/controller/app_controller'
module Spider
class PageController < Controller
include Visual
def initialize(request, response, scene=nil)
super
@widgets = {}
end
def get_route(path)
if (path =~ /^[^:]+:([^:\/]+)[:\/]?(.*)$/) # route to widgets
if (@widgets[$1])
return Route.new(:path => path, :dest => @widgets[$1], :action => $2)
end
end
return super
end
def init_template(path)
template = super
template.widgets = @widgets
return template
end
def render(path=nil, scene=nil)
scene ||= @scene
scene[:widgets] = @widgets
super(path, scene)
end
end
end | Fix for new template sintax | Fix for new template sintax
| Ruby | mit | me/spider,me/spider,me/spider,me/spider | ruby | ## Code Before:
require 'spiderfw/controller/app_controller'
module Spider
class PageController < Controller
include Visual
def initialize(request, response, scene=nil)
super
@widgets = {}
end
def get_route(path)
if (path =~ /^[^:]+:([^:\/]+)[:\/]?(.*)$/) # route to widgets
if (@widgets[$1])
return Route.new(:path => path, :dest => @widgets[$1], :action => $2)
end
end
return super
end
def init_template(path)
template = load_template(path)
template.widgets = @widgets
template.init(@request, @scene)
return template
end
def render(path=nil, scene=nil)
scene ||= @scene
scene[:widgets] = @widgets
super(path, scene)
end
end
end
## Instruction:
Fix for new template sintax
## Code After:
require 'spiderfw/controller/app_controller'
module Spider
class PageController < Controller
include Visual
def initialize(request, response, scene=nil)
super
@widgets = {}
end
def get_route(path)
if (path =~ /^[^:]+:([^:\/]+)[:\/]?(.*)$/) # route to widgets
if (@widgets[$1])
return Route.new(:path => path, :dest => @widgets[$1], :action => $2)
end
end
return super
end
def init_template(path)
template = super
template.widgets = @widgets
return template
end
def render(path=nil, scene=nil)
scene ||= @scene
scene[:widgets] = @widgets
super(path, scene)
end
end
end | require 'spiderfw/controller/app_controller'
module Spider
class PageController < Controller
include Visual
def initialize(request, response, scene=nil)
super
@widgets = {}
end
def get_route(path)
if (path =~ /^[^:]+:([^:\/]+)[:\/]?(.*)$/) # route to widgets
if (@widgets[$1])
return Route.new(:path => path, :dest => @widgets[$1], :action => $2)
end
end
return super
end
def init_template(path)
- template = load_template(path)
+ template = super
template.widgets = @widgets
- template.init(@request, @scene)
return template
end
def render(path=nil, scene=nil)
scene ||= @scene
scene[:widgets] = @widgets
super(path, scene)
end
end
end | 3 | 0.073171 | 1 | 2 |
0eaa149abc80ec8913cf60d19b67da56884d0f34 | source/developers_guide.rst | source/developers_guide.rst | .. _developers_guide:
%%%%%%%%%%%%%%%%
Developers guide
%%%%%%%%%%%%%%%%
This section contains tutorials showing how to use the diferent VCA filters.
Berfore starting with this section, it would be very interesting that you have
a look to the :doc:`API <APIs>` and the :doc:`architecture <architecture>`.
As it was explained on the :doc:`API <APIs>` section, these filters can be used
with different technologies.Here, we are going to explain this guide using a
web application which interacts with and application server based on JavaEE
technogoly. This application server will be in charge of holding the logic
orchestration, the communication among the clients and controlling the Server
capabilites.
Here we are going to see the following examples:
NuboFaceDetector
================
- You can access to this example :doc:`here <face_detector>`
NuboNoseDetector
================
- You can access to this example :doc:`here <nose_detector>`
NuboMouthDetector
=================
- You can access to this example :doc:`here <mouth_detector>`
NuboEyeDetector
===============
- You can access to this example :doc:`here <eye_detector>`
NuboEarDetector
===============
- You can access to this example :doc:`here <ear_detector>`
NuboFaceProfile
===============
- You can access to this example :doc:`here <face_profile>`
NuboTracker
===========
- You can access to this example :doc:`here <tracker>`
| .. _developers_guide:
%%%%%%%%%%%%%%%%
Developers guide
%%%%%%%%%%%%%%%%
This section contains tutorials showing how to use the diferent VCA filters.
Berfore starting with this section, it would be very interesting that you have
a look to the :doc:`API <APIs>` and the :doc:`architecture <architecture>`.
As it was explained on the :doc:`API <APIs>` section, these filters can be used
with different technologies.Here, we are going to explain this guide using a
web application which interacts with and application server based on JavaEE
technogoly. This application server will be in charge of holding the logic
orchestration, the communication among the clients and controlling the Server
capabilites.
Here we are going to see the following examples:
.. toctree:: :maxdepth: 1
face_detector.rst
nose_detector.rst
mouth_detector.rst
eye_detector.rst
ear_detector.rst
face_profile.rst
face_profile.rst
tracker.rst
| Include examples in developer guide toctree | Include examples in developer guide toctree
| reStructuredText | apache-2.0 | nubomedia/DOC-NUBOMEDIA-VCA | restructuredtext | ## Code Before:
.. _developers_guide:
%%%%%%%%%%%%%%%%
Developers guide
%%%%%%%%%%%%%%%%
This section contains tutorials showing how to use the diferent VCA filters.
Berfore starting with this section, it would be very interesting that you have
a look to the :doc:`API <APIs>` and the :doc:`architecture <architecture>`.
As it was explained on the :doc:`API <APIs>` section, these filters can be used
with different technologies.Here, we are going to explain this guide using a
web application which interacts with and application server based on JavaEE
technogoly. This application server will be in charge of holding the logic
orchestration, the communication among the clients and controlling the Server
capabilites.
Here we are going to see the following examples:
NuboFaceDetector
================
- You can access to this example :doc:`here <face_detector>`
NuboNoseDetector
================
- You can access to this example :doc:`here <nose_detector>`
NuboMouthDetector
=================
- You can access to this example :doc:`here <mouth_detector>`
NuboEyeDetector
===============
- You can access to this example :doc:`here <eye_detector>`
NuboEarDetector
===============
- You can access to this example :doc:`here <ear_detector>`
NuboFaceProfile
===============
- You can access to this example :doc:`here <face_profile>`
NuboTracker
===========
- You can access to this example :doc:`here <tracker>`
## Instruction:
Include examples in developer guide toctree
## Code After:
.. _developers_guide:
%%%%%%%%%%%%%%%%
Developers guide
%%%%%%%%%%%%%%%%
This section contains tutorials showing how to use the diferent VCA filters.
Berfore starting with this section, it would be very interesting that you have
a look to the :doc:`API <APIs>` and the :doc:`architecture <architecture>`.
As it was explained on the :doc:`API <APIs>` section, these filters can be used
with different technologies.Here, we are going to explain this guide using a
web application which interacts with and application server based on JavaEE
technogoly. This application server will be in charge of holding the logic
orchestration, the communication among the clients and controlling the Server
capabilites.
Here we are going to see the following examples:
.. toctree:: :maxdepth: 1
face_detector.rst
nose_detector.rst
mouth_detector.rst
eye_detector.rst
ear_detector.rst
face_profile.rst
face_profile.rst
tracker.rst
| .. _developers_guide:
%%%%%%%%%%%%%%%%
Developers guide
%%%%%%%%%%%%%%%%
This section contains tutorials showing how to use the diferent VCA filters.
Berfore starting with this section, it would be very interesting that you have
a look to the :doc:`API <APIs>` and the :doc:`architecture <architecture>`.
As it was explained on the :doc:`API <APIs>` section, these filters can be used
with different technologies.Here, we are going to explain this guide using a
web application which interacts with and application server based on JavaEE
technogoly. This application server will be in charge of holding the logic
orchestration, the communication among the clients and controlling the Server
capabilites.
Here we are going to see the following examples:
+ .. toctree:: :maxdepth: 1
- NuboFaceDetector
- ================
+ face_detector.rst
+ nose_detector.rst
+ mouth_detector.rst
+ eye_detector.rst
+ ear_detector.rst
+ face_profile.rst
+ face_profile.rst
+ tracker.rst
- - You can access to this example :doc:`here <face_detector>`
-
- NuboNoseDetector
- ================
-
- - You can access to this example :doc:`here <nose_detector>`
-
- NuboMouthDetector
- =================
-
- - You can access to this example :doc:`here <mouth_detector>`
-
- NuboEyeDetector
- ===============
-
- - You can access to this example :doc:`here <eye_detector>`
-
- NuboEarDetector
- ===============
-
- - You can access to this example :doc:`here <ear_detector>`
-
- NuboFaceProfile
- ===============
-
- - You can access to this example :doc:`here <face_profile>`
-
- NuboTracker
- ===========
-
- - You can access to this example :doc:`here <tracker>` | 42 | 0.792453 | 9 | 33 |
6164224d4cca60aabc83d58d0dd2dd668cf43078 | packages/ye/yesod-links.yaml | packages/ye/yesod-links.yaml | homepage: http://github.com/pbrisbin/yesod-goodies/yesod-links
changelog-type: ''
hash: 9a9a1fadd8fe794c8766bd5c6fce81de99b7d2b1e14436759fadfd7d0735aa0c
test-bench-deps: {}
maintainer: me@pbrisbin.com
synopsis: A typeclass which simplifies creating link widgets throughout your site
changelog: ''
basic-deps:
yesod-core: ! '>=1.2 && <1.3'
base: ! '>=4 && <5'
text: ! '>=0.11 && <0.12'
all-versions:
- '0.2'
- 0.2.1
- 0.3.0
author: Patrick Brisbin
latest: 0.3.0
description-type: haddock
description: A yesod goody, yesod links
license-name: BSD-3-Clause
| homepage: http://github.com/pbrisbin/yesod-goodies/yesod-links
changelog-type: ''
hash: 4cc2c445dc8b8ad863eeab693948b54208d718e4106deeec9392f29b24a1df5e
test-bench-deps: {}
maintainer: me@pbrisbin.com
synopsis: A typeclass which simplifies creating link widgets throughout your site
changelog: ''
basic-deps:
yesod-core: '>=1.2'
base: '>=4 && <5'
text: '>=0.11'
all-versions:
- '0.2'
- 0.2.1
- 0.3.0
author: Patrick Brisbin
latest: 0.3.0
description-type: haddock
description: A yesod goody, yesod links
license-name: BSD-3-Clause
| Update from Hackage at 2021-06-28T13:12:44Z | Update from Hackage at 2021-06-28T13:12:44Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: http://github.com/pbrisbin/yesod-goodies/yesod-links
changelog-type: ''
hash: 9a9a1fadd8fe794c8766bd5c6fce81de99b7d2b1e14436759fadfd7d0735aa0c
test-bench-deps: {}
maintainer: me@pbrisbin.com
synopsis: A typeclass which simplifies creating link widgets throughout your site
changelog: ''
basic-deps:
yesod-core: ! '>=1.2 && <1.3'
base: ! '>=4 && <5'
text: ! '>=0.11 && <0.12'
all-versions:
- '0.2'
- 0.2.1
- 0.3.0
author: Patrick Brisbin
latest: 0.3.0
description-type: haddock
description: A yesod goody, yesod links
license-name: BSD-3-Clause
## Instruction:
Update from Hackage at 2021-06-28T13:12:44Z
## Code After:
homepage: http://github.com/pbrisbin/yesod-goodies/yesod-links
changelog-type: ''
hash: 4cc2c445dc8b8ad863eeab693948b54208d718e4106deeec9392f29b24a1df5e
test-bench-deps: {}
maintainer: me@pbrisbin.com
synopsis: A typeclass which simplifies creating link widgets throughout your site
changelog: ''
basic-deps:
yesod-core: '>=1.2'
base: '>=4 && <5'
text: '>=0.11'
all-versions:
- '0.2'
- 0.2.1
- 0.3.0
author: Patrick Brisbin
latest: 0.3.0
description-type: haddock
description: A yesod goody, yesod links
license-name: BSD-3-Clause
| homepage: http://github.com/pbrisbin/yesod-goodies/yesod-links
changelog-type: ''
- hash: 9a9a1fadd8fe794c8766bd5c6fce81de99b7d2b1e14436759fadfd7d0735aa0c
+ hash: 4cc2c445dc8b8ad863eeab693948b54208d718e4106deeec9392f29b24a1df5e
test-bench-deps: {}
maintainer: me@pbrisbin.com
synopsis: A typeclass which simplifies creating link widgets throughout your site
changelog: ''
basic-deps:
- yesod-core: ! '>=1.2 && <1.3'
? -- --------
+ yesod-core: '>=1.2'
- base: ! '>=4 && <5'
? --
+ base: '>=4 && <5'
- text: ! '>=0.11 && <0.12'
+ text: '>=0.11'
all-versions:
- '0.2'
- 0.2.1
- 0.3.0
author: Patrick Brisbin
latest: 0.3.0
description-type: haddock
description: A yesod goody, yesod links
license-name: BSD-3-Clause | 8 | 0.4 | 4 | 4 |
8319038d642d15e7a00d280a8115d17d04d9cb57 | content/sbmarathon/donate.md | content/sbmarathon/donate.md | Our runners are making an impact in Uganda by raising funding and awareness. Join us in sustainably empowering widows and orphans with your donation today! 100% of your donation directly supports our women and children in Uganda.
## BOH’S RUNNER PAGES
Select the runner you will help reach their $1,000 fundraising goal! Thank you for your support!
* [Dalia Caballero](https://becauseofhope.webconnex.com/SBMDalia)
* [Kara Randall](https://becauseofhope.webconnex.com/SBMKara)
* [Natalie Lemonnier](https://becauseofhope.webconnex.com/SBMNatalie)
* [Sarah Ruiz](https://becauseofhope.webconnex.com/SBMSarah)
| Our runners are making an impact in Uganda by raising funding and awareness. Join us in sustainably empowering widows and orphans with your donation today! 100% of your donation directly supports our women and children in Uganda.
## BOH’S RUNNER PAGES
Select the runner you will help reach their $1,000 fundraising goal! Thank you for your support!
* [Daniel Ruiz](https://becauseofhope.webconnex.com/SBMDaniel)
* [Dalia Caballero](https://becauseofhope.webconnex.com/SBMDalia)
* [Kara Randall](https://becauseofhope.webconnex.com/SBMKara)
* [Natalie Lemonnier](https://becauseofhope.webconnex.com/SBMNatalie)
* [Sarah Ruiz](https://becauseofhope.webconnex.com/SBMSarah)
| Add Daniel Ruiz to marathon page. | Add Daniel Ruiz to marathon page.
https://trello.com/c/VihQVVut/375-add-daniel-ruiz-to-sb-marathon-donation-page | Markdown | mit | becauseofhope/because-of-hope,becauseofhope/because-of-hope,becauseofhope/because-of-hope,becauseofhope/because-of-hope | markdown | ## Code Before:
Our runners are making an impact in Uganda by raising funding and awareness. Join us in sustainably empowering widows and orphans with your donation today! 100% of your donation directly supports our women and children in Uganda.
## BOH’S RUNNER PAGES
Select the runner you will help reach their $1,000 fundraising goal! Thank you for your support!
* [Dalia Caballero](https://becauseofhope.webconnex.com/SBMDalia)
* [Kara Randall](https://becauseofhope.webconnex.com/SBMKara)
* [Natalie Lemonnier](https://becauseofhope.webconnex.com/SBMNatalie)
* [Sarah Ruiz](https://becauseofhope.webconnex.com/SBMSarah)
## Instruction:
Add Daniel Ruiz to marathon page.
https://trello.com/c/VihQVVut/375-add-daniel-ruiz-to-sb-marathon-donation-page
## Code After:
Our runners are making an impact in Uganda by raising funding and awareness. Join us in sustainably empowering widows and orphans with your donation today! 100% of your donation directly supports our women and children in Uganda.
## BOH’S RUNNER PAGES
Select the runner you will help reach their $1,000 fundraising goal! Thank you for your support!
* [Daniel Ruiz](https://becauseofhope.webconnex.com/SBMDaniel)
* [Dalia Caballero](https://becauseofhope.webconnex.com/SBMDalia)
* [Kara Randall](https://becauseofhope.webconnex.com/SBMKara)
* [Natalie Lemonnier](https://becauseofhope.webconnex.com/SBMNatalie)
* [Sarah Ruiz](https://becauseofhope.webconnex.com/SBMSarah)
| Our runners are making an impact in Uganda by raising funding and awareness. Join us in sustainably empowering widows and orphans with your donation today! 100% of your donation directly supports our women and children in Uganda.
## BOH’S RUNNER PAGES
Select the runner you will help reach their $1,000 fundraising goal! Thank you for your support!
+ * [Daniel Ruiz](https://becauseofhope.webconnex.com/SBMDaniel)
* [Dalia Caballero](https://becauseofhope.webconnex.com/SBMDalia)
* [Kara Randall](https://becauseofhope.webconnex.com/SBMKara)
* [Natalie Lemonnier](https://becauseofhope.webconnex.com/SBMNatalie)
* [Sarah Ruiz](https://becauseofhope.webconnex.com/SBMSarah) | 1 | 0.111111 | 1 | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.